We are not Far Away from having all available knowledge at our Lips and Fingertips in an instant.
New from Google Research! REALM: https://t.co/kS2oTyxAAj
We pretrain an LM that sparsely attends over all of Wikipedia as extra context. We backprop through a latent retrieval step on 13M docs. Yields new SOTA results for open domain QA, breaking 40 on NaturalQuestions-Open! pic.twitter.com/DYDFX69Td8— Kelvin Guu (@kelvin_guu) February 11, 2020