PDF4PRO ⚡AMP

Modern search engine that looking for books and documents around the web

Example: quiz answers

Language Models are Unsupervised Multitask Learners

Language Models are Unsupervised Multitask LearnersAlec Radford*1 Jeffrey Wu*1 Rewon Child1 David Luan1 Dario Amodei**1 Ilya Sutskever**1 AbstractNatural Language processing tasks, such as ques-tion answering, machine translation, reading com-prehension, and summarization, are typicallyapproached with supervised learning on task-specific datasets. We demonstrate that languagemodels begin to learn these tasks without any ex-plicit supervision when trained on a new datasetof millions of webpages called WebText. Whenconditioned on a document plus questions, the an-swers generated by the Language model reach 55F1 on the CoQA dataset - matching or exceedingthe performance of 3 out of 4 baseline systemswithout using the 127,000+ training capacity of the Language model is essentialto the success of zero-shot task transfer and in-creasing it improves performance in a log-linearfashion across tasks.

plicit supervision when trained on a new dataset of millions of webpages called WebText. When conditioned on a document plus questions, the an-swers generated by the language model reach 55 F1 on the CoQA dataset - matching or exceeding the performance of 3 out of 4 baseline systems without using the 127,000+ training examples.

Tags:

  Model

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Spam in document Broken preview Other abuse

Transcription of Language Models are Unsupervised Multitask Learners

Related search queries