GPT-2
原作者 | OpenAI |
---|---|
首次發佈 | 2019年2月14日 |
目前版本 |
|
原始碼庫 | https://github.com/openai/gpt-2 |
前任 | GPT-1 |
繼任 | GPT-3 |
類型 | |
許可協定 | |
網站 | openai |
生成式預訓練變換模型2(英語:Generative Pre-trained Transformer 2,簡稱 GPT-2)是OpenAI於2019年2月建立的開源人工智能。[2] [3] [4] [5] GPT-2能夠翻譯文字、回答問題、總結段落,[6]並生成文字輸出。雖然其輸出內容有時與人類相似,[7]但在生成長段落時輸出內容可能會變得重複或無意義。[8]GPT-2 是一個通用學習器,沒有經過專門訓練來執行任何特定的任務,[9] [6] 並且是作為 OpenAI 2018 GPT 模型的「直接擴充」而建立的,[10]其參數數量和訓練數據集的大小均增加了十倍。[5]
參考資料
- ^ https://openai.com/blog/gpt-2-1-5b-release/.
- ^ Piper, Kelsey. A poetry-writing AI has just been unveiled. It's ... pretty good.. Vox. 15 May 2019 [19 December 2020]. (原始內容存檔於7 November 2020).
- ^ Johnson, Khari. OpenAI releases curtailed version of GPT-2 language model. VentureBeat. 20 August 2019 [19 December 2020]. (原始內容存檔於18 December 2020).
- ^ Vincent, James. OpenAI has published the text-generating AI it said was too dangerous to share. The Verge. 7 November 2019 [19 December 2020]. (原始內容存檔於11 June 2020).
- ^ 5.0 5.1 Better Language Models and Their Implications. OpenAI. 14 February 2019 [19 December 2020]. (原始內容存檔於19 December 2020).
- ^ 6.0 6.1 Hegde. Unsupervised Paraphrase Generation using Pre-trained Language Models. arXiv:2006.05477 .
- ^ Kaiser, Caleb. Too big to deploy: How GPT-2 is breaking servers. Towards Data Science. 31 January 2020 [27 February 2021]. (原始內容存檔於15 February 2020).
- ^ Hern, Alex. New AI fake text generator may be too dangerous to release, say creators. The Guardian. 14 February 2019 [19 December 2020]. (原始內容存檔於14 February 2019).
- ^ Radford, Alec; Wu, Jeffrey; Child, Rewon; Luan, David; Amodei, Dario; Sutskever, Ilua. Language models are unsupervised multitask learners (PDF) 1 (8). 14 February 2019 [19 December 2020]. (原始內容存檔 (PDF)於6 February 2021).
- ^ Radford, Alec; Narasimhan, Karthik; Salimans, Tim; Sutskever, Ilya. Improving Language Understanding by Generative Pre-Training (PDF). OpenAI: 12. 11 June 2018 [23 January 2021]. (原始內容存檔 (PDF)於26 January 2021).