DeepPavlov/rubert-base-cased-conversational

古风汉服美女图集

DeepPavlov/rubert-base-cased-conversational

rubert-base-cased-conversational

Conversational RuBERT (Russian, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) was trained on OpenSubtitles[1], Dirty, Pikabu, and a Social Media segment of Taiga corpus[2]. We assembled a new vocabulary for Conversational RuBERT model on this data and initialized the model with RuBERT.
08.11.2021: upload model with MLM and NSP heads
[1]: P. Lison and J. Tiedemann, 2016, OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016)
[2]: Shavrina T., Shapovalova O. (2017) TO THE METHODOLOGY OF CORPUS CONSTRUCTION FOR MACHINE LEARNING: «TAIGA» SYNTAX TREE CORPUS AND PARSER. in proc. of “CORPORA2017”, international conference , Saint-Petersbourg, 2017.

前往AI网址导航

收录说明:
1、本网页并非 DeepPavlov/rubert-base-cased-conversational 官网网址页面,此页面内容编录于互联网,只作展示之用;2、如果有与 DeepPavlov/rubert-base-cased-conversational 相关业务事宜,请访问其网站并获取联系方式;3、本站与 DeepPavlov/rubert-base-cased-conversational 无任何关系,对于 DeepPavlov/rubert-base-cased-conversational 网站中的信息,请用户谨慎辨识其真伪。4、本站收录 DeepPavlov/rubert-base-cased-conversational 时,此站内容访问正常,如遇跳转非法网站,有可能此网站被非法入侵或者已更换新网址,导致旧网址被非法使用,5、如果你是网站站长或者负责人,不想被收录请邮件删除:i-hu#Foxmail.com (#换@)

© 版权声明

相关文章