BERT_Pre-training of Deep Bidirectional Transformers for Language Understanding

时间:2021-11-22 02:05:49
【文件属性】:
文件名称:BERT_Pre-training of Deep Bidirectional Transformers for Language Understanding
文件大小:578KB
文件格式:PDF
更新时间:2021-11-22 02:05:49
NLP BERT全称Bidirectional Encoder Representations from Transformers,是预训练语言表示的方法,可以在大型文本语料库(如*)上训练通用的“语言理解”模型,然后将该模型用于下游NLP任务,比如机器翻译、问答。项目地址:https://github.com/google-research/bert#fine-tuning-with-bert

网友评论