预训练模型汇总:下载地址&综述
常见类似bert,albert,roberta, bart等预训练模型下载地址汇总
·
1.下载地址
huggingface
https://huggingface.co/hfl/chinese-roberta-wwm-ext/tree/main
mackedward-nlp
https://github.com/makcedward/nlp
中文预训练模型
https://github.com/lonePatient/awesome-pretrained-chinese-nlp-models
https://github.com/ymcui/MacBERT
https://github.com/lonePatient/NeZha_Chinese_PyTorch
https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/NEZHA-TensorFlow
中文BERT
https://github.com/stupidHIGH/bert_family_tasks
中文RoBERTa
https://github.com/ymcui/Chinese-BERT-wwm
wobert
https://github.com/ZhuiyiTechnology/WoBERT
ngram_mask:ernie/spanbert
https://www.zhihu.com/question/315190433/answer/618543076
2.综述:
bert介绍
预训练模型综述-邱锡鹏
预训练模型综述
ps:上面模型下载地址中,基本都会附带使用说明以及相关论文
更多推荐



所有评论(0)