tensorflow版本的模型 :
链接:https://pan.baidu.com/s/10tjVfypoQy6G_mkZK6cqOQ?pwd=yljr
提取码:yljr
模型文件名 模型简称 语料 版本 地址 github地址 加载方式 huggingface加载
chinese_roberta_wwm_large_ext_L-24_H-1024_A-16 RoBERTa-wwm-ext-large, Chinese 中文维基百科,其他百科、新闻、问答等数据,总词数达5.4B tensorflow 百度网盘 请输入提取码 GitHub - ymcui/Chinese-BERT-wwm: Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
hflchinese-roberta-wwm-ext-large pytorch https://huggingface.co/hfl/chinese-roberta-wwm-ext-large hfl/chinese-roberta-wwm-ext-large
chinese_roberta_wwm_ext_L-12_H-768_A-12 RoBERTa-wwm-ext, Chinese tensorflow 百度网盘 请输入提取码
hflchinese-roberta-wwm-ext pytorch https://huggingface.co/hfl/chinese-roberta-wwm-ext hfl/chinese-roberta-wwm-ext
chinese_bert_wwm_ext_L-12_H-768_A-12 BERT-wwm-ext, Chinese tensorflow 百度网盘 请输入提取码
hflchinese-bert-wwm-ext pytorch hfl/chinese-bert-wwm-ext · Hugging Face hfl/chinese-bert-wwm-ext
chinese_bert_wwm_L-12_H-768_A-12 BERT-wwm, Chinese tensorflow 百度网盘 请输入提取码
hfl/chinese-bert-wwm-ext pytorch hfl/chinese-bert-wwm
GitHub - brightmart/roberta_zh: RoBERTa中文预训练模型: RoBERTa for Chinese
roberta_zh_l12 RoBERTa_zh_L12 30G原始文本,近3亿个句子,100亿个中文字(token),产生了2.5亿个训练数据(instance);覆盖新闻、社区问答、多个百科数据等; tensorflow roberta_zh_l12.zip_免费高速下载|百度网盘-分享无限制 Bert 直接加载
roeberta_zh_L-24_H-1024_A-16 RoBERTa-zh-Large tensorflow roeberta_zh_L-24_H-1024_A-16.zip_免费高速下载|百度网盘-分享无限制 Bert 直接加载
RoBERTa_zh_L12_PyTorch RoBERTa_zh_L12 pytorch RoBERTa_zh_L12_PyTorch.zip_免费高速下载|百度网盘-分享无限制 Bert的PyTorch版直接加载
chinese_L-12_H-768_A-12 bert_base, Chinese 中文维基百科 https://storage.googleapis.com/bert_models/2018_11_03/chinese_L-12_H-768_A-12.zip GitHub - google-research/bert: TensorFlow code and pre-trained models for BERT
uncased_L-2_H-128_A-2 bert_tiny, 24个bert_uncased模型 https://storage.googleapis.com/bert_models/2020_02_20/all_bert_models.zip
uncased_L-4_H-256_A-4 bert_mini, 24个bert_uncased 模型
uncased_L-4_H-512_A-8 bert_small,24个bert_uncased
uncased_L-8_H-512_A-8 bert_medium, 24个bert_uncased
uncased_L-12_H-768_A-12 bert_base, 24个bert_uncased
chinese_xlnet_mid_L-24_H-768_A-12 XLNet-mid, Chinese 中文维基百科,其他百科、新闻、问答等数据,总词数达5.4B tensorflow 百度网盘 请输入提取码 GitHub - ymcui/Chinese-XLNet: Pre-Trained Chinese XLNet(中文XLNet预训练模型) hfl/chinese-xlnet-mid
hflchinese-xlnet-mid pytorch https://huggingface.co/hfl/chinese-xlnet-mid
chinese_xlnet_base_L-12_H-768_A-12 XLNet-base, Chinese tensorflow 百度网盘 请输入提取码 hfl/chinese-xlnet-base
hflchinese-xlnet-base pytorch https://huggingface.co/hfl/chinese-xlnet-base
albert_tiny_zh albert_tiny_zh https://storage.googleapis.com/albert_zh/albert_tiny.zip GitHub - brightmart/albert_zh: A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
albert_tiny_489k albert_tiny_zh(训练更久,累积学习20亿个样本) https://storage.googleapis.com/albert_zh/albert_tiny_489k.zip
albert_tiny_zh_google albert_tiny_google_zh(累积学习10亿个样本,google版本) https://storage.googleapis.com/albert_zh/albert_tiny_zh_google.zip
albert_small_zh_google albert_small_google_zh(累积学习10亿个样本,google版本) https://storage.googleapis.com/albert_zh/albert_small_zh_google.zip
albert_large_zh albert_large_zh https://storage.googleapis.com/albert_zh/albert_large_zh.zip
albert_base_zh albert_base_zh(小模型体验版) https://storage.googleapis.com/albert_zh/albert_base_zh.zip
albert_base_zh_additional_36k_steps albert_base_zh(额外训练了1.5亿个实例即 36k steps * batch_size 4096) https://storage.googleapis.com/albert_zh/albert_base_zh_additional_36k_steps.zip
albert_xlarge_zh_177k albert_xlarge_zh_177k https://storage.googleapis.com/albert_zh/albert_xlarge_zh_177k.zip
albert_xlarge_zh_183k albert_xlarge_zh_183k(优先尝试) https://storage.googleapis.com/albert_zh/albert_xlarge_zh_183k.zip
voidfulalbert_chinese_tiny pytorch https://huggingface.co/voidful/albert_chinese_tiny voidful/albert_chinese_tiny
voidfulalbert_chinese_small pytorch https://huggingface.co/voidful/albert_chinese_small voidful/albert_chinese_small
voidfulalbert_chinese_base pytorch voidful/albert_chinese_base · Hugging Face voidful/albert_chinese_base
voidfulalbert_chinese_large pytorch https://huggingface.co/voidful/albert_chinese_large voidful/albert_chinese_large
voidfulalbert_chinese_xlarge pytorch voidful/albert_chinese_xlarge · Hugging Face voidful/albert_chinese_xlarge
voidfulalbert_chinese_xxlarge pytorch https://huggingface.co/voidful/albert_chinese_xxlarge voidful/albert_chinese_xxlarge
electra_180g_large ELECTRA-180g-large, Chinese tensorflow 百度网盘 请输入提取码 GitHub - ymcui/Chinese-ELECTRA: Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型) hfl/chinese-electra-180g-large-discriminator
hflchinese-electra-180g-large-discriminator pytorch https://huggingface.co/hfl/chinese-electra-180g-large-discriminator
electra_180g_base ELECTRA-180g-base, Chinese tensorflow 百度网盘 请输入提取码 hfl/chinese-electra-180g-base-discriminator
hflchinese-electra-180g-base-discriminator pytorch https://huggingface.co/hfl/chinese-electra-180g-base-discriminator
electra_180g_small_ex ELECTRA-180g-small-ex, Chinese tensorflow 百度网盘 请输入提取码 hfl/chinese-electra-180g-small-ex-discriminator
hflchinese-electra-180g-small-ex-discriminator pytorch https://huggingface.co/hfl/chinese-electra-180g-small-ex-discriminator
electra_180g_small ELECTRA-small, Chinese tensorflow 百度网盘 请输入提取码 hfl/chinese-electra-180g-small-discriminator
hflchinese-electra-180g-small-discriminator pytorch https://huggingface.co/hfl/chinese-electra-180g-small-discriminator
chinese_macbert_large MacBERT-large, Chinese tensorflow 百度网盘 请输入提取码 GitHub - ymcui/MacBERT: Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT) hfl/chinese-macbert-large
hflchinese-macbert-large pytorch https://huggingface.co/hfl/chinese-macbert-large
chinese_macbert_base MacBERT-base, Chinese tensorflow 百度网盘 请输入提取码 hfl/chinese-macbert-base
hflchinese-macbert-base pytorch https://cdn-lfs.huggingface.co/hfl/chinese-macbert-base
Erlangshen-SimCSE-110M-Chinese pytorch
unsup-simcse-bert-base-uncased pytorch
sup-simcse-roberta-large pytorch
sup-simcse-bert-base-uncased pytorch
albert-base-chinese-cluecorpussmall pytorch
Logo

有“AI”的1024 = 2048,欢迎大家加入2048 AI社区

更多推荐