支持加载的权重:
- Google原版bert: https://github.com/google-research/bert
- brightmart版roberta: https://github.com/brightmart/roberta_zh
- 哈工大版roberta: https://github.com/ymcui/Chinese-BERT-wwm
- Google原版albert[例子]: https://github.com/google-research/ALBERT
- brightmart版albert: https://github.com/brightmart/albert_zh
- 转换后的albert: https://github.com/bojone/albert_zh
- 华为的NEZHA: https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/NEZHA-TensorFlow
- 华为的NEZHA-GEN: https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/NEZHA-Gen-TensorFlow
- 自研语言模型: https://github.com/ZhuiyiTechnology/pretrained-models
- T5模型: https://github.com/google-research/text-to-text-transfer-transformer
- GPT_OpenAI: https://github.com/bojone/CDial-GPT-tf
- GPT2_ML: https://github.com/imcaspar/gpt2-ml
- Google原版ELECTRA: https://github.com/google-research/electra
- 哈工大版ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
- CLUE版ELECTRA: https://github.com/CLUEbenchmark/ELECTRA
- LaBSE(多国语言BERT): https://github.com/bojone/labse
- Chinese-GEN项目下的模型: https://github.com/bojone/chinese-gen
- T5.1.1: https://github.com/google-research/text-to-text-transfer-transformer/blob/master/released_checkpoints.md#t511
- Multilingual T5: https://github.com/google-research/multilingual-t5/