Commit
·
842c374
1
Parent(s):
d83e41f
Delete bert/chinese-roberta-wwm-ext-large/README.md
Browse files
bert/chinese-roberta-wwm-ext-large/README.md
DELETED
|
@@ -1,64 +0,0 @@
|
|
| 1 |
-
---
|
| 2 |
-
language:
|
| 3 |
-
- zh
|
| 4 |
-
tags:
|
| 5 |
-
- bert
|
| 6 |
-
license: "apache-2.0"
|
| 7 |
-
---
|
| 8 |
-
|
| 9 |
-
How to download the files:
|
| 10 |
-
```
|
| 11 |
-
curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | sudo bash
|
| 12 |
-
sudo apt-get install git-lfs
|
| 13 |
-
git clone https://huggingface.co/hfl/chinese-roberta-wwm-ext-large
|
| 14 |
-
```
|
| 15 |
-
|
| 16 |
-
# Please use 'Bert' related functions to load this model!
|
| 17 |
-
|
| 18 |
-
## Chinese BERT with Whole Word Masking
|
| 19 |
-
For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
|
| 20 |
-
|
| 21 |
-
**[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)**
|
| 22 |
-
Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping Hu
|
| 23 |
-
|
| 24 |
-
This repository is developed based on:https://github.com/google-research/bert
|
| 25 |
-
|
| 26 |
-
You may also interested in,
|
| 27 |
-
- Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm
|
| 28 |
-
- Chinese MacBERT: https://github.com/ymcui/MacBERT
|
| 29 |
-
- Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
|
| 30 |
-
- Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
|
| 31 |
-
- Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer
|
| 32 |
-
|
| 33 |
-
More resources by HFL: https://github.com/ymcui/HFL-Anthology
|
| 34 |
-
|
| 35 |
-
## Citation
|
| 36 |
-
If you find the technical report or resource is useful, please cite the following technical report in your paper.
|
| 37 |
-
- Primary: https://arxiv.org/abs/2004.13922
|
| 38 |
-
```
|
| 39 |
-
@inproceedings{cui-etal-2020-revisiting,
|
| 40 |
-
title = "Revisiting Pre-Trained Models for {C}hinese Natural Language Processing",
|
| 41 |
-
author = "Cui, Yiming and
|
| 42 |
-
Che, Wanxiang and
|
| 43 |
-
Liu, Ting and
|
| 44 |
-
Qin, Bing and
|
| 45 |
-
Wang, Shijin and
|
| 46 |
-
Hu, Guoping",
|
| 47 |
-
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings",
|
| 48 |
-
month = nov,
|
| 49 |
-
year = "2020",
|
| 50 |
-
address = "Online",
|
| 51 |
-
publisher = "Association for Computational Linguistics",
|
| 52 |
-
url = "https://www.aclweb.org/anthology/2020.findings-emnlp.58",
|
| 53 |
-
pages = "657--668",
|
| 54 |
-
}
|
| 55 |
-
```
|
| 56 |
-
- Secondary: https://arxiv.org/abs/1906.08101
|
| 57 |
-
```
|
| 58 |
-
@article{chinese-bert-wwm,
|
| 59 |
-
title={Pre-Training with Whole Word Masking for Chinese BERT},
|
| 60 |
-
author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Yang, Ziqing and Wang, Shijin and Hu, Guoping},
|
| 61 |
-
journal={arXiv preprint arXiv:1906.08101},
|
| 62 |
-
year={2019}
|
| 63 |
-
}
|
| 64 |
-
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|