mllmTeam commited on
Commit
384924e
·
verified ·
1 Parent(s): 2820e4c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -2
README.md CHANGED
@@ -78,5 +78,18 @@ The training dataset PhoneLM used is comprised of a filtered mixture of open-sou
78
  | Gemma-2B | 71.4 | 65.2 | 78.4 | 91.4 | 69.9 | 72.3 | 42.0 | 70.09 |
79
  | Gemma 2-2B | 55.0 | 68.7 | 78.7 | 96.0 | 73.6 | 80.3 | 46.9 | 71.31 |
80
 
81
- ## LICENSE
82
- * This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-1.5B/blob/main/LICENSE) License.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
78
  | Gemma-2B | 71.4 | 65.2 | 78.4 | 91.4 | 69.9 | 72.3 | 42.0 | 70.09 |
79
  | Gemma 2-2B | 55.0 | 68.7 | 78.7 | 96.0 | 73.6 | 80.3 | 46.9 | 71.31 |
80
 
81
+ ## License
82
+ * This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-1.5B/blob/main/LICENSE) License.
83
+
84
+ ## Citation
85
+ ```
86
+ @misc{yi2024phonelmanefficientcapablesmall,
87
+ title={PhoneLM:an Efficient and Capable Small Language Model Family through Principled Pre-training},
88
+ author={Rongjie Yi and Xiang Li and Weikai Xie and Zhenyan Lu and Chenghua Wang and Ao Zhou and Shangguang Wang and Xiwen Zhang and Mengwei Xu},
89
+ year={2024},
90
+ eprint={2411.05046},
91
+ archivePrefix={arXiv},
92
+ primaryClass={cs.CL},
93
+ url={https://arxiv.org/abs/2411.05046},
94
+ }
95
+ ```