Update README.md
#7
by
adenn
- opened
README.md
CHANGED
|
@@ -7,6 +7,15 @@ tags:
|
|
| 7 |
base_model: black-forest-labs/FLUX.1-dev
|
| 8 |
instance_prompt: null
|
| 9 |
license: mit
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 10 |
---
|
| 11 |
📢 [[Project Page](https://ali-vilab.github.io/In-Context-LoRA-Page/)] [[Github Repo](https://github.com/ali-vilab/In-Context-LoRA)] [[Paper](https://arxiv.org/abs/2410.23775)]
|
| 12 |
# 🔥 Latest News
|
|
@@ -95,4 +104,4 @@ If you find this work useful in your research, please consider citing:
|
|
| 95 |
|
| 96 |
Weights for these models are available in Safetensors format.
|
| 97 |
|
| 98 |
-
[Download](/ali-vilab/In-Context-LoRA/tree/main) them in the Files & versions tab.
|
|
|
|
| 7 |
base_model: black-forest-labs/FLUX.1-dev
|
| 8 |
instance_prompt: null
|
| 9 |
license: mit
|
| 10 |
+
datasets:
|
| 11 |
+
- fka/awesome-chatgpt-prompts
|
| 12 |
+
language:
|
| 13 |
+
- zh
|
| 14 |
+
metrics:
|
| 15 |
+
- accuracy
|
| 16 |
+
new_version: nvidia/Llama-3.1-Nemotron-70B-Instruct-HF
|
| 17 |
+
pipeline_tag: text-classification
|
| 18 |
+
library_name: adapter-transformers
|
| 19 |
---
|
| 20 |
📢 [[Project Page](https://ali-vilab.github.io/In-Context-LoRA-Page/)] [[Github Repo](https://github.com/ali-vilab/In-Context-LoRA)] [[Paper](https://arxiv.org/abs/2410.23775)]
|
| 21 |
# 🔥 Latest News
|
|
|
|
| 104 |
|
| 105 |
Weights for these models are available in Safetensors format.
|
| 106 |
|
| 107 |
+
[Download](/ali-vilab/In-Context-LoRA/tree/main) them in the Files & versions tab.
|