Text Generation
Transformers
Safetensors
GGUF
gemma3_text
turkish
türkiye
english
ai
lamapi
gemma3
next
next-x1
efficient
open-source
1b
huggingface
large-language-model
llm
causal
transformer
artificial-intelligence
machine-learning
ai-research
natural-language-processing
nlp
finetuned
lightweight
creative
summarization
question-answering
chat-model
generative-ai
optimized-model
unsloth
trl
sft
chemistry
biology
finance
legal
music
art
code
climate
medical
agent
text-generation-inference
conversational
Update README.md
Browse files
README.md
CHANGED
|
@@ -124,26 +124,6 @@ library_name: transformers
|
|
| 124 |
|
| 125 |
---
|
| 126 |
|
| 127 |
-
<style>
|
| 128 |
-
table { width:fit-content; border-collapse:separate; border-spacing:0 3px;font-family:system-ui, -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, 'Open Sans', 'Helvetica Neue', sans-serif;color:rgb(255, 255, 255)!important;background:rgb(28, 41, 59);border-radius:16px;padding: 10px; border:none;transition:.2s all ease;}
|
| 129 |
-
thead th { text-align:center; padding:4px 10px; font-size:13px; text-transform:uppercase; color:rgb(255, 255, 255)!important;border:none; }
|
| 130 |
-
tbody tr { transition: transform 0.18s ease, box-shadow 0.18s ease; border:none !important;transition:.2s all ease;border-radius:16px;background:rgba(11, 23, 27, 1);}
|
| 131 |
-
tbody .next:hover {box-shadow:0 6px 15px rgba(0, 76, 148, 0.1);background: rgb(0, 59, 225)}
|
| 132 |
-
tbody tr:hover { box-shadow:0 0px 15px rgba(12, 12, 12, 0.4); background:rgba(17, 34, 53, 1)}
|
| 133 |
-
td { padding:8px 10px;border:0px transparent !important;outline:transparent !important; text-align:center; }
|
| 134 |
-
td:first-child { font-weight:600;text-align:left }
|
| 135 |
-
/* tbody .turkish td { background: rgba(255, 0, 0, 0.2) !important; color:rgb(200,200,200); font-weight:400;border:0px !important; scale:1.0; } */
|
| 136 |
-
/* tbody .next td { background: rgba(0, 89, 255, 0.49)!important; color:rgb(200,200,200); font-weight:600;border:0px !important; scale:1.00;outline:none;border:none !important;} */
|
| 137 |
-
.next{
|
| 138 |
-
background: rgb(0, 89, 255);
|
| 139 |
-
}
|
| 140 |
-
tbody tr td:first-child { border-top-left-radius:12px; border-bottom-left-radius:12px; }
|
| 141 |
-
tbody tr td:last-child { border-top-right-radius:12px; border-bottom-right-radius:12px; }
|
| 142 |
-
strong{
|
| 143 |
-
font-size:16px;font-weight:700;color:rgba(255, 255, 255, 1)!important;
|
| 144 |
-
}
|
| 145 |
-
em{opacity:1;font-size:11px !important;}
|
| 146 |
-
</style>
|
| 147 |
## 📖 Overview
|
| 148 |
|
| 149 |
**Next-1B** is a **1-billion parameter causal language model** based on **Gemma 3**, designed for **efficiency, low-resource deployment, and reasoning-focused natural language understanding**.
|
|
|
|
| 124 |
|
| 125 |
---
|
| 126 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 127 |
## 📖 Overview
|
| 128 |
|
| 129 |
**Next-1B** is a **1-billion parameter causal language model** based on **Gemma 3**, designed for **efficiency, low-resource deployment, and reasoning-focused natural language understanding**.
|