File size: 793 Bytes
e318cf7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
---
library_name: transformers
license: apache-2.0
language: [ko]
tags: [lora, peft, adapters]
task_categories: [text-generation]
base_model: OpenPipe/gemma-3-27b-it-text-only
---

# sapie-model/SQL-grpo-lora-766

- ์ด ๋ฆฌํฌ๋Š” **LoRA/์–ด๋Œ‘ํ„ฐ ๊ฐ€์ค‘์น˜**๋งŒ ํฌํ•จํ•ฉ๋‹ˆ๋‹ค. ์ถ”๋ก  ์‹œ์—๋Š” ๋ฒ ์ด์Šค ๋ชจ๋ธ `OpenPipe/gemma-3-27b-it-text-only` ๊ณผ ํ•จ๊ป˜ ๋กœ๋“œํ•˜์„ธ์š”.

## ์‚ฌ์šฉ ์˜ˆ์‹œ

```python
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel

base = "OpenPipe/gemma-3-27b-it-text-only"
model_id = "sapie-model/SQL-grpo-lora-766"  # ์ด ์–ด๋Œ‘ํ„ฐ ๋ฆฌํฌ

tok = AutoTokenizer.from_pretrained(base)
model = AutoModelForCausalLM.from_pretrained(base, torch_dtype='auto', device_map='auto')
model = PeftModel.from_pretrained(model, model_id)
```