| library_name: transformers | |
| license: apache-2.0 | |
| language: [ko] | |
| tags: [lora, peft, adapters] | |
| task_categories: [text-generation] | |
| base_model: OpenPipe/gemma-3-27b-it-text-only | |
| # sapie-model/SQL-sft-grpo-5000 | |
| - ์ด ๋ฆฌํฌ๋ **LoRA/์ด๋ํฐ ๊ฐ์ค์น**๋ง ํฌํจํฉ๋๋ค. ์ถ๋ก ์์๋ ๋ฒ ์ด์ค ๋ชจ๋ธ `OpenPipe/gemma-3-27b-it-text-only` ๊ณผ ํจ๊ป ๋ก๋ํ์ธ์. | |
| ## ์ฌ์ฉ ์์ | |
| ```python | |
| from transformers import AutoTokenizer, AutoModelForCausalLM | |
| from peft import PeftModel | |
| base = "OpenPipe/gemma-3-27b-it-text-only" | |
| model_id = "sapie-model/SQL-sft-grpo-5000" # ์ด ์ด๋ํฐ ๋ฆฌํฌ | |
| tok = AutoTokenizer.from_pretrained(base) | |
| model = AutoModelForCausalLM.from_pretrained(base, torch_dtype='auto', device_map='auto') | |
| model = PeftModel.from_pretrained(model, model_id) | |
| ``` | |