SQL-sft-grpo-5000 / README.md
sapie-model's picture
Upload README.md with huggingface_hub
6bb8cdf verified
metadata
library_name: transformers
license: apache-2.0
language:
  - ko
tags:
  - lora
  - peft
  - adapters
task_categories:
  - text-generation
base_model: OpenPipe/gemma-3-27b-it-text-only

sapie-model/SQL-sft-grpo-5000

  • ์ด ๋ฆฌํฌ๋Š” LoRA/์–ด๋Œ‘ํ„ฐ ๊ฐ€์ค‘์น˜๋งŒ ํฌํ•จํ•ฉ๋‹ˆ๋‹ค. ์ถ”๋ก  ์‹œ์—๋Š” ๋ฒ ์ด์Šค ๋ชจ๋ธ OpenPipe/gemma-3-27b-it-text-only ๊ณผ ํ•จ๊ป˜ ๋กœ๋“œํ•˜์„ธ์š”.

์‚ฌ์šฉ ์˜ˆ์‹œ

from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel

base = "OpenPipe/gemma-3-27b-it-text-only"
model_id = "sapie-model/SQL-sft-grpo-5000"  # ์ด ์–ด๋Œ‘ํ„ฐ ๋ฆฌํฌ

tok = AutoTokenizer.from_pretrained(base)
model = AutoModelForCausalLM.from_pretrained(base, torch_dtype='auto', device_map='auto')
model = PeftModel.from_pretrained(model, model_id)