iansotnek commited on
Commit
a9c6436
·
1 Parent(s): 2e26302

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -57,8 +57,8 @@ Including `torch_dtype=torch.bfloat16` is generally recommended if this type is
57
  It is also fine to remove it if there is sufficient memory.
58
 
59
  ```python
60
- import torch
61
  from transformers import pipeline
 
62
 
63
  generate_text = pipeline(model="aisquared/chopt-research-1_3b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
64
  ```
@@ -74,9 +74,9 @@ Alternatively, if you prefer to not use `trust_remote_code=True` you can downloa
74
  store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
75
 
76
  ```python
77
- import torch
78
  from instruct_pipeline import InstructionTextGenerationPipeline
79
  from transformers import AutoModelForCausalLM, AutoTokenizer
 
80
 
81
  tokenizer = AutoTokenizer.from_pretrained("aisquared/chopt-research-1_3b", padding_side="left")
82
  model = AutoModelForCausalLM.from_pretrained("aisquared/chopt-research-1_3b", device_map="auto", torch_dtype=torch.bfloat16)
 
57
  It is also fine to remove it if there is sufficient memory.
58
 
59
  ```python
 
60
  from transformers import pipeline
61
+ import torch
62
 
63
  generate_text = pipeline(model="aisquared/chopt-research-1_3b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
64
  ```
 
74
  store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
75
 
76
  ```python
 
77
  from instruct_pipeline import InstructionTextGenerationPipeline
78
  from transformers import AutoModelForCausalLM, AutoTokenizer
79
+ import torch
80
 
81
  tokenizer = AutoTokenizer.from_pretrained("aisquared/chopt-research-1_3b", padding_side="left")
82
  model = AutoModelForCausalLM.from_pretrained("aisquared/chopt-research-1_3b", device_map="auto", torch_dtype=torch.bfloat16)