Incorrect path_or_model_id
#5
by
officialtgmteam
- opened
Hello, i have problem with run model on my pc, I want run it in offline mode, but it write me error when i run it... it say:
"Incorrect path_or_model_id: 'D:/Qwen--Qwen2.5-3B-Instruct/.cache/models--Qwen--Qwen2.5-3B-Instruct/snapshots/aa8e72537993ba99e69dfaafaafaafa59ed015b17504d1'. Please provide either the path to a local folder or the repo_id of a model on the Hub."
I have downloaded model in this folder, but it doesn't see it, or I don't know why it return this error. My code is here:
try:
from transformers import AutoTokenizer, AutoModelForCausalLM
Path(self.qwen_cache_dir).mkdir(parents=True, exist_ok=True)
pretrained_net = "Qwen/Qwen2.5-3B-Instruct"
pretrained = [self.qwen_model_path](D:/Qwen--Qwen2.5-3B-Instruct/.cache/models--Qwen--Qwen2.5-3B-Instruct/snapshots/aa8e72537993ba99e69dfaafaafaafa59ed015b17504d1)
model_name = "Qwen/Qwen2.5-3B-Instruct"
self.local_model = "Qwen/Qwen2.5-3B-Instruct"
device_map = "auto"
torch_dtype = torch.float16
try:
self.tokenizer = AutoTokenizer.from_pretrained(pretrained, local_files_only=True, cache_dir=self.qwen_cache_dir)
print('Tokenizer run.')
except Exception as e: # noqa: E722
print(f'{e}')
return
try:
self.model = AutoModelForCausalLM.from_pretrained(
pretrained=pretrained,
local_model_available=True,
torch_dtype=torch_dtype,
device_map=device_map,
local_files_only=True,
cache_dir=self.qwen_cache_dir
)
self.model.eval()
print('Qwen run.')
self.model_name = model_name
except Exception as e: # noqa: E722
print(f'{e}')
return
can you please help me how to run it without connection to internet? Thank you so much.