Failed when running the generated code snippets on Google Colab
#35
by
hieultima
- opened
When I runned the model on Google Colab notebook, I got the following error.
And as stated in the notebook guideline, I filed this issue in here and on Github model repo
Below is the full code cell output actions history of my current session:
Command 1: !pip install -U transformers
Output:
Requirement already satisfied: transformers in /usr/local/lib/python3.12/dist-packages (4.55.2)
Collecting transformers
Downloading transformers-4.55.4-py3-none-any.whl.metadata (41 kB)
ββββββββββββββββββββββββββββββββββββββββ 42.0/42.0 kB 1.9 MB/s eta 0:00:00
Requirement already satisfied: filelock in /usr/local/lib/python3.12/dist-packages (from transformers) (3.19.1)
Requirement already satisfied: huggingface-hub<1.0,>=0.34.0 in /usr/local/lib/python3.12/dist-packages (from transformers) (0.34.4)
Requirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.12/dist-packages (from transformers) (2.0.2)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.12/dist-packages (from transformers) (25.0)
Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.12/dist-packages (from transformers) (6.0.2)
Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.12/dist-packages (from transformers) (2024.11.6)
Requirement already satisfied: requests in /usr/local/lib/python3.12/dist-packages (from transformers) (2.32.4)
Requirement already satisfied: tokenizers<0.22,>=0.21 in /usr/local/lib/python3.12/dist-packages (from transformers) (0.21.4)
Requirement already satisfied: safetensors>=0.4.3 in /usr/local/lib/python3.12/dist-packages (from transformers) (0.6.2)
Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.12/dist-packages (from transformers) (4.67.1)
Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.12/dist-packages (from huggingface-hub<1.0,>=0.34.0->transformers) (2025.3.0)
Requirement already satisfied: typing-extensions>=3.7.4.3 in /usr/local/lib/python3.12/dist-packages (from huggingface-hub<1.0,>=0.34.0->transformers) (4.14.1)
Requirement already satisfied: hf-xet<2.0.0,>=1.1.3 in /usr/local/lib/python3.12/dist-packages (from huggingface-hub<1.0,>=0.34.0->transformers) (1.1.7)
Requirement already satisfied: charset_normalizer<4,>=2 in /usr/local/lib/python3.12/dist-packages (from requests->transformers) (3.4.3)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.12/dist-packages (from requests->transformers) (3.10)
Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.12/dist-packages (from requests->transformers) (2.5.0)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.12/dist-packages (from requests->transformers) (2025.8.3)
Downloading transformers-4.55.4-py3-none-any.whl (11.3 MB)
ββββββββββββββββββββββββββββββββββββββββ 11.3/11.3 MB 84.9 MB/s eta 0:00:00
Installing collected packages: transformers
Attempting uninstall: transformers
Found existing installation: transformers 4.55.2
Uninstalling transformers-4.55.2:
Successfully uninstalled transformers-4.55.2
Successfully installed transformers-4.55.4
Command 2:
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("microsoft/OmniParser-v2.0", trust_remote_code=True, torch_dtype="auto"),
Output:
---------------------------------------------------------------------------
JSONDecodeError Traceback (most recent call last)
[/usr/local/lib/python3.12/dist-packages/transformers/configuration_utils.py](https://localhost:8080/#) in _get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
742 # Load config dict
--> 743 config_dict = cls._dict_from_json_file(resolved_config_file)
744
8 frames
[/usr/local/lib/python3.12/dist-packages/transformers/configuration_utils.py](https://localhost:8080/#) in _dict_from_json_file(cls, json_file)
844 text = reader.read()
--> 845 return json.loads(text)
846
[/usr/lib/python3.12/json/__init__.py](https://localhost:8080/#) in loads(s, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, **kw)
345 parse_constant is None and object_pairs_hook is None and not kw):
--> 346 return _default_decoder.decode(s)
347 if cls is None:
[/usr/lib/python3.12/json/decoder.py](https://localhost:8080/#) in decode(self, s, _w)
337 """
--> 338 obj, end = self.raw_decode(s, idx=_w(s, 0).end())
339 end = _w(s, end).end()
[/usr/lib/python3.12/json/decoder.py](https://localhost:8080/#) in raw_decode(self, s, idx)
355 except StopIteration as err:
--> 356 raise JSONDecodeError("Expecting value", s, err.value) from None
357 return obj, end
JSONDecodeError: Expecting value: line 1 column 1 (char 0)
During handling of the above exception, another exception occurred:
OSError Traceback (most recent call last)
[/tmp/ipython-input-3342156634.py](https://localhost:8080/#) in <cell line: 0>()
1 # Load model directly
2 from transformers import AutoModel
----> 3 model = AutoModel.from_pretrained("microsoft/OmniParser-v2.0", trust_remote_code=True, torch_dtype="auto"),
[/usr/local/lib/python3.12/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
545 _ = kwargs.pop("quantization_config")
546
--> 547 config, kwargs = AutoConfig.from_pretrained(
548 pretrained_model_name_or_path,
549 return_unused_kwargs=True,
[/usr/local/lib/python3.12/dist-packages/transformers/models/auto/configuration_auto.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1248 code_revision = kwargs.pop("code_revision", None)
1249
-> 1250 config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
1251 has_remote_code = "auto_map" in config_dict and "AutoConfig" in config_dict["auto_map"]
1252 has_local_code = "model_type" in config_dict and config_dict["model_type"] in CONFIG_MAPPING
[/usr/local/lib/python3.12/dist-packages/transformers/configuration_utils.py](https://localhost:8080/#) in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
647 original_kwargs = copy.deepcopy(kwargs)
648 # Get config dict associated with the base config file
--> 649 config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
650 if config_dict is None:
651 return {}, kwargs
[/usr/local/lib/python3.12/dist-packages/transformers/configuration_utils.py](https://localhost:8080/#) in _get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
745 config_dict["_commit_hash"] = commit_hash
746 except (json.JSONDecodeError, UnicodeDecodeError):
--> 747 raise OSError(f"It looks like the config file at '{resolved_config_file}' is not a valid JSON file.")
748
749 if is_local:
OSError: It looks like the config file at '/root/.cache/huggingface/hub/models--microsoft--OmniParser-v2.0/snapshots/6600256cb0f1b07651e3bc86166196307bad7e2d/config.json' is not a valid JSON file.
I have the same problem