The model type in 'config.json' does not match the names in 'transformers'.

#1
by ZmuX1n - opened

It seems that there are some problems in the 'config.json' file in the model file repository. I encountered the following problems when running the code:
Traceback (most recent call last):
File "/user/checkdata/metaclip_check.py", line 7, in
model = AutoModel.from_pretrained("nielsr/metaclip-2-huge-worldwide-378", torch_dtype=torch.bfloat16, attn_implementation="sdpa")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/user/zhangjincheng/ocr_data/code/transformers/src/transformers/models/auto/auto_factory.py", line 547, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/user/code/transformers/src/transformers/models/auto/configuration_auto.py", line 1271, in from_pretrained
raise ValueError(
ValueError: The checkpoint you are trying to load has model type metaclip_2 but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

This error was solved after I changed the value of the 'model_type' field in 'config.json' from {"model_type": "metaclip_2"} to {"model_type": "metaclip-2"}.
This problem will affect the online download and call of the model through transformers. I hope the author can modify it as soon as possible.

Hi, please read the model card, it's not merged yet in Transformers

Sign up or log in to comment