我无法使用HuggingFace(transformers)预训练模型,我正在尝试运行.frompretrained()函数,具体如下:
from transformers import BertModel, AutoModel
model = BertModel.from_pretrained('bert-base-uncased')
我得到以下错误:
AttributeError: module 'torch' has no attribute '_utils'
如果有帮助,我运行它时得到这个:
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ in <module>:2 │
│ │
│ 1 from transformers import BertModel, AutoModel │
│ ❱ 2 model = BertModel.from_pretrained('bert-base-uncased') │
│ 3 │
│ │
│ /Users/gpt_env/lib/python3.10/site-packages/transformers/modeling_utils.py:2600 │
│ in from_pretrained │
│ │
│ 2597 │ │ if from_pt: │
│ 2598 │ │ │ if not is_sharded and state_dict is None: │
│ 2599 │ │ │ │ # Time to load the checkpoint │
│ ❱ 2600 │ │ │ │ state_dict = load_state_dict(resolved_archive_file) │
│ 2601 │ │ │ │
│ 2602 │ │ │ # set dtype to instantiate the model under: │
│ 2603 │ │ │ # 1. If torch_dtype is not None, we use that dtype │
│ │
│ /Users/gpt_env/lib/python3.10/site-packages/transformers/modeling_utils.py:446 │
│ in load_state_dict │
│ │
│ 443 │ """ │
│ 444 │ if checkpoint_file.endswith(".safetensors") and is_safetensors_available(): │
│ 445 │ │ # Check format of the archive │
│ ❱ 446 │ │ with safe_open(checkpoint_file, framework="pt") as f: │
│ 447 │ │ │ metadata = f.metadata() │
│ 448 │ │ if metadata.get("format") not in ["pt", "tf", "flax"]: │
│ 449 │ │ │ raise OSError( │
│ │
│ /Users/gpt_env/lib/python3.10/site-packages/torch/storage.py:779 in from_file │
│ │
│ 776 │ │ untyped_storage: UntypedStorage = UntypedStorage.from_file( │
│ 777 │ │ │ filename, │
│ 778 │ │ │ shared, │
│ ❱ 779 │ │ │ size * torch._utils._element_size(cls.dtype)) │
│ 780 │ │ storage = cls(wrap_storage=untyped_storage) │
│ 781 │ │ return storage │
│ 782
我尝试过使用相同的from_pretrained()函数的其他模型,也发生了类似的错误。
任何帮助是非常感谢!
1条答案
按热度按时间nxagd54h1#
你能复制我的版本,因为它为我工作