Spacy-transformers -更新transformers兼容性

4dbbbstv  于 5个月前  发布在  其他
关注(0)|答案(4)|浏览(51)

我正在使用spacy-transformers的1.3.4版本,但它与最新版本的transformers(4.37.2)不兼容。是否有计划进行更新?谢谢!

wkftcu5l

wkftcu5l1#

这仅仅是版本不兼容问题,因为我们将transformers固定到了<4.37.0上,还是说你能够实际地在本地更新你的transformers安装,并且一切仍然按预期工作?
如果我可以问的话,你正在使用spaCy的哪个版本?因为从3.7版本开始,我们已经开始转向https://github.com/explosion/spacy-curated-transformers - 你尝试过吗?

z6psavjg

z6psavjg2#

我正在使用 spacy==3.7.3
我可以卸载 spacy-transformers 以供 spacy-curated-transformers 使用吗?
使用 spacy-transformers==1.3.4 时,一切似乎都正常,但我只能得到版本错误。
使用 spacy-curated-transformers 时,我遇到了这个错误:

ValueError: [E002] Can't find factory for 'transformer' for language English (en). This usually happens when spaCy calls `nlp.create_pipe` with a custom component name that's not registered on the current language class. If you're using a custom component, make sure you've added the decorator `@Language.component` (for function components) or `@Language.factory` (for class components).

Available factories: attribute_ruler, tok2vec, merge_noun_chunks, merge_entities, merge_subtokens, token_splitter, doc_cleaner, parser, beam_parser, lemmatizer, trainable_lemmatizer, entity_linker, entity_ruler, tagger, morphologizer, ner, beam_ner, senter, sentencizer, spancat, spancat_singlelabel, span_finder, future_entity_ruler, span_ruler, textcat, textcat_multilabel, en.lemmatizer

运行中:
spacy.load(path)

xoshrz7s

xoshrz7s3#

我们不得不拉取3.7.3版本(由于无关原因 - 多进程代码中的一个错误),所以请尽量更新到3.7.4版本。

我能否为spacy-curated-transformers卸载spacy-transformers?
是的,但然后你需要使用curated_transformer作为工厂,而不仅仅是transformer。你可以查看一个示例配置here:

import spacy
from spacy_transformers import `curated_transformer`

spacy.load("path")

你正在加载哪个模型?如果这是一个使用旧的spacy_transformer 's transformer工厂的预训练模型,那么你仍然需要spacy_transformer。如果是我们的预训练模型,你可能可以更新。

pbpqsu0x

pbpqsu0x4#

感谢。我想训练一个带有transformers的spancat管道。我下载了spacy-curated-transformersspacy==3.7.4,但遇到了这个错误:

catalogue.RegistryError: [E892] Unknown function registry: 'span_getters'.

Available names: architectures, augmenters, batchers, callbacks, cli, datasets, displacy_colors, factories, initializers, languages, layers, lemmatizers, loggers, lookups, losses, misc, model_loaders, models, ops, optimizers, readers, schedules, scorers, tokenizers, vectors

。我在spaCy网站上使用了"This is an auto-generated partial config.",但它仅适用于spacy-transformers。我尝试将其适应于spacy-curated-transformers。这是我实际使用的cfg文件,用于!python -m spacy init labels mycfg.cfg ...:

[paths]
train = null
dev = null
vectors = null
init_tok2vec = null

[system]
gpu_allocator = "pytorch"
seed = 0

[nlp]
lang = "en"
pipeline = ["transformer","spancat"]
batch_size = 512
disabled = []
before_creation = null
after_creation = null
after_pipeline_creation = null
tokenizer = {"@tokenizers":"spacy.Tokenizer.v1"}
vectors = {"@vectors":"spacy.Vectors.v1"}

[components]

[components.spancat]
factory = "spancat"
max_positive = null
scorer = {"@scorers":"spacy.spancat_scorer.v1"}
spans_key = "sc"
threshold = 0.5

[components.spancat.model]
@architectures = "spacy.SpanCategorizer.v1"

[components.spancat.model.reducer]
@layers = "spacy.mean_max_reducer.v1"
hidden_size = 128

[components.spancat.model.scorer]
@layers = "spacy.LinearLogistic.v1"
nO = null
nI = null

[components.spancat.model.tok2vec]
@architectures = "spacy-curated-transformers.TransformerListener.v1"
grad_factor = 1.0
pooling = {"@layers":"reduce_mean.v1"}
upstream = "*"

[components.spancat.suggester]
@misc = "spacy.ngram_suggester.v1"
sizes = [1,2,3]

[components.transformer]
factory = "curated_transformer"
max_batch_items = 4096
set_extra_annotations = {"@annotation_setters":"spacy-curated-transformers.null_annotation_setter.v1"}

[components.transformer.model]
@architectures = "spacy-curated-transformers.RobertaTransformer.v1"
name = "roberta-base"
mixed_precision = false

[components.transformer.model.get_spans]
@span_getters = "spacy-curated-transformers.strided_spans.v1"
window = 128
stride = 96

[components.transformer.model.grad_scaler_config]

[components.transformer.model.tokenizer_config]
use_fast = true

[components.transformer.model.transformer_config]

[corpora]
...other..

相关问题