自然语言处理NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 SRL
Gavin大咖金句
在Gavin大咖看来,正常的AI技术人员很难破解Transformer的的玄妙主要是因为其无法体悟贝叶斯公式中P(B)的内涵,从而失去了领悟获得Transfomer真正力量的机会。
在Gavin大咖看来,理论上将Transformer能够更好的处理一切以“set of units”存在的数据,而计算机视觉、语音、自然语言处理等属于这种类型的数据,所以理论上讲Transformer会在接下来数十年对这些领域形成主导性的统治力。
Gavin:Non-linearity是Transformer的魔法
Gavin大咖认为:如果说Transformer是人工智能时代的芯片,那么StarSpace就是AI时代的操作系统。你怎么认为?
欢迎加Gavin大咖微信,加入星空智能对话交流群
本文是SRL系列博客的第5个示例,这个示例没有将一个动词重复几次。但是,示例5包含一个可以具有多种功能和含义的单词。它超越了一词多义,因为单词“round”可以有不同的含义和语法功能。单词“round”可以是名词、形容词、副词、及物动词或不及物动词。
作为及物动词或不及物动词,“round”可以表示达到完美或完成。从这个意义上说,“round”可以与“off”一起使用。
下面的句子在过去式中使用“round”:
"The bright sun, the blue sky, the warm sand, the palm trees, everything round off."
Round在某种意义上是“带来完美”,最好的语法形式应该是"rounded," 但Transformer找到了正确的动词,句子听起来很有诗意。让我们在SRL.ipynb中运行示例5
allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -
运行结果如下: `
12-20 09:09:05,537 - INFO - allennlp.common.params - model.ignore_span_metric = False
2020-12-20 09:09:05,537 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl
2020-12-20 09:09:05,837 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517
2020-12-20 09:09:05,837 - INFO - transformers.configuration_utils - Model config BertConfig {
"architectures": [
"BertForMaskedLM"
],
"attention_probs_dropout_prob": 0.1,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"layer_norm_eps": 1e-12,
"max_position_embeddings": 512,
"model_type": "bert",
"num_attention_heads": 12,
"num_hidden_layers": 12,
"pad_token_id": 0,
"type_vocab_size": 2,
"vocab_size": 30522
}
2020-12-20 09:09:06,048 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157
2020-12-20 09:09:08,747 - INFO - allennlp.nn.initializers - Initializing parameters
2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code
2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.bias
2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.weight
2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - bert_model.embeddings.position_embeddings.weight
2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - bert_model.embeddings.token_type_embeddings.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.embeddings.word_embeddings.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.bias
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.bias
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.bias
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.bias
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.bias
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.bias
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.bias
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.bias
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.bias
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.bias
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.weight
2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.weight
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.bias
2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.weight
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.bias
2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.bias
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.weight
2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.bias
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.weight
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.bias
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.weight
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.bias
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.weight
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.bias
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.weight
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.bias
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.weight
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.bias
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.weight
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.bias
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.weight
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.bias
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.weight
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.bias
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.weight
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.bias
2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.weight
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.bias
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.weight
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.bias
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.weight
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.bias
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.weight
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.bias
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.weight
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.bias
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.weight
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.bias
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.weight
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.bias
2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.bias
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.bias
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.bias
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.bias
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.bias
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.bias
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.bias
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.bias
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.bias
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.bias
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.weight
2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.bias
2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.weight
2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.bias
2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.weight
2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.bias
2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.weight
2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.bias
2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.weight
2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.bias
2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.weight
2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.bias
2020-12-20 09:09:08,786 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.weight
2020-12-20 09:09:08,787 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.bias
2020-12-20 09:09:08,787 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.weight
2020-12-20 09:09:08,787 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.bias
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.weight
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.bias
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.weight
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.bias
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.weight
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.bias
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.weight
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.bias
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.weight
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.bias
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.weight
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.bias
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.weight
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.bias
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.weight
2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.bias
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.weight
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.bias
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.weight
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.bias
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.weight
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.bias
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.weight
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.bias
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.weight
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.bias
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.weight
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.bias
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.weight
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.bias
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.weight
2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.bias
2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.weight
2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.bias
2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.weight
2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.bias
2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.weight
2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - tag_projection_layer.bias
2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - tag_projection_layer.weight
2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.type = srl
2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.lazy = False
2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.cache_directory = None
2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.max_instances = None
2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False
2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False
2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.token_indexers = None
2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None
2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased
2020-12-20 09:09:09,561 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
input 0: {"sentence": "The bright sun, the blue sky, the warm sand, the palm trees, everything round off."}
prediction: {"verbs": [], "words": ["The", "bright", "sun", ",", "the", "blue", "sky", ",", "the", "warm", "sand", ",", "the", "palm", "trees", ",", "everything", "round", "off", "."]}
2020-12-20 09:09:10,283 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpn3jl1yco
`
输出不显示动词, transformer 没有识别谓词。事实上,它没有发现任何动词:
prediction: {"verbs": [], "words": ["The", "bright", "sun", ",", "the", "blue", "sky", ",", "the", "warm", "sand", ",", "the", "palm", "trees", ",", "everything", "round", "off", "."]}
因为我们喜欢BERT-based transformer,我们会对它很好。让我们换个从过去时态到现在时态的句子:
"The bright sun, the blue sky, the warm sand, the palm trees, everything rounds off."
让我们用现在时态再试试SRL.ipynb
!echo '{"sentence": "The bright sun, the blue sky, the warm sand, the palm trees, everything rounds off."}' | \ allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -
运行结果如下
2020-12-20 09:09:12,636 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.
2020-12-20 09:09:12.789933: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1
2020-12-20 09:09:14,547 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.
2020-12-20 09:09:15,750 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724
2020-12-20 09:09:15,750 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724
2020-12-20 09:09:15,751 - INFO - filelock - Lock 139884787906432 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock
2020-12-20 09:09:15,751 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date
2020-12-20 09:09:15,751 - INFO - filelock - Lock 139884787906432 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock
2020-12-20 09:09:15,751 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724
2020-12-20 09:09:15,751 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpuj2lb1i1
2020-12-20 09:09:19,983 - INFO - allennlp.common.params - type = from_instances
2020-12-20 09:09:19,983 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpuj2lb1i1/vocabulary.
2020-12-20 09:09:19,983 - INFO - filelock - Lock 139884137381784 acquired on /tmp/tmpuj2lb1i1/vocabulary/.lock
2020-12-20 09:09:20,009 - INFO - filelock - Lock 139884137381784 released on /tmp/tmpuj2lb1i1/vocabulary/.lock
2020-12-20 09:09:20,010 - INFO - allennlp.common.params - model.type = srl_bert
2020-12-20 09:09:20,010 - INFO - allennlp.common.params - model.regularizer = None
2020-12-20 09:09:20,010 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased
2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.embedding_dropout = 0.1
2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.initializer = <allennlp.nn.initializers.InitializerApplicator object at 0x7f39504e07b8>
2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.label_smoothing = None
2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.ignore_span_metric = False
2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl
2020-12-20 09:09:20,306 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517
2020-12-20 09:09:20,307 - INFO - transformers.configuration_utils - Model config BertConfig {
"architectures": [
"BertForMaskedLM"
],
"attention_probs_dropout_prob": 0.1,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"layer_norm_eps": 1e-12,
"max_position_embeddings": 512,
"model_type": "bert",
"num_attention_heads": 12,
"num_hidden_layers": 12,
"pad_token_id": 0,
"type_vocab_size": 2,
"vocab_size": 30522
}
2020-12-20 09:09:20,499 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157
2020-12-20 09:09:23,169 - INFO - allennlp.nn.initializers - Initializing parameters
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.bias
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.weight
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.embeddings.position_embeddings.weight
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.embeddings.token_type_embeddings.weight
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.embeddings.word_embeddings.weight
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.bias
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.weight
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.bias
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.weight
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.bias
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.weight
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.bias
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.weight
2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.weight
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.weight
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.weight
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.weight
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.weight
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.weight
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.weight
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.weight
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.weight
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.weight
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.bias
2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.weight
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.bias
2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.weight
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.bias
2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.weight
2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.bias
2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.weight
2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.bias
2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.weight
2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.bias
2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.weight
2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.bias
2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.weight
2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.bias
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.weight
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.bias
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.weight
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.bias
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.weight
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.bias
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.weight
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.bias
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.weight
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.bias
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.weight
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.bias
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.weight
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.bias
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.weight
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.bias
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.weight
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.bias
2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.weight
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.bias
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.weight
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.bias
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.weight
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.bias
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.weight
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.bias
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.weight
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.bias
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.weight
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.bias
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.weight
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.bias
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.weight
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.bias
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.weight
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.bias
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.weight
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.bias
2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.weight
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.bias
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.weight
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.bias
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.weight
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.bias
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.weight
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.bias
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.weight
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.bias
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.weight
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.bias
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.weight
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.bias
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.weight
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.bias
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.weight
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.bias
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.weight
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.bias
2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.weight
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.bias
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.weight
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.bias
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.weight
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.bias
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.weight
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.bias
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.weight
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.bias
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.weight
2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.bias
2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.weight
2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.bias
2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.weight
2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.bias
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.weight
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.bias
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.weight
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.bias
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.weight
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.bias
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.weight
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.bias
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.weight
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.bias
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.weight
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.bias
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.weight
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.bias
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.weight
2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.bias
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.weight
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.bias
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.weight
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.bias
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.weight
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.bias
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.weight
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.bias
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.weight
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.bias
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.weight
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.bias
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.weight
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.bias
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.weight
2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.bias
2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.weight
2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.bias
2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.weight
2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.bias
2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.weight
2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - tag_projection_layer.bias
2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - tag_projection_layer.weight
2020-12-20 09:09:23,707 - INFO - allennlp.common.params - dataset_reader.type = srl
2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.lazy = False
2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.cache_directory = None
2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.max_instances = None
2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False
2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False
2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.token_indexers = None
2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None
2020-12-20 09:09:23,709 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased
2020-12-20 09:09:23,994 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
input 0: {"sentence": "The bright sun, the blue sky, the warm sand, the palm trees, everything rounds off."}
prediction: {"verbs": [{"verb": "rounds", "description": "[ARG1: The bright sun , the blue sky , the warm sand , the palm trees] , [R-ARG1: everything] [V: rounds] off .", "tags": ["B-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "O", "B-R-ARG1", "B-V", "O", "O"]}], "words": ["The", "bright", "sun", ",", "the", "blue", "sky", ",", "the", "warm", "sand", ",", "the", "palm", "trees", ",", "everything", "rounds", "off", "."]}
2020-12-20 09:09:24,932 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpuj2lb1i1
原始输出显示找到了谓词,如以下摘录所示
prediction: {"verbs": [{"verb": "rounds", "description": "[ARG1: The bright sun , the blue sky , the warm sand , the palm trees] , [R-ARG1: everything] [V: rounds] off .", "tags": ["B-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "I-ARG1", "O", "B-R-ARG1", "B-V", "O", "O"]}], "words": ["The", "bright", "sun", ",", "the", "blue", "sky", ",", "the", "warm", "sand", ",", "the", "palm", "trees", ",", "everything", "rounds", "off", "."]}
如果我们在AllenNLP上运行这个句子,我们可以得到直观的解释
我们基于BERT的transformer 做得很好, 因为单词“round”可以用复数形式表示为“rounds”。BERT模型最初未能产生我们预期的结果,但在朋友们的帮助下,这个样本的结果都很好。
星空智能对话机器人系列博客
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 多头注意力架构-通过Python实例计算Q, K, V
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 多头注意力架构 Concatenation of the output of the heads
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 位置编码(positional_encoding)
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 KantaiBERT ByteLevelBPETokenizer
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 KantaiBERT Initializing model
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 KantaiBERT Exploring the parameters
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 KantaiBERT Initializing the trainer
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 KantaiBERT Language modeling with FillMaskPipeline
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 GLUE Winograd schemas and NER
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Workshop on Machine Translation (WMT)
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Pattern-Exploiting Training (PET)
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 The philosophy of Pattern-Exploiting Training (PET)
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 It‘s time to make a decision
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Text completion with GPT-2
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Text completion with GPT-2 step3-5
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Text completion with GPT-2 step 6-8
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Text completion with GPT-2 step 9
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Training a GPT-2 language model
-
NLP星空智能对话机器人系列:论文学习 Do Transformers Really Perform Bad for Graph Representation
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Training a GPT-2 language model Steps 2 to 6
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Training a GPT-2 language model Steps 7 to 9
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Training a GPT-2 language model Steps 10
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 T5-large transformer model
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Architecture of the T5 model
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Summarizing documents with T5-large
-
自然语言处理NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Matching datasets and tokenizers
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Standard NLP tasks with specific vocabulary
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 T5 Bill of Rights Sample
-
NLP星空智能对话机器人系列:论文解读 How Good is Your Tokenizer? (你的词元分析器有多好?多语言模型的单语性能研究)
-
NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 Semantic Role Labeling (SRL).
-
自然语言处理NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 SRL(Semantic Role Labeling) Sample 2
-
自然语言处理NLP星空智能对话机器人系列:深入理解Transformer自然语言处理 SRL(Semantic Role Labeling)