Graph-Bert:Graph-Bert的源代码

时间:2021-05-23 18:25:45
【文件属性】:
文件名称:Graph-Bert:Graph-Bert的源代码
文件大小:2.16MB
文件格式:ZIP
更新时间:2021-05-23 18:25:45
Python 图伯特 - Depending on your transformer toolkit versions, the transformer import code may need to be adjusted, like as follows: + from transformers.modeling_bert import BertPreTrainedModel, BertPooler + --> from transformers.models.bert.modeling_bert import BertPreTrainedModel, BertPooler - (Please check your transformer toolikt, and update the import code accordingly.) Graph-Bert:学习图形表示只需要注意 IFM实验室的
【文件预览】:
Graph-Bert-master
----script_2_pre_train.py(5KB)
----code()
--------EvaluateClustering.py(1KB)
--------MethodGraphBertNodeConstruct.py(2KB)
--------MethodGraphBertGraphRecovery.py(2KB)
--------ResultSaving.py(700B)
--------MethodGraphBert.py(6KB)
--------MethodHopDistance.py(1KB)
--------EvaluateAcc.py(393B)
--------__init__.py(369B)
--------base_class()
--------MethodGraphBatching.py(943B)
--------MethodGraphBertGraphClustering.py(2KB)
--------__pycache__()
--------DatasetLoader.py(7KB)
--------MethodBertComp.py(7KB)
--------MethodWLNodeColoring.py(2KB)
--------Settings.py(774B)
--------MethodGraphBertNodeClassification.py(6KB)
----script_3_fine_tuning.py(6KB)
----script_1_preprocess.py(4KB)
----README.md(6KB)
----script_4_evaluation_plots.py(3KB)
----data()
--------cora()
----result()
--------GraphBert()
--------Hop()
--------Batch()
--------PreTrained_GraphBert()
--------WL()
--------screenshot()

网友评论