文件名称:attention
文件大小:115KB
文件格式:ZIP
更新时间:2024-03-29 21:20:31
JupyterNotebook
注意不是不解释 EMNLP 2019论文的代码Wiegreffe&Pinter的。 使用此代码库时,请引用: @inproceedings{wiegreffe-pinter-2019-attention, title = "Attention is not not Explanation", author = "Wiegreffe, Sarah and Pinter, Yuval", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCN
【文件预览】:
attention-master
----train_and_run_experiments_bc.py(3KB)
----Seed_graphs.ipynb(116KB)
----seed_graphs.py(4KB)
----common_code()
--------metrics.py(2KB)
--------plotting.py(4KB)
--------common.py(699B)
----model()
--------modules()
--------modelUtils.py(3KB)
--------__init__.py(0B)
--------Binary_Classification.py(9KB)
----run_frozen_attn.sh(128B)
----configurations.py(1KB)
----run_baselines.sh(126B)
----seeds.txt(46B)
----__init__.py(0B)
----ExperimentsBC.py(1KB)
----LICENSE(34KB)
----README.md(3KB)
----run_seeds.sh(143B)
----Trainers()
--------TrainerBC.py(7KB)
--------DatasetBC.py(5KB)
----run_adversarial.sh(175B)