reference
The paper 'Pretrain-kge: learning knowledge representation from pretrained language models' was published in the Findings of the Association for Computational Linguistics: EMNLP 2020.
Authors
Sources
- Practices, opportunities and challenges in the fusion of knowledge ... www.frontiersin.org via serper
Referenced by nodes (3)
- Association for Computational Linguistics entity
- pre-trained language models concept
- knowledge representation concept