reference
BERTRL, proposed by Zha et al. in 2022, leverages pre-trained language models and fine-tunes them using relation instances and reasoning paths as training samples.
Authors
Sources
- Practices, opportunities and challenges in the fusion of knowledge ... www.frontiersin.org via serper
Referenced by nodes (2)
- knowledge graphs concept
- pre-trained language models concept