共 15 条
- [1] Learning to forget: continual prediction with LSTM[J] . Gers F A,Schmidhuber J,Cummins F. eural computation . 2000 (10)
- [2] RoBERTa:a robustly optimized BERT pretraining approach . Liu Y,Ott M,Goyal N,et al. . 2019
- [3] BERT:Pretraining of deep bidirectional transformers for language understanding . DEVLIN J,CHANG M W,LEE K,et al. Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics Human Language Technologies . 2019
- [5] Attention-based bidirectional long short-term memory networks for relation classification . Zhou P,Shi W,Tian J,et al. The 54th Annual Meeting of the Association for Computational Linguistics . 2016
- [6] Deep contextualized word representations . Peters M E,Neumann M,Iyyer M,et al. . 2018
- [7] 情感词汇本体的构造 . 徐琳宏,林鸿飞,潘宇,任惠,陈建美. 情报学报 . 2008
- [8] Chinese NER Using Lattice LSTM . Zhang Y,Yang J. . 2018
- [9] Overview of the Bio Creative V Chemical Disease Relation(CDR)Task . Wei C H,Peng Y,Leaman R.et al. Proceedings of the 5th Bio Creative Challenge Evaluation Workshop . 2015
- [10] 情感词汇本体的构造[J]. 徐琳宏,林鸿飞,潘宇,任惠,陈建美. 报学报. 2008 (02)