We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent 89666d3 commit 9e06b01Copy full SHA for 9e06b01
README.md
@@ -105,7 +105,7 @@ We iterated the training process for 20 epochs with batch size 64 and early stop
105
Please see [https://github.com/ncbi-nlp/NCBI_BERT](https://github.com/ncbi-nlp/NCBI_BERT).
106
107
108
-## Citing NCBI BERT
+## Citing BLUE
109
110
* Peng Y, Yan S, Lu Z. Transfer Learning in Biomedical Natural Language Processing: An
111
Evaluation of BERT and ELMo on Ten Benchmarking Datasets. In *Proceedings of the Workshop on Biomedical Natural Language Processing (BioNLP)*. 2019.
0 commit comments