篇名 | Research-based-named Entity Recognition Learning Text Biomedical Extraction by Adoption of Training Bidirectional Language Model (BiLM) |
---|---|
卷期 | 31:4 |
作者 | Alshreef Abed 、 Yuan Jingling 、 Lin Li |
頁次 | 157-173 |
關鍵字 | big data processing 、 biomedical text data 、 computer 、 CNN 、 data mining 、 engineering sciences 、 EI 、 MEDLINE 、 Scopus |
出刊日期 | 202008 |
DOI | 10.3966/199115992020083104012 |
Deep learning entities are a fundamental task biomedical and text extraction; it represents an amazing research scope where End-to-End configuration can be adopted without any requirement for hand-engineered function. However, most of the methods currently used only focus on High-Quality labeled configuration, which is an expensive option for users. To satisfy this inconvenient, combining ext mining for discovering andbiomedical knowledge extraction can be seen as the best appropriate scope in the context of computer sciences applications. In this article, we combine Biomedical named entity recognition (NER) with learning entity to increase the labeled data extracted. By adopting bidirectional language model (BiLM) in NER environment, two stages are defined; first, we evaluate BiLM- NER F1 score in the context of training on unlabeled data and transfer. The evaluation is set up on four benchmarkdatasets in the purpose to show leads and F1 scores. The NER-BiLM results obtained show a high performance of F1 score. However, a high-level challenge concerning the time factors cost still required. To fix this issue, in the second step of our interpretation, a comparative study between BiLM- NER with CCA and Canonical Correlation two current performances are investigated; the results show that, compared with a baseline having a 70.09% F1 score, BiLM-NER F1 score show 72.82%, which represent a gap of 0.3% compared with CCA and Canonical Correlation. This new performances confirm the highest-level of our proposed-NER-BiLM approaches. This work can be considered as a new contribution to data mining and biomedical research approaches.