Find in Library
Search millions of books, articles, and more
Indexed Open Access Databases
DATLMedQA: A Data Augmentation and Transfer Learning Based Solution for Medical Question Answering
oleh: Shuohua Zhou, Yanping Zhang
Format: | Article |
---|---|
Diterbitkan: | MDPI AG 2021-11-01 |
Deskripsi
With the outbreak of COVID-19 that has prompted an increased focus on self-care, more and more people hope to obtain disease knowledge from the Internet. In response to this demand, medical question answering and question generation tasks have become an important part of natural language processing (NLP). However, there are limited samples of medical questions and answers, and the question generation systems cannot fully meet the needs of non-professionals for medical questions. In this research, we propose a BERT medical pretraining model, using GPT-2 for question augmentation and T5-Small for topic extraction, calculating the cosine similarity of the extracted topic and using XGBoost for prediction. With augmentation using GPT-2, the prediction accuracy of our model outperforms the state-of-the-art (SOTA) model performance. Our experiment results demonstrate the outstanding performance of our model in medical question answering and question generation tasks, and its great potential to solve other biomedical question answering challenges.