Description
Instructional prompts are a novel technique that can significantly improve the performance of natural language processing tasks by specifying the task instruction to the language model. This is the first paper that uses instructional prompts to improve performance of the

Instructional prompts are a novel technique that can significantly improve the performance of natural language processing tasks by specifying the task instruction to the language model. This is the first paper that uses instructional prompts to improve performance of the question answering task in biomedical domain. This work makes two significant contributions. Firstly, a question answer dataset of 600K question answer pairs has been developed by using the medical textbook ‘Differential Diagnosis Primary Care’, which contains information on how to diagnose a patient by observing their disease symptoms. Secondly, a question answering language model augmented with instructional prompts has been developed by training on the medical information extracted from the book ‘Differential Diagnosis Primary Care’. Experiments have been conducted to demonstrate that it performs better than a normal question answering model that does not use instructional prompts. Instructional prompts are based on prompt tuning and prefix tuning, which are novel techniques which can help train language model to do specific downstream tasks by keeping majority of model parameters frozen, and only optimizing a small number of continuous task-specific vectors (called the prefixes).
Reuse Permissions
  • Downloads
    PDF (707.5 KB)
    Download count: 1

    Details

    Title
    • Medical Question Answering using Instructional Prompts
    Contributors
    Date Created
    2021
    Resource Type
  • Text
  • Collections this item is in
    Note
    • Partial requirement for: M.S., Arizona State University, 2021
    • Field of study: Computer Science

    Machine-readable links