Search Articles

View query in Help articles search

Search Results (1 to 10 of 23 Results)

Download search results: CSV END BibTex RIS


Acquisition of a Lexicon for Family History Information: Bidirectional Encoder Representations From Transformers–Assisted Sublanguage Analysis

Acquisition of a Lexicon for Family History Information: Bidirectional Encoder Representations From Transformers–Assisted Sublanguage Analysis

Liu et al [13] constructed a corpus-driven semantic lexicon based on the UMLS assisted by variants mined and usage information gathered from clinical text. Regarding deep learning–based approaches, pretrained language models such as bidirectional encoder representations from transformers (BERT) [14] can learn the structure of language (ie, the basic semantic and syntax information) through unsupervised training on a large corpus of unlabeled text [15].

Liwei Wang, Huan He, Andrew Wen, Sungrim Moon, Sunyang Fu, Kevin J Peterson, Xuguang Ai, Sijia Liu, Ramakanth Kavuluru, Hongfang Liu

JMIR Med Inform 2023;11:e48072