Large Language Models in Biomedical Research

The field of biomedical research is witnessing a significant shift towards the adoption of large language models (LLMs) for various applications, including disease prediction, causal inference, and question answering. Recent studies have demonstrated the potential of LLMs in predicting cardiac diseases and identifying genetic patterns associated with cardiac conditions. Additionally, LLMs have been used to automate confounder discovery and subgroup analysis in causal inference, enhancing treatment effect estimation robustness. The development of novel benchmark datasets, such as HealthBranches, has also enabled the evaluation of LLMs' multi-step inference capabilities. Noteworthy papers include the introduction of LLM-BI, a conceptual pipeline for automating Bayesian workflows, and Semantic Bridge, a universal framework for controllably generating sophisticated multi-hop reasoning questions. The Knowledge-Reasoning Dissociation study highlights the fundamental limitations of LLMs in clinical natural language inference, revealing a dissociation between knowledge and reasoning capabilities.

Sources

How Effectively Can Large Language Models Connect SNP Variants and ECG Phenotypes for Cardiovascular Risk Prediction?

LLM-based Agents for Automated Confounder Discovery and Subgroup Analysis in Causal Inference

HealthBranches: Synthesizing Clinically-Grounded Question Answering Datasets via Decision Pathways

LLM-BI: Towards Fully Automated Bayesian Inference with Large Language Models

Position: Causal Machine Learning Requires Rigorous Synthetic Experiments for Broader Adoption

Transforming Questions and Documents for Semantically Aligned Retrieval-Augmented Generation

Semantic Bridge: Universal Multi-Hop Question Generation via AMR-Driven Graph Synthesis

Technical Report: Facilitating the Adoption of Causal Inference Methods Through LLM-Empowered Co-Pilot

The Knowledge-Reasoning Dissociation: Fundamental Limitations of LLMs in Clinical Natural Language Inference

Built with on top of