Large Language Models in Chip Design and Verification

The field of chip design and verification is witnessing a significant shift with the integration of Large Language Models (LLMs). Researchers are exploring the potential of LLMs in generating high-quality Verilog code, optimizing Power-Performance-Area (PPA) constraints, and automating root cause analysis of formal verification failures. Notable advancements include the development of novel frameworks that leverage LLMs to improve the functional and syntactic correctness of generated Verilog code, as well as techniques that enhance the structural optimization of Register-Transfer Level (RTL) logic synthesis. Furthermore, LLMs are being applied to mitigate intellectual property (IP) leakage in RTL code generation and to generate high-quality schematic diagrams for analog circuits. Overall, the field is moving towards increased automation and efficiency in chip design and verification, with LLMs playing a key role in driving these advancements. Noteworthy papers include: LLM-VeriPPA, which achieves state-of-the-art results in Verilog code generation and PPA optimization. FVDebug, which automates root cause analysis of formal verification failures with high hypothesis quality and strong Pass@k fix rates. VeriGRAG, which substantially improves the correctness of Verilog code generation using structure-aware soft prompts. EEschematic, which generates high-quality schematic diagrams for analog circuits using a multimodal LLM. SmaRTLy, which optimizes RTL logic synthesis using logic inferencing and structural rebuilding techniques. QiMeng-SALV, which proposes a signal-aware learning approach for Verilog code generation. CircuitGuard, which mitigates LLM memorization in RTL code generation to prevent IP leakage.

Sources

LLM-VeriPPA: Power, Performance, and Area Optimization aware Verilog Code Generation with Large Language Models

FVDebug: An LLM-Driven Debugging Assistant for Automated Root Cause Analysis of Formal Verification Failures

VeriGRAG: Enhancing LLM-Based Verilog Code Generation with Structure-Aware Soft Prompts

EEschematic: Multimodal-LLM Based AI Agent for Schematic Generation of Analog Circuit

SmaRTLy: RTL Optimization with Logic Inferencing and Structural Rebuilding

QiMeng-SALV: Signal-Aware Learning for Verilog Code Generation

CircuitGuard: Mitigating LLM Memorization in RTL Code Generation Against IP Leakage

Built with on top of