Large Language Models in Hardware Design and Optimization

The field of hardware design and optimization is experiencing a significant shift with the integration of large language models (LLMs). Recent developments have demonstrated the potential of LLMs in automating and streamlining various aspects of hardware design, including algorithm-to-hardware translation, constrained multi-objective optimization, and analog circuit design. The use of LLMs is enabling the creation of more efficient, reliable, and adaptable hardware systems. Notably, LLMs are being used to generate high-quality hardware description language code, optimize design parameters, and even assist in the design of complex systems such as photonic integrated circuits. The application of LLMs in hardware design is also being extended to areas such as federated learning, where multiple parties can collaboratively enhance a shared LLM for automated hardware design generation while protecting proprietary data. Overall, the integration of LLMs in hardware design and optimization is opening up new avenues for innovation and advancement in the field. Noteworthy papers include A2HCoder, which proposes a hierarchical framework for algorithm-to-hardware translation, and FedChip, which introduces a federated fine-tuning approach for collaborative LLM enhancement. EvoVerilog is also notable for its novel framework that combines LLMs with evolutionary algorithms for automatic Verilog code generation and refinement. Additionally, Image2Net and White-Box Reasoning demonstrate the potential of LLMs in analog circuit design and optimization. HiFo-Prompt and MAHL also show promise in LLM-based automatic heuristic design and hierarchical chiplet design generation. Discrete Optimization of Min-Max Violation and its applications, ChronoLLM, AI Agents for Photonic Integrated Circuit Design Automation, and Incremental-Decremental Maximization are other notable works in this area. Computational Intelligence based Land-use Allocation Approaches for Mixed Use Areas is also a significant contribution. These papers collectively highlight the significant impact of LLMs on the field of hardware design and optimization.

Sources

A2HCoder: An LLM-Driven Coding Agent for Hierarchical Algorithm-to-HDL Translation

LLM4CMO: Large Language Model-aided Algorithm Design for Constrained Multiobjective Optimization

EvoVerilog: Large Langugage Model Assisted Evolution of Verilog Code

Image2Net: Datasets, Benchmark and Hybrid Framework to Convert Analog Circuit Diagrams into Netlists

FedChip: Federated LLM for Artificial Intelligence Accelerator Chip Design

White-Box Reasoning: Synergizing LLM Strategy and gm/Id Data for Automated Analog Circuit Design

HiFo-Prompt: Prompting with Hindsight and Foresight for LLM-based Automatic Heuristic Design

Discrete Optimization of Min-Max Violation and its Applications Across Computational Sciences

ChronoLLM: Customizing Language Models for Physics-Based Simulation Code Generation

MAHL: Multi-Agent LLM-Guided Hierarchical Chiplet Design with Adaptive Debugging

AI Agents for Photonic Integrated Circuit Design Automation

Incremental-Decremental Maximization

Computational Intelligence based Land-use Allocation Approaches for Mixed Use Areas

Built with on top of