The field of digital system design is experiencing a significant shift with the integration of large language models (LLMs) in various aspects of the design process. Recent research has focused on developing benchmarks and frameworks that leverage LLMs for complex digital system design, such as RTL synthesis and circuit design assistance. These efforts aim to improve the efficiency and scalability of traditional design methods. Additionally, LLMs are being applied to materials science, enabling the extraction of structure-function relationships from plants and the design of new bioinspired materials. Noteworthy papers in this area include: ArchXBench, which introduces a comprehensive benchmark suite for LLM-driven RTL synthesis, highlighting the capabilities and limitations of current state-of-the-art LLMs. Generative Artificial Intelligence Extracts Structure-Function Relationships from Plants for New Materials, which presents a framework for integrating generative AI with literature from plant science and materials engineering to design new bioinspired materials. MuaLLM, which proposes a multimodal LLM agent for circuit design assistance, enabling efficient and comprehensive analysis of circuit design research papers. DiffAxE, which introduces a generative approach for hardware accelerator generation and design space exploration, achieving significant improvements in efficiency and accuracy compared to conventional methods. AnalogSeeker, which proposes an open-source foundation language model for analog circuit design, achieving competitive results with mainstream commercial models.