Advances in Graph Neural Networks and Symbolic Regression

The field of graph neural networks and symbolic regression is rapidly advancing, with a focus on improving the accuracy and efficiency of these methods. Recent developments have seen the introduction of new architectures and techniques, such as the use of physics-informed neural networks and the incorporation of topology optimization. These advancements have the potential to significantly impact a range of applications, from the analysis of complex systems to the design of optimal structures. Notably, the use of graph neural networks for topology optimization has shown promising results, enabling the generation of stress-constrained manufacturable topologies. Additionally, the development of symbolic regression methods has improved the ability to discover equations from data, with applications in fields such as physics and epidemiology. Some noteworthy papers in this area include the introduction of TOFLUX, a topology optimization framework for fluid devices, and the development of EQUATE, a data-efficient fine-tuning framework for symbolic equation discovery. Overall, these advances have the potential to drive significant progress in a range of fields and applications.

Sources

Enhanced Hybrid Technique for Efficient Digitization of Handwritten Marksheets

Physics-Inspired Spatial Temporal Graph Neural Networks for Predicting Industrial Chain Resilience

TOFLUX: A Differentiable Topology Optimization Framework for Multiphysics Fluidic Problems

Unveiling the Actual Performance of Neural-based Models for Equation Discovery on Graph Dynamical Systems

Weisfeiler-Lehman meets Events: An Expressivity Analysis for Continuous-Time Dynamic Graph Neural Networks

Automated discovery of finite volume schemes using Graph Neural Networks

Graph Neural Network-Based Topology Optimization for Self-Supporting Structures in Additive Manufacturing

Numerical Optimization for Tensor Disentanglement

Efficiently Generating Multidimensional Calorimeter Data with Tensor Decomposition Parameterization

Data-Efficient Symbolic Regression via Foundation Model Distillation

The Return of Structural Handwritten Mathematical Expression Recognition

Dominant H-Eigenvectors of Tensor Kronecker Products Do Not Decouple

Discovering equations from data: symbolic regression in dynamical systems

Optimization on the Extended Tensor-Train Manifold with Shared Factors

Built with on top of