The field of sequence modeling is witnessing a significant shift towards State Space Models (SSMs) due to their ability to efficiently capture long-range dependencies. Recent research has focused on developing innovative SSM architectures that can handle complex sequences with linear or near-linear complexity. These models have demonstrated superior performance in various tasks, including genomic data analysis, graph-level anomaly detection, and visual recognition. Notably, the integration of selective state mechanisms and attention mechanisms has enabled SSMs to achieve state-of-the-art results in multiple domains. Some noteworthy papers in this area include Gene42, which introduced a novel family of Genomic Foundation Models capable of handling extensive long context lengths in genomics, and GLADMamba, which proposed a novel framework for unsupervised graph-level anomaly detection using selective state space models. Additionally, vGamba and Q-MambaIR have showcased the potential of SSMs in visual recognition and image restoration tasks, respectively.