The field of tabular learning and bipartite prediction is experiencing significant advancements, driven by innovative approaches to improve model performance, efficiency, and interpretability. Recent developments focus on enhancing gradient boosting algorithms, introducing new frameworks for tabular transformers, and exploring the impact of hyperparameters on random forest performance. Notably, researchers are investigating the role of positional encodings in tabular data and proposing novel methods for data augmentation and minority class oversampling. These advancements have the potential to improve predictive performance, robustness, and scalability in various applications, including drug-target interactions, RNA-disease associations, and regulatory network inference. Noteworthy papers include: Oxytrees, which proposes a proxy-based biclustering model tree approach for bipartite learning, achieving up to 30-fold improvement in training times compared to state-of-the-art biclustering forests. MorphBoost, which introduces a self-organizing universal gradient boosting framework with adaptive tree morphing, demonstrating state-of-the-art performance and superior consistency and robustness. Tab-PET, which explores the use of graph-based positional encodings for tabular transformers, finding that they can significantly improve generalization performance. iLTM, which presents an integrated Large Tabular Model that unifies tree-derived embeddings, dimensionality-agnostic representations, and retrieval, achieving consistently superior performance across tabular classification and regression tasks.