Advances in Quantile Computation and Multiclass Classification

The field of data analytics and machine learning is witnessing significant advancements in quantile computation and multiclass classification. Researchers are developing innovative methods to improve the accuracy and efficiency of these core primitives. A key direction is the development of exact and approximate algorithms for quantile computation, which is crucial for large-scale data analytics. Another area of focus is the improvement of multiclass classification algorithms, including the development of new dimensions and bounds for agnostic PAC sample complexity. Noteworthy papers in this area include: An improved approximation algorithm for k-Median, which achieves a polynomial-time approximation algorithm for the k-Median problem. Sample Complexity of Agnostic Multiclass Classification, which proves nearly tight agnostic sample complexity bounds for multiclass PAC learning. An FPTAS for 7/9-Approximation to Maximin Share Allocations, which presents a new algorithm that achieves a 7/9-approximation for the maximin share allocation of indivisible goods under additive valuations. Dimension-Free Correlated Sampling for the Hypersimplex, which improves the factor for correlated sampling from the hypersimplex to O(log k), independent of the ambient dimension n.

Sources

A Quick and Exact Method for Distributed Quantile Computation

An improved approximation algorithm for k-Median

Sample Complexity of Agnostic Multiclass Classification: Natarajan Dimension Strikes Back

An FPTAS for 7/9-Approximation to Maximin Share Allocations

Dimension-Free Correlated Sampling for the Hypersimplex

Sample-Adaptivity Tradeoff in On-Demand Sampling

Built with on top of