The field of graph data analysis is shifting towards a greater emphasis on privacy preservation, driven by the need to comply with regulations such as the General Data Protection Regulation (GDPR). Researchers are exploring innovative methods to balance privacy and utility, particularly in scenarios where data publishers and users are distinct entities. A key challenge is developing approaches that can ensure unbiased graph structure recovery while enforcing differential privacy (DP) at the data publishing stage. Another important area of research is benchmarking fraud detectors on private graph-structured data, which requires careful consideration of privacy attacks and the development of solutions that satisfy formal DP guarantees. Notable papers in this area include: Graph Structure Learning with Privacy Guarantees for Open Graph Data, which proposes a novel privacy-preserving estimation framework for open graph data. Benchmarking Fraud Detectors on Private Graph Data, which introduces a realistic privacy attack on the system and studies how to benchmark algorithms while satisfying a formal DP guarantee. Scalable contribution bounding to achieve privacy, which proposes a novel and efficient distributed algorithm to enforce user-level differential privacy in large datasets.