Data privacy has become one of the most pressing challenges for individuals and organizations. We share pieces of ourselves with every interaction—booking a flight, posting on social media, or browsing the web. But who controls that data, and how is it protected?
Merav Yuravlivker, Chief Learning Officer at Data Society, sheds light on the reality: “The way that we generate data now is with third-party platforms, which automatically means we don’t always have ownership over our data. And we just have to understand that this is the world we live in as of now.”
This stark reality highlights the critical need for increased clarity, transparency, and accountability in the collection, storage, and use of data.
Laws like GDPR and CCPA have been game-changers, emphasizing transparency, user consent, and the right to be forgotten. These regulations ensure that individuals have greater control over their personal information, but they also highlight a significant tradeoff: convenience versus control.
As Yuravlivker explains: “Most people are okay with the tradeoff that, in order to use the service, you will be using my data. But the most important piece is to be upfront about it and to make sure we are agreeing in advance.”
Transparency isn’t just a regulatory requirement—it’s a trust-building opportunity. McKinsey conducted a study and found: “About half of the consumer respondents said they are more likely to trust a company that asks only for information relevant to its products or that limits the amount of personal information requested.” Organizations that communicate how data is used and give users meaningful choices will stand out in an increasingly privacy-conscious marketplace.
The rise of generative AI has introduced new complexities to data privacy. Publicly available data, like blogs or social media posts, is often used to train AI models, sometimes without explicit consent. This raises questions about where the line between privacy and innovation should be drawn.
“When we think about AI models, there are public models trained on data that’s broadly accessible, and there are proprietary models using internal data. The context matters, but we need to define clearer ownership standards to avoid misuse and protect privacy,” says Yuravlivker.
As AI becomes more integrated into our daily lives, ensuring that these systems respect data privacy will be critical to maintaining trust and mitigating ethical risks. As recommended by the World Economic Forum: “Governments need to incorporate foresight mechanisms to anticipate future risks and adapt their policies accordingly.” If organizations are looking for an immediate solution, they can focus on internal data privacy frameworks, policies, and controls to remain transparent with users. This not only helps pinpoint a current initiative, it will also prepare organizations for future regulation requirements.
Organizations that proactively approach data privacy can turn a challenge into a competitive advantage. Here are three steps to start:
As Yuravlivker points out, “We’re moving much faster as a society in terms of technology than we are in terms of policy.” The gap between innovation and regulation is widening, making it more critical than ever for businesses to lead by example.
By building robust privacy practices today, organizations can comply with current laws, prepare for future regulations, and strengthen trust with their customers.
How is your organization navigating data privacy challenges in 2025, and what strides are you taking to stay ahead? Contact Data Society if you want to learn how we can help.
Subscribe to get the latest updates from Data Society, including tips for how to use your data better, real-life examples of leveraging analytics, and more.