In the corporate world, data privacy stands for the ethical business decision to use collected consumer data in a safe, secure and compliant way. Data processing provides many benefits, but must be acquired via people’s consent, or else businesses risk damaging their compliance, reputation, brand value, and security.
Customers increasingly care about data privacy, as highlighted by a recent global Axyway survey revealing that 85 per cent of people are concerned about the security of their online data and 90 per cent want to know the specific data that companies have collected about them.
Businesses should treat access to customer data as an earned privilege, but in recent years this access has been taken for granted and abused, and the legal and consumer backlash against data processing has begun.
Inspirations: Companies with privacy-centered cultures
There are only positive outcomes for companies who adopt a privacy-centered culture. Once explicit consent is obtained, they are afforded the abilities to track, offer opt-ins and exchange data, allowing them to understand their markets, and optimise their products, services, and marketing to better serve their customers.
Necessary data must be collected in strict compliance with regulations. This means businesses must ask for permissions to collect, process and store sensitive data, adhering to legal practices such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA).
In simple terms, companies with a privacy-centered culture treat consumers’ data with utmost integrity and security – and provide reassurances of ethical data usage readily and transparently.
Consumer data analytics has been around for decades, but digital technologies, omnipresent connectivity, social media networks, data science and machine learning, have led to an increase in the sophistication of customer profiling, and the possibility (and prevalence) of lax, unethical practices, whether deliberate or inadvertent.
More and more, big tech companies such as Google and Facebook capture millions of user data points every day, from general demographic data like “age” or “gender”, to more granular insights such as “income”, “past browsing history” or “recently visited geo-locations”.
When combined, such personally identifiable information (PII) can be used to approximate the user’s exact address, frequently purchased goods, political beliefs and medical conditions – with the information sold on to third parties, potentially without consent.
This is when ethical issues arise
The 2018 Cambridge Analytica data scandal, which saw 50 million Facebook profiles being harvested to target American voters, is a prime example of consumer data that was unethically exploited.
Google has also faced a series of regulatory issues over the years surrounding consumer privacy breaches. In 2021, a Google Chrome browser update put 2.6 billion users at risk of “surveillance, manipulation and abuse” by providing third parties with data on device usage. Google were also taken to court the same year for failing to provide full disclosures on tracking performed in Google Chrome incognito mode, and still have a five-million-dollar lawsuit pending. Failing to securely protect customer data is a costly endeavor.
As of 2022, Google Analytics is considered GDPR non-compliant and was even branded “illegal” by several European countries. Newer products such as Matomo and others like it are GDPR compliant by design, making it easier for privacy professionals and marketers to ensure they are treating customer browsing data with respect.
Security risks of non-compliance
Data collection and processing sounds like a ‘given’ with easy data accessibility of analytics tools but choosing to do so opens businesses up to a spectrum of risks. Data collection also demands data protection by businesses, yet many focus on the former and neglect the latter.
Relaxed attitudes surrounding consumer data protection have consequently caused a major spike in data breaches. For example, Check Point research found that cyberattacks increased 50 per cent year-on-year, with companies globally facing 925 cyberattacks per week – many of these attacks occurring as a result of poor data security.
With poor data protection, billions of stolen records can be made public or even get sold on the dark web. Stolen emails can be used in social engineering attacks, to distribute malware, and harvest even more data.
Legal risks of non-compliance
Globally, 71 per cent of countries have legislation for data collection and customer privacy, yet businesses continuously try to evade existing rules and occasionally break them – with regret.
In regard to serious cases, companies who fail to comply with GDPR and/or suffer a data breach could face a fine of up to €20 million euros, or four per cent of a company’s annual turnover.
As for the CCPA, civil penalties for willful violations can be up to $7,500 per violation or $2,500 per inadvertent violation after notice and a 30-day opportunity to cure is provided.
Data protection should accordingly be respected to reduce such compliance burdens in the future and prevent the business from direct losses.
Reputational risks of non-compliance
As trust is the new currency, data negligence and consumer privacy violations are the fastest ways to lose it. This loss of trust can lead to brand damage, with lasting effects on business results.
When businesses fail to collect and use data with integrity, consumers are fast to cut ties. CMSWire even reported that 71 per cent of people said they would purchase less from a business that had lost their trust.
A company can lose market share, brand equity, and competitive positioning the moment information about privacy violations become public, let alone the security implications associated.
Post-data breach, an AON report estimates that companies can lose around 25 per cent of their initial value, and losses can be significantly higher in other cases.
By respecting consumer data, organisations construct additional layers of protection around their business, saving it from direct and indirect losses.
Therefore, a strong data privacy culture is essential to enhance security posture and regulatory compliance. Organisations that embed a culture of respect for personal data across their teams as part of the discipline of good data practice will retain the trust of valuable customers, and set themselves up for future success.
Thomas Steur is CTO at Innocraft, the makers of Matomo, an open-source, privacy-friendly web analytics solution, now used by over 1,000,000 websites worldwide. Alongside Co-Founder and CEO Matthieu Aubry, he aims to create an approachable and empowering analytics solution with outstanding customer service, backed by a passionate team.