In today’s highly connected society, the old saying that “data is king” has never been truer. Data is now the modern-day currency fueling our economy, and companies and organizations of all kinds are capitalizing on this reality.
There are numerous ways that people share information — face-to-face, over the phone, on online forms, and via email or text. Even more numerous are the various types of information we share, such as physical addresses, phone numbers, social security numbers, financial data, and health records, not to mention the other personal details that make up our digital footprints. Such details include the things we buy, the content we post on social media, the retina scans used to unlock our cellphones and authorize access to select apps, our movement through public and private spaces, and more. The personal information that we provide voluntarily — or involuntarily — is gathered by scores of entities and used for a range of purposes of which we may not even be aware.
For Instance, Did You Know…
- Your DNA data is valuable. The saliva that people willingly share with genealogy and genetics testing companies to track their ancestry or to build a health profile may be resold to pharmaceutical companies for research or sales purposes. According to Kirsten Ostherr, PhD, director of medical humanities and the Medical Futures Lab at Rice University, in an investigative article on privacy problems of direct-to-consumer genetic testing from Consumer Reports, “there’s a real risk that such a profile could then be sold to other companies looking to set a life insurance rate or a home loan interest rate for you, or to a potential future employer looking for background.” Plus, under certain circumstances, the DNA information collected may also be accessed by law enforcement personnel without consent as part of a criminal investigation.
- People’s online browser and product purchase histories are tracked by apps to tailor customized ads. Targeted ads may be relatively harmless depending on the context, but identifying contextual harms is not a strong suit of machine learning algorithms. For example, a targeted ad for a piece of clothing similar to a past purchase may be seen as a helpful level of personalization, whereas product advertisements targeted towards people with certain medical conditions could be considered predatory.
- Personal data shared on popular social media platforms – e.g., photos, videos, text posts, etc. – is routinely reviewed by artificial intelligence programs to better understand patterns and develop algorithms designed to help companies achieve a wide variety of business goals. Many people don’t realize that a “private message” isn’t necessarily private.
Transparency Is Key
Ultimately, information is power. In all of the above cases, “the user received something in return for allowing a corporation to monetize their [personal] data,” confirmed Louise Matsakis of technology publication Wired.
However, this often-unwitting exchange isn’t something all users take lightly. A recent Pew Research Center study revealed that four out of five people surveyed feel that they have little control over the data that companies or government agencies collect on them and are either “very” or “somewhat” concerned about how companies are using it.
In light of growing ethical concerns and the alarming incidence of personal data breaches and other cyber crime that’s forecasted to incur more than US$10 trillion in damages worldwide by 2025, according to Cybersecurity Ventures, most countries have enacted some level of data privacy legislation that sets parameters around how data is collected, used, and shared. However, these laws aren’t standard across different countries – or even centralized at the federal level, as is the case across the U.S. This gap leaves countries/states to largely enact their own data privacy laws and penalties for non-compliance. For organizations serving a global population, this can be especially difficult to navigate.
Privacy By Design
Given that data privacy definitions aren’t yet standardized, experts say that organizations must take measures to ensure that data privacy and transparency are addressed up-front in order to be more efficient. In other words, personal data will be better insulated and companies will be increasingly protected from the legal and financial repercussions of data privacy non-compliance when they make concerted efforts to build the key pillars of data privacy into their product development process at the outset.
The European Union formally subscribed to this theory by adopting the concept of ‘Privacy by Design,’ a process by which technology is used to engineer data privacy into the development of products at their earliest stages. It’s an approach that savvy companies are watching closely in the best interests of both their customers’ privacy and security as well as their organization’s integrity/brand.
Position Your Organization for Success
The protection of privacy and personal data is an essential human right – one that requires organizations to take action to ensure data privacy for their users. Ideally, data privacy should begin in the product development stage. It’s a best practice undertaken to ensure that every member of the product team understands privacy by design and how to put those guidelines into practice.
The IEEE | IAPP Data Privacy Engineering Collection delivers critical training, resources, and content for engineers and technology professionals tasked with protecting and maintaining data privacy.
This series of online learning courses from IEEE and the International Association of Privacy Professionals (IAPP) will help learners understand the principles of data privacy, apply the latest strategies related to legal and ethical data use, address corporate privacy challenges, and ensure that their organization’s products and operations meet privacy goals and mitigate risks.
For more information or to register, visit IEEE Xplore Digital Library.
Matsakis, Louise. (15 February 2019). “The WIRED Guide to Your Personal Data (and Who Is Using It).” Wired.
“Americans And Privacy: Concerned, Confused And Feeling Lack Of Control Over Their Personal Information.” (19 November 2019). Pew Research Center.
Morgan, Steve. (13 November 2020). “Cybercrime To Cost the World $10.5 Trillion Annually By 2025.” Cybercrime Magazine.
14 December 2022. “Data Privacy Laws: What You Need to Know in 2023.” Osano.
Nudson, Rae. (9 April 2020). “When Targeted Ads Feel a Little Too Targeted.” Vox.
No comments yet.