Online Privacy in the Age of Data Brokers

Online Privacy in the Age of Data Brokers


(Lightspring/Shutterstock)

The Biden administration’s recent executive order on restricting the transfer of sensitive personal data to “countries of concern” is likely the opening of much stricter privacy regulations in general, and data brokers in particular.

The issues at stake are profound. Attention is the engine that drives revenue on the internet, and our data is the fuel for that engine.  It might seem like everyday data like simple buying patterns and the sites we visit are innocuous, and on their own, they might be, but this isn’t the whole picture.

If I connect your shopping patterns to your location data, along with a history of all the sites you visit and when, your social media connections, friends, family and preferences, I know a lot more about you than you might imagine, and I can predict a lot more about you than you are likely comfortable with. Data brokers are the ones collecting and selling this data.

Most algorithms optimize dispassionately for only one thing, attention, and this tends to appeal to our baser instincts. Generally speaking, anger drives more attention than debate, performance drives more attention than real life, and glitz draws more attention than the mundane. Left to its own devices, an algorithm with broad access to our data devolves into the echo chambers we see on social media today, and if you’re a company looking to sell a product, you go where the attention lies.

(eamesBot/Shutterstock)

This is problematic without any additional context, but let’s extrapolate this beyond ad targeting. Let’s now imagine an algorithm that is not dispassionate, say a foreign government that wants to influence our kids, or drive discontent amongst voters in a certain state. No problem, and the more data collected on each of us, the more successful the campaign.

Open access to our data doesn’t always require an algorithm or much complexity at all to create serious concerns. For example, the ability to gather compromising data on a congressperson, an ex being able to stalk you, or your healthcare and religious beliefs shared with anyone willing to pay, or law enforcement without a warrant is unacceptable in general, but particularly unacceptable when a foreign government has its hands on the controls.

The CFPB’s announcement starts to address the issue. They want data brokers to comply with the Fair Credit Reporting Act (FCRA), and this would dramatically change the way data brokers are allowed to sell.

The FCRA requires a strictly defined purpose when using someone’s credit data (i.e., approving a line of credit or employment screening).  The law aims to protect individuals from misuse/overuse of data intended for only specific use cases, and the rules are very strict – for good reason. It’s critical that this data is 1) accurate, and 2) controlled. Regulated entities like the Credit Reporting Agencies fall under these strict laws.

Given AI and open access to our data, it’s too easy to trace our every move, so regulating data brokers makes good sense. We don’t want our next door neighbors to be able to pull our entire background report.

The size of this market varies based on what is measured, but it is enormous — at over $300 billion globally. Location tracking in the U.S. alone, for example, is a $12 billion a year market and growing quickly.

The Data Collection Problem

For companies trying to comply with orders, such as “don’t share data with ‘countries of concern,’” it seems easy enough, but it’s actually harder than it looks. And to be clear, most companies are not intentionally sharing data with countries of concern, but the opaque and complex ecosystem of how our data is collected, shared, bought and sold online makes it far more difficult than one might think to ensure a website isn’t inadvertently sharing data with foreign entities.

The interplay between data brokers, data lakes, third-party apps, and ad tech creates a complex web of data flows that have little regard for national boundaries. Stopping the oversharing of data in the first place is essential.

The core issue is that data brokers are collecting unprecedented amounts of personal data–from location tracking and biometrics to financial and health records. This data is often collected through seemingly benign services and apps that are connected to other services and apps, and these are often connected to yet other services and apps, and so the number of entities with access to our data grows nearly exponentially.

Data collection can range from cookies and tracking pixels to more sophisticated techniques like browser fingerprinting and location tracking. This is not to suggest that all data collection is bad.

Some use cases have legitimate purposes, such as providing access to our health records online, helping us to find our friends more easily, and seeing more relevant content on websites. These technologies serve their intended purposes, but they also enable the collection of vast amounts of personal data without users’ explicit consent.

Also, the data does not simply stay with the original companies that collected it. It is often sold or shared with a complex network of data brokers, advertisers, and other third parties. Therefore, it can indirectly and often inadvertently end up with foreign adversaries.

Overly open access to this data has become the pressing and difficult problem to solve, and it starts with the simple fact that we all share and collect too much data without giving much thought to where it might end up or get used.

In our recent research study, we examined the frequency that data is directly shared with companies in Russia or China. We discovered that 2% of U.S. companies have web trackers on their websites that share data with these foreign adversaries.  This percentage may seem small initially, yet its significance becomes apparent when considering the broader context.

According to a Siteefy survey, there are approximately 133 million websites in the U.S. Applying the 2% figure to this total, we estimate that around 2.7 million websites share data with entities based in foreign countries.  Additionally, 12% of the sites we scanned linked to Tik Tok, 47% linked to Meta, including 33% of healthcare companies, even after a year of nonstop litigation for sharing health data with Meta, even indirectly. This points again to the technological problem every organization running a modern website has to address.

(rawf8/Shutterstock)

The current method of protecting against this has been underwhelming. Take traditional cookie consent management as an example.  Our study found that 67% of companies have a consent banner, but 98% drop cookies or trackers before a user interacts with that banner.

And the burden of asking a consumer for broad consent is unreasonable itself.  Even for a technologist, it’s next to impossible to broadly consent to a large list of trackers. Managing this more specifically with precision blocking of trackers is where we’re headed, and companies need the right tools to manage this.

The Executive Order

The executive order starts to define standards, and that’s something everyone can agree is needed. The order gives consistent direction to the following agencies:

  • Homeland Security to work together to set high security standards to prevent access by foreign adversaries;
  • The Departments of Health and Human Services, Defense, and Veterans Affairs to help ensure that Federal grants, contracts, and awards are not used to facilitate access to Americans’ sensitive health data;
  • The United States Telecommunications Services Sector (often called “Team Telecom”) considers the threats to Americans’ sensitive personal data in its reviews of submarine cable licenses.

The collective goal is to establish a more secure and transparent data supply chain, preventing our information from being exploited by anyone, particularly foreign governments. Much more specificity is required for companies to comply with the order, but it’s a good start.

About the author: As CEO & Founder of LOKKER, Ian Cohen is dedicated to providing solutions that empower companies to take control of their privacy obligations. Before founding LOKKER in 2021, Cohen formerly served as CEO for Credit.com, and CPO for Experian, where he focused on consumer-permissioned data. 

Related Items:

What Is the American Privacy Rights Act, and Who Supports It?

How to Help Your Data Teams Put Privacy First

MOAB Puts a Bow on Data Privacy Week

 

 



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.