If you are hoping to improve how your private personal data is seen or used, the good news is, there’s not much for you to do. That’s also, unfortunately, the bad news.

“Regrettably, there is only so much individuals can do to protect their data,” said Jennifer King, privacy and data policy fellow at the Stanford University Institute for Human-Centered Artificial Intelligence (Stanford HAI). “It’s one of the biggest reasons why so many people in the privacy space advocate for a federal-level law. We have a patchwork of state laws to address consumer data privacy issues, but we don’t have a federal law.”

King shares ways individuals can help advocate for federal data protections, along with other tips for safeguarding your data:

Don’t think you have ‘nothing to hide’

A lot of folks think data privacy isn’t that important because they have “nothing to hide.” But the concern shouldn’t be whether data is showing you doing anything wrong or secret. The concern is the extent to which companies aggregate that data and use that to make inferences about your habits or behaviors.

Smart cars are an excellent example. They are collecting a tremendous amount of data. And we already see instances where people are facing discrimination from insurance companies because of the way these systems make judgments about your driving habits. You could say “bad drivers should pay more” but you should always be skeptical about how these systems make judgments. For example, what if the car didn’t differentiate between me and my teenager who is learning to drive?

Choose cookies and browsers wisely

We are often tracked through browsers and that’s a difficult place to cut down on data leakage. Even in contexts where you would assume that you’re dealing with first-party sites that wouldn’t sell or share your data, they often are.

The primary way to cut down on that type of data leakage is to reject browser cookies whenever possible or use a browser that tries to avoid cross-site tracking, like DuckDuckGo, Brave, or Mozilla’s Firefox.

The General Data Protection Regulation and ePrivacy Directive are pieces of legislation in Europe (that affect international sites too) that attempt to curb some of the complications surrounding cookies. With those laws, basically, companies aren’t allowed to present cookie options to you in deceptive, confusing ways. In most cases, they’ll allow you to just reject everything. It’s a huge improvement over the past but it is still only a piece of the problem with respect to protecting our data.

The argument I make, made in the white paper that we published with Stanford HAI, was that you should be able to automate these things out of your life through some kind of global preference. [More on that in the next point.]

Get familiar with Global Privacy Control

Some browsers respect a signal called Global Privacy Control, which is a setting that allows people to enact a Do Not Track signal across all their browsing in one step. Some browsers don’t offer this yet, though. In California, there’s a proposed state bill to mandate that all browsers respect Global Privacy Control.

We also passed a recent law here in California, which provides a one-stop operation for consumers to request that all registered data brokers delete their data. That moves us in the right direction, too, but we still end up with the problem where people have to know about these things and take action themselves.

Turn off your location data when you don’t need it

One of the most concerning data vulnerabilities on a mobile device is location data. I advise people to turn off GPS location when they’re not using it. (And I even sometimes struggle with this because it’s often a five- or six-step process on some phones.)

Location data is particularly sensitive because of the pattern it creates when it’s aggregated over time. A lot of people make the “nothing to hide” argument here, again. But, for example, it could track you going to medical appointments for you or your family, and those could be used to imply sensitive or inaccurate conclusions.

Location data is probably also the origin of situations where you start getting real-time ads based on a verbal conversation that you had. If you share your location with certain apps, they know when you are near one of your friends (who also has location turned on) using the same app, and the app can make inferences about your shared interests or track real-time behavior, like a search. And then you may both see ads – often in seconds – based on those assumptions. The fact that an ad may target something you didn’t even search or click on is further proof of how detailed and responsive these aggregate data profiles can be.

Don’t depend on Incognito Mode

Incognito Mode does not do what you think it does. It’s about keeping your browsing session anonymized between users of the same computer. It does not allow you to browse anonymously without any tracking. Your browser still knows what you’re doing in Incognito Mode. There was actually a recent lawsuit around the confusion that companies have created through developing those modes.

If you’re on a public computer at a library, Incognito Mode is a good thing to use, but it is not at all protecting you from any of the companies observing what you’re doing.

Cover your cameras

Covering your camera is a pretty low-threshold action to take to make sure that nobody’s running malware on your machine and recording without your knowledge. Although it’s not common, people do get targeted right for hacking attempts. I do know of somebody, personally, who had been targeted for a ransomware scheme. It happens.

Make your voice heard

If people care about the privacy of their data, they should tell their Congress members that they want to be protected from companies exploiting our data.

Politicians know data is valuable. That’s at least part of the motivation around trying to ban TikTok, in addition to concerns about national security. But many of us in the privacy space feel like we’re passing up the opportunity to address this at the right level, more broadly. There’s even a bill that was introduced last congressional session, that couldn’t get to the floor, that would do this.

California is one of the dozen or so states with a consumer privacy act (the CCPA), and voters passed Prop. 24 in 2020 to expand the law. Our new state privacy agency is still working on implementing Prop. 24, and one of the components the California Privacy Protection Agency board is figuring out right now – which is open for public comment – is around automated decision-making technology, which includes artificial intelligence. One of the things that the legislation is looking to do is give people a way to opt out if a technology is using AI for a consequential decision, like hiring or housing applications. Whether that ends up being good or bad I think depends on the context. But people should know that these decisions are happening right now if they want to weigh in.