You search for a product on one website and suddenly hundreds of related ones are populating your every ad. You travel to a new city and your algorithm is now pushing you videos of the best restaurants and things to do there. You post a photo of a dog and are inundated with chew toy ads soon after. Unless you’ve sworn off social media completely, it’s likely some (or all) of these things have happened to you. But why?
The answer is somewhere within those 80-something pages of terms and conditions you scrolled through and accepted upon your first download of a social media app. While what you agreed to is probably not as nefarious as what you’d expect from a Black Mirror episode, by clicking “OK,” you gave these apps your consent to have and use your data for their benefit. But while the convenience of having an app predict what you want is surely a technological advancement to marvel at, should it also raise an eyebrow over how safe it actually is?
This question has been at the core of ethical debates, research studies, and even congressional hearings over the years. Most recently, the U.S. government implemented a ban that resulted in the popular video app TikTok ceasing its functionality for U.S. users just hours before the ban went into effect. TikTok, which is owned by Chinese tech company ByteDance, was initially targeted under the first Trump administration due to data security concerns and foreign interference, and continued to be the subject of data privacy debates both in and out of Congress. The ban was contingent on ByteDance selling TikTok to an American company by a Jan. 19, 2025, deadline, and when it was clear that wasn’t going to happen, TikTok took itself offline in the U.S. That dark period lasted only 14 hours, though, as TikTok execs struck a compromise with President Donald Trump (even though Trump was not yet sworn into office at the time of this happening). Trump then postponed the TikTok ban for 75 days so that the administration had more time to negotiate a deal — which means it’s possible TikTok could still go away again in early April.
So, while TikTok is safe for now, the reason for the ban has sparked questions about what it even means for a company to have your data: Why is the U.S. so concerned with a Chinese-owned company having Americans’ data in the first place? What’s so valuable, and therefore dangerous, about the data?
“I think sometimes we have a failure of imagination of what people can do with this information,” Debbie Reynolds, a data privacy and emerging technology expert, tells Her Campus in an exclusive interview. Reynolds, who boasts 450,000+ listeners on her “The Data Diva” Talks Privacy Podcast, works with tech companies — TikTok being one of them — to share her expertise on optimizing data use while protecting users, and also uses her social media to educate internet users on the privacy risks of social media apps and how to best navigate them.
Reynolds’s career in data privacy consulting has spanned multiple presidential administrations, and during that time, she’s watched hit social media apps come and go. So, it’s safe to say that, throughout the years, she’s seen some things — both good and bad. “Technology is like a double-edged sword where it slices both ways, so it has some benefits to it, but then also you have to not be so excited about the benefits that you forget about what those risks are,” she says.
And what, exactly, are those risks when it comes to your data? Here’s a rundown of how social media apps actually use your data, and what precautions you can take to protect yourself.
First thing’s first, what does it really mean for social media apps to have your data?
The concept of a social media app using your data can seem pretty silly. Surely, Instagram execs don’t personally care about what you ate for lunch today or that your dating anniversary is next week… right? But the truth is, that type of information is actually exactly what social media apps use to keep the lights on.
“Almost anything you could possibly imagine can be packaged up and sold to a company that wants to advertise, especially if that service is free,” Reynolds says. That could cover a range of things — from personal information such as your name, email, and gender, to where you’re located and who you follow, to the content that you like and engage with, and even device data like your browser and IP address. Advertisers use your demographics and interests to push out ads for products you’re most likely to purchase. If you’re based in Florida, for example, you’re likely not getting ads for ski equipment — unless you’ve been using the internet and social media to do research for planning a trip to the mountains; then, bring on the snowsuit sponcon!
While apps typically have some degree of customizable privacy settings, and claim to be transparent about how your data is used, it’s hard to ignore the fear of such personal data being sold or shared with entities that want to do more than just sell a product. Think of it this way: If that data is so easily accessible, who’s to say it can’t be used to influence voting habits or prey on vulnerable groups like minors, the elderly, or marginalized communities? Reynolds says that calls to protect sensitive data have begun showing up in proposed legislation; for example, in June 2023, a bipartisan group of senators introduced a bill titled “Platform Accountability and Transparency Act” that would support private research into major social media companies. The bill has yet to be voted on.
What’s the issue the U.S. has with TikTok?
TikTok and Trump actually go way back. In 2020, Trump signed an executive order effectively banning TikTok due to national security concerns that China was accessing the information of U.S. users’ information, though TikTok maintained that its data was stored outside of China. The ban never came to be, though, as a federal judge blocked it, and then former President Joe Biden revoked it when he took office in 2021. Now, four years later, Trump has changed his tune significantly, calling TikTok a “unique medium for freedom of expression” in his brief to the Supreme Court asking to pause the ban. So, why the switch-up? Well, there’s no way to know for sure, but it’s worth noting that Trump has now accrued 14.7 million followers on the app — and won an election (in part due to Gen Z voters, the largest demographic on the app), so some people have theorized that’s why he may be viewing TikTok a little more favorably.
Reynolds believes that the government’s initial concern with TikTok had less to do with national security and more to do with wanting to ride the coattails of an international tech success. “I think a lot of the attention and concern with TikTok in general has been just because it’s so popular,” Reynolds says, noting that ByteDance is not the only company that has apps from China, but that it’s the one getting all this attention from the U.S. government. “They have done something that some of the other incumbent social media companies haven’t done, which is capture a young audience.”
In response to the possibility of what a U.S. purchase of TikTok might bring, Reynolds doesn’t think much would change, saying that the algorithm is what American companies are coveting most. There’s a small caveat, though: A purchase of TikTok by an American company doesn’t necessarily mean that the algorithm comes with it. “A lot of people don’t remember this, but back in the previous Trump administration when he first was talking about trying to ban TikTok, China put an export control on the algorithm,” Reynolds says. Export controls are legislation that regulates how goods are sold, and in 2020, shortly after Trump announced his first intent to ban TikTok, China updated its export controls to cover technologies it deemed sensitive. While TikTok’s algorithm wasn’t explicitly mentioned, the language used to refer to tech sounded a lot like the app’s auto-generative algorithm technology, which many have dubbed its “secret sauce.” This meant that regardless of sale, a U.S. company would likely never obtain the same algorithm that made TikTok so successful in the first place.
So, how do you keep yourself and your data safe on social media?
Thankfully, there’s a better (and more realistic) answer than just “don’t use it.” While social media users don’t have much control over what the government or social media giants do in terms of data protection, there are some precautions you can take on the individual level that could help keep your information safer.
It’s not uncommon to get a notification on apps like TikTok that asks for your information in order to “better tailor your ads.” Reynolds advises to think before you click yes. “Ask yourself, ‘Do I really want to share this information with this company?’” Reynolds says. “Figure out the apps you really like, the ones you really, really want to interact with, and you can share your data with that app. If you want to personalize your experience and you trust this app, I don’t see a problem with that. But for a random app or a random website, maybe think first, ‘I don’t really know you like that!’”
Also, this may seem like internet safety 101, but Reynolds reminds readers that it’s important to be mindful of your data privacy everywhere online, not just on social media. Similarly to social media apps, nearly all websites in the U.S. collect “cookies” — text files with small pieces of data — to tailor their web advertising. When you visit a website, you’re typically prompted with a message asking you to “accept all” or “manage cookies,” and accepting cookies will collect info like where you’re located, your computer settings, and accessibility modes that can all be used for advertising.
The fight for privacy in these spaces is ongoing — even in places you might not think to look. Back in 2022, when Roe v. Wade was overturned, it was theorized that governments in states where abortion is criminalized might seek data from period tracking apps to find out whether or not a person could be pregnant, and how far along they are, in order to make decisions about providing or denying them reproductive healthcare — or even for persecuting them for seeking an abortion or losing their pregnancy. Reynolds says that theory isn’t as far-fetched as you might think.
“It can happen, and it is happening,” Reynolds says. “Unfortunately for women, at a state level, you have different privacy rights than men. So if you go to Texas as a woman, more of your medical or health information can be shared with law enforcement. It’s creating a very uneven privacy landscape for men and women based on the state that they live in.”
While hearing all of this might make you want to throw your phone in a large body of water, it’s not a total cause for panic. Reynolds says the best thing you can do to protect yourself and your data online is to be mindful of how you interact with social media and share your information. “Someone who is using [an] app who doesn’t share a lot of their personal information is going to have a safer experience than someone who shares more,” Reynolds says. “So I think you have to be selective about what you share, when you share, and how you share.”