The days you can keep using TikTok might be numbered.
In fact, a new bill passed the House this weekend that gives the Chinese company that created the social media app exactly 270 days to sell their wares—or else.
Packaged with another bill that includes aid for Israel and Ukraine, the measure will go to the Senate next and then to President Biden, who has already voiced support for the bill.
This is the second time the House of Representatives has approved a TikTok ban, targeting the app because lawmakers believe it will cause interference in the next election and because there are legitimate security risks related to user data storage and accessibility.
If the bill passes, Joe Biden has the option to extend the time Bytedance—the company that owns TikTok—has to sell the app by 90 additional days.
Repercussions for not selling the app (if the bill passes the Senate) include blocking access in app stores and through Internet providers. It means the U.S. Government can effectively shut down access to the app for everyone.
For the average TikTok user, this is a dramatic turn of events. It might be easy to swipe through the news and assume the legislation doesn’t really have enough momentum, especially since the first bill fizzled out. Yet, packaging legislation within another bill is a tried-and-true method to pass new legislation through the Senate, especially when there is foreign aid involved.
For its part, Bytedance claims the user data they are processing for their 170 million users is passed through a third-party provider based in the U.S.
I see this as a free speech issue as much as a privacy and security issue. On the one hand, the ban could trigger a momentous change in how the federal government can get involved in technology and specifically the apps we are all using on a daily basis. It has “Big Brother” ramifications not only with what is a mostly harmless (and fairly inconsequential) entertainment app but with any app that collects data from users. The TikTok algorithm has always struck me as invasive in that it is constantly watching everything we do and adjusting the feed appropriately, but at the same time, that typically means sending me more videos about NBA basketball.
On the other hand, I have concerns about the security risks. The axiom “what we don’t know can’t hurt us” doesn’t apply here. Frankly, there is a distinct possibility that TikTok could misuse our data and influence us during the election—we just don’t know enough about how the data is stored and used within the app, or how it is stored or sold outside of the US. All it would take is a small change in the algorithm to steer voters away from harmless content and into something much more nefarious, manipulative, and coercive.
Essentially, this is one case where I can see both sides of the issue quite easily. TikTok is both harmless at first glance and potentially dangerous. It’s harmless because—I’ve seen the videos. They are fun to watch and easy to forget. Potentially dangerous, because we’re so easily sucked into the videos and there’s no telling whether we’re being tricked into accepting a certain viewpoint.
My solution? As usual, it’s moderating my own usage. I often delete the app until I need to use it again for research or because, say, it’s the NBA playoffs. (This is your time, Timberwolves!) In the end, the future of the app is in the hands of the Senate and President Biden, whether we like that or not.