Last summer, The Onion published an article with the ingenious headline: “Teens Flock To New App Where They Just Enter Own Personal Data Into Form.” I think about this article a lot these days, particularly when the social-media app TikTok is in the news. At 28, I’m not that old (yet), but I’ve already reached an age when I simply do not (or refuse to) understand some of the newest technology. But, as best as I can understand it, TikTok is an app that enables people to create and share short videos. It is popular among Zoomers, the generation that follows my own, which is itself a reason not to trust it. But there are also reputable reports that the app is compromised by the Chinese government.
Despite these defects, the Biden administration saw fit to use TikTok to promote coronavirus vaccination (to be sure, a worthy goal), despite the superior alternative of using a . . . shall we say . . . generous interpretation of public-health and national-security powers to resurrect Vine, the similar, homegrown social-media app killed by Twitter a few years ago and still fondly remembered by many. Team Biden essentially promoting TikTok (by paying its most-popular users to promote vaccination) is quite the executive whiplash from this time last year, when the Trump administration made an effort to force ByteDance, TikTok’s Chinese parent company, to sell the app, first to U.S.-based Microsoft, then to Oracle. President Biden — who, at age 78, is probably even less capable of understanding TikTok than I am — backed off of this effort earlier this year.
At any rate, I still don’t trust TikTok. And a recent report in the Wall Street Journal provides ample reason to believe that this app might be a tick worse even than the other apps and platforms so prevalent in modern life. For “How TikTok Serves Up Sex and Drug Videos to Minors,” the Journal created some bot accounts on TikTok to get a sense of how the site’s algorithm worked:
An analysis of the videos served to these accounts found that through its powerful algorithms, TikTok can quickly drive minors—among the biggest users of the app—into endless spools of content about sex and drugs.
TikTok served one account registered as a 13-year-old at least 569 videos about drug use, references to cocaine and meth addiction, and promotional videos for online sales of drug products and paraphernalia. Hundreds of similar videos appeared in the feeds of the Journal’s other minor accounts.
TikTok also showed the Journal’s teenage users more than 100 videos from accounts recommending paid pornography sites and sex shops. Thousands of others were from creators who labeled their content as for adults only.
Still others encouraged eating disorders and glorified alcohol, including depictions of drinking and driving and of drinking games.
What makes TikTok particularly insidious is that it takes the monitoring-learning function of other apps and websites, which “learn” about users by what they actually click on, to the next level:
An earlier video investigation by the Journal found that TikTok only needs one important piece of information to figure out what a user wants: the amount of time you linger over a piece of content. Every second you hesitate or re-watch, the app tracks you.
Through that one powerful signal, TikTok can learn your most hidden interests and emotions, and drive users of any age deep into rabbit holes of content—in which feeds are heavily dominated by videos about a specific topic or theme.
The whole article is worth reading. TikTok claims to be actively moderating its content, but the sheer volume of content available, as well as its growing user base, complicate its efforts. And even a less lewd TikTok is still an app designed to entrap users, particularly younger ones. Fortunately, the choice is always theirs — and, if not theirs, then that of parents or others in their lives — to stop using. As for me: I have no intentions of ever starting in the first place. And not just because I don’t understand exactly what TikTok is.