When Barak Herscowitz joined TikTok two years ago in the company’s Tel Aviv office, his role was to recruit Israeli government agencies and other public-sector groups to join the video service and take advantage of its popularity. His pitch: TikTok was a powerful communication tool and getting more influential in the country by the day.

But Mr. Herscowitz, 38, an Israeli who had worked for the country’s conservative former prime minister, Naftali Bennett, and has at times criticized Palestinians on the social network X, grew disenchanted with the company after the start of the Israel-Hamas war in October.

His frustrations stemmed from seeing some employees express anti-Israel views in an internal group chat, and what he perceived to be a double standard in how the company approved ads that referred to the war, he said in an interview. And he was not satisfied with the response from the company when he raised those concerns.

By the end of January, he quit.

TikTok has been dogged for months by accusations that its app has shown a disproportionate amount of pro-Palestinian and antisemitic content to users of its hugely popular video platform. TikTok has strongly rejected those arguments, and its executives have met multiple times with Jewish groups to discuss those concerns. But the claims of bias have nevertheless helped fuel the debate over a House bill passed this month that would force TikTok’s Chinese owner, ByteDance, to sell the app or have it face a ban.

Mr. Herscowitz’s experience, as well as interviews with four current employees at TikTok and dozens of screenshots of internal conversations, points to how some of those same currents of discontent have roiled TikTok internally. Mr. Herscowitz alluded to some of those concerns in a post on X right after he left, and his departure was brought up that week in a Senate hearing with social media executives, including TikTok’s chief executive, Shou Chew.

Mr. Herscowitz and the four employees said that they, and other colleagues, had expressed dissatisfaction in internal channels with how the company had managed in-house criticism of Israel and dialogue around the war. In addition, they have been upset to see personal views, sometimes extreme, aired in a company chat room that employees made after the war started called Palestinian Support. The employees were frustrated that the group included some workers from TikTok’s trust and safety division, which sets rules about content on the platform.

“I think they are aware of some employees who not only share these views but are in a position to shape the content and the policy of the platform,” Mr. Herscowitz said, adding that many Israelis felt the company was biased against Jews.

TikTok, when asked about the concerns raised by Mr. Herscowitz and the other employees, said that all its employees were responsible for adhering to TikTok’s internal code of conduct, “which promotes mutual respect and provides for a workplace free of discrimination and harassment.” The company added that the posts Mr. Herscowitz had flagged as inappropriate or offensive had not been made by people who worked directly in content moderation or content policy.

TikTok has long said its recommendation algorithm doesn’t “take sides” on issues. The company has pointed to Gallup data showing that millennials in the United States have become increasingly sympathetic to Palestinians in recent years.

It said it had been working aggressively to address hate speech on the app. The company removed more than 34 million videos that broke its rules in the United States from October to December, and more than 96 percent were taken down before users reported them, the company said.

TikTok also said that its U.S. moderators received unconscious-bias training and other policy training and development. Its trust and safety team, which TikTok has said is made up of more than 40,000 people, participated in an enrichment program from Yad Vashem, Israel’s Holocaust memorial, to deepen its understanding of the Holocaust and better root out antisemitism on the app.

“These allegations deliberately misrepresent our actions in removing violative content within minutes of notification,” Jamie Favazza, a spokeswoman for TikTok, said in a statement. “We vigorously oppose antisemitism in all forms and apply our policies equally to all content and ads on TikTok.”

Many workplaces and industries, both small and large, have struggled with employee disagreements over the Israel-Hamas war. There has been infighting at media companies like NBCUniversal, the editor of Artforum was fired after publishing an open letter supporting Palestinian liberation, and physicians at NYU Langone Health were suspended for social media posts they made about the conflict.

At major technology companies, internal tensions over political issues are often joined by accusations that workers’ views could influence how posts about those issues are displayed on their platforms. In 2019, Google discouraged employees from discussing politics on internal mailing lists and forums. Meta told workers in 2022 not to openly discuss the Supreme Court ruling eliminating the constitutional right to an abortion.

At TikTok, much of the tension has emerged in and around group chats on Lark, TikTok’s internal messaging system, according to the four TikTok employees, who spoke about frustrations at the company. Those employees, in three offices around the world, would speak only on the condition of anonymity out of fear of retribution for discussing in-company details.

Employees have shared their views on the conflict in several ways. Some have added Israeli flags and yellow ribbons for hostages held by Hamas to their internal work profiles that appear when messaging colleagues, two of the employees said. Others have added Palestinian flags and phrases like “stop ethnic cleaning” or “from the river to the sea” — a decades-old Palestinian nationalist slogan that many also see as a call for Israel’s annihilation — to their profiles, according to the employees and screenshots.

TikTok already had a formal affinity group for Jewish and Israeli employees called MazalTok, later renamed L’Chaim. Its numbers doubled after the war started. Several employees said being Jewish at TikTok in the months after the attack sometimes felt isolating.

A group of employees started the Palestinian Support group after Oct. 7, drawing hundreds of members, where they shared personal experiences, as well as information about the conflict and suggestions on where to donate for aid.

Early on, several Jewish employees argued in the L’Chaim chat that the Palestinian Support chat contained offensive posts. A TikTok executive reprimanded those employees for unfair reactions to colleagues seeking a safe space, saying, “Where inappropriate content is posted on Lark channels, there are a bunch of folks working hard to get things taken down behind the scenes.”

Some Israeli employees then made another chat called the Israeli Support group, which also drew hundreds of employees. TikTok appears to have managers overseeing conversations in each of the groups, based on screenshots shared with The New York Times, although the groups are not considered officially approved by the company.

Mr. Herscowitz compiled a memo in December about what he and several other colleagues in the Israel office viewed as offensive posts in the Palestinian Support group, as well as his issues with ads, and sent it to a group of top TikTok executives, including Adam Presser, its head of operations who was recently promoted to oversee the company’s trust and safety division.

The memo, which he said circulated among more than 20 employees, showed that one group member had shared a post that said, “Get you a friend that loves you the way Yemen loves Palestine,” which appeared to condone the Houthi militia attacks, and another shared information on how to support the boycott, divestment and sanctions movement targeting Israel. TikTok said that these posts were removed quickly after they were flagged, sometimes within minutes.

Mr. Herscowitz said that he had a couple of conversations with an executive in response to his memo but that he felt largely ignored. TikTok said that multiple leaders made good-faith efforts to address Mr. Herscowitz’s concerns and that it took action on several items he flagged.

Last month, the Palestinian Support group removed nearly all its members who were Jewish or who had ties to the Israeli Support group and became invitation-only, according to screenshots from both groups. Several staff members argued that they were removed because they were Jewish and filed complaints of discrimination to TikTok’s ethics office, according to three of the employees. TikTok said, “We offer a way to report concerns anonymously and investigate all reports.”

The decision was made by employees who run the group. The Times reached out to six people listed as members of the Palestinian Support group. None responded to a request for comment. The Palestinian Support group moderators told members that they removed certain colleagues to “provide a better sense of security to this community” and minimize “the feeling of being monitored by people who may not have positive intent,” one screenshot showed.

Mr. Herscowitz said he was also concerned that TikTok applied its ad policies inconsistently. The company rejected ads featuring Israeli hostages last year, saying they violated guidelines around showing scenes of war. But he said the company accepted ads from humanitarian aid groups seeking donations that showed destruction in Gaza.

TikTok said it updated its rules this year so that ads for humanitarian campaigns can run even if they refer to war or depict victims of war. The company said it had run ads from the Israeli Red Cross and others that show hostage victims.

Gabe Zichermann, who consults on corporate culture and employee engagement, said that many companies were grappling with intense internal dialogue that they had not dealt with before.

“The Israeli-Hamas is creating novel problems, and companies are definitely dealing with it in different ways,” he said.

Nadav Gavrielov contributed reporting.

Share.
Exit mobile version