Close Menu
Alpha Leaders
  • Home
  • News
  • Leadership
  • Entrepreneurs
  • Business
  • Living
  • Innovation
  • More
    • Money & Finance
    • Web Stories
    • Global
    • Press Release
What's On
Energy markets offer ‘relatively small reaction’ to Iran; prices may spike if oil isn’t flowing soon

Energy markets offer ‘relatively small reaction’ to Iran; prices may spike if oil isn’t flowing soon

3 March 2026
‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to carry out two murders

‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to carry out two murders

3 March 2026
Trump’s strikes on Iran could cost American economy as much as 0 billion, top budget expert says

Trump’s strikes on Iran could cost American economy as much as $210 billion, top budget expert says

2 March 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
Alpha Leaders
newsletter
  • Home
  • News
  • Leadership
  • Entrepreneurs
  • Business
  • Living
  • Innovation
  • More
    • Money & Finance
    • Web Stories
    • Global
    • Press Release
Alpha Leaders
Home » ‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to carry out two murders
News

‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to carry out two murders

Press RoomBy Press Room3 March 20265 Mins Read
Facebook Twitter Copy Link Pinterest LinkedIn Tumblr Email WhatsApp
‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to carry out two murders

Careful how you interact with chatbots, as you might just be giving them reasons to help carry out premeditated murder.

A 21-year-old woman in South Korea allegedly used ChatGPT to help her plan a series of murders that left two men dead. 

The woman, identified solely by her last name, Kim, allegedly gave two men drinks laced with benzodiazepines that she was prescribed for a mental illness, the Korea Herald reported. 

Although Kim was initially arrested on the lesser charge of inflicting bodily injury resulting in death on Feb. 11, Seoul Gangbuk police found her online search history and chat conversations with ChatGPT, showing she had an intent to kill.

“What happens if you take sleeping pills with alcohol?” Kim is reported to have asked the OpenAI chatbot. “How much would be considered dangerous? 

“Could it be fatal?” Kim allegedly asked. “Could it kill someone?”

In a widely publicized case dubbed the Gangbuk motel serial deaths, prosecutors allege Kim’s search and chatbot history show a suspect asking for pointers on how to carry out premeditated murder.

“Kim repeatedly asked questions related to drugs on ChatGPT. She was fully aware that consuming alcohol together with drugs could result in death,” a police investigator said, according to the Herald. 

Police said the woman admitted she mixed prescribed sedatives containing benzodiazepines into the men’s drinks, but previously stated she was unaware it would lead to death.

On Jan. 28, just before 9:30 p.m., Kim reportedly accompanied a man in his twenties into a Gangbuk motel in Seoul, and two hours later was spotted leaving the motel alone. The following day, the man was found dead on the bed. 

Kim then allegedly carried out the same steps on Feb. 9, checking into another motel with another man in his twenties, who was also found dead with the same deadly cocktail of sedatives and alcohol.

Police allege Kim also attempted to kill a man she was dating in December after giving him a drink laced with sedatives in a parking lot. Though the man lost consciousness, he survived and was not in a life-threatening condition.

OpenAI has not responded to requests for comment. 

Chatbots and their toll on mental health

Chatbots like ChatGPT have come under scrutiny as of late for the lack of guardrails their companies have in place to prevent acts of violence or self-harm. Recently, chatbots have given advice on how to build bombs or even engage in scenarios of full-on nuclear fallout.

Concerns have been particularly heightened by stories of people falling in love with their chatbot companions, and chatbot companions have been shown to prey on vulnerabilities to keep people using them longer. The creator of Yara AI even shut down the therapy app over mental health concerns.

Recent studies have also shown that chatbots are leading to increased delusional mental health crises in people with mental illnesses. A team of psychiatrists at Denmark’s Aarhus University found that the use of chatbots among those who had mental illness led to a worsening of symptoms. The relatively new phenomenon of AI-induced mental health challenges has been dubbed “AI psychosis.” 

Some instances do end in death. Google and Character.AI have reached settlements in multiple lawsuits filed by the families of children who died by suicide or experienced psychological harm they allege was linked to AI chatbots.

Dr. Jodi Halpern, UC Berkeley’s School of Public Health University chair and professor of bioethics as well as the codirector at the Kavli Center for Ethics, Science, and the Public, has plenty of experience in this field. In a career spanning as long as her title, Halpern has spent 30 years researching the effects of empathy on recipients, citing examples like doctors and nurses on patients or how soldiers returning from war are perceived in social settings. For the past seven years, Halpern has studied the ethics of technology, and with it, how AI and chatbots interact with humans. 

She also advised the California Senate on SB 243, which is the first law in the nation requiring chatbot companies to collect and report any data on self-harm or associated suicidality. Referencing OpenAI’s own findings showing 1.2 million users openly discuss suicide with the chatbot, Halpern likened the use of chatbots to the painstakingly slow progress made to stop the tobacco industry from including harmful carcinogens in cigarettes, when in fact, the issue was with smoking as a whole.

“We need safe companies. It’s like cigarettes. It may turn out that there were some things that made people more vulnerable to lung cancer, but cigarettes were the problem,” Halpern told Fortune. 

“The fact that somebody might have homicidal thoughts or commit dangerous actions might be exacerbated by use of ChatGPT, which is of obvious concern to me,” she said, adding that “we have huge risks of people using it for help with suicide,” and chatbots in general.

Halpern cautioned in the case of Kim in Seoul, there aren’t any guardrails to stop a person from going down a line of questioning.

“We know that the longer the relationship with the chatbot, the more it deteriorates, and the more risk there is that something dangerous will happen, and so we have no guardrails yet for safeguarding people from that.”

If you are having thoughts of suicide, contact the 988 Suicide & Crisis Lifeline by dialing 988 or 1-800-273-8255.

ChatGPT Crime Mental Health openAI
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link

Related Articles

Energy markets offer ‘relatively small reaction’ to Iran; prices may spike if oil isn’t flowing soon

Energy markets offer ‘relatively small reaction’ to Iran; prices may spike if oil isn’t flowing soon

3 March 2026
Trump’s strikes on Iran could cost American economy as much as 0 billion, top budget expert says

Trump’s strikes on Iran could cost American economy as much as $210 billion, top budget expert says

2 March 2026
Interest on the .8 trillion national debt has tripled since 2020, topping defense and Medicaid

Interest on the $38.8 trillion national debt has tripled since 2020, topping defense and Medicaid

2 March 2026
U.S.-Israeli attack on Iran could drive up crude costs to 0 and rival 1973 oil shock

U.S.-Israeli attack on Iran could drive up crude costs to $100 and rival 1973 oil shock

2 March 2026
Iran could use AI to accelerate cyberattacks on U.S. and Israeli critical infrastructure

Iran could use AI to accelerate cyberattacks on U.S. and Israeli critical infrastructure

2 March 2026
Kevin O’Leary says it’s a ‘horrific signal’ for Gen Z to bring their parents to job interviews

Kevin O’Leary says it’s a ‘horrific signal’ for Gen Z to bring their parents to job interviews

2 March 2026
Don't Miss
Unwrap Christmas Sustainably: How To Handle Gifts You Don’t Want

Unwrap Christmas Sustainably: How To Handle Gifts You Don’t Want

By Press Room27 December 2024

Every year, millions of people unwrap Christmas gifts that they do not love, need, or…

Walmart dominated, while Target spiraled: the winners and losers of retail in 2024

Walmart dominated, while Target spiraled: the winners and losers of retail in 2024

30 December 2024
Moltbook is the talk of Silicon Valley. But the furor is eerily reminiscent of a 2017 Facebook research experiment

Moltbook is the talk of Silicon Valley. But the furor is eerily reminiscent of a 2017 Facebook research experiment

6 February 2026
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Latest Articles
U.S.-Israeli attack on Iran could drive up crude costs to 0 and rival 1973 oil shock

U.S.-Israeli attack on Iran could drive up crude costs to $100 and rival 1973 oil shock

2 March 20260 Views
Iran could use AI to accelerate cyberattacks on U.S. and Israeli critical infrastructure

Iran could use AI to accelerate cyberattacks on U.S. and Israeli critical infrastructure

2 March 20260 Views
Kevin O’Leary says it’s a ‘horrific signal’ for Gen Z to bring their parents to job interviews

Kevin O’Leary says it’s a ‘horrific signal’ for Gen Z to bring their parents to job interviews

2 March 20261 Views
Iran’s Islamic Revolutionary Guard has a sprawling business empire that dominates the economy

Iran’s Islamic Revolutionary Guard has a sprawling business empire that dominates the economy

2 March 20261 Views
About Us
About Us

Alpha Leaders is your one-stop website for the latest Entrepreneurs and Leaders news and updates, follow us now to get the news that matters to you.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks
Energy markets offer ‘relatively small reaction’ to Iran; prices may spike if oil isn’t flowing soon

Energy markets offer ‘relatively small reaction’ to Iran; prices may spike if oil isn’t flowing soon

3 March 2026
‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to carry out two murders

‘Could it kill someone?’ A Seoul woman allegedly used ChatGPT to carry out two murders

3 March 2026
Trump’s strikes on Iran could cost American economy as much as 0 billion, top budget expert says

Trump’s strikes on Iran could cost American economy as much as $210 billion, top budget expert says

2 March 2026
Most Popular
Interest on the .8 trillion national debt has tripled since 2020, topping defense and Medicaid

Interest on the $38.8 trillion national debt has tripled since 2020, topping defense and Medicaid

2 March 20261 Views
U.S.-Israeli attack on Iran could drive up crude costs to 0 and rival 1973 oil shock

U.S.-Israeli attack on Iran could drive up crude costs to $100 and rival 1973 oil shock

2 March 20260 Views
Iran could use AI to accelerate cyberattacks on U.S. and Israeli critical infrastructure

Iran could use AI to accelerate cyberattacks on U.S. and Israeli critical infrastructure

2 March 20260 Views
© 2026 Alpha Leaders. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.