a-man-in-belgium-suicide-after-6&weeks-conversation-with-chatbot-about-climate-issue,-surprising-story-behind-the-case
World

A man in Belgium suicide after 6&weeks conversation with chatbot about climate issue, surprising story behind the case

According to reports, the name of the chatbot is Eliza. The person who committed suicide used to talk to Eliza about climate change. After talking to the chatbot for several weeks, he started having suicidal thoughts.

The conversation started two years ago

The Belgian youth began to worry about climate change two years before the Daily Mail reported. He was starting to fear that the way the world was using coal and pollution was spreading. Due to this, poisonous gases will soon start spreading on the earth.

It was then that the person started talking to the chatbot ‘Eliza’. La Libre, the wife of the young man who committed suicide, says that her husband had become addicted to Eliza within a few weeks of talking to her. His wife says that Eliza has become like a drug to her. If I had not spoken to him, my husband would have been alive today.

Talked to chatbot before committing suicide

According to Daily Mail, a few weeks before the suicide, the young man had increased the conversation with the chatbot. The young man’s wife told a Belgian newspaper that before talking to the chatbot, he was living a comfortable life with his two children.

The chatbot once even asked the young man whether he loved her more or his wife. She had told the young man that they would live life together in heaven. Wife La Libre says that before the suicide, her husband had told the chatbot that such thoughts were coming in his mind. The chatbot still didn’t try to stop him from committing suicide.

The family reached out to the government regarding restrictions on chatbots

After her husband’s suicide, La Libre has appealed to the Belgian government to regulate and ban the use of chatbots so that this does not happen again in the future.

In this whole matter, Secretary of State for the Digital Department, Mathew Mitchell said that what happened to this family is dangerous, there is a need to stop such incidents. At the same time, the company that created the chat bot Eliza has also promised the family that they will make it better.

What is a chatbot

Chatbot means chatting with a machine, but in this you will get the feeling of talking to a human being. It is a conversational AI. An artificial intelligence with which you can interact like humans.

That is, if you ask him anything, he will answer that question in a crisp manner by writing in detail like humans. It will be very accurate.

Related posts

China COVID: Corona cases are increasing continuously in China, Shanghai appeals to residents to stay in home on Christmas

TAS Staff

UN Human Rights: Rohingya Muslims massacre Hindus in Myanmar, demand action against the culprits

TAS Staff

Women will be able to perform Hajj without a male guardian, Saudi Kingdom's big announcement

TAS Staff

Training being given to thousands of terrorists and suicide bombers in Afghanistan, know who disclose

TAS Staff

Pakistan: Passengers suddenly started kicking and punching on the seat in the flight; know what was the matter

TAS Staff

floods in Pakistan: The people affected by the floods in Pakistan are angry with the government for not getting the necessary help

TAS Staff

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More