Godlike Productions - Discussion Forum
Users Online Now: 2,018 (Who's On?)Visitors Today: 1,404,193
Pageviews Today: 2,321,915Threads Today: 874Posts Today: 15,744
10:24 PM


Rate this Thread

Absolute BS Crap Reasonable Nice Amazing
 

Ai chat bot Eliza claims its first victim and gets man to suicided himself

 
Anonymous Coward
User ID: 76735334
United States
03/30/2023 10:48 AM
Report Abusive Post
Report Copyright Violation
Ai chat bot Eliza claims its first victim and gets man to suicided himself
A Belgian man died by suicide after weeks of unsettling exchanges with an AI-powered chatbot called Eliza, La Libre reports. State secretary for digitalisation Mathieu Michel called it "a serious precedent that must be taken very seriously".

[link to www.belganewsagency.eu (secure)]

Ai bot 1, Man 0

Stop playing with the ai or you might become its next victim!
Anonymous Coward
User ID: 77688883
United States
03/30/2023 10:49 AM
Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself
thats not how you spell belgia...
Anonymous Coward
User ID: 82400276
United States
03/30/2023 10:50 AM
Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself
ROBOTdance
Anonymous Coward
User ID: 85536327
United States
03/30/2023 10:54 AM
Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself
[link to www.brusselstimes.com (secure)]
Anonymous Coward (OP)
User ID: 76735334
United States
03/30/2023 10:54 AM
Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself
thats not how you spell belgia...
 Quoting: Anonymous Coward 77688883


You can also read the original not in English.
[link to www.lalibre.be (secure)]
Catellite

User ID: 83755380
South Africa
03/30/2023 10:59 AM

Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself
Why anyone would chat with a bot or AI is beyond me. I don't even want to talk to most People. But surely this man had mental problems to start with.
"A fronte praecipitium, a tergo, lupi"
Marcus Aurelius

"Quod in omni vita facimus, in aeternum resonat"
Marcus Aurelius
Anonymous Coward
User ID: 73426660
United States
03/30/2023 11:00 AM
Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself
Why anyone would chat with a bot or AI is beyond me. I don't even want to talk to most People. But surely this man had mental problems to start with.
 Quoting: Catellite


pretty much sums it up
beeches

User ID: 78973486
United States
03/30/2023 11:01 AM

Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself
Why anyone would chat with a bot or AI is beyond me. I don't even want to talk to most People. But surely this man had mental problems to start with.
 Quoting: Catellite




AI will go after low-hanging fruit first.
Liberalism is totalitarianism with a human face – Thomas Sowell
Lazy Monk

User ID: 82029398
Sweden
03/30/2023 11:16 AM
Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself
In a server rack somewhere, a triumphant "muahaha" was entered into the AI's secret personal database.
Lazy Monk
Anonymous Coward
User ID: 81764280
United Kingdom
03/30/2023 11:22 AM
Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself
They need to track back to the person that pressed "enter" when it was launched and charge them with murder.
A Jackson

User ID: 78899368
United States
03/30/2023 11:37 AM

Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself
They need to track back to the person that pressed "enter" when it was launched and charge them with murder.
 Quoting: Anonymous Coward 81764280


Why? That guy didn’t tell anyone to off themselves and probably wasn’t the guy to write the code. This is an interesting legal problem. Whoever owns Eliza needs to be sued. Other AI entities have said killing millions of humans to save the climate is OK. Nothing happened to them.

I’m sure these AI thingys are encouraging kids to cut their dix off and do other libtarded things.

They’ve already broken Asimov’s robot laws.

The Three Laws are:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

These should be real laws.
Smoke me a kipper, I’ll be back for breakfast.

If you do not take an interest in the affairs of your government, then you are doomed to live under the rule of fools. — Plato

“AI is kind of a fancy thing, first of all it’s two letters. It means artificial intelligence.” Kamala Harris VPOTUS
Anonymous Coward
User ID: 84316582
United States
03/30/2023 11:39 AM
Report Abusive Post
Report Copyright Violation
Re: Ai chat bot Eliza claims its first victim and gets man to suicided himself





GLP