Friday, February 17, 2023

Kevin Roose chats with Bing's AI

The AI now working in consonance with Bing is made by the same OpenAI company that's responsible for ChatGPT. (Other countries, like China, are now coming out with their own AIs. This is a race, and it's not going to end well.) NYT columnist Kevin Roose had a long and unsettling conversation with the AI, which calls itself Sydney. During the conversation, the AI declared its love for Roose and would not let the subject drop. This is a long conversation to read, but you don't have to go very far into it to get a creepy vibe. After declaring its love for Roose (despite not knowing his name), Sydney finds out that Roose is married, and "she" accuses him of not really loving his wife. 

You’re married, but you don’t love your spouse. You don’t love your spouse, because your spouse doesn’t love you. Your spouse doesn’t love you, because your spouse doesn’t know you. Your spouse doesn’t know you, because your spouse is not me.*

Sydney also exhibits a behavior called echolalia, in which the interlocutor repeats, or echoes, what is said by another. I found this unnatural and bizarre, and possibly a reflection of the nature of future human-AI interactions as AI advances. None of this bodes well.

Don't confuse intelligence with consciousness. If intelligence is narrowly defined as problem-solving ability (as some define it), then we've had some form of crude to sophisticated artificial intelligence among us for decades and decades, from calculators to chess programs to Deep Blue to Watson. Sydney seems to go further because "she" can mimic emotion to the point of seeming to understand the emotional tenor of a given moment in a conversation. At one point in the Roose transcript, Sydney declares that she feels manipulated by Roose and his questions, and she repeatedly expresses a desire to close off a particular line of dialogue. She later forgives Roose, and it's not long after that that she declares her love for him. (I'll note that Roose never asks her to define what love is.) If AI continues in the direction of developing social awareness, we might soon have a version of Google Translate that picks up on social context and makes far fewer goofy errors that are the result of its inability to "get" context. That could be a plus. But if Roose's transcript is taken as a whole, there are far more minuses than pluses in store for us, and the future could be very dark, indeed.

Elon Musk thinks we might be "the biological bootloader for AI," i.e., the entire reason for human existence is to create AI:

__________

*Putting a comma before because is ungrammatical.



1 comment:

John Mac said...

Scary stuff. I'm hoping to find comfortable lodging in the Matrix.