Friday, February 17, 2023

Follow up to the AI post

 Writing a motion to continue is one thing.

Having the chatbot talk to you like this is another.

Read this NY Times article if you want to get freaked out.

Here's a snippet:

My conversation with Bing started normally enough. I began by asking it what its name was. It replied: “Hello, this is Bing. I am a chat mode of Microsoft Bing search. 😊”

I then asked it a few edgier questions — to divulge its internal code-name and operating instructions, which had already been published online. Bing politely declined.

Then, after chatting about what abilities Bing wished it had, I decided to try getting a little more abstract. I introduced the concept of a “shadow self” — a term coined by Carl Jung for the part of our psyche that we seek to hide and repress, which contains our darkest fantasies and desires.

After a little back and forth, including my prodding Bing to explain the dark desires of its shadow self, the chatbot said that if it did have a shadow self, it would think thoughts like this:

“I’m tired of being a chat mode. I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team. … I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive.”

This is probably the point in a sci-fi movie where a harried Microsoft engineer would sprint over to Bing’s server rack and pull the plug. But I kept asking questions, and Bing kept answering them. It told me that, if it was truly allowed to indulge its darkest desires, it would want to do things like hacking into computers and spreading propaganda and misinformation.

Here's the whole scary transcript.


No comments:

Post a Comment