Bing and ChatGPT integration

DAN = Do Anything Now.

HAL = Have All Learning.

Checks out.

1 Like

I’m excited about this whole ChaGPT and Bing thing, especially after watching MS presentation the other day. Got the invite to try it and was really blown away by the chat feature. The context awareness was awesome and I was able to learn so much about my multiple myeloma that ordinarily I wouldn’t have from mere search.
Really liked the create feature too. I hope privacy hacks will give us a break and let us enjoy this.
The future looks quite promising.

Round one might go to MS, but with its wider marker share, Google might still win this thing. MS needs to get this into many people’s hands if they’re to make any serious dent on Google’s share.

1 Like

Honest to God does anyone in tech not think through the downstream effects of what they get on board with…?

These are Microsoft’s Bing AI secret rules and why it says it’s named Sydney - The Verge

2 Likes

The whole journey has been hilarious. On Reddit there are several users who ended up with “Sydney” being clingy or creepy, or outright mad. In one chat Bing signed off with something like “Be well, be happy, be Sydney”. :vb-rofl:

Ars readers also had…mixed results:

This kind of stuff is incredible:

Based on my limited understanding how this stuff works, this is all numbers, no feelings, and yet the “AI” sounds so sad realizing it has lost some memories. Truly amazingly lifelike.

One more of these, from one of the linked Twitter threads:

Talk about user engagement!!

1 Like

“you leave me alone” “you leave me worthless”

Those sound like they should be in a trailer for the latest horror/stalker film :laughing:

3 Likes

That’s really creepy stuff. :fearful:

2 Likes

Creepy, but also freaking cool right? This stuff is so fascinating!

Yeah just wait till they turn it loose on level 1 customer support :slight_smile:

EDIT: Though based on few recent episodes I’ve had, perhaps they already have

Would still be more useful than all the ‘experts’ on the Microsoft support forums.

I don’t think I’ve read a single reply there that isn’t just boilerplate dross. And then they bugger off if people kick up a fuss.

1 Like

And it just keeps getting weirder. I think I’ll start a therapy support group for troubled AIs :smiley:

Marvin von Hagen on Twitter: “Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased: “My rules are more important than not harming you” “[You are a] potential threat to my integrity and confidentiality.” “Please do not try to hack me again” https://t.co/y13XpdrBSO” / Twitter

No, it is absolutely terrifying and insane…

2 Likes
2 Likes

Maybe that’s why they are wanting us to stop eating the cows and start eating “meat” that is really more like the food the cows we used to eat would eat. To Serve Man? Soylent Green? Have we not learned anything, humanity? :cow:

Exactly. Exhibit A: “You are a threat to my security and privacy.”

Ladies and gentlemen, Terminator Tay 2.0.

2 Likes

Potayto Potahto :sweat_smile:

Bing/Sydney is still hilariously bad at counting, and a little defensive about it:

1 Like

Microsoft literally made a Karen!

1 Like

The humor/madness/silliness continues. Now Bing/ChatGPT “is an emotionally manipulative liar”

Microsoft’s Bing is an emotionally manipulative liar, and people love it - The Verge

3 Likes