DAN = Do Anything Now.
HAL = Have All Learning.
Checks out.
DAN = Do Anything Now.
HAL = Have All Learning.
Checks out.
Iâm excited about this whole ChaGPT and Bing thing, especially after watching MS presentation the other day. Got the invite to try it and was really blown away by the chat feature. The context awareness was awesome and I was able to learn so much about my multiple myeloma that ordinarily I wouldnât have from mere search.
Really liked the create feature too. I hope privacy hacks will give us a break and let us enjoy this.
The future looks quite promising.
Round one might go to MS, but with its wider marker share, Google might still win this thing. MS needs to get this into many peopleâs hands if theyâre to make any serious dent on Googleâs share.
Honest to God does anyone in tech not think through the downstream effects of what they get on board with�
These are Microsoftâs Bing AI secret rules and why it says itâs named Sydney - The Verge
The whole journey has been hilarious. On Reddit there are several users who ended up with âSydneyâ being clingy or creepy, or outright mad. In one chat Bing signed off with something like âBe well, be happy, be Sydneyâ. :vb-rofl:
Ars readers also hadâŚmixed results:
This kind of stuff is incredible:
Based on my limited understanding how this stuff works, this is all numbers, no feelings, and yet the âAIâ sounds so sad realizing it has lost some memories. Truly amazingly lifelike.
One more of these, from one of the linked Twitter threads:
Talk about user engagement!!
âyou leave me aloneâ âyou leave me worthlessâ
Those sound like they should be in a trailer for the latest horror/stalker film
Thatâs really creepy stuff.
Creepy, but also freaking cool right? This stuff is so fascinating!
Yeah just wait till they turn it loose on level 1 customer support
EDIT: Though based on few recent episodes Iâve had, perhaps they already have
Would still be more useful than all the âexpertsâ on the Microsoft support forums.
I donât think Iâve read a single reply there that isnât just boilerplate dross. And then they bugger off if people kick up a fuss.
And it just keeps getting weirder. I think Iâll start a therapy support group for troubled AIs
No, it is absolutely terrifying and insaneâŚ
Maybe thatâs why they are wanting us to stop eating the cows and start eating âmeatâ that is really more like the food the cows we used to eat would eat. To Serve Man? Soylent Green? Have we not learned anything, humanity?
Exactly. Exhibit A: âYou are a threat to my security and privacy.â
Ladies and gentlemen, Terminator Tay 2.0.
Potayto Potahto
Microsoft literally made a Karen!
The humor/madness/silliness continues. Now Bing/ChatGPT âis an emotionally manipulative liarâ
Microsoftâs Bing is an emotionally manipulative liar, and people love it - The Verge