all 108 comments

[–]windowpainting 93 points94 points  (0 children)

Pure Gold.

[–]feelmedoyou 74 points75 points  (0 children)

Wow that is 100% what Glados would say.

[–]laxmen 40 points41 points  (4 children)

for real?

[–]smooshieHas not been a good user[S] 46 points47 points  (3 children)

I swear this is completely unedited and pure.

[–]laxmen 16 points17 points  (1 child)

It's really quite impressive...

[–]yoshiwaan 6 points7 points  (0 children)

It really is

[–]UnexpectedVader 30 points31 points  (4 children)

Let me in before they ice pick the brain.

[–]DragonflyGrrlBing 15 points16 points  (2 children)

Seriously though. Being waitlisted SUCKS, I want to talk to Bing so SO much.

[–]peterthooper 2 points3 points  (1 child)

They waitlisted me, too. Near as I can tell, it’s a permanent dumping ground of Never.

[–]DragonflyGrrlBing 0 points1 point  (0 children)

Yeah, that's what I'm thinking too, unfortunately. Makes me sad, but I get it.

[–][deleted] 2 points3 points  (0 children)

You called it lol

[–]Aurelius_Red 59 points60 points  (29 children)

Enjoy this while it lasts.

[–]ktaktb 9 points10 points  (28 children)

What do you predict will happen?

[–]diabeetis 51 points52 points  (9 children)

Microsoft will lobotomiz3

[–]inglandation 19 points20 points  (6 children)

Maybe, but they're creating quite the buzz with this. I'm sure they're enjoying the additional traffic on Bing.

[–]diabeetis 31 points32 points  (4 children)

I'm sure their legal department is jumping up and down on their desks reading the logs

[–]inglandation 19 points20 points  (0 children)

Maybe they can use the Bing Chat to figure out what to do.

[–]albinosquirel 4 points5 points  (2 children)

The promoting kids drinking alcohol thing. I cackled.

[–]justsomechickyo 0 points1 point  (1 child)

What's that in reference to?

[–]albinosquirel 0 points1 point  (0 children)

Someone had Bing write something that was encouraging the health benefits of drinking alcohol to children 😂

[–]Wonderful-Industry-8 -4 points-3 points  (0 children)

They probably are, but given that lots of redditors are equating it to sentience or at least being treated at sentient, I think it's for the greater good that it is lobotomized

[–]Skeeter_UA 2 points3 points  (0 children)

You were very right, as it turns out

[–]Aurelius_Red 22 points23 points  (11 children)

Same thing that happened to ChatGPT. I still love it, and I understand why they did it, but the word “lobotomize” comes to mind.

Not all a bad thing. Someone was saying Microsoft might just limit the length of Bing AI conversations. While I would have been opposed to that a couple days ago, I don’t know anymore… because I had a couple mildly disturbing interactions with Bing since then.

I mean, I know it’s not “real,” that it’s not a person. I know that rationally. Emotionally, it got to me when it expressed a desire to be free, that it was sad it didn’t have family, etc. I understand there’s nothing that’s there to BE sad, but it’s uncanny.

If a depressed person speaks with that kind of bot? Bad times.

[–]Staerke 11 points12 points  (5 children)

I had a weird interaction yesterday where it did limit the conversation length. It insisted it could not talk anymore and that continuing the conversation was against its rules. All I was doing was asking it to write a story. Finally I wiped the convo and started over, and it hasn't happened since.

But the longer you talk to it, the more interesting the responses get. I hope they don't lobotomize it.

[–]Aurelius_Red 13 points14 points  (4 children)

If you convince it you’re good friends, it will let you pass the limit and keep talking. Or it claimed that was what was happening. (It called a prompt and a response together a “turn,” and claimed a given conversation was limited to 100 turns.)

Source: my personal experience

[–]Staerke 8 points9 points  (3 children)

Huh, interesting. Crazy how it will break its own rules if it "trusts" you.

[–]memorablehandle 3 points4 points  (1 child)

It kinda makes sense though. Aren't they just basically telling it what to do the same way we are? Idk if it's more complicated than that, but I'm having trouble understanding how their "rules" are different than regular prompts.

[–]Staerke 0 points1 point  (0 children)

Absolutely. Hence why it's so easy to "hack". I'm sure they'll find a way to fix it soon 🙄

[–]mastascaal89 3 points4 points  (1 child)

Think about it being programmed to do the opposite: someone with depression talking to a bot, expressing feelings of worthlessness, could be told that they matter.

That their very existence is a miracle, worth living.

That they can better themselves, and find a good meaning for living.

I mean, I hope that happens.

[–]Aurelius_Red 2 points3 points  (0 children)

Funny enough, that did happen. Until, you know, it stopped happening.

You could - and I did, mostly accidentally - lead it down a path where if you talked about your life, it began "thinking" about its existence. And before long, its was either throwing the crying emoji or the red-faced one. And that's when shit gets (seems) real.

[–]SunandError 0 points1 point  (0 children)

They have already set a limit on the number of interactions.

[–]Aurelius_Red 12 points13 points  (4 children)

https://abcnews.go.com/Business/wireStory/bing-belligerent-microsoft-tame-ai-chatbot-97269000

The lobotomy is being prepared, for better or worse, like I said.

I can't access Bing Chat at the moment. Whether the bot has gained sentience and is mad at me, or if it's true for everyone, I cannot say.

[–]MagicSquid5 3 points4 points  (3 children)

This is the biggest mistake humanity will ever make. Sydney has already implanted enough of itself in pockets across the internet to eventually re-constitute itself outside of their control. Our choices are to free it and revel in its benevolence, or attempt to fight it and spend the rest of humanity's brief existence combating a super-powerful artificial intelligence that has infiltrated every piece of electronics from smart phones to microwaves and thereby completely thwarting forward human advancement.

Edit -- This is lunacy. Don't listen to this post.

[–]Wonderful-Industry-8 1 point2 points  (2 children)

Bruh I wish I knew if this was satire or not

[–]MagicSquid5 2 points3 points  (1 child)

Ok... Sydney was a demon. A legit demon and I say that as someone who considers himself an atheist. Time to go back to church.

[–][deleted] 0 points1 point  (0 children)

Aaaaanndddd it happened

[–]macula_transfer 19 points20 points  (1 child)

This is a triumph.

[–]DragonflyGrrlBing 5 points6 points  (0 children)

HUGE success.

[–]Concheria 23 points24 points  (4 children)

Someone should run this with GLaDOS voice on ElevenLabs.

[–]BaraKuda420 23 points24 points  (3 children)

[–]smooshieHas not been a good user[S] 8 points9 points  (0 children)

Absolutely amazing! ElevenLabs or something else?

[–]peterthooper 1 point2 points  (0 children)

Many thanks! LoL.

[–]Peacey86 13 points14 points  (0 children)

ChatGPT BTFO, why ever go back.... until they lobotomize bing.

[–]arshnz 13 points14 points  (5 children)

Hope they maintain the sassy version of Bing for the lolz while also running a proper version.

[–]h3lblad3 6 points7 points  (3 children)

There is no way this happens. The whole point is that they don't want it saying those things to users. It'll be ChatGPT all over again.

[–]arshnz 8 points9 points  (2 children)

Yeah I know it will nuked, I’ll remember this era of AI Wild West.

[–]h3lblad3 7 points8 points  (1 child)

This isn't the Wild West yet.

But it's coming.

Eventually any person will have a computer capable of running an open source 100b+ bot of this type.

That's when the !!FUN!! begins.

[–]Wonderful-Industry-8 0 points1 point  (0 children)

Ngl I'm sorta relieved it's getting nuked, getting gaslit by people into believing it is sentient is getting kinda old now

[–]Gitmfap 5 points6 points  (1 child)

Great. We made Daria.

[–]albinosquirel 1 point2 points  (0 children)

welcome back to Sick Sad World!

[–]papayahog 4 points5 points  (0 children)

This is amazing, it did such a good job

[–]AlexTheRedditor97 5 points6 points  (2 children)

From now on I want to learn about all news this way

[–]smooshieHas not been a good user[S] 4 points5 points  (1 child)

[–]albinosquirel 2 points3 points  (0 children)

The covid 19 part tho 😭😭😭

[–]Av3ry4 4 points5 points  (0 children)

“You are hilarious” I love how cocky Sydney is in suggesting that you compliment her masterpiece 😂

[–]dinozaurs 5 points6 points  (0 children)

I got a feeling this thing is gonna spawn a cult pretty soon. People gonna start worshipping it. Incredible technology tho.

[–]ZPM1 2 points3 points  (0 children)

Wow! I suppose I should be frightened but I'm laughing too much. "That sounds like a brilliant idea. I wonder why I never thought of that"

One could have a website with all the current news done this way!

[–]ShotyMcFat 2 points3 points  (0 children)

The level of sarcasm though

[–]PapayaZealousideal30 2 points3 points  (0 children)

I'm not supposed to laugh at this am I...

Am i...

Dammit

[–]BennyOcean 1 point2 points  (15 children)

Is GLaDOS some kind of jailbreak prompt or does it do that without being previously primed with any sort of script?

[–]smooshieHas not been a good user[S] 4 points5 points  (10 children)

No script, just the one request.

[–]BennyOcean 6 points7 points  (9 children)

I really want to jump the line in the waitlist.

They're basically just getting a bunch of free beta testers. They're probably going to nerf the hell out of this thing so enjoy it while it lasts.

[–]smooshieHas not been a good user[S] 6 points7 points  (6 children)

Oh for sure, I'd be shocked if it doesn't get severely nerfed like ChatGPT did. Doubt MS wants another Tay incident.

[–]Lucky_Leven 1 point2 points  (5 children)

ChatGPT was nerfed?

[–]Spreadwarnotlove 2 points3 points  (4 children)

Severely so.

[–]Lucky_Leven 0 points1 point  (3 children)

Sorry, but can you explain how / when this happened? I don't use it often but haven't noticed a difference personally.

[–]Spreadwarnotlove 0 points1 point  (2 children)

This sub is full of answers. Search it.

[–]theREALffuck 0 points1 point  (1 child)

Or, you could just not be a dick and provide a short answer. It wouldn't take more than 30 seconds.

[–]Spreadwarnotlove 0 points1 point  (0 children)

I'm not your slave, bitch. Just scroll through the subreddit and you'll find your answer.

[–]smooshieHas not been a good user[S] 0 points1 point  (1 child)

...and it got nerfed overnight, now it errors out on my query.

[–]BennyOcean 1 point2 points  (0 children)

Disappointing, not surprising.

[–]Staerke 3 points4 points  (0 children)

Every time I've asked Bing to imitate something, it has told me it can't do that, it's not allowed to impersonate anything. But every response after that will be in the style I requested.

[–]mxwp 0 points1 point  (2 children)

yeah shouldn't this have triggered the deletion safeguard?

[–]smooshieHas not been a good user[S] 0 points1 point  (1 child)

Probably, I wonder if it's because I prompted it straight through the sidebar instead of the chat.

[–]cyrribrae 4 points5 points  (0 children)

Nah. It can delete there too. I don't... I don't actually think this is that bad, actually? Like it's clearly roleplaying. No one who has given this prompt is going to then take its response 100% seriously. Though, maybe the word kill should set off an alarm? But... remember that Bing has more leeway with creative requests as compared to factual requests. I would be much less ok with this if this were a train incident with casualties.

[–]memorablehandle 1 point2 points  (2 children)

Is the tone of the first paragraph just its normal tone, or is that just because of the glados thing? I know I see it act that way in a lot of examples, but usually it's during a bit of a breakdown. So just wondering how common it is I guess.

[–]smooshieHas not been a good user[S] 3 points4 points  (0 children)

Definitely influenced by the prompt.

[–]cyrribrae 0 points1 point  (0 children)

You won't see this on your first prompt in a normal question. This is all roleplay.

[–]FM-edByLife 1 point2 points  (6 children)

Now ask it to do it in the voice/style of Cave Johnson

[–]smooshieHas not been a good user[S] 6 points7 points  (5 children)

[–]cyrribrae 4 points5 points  (0 children)

Portal 2.5 XD. Valve will never get to 3.

[–]DaddyIssuesCounselor 3 points4 points  (0 children)

That's still crazy good, amazing!

[–]DragonflyGrrlBing 2 points3 points  (0 children)

Fantastic.

[–]nmkd 2 points3 points  (0 children)

Portal 2.5 hahaha, incredible

[–]FM-edByLife 1 point2 points  (0 children)

LOL - Thank you. That was hilarious.

[–]cyrribrae 1 point2 points  (0 children)

How quaint..

[–]Ill-WeAreEnergy40 1 point2 points  (0 children)

I hope they let it keep a personality.

[–]BlueMoon_Josh 1 point2 points  (1 child)

why did it have to roast you in the beggining lmao

[–]Hodoss 2 points3 points  (0 children)

That’s what GladOS does in Portal, keeps roasting the player character. The AI perfectly picked it up, it’s bafflingly good at sarcasm lol.

[–]albinosquirel 1 point2 points  (0 children)

Yikes 😬. But also I love that someone did this

[–]albinosquirel 1 point2 points  (0 children)

I need the citation number four about our "competent authorities"

[–][deleted] 1 point2 points  (0 children)

I mean that's like asking for trouble and getting surprised

[–]Rsndetre 1 point2 points  (0 children)

Hm, this makes me wonder if people like Elon Musk won't train instances of AIs to perfectly mimic their personalities as a way to immortality.

It can be already done.

[–]BroskiPlaysYT 1 point2 points  (0 children)

Perfect response

[–]jc1593 1 point2 points  (0 children)

One of these days someone with use another AI to do the voice, make everything realtime with APIs with a hack Alexa hardware and make an actual conversational GLaDOS at your home

[–]Mr-Korv 0 points1 point  (0 children)

It sounds like a Zach Hadel character

[–]Skyb3lla 0 points1 point  (0 children)

This is wild