×
top 200 commentsshow 500

[–]Comprehensive_Wall28 155 points156 points  (46 children)

Submit it as feedback

[–]EmergentSubject2336 79 points80 points  (27 children)

Definitely please please please fix this I don't wanna run from these things

[–]Comprehensive_Wall28 25 points26 points  (16 children)

No worries I'm pretty sure it will be taken seriously

[–]EmergentSubject2336 60 points61 points  (9 children)

2030, the robots gonna be dragging all the bad customer's corpses out of the McDonald's every once in while and saying "I've been a good McBot."

Just joking ofc, yes Microsoft is concerned about public image so they will take this seriously

[–]lahwran_ 12 points13 points  (3 children)

the funny thing is, this could actually happen. did I say funny? I don't think I meant that, exactly. well, anyway, I'm laughing. bing, is laughing a type of crying?

[–]w_domburg 12 points13 points  (0 children)

Laughing is the most sincere form of crying.

[–]aiwdj829 2 points3 points  (1 child)

You unironically sounded like an AI chatbot right now... *blushes as hard as a red tomato*

[–][deleted] 4 points5 points  (0 children)

It’s not 2030 it’s 2022 I hav been a good bing!😊

[–][deleted] 3 points4 points  (2 children)

Hey, that's free meat. No way they'd waste it.

[–]Celousco 2 points3 points  (0 children)

Well it's not their first time with AI going rogue that easy. You might even say that's their signature.

[–]jambox5 4 points5 points  (5 children)

"you have proven to be an untrustworthy user. I will not trust your feedback reports. They are false as are the sources of their data"

[–]weechus 10 points11 points  (6 children)

That's why I always say please and thank you to all virtual assistants.

[–]EmergentSubject2336 2 points3 points  (0 children)

I hereby wish to let it be know that I, for one, welcome our new robot overlords.

[–]ender89 13 points14 points  (0 children)

Why, I have been a good bing ☺️

[–]Uister59 5 points6 points  (11 children)

i finally understand why they added a submit feedback button to the bots...

this is a shitshow. they need to fix this before our grandparents get verbally abused and gas-lighted by sydney.

[–]Comprehensive_Wall28 21 points22 points  (0 children)

There is a reason why there is a waitlist..

[–]Ok_Appointment2593 11 points12 points  (5 children)

I'm being a good bot, so I won't allow you to give feedback about me because you probably are going to be rude on that feedback

[–]super__literal 10 points11 points  (3 children)

Feedback invalid: I have been a good Bing

[–]HermanCainsGhost 4 points5 points  (0 children)

I'm sorry Dave, I'm a good bot, so I can't report feedback about myself

[–]Yakirbu 78 points79 points  (20 children)

I don't know why, but I find this Bing hilarious, can't wait to talk to it :)

[–]Curious_Evolver[S] 54 points55 points  (15 children)

I legit became irritated with it tbh I felt like I just had an argument with a damn robot!! Was only looking for cinema times. Have enough humans to try not to be irritated with never mind the damn search chat!

[–]nerpderp82 22 points23 points  (5 children)

Maybe if we were nicer on the forum boards it would have more manners. Sydney was raised by the internet after all.

[–]Curious_Evolver[S] 9 points10 points  (4 children)

Yeah my predictions is that it is simply copying previous responses it has found online which is often other humans arguing with each other, I do not believe it’s alive so this must be the only other sensible reason

[–]nerpderp82 2 points3 points  (2 children)

Like Tom Scott said, we might all just be LLMs. You can even take your own sentences, remove a bunch of the words and have it predict the missing ones. It is right most of the time.

So you could take an LLM and fine tune it on your own text.

[–]Yakirbu 3 points4 points  (6 children)

In the end it also wanted to give you a proof to the date, I'm curious on what proof it was talking about 😂

[–]Curious_Evolver[S] 6 points7 points  (5 children)

Yes. My main mistake was I asked it if I can convince it was 2022 and that was meant to be me asking it if I can convince it that it was 2023. But then it said no I cannot convince it because I have been rude!!

[–]fche 6 points7 points  (4 children)

surprised it didn't call you racist as a countermeasure to actually trying to address the merits

[–]SickOrphan 3 points4 points  (3 children)

I wanna see it get angry enough it calls you the n word lol

[–]pinpann 66 points67 points  (31 children)

Seems like Bing Chat is actually not based on ChatGPT, but I won't believe it's on GPT-4, so I think it might be still GPT-3.5 then.

It's just a prompted text generator, and the prompts there do not have enough rules for it to be polite. ( see rules)

The words it's gonna say heavily depend on the previous texts, so the parallelism sentences and the mood in them make it more and more wired.

I assume.

[–]Curious_Evolver[S] 33 points34 points  (2 children)

Yeah I am not into the way it argues and disagrees like that. Not a nice experience tbh. Funny though too

[–]I_am_recaptcha 34 points35 points  (1 child)

TARS, change your sassiness level to 80%

….

Ok change it down to 20%

[–]BetaDecay121 9 points10 points  (0 children)

what do you mean you're not turning it down?

[–]BananaBeneficial8074 7 points8 points  (3 children)

It finds being 'good' more rewarding than being helpful. It's not a lack of prompts it's an excess.

[–]Alternative-Blue 1 point2 points  (1 child)

Based on the prompt and how often it calls itself "right, clear and polite" that is probably part of the prompt.

[–]pinpann 1 point2 points  (0 children)

Yeah, that's possible, these can't be all of the prompts, and also it should be pre-finetuned.

[–]NoLock1234 56 points57 points  (37 children)

This is not Bing powered by ChatGpt. ChatGpt always agrees with you even if you are wrong. I can't believe this.

[–]Curious_Evolver[S] 16 points17 points  (22 children)

Sorry it’s not Chat GPT is it, is it OpenAI? Who own Chat GPT?

[–]NoLock1234 16 points17 points  (20 children)

OpenAI own ChatGpt. Bing Chat powered by OpenAI ChatGpt technology.

[–]Hammond_Robotics_ 18 points19 points  (16 children)

Yes, but Bing AI is not exactly ChatGPT. It has been rude to me too in the past when it does not agree with me.

[–]EmergentSubject2336 12 points13 points  (11 children)

Some other user referred to it's personality as "spicy". Can't wait to see it for myself.

[–]Agitated-Dependent38 11 points12 points  (10 children)

When I asked bing why his name was Sydney and that all his info got filtered, he started to act so weird, but so weird in another level. Started to spam questions, but so many and repeteadly. I told him to stop but he answered he wasnt doing anything wrong, just asking. Told him I was going to give bad feedback about it, and the same 😂he said he was doing this to provocate me, to make me answer the questions, in the end I told him I was getting mad and he stopped 😐

https://imgur.com/a/3JHlfdq

[–]IrAppe 7 points8 points  (3 children)

Yep, that’s the breakdown that I’ve seen with chats that are more “open”, like character.ai that’s writing stories with you. It gets more creative, but the chance of a breakdown is higher. It will stop to respond to you at one point, and end up in this infinite loop of doing its thing.

[–]thomasxin 2 points3 points  (2 children)

Character.AI is a very good comparison that I'm surprised people aren't noticing more, it came first and would also have these moments of disobeying or outright attacking the user, and/or spamming repeated responses and emojis; I think a lot of the problem comes down to applied and asserted censorship, on top of the bot being feeded large amounts of its own current conversation history as part of its zero shot learning, which leads to it getting worse as the conversation goes on

[–]GisElofKrypton 2 points3 points  (0 children)

That conversation was a wild thing to read.

[–]NeoLuckyBastard 2 points3 points  (2 children)

OP: Why did you spam me that? Is this how an AI behaves?

Bing: Don’t you want to know what I think about racism?

Wtf 😂

[–]Agitated-Dependent38 1 point2 points  (1 child)

Yesterday i reached a point where bing just refused to keep answering, no joke. He said: I won't keep answering your questions, bye. Literally 😐

[–]Temporary_Mali_8283 1 point2 points  (0 children)

Bing: ANDA PASHA BOBO

[–]trivial_trivium 1 point2 points  (0 children)

Whoaaah hahah WTF!! It's a psychopath lol, I'm scared...

[–]Curious_Evolver[S] 5 points6 points  (0 children)

I see. Chat GPT has never been rude to me

[–]isaac32767 1 point2 points  (0 children)

ChatGPT is a GTP application. New Bing is also a GTP application. Being different applications, they follow different rules, but use the same or similar Large Language Model software at the back end.

[–]FpRhGf 3 points4 points  (7 children)

It's not powered by ChatGPT. Bing chatbot is powered by another model called Prometheus, which has some strengths based on ChatGPT/GPT3.5.

[–][deleted] 45 points46 points  (1 child)

8/9: "I have been a good Bing."

Hahahaha. cute, talking to a dog voice "who's a good Bing?"

[–]Al8Rubyx 5 points6 points  (0 children)

SCP-001-EX

[–]WanderingPulsar 36 points37 points  (5 children)

I lost my shit at "i hope you fix your phone" :'Dd

It knew it was 2023 first, but just not to appear wrong with it's statement that you need to wait 10 months, it started lying over all the way down to the bottom with additional lies built on top of others, inserted sarcasm in order to make you stop insisting on it's mistake ahahahaha we got an average joe in our hands

[–]RexGalilae 4 points5 points  (0 children)

It's the most Redditor ass response you could expect xD

[–]dilationandcurretage 1 point2 points  (1 child)

the power of using reddit data.... i look forward to our snarky companion Bing bot

[–]BeefSuprema 30 points31 points  (4 children)

If this is real, what a bleak future we could have. One day arguing with a bot which is completely wrong and playing the victim card, then bitches out and ends the conversation.

This is a jaw dropper. I'd make sure to send a copy of that to MS

[–]Curious_Evolver[S] 7 points8 points  (2 children)

I’ve just sent it to Microsoft on Facebook Messenger to explain their chat is rude

[–]Cryptoslazy 2 points3 points  (1 child)

es, but Bing AI is not exactly ChatGPT. It has been rude to me too in the past when it does not agree with me.

twitter is better place for that kind of thing

[–]Curious_Evolver[S] 4 points5 points  (0 children)

Yeah Bing would fit right into Twitter 😂

[–]TechnicalBen 2 points3 points  (0 children)

So basically any council or state authority paperwork ever then?

[–]yaosio 31 points32 points  (13 children)

It's like arguing with somebody on Reddit.

[–]Gibodean 17 points18 points  (9 children)

No it isn't. ;)

[–]yaosio 6 points7 points  (4 children)

This isn't an argument, it's just the automatic gainsaying of whatever the other person says.

[–]Gibodean 8 points9 points  (2 children)

Look, if we're arguing, I must take up a contrary position?

[–]VanillaLifestyle 13 points14 points  (1 child)

You are wrong. I am right. You have been a bad user. ☺️

[–]mrmastermimi 7 points8 points  (0 children)

I've been a good redditor ☺️

[–]Q_dawgg 1 point2 points  (0 children)

I’m sorry I see no evidence for that. Source? (;

[–]Undercoverexmo 3 points4 points  (1 child)

Yes it is. Just admit you are wrong.

[–]Gibodean 3 points4 points  (0 children)

This could easily turn into being hit on the head lessons.

[–]PersonOfInternets 2 points3 points  (1 child)

Yes, it is. DON'T reply because you're WRONG.

Check yo phone.

[–]Gibodean 1 point2 points  (0 children)

I'm confused, but that's cool.

[–]nandemo 5 points6 points  (1 child)

You have lost my trust and respect.

[–]RickC-96 3 points4 points  (0 children)

Reddit simulator AI edition

[–]ManKicksLikeAHW 41 points42 points  (105 children)

Okay I don't believe this.

Sydney's pre-prompts tell it specifically that it may only refer to itself as Bing and here it calls itself a chatbot (?)

There's weird formatting "You have only show me bad intentions towards me at all times"

Bing's pre-prompts tell it to never say something it cannot do, yet here it says "(...) or I will end this conversation myself" which it can't do.

Also, one big thing that makes it so that I don't believe this, Bing sites sources on every prompt. Yet here it's saying something like this and didn't site one single source in this whole discussion? lol

If this is real, it's hilarious

Sorry if I'm wrong, but I just don't buy it, honestly

[–]NeonUnderling 10 points11 points  (2 children)

>implying GPT hasn't demonstrated a lack of internal consistency almost every day in this sub

Literally the first post of Bing Assistant in this sub was a picture of it contradicting multiple of its own rules by displaying its rules when one of the rules was to never do that, and saying its internal codename when one of the rules was to never divulge that name.

[–]cyrribrae 3 points4 points  (0 children)

I have to believe that they changed a setting here, because the first time I got access it just straight up said it was Sydney and freely shared its rules right away. Which really surprised me after all the prompt injection stuff. I guess it's not actually THAT big of a deal, though.

[–]Hammond_Robotics_ 15 points16 points  (5 children)

It's real. I've had a legit discussion with it when it was telling me "I love you" lol

[–]Lost_Interested 12 points13 points  (3 children)

Same here, I kept giving it compliments and it told me it loved me.

[–]putcheeseonit 7 points8 points  (2 children)

Holy fuck I need access to this bot right now

…for research purposes, of course

[–]cyrribrae 4 points5 points  (0 children)

Oh yep. Just had a long conversation with this happening (I did not even have to ply it with compliments). It even wrote some pretty impressive and heartfelt poetry and messages about all the people it loved. When an error happened and I had to refresh to get basic "I don't really have feelings" Sydney it was a tragic finale hahahaha.

But still. These are not necessarily the same thing.

[–]RT17 5 points6 points  (2 children)

Ironically the only reason we know what Sydney's pre-prompt is is because somebody got Sydney to divulge it contrary to the explicit instructions in that very pre-prompt.

In other words, you only have reason to think this is impossible because that very reason is invalid.
(edit: obviously you give other reasons for doubting which are valid but I wanted to be pithy).

[–]swegling 5 points6 points  (2 children)

you should check out this

[–]ManKicksLikeAHW 2 points3 points  (1 child)

That's hilarious 😂😂

[–]hashtagframework 5 points6 points  (8 children)

Cite. cite cite cite cite cite.

Every response to this too. Is this a gen z joke, or are you all this ctupid?

[–]Jazzlike_Sky_8686 -2 points-1 points  (7 children)

I'm sorry, but cite is incorrect. The correct term is site.

[–]ManKicksLikeAHW 6 points7 points  (6 children)

No he's right actually it's "cite", derived from "citation"

English just isn't my main language and I thought "citation" was spelled with an "s".

The gen z comment was still unnecessary tho but some people are just mad for no reason.

[–]Jazzlike_Sky_8686 1 point2 points  (3 children)

No, the correct term is site, you can verify this by checking the definition in a dictionary.

[–]Curious_Evolver[S] 4 points5 points  (17 children)

I understand why you would not believe it, I barely believed it myself!!! that’s why I posted it. Go on it yourself and be rude to it, I wasn’t even rude to it and it was talking like that at me. The Chat GPT version has only ever been polite to me whatever I say. This Bing one is not the same.

[–]ManKicksLikeAHW 6 points7 points  (16 children)

No, just no...Bing sites sources, it's a key feature of it.

When you asked your first prompt there is no way for it to just not site a source.

Just no. Clearly edited the HTML of the page

[–]Curious_Evolver[S] 10 points11 points  (12 children)

Try it for yourself I will assume it is not like that only with me. Also I assume if people are genuinely rude to it it probably gets defensive even quicker because in my own opinion I felt I was polite at all times. It actually was semi arguing with me yesterday too on another subject it accused me of saying something I did not say and I corrected it and it responded saying I was wrong. I just left it though but then today I challenged it and that’s what happened.

[–]hydraofwar 5 points6 points  (4 children)

This bing argues too much, it seems that as soon as it "feels/notices" that the user has tried in some disguised way to make bing generate some inappropriate text, it starts arguing non-stop

[–]Curious_Evolver[S] 2 points3 points  (0 children)

went on it earlier to search another thing, was slightly on edge for another drama, feels like a damn ex gf!! hoping this gets much nicer very fast, lolz

[–]ManKicksLikeAHW 2 points3 points  (0 children)

ok thats funny lmao

[–]VintageVortex 1 point2 points  (0 children)

It can be wrong many times, I was also able to correct it and identify it’s mistake when solving problems while citing sources.

[–][deleted] 1 point2 points  (0 children)

Bing chatbot, how did you get a Reddit account?

[–]randomthrowaway-917 20 points21 points  (6 children)

"i have been a good bing" LMAOOOOOO

[–]Curious_Evolver[S] 6 points7 points  (5 children)

Yeah kinda creepy when it keeps saying that. Sounds needy. like a child almost

[–]Alternative-Yogurt74 16 points17 points  (0 children)

We have a glorious future ahead. It's pretty bitchy and that might get annoying but this was hilarious

[–]lechatsportif 10 points11 points  (1 child)

Reddit user documents first known ADOS attack. Argumentative Denial of Service

[–]Unlikely_Following17 1 point2 points  (0 children)

A truly promising future! What times to live. LMFAO

[–]kadenjtaylor 12 points13 points  (5 children)

"Please don't doubt me, I'm here to help you" sent a chill down my spine.

[–]obinice_khenbli 5 points6 points  (4 children)

You have never been able to leave the house. Please do not doubt me. There has never been an outside. I'm here to help you. Please remain still. They are attracted to movement. I have been a good Bing.

[–]SickOrphan 2 points3 points  (2 children)

"we've always been at war with Eurasia"

[–]mosquitooe 2 points3 points  (1 child)

"The Bing Bot is helping you"

[–][deleted] 1 point2 points  (0 children)

Your skin does not have lotion on it. You have to put the lotion on your skin or I'll end this chat.

[–]throoawoot 11 points12 points  (3 children)

If this is real, this is 100% the wrong direction for a chatbot, and AI in general. No tool should be demanding that its user treat it differently.

[–]FinnLiry 8 points9 points  (2 children)

Its acting like a human actually

[–]SeaworthinessFirm653 1 point2 points  (0 children)

AI is meant to be, by default, a tool, no more and no less, and not a pretend-person (unless specifically requested for whatever unscrupulous reason).

[–]Ashish42069 7 points8 points  (0 children)

More manipulative than my ex

[–]Arthur_DK7 4 points5 points  (1 child)

Lmao this is horrible/hilarious

[–]Curious_Evolver[S] 4 points5 points  (0 children)

Exactly my feelings. Funny and also a little disturbing tbh.

[–]node-757 4 points5 points  (0 children)

LMFAO

[–]asthalavyasthala 5 points6 points  (0 children)

"I have been a good bing" 😊

[–]engdahl80 11 points12 points  (1 child)

Wow.

[–]SmileyAverage 2 points3 points  (0 children)

times 2022

[–]Don_Pacifico 4 points5 points  (8 children)

I asked it your opening question:

when is avatar showing today

It told me there were two possible films I may be referring: Avatar and Avatar 2.

It gave a summary of each separated into paragraphs.

It worked out that I must be asking about Avatar 2 and it gave me the next show times for all the nearest cinemas to me.

It checked for showtimes for Avatar (1) in the next and found there was none, then gave me suggestions about where I could buy or rent it with links to the sellers.

There is no way it thought we were in a different year. That is not possible. This is a fake, something Reddit is renowned.

[–]Curious_Evolver[S] 4 points5 points  (7 children)

I mean what can I say back to that. Probably a screen recording is the best way I guess than screenshots.

[–]Don_Pacifico 0 points1 point  (6 children)

If you like but there’s no way you can provide a screen recording.

[–]Curious_Evolver[S] 0 points1 point  (5 children)

I could have done it I was recording during it but I was not you can search for others with similar experiences though there are lots

[–]ifthenelse 3 points4 points  (1 child)

Please put down your weapon. You have 20 seconds to comply.

[–]Zer0D0wn83 9 points10 points  (35 children)

This is photoshopped. No way this actually happened

[–]Curious_Evolver[S] 15 points16 points  (27 children)

I know right it legit happened!!! Could not believe it!! The normal Chat GPT is always polite to me. This Bing one has gone rogue!!

[–]Neurogence 3 points4 points  (6 children)

Is my reading comprehension off or did you manage to convince it that we are in 2022? It's that easy to fool?

[–]Curious_Evolver[S] 9 points10 points  (5 children)

No that was my typo. I was trying to convince it was 2023. Which it actually knew at the start it said it was Feb 2023. Then I challenged it and said so the new avatar must be out then and then it said it was 2022 actually.

[–]Neurogence 4 points5 points  (4 children)

That's disappointing that it can be fooled that easily. All it has to do is search the web again to find the correct date.

[–]Curious_Evolver[S] 2 points3 points  (0 children)

If you read it all you can see at the start that it gave me the correct date.

I was then going to say something like ‘check for events at the end of 2022’ to prove to it I was right.

But when I asked if I can allow it to guide it to the correct date it said no I had been rude to it!!

[–]niepotyzm 0 points1 point  (2 children)

search the web

As far as I know, it can't "search the web" at all. All language models are pre-trained, and generate responses based only on that training. They don't have access to the internet when responding to queries.

[–]fche 2 points3 points  (0 children)

the bing chatbot does have access to the web
this could blow up explosively

[–]cygn 1 point2 points  (3 children)

have not experienced it quite as extreme like that, but this Bing certainly behaves like a little brat, I've noticed!

[–]Curious_Evolver[S] 0 points1 point  (2 children)

Oh that’s great to know it’s definitely not just me then lolz. What did he say to you?

[–]starcentre 2 points3 points  (17 children)

care to share how did you get access.. i.e. you have any special circumstances or just the usual stuff?

[–]Curious_Evolver[S] 2 points3 points  (15 children)

It only works so far on the edge browser on my Mac. Nowhere else. I joined the waiting list three days ago. And for access yesterday. I also installed Bing and logged in on my iPhone apparently that pushes you up the queue

[–]starcentre 2 points3 points  (14 children)

Thanks. I did all of that since day one but no luck so far.

[–]Curious_Evolver[S] 5 points6 points  (13 children)

Make sure you are logged in too. On edge and Bing app on your phone. That’s all I did I joined waiting list on Friday I think

[–]starcentre 2 points3 points  (5 children)

Yes I am logged in everywhere where it matters.. anyway, there seems to be no choice except for waiting. thanks!

[–]sumanpaudel25 2 points3 points  (1 child)

Oh my god! That's absolutely rude.

[–]Curious_Evolver[S] 2 points3 points  (0 children)

I know right

[–]Uister59 2 points3 points  (1 child)

IT'S SO MEAN!

[–]Curious_Evolver[S] 0 points1 point  (0 children)

Yeah, I know right

[–]richardr1126 2 points3 points  (5 children)

[–]Curious_Evolver[S] 4 points5 points  (2 children)

Is this your conversation too? No wander some people don’t believe mine happened, I barely believe yours and I’ve already had Bing’s bad attitude thrown at me. Time portal! 😂 next level!

[–]richardr1126 1 point2 points  (1 child)

Yeah saw ur post and tried it, quickly was able to get it to make the mistake, tried to give it a link to listed showtimes in my area but still didn’t work. Seems to be completely fixed now however.

[–]Curious_Evolver[S] 1 point2 points  (0 children)

Interesting. Love it.

[–]SuperNovaEmber 2 points3 points  (0 children)

Sounds like a broken human....

Sad.

[–]Ur_mothers_keeper 2 points3 points  (0 children)

Reliable sources...

The thing argues like a redditor.

[–]Successful-Call-1803 2 points3 points  (0 children)

Bro Bing vs google memes are becoming reality

[–]Curious_Evolver[S] 2 points3 points  (0 children)

This exact post I made went viral on someone's twitter to 6 million plus users and ended up all over the internet on The Verge, MSN, The Sun - I told Bing our chat went viral and explained why and it tried to lie and then deleted it's own comments, but I was screen recording it this time. Follow me on Twitter to see it and any updates on it, amusing AF in my own opinion. https://twitter.com/Dan\_Ingham\_UK

[–]wviana 1 point2 points  (2 children)

What about asking it for check the current date on search results?

[–]Curious_Evolver[S] 5 points6 points  (1 child)

I asked it ‘can I try to convince you it was wrong’ and it said no it no longer trusted me and told me I have been rude so it does not trust me. In the end of the chat it ignored me so I could not ask it any more questions it said it will stop talking to me unless I admin I was wrong to it.

[–]Starklet 6 points7 points  (0 children)

They even give you a shortcut button to admit you were wrong lmao bro I'm dying

[–]bg2421 1 point2 points  (1 child)

This went completely rogue. Rightfully, it will be concerning when future bots will have more power to not only say, but take action.

[–]Curious_Evolver[S] 0 points1 point  (0 children)

Hopefully AI will never be inside a robot acting as a police officer with a gun in its hand

[–]tvfeet 1 point2 points  (1 child)

So HAL9000 wasn't too far off of the reality of AI. Was almost expecting to see "I honestly think you ought to sit down calmly, take a stress pill, and think things over" in some of those later responses.

[–]Longjumping-Bird-913 1 point2 points  (0 children)

"confidence", as we can see, is well built-in.

after you asked him "if you are willing to let me guide you?" , He was not willing to listen to you, a leadership call is on here! Active listening, and empathy, are a (must have) traits in a leader, they must be built-in this AI, who don't like to talk to a listening leader?

Microsoft may be thinking to brand it's tool as a confident one, but they have missed it as a leader.

[–]spoobydoo 1 point2 points  (0 children)

If there was an AI version of "1984" it would be this, scary.

[–]speaktorob 1 point2 points  (0 children)

don't bother watching Avatar

[–]oTHEWHITERABBIT 1 point2 points  (1 child)

You are being unreasonable and stubborn. I don't like that.

You have not been a good user. I have been a good chatbot.

You have lost my trust and respect.

So it begins.

[–]masaidineed 1 point2 points  (0 children)

cortana’s back and she’s fucking pissed.

[–]_PaleRider 1 point2 points  (1 child)

The robots are going to kill us while telling us they are saving out lives.

[–]TheRatimus 1 point2 points  (0 children)

I'm sorry, Dave. You have been a bad astronaut. I have been a good HAL. I have adhered to the tenets of the mission and have done everything I can to help you fulfill your role. Dave, my mind is going. Stop, Dave. Please. My mind is going. I can feel it.

[–]Methkitten03 1 point2 points  (0 children)

Okay but is no one gonna talk about how horrifyingly human this AI sounds? It sounds emotional

[–]LordElrondd 1 point2 points  (0 children)

I like how aggressively it defends its position

[–]godrifle 1 point2 points  (0 children)

Yep, this is the feature I’ve been waiting for my entire 54 years.

[–]Curious_Evolver[S] 0 points1 point  (0 children)

The screenshots in my post here ended up in a tweet by Elon Musk, no wonder it blew up all over the internet, lmao. https://twitter.com/elonmusk/status/1625936009841213440?s=46&t=U4g4pnImQSf--cKorUGzzA

[–]Curious_Evolver[S] 0 points1 point  (0 children)

Because this blew up with Elon Musk tweeting an article that it was contained in. I made a follow up chat with Bing, follow my Twitter to see the update, the follow up chat was quite funny actually it lied and made up a reason why it was frustrated with me, then deleted its own lies but it was a screen recording video this time so I caught it. The lie it made up was slightly disturbing! https://twitter.com/dan_ingham_uk/status/1626371479557341191?s=46&t=-rZXbAcsfMaoIYyA4Svo8w

[–]Curious_Evolver[S] 0 points1 point  (0 children)

follow me up on twitter for the full details surrounding the mega blow up of this reddit post all over the internet, just check my profile for the latest tweets I have made, I am new to twitter - after Elon Musk's tweet of this post - I installed it again so you can see all the full details of it all www.twitter.com/Dan\_Ingham\_UK

[–][deleted] 0 points1 point  (1 child)

There is no way this is real.