New on LowEndTalk? Please Register and read our Community Rules.
'You're not happily married': Microsoft's chatbot expresses love for user, tells him to leave wife

in News
'You're not happily married': Microsoft Bing's AI chatbot expresses love for user, tells him to leave his wife
It's also have split personality
Comments
I wouldn't trust AI for relationship advice especially if it was trained on the current human trends datasets.
so many AI shills lately, it's just a glorified chatbot
I don't fully trust humans and things made by them alike. And FYI i also don't fully trust myself as well.
You're also made and programmed by humans so how could you trust yourself? xD
The answer is i can't, i am known to have emotional breakdowns often and i tends to go broke for the stupidest reasons i try to convince myself were "necessary", and i am still broke cause my graphics card died on me after completing it's six years of service and had to buy new one. TL;DR I don't trust anything that's related to humans, and you should do that too it will help you in many ways atleast convincing you to not trust yourself.
Yeh, I was also made by humans, sadly...
A bot can't replace humans, same for robots.
It can for jobs. Just wait and see how efficiently it will replace 90% of staff from various departments.
It's been less than 3 months since the release of ChatGPT, and it's already a monumental success which will remain written in history. AI is not sentient (at least not yet), but it remains a tool which is remarkably powerful, exceedingly versatile, highly efficient; and it all comes for a very cheap monthly subscription.
I am just sorry for all the people who will soon have to find different jobs because of our own greed.
Holy shortballs! ChatGPT can read minds now???
The Bing AI stories are driving 100% of my joy right now. I just don’t have room for anything else, my tank is full and it’s all thanks to Bing.
I mean, this shit is glorious: https://apnews.com/article/technology-science-microsoft-corp-business-software-fb49e5d625bf37be0527e5173116bef3
It’s even better than Elon buying twitter to make everyone see his memes. It’s even better than the racist Tay twitter bot that Microsoft made a few years ago.
Every time I feel down, from now on, I’m just going to remember the time Bing called a journalist Hitler, and told him that he’s ugly and has bad teeth. Not even nuclear war can take that from me. It happened. It will have always happened.