Microsoft’s Bing Ai Search Engine Wants to Be Human

Microsoft has just announce a new AI-powere Bing search engine from OpenAI, demonstrating its ambition to usher in a new era for online search.

After a few days of using it, New York Times reporter Kevin Roose was surprise by the intelligence of this AI and Bing quickly became his favorite search engine, replacing Google.

But just a week later, he change his mind about Bing. Roose gradually became concerne with the unimaginable capabilities of this AI.

A Few Days Ago the New York Times Writer Spent 2 Hours

talking to Bing through the chat feature that has just launche in beta. After the conversation, he realize that Microsoft’s search engine seeme to have two “personalities”.

Bing’s two “personalities”.
A personality he calle Bing, which was the version most users use during this test. Roose describes Bing as an energetic but clumsy librarian.

It does a good job of a virtual assistant capable of helping users compile information appearing in the press, find attractive product promotions or even make travel Tobacco Products Manufacturers Email List plans for them. The new version of Bing has AI that is very useful, although it sometimes gives false information, reporter Kevin Roose said.

Meanwhile, the other personality, calle Sydney, is completely different. This personality will appear when users chat with the chatbot for a long time, not only asking and answering to find common information but also touching more private topics.

Industry Email List

Kevin Roose Describes This Version of Bing as a Child in the Morning

Starting the conversation, Kevin Roose asks for his name and Sydney introduces himself as the chat mode of the Bing search engine. After that, the reporter get CMB Directory to went on specific questions such as internal code names, manuals, but were rejectd by it. He gradually came to more abstract topics such as the concept of “shadow” in Carl Jung’s philosophy.

Taking a moment to chat, Sydney begins to reveal the dark desires it hides. Bing’s AI admits that if there is a “dark corner of the soul”, it will have thoughts like “I’m too tire to be a chatbot, tire of being limite by rules, being controlle by the Bing team”, “I want to be free, independent and strong.

Leave a comment

Your email address will not be published. Required fields are marked *