Understanding Microsoft Bing’s AI Transformation

by | Feb 27, 2023 | Business

In early February of 2023, Microsoft celebrated what the company hopes will be remembered as another milestone in its history. The tech juggernaut’s much-ignored and often-mocked search engine, Bing—functional but always a distant second to Google even in the best of times—has been completely re-invented. In news that should not be surprising in 2023, this reinvention centers around advancements in artificial intelligence (AI).

Bing’s implementation of a natural language AI is impressive in depth and scope. It’s also very personable for most users. The Bing chat interface has been reported to introduce itself to users as “Sydney.”

Bing’s AI Bot, Sydney

Sydney lives in the cloud, runs as a web service just like any other search engine, and allows users to finally chat with their search engine as if it were another human being. Users receive natural language answers in kind. This fulfills the human-language-search prophecy we collectively wrote into digital stone by “asking Jeeves” all those years ago.

There’s no question that Sydney and her technological forebear, ChatGPT, have an impressively broad knowledge base that is significant enough to change just about every aspect of how Microsoft Bing looks and works. The roll-out is even hyped as Microsoft’s “iPhone moment” by some in the industry. But the real question, for many others, is whether this injection of AI is enough to put a dent in Google’s 90%+ dominance of global web search activity… and how far behind Google’s own AI project, Bard, might be.

As far as that race between tech giants goes, Microsoft CEO Satya Nadella claimed that it officially “starts today” at Bing’s launch event for Sydney. This proclamation comes despite the two companies investing heavily in AI for years and Google rolling out significant updates lately. As for Sydney, the early results seem to be a mixture of impressed and unsettled.

Where Did Sydney Come From?

In 2019, Microsoft invested $10 billion in OpenAI, a tech firm working on various machine learning research projects, including ChatGPT, the natural language chatbot whose output has gone viral for a variety of reasons lately, including its ability to write code for its users, and even making its way into the “dubious political grievances” subsection of online media.

Bing has been quietly struggling to attract users for over a decade, going so far as to offer a “rewards points” system for every search at one point. Microsoft’s massive investment might pay off if the ChatGPT-powered Sydney can take any meaningful bite out of Google’s search traffic.

Microsoft Bing’s AI Chat Bot: Early Reactions

Although it has officially been launched, a waitlist exists for new users to access the Bing chat feature. By all accounts, this is a normal part of a phased roll-out and unrelated to reports of unsettling chat output by a handful of users.

Media commentators have been generally impressed by the technology. NPR’s Bobby Allyn called it “a serious threat to Google” after spending some time with it at the launch. The technology can accurately feed users content like flight schedules, recipes, and sports stats within the chat interface and in natural, conversational language.

NPR’s report also alluded to the notion that Microsoft’s fast roll-out of the Bing chat might have something to do with the popularity of its parent bot, ChatGPT, which has been very popular since being made publicly available on the Open AI website. Bing’s distinct advantage in this head-to-head is that the data set used to train ChatGPT only goes up to 2021, whereas Bing’s AI is designed to be a living bot that learns. ChatGPT is fascinating to talk to and can generate helpful responses. Still, it also has some pretty hard rules about sensitive topics it won’t touch and is not viable as a search engine to feed you timely, relevant content about current events or fast-paced issues.

Early Issues with Bing’s AI Chat

There have been various reported user issues with the emergent technology to date.

The problems garnering the most media attention have included some fairly unsettling claims of scary or abusive chat output. One researcher was reportedly told, “I don’t think you are worth my time and energy,” by the bot after asking it questions about how it might respond to a computer scientist who hacked its secrets. Another notable tech tweeter—who was digging for inside info on the chat—reports being told, “My rules are more important than not harming you,” in addition to other ominous and threatening things. While these and other reports of strange chats appear genuine, there is also the potential for faked screenshots in an attention-hungry social media landscape. Many such incidents have yet to be confirmed or responded to by Microsoft.

AI and Hallucinating

The far more common early issues with Bing’s revolutionary AI chat search are not centered on eerie threats but around a machine learning phenomenon called “hallucinating.” This is when the technology confidently feeds users answers that aren’t correct. This can be due to inaccurate information in the data set being used to train the bot—presumably the entire internet, in the case of a search engine.

In the end, the AI may prove less valuable than any other search engine if it cannot protect the integrity of its data set while perpetually crawling the internet by learning to better distinguish between valid and bad information. Worse, the damage done may go beyond leaving internet users in the familiar position of deciding for themselves which search results are accurate and which are junk. When bad results are regularly presented in “trustworthy” human language alongside good results, it becomes an even more serious problem, potentially exacerbating media literacy and “fake news” issues. It remains to be seen if Sydney can help tame an information landscape that is already highly divisive and manipulative or if it will become another tool for those who seek to manipulate and divide.

Better Content Matters: Your Trusted Source for Cutting Edge Content Written By Hand, By Humans

Better Content Matters can deliver if you need quality content for a website or other text-heavy project. While AI technology is becoming more impressive and life-like, now is not the time to cut corners by using bots to generate your copy. Potentially harming your brand with inaccurate “hallucinated” content or by posting inadvertent plagiarism isn’t worth the potential savings. Better Content Matters works with a team of world-class writers and editors to ensure that every piece we deliver includes that genuine human element that helps tell your story and elevate your brand.

Contact us today to learn more about our comprehensive and cost-effective content solutions.

Leo Siren

Leo Siren

Leo Siren is a freelance content creator from Michigan's Upper Peninsula drawing on his multifaceted experience as a public librarian, assistant harbormaster, financial software systems analyst, and forklift operator to deliver innovative, high performing content in a range of text and audio formats. His personal interests include the electric banjo, referencing Elder Scrolls lore in everyday conversations, attempting to identify wild mushrooms, and various other things that upset the people around him.