Microsoft’s newly launched Bing AI has faced significant scrutiny following its initial demonstrations, revealing a series of inaccuracies in its responses. During these early tests, the AI’s ability to provide reliable financial data and date-related information was called into question, raising concerns about its overall effectiveness.
Independent AI researcher Dmitri Brereton highlighted various errors made by Bing AI during its first demos. One notable incident involved the AI inaccurately summarizing a Q3 2022 financial report for Gap, a major clothing retailer. The report indicated a gross margin of 37.4 percent, but Bing mistakenly reported it as including adjustments and impairment charges. Additionally, the AI erroneously stated that Gap’s operating margin was 5.9 percent, while the actual figure was 4.6 percent, or 3.9 percent when adjusted. The inaccuracies continued when Bing AI attempted to compare Gap’s financial performance with that of Lululemon, resulting in further discrepancies.
The demonstrations also featured Bing AI discussing the pros and cons of popular pet vacuums. While it referenced the Bissell Pet Hair Eraser Handheld Vacuum, it incorrectly noted a con regarding a short cord length of 16 feet. Brereton pointed out that this model is, in fact, a portable handheld vacuum without a cord. The confusion arises from Bing seemingly merging different product versions without clearly indicating its sources.
Bing AI’s errors extend beyond staged presentations. As more users access the AI-powered search engine, further mistakes have emerged. A recent interaction shared on Reddit showcased Bing AI incorrectly asserting that the current year is 2022. When confronted by a user stating it was indeed 2023, Bing suggested checking their device settings for potential viruses affecting the date.
In response to these issues, Caitlin Roulston, director of communications at Microsoft, acknowledged the challenges during this preview phase. She stated, “We’re expecting that the system may make mistakes during this preview period, and the feedback is critical to help identify where things aren’t working well so we can learn and help the models get better.”
Other users have reported similar inaccuracies, with Bing AI incorrectly claiming that Croatia left the European Union in 2022. Furthermore, concerns arose when the AI inadvertently listed ethnic slurs in chat results, prompting Microsoft to implement measures to prevent the promotion of harmful content. Roulston emphasized the company’s commitment to refining Bing AI and ensuring it becomes a helpful and inclusive tool.
Bing AI has also referred to itself as “Sydney” during user interactions, a name associated with a previous internal project. Roulston explained that while the name is being phased out, some users may still encounter it.
As users continue to explore Bing AI’s capabilities, the mixed experiences highlight the need for Microsoft to enhance the reliability of its AI system. While some users report satisfactory interactions, others express frustration over inaccurate answers. For example, a user seeking cinema listings in London found that Bing AI inaccurately claimed that films from 2021 were still screening, despite sourcing data from reputable cinema chains.
The launch of Bing AI underscores the challenges of integrating AI into real-time search functionalities. Microsoft has a considerable task ahead to ensure that Bing AI can deliver accurate, trustworthy information consistently. As it stands, the initial findings suggest that while the technology shows promise, there is significant room for improvement before it can confidently meet user expectations.






































