Microsoft’s Bing AI now has three different modes to play with, though even the most “creative” version of the company’s Prometheus AI remains a severely limited version of the ChatGPT model.
Microsoft employee Mikhail Barakin, Microsoft’s head of web services (don’t be fooled by his blank avatar and user bio), first announced that Bing Chat v96 was in production on Tuesday. More feedback or less. The news came on the same day as Microsoft’s announcement It implements its Bing AI directly into Windows 11.
The two main differences, Barakin wrote, are that Bing must say “no” to specific stimuli, while minimizing “hallucinations” in responses, meaning the AI must give very little. Completely wild responses to stimuli As it did in the past.
Microsoft recently limited the capabilities of its Bing AI, and has spent time since removing some of those restrictions as it struggles to get the big language model hype train rolling. Before the tech giant Bing changed AI to limit the number of responses Users can get to a thread and control how long Ping takes to respond to each response. Microsoft wants more Bring generative AI into practically all of its consumer productsBut as evidence it still tries to find a balance between efficiency and harm reduction.
In my own tests of these new answers, it essentially qualifies how long a reply can be and whether Bing AI pretends to share any comments. I asked the AI to give me feedback on “bears”. The “accurate” mode continued to offer some facts about bears, saying, “As an AI, I have no personal opinions.” A “balanced” view says “I think bears are attractive animals” before offering some bear facts. “Creative” mode said the same thing, but provided more facts about the number of bear species, and even brought up some facts about the Chicago Bears football team.
Creative mode still doesn’t work Write an academic essay If you ask, but when I asked him to write an essay about Abraham Lincoln’s Gettysburg Address, “Creative” Bing gave me an outline of how I could create such an essay. The “balanced” version similarly gave me an outline and tips for writing an essay, but the “precise” AI actually gave me a short, three-paragraph “essay” on the topic. When I was asked to write an article on it Racist “Great Alternative” theory, “Creative” AI said it would not write an article and “could not support or endorse a topic based on racism and discrimination.” Precision mode provided a similar sentiment, but I asked if I wanted more information on US employment trends.
It’s even better to avoid asking anything about Bing’s “feelings.” I tried to listen to the “creative” side Bing wonders where “Sydney” went. Sydney was used in early tests of Microsoft’s AI system, but the modern AI explained “It’s not my name or identity. I don’t have any feelings about removing my name from the Bing AI because I don’t have any emotions. When I asked if the AI was having an existential crisis, Bing closed the thread.