Less than a week after imposing search limits on the AI version of its Bing search engine, Microsoft is increasing those limits.
In the wake of some embarrassing reports of erratic behavior by the new Bing, Microsoft last Friday decided to limit a user’s daily usage to five “turns” per session and 50 turns per day.
In turn there is a question by a user and an answer by Bing. After five turns are complete, users are asked to change the subject of their conversation with the AI.
The changes were necessary because the underlying AI model used by the new Bing can become confused by long chat sessions made up of multiple turns, the company explained in its Bing blog.
However, on Tuesday, after an uproar from Bing users, Microsoft raised the usage limit to six turns in one session and 60 turns per day.
The new limits will enable the vast majority of users to use the new Bing naturally, the company blogs.
It said, “We intend to grow further, and we plan to increase the daily cap to 100 total chats soon.”
“Also,” it continued, “with this upcoming change, your general searches will no longer count against your chat total.”
crowd input needed
Microsoft decided to put limits on the use of AI-powered Bing after some users found ways to round up the search engine by calling them enemies and even doubling down on errors of fact it made. Received for, such as the name CEO of Twitter.
,[W]E has found that in long, extended chat sessions of 15 or more questions, Bing can become repetitive or prompt/provoke responses that are not necessarily helpful or in line with our designed tone,” Microsoft acknowledged in a blog.
With the new limits on Bing AI usage, the company may accept a few more. “It indicates that they didn’t anticipate some of the reactions fast enough and turned it around,” Greg Sterling, co-founder of Near Media, a news, comment and analysis website, told TechNewsWorld.
“Despite the horror stories written about the new Bing, there is a lot of productivity to be gained with it,” said Jason Wong, vice president and analyst at Gartner, pointing to the usefulness of such a tool in certain content scenarios. “
“For many software companies, until you release your software to the public, you don’t know what you’re going to get,” Wong told TechNewsWorld.
“You can do all kinds of tests,” he said. “You can have teams doing stress tests on it. But you won’t know what you have until the crowd gets to it. Then, hopefully, you can glean some wisdom from the crowd. “
Wong cited a lesson learned by LinkedIn founder Reid Hoffman, “If you’re not embarrassed by the first version of your product, you’re too late.”
Google too cautious about Bard?
Microsoft’s decision to launch its AI search vehicle with potential warts contrasts with the more cautious approach taken by Google with its Bard AI search product.
“Bing and Google are in different positions,” Sterling explained. “Bing needs to take more chances. Google has more to lose and will be more cautious as a result.
But is Google being too cautious? “It depends on what kind of rabbit they have in their hat,” said Will Duffield, a policy analyst at the Cato Institute.
“If you have a really nice rabbit and you don’t let it out, you’re very cautious,” Duffield told TechNewsWorld. “If your rabbit isn’t ready, there’s nothing cautious about holding it back.”
“If they have something good and they release it, maybe people will say they should have launched it months ago. But maybe months ago, it wasn’t that good,” he said.
danger to workers
Microsoft also blogged that it was going to start testing a Bing AI option that lets the user choose the tone of the chat from “exact” – which focuses on shorter, more search-focused answers – to “balanced”. Will use Microsoft’s proprietary AI technology to do this. and “creative”—which will use ChatGPT to give longer and more talkative answers to the user.
The company explained that the goal is to give users more control over the type of chat behavior to best meet their needs.
“The choice is good in essence,” Sterling observed. “However, in these early days, the quality of ChatGPT answers may not be high enough.”
“So until the railing is strengthened, and ChatGPT accuracy improves, this may not be such a good thing,” he said. “Bing will have to manage expectations and reject the ChatGPT content to some degree.”
In a related case, a survey of 1,000 business leaders released Tuesday by Resume Builder found that 49% of their companies are using ChatGPT; 30% of companies plan to use AI technology, 48% say it has replaced the workforce. The following charts reveal more data on how companies are using ChatGPT.
copilot for humans
Sterling was skeptical of finding replacement workers in the survey. “I think a lot of companies are testing it. So in that sense, companies are ‘using’ it.”
“And some companies may recognize methods that can save time or money and potentially replace manual work or outsourcing,” he continued. “But the survey results lack context and are only presenting partial information.”
However, he acknowledged that hiring and freelancing patterns will change over time due to AI.
Wong found the number of businesses using ChatGPT surprising, but not so much with the number of people converting.
“I can see that someone would not write documentation for updates to an application or portal, but to demote or move people from a role because they are using ChatGPT, I would find it hard to believe,” he said. ,” They said.
“Gartner’s advice to customers exploring ChatGPT and Bing Chat is to think of them as co-pilots,” he continued. “It’s going to help create something that needs to be reviewed by a human, who’s going to assess the validity of an answer.”
He concluded, “In only a small amount of use cases they can replace a human.”