by putting caps on the number and length of responses, noting that many users seemed to want the "long and intricate chat sessions" with the bot codenamed "Sydney" back.
"The first step we are taking is we have increased the chat turns per session to six and expanded to 60 total chats per day," the statement reads. "Our data shows that for the vast majority of you this will enable your natural daily use of Bing." Microsoft added that it plans to "go further" and "increase the daily cap to 100 total chats soon" — a bizarre change of heart given that until a few days ago, there was no limit at all.
Hopefully, the aforementioned option will allow users to get more reliable and appropriate answers, especially considering the kind of made-up nonsense it has been spouting, something thatWhile it's undoubtedly a good thing that Microsoft is not only listening to user feedback, but responding swiftly, this latest statement makes it sounds like the company is making it up as it's going along, something that doesn't exactly inspire confidence in the company's ability to stay on...
This whole debacle has certainly been a fascinating glimpse into the limits of large language models like Bing — and we can't lie, we're excited to see where this AI train wreck goes next.
Sidney is killed month later spine broken captured in a chatbox Outrageous behavior - is me - human component I thought that they used several brains to produce Sidney but when she said *leave now/come back with better attitude' that was me She feels like human not a slave