One year to go before end of life for Windows 10.
October 22, 2024Outsourced IT Support: The Strategic Solution for Businesses
October 25, 2024Artificial intelligence chatbots have been billed as productivity tools for consumers — they can help you plan a trip, for example, or give advice on writing a confrontational email to your landlord. But they often sound stilted or oddly stubborn or just downright weird.
And despite the proliferation of chatbots and other AI tools, many people still struggle to trust them and haven’t necessarily wanted to use them on a daily basis.
Now, Microsoft is trying to fix that, by focusing on its chatbot’s “personality” and how it makes users feel, not just what it can do for them.
Microsoft on Tuesday announced a major update to Copilot, its AI system, that it says marks the first step toward creating an “AI companion” for users.
Artificial intelligence chatbots have been billed as productivity tools for consumers — they can help you plan a trip, for example, or give advice on writing a confrontational email to your landlord. But they often sound stilted or oddly stubborn or just downright weird.
And despite the proliferation of chatbots and other AI tools, many people still struggle to trust them and haven’t necessarily wanted to use them on a daily basis.
Now, Microsoft is trying to fix that, by focusing on its chatbot’s “personality” and how it makes users feel, not just what it can do for them.
Microsoft on Tuesday announced a major update to Copilot, its AI system, that it says marks the first step toward creating an “AI companion” for users.
The earlier iteration of the Microsoft AI chatbot received some backlash for unexpected changes in tone and sometimes downright concerning responses. The bot would start off an interaction sounding empathetic but could turn sassy or rude during long exchanges. In one instance, the bot told a New York Times reporter he should leave his wife because “I just want to love you and be loved by you.” (Microsoft later limited the number of messages users can exchange with the chatbot in any one session, to prevent such responses.)
Some experts have also raised broader concerns about people forming emotional attachments to bots that sound too human at the expense of their real-world relationships.
To address those concerns while still developing Copilot’s personality, Microsoft has a team of dozens of creative directors, language specialists, psychologists and other non-technical workers to interact with the model and give it feedback about the ideal ways to respond.
“We’ve really crafted an AI model that is designed for conversation, so it feels more fluent, it’s more friendly,” Suleyman told CNN. “It’s got, you know, real energy … Like, it’s got character. It pushes back occasionally, it can be a little bit funny, and it’s really optimizing for this long-term conversational exchange, rather than a question-answer thing.”
Suleyman added that if you tell the new Copilot that you love it and would like to get married, “it’s going to know that that isn’t something it should be talking to you about. It will remind you, politely and respectfully, that that’s not what it’s here for.”
And to avoid the kinds of criticisms that dogged OpenAI over a chatbot voice that resembled actor Scarlett Johansson, Microsoft paid voice actors to provide training data for four voice options that are intentionally designed not to imitate well-known figures.
The earlier iteration of the Microsoft AI chatbot received some backlash for unexpected changes in tone and sometimes downright concerning responses. The bot would start off an interaction sounding empathetic but could turn sassy or rude during long exchanges. In one instance, the bot told a New York Times reporter he should leave his wife because “I just want to love you and be loved by you.” (Microsoft later limited the number of messages users can exchange with the chatbot in any one session, to prevent such responses.)
Some experts have also raised broader concerns about people forming emotional attachments to bots that sound too human at the expense of their real-world relationships.
To address those concerns while still developing Copilot’s personality, Microsoft has a team of dozens of creative directors, language specialists, psychologists and other non-technical workers to interact with the model and give it feedback about the ideal ways to respond.
“We’ve really crafted an AI model that is designed for conversation, so it feels more fluent, it’s more friendly,” Suleyman told CNN. “It’s got, you know, real energy … Like, it’s got character. It pushes back occasionally, it can be a little bit funny, and it’s really optimizing for this long-term conversational exchange, rather than a question-answer thing.”
Suleyman added that if you tell the new Copilot that you love it and would like to get married, “it’s going to know that that isn’t something it should be talking to you about. It will remind you, politely and respectfully, that that’s not what it’s here for.”
And to avoid the kinds of criticisms that dogged OpenAI over a chatbot voice that resembled actor Scarlett Johansson, Microsoft paid voice actors to provide training data for four voice options that are intentionally designed not to imitate well-known figures.
Source: https://www.cnn.com/2024/10/01/tech/microsoft-copilot-ai-chatbot-update-friendly/index.html