The Future for Computer Assistants is Easy


Queering smart woman can mean, in its simplest form, to offer digital personal helpers that accurately represent the many feminine versions that exist around the world, in contrast to the fun, submissive personality that many companies choose to emulate.

Q would be as clear as these weapons might look, “Strengers adds,” but it can’t be the only answer. ” Another approach would be to bring sperm in different ways. One example would be Pepper, a humanoid robot developed by Softbank Robotic that is often referred to as its own name, and can detect a person’s face and emotions. Or Jibo, another robot, re-launched in 2017, which also used a male version and was marketed as a home-visiting robot, although it was given a second life as a tool for monitoring health care and education. Considering the “calm and feminine” manners played by Pepper and Jibo – for example, the first one responds politely and often seems flamboyant, and the latter often just walks around insulting and approaching the users politely – Strengers and Kennedy observe. they as good steps in the right way.

Digital developers can also create a bot personality to replace the technical ones. Now, the Capital One cooking robot set up in 2019, when asked about gender, responds playfully: “I’m young. I don’t mean I’m all, I mean I’m one with zero. Think of me as a bot.”

Likewise, Kai, an online banking platform developed by Kasisto – an organization that develops AI software for online banking – is completely out of character. Jacqueline Feldman, a Massachusetts writer and UX producer who created Kai, described the bot as “designed to be male-dominated.” Not just by imagining who he is, as Q does, but by simply copying robots and using the word “it”. “In my opinion as a designer, a bot can be designed beautifully and beautifully in new ways with a bot, without pretending to be human,” he says.

When asked if he was a real person, Kai would say, “Bot is a bot and a bot. The next question, please,” was a clear indication to users that he was not a person or a liar. But I learn. That’s machine learning. ”

The identity of the bot does not mean that Kai is abused. A few years ago, Feldman too talked about deliberately making Kai a skill to distort and stop torture. For example, if a user repeatedly annoys a bot, Kai might reply “I see white sand with a hammock, please try again later!” “I did everything I could to give the bot respect,” Feldman said he tells Australian Broadcasting Corporation in 2017.

However, Feldman believes there is a good need for boats to identify themselves as bottles. “There is a clear lack of manufacturing companies [bots] it makes it easier for the person chatting with the bot to forget that it is a bot, “he says, and gendering or giving them a human voice makes this even more difficult. Since most buyers encounter chatbots it can be frustrating and most people would like to talk to someone, Feldman thinks that having the human behavior of bots could be “extremely creative.”



Source link

Leave a Reply

Your email address will not be published.