Google Assistant and the need to ask for things please. Thank you



[ad_1]

How many ways do you have to ask Google Assistant for something? In principle innumerable because the language itself is full of synonyms, but in terms of command, one could distinguish Blunt and friendly order. How are they different? In tone and in need to ask for things please; something that the Google assistant will start rewarding from now on.

The function called "Pretty Please", Something translated by superlative please, has been featured in the previous Google I / O as a way to emphasize good behavior in families. At the end of the day, we all seek to ensure that our children acquire good manners, which is why the wizard is a good way to start.

Google has developed a double slope in the responses that, although with the same result is obtained (execute the command), ask for it please assistant will be much more friendly. The difference is subtle, but it contains a moral dilemma that is just beginning to be seen: Should we treat artificial intelligence with the same respect as a person?

"Ok, google, please, give me a timer for 5 minutes"

Google Assistant responds nicely if you ask him

9to5Google's picture

Pretty Please Starts to be active in the United States, the world's first place where Google Assistant softens your transaction if users add a pinch of sugar to their order. The approach seems fair to me because it is a good way to teach good manners to children. In the end, if Google tries to make Wizard's behavior as natural as possible, good manners are inherent in the natural. Or they should be, of course.

9to5Google is responsible for capturing the novelty: Pretty Please begins to reach Google Home and smart screens. The feature will use voice matching to identify child users with the intention of learning to behave in a user-friendly manner. In this way, they will imitate the behavior practiced with the assistant when he speaks to people. And vice versa.

Before talking about the moral dilemma that arises when Artificial Intelligence stops using algorithms to improve the software's actions and ends up integrating into a with which we can interact. If it's an object, we should not feel bad about dealing with Google Assistant as such. But since it is a person even though it is not really the case, the ethic stipulates that the treatment must be equal.

At the moment when artificial intelligence looks like a person, we should treat it as such

Ask an assistant for things you like when you are going to do them with a command as well? Ask our autonomous car to kindly turn on so you can take us to work? They seem absurd situations, but there will come a time when we have to lift them. And not because we are going to coexist with replicants of Blade Runner, which will surely not be like that.

"Thank you so much for asking so kindly, here is your reminder"

Google Assistant responds nicely if you ask him

We got used to talking loudly on the street, leaving our hands free and no one saw that we had talked on the phone, we also learned to overcome shame and talk directly with our device without having to worry about it. there is a person on the other side. The next step is talk to this assistant as if he was a person, with all that human cordial treatment implies. This is a necessary evolution because, in the immediate future, the contact of children with machines will not be anecdotal, maybe even that they will have their first interactions with them. And they must learn to build relationships without hurting others.

Would you like to ask for things please? Google Assistant if only to check your answers? Do you think that one day we will talk to machines the same way we talk to people? A little agitated, yes.

[ad_2]
Source link