Google has a problem with its quick answers that makes them spread lies and rumors

Google

For some time, Google has been trying to make our lives even easier by offering the so-called quick answers. It is a system in which, when we ask questions especially important or common in the mobile, instead of offering us several results Google directly speaks to us to recite the most popular answer to our doubt.

But this attempt to provide us with the best possible response instantly is still far from entirely reliable. Especially if we see the answers that they have compiled in Search Engine Land, where they have discovered that this system causes that Google sometimes respond with rumors or totally false information.

For example, one of the most controversial answers is when asked to Google which presidents of the United States have been part of the Ku Klux Klan, and this recites a list of presidents accusing them without having any kind of evidence, and using a blog as a source of WordPress with a rather dubious credibility.

The problem here, comes mostly in the way the search engine selects its quick answers. It does not seem to do any type of study looking for the most reliable source, but simply is to recite the answers of the page better positioned . In this way, it all depends on SEO and its positioning algorithm rather than credibility.

The method may be good for most questions, but when we enter the mystery ship and try to ask morbid things or referring to false news and rumors, where the best positioned pages do not have to be reliable, everything goes to the garete. This is especially worrying at a time when people often rely blindly on what is shared on the Internet without bothering to look at whether information is true or false.

Misinformation can reach such a point that Google can say barbarities like Obama is planning a communist coup, or that dinosaurs are an invention to indoctrinate children and that they do not believe that the earth is only a few thousand years old like says the Bible. Answers that, as we say, can be dangerous and misinform more than they report.

Virtual assistants will have to deal with it

This problem with Google’s quick answers directly affects attendees like Google Home, who respond to any of our questions based on searches on the browser. As we see in the tweet, also picked up by Search Engine Land, this can lead to the virtual assistant of Google to recite sexist or insulting answers like that every woman has a prostitute inside her.

This, as you will understand, is a problem facing an imminent future in which more and more houses have these connected devices. And is that, as other media have said that have covered this topic as The Outline, these types of Google responses may be worse than the false news we have been talking about in recent months.

And how to solve this? According to Eric Enge, CEO of Stone Temple Consulting, the quickest solution would be for Google to allow more and more users to interact with its rapid response system or snippets. And for that I would have to release it globally, since for example we can not access this type of answers.

And what is achieved with it? If you look at the captures, under each one there is a link “Feedback”, which would be the users themselves who would be fixing the algorithm. And it’s better to do so soon, because both Google and other companies like Amazon are competing in a real race to implant our virtual assistants in our homes, and that the information they offer is false can be a problem in the medium or long term.

Written by suNCh8

Leave a Reply