A Google launcher launches a project to maintain technological ethic | News from the world



[ad_1]

Employees of technology companies should have the right to know when they are working on projects that they might find ethically unacceptable, said a former Google launcher.

In 2018, Jack Poulson hit the headlines after resigning from Google because of the company's (now-abandoned) plan to create a censorship CE for the Chinese research market. Now, he wants to make sure other technicians can fight for the cause without having to put their livelihoods in jeopardy.

Poulson has launched Tech Inquiry, a non-profit organization that aims to make it easier for conscious developers to speak up within their company when they feel that ethical boundaries are being crossed. Equally important, he insists on greater transparency to prevent workers from being simply encouraged to do work that they would never do voluntarily.

"I think technicians need informed consent about when their work could result in loss of life or the removal of human rights or fundamental freedoms," Poulson told the Guardian. , in London, where he was scheduled to speak at the annual conference of the Open Rights Group on Democracy Saturday. "How can we help technicians who have seen something go wrong? How can we ensure that they have an avenue of trust, a place to reach out, that is not necessarily intended for a journalist? "

Tech Inquiry aims to take advantage of the growing wave of employee discontent that has spread to Silicon Valley and provide workers with the tools they need to confront their leaders and the information they need to know when this repression is necessary. Poulson's position against Project Dragonfly, Google's effort to build a censored search engine to re-enter the Chinese market, is just one example of this move. He also cites blame within Google for Project Maven, an artificial intelligence Pentagon project that the company took out in April 2017, as well as moves by other companies including Microsoft, Amazon, and Intel.

Poulson's key is to extend this power to workers during the design and development phases of the sector, where the work is often done in the greatest secrecy and where the public is least able to get noticed. It is moving in a dangerous or alarming direction. . "Internally, and especially in a research department like this, you get the impression that you have a free pbad until you're ready to press the button to launch it."

When Dragonfly was revealed to the public, for example, "the response of [Google’s chief executive] Sundar Pichai would try to argue that there should be no accountability because it was "an exploratory project" … And it was not just specific to Dragonfly. I've met this on YouTube, as part of a project using conversational recommendation engines. "

Adults tend not to look in the conversation, says Poulson, but to type in more structured queries: rather than looking for "how do I tie a bowtie", they can simply look for "how to make a bow tie" . Because of this, YouTube had to find a better dataset to focus on more natural queries and opted for kids.

"One of the project leaders explains that we will use a dataset from YouTube and a younger audience because they use more natural language. I spoke up and said, "we can not use the child data" and the answer was "it's okay, their parents agreed." I talked to co-workers and they said, "Do not worry, when we launch the product, we'll have a thorough privacy review." When you do R & D, there is the idea that you can save time and let the privacy team deal with it later.

Allowing "free" ethical freedom for "experiments" is dangerous, says Poulson. First, as a project comes to an end, "you have created a complete team whose career investment is to launch this project. In addition, you have normalized the behavior of several engineers. There is something fundamental: the first time you see your business doing anything is when you go to talk. After that, a switch turns for you and you accept it.

[ad_2]
Source link