Google used to have a simple motto: Don’t be evil. Now, with the firing of a data scientist whose job was to identify and mitigate the harm that the company’s technology could do, it has yet again demonstrated how far it has strayed from that laudable goal.
Timnit Gebru was one of Google’s brightest stars, part of a group hired to work on accountability and ethics in artificial intelligence — that is, to develop fairness guidelines for the algorithms that increasingly decide everything from who gets credit to who goes to prison. She was a lead researcher on the Gender Shades project, which demonstrated the flaws of facial recognition software developed by IBM, Amazon, Microsoft and others, particularly in identifying dark female faces. As far as I can ascertain, she was fired for doing her job: specifically, for critically assessing models that allow computers to converse with people — an area in which Google is active.
Full disclosure: It’s hard for me to untangle my opinion on this from my personal and professional loyalties. I’m not acquainted with Gebru, but we have quite a few friends in common and I’ve admired her work for some time. I signed a letter supporting her. I also run an independent company that specializes in auditing algorithms for bias, so I have an interest in getting big tech firms to use my services rather than do their vetting in-house.
All that said, I genuinely believe that Gebru’s story illustrates a broader issue: You can’t trust companies to check their own work, particularly where the result might conflict with their financial interests. My favorite example is Theranos, which insisted that its research into a novel blood test was so amazing and valuable that it couldn’t be shown to outsiders — until it proved to be a dangerous fraud. The warning applies no less to tech companies such as Google, IBM, Microsoft and Facebook, which have created internal ethics groups and external tools in an effort to display responsibility and keep their algorithms unregulated.
I’ll admit that to some extent, I envy the people who work on the accountability teams. They have fascinating jobs, with access to tons of data that they’d never be able to play with in academia. At the same time, though, they have little or no influence to push their employers to actually implement the fairness frameworks that they so carefully develop. Their scientific papers are often heavily edited or even censored, as I learned when I once tried to co-author one (I quit the project).
I often wondered about Gebru and others working at Google: How could they stand the bureaucracy, or express their very real concerns in that environment? As it turns out, they couldn’t.
Gebru, along with co-authors from academia as well as Google, was trying to get the company’s approval to submit a paper on some unintended consequences of large language models. One problem is that their energy consumption and carbon footprint have been rapidly expanding along with their use of computing power. Another is that, after ingesting a large chunk of the entire history of all written text, they’re troublingly likely to use nasty, racist or otherwise inappropriate language.
The findings, while perfectly good and interesting, were not particularly new. Which makes it all the more bizarre that someone higher up at Google decided, with no explanation, that Gebru had to back out of publishing the paper. When she demanded to know what the actual complaints were so she could address them, she was fired (with her boss informed only after the fact).
Aside from turning the paper viral, the incident offered a shocking indication of how little Google can tolerate even mild pushback, and how easily it can shed all pretense of scientific independence. The fact that Gebru was one of the company’s only Black female researchers makes it a particularly egregious example of punching down in the same old tired way.
Embarrassing as this episode should be for Google — the company’s CEO has apologized — I’m hoping policy makers grasp the larger lesson. The artificial intelligence that plays a growing role in our lives requires outside scrutiny, from people who have the proper incentives to be independent and the power to compel meaningful reform. Otherwise, algorithms will be doomed to repeat and amplify the flaws of the humans who made them.- Bloomberg
Also read: Google CEO Sundar Pichai apologises for handling of departure of AI researcher
Worthless Opinions, that’s all these worthless news companies like print have to offer. Look at the other comments too lol
Dont act as u bother. Pathetic journalism. Dont try to be international… Aukad mei raho. Let the issue be sorted not sure why it is shown as black vs white
Being an Indian publication, The Print should look for evil closer to home. Expose some of the government cronyism that has brought the telecom sector to its knees and propelled one company with suitable government interventions. Now, customers will be left with no choice, but that one corporate which literally directly or indirectly owns most of Indian media and is the epitome of the ‘deep state’ that controls all the four pillars of world’s largest democracy.
So, first you admit your bias and state that most of your sources are from what basically amounts to a rumor mill. Second, as you state, Gebru’s findings were not only not surprising, but a somewhat rehashing of old findings.
Look, I think Google should be broken up and has become monopolistic in their abuse of market share, but it seems they fired this researcher because she was rehashing old findings and labeling them as new. That’s the oldest she’ll game in academia, and it’s somewhat dishonest. Why would a corporation pay someone to do that research, that’s what university positions are for, as the saying goes.
The author of this article conveniently forgot James Damore episode, probably because of her inherent feminist, misandrist bias. And probably she would have lauded Google CEO for firing him as well.
Simple !! Cannot expect throwing of around of hatred towards one gender & probably race and unscientific feminists dogmas/ideologies come back to serve the thrower, having itself magically urned into a lovingly fair set of scientific tool.
The data scientist was fired because of her non compliance with producing standard research paper under the aegis of Google. You’re free to present a substandard research paper as long as you’re acting in an individual capacity. Sub standard paper can malign Google’s image.
Please stop attributing everything to woke politics.
Year or more old news. Stale rehashed content. Save your journalism first.