Science & Tech

Google Crusades for ‘Fairness’

(Aly Song/Reuters)
Google appears to prefer proactively shaping reality to faithfully reflecting it.

James O’Keefe’s Project Veritas operates on the ethical periphery, and to some degree, it has to. In its efforts to infiltrate and expose the world of clandestine bias buried in left-wing institutions, it has played with proverbial fire, emerging both burned (most notably by trying and failing to sell a Washington Post reporter on a fabricated sexual-assault allegation) and as kindling in the immolation of institutions begging to be burned. It’s a polarizing organization, in that way, and its videos and leaks ought to be evaluated on their own merits.

Project Veritas recently released clips of a conversation it recorded between a Veritas operative and Google’s Head of Responsible Innovation, Jen Gennai. The content of their conversation, combined with internal memos released and explicated by an anonymous internal whistleblower, were meant to demonstrate a pervasive and insidious left-wing bias at Google. How insidious that bias is remains an open question, but if O’Keefe’s footage and documents are to be believed, there are certainly people at the company promoting intersectional and other critical theories designed to influence algorithms and search outcomes.

Take, for instance, the so-called “Machine Learning Fairness” algorithms used by Google, designed to avoid producing results that are, as an internal memo describes, facilitated by the “unjust or prejudicial treatment of people that is related to sensitive characteristics, such as race, income, sexual orientation or gender through algorithmic systems or algorithmically aided decision-making.” Google calls this phenomenon “algorithmic unfairness,” which sounds benign enough; later in the document, however, Google expounds upon precisely what this means in practice.

When a search result “is factually accurate” — or, in other words, when the company’s search algorithm delivers an accurate and precise representation of the world as it is — Google insists that this can “still be algorithmic unfairness.”

The memo lays out an example of this phenomenon: “Imagine that a Google image query for CEOs shows predominantly men. Even if it were a factually accurate representation of the world, it would still be algorithmic unfairness.”

If this memo is genuine — and it certainly comports with the spirit of Google’s publicly available summary of its Machine Learning Fairness philosophy — then the company is indeed more interested in proactively shaping reality than faithfully reflecting it. This wishful reconstruction of reality is the charge of activists, writers, and the creative class, not the world’s largest search engine and information company. A search engine that dabbles in sanitizing basic realities that are inherently political — in a way distinct from filtering out violent or pornographic material — is abdicating something essential about what a search engine is for. If it isn’t charged with faithfully reflecting reality as it is, a search engine becomes little more than a canvas for the biases of its programmers.

And if the attitudes of some Google executives are representative of those biases, that canvas might well be imbued with political assumptions.

The clips of Project Veritas’s surveilled conversation with Google executive Jen Gennai are presented in a vacuum, and, for her part, Gennai insists her remarks were “taken out of context.” That’s possible. But her remarks do, to some extent, stand on their own as indicators of her political views and the effect that those views have on her approach to fostering “Responsible Innovation.”

Discussing what “fairness” means to her, Gennai insists that her “definition of fairness and bias specifically talks about historically marginalized communities; and that’s who I care about. Communities who are in power and have traditionally been in power are not who I’m solving fairness for.” This comports quite well with what appear to be the philosophical bases of the “Machine Learning Fairness” algorithm. If these documents and conversations are to be believed, Google is intentional in its desire to avoid further perpetuating the influence of those “communities” that “have traditionally been in power.”

It would seem rather important, then, to be in the good graces of Jen Gennai as she chooses what communities to “[solve] fairness for.”

Later in the conversation, Gennai expressed the pressure Google feels to “fill the gap of what should be done” if “the White House and Congress won’t play a role in making things more fair.” The context of these remarks is not provided, and it’s easy to read too much into what she’s saying here.

That said, to whatever extent it is the rightful prerogative of “the White House and Congress” to make “things more fair” — a more loaded phrase, I cannot conceive — Google assumes the awesome responsibility of presuming it can act as an extra-governmental facilitator of “fairness.”

Whatever the flaws of James O’Keefe and Project Veritas — and there are many — most of the statements Gennai makes in that video are revealing in themselves. If Google thinks itself in the business of discarding “factually accurate representations” of reality in its search results and insisting, to quote an internal document, upon contemplating “how we might help society reach a more fair and equitable state,” it has become something other than a dispassionate search engine.

But maybe that’s the point.

Most Popular

White House

More Evidence the Guardrails Are Gone

At the end of last month, just as the news of the Ukraine scandal started dominating the news cycle, I argued that we're seeing evidence that the guardrails that staff had placed around Donald Trump's worst instincts were in the process of breaking down. When Trump's staff was at its best, it was possible to draw ... Read More
Politics & Policy

Elizabeth Warren Is Not Honest

If you want to run for office, political consultants will hammer away at one point: Tell stories. People respond to stories. We’ve been a story-telling species since our fur-clad ancestors gathered around campfires. Don’t cite statistics. No one can remember statistics. Make it human. Make it relatable. ... Read More
National Review

Farewell

Today is my last day at National Review. It's an incredibly bittersweet moment. While I've only worked full-time since May, 2015, I've contributed posts and pieces for over fifteen years. NR was the first national platform to publish my work, and now -- thousands of posts and more than a million words later -- I ... Read More
Economy & Business

Andrew Yang, Snake Oil Salesman

Andrew Yang, the tech entrepreneur and gadfly, has definitely cleared the bar for a successful cause candidate. Not only has he exceeded expectations for his polling and fundraising, not only has he developed a cult following, not only has he got people talking about his signature idea, the universal basic ... Read More
World

Is America Becoming Sinicized?

A little over 40 years ago, Chinese Communist strongman and reformer Deng Xiaoping began 15 years of sweeping economic reforms. They were designed to end the disastrous, even murderous planned economy of Mao Zedong, who died in 1976. The results of Deng’s revolution astonished the world. In four decades, ... Read More