Algorithms are Gods
Welcome to new subscribers. For those just joining the party, The Generalist exists to help the curious go beyond the headlines and understand how technology is changing the world. This week you'll learn about Sergey Brin's view of god, pangolins' big break, the "Mad Dog" forced into retirement, Sam Altman's move to monetize, and Bill Gates' video game. Thank you for being here.
Editor's letter: The case for algorithmic affirmative action
What's the worst way to begin a newsletter?
Invoking a Latin phrase is surely up there. And yet here we are, staring at three fusty words in a dead tongue:
Deus ex machina.
It's a phrase I have thought a lot about this week, "god in the machine." Originating in Ancient Greece, the term was first used to describe the work of the playwright Aeschylus, later that of Euripides. It described a dramatic device in which the main action of a play is resolved by calling on a deity. In Medea, for example, the titular character is rescued by Helios, the sun-god, who appears apropos of nothing to whisk Medea to safety. To give the entrance of these almighties their due, productions usually leveraged "machines" of some sort — cranes, or trapdoors. Hence the name, and more truthful translation, "god from the machine."
Over the past two decades, the deus ex machina mantra more accurately fits the tech industry rather than the dramatic arts. The ambition of iconic founders has been to embed the divine in lines of code, to put god in their machines. Uber, notoriously, built "God View," which gave viewers total visibility into the movement of passengers in a given city, while Sergey Brin, founder of Google, once said, "[T]he perfect search engine would be like the mind of God."
For twenty years or so, that may have been the goal. But an apotheosis has occurred. We once dreamt of building god in the machine. Today, the machine is god. Our lives are ruled by invisible algorithms that decide what we buy and read and watch, what healthcare we are given, what job offers we receive, whether or not we go to prison. Amazon's algorithm is said to influence a third of our decisions on the platform. Netflix drives 80% of our choices. These are minor examples, but as Francis Bacon noted "God hangs the greatest weights upon the smallest wires."
These gods are, of course, nothing like the one Bacon would have recognized. Unlike man, supposedly made in god's image, these deities have been made in ours. Which is to say that they are unjust and unfair, prejudiced against those most in need. Last week, we alighted on this subject to note that technology can be better used to correct racial imbalances: we should begin with algorithmic affirmative action.
Women and minorities endure systemic biases from technology. That may not surprise, but it should still sting.
COMPAS, a system used in courtrooms across the country to predict the likelihood of an individual committing further crimes, was shown to be biased against African-Americans. These "risk assessments" informed bond amounts and decisions about which individuals were set free.
An algorithm used by healthcare systems serving millions of patients graded black patients as lower risk than white patients with similar ailments. The result is that black patients received less than half the additional care they would have if they had been white.
A fintech platform offering mortgages charges African-Americans and Latinx users 5.3 basis points more than white applicants. Refinancing an existing mortgage costs these minorities 2.0 basis points more.
So: money, health, freedom. Each fundamentally altered by the decision-making of machine learning models. While in some rare cases, prejudiced algorithms may be the product of some rogue, race-baiting developer, the more common truth is less vicious, harder to solve. More often than not, algorithms suffer because of the data on which they rely. Or as the Yale Law Journal summarizes, "Bias In, Bias Out." The world is biased — how could algorithmic reflections of it be any different?
As with anything of sufficient complexity, solutions are not simple. Some have suggested unveiling the workings of each "black-box," laying bare the cogs and wheels by which decisions are made. That way lies madness. Not only would it require trade secrets to be revealed, it would also allow bad actors to game the system more effectively. If Google were to expose the inner workings of its page ranking algo, it would become much easier to spam, potentially exacerbating the problem.
A better place to start may be in reframing what we expect from these systems, what we consider fair. A change in mindset is required: while most algorithms are "procedurally" fair (which is to say, they operate fairly when the outcome is excepted), they are rarely "substantively" fair. That is, they do not produce just results, regardless of the methodology used.
How do we make this shift? How do we ensure that even in the digital dominion, the moral universe bends towards justice?
Above all, it may be a question of paying attention to the inputs. Just as a "Prince who is not himself wise cannot be well-advised" (Machiavelli), so too can an algorithm with incomplete data fail to be anything but flawed. Recent research from MIT has shown how increasing the diversity of data can lead to less biased outcomes. This new approach is able to determine how much more data is needed from a certain group to improve the accuracy of predictions related to that population. For example, the system might suggest adding data from Asian-Americans to better predict mortality rates for Asian-Americans in an ICU environment. Augmenting the data set does nothing to diminish the efficacy of predictions for other groups.
For all of its idealism and professions of tolerance, tech is unlikely to make such changes unbidden. Real regulation will need to be levied. Calls to form a sort of "FDA for algorithms," designed to evaluate the cost/benefit of different models is an interesting suggestion, as is the meat of the "Algorithmic Accountability Act," which is still under consideration by Congress. The bill requires companies to "study and fix flawed computer algorithms."
The challenge in all of this will be securing justice without neutering innovation. In that endeavor, the private sector may be invaluable. Will data-collection companies emerge to serve different ethnic and affinity groups? Could a data-munger create proportionate data sets by nation, city, county to ensure adequate representation? Can "prejudice-detection" systems be used to alert companies of algorithmic bias? Asking for more, asking for better does not need to stifle invention.
A phrase is inscribed on the gravestone of famed psychoanalyst, Carl Jung. Latin, of course, though we need not dwell on those old words. In English, it reads, "Summoned or unsummoned, god is present." Whether we like it or not, our world is governed by an invisible, nearly omnipotent power or set of powers. It is up to us to ensure that we create deities of which we can be proud, godly not only in their might but in their mercy.
In other news...
A previous Editor's Letter explored the acquisitions big tech may make in the coming months. A version of that piece was published in Marker this week.
The S-1 Club released a new edition on Tuesday. Tanay Jaipuria, Tom Guthrie, Jon Hale, an anonymous contributor, and yours truly, covered car e-commerce platform Vroom, which made its public bow this week. After debuting, VRM rose over 100%. Quite a ride. I'll be announcing the next group of contributors on Tuesday. If you haven't had a chance yet, enter your predictions for Vroom in the Typeform here. The most accurate prediction will win the much-coveted "Golden Graham" prize.
Finally, the second edition of RFS 100 was shared on Friday. It featured ideas from investors and operators at Zetta Venture Partners, Gelt VC, Faire, Haystack, Reshape, Firstminute, and others. This was the last edition sent out to The Generalist's subscriber base.
Overheard
"Each generation must discover its mission, fulfill it or betray it, in relative opacity." — Franz Fanon, The Wretched of the Earth
Good news
Virus-free
A crowd of 20K filed into Forsyth Barr Stadium yesterday as New Zealand celebrated 20 days with no new coronavirus cases. To date only 22 Covid-19-related deaths have been recorded in the country. The Super Rugby match finished 28-27 with the Otago Highlanders besting the Waikato Chiefs in the dying minutes.
Pangolin protections
As covered in a previous edition, pangolins are the most-trafficked mammal outside of humans. This week saw the creatures get a boost as China categorized them as Class 1 creatures under the country's wildlife protection laws. That makes domestic trade illegal. Adding to the news, the government excluded pangolinscales from the list of approved ingredients for traditional medicine.
Long tail
"Too dangerous to release"
That was how OpenAI once described their text-generation model. Just over a year later and the ship steered by Sam Altman is rolling out "the API" a commercialized version of the GPT-3 program. The cloud service is designed to help companies better comprehend human language and is already serving Google and Algolia. OpenAI's service may also be used to inform chatbots. Like many such bots, GPT-3's system is no stranger to racist and sexist monologues, responding to a prompt from Wired by riffing, "We are being beaten and raped…vast immigration started in the '60s and never stopped." Lovely.
Machine learning models are being applied to African satellite imagery to provide a clearer picture of citizen consumption. More here
Hackers hijacked machine learning clusters on Microsoft's Azure platform, using them to mine cryptocurrency for free. More here
The superpower of speaking Japanese
Despite limited testing and social distancing, Covid-19 remains relatively rare in Japan with just .5 cases per 100K people. Strangely enough, language may have something to do with it. Unlike English or Chinese, speaking Japanese requires relatively little inhalation and exhalation, potentially reducing the opportunities for pathogens to spread. Though it may sound farfetched, a study from the 1960s showed that singing — which requires more breaths in and out — produced six times as many airborne droplets as talking. All to say, the structure of language may play some role in transmission.
University students specializing in language studies often spend their final year abroad. Many are preparing for virtual immersion. More here
Though explicit racism has declined in America, implicit bias is alive and well in our words. More here
The "Middle Kingdom," moderated
More than 170K accounts were kicked off of Twitter this week. Revealed to be a concerted pro-China operation based in the People's Republic, the accounts primarily espoused messaging favorable to the Communist Party. Of the 170K, ~23K were classified as highly active, with the remaining sum serving as "amplifier accounts." Roughly 1K Russian-linked accounts were also booted.
An electric-battery manufacturer in southeastern China believes their newest power pack will run for 1.24 million miles, a new record. More here
At the Chinese government's behest, Zoom shut down the accounts of activists commemorating the Tiananmen Square massacre. More here
A cluster of coronavirus cases have been discovered surrounding a meat and vegetable market in Beijing, prompting immediate action. More here
Deep breathing
Some front-line medical workers in the UK have been given access to a new virtual reality game, designed to reduce anxiety. "DEEP" allows users to swim through a digital world with their speed controlled by breathing. The deeper a player breathes, the further they swim. So far, it's garnered positive feedback, illuminating how gaming can be used to treat stress and other mental issues.
AT&T is considering selling off its Warner Bros. gaming division for $4B. EA, Blizzard, and others are reportedly interested in the unit which owns titles like "Mortal Kombat." More here
Jian Zihao, one of China's most famous e-sports athletes, is retiring citing ill-health. Despite being just 23, "Mad Dog" as he is known, described facing mental issues and receiving a type-2 diabetes diagnosis. More here
In 1986, Bill Gates created a video-game called "Donkey." You can now play it on your phone. More here
Jobs
VC Analyst + Intern - AlleyCorp - NYC - Apply here
VC Intern - HOF - NYC - Apply here
Chief of Staff - Tap Tap Send - NYC - Apply here
VC Head - New Media - NYC/SF - Apply here
VC Investor - Emergence - SF - Apply here
VC Associate - Maveron - SF - Apply here
AI VC - Point72 - SF - Apply here
VC Analyst - Meadow - SF - Apply here
Chief of Staff - AppDynamics - SF - Apply here
Investment Intern - DocSend - Remote - Apply here
Puzzler
I fly like a bird over land and sea. I am white but have no feathers. If you hold me, I will die. What am I?
As always, clues are given to those curious enough to message. And all guesses are welcomed, no matter how speculative.
A new winner sped to victory this week, as Mathias P peeled ahead of the peloton to claim The Generalist's version of the maillot jaune. Just over the next hill and hot on his trail were NBT, Justin A, Santhosh N, Monica V, Abhinav Sapru, Sam P, Jimmy S, Jeremy D, Aya M, Violette Z, Martin S, Vinish G, SEV, NMT, and Eli K. All impressively answered last week's riddle, below.
I am a word of six. My first three letters refer to an automobile, my last three letters refer to a household animal. My first four letters make a fish, in my entirety, I can be found on your floor. What am I?
The answer? Carpet, which succeeds in referencing an automobile (car), household animal (pet), fish (carp), and floor decoration (carpet!). Congratulations to all. Impressively adept world-wrangling.
Thank you for reading. Did you enjoy today's edition? If so, you might consider sending it to the friend most likely to retweet a Russian bot account. Wishing you a restful Sunday from a cabin in upstate New York. It's a brisk and sunny morning here.