This review previously ran in April to coincide with a special online release. The presence of AI in today’s society is becoming more and more ubiquitous— particularly as large companies like Netflix, Amazon, Facebook, Spotify, and many more continually deploy AI-related solutions that directly interact (often behind the scenes) with consumers everyday. Watch AI & Bot Conference for Free Take a look, Netflix ended up presenting thumbnails to users that matched a user’s ethnicity, Becoming Human: Artificial Intelligence Magazine, Cheat Sheets for AI, Neural Networks, Machine Learning, Deep Learning & Big Data, Towards further practical model-based reinforcement learning, Designing AI: Solving Snake with Evolution. A true effort to make these connections is also a documentary film. Yet, Netflix’s algorithm (arguably) made false thumbnail recommendations of supporting black actors/actresses who don’t really represent what the movie was about, but did experience a higher click rate among certain ethnic audiences. Does that mean there is a wrong way? Machine learning (ML) is a potential AI solution — but we need to first define the problem before prescribing that solution. Take, for example, the film’s discussion of superintelligence, the theory that animates many apocalyptic AI scenarios. Only through proper positioning and connection with Netflix’s core business problem did these ideas become the reality that they are today. When an algorithm developed by Google to filter online comments gives the statement “I am a gay black woman” a toxicity rating of 87 percent, even the most bombastic documentary makers should be able to express why poorly applied AI could be worrying. We’ve also seen limitations of algorithms that “overdo it” and discussed specific examples in which the Netflix algorithm presented misleading thumbnails to people of color because the algorithm optimized for clicks, effectively “tricking” the users into clicking bait. As Microsoft AI researcher Timnit Gebru pointed out on Twitter, this is genuinely a “difficult feat to achieve,” considering the gender diversity in the field. It’s like ending a documentary on violence in cities by saying, “Forget about muggings, your neighbor could be making a nuclear bomb in their garage right now!” That may be technically true, but it’s not particularly helpful. Because this “feature” actually reduces viewership, as negative reviews discourage users from trying out a video. Paine, who previously directed the well-received 2006 documentary Who Killed the Electric Car?, is trying to give an ambitious overview of the threat and potential of AI, but he does so in the same way that satellite imagery provides a “good overview” of where you left your car keys. Do You Trust This Computer? Every human being here is just a user. Now that we know how Netflix turns images into numbers in a machine learning model, what are some insights Netflix has found from all the data processing and A/B tests they have conducted for so many years? What if Netflix custom created a different thumbnail for each user that is optimized to increase click rates? While not perfect, Netflix’s algorithms suggest that such level of personalization based on user profile characteristics increases probability of click thru rates. Throughout the documentary, women and children are interviewed on the street, and their relaxed and informal comments — like, “Oh my god I trust my computer so much” — are consistently contrasted with the assured expertise of men. There’s just not enough detail to be useful. Your feelings and reactions have been controlled, keeping an eye on where you are and what kind of people around you. If a pilot MVP version of this showed that users who engaged with his new feature stayed longer or came back more often or helped drive more word of mouth about Netflix, then it could warrant further resources. And, the most dangerous disclosure is that these AI machines that they are no longer under the control of anyone. We’ll answer this second question further below. But once Netflix annotates each thumbnail and assigns metadata to each one to describe what’s in that thumbnail — now we have numeric representation of that unstructured data. Well, turns out, back in 2014, Netflix conducted studies showing just how important that thumbnail is: Nick Nelson, Netflix’s global manager of creative services, explained that the company conducted research in early 2014 that found artwork was “not only the biggest influencer” for a user’s decision about what to watch, it also constituted over 82 percent of their focus while browsing Netflix. Yes, that would be a pretty awesome use case leveraging natural language processing (NLP) to understand your post-episode commentary in context. This is just yet another example of how a business need supercedes a popular user need! That was their hypothesis: that adjusting the artistic content of an image thumbnail could have a strong link to viewership.