Data Streams

Hito Steyerl and Kate Crawford

The New Inquiry

2017-01-23

“We are being seen with ever greater resolution, even while the systems around us increasingly disappear into the background.”

“ON November 7, 2016, the day before the US Presidential election, the New Inquiry recorded a Skype conversation between artist and writer Hito Steyerl and academic and writer Kate Crawford The two discussed NSA bros, dataveillance, apex predators, AI, and empathy machines. This is the first in a two-part series. The second part of the conversation, which takes place after the election, will be published in February.”

“I used a couple of their metaphors, namely, the “sea of data” they described–which you also used, Kate, in your essay for the Whitney–to try to think through the sinking feeling of basically being surrounded by this data.”

“HITO STEYERL.”

“KATE CRAWFORD. What I loved–going back to our panel at the Whitney and thinking about your “Sea of Data” piece–is that we both hit on these singular images about the limits of knowing.”

“HITO STEYERL. I’m really fascinated by quantifying social interaction and this idea of abstracting every kind of social interaction by citizens or human beings into just a single number; this could be a threat score, it could be a credit score, it could be an artist ranking score, which is something I’m subjected to all the time.”

“HITO STEYERL. This reminds me of the late 19th century, where there were a lot of scientific efforts being invested into deciphering hysteria, or so-called “women’s mental diseases.” And there were so many criteria identified for pinning down this mysterious disease. I feel we are kind of back in the era of crude psychologisms, trying to attribute social, mental, or social-slash-mental illnesses or deficiencies with frankly absurd and unscientific markers.”

“HITO STEYERL. I think that maybe the source of this is a paradigm shift in the methodology. As far as I understand it, statistics have moved from constructing models and trying to test them using empirical data to just using the data and letting the patterns emerge somehow from the data. This is a methodology based on correlation. They keep repeating that correlation replaces causation. But correlation is entirely based on identifying surface patterns, right? The questions–why are they arising? why do they look the way they look?–are secondary now.”

“HITO STEYERL. I really love the beginning of the text I was just reading when you called, “Artificial Intelligence Is Hard To See.” As you say, it’s the “weak AI systems” which are currently causing the most severe social fallout. I absolutely agree with this. I call these systems “artificial stupidity” and I think they are already having a major impact in our lives. The main fallout will of course be automation. Automation is already creating major inequality and also social fragmentation–nativist, semi-fascist, and even fascist movements. The more “intelligent” these programs become, the more social fragmentation will increase, and also polarization. I think a lot of the political turmoil we are already seeing today is due to artificial stupidity.”

“KATE CRAWFORD. It’s interesting because in my work I’ve been thinking about the rise of the AI super-predator. The narrative that’s being driven by Silicon Valley is that the biggest threat from AI is going to be the creation of a superintelligence that will dominate and subjugate humanity. In reality, the only people who would realize that this is already emerging are the current apex predators themselves. If you are a rich white man who runs a major technology company or a venture capital company in Silicon Valley, the singularity might sound very threatening to you. But to everybody else, those threats are already here. We are already living with systems that are subjugating human labor and particular subsets of the human population in ways that are harsher than others.”

“HITO STEYERL. As people get replaced by systems, one of the few human jobs that seems to remain is security. I just went into a supermarket in Holland and there was literally no cashier left, just a security person. All the female cashiers–it used to be a female job, mainly–were now being replaced with just one security guy. It was kind of extraordinary.”

“KATE CRAWFORD. In the long tail of human labor, the last remnant will be security. The security guard will be the last person to leave the building.”

“HITO STEYERL. It’s like low-grade military.”

“KATE CRAWFORD. It reminds me of a joke that you told a long time ago when we were talking about the singularity. I was talking about my apex predator theory and you said, “You know, people think that this superintelligence is off in the future, but it’s already here. It’s called neoliberalism.””

“HITO STEYERL. But it is, no?

KATE CRAWFORD. I love that joke. It’s my favorite Hito joke.”

“As I was reading your work again this morning, I was going back to what you wrote about the idea of the comrade, and I was thinking about the value of comradeship. What does it mean to be comrades in this field of machine vision and autonomous agents? Can an idea like that persist in these spaces? Where do you find comradeship?”

“HITO STEYERL. This is definitely one of the assumptions that’s thoroughly lacking in the development of all of these systems. Today I was reading a very interesting research paper about computer-based social simulations. They made this agent who is quite humanlike in that he is not rational–he has emotions and so on–but anytime they let it loose, it would start killing. They start killing one another very quickly. They can’t survive. It’s pure genocide. And I think the reason–I have no idea–but I expect one of the reasons for that was that they did not manage to formalize empathy and solidarity. There is probably a formula that they use for homophily, or preferences based on affinity. But homophily is not solidarity, and I really wonder how these systems would look if anything like that was introduced. They would probably look very different, but we don’t know. Until then, all these simulations are presented as scientific renditions of human behavior.”


Previous Entry Next Entry

« This Granular Life Language Games Can Set Us Free »