Ramon Amaro discusses his new book with Bob Mills
23 June 2021
Dr Ramon Amaro, who joined the Department last year, speaks to Bob Mills about his research and forthcoming book 'The Black Technical Object'.
Hi Ramon. Can you tell us a bit about yourself and your academic journey?
I’d describe that journey as particularly non-traditional. I started out as a programme design engineer and worked for a long time in that industry. Then I went into engineering policy and became a project manager, which made me realise the limitations of programme design and led me to think more deeply about the human impact on those processes. And that’s when I decided to go back to graduate school.
First, I did a Masters in Sociological Research, which led me to explore how to qualify or even quantify black lived experience. For me this question ran deeper than simply seeing race as a construct or as something that’s objective. I wanted to think about how race enacts itself in the way a person sees themselves as being in the world. This led to my doctoral research, where I combined thinking about the engineering and advanced computational systems with thinking about black phenomenology or lived experience. The Centre for Cultural Studies at Goldsmiths, where I was based for the PhD, was an important space for this kind of disciplinary collision.
Your book, The Black Technical Object: On Machine Learning and the Aspiration of Black Being, comes out later this year. First off, could you explain ‘machine learning’ to our readers?
Essentially, machine learning is a computer programming system designed for taxonomic patterning. It makes use of algorithms, which are essentially step-by-step instructions for mathematical functions that process data to produce a certain, usually predictive output. The technology effectively guesses what the outcome might be, then a human intervenes and considers if it’s appropriate. This is then fed back and that’s how the technology learns.
A good example is the music app Spotify. Spotify is essentially just a random collection of data: millions of songs, genres etc. An algorithm is used to organise this data, for example by genre – ‘pop’, ‘jazz’ etc. But machine learning really gains its power when the user starts intervening. If, for instance, Spotify plays a Taylor Swift song and the user indicates that they like that song, the algorithm begins to learn that they’re a Swift fan. Spotify then correlates this information with other data (e.g. music similar to Swift’s) and begins to recommend other songs on this basis. But if you skip a Taylor Swift song quickly, it learns that you’re not a fan. So the app begins to home in very specifically on your tastes and needs. And this is very powerful because the same algorithm and catalogue of music can be used to produce highly individualised experiences.
But machine learning can also make decisions that reproduce prejudice. Can you explain how, in the book, you connect algorithmic bias with a deeper history of bias?
The Black Technical Object begins with a very practical analysis of machine learning. Throughout my research I found it very difficult to distinguish between machine learning and statistical analysis more broadly, because each uses the other’s methods. I started also to investigate the genealogy of statistical analysis and in so doing uncovered a long history of data being used to impose racial hierarchies, or for the purposes of human sorting or population control. By tracing this history, I hope to bring to light that what we’re witnessing today as ‘machine learning bias’ isn’t new as such. Rather, it’s part of a pervasive logic of statistical sorting that makes use of contemporary technologies to enact its aims, be they colonial, eugenic or some other mechanism.
I want to avoid the idea that we simply take for granted that something exists called ‘algorithmic bias’ and that, if we just clean up the data or change the algorithm, the bias will go away. Instead, I show that this bias is already steeped within even the history of the mathematics so requires a greater intervention.
The book’s subtitle refers to the aspiration of black being. Could you say more about this idea and how it connects with your argument about machine learning?
This is an important issue for the second part of the book. The identification of people of difference through statistical analysis has a very explicit alienating effect which, for me, is psychosocial. This history has resulted in a negating type of embodiment, where a person feels alienated from their own representation through technology, which produces a type of neurosis.
One response to this is a desire for increased representation. If a minoritized person encounters machine learning bias, there’s been a tendency to react by attempting simply to insert difference into the algorithm. But while this may seem satisfying, it’s just replicating this idea that you negate your own position in favour of the ‘other’ which the algorithm has essentially recognised you as. I use a primary case study to show that this desire for recognition disrupts any aspiration just to be. As black beings on this earth, you’re already in the position of living in the world through duress but the book explores how the black being can also resist this negating experience and instead focus on that which is aspirational. Here I draw especially on Frantz Fanon’s work to think about the idea of the black object.
The latter part of the book is quite experimental. Here I bring together a practical analysis of machine learning with a theoretical analysis of black negation and ask what needs to be altered to fully aspire. We don’t have to get rid of processes such as machine learning. Rather, we can live through such processes and enact another sense of being. This is where, for me, art comes in.
Could you say more how you locate your research on machine learning within a history of art and visual culture?
Around 2012, when I began this research, there were very few conversations about this type of racial incursion specifically related to machine learning. Of course, we see it everywhere today, but back then the art world was a context where I could experiment with these ideas and where audiences were interested in them. Some within that world, artists, had this prescient view of where things were going and were very much at the forefront of the debate. The more I interacted with those communities, the more I saw the value of the history and practice of art and it grew from there, to the point where I now see the technologies and art as being inseparable. Indeed, today one of the places for experimentation by big tech companies is art. Questions such as ‘what is beauty or taste?’, or ‘what is form?’ are seen as being the last frontiers for the algorithmic structure.
You were appointed as a lecturer in art and visual cultures of the global South. Could you say more about the relationship between your research and this category of the ‘global South’?
European canons take for granted their location within a very specific cultural milieu. But what’s fascinating about visual cultures of the global South is their refusal to take their own historicity and lived experience for granted. Such work tends to be highly experimental and experiential – a challenge to multiple audiences. People can connect to it because it’s so affectual and challenges everyone’s perceptual matrix. Many artists that originate from the global South or for whom it is their subject matter take us on a journey through that experience and confront us with it. That’s exciting for me.
Finally, where next with your research?
In many ways this book, The Black Technical Object, is a primer for the next, which will mainly be a book on art. I want to think there about the relationship between black abstraction and machine abstraction. How can we learn the lessons from abstraction in visual art—lessons that big tech companies are also attempting to learn—to think about a new type of world making?