NERO is an international publishing house devoted to art, criticism and contemporary culture. Founded in Rome in 2004, it publishes artists’ books, catalogs, editions and essays.

NERO explores present and future imaginaries beyond any field of specialization, format or code – as visual arts, music, philosophy, politics, aesthetics or fictional narrations – extensively investigating unconventional perspectives and provocative outlooks to decipher the essence of this ever changing reality.

Lungotevere degli Artigiani 8/b
00153 – Rome
[email protected]
[email protected]



Privacy Policy - Cookie Policy

Heads of Content:
Valerio Mannucci, Lorenzo Micheli Gigotti

Creative Director:
Francesco de Figueiredo

Editor at large:
Luca Lo Pinto

Michele Angiletta, Alessandra Castellazzi, Carlotta Colarieti, Clara Ciccioni, Carolina Feliziani, Tijana Mamula, Valerio Mattioli, Laura Tripaldi

News Editor:
Giulia Crispiani

Elisa Chieruzzi, Lorenzo Curatola, Lola Giffard-Bouvier

Administration and Production:
Linda Lazzaro

Davide Francalanci

Sound Artifacts and Lively Data

A podcast by Radio Papesse & ALMARE: What Do Sounds Want – Episode III

You’ve been working late for days, an unmade bed awaits you at home, along with an antisocial roommate you share a bare fridge with: two yogurts gone bad, a jam blooming with mold, a leftover Chinese takeaway from three days ago. 

You know it, but you open the fridge anyway, hoping that a ready-made soup has made its way onto your shelf.

If you had a smart fridge, it would know what you like. 

Listen to “3. Sound artifacts and lively data” on Spreaker.

A smart fridge would have shopped for you online… but your fridge, it just keeps cooling. It takes no initiative. It knows nothing about you.

But what if it did know you? It’s not science fiction.

In 2017, Whirlpool appealed to the United States government to impose duties on its competitors LG and Samsung, guilty of selling smart technology cheaply: the Korean fridges had eaten up the market and were about to eat Whirlpool as a whole.

By understanding, ahead of others, that in a data-driven market, you win by keeping costs low, so you build a customer base, oil the data stream, and cash in over the long term. 
In other words, the fridge costs less, because I am prepared to sell the data you provide for free.

It’s a matter of numbers. The logic of accumulation works like this, it turns a refrigerator into a machine that produces, collects and transmits data. And how that data can be used now propels new ways of doing business, governance and surveillance.

In this episode we’re talking about data, how it shapes our understanding of the world and of ourselves, and how sound and voice have been enmeshed in its logic.

The voice era is here: despite a crescendo of concerns, a rising buzz of anxiety about privacy, biometric data, state control, AI taking over our lives and Black Mirror-like scenarios… voice-enabled technology is becoming pervasive. 

Not only do we live in times of ubiquitous listening—where we are surrounded by sound artifacts in every moment—but we also live in times where we are listened to, wall to wall, again and again. Our homes have ears, the devices we surround ourselves with have ears.

Your fridge listens to you. 
And beyond the metaphor, Alexa, just to name a technology you may be familiar with, is listening to you and she’s also learning to mimic your voice: it is now widely reported that she is even able to replicate the voice of the dead… creepy, no less than eerie…

But, what for? 
Well, data is the answer.
And what is data?

Data is the raw material produced by reducing reality into categories. 
Data is the intentional by-product of a measurement. 
Without data, you’re just another person with an opinion”, data scientist William Edward Deming once said. 

Today we could argue that collected data also depends on choices and decisions, which in turn are the result of individual stories and therefore, human errors, biases, personal interests, privileges, ideologies.

And again, data does not exist regardless of ideas or technologies, it depends on the context in which it is generated. 

Yes, “generated”, because as Jathan Sadowsky writes in When data is capital, “data is more than knowledge about the world, it is discrete bits of info that is digitally recorded, machine processable, easily agglomerated and highly mobile.”

He argues that “data is not out there waiting to be discovered as if it already exists in the world, like crude oil. The framing of data as a natural resource that is everywhere and free for the taking, reinforces regimes of data accumulation.” 

That’s the reason why, for instance, he speaks of data manufacturing, or data extraction, instead of data mining: data manufacturing because data is a recorded abstraction of the world, created and valorised by people using technology. 

Data extraction, because it emphasizes the people targeted by data surveillance and the datafication of its life, wherein its consent is not requested and the info produced is not fairly compensated. 

To cut it short, collecting data means creating data.
In the words of Jathan Sadowski, “to know the world is to exercise power over it and to exercise power, is to know it, to examine its features and characteristics, to sort it into categories and norms, to render it legible and observable.”

But let’s take a closer look: why all this interest in data?

If you have not listened to the first two episodes of What Do Sounds Want?, this podcast complements Life Chronicles of Dorothea Iesj S.P.U., a sci-fi film and audio novel that narrates the many adventures of researcher Dorothea, as she extracts—and smuggles—sound finds from the past. 

The film investigates the link between data capitalism, technology, and value creation, reflecting on the use of archaeological artifacts, archives, and memory as instruments of power and control: what happens when everything, every surface and object can record us? When we are subject to a pervasive acoustic surveillance? 

Well, this is not exactly 100% science fiction. 
It is not something to worry about in the future. This is real. 
The datafication of our bodies is real, we are tracked down and listened to, aware or not. And this datafication affects identity formation, as individuals come to perceive themselves and others through the lens of data.

To better understand how people and their digital data make each other, and to single out the relation between data, sound, recording and listening practices, we’ve spoken with sociologist Deborah Lupton, author of books such as The Quantified Self or Data Selves, and with Columbia University lecturer Audrey Amsellem, who’s been writing a great deal about sound and surveillance. 

Audrey Amsellem: Part of my work is tracing this kind of starvation for data, this data harvesting, this constant notion in Western history of extracting, of taking away whether we are talking about land or culture…

Nick Couldry would call it Data Colonialism, a capitalization of life without limits. Shoshana Zuboff speaks of surveillance capitalism. 

Were we to trace the origins of the ubiquitous surveillance of our digital behavior, we could perhaps go back to the early 2000s. 

9/11 and music piracy have normalized the fact that our online beings could be monitored. Piracy surveillance, as Sonia Katyal writes, gave a form of legitimacy to actually tracking people’s behavior online and to doing it in a way that was unprecedented. 

Audrey Amsellem…

Audrey Amsellem: To me, this is a major historical precedent for surveillance capitalism, because it becomes okay for entities to track and analyze your online behavior. And it’s interesting that it always starts with culture, with cultural material, with music, right? But surveillance capitalism, what it does is it enables listening, recording, gathering, analyzing your online behavior and your subjectivity. And, you know, all of this is turned into data about people for commercial profit.

Audrey Amsellem: 9/11 is actually another important precedent here as well, and particularly in relation to state surveillance, that all of a sudden, because you have this terrorist threat, that it’s okay to be surveilled as well. And so people start to accept it and not necessarily think about the implications of that.

But this is no more: because of the 2018 Cambridge Analytica/Facebook scandal and the European Union’s adoption of the General Data Protection Regulation, the risks and the sociopolitical implications associated with such a control are very much talked about, especially in the Global North, and our relation with our data is more complicated than the one surveillance studies often linger into.  

Deborah Lupton: I mean, one thing that we’ve often heard is the internet knows everything about us because we are tracked and monitored and surveyed and, um, you know, our personal data is recorded every time we go online. But actually, it’s a lot more complicated than that.

On one side data can be empowering for individuals and on the other side people, communities and groups increasingly resist data surveillance and reclaim control over their data. This is what Deborah Lupton writes about in Data Selves: More-than-Human Perspectives.

Deborah Lupton: We understand ourselves through digital media, but that’s only a small part of how we understand ourselves. 

So digital data generated with and through our interactions online and with digital devices is, you know, for many people, one form of knowledge about themselves and one form of contributing to their ideas of self, but they’re only just one of many ways that we learn about ourselves and think about ourselves.
Um, you know, the whole concept of surveillance capitalism by Shoshana Zuboff, which has got a huge amount of attention, particularly in surveillance studies and media and communications studies, is a very, um, I think, quite a simplistic way, which deprives people in some ways, it sees people as very passive sort of absorbers of or unknowing participants in the way that they use.

And I guess what I try to do in the book that I did call Data Selves, is challenge that and sort of talk more about people’s agency and talk about why people might choose to, you know, generate their own data and use it, use their data themselves for their own purposes. It’s not always about being manipulated by third parties…

Many people indeed do really appreciate being able to use digital devices and apps to monitor their bodies: in the self-tracking cultures, which Deborah Lupton writes about in The Quantified Self, people can have agency over their own data.

Deborah Lupton: So again, I would emphasize that agency that people do have and that the critical capacities they have to recognise that those  data only, you know, tells one facet of themselves and their bodies and their health and other social relationships. I guess where the big difference comes into it is that there, of course, are many, many other ways that digital data is generated about people that they don’t have a lot of control over, and they don’t know where their data is going and they don’t know who is using that data. And of course, there are many situations like that.

So, the self-tracking practices go far beyond the individual pursuit of self-optimization—as desired for instance by the techno-optimists, affiliated to The Quantified Self movement initiated by Kevin Kelly, the founder of Wired. Deborah Lupton is rather clear about that in the homonymous book The Quantified Self: self-tracking practices are culturally embedded; they made their way into various social realms, such as education, healthcare, insurance assessments and very often are imposed and exploitative. 

And here it’s time to bring sound into the equation between control and data. Audrey Amsellem has been researching the legal and ethical ramification of our use of apps and devices that measure and keep track of our habits through the listening and recording of our voice or music consumption. 

She talks about a neoliberal ear.

Audrey Amsellem: So if we look at the hallmarks of neoliberalism, we have the idea of the free market. We have this notion of valuing profit over public good. We have this notion of individualism. So that’s the ideology of neoliberalism, but it’s also a mode of governance, that is coupled with the recording and tracking capacities of modern technology, and together that forms a specific way of listening. So we see emerging not just the ability or, I guess, the perceived ability to listen for and to collect human subjectivity but we actually see, um, the deep rooted belief that this collection of subjectivity has inherent value and that it can be marketed and quantified and sold for profit.
So, um, the neoliberal ear is a form of listening that is the product of neoliberal ideology and neoliberal governance.

ALEXA play….(registrare)

The voice-activated technologies you might have at home literally listen to you. Always. You know about that and that’s also one of their main appeals, right? It makes you feel in control of things, right?

Stephen Neville, who also studies auditory surveillance, has introduced the concept of “eavesmining.” It stems from the combination of data mining and eavesdropping, and describes a set of digital listening processes, a mode of surveillance that operates at the edge of acoustic space and digital infrastructure. 

Let’s take Alexa: she is listening to everything that happens around her because she must be ready to answer your questions when you pronounce the word ALEXA. But what you might not know is that she is constantly recording to process sound… 

Amazon and others are building devices that normalize constant recording of users.

Amsellem says that they should know they’ve been recorded.

Audrey Amsellem: But the ambiguity is whether those particular snippets of their voices or sonic moments and their lives, are those snippets stored, or if they are kept or analyzed to then be sort of fed back to them in forms of ads or even at times, form of manipulation.
There is an ethical necessity, in my opinion, to let people know that they are being recorded because it is actually a different experience of being in the world. We have this notion of listening that’s ephemeral, right? And we have recordings that now are forever. But it’s not only that. They can be distorted. They can be used for purposes that are outside of our control. And this is where this ambiguity exists. And it’s also a very unsettling one.

In her book Sound and Surveillance: The Making of the Neoliberal Ear, Audrey Amsellem writes that listening responds to the constant data starvations of these big companies and that the voice, “as a means of identification, seems to offer a more stable and truthful versions of the selves (and therefore more identifiable habits).”

As she says: “Our searches, our queries, our questions across platforms, are but mere echoes of our thoughts taken as truth by tech companies and the advertisers who purchase our recordings to form a picture of our identities. Our echoes, in the form of data, are a reflective distortion of us, but not us. Tech companies are aware of this issue, and developing voice technology is their solution.”

Audrey Amsellem: So we have, you know, these companies that, they gather data about people and they treat this data as raw, as objective, as valuable in itself, and they analyze it to and for some kind of information about our subjectivities and who we are, what we think. But we are actually complex beings. We are beings that have very unclear inputs and outputs, so these little bits of information that they can collect, they’re not us. They are what I call echoes. They’re tied to us in some way but they distort us.
But what voice does, and this is according to tech companies and their discourse and the hundreds of patent applications I studied, right, is that it’s supposed to offer context. So what they believe is that they can infer what you actually mean by analyzing your voice data and that full sentence in the context in which it’s operating in. So in order to make this analysis, they have to record you, not just in a moment. They have to do it over time so that they know, you know, what is your neutral voice? Uh, when are you being sarcastic? When are you being serious? So we have, you know, an obvious privacy issue here. And this is also the constant starvation for more data and this idea that the more data you have, the more precise of a picture you can paint. But the caveat is that voice is actually not as stable as it seems. So they believe that with voice they can identify gender, age, accent, race and they believe that it’s useful data to them. But the reality actually tends to be much more complex. And, they tend to sort of treat complex problems with, in a way, simple solutions.

So both Deborah Lupton and Audrey Amsellem seem to agree that human subjectivity can’t be fully and easily categorized or commodified as data, not even through one’s voice that, luckily for us!, is more opaque and unstable than any prediction.

And furthermore, as Lupton points out, data always comes from the past: whatever has been recorded of my habits is already old, past tense…
She asks…what if we thought of our personal data as a new form of human remains? As archeological artifacts? Are they like our bones? 

Deborah Lupton: So in the book Data Selves, one thing that really sparked my thinking and I guess creative thinking about how we might understand digital data, um, in terms of how we understand its relationship to ourselves and our identities, is making that analogy between human body parts. It  gets back again to how there’s a digital data economy where people’s data is not all of it, but, you know, some of their digital data is sold by data brokers and used for commercial purposes. So, I’ve been quite interested in what’s called the bio economy, which is the way that human cells or blood or gametes are marketed, uh, become commodities for sale, their body organs, those kinds of things, parts of human bodies. That’s where I sort of talk about digital data in a way, as being both part of an economy of parts of humans, because you could see digital data as parts of humans in terms of generating information about their bodies and their selves and their everyday lives and practices. I also read a really interesting book about the way that human bones became used—back in the sort of early days of anatomy teaching and medical science, so in the late 1700s into the 1800s in Europe—skeletons—or for that matter  corpses—were stolen and sold for anonymous to teach medicine. But there was this actual job where people would put together, take the bones from people’s skeletons and sort of put them together as a teaching skeleton that could be, you know, wired together for teaching purposes at medical schools. And so that really sparked my, really built on my idea of the fact that, well, digital data about people’s bodies, it’s not a sort of material part of their bodies, like bones are or, um, blood or cells, but it’s also part of the bio economy these days. But the other analogy I was trying to draw there with, particularly with the bones, human bones, is that they can be used in lots of different ways. They can be in museums, they can be in medical schools, they can be in graveyards. So they can be used in art and I was arguing that the same can be said of people’s personal data, that they are brought together and sort of processed and, you know, mixed up in lots of different ways. I use the term lively data, by the way, to talk about people’s personal data, because it does have its own sort of life that once it’s generated by the people the data is about, it goes on to be formed and reformed and reprocessed, or maybe just sitting in archives, like, I guess like dead bodies, but might be brought out again to be used and reused again. So I guess that was the sort of analogy really I was trying to make, by talking about remains, the human remains and sort of making those, sort of riffing off that idea, particularly of human skeletons or human bones.

So let’s go back to Life Chronicle of Dorothea Iesj S.P.U. 
Let’s go back to Dorothea and the sounds she extracts from the past and she smuggles…
[Audio excerpt from Life Chronicle of Dorothea Iesj S.P.U.]

Gli audio-reperti, i ritrovamenti acustici in generale,
profanati dal suo meticoloso setaccio origliatorio.
Secretioni uditive. Reliquie parlanti.

The audio-specimens are talking relics.

E beninteso, ECHO 
non estrae invero i suoni dalla materia,
ma ne formula una resa, una simulatione.
Questa la sua proprietà fondamentale
e il suo limite intrinseco.

ECHO—the technology Dorothea is using—does not indeed extract sound from matter, it formulates a simulation. This is ECHO’s fundamental property, and its intrinsic limitation.

We might say that sounds like bones are lively data. 

This podcast serves as a pair to Life Chronicles of Dorothea Ïesj S.P.U., a project by ALMARE curated by Radio Papesse, promoted in collaboration with Timespan and produced thanks to the support of the Italian Council—a program to promote Italian art by the Directorate-General for Contemporary Creativity of the Italian Ministry of Culture.

If you’d like to know more about Life Chronicles of Dorothea Ïesj S.P.U., please visit and

In the next episode we’ll dive into sonic fiction. Stay tuned.

You can find the Italian translation of the third episode here.


ALMARE is an artistic and curatorial collective dedicated to contemporary art practices that use sound as expressive mean. It was founded in Turin in 2017 by Amos Cappuccio, Giulia Mengozzi, Luca Morino and Gabbi Cattani. ALMARE works between curatorial and artistic practices, through collective writing, research, sound and music production, organizing concerts, performance lectures, talks and exhibitions.
Radio Papesse is a web radio and a sound archive dedicated to contemporary artistic practices, founded by Ilaria Gadenz and Carola Haupt. It hosts and commissions experimental sound and radio works, by inviting artists and sound makers to renovate the rules of broadcasting and narration.