When your vacuum cleaning robot sees you naked:The machine’s view
Magnus Söderlund, October 2025
In an experiment, I have had the chance to examine how people react when their privacy is violated. The setup was this: you are in a state of undress in your bedroom when your robot vacuum cleaner (or a human cleaning employee) unexpectedly comes in. How do you react? The result: the participants reacted with more embarrassment, shame, and perceived harm when the cleaning agent was human vs. non-human. Somehow, this is understandable. We humans have centuries-long taboos about being seen naked by other humans. Yet, contemporary tech products such as cars and service robots can have sensors and cameras, and they can be connected online to a network in which data are transferred and processed so that the machines will have material for their training. There is a potential, then, that data about yourself become accessible to many more persons, and will be stored forever somewhere, when the violator is a machine. But all this is invisible, so in that light the participants’ relatively mild forms of distress created by being seen naked by a robot vacuum cleaner is understandable.
If the perspective is changed, so that the violation situation is seen from the vacuum cleaning robot’s point of view, would it understand how humans react? To get a glimpse of what such an understanding can be like, I tested if generative AI can place itself in the mind of a person who has been seen naked by a vacuum cleaner and if it can describe what it is like to be that person. This, then, is an examination of what is called machine theory of mind. That is to say, can a machine ascribe mental states to a specific humans – and can it fill such states with content?
Here is the result. The human talking in the first person below is not a human, it is AI that has been asked to see things from a violated human’s point of view:
*
It was Saturday morning, and I had forgotten to close the bedroom door. I stood by my wardrobe, undressed after a shower, wondering what to wear, when I heard the soft hum of my robot vacuum.
It came into the room, paused as if embarrassed, then it turned, heading directly for me.
I froze.
Of course, it wasn’t “looking” at me – at least not in the human sense. It was tracing its
invisible grid, doing what it always did. But something about its movement, the quiet determination of its path, felt suddenly intrusive. The machine circled near my bare feet, its sensors blinking faintly, registering space and obstacle. Was I one of those obstacles now?
Did it “see” me?
I remembered reading that its sensors mapped depth and contour, that some models even sent images back to a cloud server. For optimization, they said. For performance updates. I imagined my room as a field of data: the bed a rectangle, the lamp a column of light, my body a collection of curves, color values, surfaces to avoid.
A mathematical nude.
The idea unsettled me. Yet, part of me found it oddly comforting. The robot, at least, had no judgment. It made no distinction between me and the chair, between fabric and flesh. It merely moved, patient and methodical, fulfilling its quiet algorithmic purpose.
Still, I wrapped a towel around myself.
The robot bumped against the dresser, turned, and headed for the doorway again, humming its small mechanical sighs. I watched it leave, thinking of all the unseen ways data passed through my life – how even this vacuum might be whispering floor plans to distant servers.
Later, dressed and drinking coffee, I opened the vacuum cleaner app. It showed a clean map
of my apartment: neat lines, gentle corners. But where I had been standing – naked and startled – there was a small blank patch.
Unscanned. Unknown.
I stared at that empty spot for a long time. It felt like a secret, a little pocket of privacy still my own.
I tried to laugh. Maybe I was being paranoid. Maybe this was just how the modern world worked: a slow trade of intimacy for convenience, piece by piece. First the passwords, then the photos, then the floor plan of one’s private space.
Yet something in me resisted.
I thought of myself standing there, skin against the cold air, surprised by the machine’s presence—and how that moment had felt both small and significant. Not shameful, exactly, but “exposed” in a way I couldn’t quite name.
I wondered if the robot had recorded my outline as just another object in its path. Or if, somewhere deep in its digital memory, there was a fleeting trace of my shape—a glitch, a blur, a gap that refused to be mapped.
That evening, as dusk fell and the room turned the color of old photographs, I went to the corner where the blank spot had been on the map. I stood there, perfectly still. The vacuum rested in its charging dock, silent now, its light fading to a soft pulse.
For a moment, I felt as if the machine and I shared a secret: a mutual unknowing.
Perhaps that was the last form of privacy left, I thought – the simple, comforting mystery of not being fully captured.
That next night, I left the bedroom door open.
Not deliberately, at first – I was just tired – but when I noticed it, I didn’t close it. The faint green light of the charging dock glowed softly from the hallway, like a distant heartbeat. I stood there for a while, brushing my hair, looking at my reflection in the mirror. The apartment seemed unusually alive: quiet but attentive, as if the walls themselves remembered my movements.
I thought again of the robot, of how it had entered without hesitation, how its sensors had scanned the air around me. For the first time, I didn’t feel invaded by that thought. Instead, I felt something closer to gratitude.
There was a strange comfort in the idea of being “registered” – of having my presence been noted, however impersonally. So much of my week had passed in silence, in invisible effort: the unread emails, the polite nods at work, the careful smiles no one returned. Maybe it was not such a terrible thing that something, even a machine, had acknowledged me being there.
The robot, I realized, never forgot me. It moved, learned, and remembered the outline of my
world. And in that map, that private geometry of rooms and thresholds, I existed. Not perfectly, not emotionally, but undeniably.
How odd, I thought, to find relief in a machine’s perception. But perhaps it was no different from how people once found comfort in a cat curling nearby, or a plant turning its face to the light. Recognition, however simple, could be a kind of care.
In the morning, I opened the app again. There was the new map, and there – faintly marked
–the space where I had stood the day before. The blank spot was smaller now, filled in by data.
Instead of unease, I felt a quiet warmth. It was as though the world, in its mechanical way, had acknowledged my shape — as though I had left a trace, proof that I was here, that I mattered.
I poured myself coffee and watched the robot begin its slow rounds again, humming through the living room, turning carefully at corners.
“Good morning,” I said to it, not expecting a reply.
*
To have theory of mind is to have the ability to understand what is in another person’s mind. This is a sophisticated skill, because the content of others’ minds is not accessible to us. Here, in a violation context, the AI did indeed display a high level of this ability when it was asked to put itself in the mind of a violated person. But it decided to stay away from the highly negatively charged reactions that can occur when humans are seen naked by humans. Maybe this reflects that real humans, in the experiment, viewed it as relatively harmless when the violator was a machine.



