Luke's Hand: Do Androids Dream of Electric Sheep
On Artificial Identity, Empathy Prosthetics, and the Dreaming Machine
An Unexpected Detour
I planned out the first few Luke’s Hand pieces before writing them. I’d start with films. Blade Runner, then Ghost in the Shell. But I picked up a copy of Philip K. Dick’s Do Androids Dream of Electric Sheep at a local bookstore while taking a break from writing — and I dug in.
This post became an almost-follow-up to Lukes Hand 01 // Blade Runner. Here, I’m tracing alignments and divergences between the novel and the film, and digging into what those points of contact reveal when you read them together.
Sheep on the Roof
Do Androids Dream of Electric Sheep is not a book about androids falling asleep, or robots with insomnia. For years I misread the title, letting the image of a snoozing android drift past me.
But sheep—electric or otherwise—are central here. In Dick’s toxic future, real animals are rare and precious, an ultimate sign of care and status. Deckard’s sheep is fake. He tends it obsessively on his rooftop, a public performance of empathy and affluence. He longs for the day he might keep a real animal, thumbing through his tattered copy of Sidney’s Catalogue of Animals.
In Blade Runner, Deckard spots the owl’s artificiality in seconds. In DADoES, Rachel convinces him hers is real — and that difference reframes the question, “Do you like our owl?” as a loaded, status-laden inquiry.
// Cyborg theory: Identity is performed, not inherited. The sheep is a prosthetic for belonging.
A World in Decay
Dick’s Earth is not just rainy; it’s collapsing under post-nuclear fallout. Radiation has stratified society. Those with cognitive degradation, “specials,” are pushed to the margins, their humanity reduced to slurs like “chickenhead” or “anthead.”
The irony is sharp: animals are revered yet weaponized as insults. They’re currency and curse in the same breath.
// Signal: Systems can hold contradictory codes — reverence and rejection — without “resolving” the dissonance.Copy code
The Two J’s
Scott’s J.F. Sebastian and Dick’s J.R. Isidore are cut from different cloth but wear the same thematic patch: the underdog. Both are marked as “less than” yet exhibit profound care.
Sebastian is a genetic designer, a maker of replicants; Isidore drives for an artificial-animal repair service. One builds the system, the other maintains its byproducts. Both take in Pris — an android — without hesitation.
// Cyborg theory: Empathy here is not about purity of origin; it’s about permeability of boundary.
They may be the most “human” characters in their respective worlds, not because of capability, but because of their capacity for empathy.
Machines for Feeling
Empathy is the prime currency in both versions of the story, but DADoES scales it beyond the job-site precision of Blade Runner. Here, it’s systemic: quantified, displayed, curated.
The Voight-Kampff test still decides who’s in or out of the human club. But humans rely on machines to maintain the very emotions the test measures. Mood organs tune their affect. Empathy boxes network their grief. Artificial animals scaffold their capacity to care.
// Cyborg theory: Humans here are not “enhanced” by technology — they’re sustained by it. Prosthetics of empathy.
It’s a deep irony: remove the prostheses, and would empathy still stand?
Mercerism’s Feedback Loop
Mercerism may be a hoax, but it’s also a kind of networked cyborg ritual. Participants jack in, feel one another’s pain, and climb a hill together in shared simulation.
The experience is technologically mediated, but the emotions are real. That’s the pivot: authenticity isn’t the point. Effect is.
// Signal: The networked human as dreamer.
// Static: The android as dreamer.
Conclusion // Dreamers in the Loop
In Blade Runner, empathy is situational — tested in the field, weaponized in interrogation. In DADoES, it’s infrastructural: measured, broadcast, ritualized. Both tempt us with the bright line — human here, android there — even as their worlds erode it.
Yet neither humans nor androids can navigate without their prosthetics of feeling. Mood organs to start the day. Empathy boxes to belong. Animals, real or not, to practice care. Remove them, and the whole fragile system shivers.
// Cyborg theory: A “bright line” between human and machine is just another interface protocol.
Maybe that’s the lasting image: dreamers—human and otherwise—wired into the same loop, building their selves from borrowed signals. And the question that hums on the static:
If all our feelings are mediated, whose dream is this?
// end transmission
If you’re new here, you can subscribe to get essays like this, along with field notes and occasional dispatches from the strange edges of human–machine culture.
We’re already in the loop. Let’s see what kind of signal we can find in the static.
// P.S. this post follows
// and you might also like:
→ Obsessive Interfaces. // On the phenomenology of tools, and why we sometimes become more interface than operator.
→ The Cyborgs Aren’t Coming. // A reminder we’ve already been here a while.