Robots, Bodies, and Souls

What do you think a robot sees as it goes about its day? What do you think its world is like? Does it think? If it does, what about? There's a very common view of robots as "obeying programming:" you've almost certainly heard some android in some movie say it "can't disobey my programming." One of the first movies to feature a shot from a robot character's point of view was Terminator, in 1984, and it looked like this:

A still from Terminator showing the T-800's point of view of a dog as grainy and red, with machine code on the sides of its vision.

The terminator's code is on his visual display, as though it's a script he reads and then acts out. It reflects the view that machines are being of objective logic, reading human-provided codes and carrying them out emotionlessly. I think this view is caught up in sort of "homunculus theory" of both human and robot behavior. Under this view, how humans work is that everything you see, feel, hear, etc. is transmitted to a tiny person in your brain (your mind, your soul, etc.) who then has feelings about those sights, and instructs your body to act according to its whims. A machine also has a homunculus, but this homunculus doesn't have feelings, only Logic and a script written in code. It reads the code and instructs the body to do what the code says, no matter what. This is why the code is displayed on the camera feed, even though that's nonsensical (how would the robot read the code it sees? more code?): we are seeing from the machine-homunculus' point of view, and it sees both code and sense data.

I would like to provide a different view of a robot's perspective. In my view, the code isn't read by the homunculus, the code is the homunculus. And it's not code, it's circuitry. Circuity is a physical system, only "logical" or "obedient" in the sense that literally everything is "obedient" to physics. The robot doesn't experience what it's doing as "following programming," but rather as "acting on instinct." Try to view this robot as a strange kind of insect or jellyfish, not an obedient servant.

This robot isn't a T-800 or even a cutting-edge machine learning algorithm. It's a simple, basic robot you probably interact with every single day. You've seen thousands of them, praised some and cursed thousands of others. Without further ado, let me present the...

.

.

.

.

.

Traffic light!!!

Specifically, this is a traffic light at a 2-way intersection with left turn lanes in only one direction.

To understand how a traffic light's programming feels to it like instinct and feeling, not logic, one should program oneself to be a traffic light. First, pick a body part, like your left hand. Watch the leftmost light, and when it's red, relax your hand, when it's green, clench your fist, and when it's yellow, straighten out your fingers. Once you've got the hang of it, pick a second body part, like your right hand, for the middle light. Finally, pick a third part, like your right foot, for the rightmost light.

Once you've got it a little bit (don't worry about perfection, there's a ways to go), scroll down for the next step. Don't forget to take breaks, and don't clench/extend too hard!

The diagram on the left shows the traffic light's point of view. The rectangular bars are its three traffic sensors, and the blinking dot on top is its internal clock, its sense of time. Try to learn to get your hand-clenching correct from that left-hand view alone. It might be helpful to put on some music at 60 or 120 BPM to get the timing right.

Hard, right? Even something "simple" like a traffic light is a bit hard to copy very accurately. Try this for a sec, but then move on, because we can learn from our little traffic-managing friend.


That terminator screenshot got one thing right: how we see the world isn't just determined by our sense data. When people feel mad, we "see red," and when people are happy, they're "glowing." It's not just feelings: countless optical illusions show that our knowledge of the world changes what we see. This diagram incorporates some information about the traffic light's mental state, which I interpret a sort of common ancestor to both thinking and feeling.

Copying the traffic light gets much easier once you know what it knows/feel what it feels, right? Once you've got it down you your satisfaction, take a look inside the traffic light's circuits.

A bit harder to understand, yeah? There's a reason we make friends by talking to each other and not peeking inside brains. All the information is there, but that's too much information to be useful! We need it to be abstracted out to the level of senses, thoughts, and feelings, which I hope I've been able to do here.

Views From Everywhere

Now that you can understand the traffic light's point of view, see how it looks side-by-side with a bird's-eye view of the same intersection. I find it relaxing to watch, like a little traffic light fishtank. You should also mess with the traffic slider - maybe I'm projecting, but I feel like the traffic light gets stressed at rush hour! And next time you're waiting a traffic light, try to imagine it from these two views as well as your own.

Bird's-eye view background is a modification of https://commons.wikimedia.org/wiki/File:Street_intersection_diagram.svg