When Pigs Fly

Perceivings by Alan Dean Foster

With Tesla, Elon Musk has been putting computer chips in cars for roughly a decade.Today it was revealed that he has put one in a pig.

Without going into details, the chip basically reads the pig’s brain waves and broadcasts them wirelessly to a receiver. There they are downloaded for viewing. We can now see what a pig is thinking, even if we cannot yet interpret it.

It’s a good thing Elon used a pig. Had the chip been implanted in a cat or dog, animal activists would have been up in arms. Even though a pig was used, it’s hard for folks who just finished bacon and eggs for breakfast to protest at the idea of a pig being employed for the benefit of science. It just doesn’t raise the emotions the way a plate of dog and eggs would. Also, I’m not sure I want to know what my cats are thinking. They’re predators, after all.

Elon may be a little mad, but history shows us that brilliant people often are. There is always method to his madness. Science-fiction has been giving us stories centered around brain-computer interfacing for some time now, the idea being that once we can develop such an interface, a user will be able to tell a chipped machine what we want simply by thinking about it. Instead of yelling for a partner to make toast, we’ll simply think “toast,” the necessary device will pick up the brain-broadcast request, and make toast.

Tesla already sells cars that respond to verbal commands. Additionally, if one hits the signal lever to change lanes, the car complies and then straightens out in the new lane—provided no other vehicle is occupying that lane. This is very useful since the car has 360° vision and I don’t.

Now imagine that you are chipped and the car is appropriately equipped to receive your thoughts. The physical turn signal disappears because all you have to do is think “change into left lane.” Beyond that we’ll move to commands that are now done tactilely or verbally. “Raise temperature — tune to KFAC — slow to 65,” and so on.

This won’t stop (or necessarily begin) with cars. The driving force behind human-computer interfacing is to enhance quality of life or patients with severe neurological impairments like Lou Gehrig’s Disease. If a sufferer can think “turn on the TV” without having to struggle to reach for a physical remote, that’s a huge benefit right there to a lot of physically impaired people.

Stretch the tech a little: You’re a paraplegic in bed. You think “bring me a glass of water.” The thought is picked up by your beside robot, which departs and returns, bringing with it the needed water. Straw, sir? Had enough? In fact, to voice the verbal command “I’ve had enough” or just “stop,” you’d first have to un-mouth the straw. With an interface, you could just think the request.

This technology is coming, and faster than most people think. It is a reason why certain aspects of contemporary science-fiction film has always troubled me. Have you ever noticed that whether it’s an X-wing fighter from Star Wars, or the command deck of the spaceship Enterprise, or the cockpit of the Earth-moon shuttle in 2001: A Space Odyssey, ship controls always consist of pressing buttons, with the occasional use of voice commands?

Here we have ships that can travel faster than light, hand-held communication devices that let us talk to aliens, robots equipped with serious artificial intelligence, artificial gravity— but everyone is still typing and pushing buttons. I suppose we have to cut Hollywood something of a break here, because watching a cockpit or command deck full of people just thinking would make for some pretty dull cinema. But I would dearly love to see it on screen, even if just once. Because that is how people on a future spaceship will interface with it and give it commands, by thinking at it through the special chips embedded in their brains. Or, if a less intrusive method is preferred, via some sort of mesh arrangement that fits comfortably over one’s head. Think medieval snood, only with chips instead of gems.

Filtering will be a major concern. Thinking back (no chip involved here) to the example of the toast, assuming the toaster is not preloaded, what if the husband wants whole wheat and the wife wants an English muffin? What’s a poor interfacing home robot to do? Such situations might actually inspire people to be more polite to one another lest via their mental requests they lock up expensive machinery.

For the next step, we can imagine two-way interfacing. Think (assuming you are equipped with the right chip and that you’ve paid your monthly fee), “I want to see the film Gunga Din. No TV, no movie screen — the film unspools, as it were, right inside your brain. Music would be even easier.

Imagine the boost to creativity. In the late 1940s the SF writer Katherine MacClean wrote a story where people simply thought new music, and it became available. No paper, no external computer involved. True, it’s a long step from reading the brain waves of a pig to composing music simply by thinking about it, but it was only a couple of hundred years from the sailing ship to the 787. Two hundred years from now, a lot of the tools we use today will be obsolete. Thinking about work will take on an entirely new meaning. And folks like me will no longer need a keyboard. Or a computer.

Toast, yes.

Prescott resident Alan Dean Foster is the author of 130 books. Follow him at

12 views0 comments

Recent Posts

See All

Algorithm for the Deceased

Perceivings by Alan Dean Foster This is my one-hundredth column for Perceivings. That’s a lot of disquisition. So for this month’s column I thought to do something related to the number 100. My initia