Simulated Reality (NSFW)

What does it mean for something to be real? This seems like an obvious answer: things are real if I can see and feel them. My house is real. My cup of coffee is real. My trip to Greenland was real (although ill-advised). My computer is real. But are the things you see on your computer real? What does it mean for something to be data – does it exist in the obvious way that we think something is real? Can a computer have a mind that thinks and feels – and if so, is that mind real?

These might seem like esoteric questions, or even pedantic bickering, but the answers have real world applications. For instance, violent video game usage is often correlated with aggressive behavior. But does that make playing the violent video game itself immoral? If one takes a consequentialist view, then it is – things that cause people to do immoral things are immoral. But not all people who play violent video games do violent things – it is not a direct cause-and-effect scenario. To say this would be to take any sort of agency out of the person playing the game – their actions are merely an effect of external factors. But there are those who would wish to seek certain types of legislation to keep certain video games off the market, or at the very least, out of the hands of impressionable children.

But I want to step away from this consequentialist argument, because there is an equally interesting question: Can an act be immoral if it is simulated? Computer graphics, robotics, and artificial intelligence is starting to reach a point where it’s difficult not to get our empathy mixed up in the simulation itself. Some people can feel sadness or anger at inanimate objects to greater or lesser extents, depending on how “real” the simulation is. This means that we feel empathetic toward simulations. If we accept that much of our moral sensibilities stems from our ability to be empathetic (sociopaths are people who do not feel empathy, which is why they act immoral) then it is not a stretch to say that these simulations have some level of moral character. But what sorts of implications does it have? Consider the following trailer for the video game “Hatred” which has quite a few people in an uproar (video is NSFW):

In case you don’t want to watch the video (or don’t care to stomach it) in this game, you play a character who decides he hates and despises the world and everyone in it, and so goes on a “genocide crusade” to kill as many people as he can in as brutal a fashion as possible until finally being violently gunned down. Your goal in the game is to just kill innocent people in a violent, and serious, fashion until you are killed. A lot of arguments I’ve read why people hate this game but not something like Grand Theft Auto 5, where you can also just go around brutally killing innocent people, is that in “Hatred” is is the goal of the game to brutally gun down and stab innocent people, whereas in GTA5 it is the persons choice to do this and has nothing to do with the goals of the game (and in fact will hinder your ability to accomplish those goals). I think people are not wrong when they say that “Hatred” somehow has a lower moral character than “GTA5” (although it’s a different subject, which I’ll address at some point, whether a work of art can have a good or bad moral character), but I think they’re wrong in using this argument.

In “Hatred” the simulations are very realistic. The victims sob, cry for mercy, and ask why. They are generated to look very realistic. This makes committing violent acts against them look and feel very realistic. And yet, anyone who see’s the trailer, or plays the game, knows that those people are not real. They aren’t feeling any real pain. There is no actual life that is really being extinguished. None of those simulated people had an real hopes or dreams or memories or loved ones. They are pixels. And yet, even ignoring the consequentialist argument – people playing this game may be influenced toward aggressive behavior – there is something that feels morally wrong about this game.

So, what happens when simulated people become even more realistic? What happens when they are able to simulate more of a sense of agency? When, if left alone, they do simulate a life that seems very real, very personal, and even very conscious? Would it be immoral to make a game – or a virtual reality – where people can just go on a killing spree and brutally murder these simulations? Would that make those players (or game developers) actual murderers?

And what about simulating sex acts that we find immoral? It’s not far-fetched to think that robots that simulate sex will one day become very real – perhaps sooner than we think.

Sex Robots

These things could become very realistic, and that doesn’t seem so bad. There are already such thing as sex dolls. Does it make a difference if the robot can simulate emotions? Desires? Or even fear and pain? And what happens if we make a sex robot that simulates a child in look, feel, and simulated emotions? We all know that the actual chassis itself is not a child, and that the simulation of a mind isn’t a thinking, feeling child that will have its life significantly altered by what someone does to it. So is it still immoral for someone who commit sexual acts against a robot that simulates the experience of a child?

These are certainly uncomfortable questions to consider, but I think with the accelerating pace of technology, they are questions that must be considered. And once these questions are considered, then there is always the hairsplitting about when a simulation of a person becomes something that is sentient, with its own first person subjective experience of the world. It’s amazing and fascinating that we now live in a world where such questions can be considered, but it’s equally as frightening, as we feel our way through Plato’s cave. I think these sorts of inquiries are going to be very real, possibly even within the next ten years. So, don’t shy away from the discomfort, because these things will come up, whether you pay attention to them or not.

Electrifying Intelligence

If someone offered you a pill that would make you smarter, how much would you be willing to pay for it?

While there are such thing as nootropics such as racetams and ampakines which purportedly help with things such as memory and attention, the effects are generally somewhat small and the chemicals can be somewhat expensive to buy or difficult to find. But there is another method that has fairly well established evidence for making you smarter, helping you concentrate, alleviating depression and anxiety, and increasing the speed at which you learn. And it all comes from zapping your brain with electricity, like at the end of “One Who Flew Over the Cuckoo’s Nest.” Well, not exactly like that, anyway. It’s called transcranial direct current stimulation or tDCS.

tDCS uses low amperage (< 2 mA) direct current (instead of alternating current) to either stimulate or inhibit (depending on the direction of polarity) a section of your brain. Neurons in the human brain send signals through their axons using electrical signals – depolarization through influx of sodium and efflux of potassium (an action potential) – so running electricity through certain areas can either cause depolarization or hyper-polarization in that area. Anodal (positive) stimulation increases activity while cathodal (negative) inhibits activity.

What’s great about tDCS is that the equipment for it is cheap and that it is non-invasive. The electrodes can be placed on the head without having to remove hair or break the skin. And depending on where you place the electrodes, you can receive different effects:


While it’s still too early to say whether this is a cure-all for various ailments such as depression and anxiety, or a boon for late night study sessions, there is already quite a bit of scientific evidence that this technique is effective, as well as anecdotal evidence from DIY users. This 25 minute Radiolab podcast showcases just how amazing this technique might be.


Imagine being able to get an extra boost in brain power and clarity when you’re tired but still have to work. Imagine learning to play an instrument, or learn a second language, or study for a math final and retain it all much quicker. Imagine getting home from a stressful day and just strapping a couple electrodes to your head and almost instantly having the stress melt away.

And while this is all great, I’d like to extrapolate a bit. Imagine having micro-electrodes implanted under the skin on your head, with currents that can be targeted at smaller and more specific areas of the brain that can run simultaneously. You can get multiple tDCS affects at once without having to strap the electrodes to your head. It could have a power source you carry with you like an MP3 player (or maybe even embedded into your body). Depending on what you were doing, you could have different settings to optimize your brain to the task. Does this seem possible? Is this something you might be willing to do?