The uncanny valley is that region in which an artificial system, in gradually becoming better at mimicking a human’s appearance or behaviour, becomes eerily off-putting. In robots, in computer graphics, in children’s toys, we innately distrust dead things telling us they’re alive.
The recent creeping intrusion of social networks into our privacy is, I think, a facet of this innate distrust, and it’s perhaps unfair to assign too much malevolent motive to mechanical systems which were not designed with ill intent, or perhaps any intent, in mind.
I like that my browser remembers what sites I visit. It saves me repeatedly typing in the same URL, or trawling Google for an article I was reading yesterday. Clearing the cache is a crucial feature, but a browser that retained no memory of the sites you visit, ever, would become very frustrating, very quickly. It would be akin to constantly giving instructions to a person with neither long nor short-term memory, no capacity to learn from your previous instructions.
Computers aren’t people though. Once told to remember something, they hold on to that fact until they are told to forget it, or their magnetic bits fade and fail, whichever comes first. Here lies the problem. Computers are too good at remembering, the capacity and extent of their memories far beyond any human’s. Divulging a secret to a computer means it will be remembered forever, not because computers are nefarious, but because that’s just what they do. Though not the fault of the computer, and indeed an intended and deliberate feature, it’s unnatural, and falls into that uncanny valley that our brains tell us to fear and distrust (and perhaps rightly so).
Browser histories, locations, fitness data, email logs, as creepy as it is to picture all this sensitive data about our lives being compiled and archived, it would be incredibly frustrating if our devices were so stupid that they tracked none of this, and in many cases would negate the entire benefit of owning these devices, using these programs, or subscribing to these services.. The problem is that there’s no distinct line between what is useful and what is creepy.
This is not to argue that we shouldn’t be upset that this data can potentially be accessed without our consent, or to claim that the motives behind large companies collecting our data are entirely benevolent, and not a cause for concern. This is a call for reflection and moderation. Before we raise pitchforks and torches against our silicon creations, and demand they be destroyed or lobotomized, we should remember why we gave them un-life them in the first place.
- Header image from https://www.flickr.com/photos/downhilldom1984/