It might be lunchtime or nearly lunchtime where you live. If so, you’ll probably want some lighthearted cyber-dystopian reading material to peruse at your desk while you eat your Pret sandwich. Well here you go.
Ai Weiwei Is Living In Our Future by Hans de Zwart on Medium: a famous artist’s experience of live under permanent, overt surveillance, a life we may all be experiencing in the not too distant future. But it’s not all about the state eavesdropping on us, because guess what? We’re doing it too:
Put a collar with a GPS chip around your dog’s neck and from that moment onwards you will be able to follow your dog on an online map and get a notification on your phone whenever your dog is outside a certain area. You want to take good care of your dog, so it shouldn’t be a surprise that the collar also functions as a fitness tracker. Now you can set your dog goals and check out graphs with trend lines. It is as Bruce Sterling says: “You are Fluffy’s Zuckerberg”.
On Nerd Entitlement by Laurie Penny in the New Statesman: triggered by, among other things, a discussion thread on Scott Aaronson’s blog, this piece looks critically at the sense of persecution often experienced by male geeks. It’s not unsympathetic but rightly points out that teenage trauma, although authentic, doesn’t negate the privilege that male geeks enjoy later in life. Now, you might say this isn’t really cyber-dystopian, but I’d say it is. As technology exerts a greater influence of our lives, the great risk is that it will be used to enforce and amplify the social advantages enjoyed by those who control it: and, for the time being, that tends to be white male nerds (like me). Addressed the issues raised in this article would go a long way to making a technology-driven future far more inclusive and a little less dystopian.
Finally, in Inadvertent Algorithmic Cruelty, Eric Meyer talks about the emotional effect of Facebook’s “Year In Review” app appearing uninvited on his timeline. The app chose, as its main image, a photo of Eric’s six-year-old daughter, who had tragically died that year. As you can imagine, it was a deeply upsetting experience. Yes, it’s an example of the insensitivity of computer algorithms, but it’s also an example of the failure of design (not for the first time at Facebook) for reasons similar to the ones mentioned above. Eric Meyer has since posted a follow-up where he states, rightly, that this isn’t a Facebook problem but one common to design teams everywhere — worst-case scenarios or even slightly unusual ones are often labelled “edge cases” and then dismissed. Either way, this is a horrible example of how technology can still cause harm without anyone intending to be harmful.