Thursday, 10 March 2011

I.am.Robot

It would seem the virtual is frazzling our wires in reality. Lauren De’Ath asks could myths about a Cyborg Nation really be true?

Questioning the detriments of technology in mainstream modern society is nothing new; after all it seems fairly Orwellian and is something of which Daily Mail ethics consistently remind. Sociability is apparently down and kids have been out taking murderous inspiration from video games and slasher movies for years we read; nothing new there. However, new investigative science is now asking just how safe is a society that breeds walking digital sponges?

It was the stuff of science fiction. Straight from the pages of Orwell; from the delirious mind of Russell T. Davies or pure Matrix reloaded. But now we have proof: tech culture is killing our brains. For years, as we sat transfixed before the television, we had our mothers whimpering beside us prophesying some ridiculous ailment called ‘square eyes’. Pfft, what of it? CITV was on, mum. Then later, we were chastised for an overzealous mobile phone usage; Chinese kids had forgotten how to use their forefingers in an age of texting thumbs and radio waves were carrying cancer straight to our brains. But- but, my I-phone?! And so life continues.

We are exposed to over 3000 commercial messages a day via various media outlets (Bluetooth, spam email, Internet pop-ups) and now new science is calling a truce to our somewhat overwhelming contact, claiming our brains are reaching critical mass. Funny, how our mothers always know best. For although we can now access information better than ever before, more importantly now that information can access us.

Years later CITV is no more, yet poised as we are on the threshold of tumultuous socio-cultural changes, Stanford science historian Professor Robert Proctor has come up with an ingenious new phrase to summarise society’s big problem. He calls it ‘agnotology- the study of culturally constructed ignorance’. His theory for human “down-culture” is simple apathy- we just don’t care. And, nor is it, he says, necessarily a bad thing. Rather, it is an instinctive coping mechanism as a relative system overload has left us bereft. But he is not alone in his studies of how digitization has pushed us to our limits and nor is he the first.

It all began way back in 1997 with our Nokia 210s. Microsoft researcher Linda Stone was mulling over what was then a very primitive human -technology affiliation; one that was enchanted with downloading ring-tones, ‘cool’ screen-savers and calling taxi-dad when you were stuck on the wrong side of town. Having worked for a multitude of computing companies, Stone was amongst the first to see how twenty-first century mod-cons could turn sour. “We were in the sweet spot of it,” she says, when we catch up with her (over smartphone email, of all things), “delighted with these devices that offered convenience: ‘I'm lost, I'll just call them on my cell phone,’ or ‘I'm running late, I'll just text and give them a heads up.’”

Noting the growing dependency on the new best friend permanently attached to our ears, she coined the revolutionary phrase, Continuous Partial Attention. It was a phrase that alluded to a society that never shut down, socially or mentally and had resulted in an extreme form of multi-tasking to cope. “When I talked about CPA in those days, I suggested to audiences that, for the moment, we were excited by this opportunity to be connected anywhere, anyplace, anytime. But ultimately… we would grow weary of this.”

And, of course we did.

Aside from a more complex relationship with our mobiles, a barrage of social networking sites suddenly meant we had an online presence, a presence accessible to anyone and even more alarming- one locked in cyberspace. It was a menacing component of modern connectivity when Facebook announced ownership of uploaded content, likewise an account could never de deleted should you wish, just made dormant. In short, you could never leave and never die. It was a revelation that shook the world. Furthermore it was revealed in 2008 that you could actually develop something called IAD (Internet Addiction Disorder) of which Facebook was the most common addiction. Symptoms included withdrawal symptoms, anxiety attacks and obsessive thinking about what could be going on online, etc. It was proof enough that our dependency on the Web had spiralled out of control.

The fear that we will become unwitting androids is one that has long plagued sociologists, but whilst this was all rumour and superstition, could science have accidentally stumbled upon proof of our living a logged on half-life? Linda Stone namedrops something called ‘email apnea’, a temporary absence of suspense of breathing whilst on email or mobile, namely we are quite literally ‘plugging in’ to an online moment. However, the question is, once we become an avatar how much of us is actually human?

David Giles specializes in media psychology at Winchester University and through his own studies of social networking sites (SNSs) he has deduced a striking theme that isn’t far off of actual science fiction. He explains that, “SNSs are a means of replicating yourself and leaving part of you behind, possibly forever and leaving all this material online in photos, personal preferences, blogs etc. is a small step towards immortality. Most SNS homepages only get seen by a few hundred people at most, but I also think that people still have, to some extent, a belief that everyone is logging on and looking at their websites. Thus they believe that simply having a website is a step towards immortality.” Perhaps, like Cypher in The Matrix movies we find that life is easier as an avatar, continues Giles: “We have the capacity to be the better person online; we can edit, delete and ascribe a personality to ourselves that might not be strictly true, but better.”

In the late 90s, before mobile technology and widespread computing had frazzled our brains, future talk of a cyborg globe all seemed something of a distant hallucination. We probably reacted as if Mum had just tried to change over our cartoons: ‘don’t overeact’! Comments Stone on our overzealous technological presence, “What we're seeing today is a shift into an always on mode. Even when we hope to take a break for a meal or a movie, an evening or a vacation, we find ourselves checking emails and texts we find that we're not able to break away. We're like the hamster on the wheel. In motion. Staying in motion. This contributes a lot of wear and tear on the body and the psyche.”

Our online issues became the stuff of widespread debate. New words such as ‘mouse potato’ and ‘stress puppies’ became commonplace in beehive offices to describe an unprecedented wave of net-addicts and Hulk-like work ethics. The first foray into future forecasting our descendents began in 2006 with the release of Mike Judge’s cult movie Idiocracy, a satire set in 2505AD that ran with the tagline ‘The Future is a No-Brainer’. It saw a prospective world where the inhabitants of Earth were so thoroughly inbred, lazy and addicted to token pop culture they were facing extinction.

Then in 2008 Atlantic journalist Nicholas Carr asked the question on the tip of everyone’s lips, ‘Is Google making us stupid?’ It was a pertinent point at a time when contemplative intelligence was radically being swept under the carpet to make way for prospective hardships; a looming recession caused by, what’s that, a lack of attention to detail. Carr later went on to dedicate an entire book on the subject of ‘Shallow Thinking’ in 2010. It went on to great critical acclaim. Meanwhile, author Susan Hill hit back at online reading, identifying it to be the sole cause for her sudden lack of concentration. She polemically quoted: “Too much internet usage fragments the brain and dissipates concentration so that after a while, one's ability to spend long, focused hours immersed in a single subject becomes blunted. Information comes pre-digested in small pieces, one grazes on endless ready-meals and snacks of the mind, and the result is mental malnutrition.”

Continues Stone, “CPA is an attention strategy, however there are times when it is a terrific strategy for an activity, other times when it's not. Because we are working so hard at trying to stay on top of everything, we shift into a ‘fight or flight’ state: a state of vigilance. In such a state, we are more likely to have inattentional blindness, where we see only what we are looking for so vigilantly that we miss everything else.” We have simply lost our capacity to commit to one thing for a period of time and information overload is the reason why.

Of course, these theories have been consistently challenged, overruled and deemed timely fear-mongering. One such techno- optimist Jamais Cascio through his work via the Institute for the Future and the Institute for Ethics and Emerging Technologies defends the computing age as the very proof of our supposedly flagging intelligence and says we are developing new parts of our brain as we go. “It's happening all around us,” he says speaking on behalf of trend agency Pew Research, “Across the full spectrum of how we understand intelligence: it's visible in the hive mind of the Internet, in the powerful tools for simulation and visualization that are jump-starting new scientific disciplines. We are developing fluid intelligence-the ability to find meaning in confusion and solve new problems, independent of acquired knowledge."

However, the fact cannot be escaped; that whilst we have developed in and rather established new areas of digital intelligence, it is rustling a few feathers as it goes. The computing age has led to a systematic dependency on digital medias in just about every area of human life, a lack of social skills and many argue, a cultural time bomb waiting to explode.

Furthermore, there is the flipside to this so-called social erosion; with any information available at the flick of a button, perhaps, query some, we have become so fearful of being incorrect, what we know and can know becomes endless. In agnotological instances, even simple debates we once thought we knew the answers to have become clouded. Normally, we expect society to progress, amassing deeper scientific understanding and basic facts every year. We assume that surely knowledge can only increase? Well, apparently not. According to Proctor, when it comes to many contentious subjects, our usual relationship to information is reversed and ignorance increases. To quote Farhad Manjoo in his book ‘True Enough: Learning to Live in a Post-Fact Society’: “If we argue about what a fact means, we're having a debate. If we argue about what the facts are it is agnotological Armageddon; where reality dies screaming.”

Of course, the fear for many is that we cannot undo what grave cultural mess we have gotten ourselves into; however the solution, according to Linda Stone is far more simple than one could have ever supposed. In a society where we spend over 30 hours a week logged-on, plugged-in and absorbing, scientists are studying the benefits of something called Earthing.“Nature is the antidote,” says Stone, “In grass, earth, trees, water and sand; we're more likely to breathe fully. We're more likely to feel replenished.”

Strange that after all our technological accomplishments, from television to mobiles, to think that there would indeed come a time when we would rely on the age old saying that, sometimes ignorance is indeed bliss.

No comments:

Post a Comment