Talking Out a Revolution: Part One – Swift Evolution

After every communication advance in human history, a great shift in our behaviour and development has occurred. The alphabet allowed the emergence of democracy, the renaissance could not have happened without the printing press. We are currently living through the greatest and swiftest communication advance in the whole of human history and we are still utterly blind as to what it might be doing to us, and how we may eventually emerge from its digital chrysalis. What is our increasing reliance on technology doing to us as a species? What does the future hold for us with technology shaping our evolution, or for the children who have never known anything but an interconnected world?

A quick test: are you nostalgic about any of the following things?

Happy Days, milkmen, Crackerjack, hot rollers, The Jackson 5, click-clacks, nothing (and I mean NOTHING) being open on a Sunday, Ziggy Stardust, getting your new comic every week, making a mixtape, Bewitched (NOT B*witched), learning to skate (not rollerblade!), Tales of the Unexpected, school milk at morning break, Freddie Mercury, Rubik’s Cubes, The A-Team, Yo-yos, skipping ropes and your watch which was ALSO SOMEHOW A CALCULATOR

If you suddenly felt all childlike and joyous at the mention of some of the above – watch out; for you are a child of the last generations that will ever know what it is like to live without technology aiding and supplementing your every waking thought. Your method of thinking is dying out, your methods of problem solving are no longer fast enough. You cannot do enough at once; you are already a dinosaur – an analogue thinker in a digital age.

Do you remember the first computer you ever saw? In about 1984 or 5 my Dad, a naval officer, showed me the “supercomputer” housed in one of their buildings. It could make calculations faster than we could ever dream of, he said. Now, while we all know that government operations are always among the very last to update their technology, in the early 80s the Navy’s supercomputer was the size of a small aircraft hangar. The noise was deafening, spooling tapes noisily spinning round while industrial fans worked to cool down the machinery, clacking and whirring – and vast.

Cut to only five or six years later and my brother and I were playing computer games at home, on a machine the size of a large microwave oven. We put a TAPE in a player that made a noise like a fax machine and in about half an hour we could make some pixelated sprites jump up and down on screen making “bloop bloop” noises. Fast forward another three years and suddenly our games are on floppy discs – with the better games always containing an average of about 12 – “Please insert disc 2” might be the phrase I most associate with rainy weekends.

The rate of change, from tape to disc to CD and console – is astonishing. The fact that computer games now regularly display the sort of graphics that used to be reserved only for digital effects engineers and show no rate of slowing is indicative of the huge progress that technology has made in the last 20 years. I was amazed a few years ago to discover that one of the games that used to take 12 discs and a half hour to load was now available on my phone and barely took up a percentage of its working memory. Remember using Ceefax? Remember when all your mobile could do was call or text? When was the last time anybody used the Yellow Pages?  You understand what I’m getting at here – you’ve lived through it.

Despite our shared nostalgia about some of my list above, there are marked differences between those of you reading this piece based almost entirely on the time into which you were born. Research shows that while those born in the “Baby Boomer” generation have a “live to work” attitude (and of course, the bastards all bought houses for a song), Generation X (those born from the mid sixties to early eighties) tend more to view themselves as “working to live”. Meanwhile, Generation Y or ‘Millennials’ (those born from the early 1980s onwards) seem to show higher levels of entitlement and narcissism than previous generations, however research suggests that in spite of this, they are a more tolerant and confident group than their predecessors. Studies about the generation coming up behind that are still being done, but there are already outliers that suggest that the system as we know and understand it is in for a shock of the most seismic variety.

Even the UN has commissioned research into ways to manage the different methodologies & expectations of employees of different generations. While these differences may warrant expensive studies, the evident similarity between them is this – each of the generations I mention above, except the last, has had to learn how to incorporate technology into their lives. That concept has ended with us, and those children born since the late 90s have known nothing but the ability to reference all of mankind’s knowledge at the swipe of a finger.

I have long been fascinated by the idea of what our increasing reliance on technology is doing to us as a species. If every communication advance in history brings about a great shift in human behaviour and development then we are living in the time of the greatest shift there has ever been, and we are still utterly blind as to what the long-term effects of it are, and what the history books will come to see as our generations’ defining moments.

Perhaps, one day, historians will look at the laptop – or more likely, the smartphone – as the tool that redefined humanity once and for all. Instant communication from anywhere in the world sent directly to you. A window into your friends’ lives is now a mere ‘status update’ away, with the constraints of geography no longer an issue. It’s no wonder that as we have developed an online persona, we have each begun to curate an exhibition of the self. First impressions are no longer necessary – we can make our minds up about your character or achievements simply by looking you up. If at any point we wish to understand what the world thinks about any given subject – be it a brand, an event, the weather, that restaurant, that film everyone’s talking about – all we now have to do is look.

In the late 80s it was said that we received more information daily through the media than our Victorian counterparts received in a year. Now it is close to three times this amount of information.

It is obvious to point out that this MUST be having an effect on us, both physically and mentally. Now that we are surrounded by a constant stream of information and content, we have learned to discern better what is worth our time, and anyone hoping to grab more than a moment of it must grapple with a series of factors; the time of day, whether we are at work or home, our relative level of interest in the content in question.

It was not long ago that every statistician was citing studies, since debunked, that suggested our average human attention span had dropped to by eight minutes shy of what it was in 2004, and was lower than that of a goldfish – whoever made the original goldfish study up deserves some credit, it has been doing the rounds for over a decade – but what is certain is that in an ever-more crowded space, we are more likely to interact with content that we trust or are familiar with in some way, and will only give our time to something we deem worthwhile. Brands have had to resort to ever-more desperate measures to contend with this.

Our thumbs, thanks to texting and gaming, are more powerful and developed than those of the generations that came before. A swiss study in 2014 found that smartphone use changes the way that the brain and the thumb interact; the study’s lead scientist believes that sensory processing in the brain is continuously shaped by use of digital technology. Those who use social media frequently also show changes in mental activity; one 2011 study showed a direct correlation between the amount of grey matter in the brain – the tissue where mental processing takes place – to the number of friends the subjects had on Facebook (really!). In a study of the hormones active during certain tasks, the production of Oxytocin; the hormone that stimulates trust and empathy, was proven to increase rapidly when those tested were online. There is a direct and scientific link between mental processing capacity and time spent on social networks.

A wealth of scientific studies prove that your brain – a computer still beyond our full understanding – learns to solve problems through repetition. Within the brain a very complex sequence of electrical and neurochemical reactions occurs in response to every form of stimulation. Repeat stimulation of one area at the neglect of others can fundamentally alter the brain’s operation – one study showed that just five hours of internet surfing can change the way it works. Alarmists would (and do) suggest that “Google rewires the brain” – but this said of course, everything rewires the brain – it is, as any neuroscientist will attest, how the brain works – greater activity in any area, even greater use of any particular muscle causes the brain to devote more “runtime” to it.

As we dedicate more of our “runtime” to the internet and social networking, our brains and our behaviour have begun to adapt. This phenomenon, combined with the increasingly liberal viewpoints of successive generations of parents, is already having a marked and (probably) unstoppable effect on society.

Next month: Generation Why



Leave a Reply