I love my Microsoft Surface tablet but the darn thing doesn’t take a charge anymore, so it has been rendered useless. I can no longer access my email account and, thus, I’m out of touch with the world for the duration of my beach vacation. My apologies if communications go unanswered.
I borrowed my son’s laptop to use in blogging, but now that wretched contraption won’t take a charge! (I’m now using my wife’s laptop, which means I’m blogging on borrowed time.) Meanwhile, Facebook stopped accepting my password. When I tried to re-set the password, the security screen asked me to identify the faces of various Facebook “friends.” As it happens, I know only a small fraction of the people who have friended me, so I failed that test miserably. An in an apparently unrelated phenomenon, my gmail account booted me out as well! Aargh. I think I’ll just go work on a puzzle.
These irritations all transpired within the space of a single day, which left me gnashing my teeth and temporarily unfit for beach-time companionship. My petty travails are inconsequential to anyone but me, but they seem symptomatic of a larger malaise: Stuff just doesn’t work like it’s supposed to. We’ve got this incredible technology, and it’s so cool that we can’t live without it, but then… it suffers from incessant glitches. Sometimes, I feel like society is headed toward one giant, Obamacare rollout-style breakdown.
Security issues are a part of the problem. Viruses, malware, spam and phishing are omnipresent threats, which means we’re required to continually update and patch our computer security. The problem gets worse over time as new technologies emerge without supplanting all of the old ones, requiring systems to be kludged together. As the Internet of Things becomes a reality, the number of connected devices grows exponentially from billions to trillions, providing more access points and vulnerabilities for infiltrators to exploit.
Another problem is the increasing complexity of IT systems. Just as hardware is kludged together, so is software. When programs have millions of lines of code (or is it billions of lines now?), there’s more stuff to go wrong. When someone tries to link incompatible systems, the complexity — an potentially for fatal conflicts — increases exponentially.
Then there’s the human factor. I’m willing to invest time learning how to use PCs, laptops, tablets, iPhones, email, and WordPress blogging software. But there comes a point when I’m tired of learning new stuff. I don’t want to have to learn my car’s IT interface, much less that of my stove, refrigerator, lights and front door lock. I just want to flip on the lights or turn on the ignition and have stuff work. I realize that young people have a bigger appetite for novelty than old guys like me, but there are millions of other old guys who think that the incremental improvement to our lives is just not worth the effort. There are limits to technology ubiquity that humans are cognitively capable and temperamentally willing to absorb, and I fear we’re bumping up against them.
According to Singularity theorists, advances in computing power and artificial intelligence are supposedly advancing so rapidly that mankind is capable of solving all these problems. But I don’t see it. While technology and IT are progressing at a geometric rate, I would argue that kludging, complexity, and capabilities of the malevolent are hurtling along at a slightly faster rate.
In 1970, Alvin Toffler wrote a book, “Future Shock,” arguing that too much change was occurring too rapidly for people to adapt. That was 46 years ago. Now we’re experiencing Present Shock. It can’t end well.There are currently no comments highlighted.