As the end of the year gets close, common themes stand out. COVID-19 is still the most obvious and traumatic, but it's not the only one. Predating the pandemic and accelerated by it, the other main thread is the rapid shift from hardware to faster, cheaper, more capable microprocessors, software, Ethernet, wireless, Internet and cloud-computing—all now under the digitalization umbrella. I've repeatedly heard expert comments that it's a faster and likely bigger switch than transitions during the first Industrial Revolution, when everyone began moving from agriculture in the country to manufacturing in the cities, though I'm not sure how they're measuring and comparing events in these two time periods.
In any case, these two eras share the thread that technical improvements by a few innovators run far ahead of everyone else's understanding, but are still universally adopted, and irrevocably alter the world and everyone's lives. The problem and the price is that most biological brains, communities and cultures are much slower to catch up and adapt to environments they didn't evolve to live in—just like all the deer and other wildlife that aren't equipped to evade 75-mph vehicles with floodlights zooming through their neighborhoods.
Sure, we humans can learn to use all the new labor-saving tools, such as increasingly powerful piercing weapons, compasses and chronometers for navigation, internal combustion engines, and computers that have shrunk from room-sized to chip-sized to invisible, virtual programs—ironically supported by room-sized racks of servers.
However, these advances are accompanied by biological and psychological lags that can dilute if not derail them. For instance, it took tens of thousands of years for people to learn the world was round despite ample visual indications on every shoreline with a cliff, as well as calculations by Greek astronomers in the 3rd century B.C. apparently proving it. It's easy to see that animals and people were rendered in 2-D for thousands of years before Renaissance architect Filippo Brunelleschi and others started using perspective to convey the illusion of 3-D on flat surfaces, even though we've always experienced foreshortening and vanishing points as part of our regular vision.
Similar mental lags appear in popular culture. For example, whenever anything is damaged in Star Trek or other sci-fi dramas, why does steam shoot out everywhere like it was a pre-diesel locomotive? And don't tell me it's "plasma." These are just antiquated concepts persisting into speculative futures because supposedly creative minds in the present are too lazy to break free of the past. It's the same reason why cybersecurity content is still illustrated with locks and barriers, and why software topics are rendered with ones and zeros. No one seems able to show what these subjects truly are and what they're really doing.
Likewise, hardwiring was long believed to be more reliable than wireless mainly because users could see and touch it. This prejudice was based on appearances, too, and didn't take into account that wireless devices and networks are constantly cycling and polling, making them more reliable in practice than hardwiring. It just took some added time and attention for users to understand this counterintuitive concept.
That's the solution: time and attention. In other words, a little patience from the famously impatient. Plus, a willingness to look beyond the end of our noses, which so many us of are even more unwilling to do.
The good news is most technical professionals in process control and automation and many other fields have already made these leaps. In fact, the whole discipline is based on using trustworthy devices to sense physical states, properties and parameters, which are far removed from the usual sight and senses, and communicating them via trusted networks to points where analysis can happen and better decisions can be made. Granted, Ethernet, wireless, Internet and the cloud continue to stretch these networks and relationships, but they were already lengthened before, so there's no good reason why we can't go a little further, right?