Happy 40th anniversary, PCMag! Times like these call for nostalgia and a look back at what we got right and wrong. I was the editor-in-chief of PC Magazine for 14 years. In the September 2001 issue, for the 20th anniversary of the IBM PC, I made some predictions(Opens in a new window) about technology and how I expected it to look 20 years into the future. So now is the perfect time to revisit those assertions and tally up my hits and misses. Let’s say I didn’t quite have a crystal-ball vision. But all in all, I didn’t fare too badly.

Miss: The Utmost Importance of the Smartphone

“Digital cameras will be ubiquitous, with just about everyone using computers to edit photos and digital video. Every business will use the Internet for communications, and web services will start to take shape this year. Over the next few years, your calendar will be available on the web and accessible wherever you are. You’ll be able to share it with multiple people. ”

I was right; these things did happen, but I needed to take the prediction farther. By 2011, digital cameras and the Internet were everywhere, and you could easily share content. But what I missed was how the smartphone would consume the digital camera market–and, more importantly, how it would become most people’s primary computing device for its portability factor, unlike the PC on your desk. Apple launched the iPhone in 2007, with the App Store following the following year. The rest is history.

Hit: The Genesis of Cloud Computing

“The applications I really want–real-time, accurate voice recognition and translation–are still years away, but they’re coming. In the next few years, we’ll see advances in peer-to-peer computing not only for file sharing but also for harnessing all the computing power we have out there to solve big problems. ”

Yes, the idea of what we now call “scale-out(Opens in a new window)” computing was already taking off. We had software-as-a-service (SaaS) solutions, including Salesforce–and depending on how you look at it, going as far back as ADP processing payroll on mainframes. Amazon Web Services launched in 2002 and soon evolved into what we now call “cloud computing.”

These platforms started as more efficient ways of running traditional applications, but they also let organizations collect, store, and analyze massive amounts of information cost-effectively. This enabled new applications and business models, accompanied by various pros and cons. I had yet to realize how important they would become as software-development platforms.

And it was the ability to train deep neural networks with GPUs and, typically, the ability to run these massive models in cloud architectures enabled speech recognition and, later, translation. Siri launched in 2010 and Alexa in 2013, and since then, such platforms have become more and more accurate, with real-time translation vastly improving in the past couple of years.

Hit: Broadband Becomes Big

“The broadband and wireless revolutions are still in early stages, and the telecommunications market is overbuilt. But I’m convinced we’ll eventually have fantastic broadband and wireless applications. ”

This one’s a no-brainer, of course. As I mentioned earlier, I underestimated everything we would be doing on smartphones. But it did take years for internet traffic to catch up–and then exceed–what was built in the dot.com era.

Hit: AI As a Double-Edged Sword

“I also take seriously the very real concerns about where technology is headed. I find some comfort in the slow progress within the field of artificial intelligence, but the ideas from folks like Ray Kurzweil and Vernor Vinge make me wonder. ”

I was right to be concerned about the use of technology, but I didn’t account for the AI explosion of the past decade. Deep-learning neural networks were an academic backwater when I wrote this; it would be another ten years until researchers started using them on GPUs. When combined with the massive amount of data we now have available and the cloud infrastructure to handle it, this technology has brought new accuracy to image recognition, voice recognition, and, later, to all sorts of other applications.

We’ve seen a lot of utility from machine-learning algorithms and the applications they’ve made possible. Still, we’ve also seen plenty of instances in which these applications have resulted in unintended or biased results and much controversy over how they’ve been applied in the real world. We’re still grappling with these issues, and there’s no end in sight.

Hit: Nanotechnology and Biotechnology

Nanotechnology and biotechnology are more fertile grounds for excitement and concern. For instance, the controversy about bioengineered food presages more arduous debates.

We’ve seen many nanotechnology and biotechnology improvements in the past 20 years–mRNA vaccines for COVID-19 among them–along with many debates on these topics. Much progress has been slower than I might have guessed, but let’s call it a hit.

Leave a Reply

Your email address will not be published. Required fields are marked *