Were 2021 Tech Predictions On the Mark? Here Are Three That Didn’t Pan Out – Toolbox

Be it Bill Gates announcing the end of spam in 2004, Blockbuster’s CEO dismissing the rise of Netflix, or inventor of the Ethernet Robert Metcalfe predicting the end of the Internet by 1996, many tech predictions have gone wrong since the beginning of the 20th century. If Bill Gates, the most well-known technologist ever, could once doubt the ability of humanity to make a 32-bit operating system, anyone can go wrong in predicting what’ll trend in the world of tech and what will not. 

In October, Jeff Bezos, the world’s second-richest person and the former CEO of Amazon Inc., tweeted an old news clipping from 1999 that called him “just another middleman” and forecasted Amazon’s failure. “This was just one of the many stories telling us all the ways we were going to fail. Today, Amazon is one of the world’s most successful companies and has revolutionized two entirely different industries,” Bezos tweeted, indicating how premature judgment about new technologies and ideas could spectacularly backfire. Elon Musk, who replaced Bezos as the world’s richest person a few days earlier, congratulated Bezos in his style.

Bezos and Musk weren’t the only tech entrepreneurs who faced criticisms in their early days. Even Steve Jobs was fired by Apple, his garage startup. With this in mind, let’s look at some of the top tech predictions made for 2021 that didn’t pan out as expected.

See More: What Will 2022 Hold for Technology? 8 Predictions As the Pandemic Marks a Return

Three Predictions That Didn’t Quite Pan Out As Expected

AI governance will significantly improve in 2021

The expanding role of artificial intelligence in shaping society was one of the top technology trends for 2021. AI was expected to, and has, enabled the proliferation of chatbots, workflow automation, autonomous driving, drug discovery, and cybersecurity and now forms the core for digital transformation projects initiated by organizations worldwide. However, a significant worry with AI adoption was controlling its use cases and preventing its use for malicious purposes. 

Several technology experts predicted that 2021 would mark the year of AI governance. Organizations across sectors would take steps to enhance customers’ trust in AI, mitigate AI bias, and governments would roll out new rules and regulations to govern its use. However, as Carnegie Council for Ethics in International Affairs states, AI systems and algorithmic technologies are being embedded and scaled far more quickly than existing governance frameworks are evolving. The failure to effectively monitor the use of AI can have significant ramifications for society.

While the use of AI for military use cases is concerning enough, what should affect organizations in the technology sector is the rise of Adversarial AI. Brooks Wallace, VP EMEA at Deep Instinct, warns that hacker groups have already started using AI to target organizations to launch cyberattacks and spread malware. 

“Adversarial AI manipulates the analytic and decision-making powers of AI and ML to develop cyber attacks in ways that were previously impossible by using ML tools to attack other ML tools. It exploits weaknesses in an organization’s network to fool their systems into thinking the incoming attacks are harmless, and therefore granting free access and movement virtually undetected,” he said.

Adversarial AI will only increase in the years to come, Wallace argued, and organizations mustn’t be naïve to the genuine threat this attack can have on them as a business. This indicates that most sophisticated hacker(s) stay within the comforts of international borders and use anonymity to commit crimes. Merely rolling out regulations just won’t be enough to tackle the menace.

Wallace said organizations could tackle Adversarial AI using Deep Learning (DL) techniques. Unlike traditional machine learning, DL can identify more complex, high-dimensional patterns and be more resilient. This allows it to counteract adversarial AI by outpacing the attacks and resisting attempts to change the model’s labeling.  

“With threats as sophisticated as adversarial AI, we need to make 2022 a year of cyber change. The only way organizations can do this is if we look toward genuinely innovative solutions that don’t simply focus on mitigation, detection, and response. We all need to level-up and not only meet but surpass the techniques being used by our cyber adversaries.” he added. 

AI governance was always a huge challenge. Did the good guys lose the battle before it even began?

See More: 5 Ways Enterprises Can Make the Most of Their AI Investments in 2022

Adoption of emerging technologies to thrive in 2021

Another major technology prediction for 2021 was the increasing pace of adoption of new technologies across sectors. This prediction was a no-brainer. Cloud had already become commonplace and had ushered in a new SaaS tools and solutions industry. Recent developments in networking, network security, cloud data management, and AI had organizations try and replace legacy technologies as quickly as possible. The initial pandemic-led disruptions had also run out of gas. What could go wrong?

In 2021, The Great Resignation arrived. Harvard Business Review found that more than four million Americans quit their jobs in April, and by the end of July, businesses advertised over 10 million open positions. What’s worse about the phenomenon is that it will not end soon. A study by Korn Ferry suggests that by 2030, there will be a global human talent shortage of more than 85 million people. This could result in about $8.5 trillion in unrealized annual revenues. 

The study also predicted that the talent crunch would erode the US’ leadership in technology, inflicting $162 billion worth of revenue losses annually. It’ll impact almost every major country, including China, Russia, Japan, and the European Union. Bucking the worldwide trend, India could become the next tech leader, boasting a surplus of more than 1 million high-skilled tech workers by 2030.

The significant talent crunch, which wasn’t talked about much until recently, severely impacted the adoption of new technologies as there weren’t enough people to run them. As per Gartner’s research, talent shortage was the most significant adoption barrier to 64% of emerging technologies in 2021, ahead of other serious factors such as implementation cost and security risk. In contrast, the worker shortage affected the adoption of just 4% of 111 emerging technologies in 2020. According to IT leaders who participated in Gartner’s survey, a shortage of skilled workers affected the adoption of 75% of automation technologies and 41% of digital workplace technologies worldwide.    

See More: Can the Great Resignation be Reversed in 2022? Here’s What Employees Really Want

Data center industry will grow spectacularly in 2021

 Predicting that the data center industry would thrive in 2021 was another no-brainer. Organizations’ need to store, secure, and process vast amounts of data had already catapulted data centers to the center of everyone’s attention in the previous years. Today, hyperscale data centers drive all of the leading public cloud platforms. In 2021, experts predicted increasing demand for edge data centers, hyperscale infrastructure, better monitoring and visibility services, and software-defined data centers (SDDC). MarketsandMarkets also forecasted that the global Software-Defined Data Center (SDDC) market would grow from $43.7B in 2020 to $120.3B by 2025 at a CAGR of 22.4%.

However, just like a shortage of people disrupted the adoption of new technologies, a lack of semiconductors severely impacted the data center industry in 2021. 

Earlier this month, Final Fantasy XIV, a massively popular multiplayer online game with over 24 million registered players worldwide, suddenly stalled as Square Enix, its developer, couldn’t add new servers fast enough to meet the demand. 

“When it comes to adding new Worlds, we need tens of ‘server machines’ for every World that we add. Server machines are high-performance computers, which utilize numerous semiconductors. However, due to the Covid-19 countermeasures currently in place, many factories across the globe which produce semiconductors, have halted production or have faced labor shortages,” said producer and director Naoki Yoshida.

“This has ultimately led to a decrease in the number of semiconductors being produced, and has resulted in a worldwide semiconductor shortage. We have made considerable investment—even more so than usual—to secure the required hardware, but even so, a long lead time will be needed to prepare the server hardware.”

Even though semiconductor manufacturers have ramped up production in 2021, they struggle to meet the rapidly rising demand for semiconductors worldwide. IBM CEO Arvind Krishna believes that the semiconductor shortage won’t be fully solved until 2023 or 2024, and hoping for a resolution in 2022 will be overly optimistic. According to Logicalis Insights, data center hardware lead times have extended to 52 weeks in 2021, forcing many organizations to turn to the cloud to accelerate business-critical projects.

“This demand has pushed the semiconductor industry to the brink as global chip shortages, brought on by labor shortages and the lack of substrates that hold the chip components, threaten to bring key industries to a halt. The sudden increase in demand also creates a domino effect downstream, impacting hardware lead times for business-critical projects. 

  “The result is that the shortage of available data center hardware is directly affecting organizations with on-premises IT that need to add capacity for new projects/workloads. Businesses are therefore left with a stark choice: Pause programs, perhaps indefinitely, or move to the public cloud,” the company said.

Do you think the tech industry did not see the talent crunch coming? Let us know on LinkedIn, Twitter, or Facebook. We would love to hear from you!

Spread the love

Leave a Reply

Your email address will not be published.