Articles

Recapping a Decade in Information Technology

Things changed a lot in the last ten years. Computer science arguably changed more during the 2010’s than it did during the 90’s. During the 90’s, the personal computer became truly personal, the internet gained widespread adoption, and ecommerce started to eat the world. In the 2000’s, smartphones and social media began to revolutionize communications. During the 2010’s, however, we began to see the flowering of computer concepts that were first proposed back in the days of Alan Turing.

The Cloud

Although cloud computing was first theorized in the 1960s and first realized in 2006 with the launch of AWS, the cloud gained so much momentum in 2010 that it was suddenly hard to imagine the internet without it.

The year 2010 saw two events that launched the cloud into full-scale popularity. First, Microsoft launched Azure, its cloud computing platform. Although Azure lags AWS in usage, it has also become the basis for Office 365, the world’s most popular suite of productivity tools. Microsoft launching a cloud product gave the cloud a sense of officialdom.

More important, however, was the launch of OpenStack. This open-source software, released as part of a collaboration between Rackspace and NASA, allowed enterprises to turn any data center into a cloud infrastructure. This enabled a proliferation of public clouds, private clouds, SaaS offerings, and more. In addition, nearly every significant innovation of the 2010s was enabled in some way by the cloud.

Artificial Intelligence

Back in the 1960’s, a university professor named Seymour Papert tasked a small team of graduate students with solving the problem of image recognition. He estimated that it would take about the length of a summer break. About half a century later, we have finally come to a time where image recognition is easy, repeatable, and widely available – and so are many other forms of artificial intelligence.

AI is not like it is in the movies. It’s not conversational, it’s not adaptable, and it’s not sentient. What it is, however, is better and faster than human performance when it some to several discrete manual tasks. Information security monitoring, anomaly detection, low-level customer support, manufacturing, image recognition, and so much more – these are all areas where AI has begun to replace human intelligence and effort.

DevOps, Containerization and Microservices

Powered by the cloud, enabled by artificial intelligence, this decade has seen revolutionary new ways to deliver and support applications.

  • First proposed in 2009, DevOps has grown to essentially take over the world of software development. DevOps has allowed companies to speed their deployments through continuous integration, testing, and delivery. Cloud environments make DevOps possible by allowing the creation of massive test environments where updates are simulated in near-real-world conditions.
  • Containerization further improved delivery speed by enabling more lightweight applications. Orchestration platforms such as Kubernetes brought AI and machine learning into the mix, at times using analytics to automatically detect and remediate errors.
  • Microservices have completely changed what software looks like. Instead of a single monolithic structure, software now takes the form of a cloud of lightweight application components.

Due to the paradigm that has emerged, creating and updating applications will never be the same.

Quantum Computing

Back in the days of mainframes and slide rules, computer scientists predicted that an entirely new kind of computing hardware would come to pass. Quantum computers would rely on neither bits nor bytes, but “qubits,” a superposition of both, in order to solve problems of unparalleled complexity.

One of the reasons why the last ten years have been so notable is that not one, but two decades-old computer science predictions have been realized. Quantum computing has arrived. A company known as D-Wave created the first viable quantum computing system back in 2011, but it took almost another ten years for a second, more important milestone to be reached.

Although early quantum computers worked, their architecture wasn’t terribly sophisticated. As a result, they were unable to perform even as well as classical computers. Computer scientists argued that in order for a quantum computer to be considered useful, it needed to be able to complete problems better and faster than an ordinary computer could – a milestone known as “quantum supremacy.”

Google’s quantum computer broke the quantum supremacy milestone in October 2019. Meanwhile, practical quantum computing applications are beginning to emerge, with Volkswagen using D-Wave computers to generate mathematically optimal bus routes. Proponents of quantum computing suggest that advancements within the field at a double exponential rate, which means that the next decade of computing advancement looks to be even more interesting than the 2010’s.

Protect Yourself from the Downside of Advancement with Device42

One thing that we learned from this decade is that every silver lining has a cloud around it. The decade that gave us machine learning and quantum computing also gave us ransomware, Cambridge Analytica, and monumentally increasing workloads for IT, development, and security staff.

Here at Device42, we’re invested in making the future easier to navigate. Our comprehensive IT asset management software helps you keep track of your changing infrastructure and application environments, map shifting dependencies, and regain control of your applications – continuing to integrate with emerging platforms such as Kubernetes. If you want to learn more about how Device42 can save you from future shock, sign up for a free demo today!

Share this post

About the author