As a futurist, every year, I look ahead and predict the key tech trends that will shape the next few months. There are so many innovations and breakthroughs happening right now, and I can’t wait to see how they help to transform business and society in 2022.
Let’s take a look at my list of key tech trends that everyone should be ready for, starting today.
1. Computing Power
What makes a supercomputer so super? Can it leap tall buildings in a single bound or protect the rights of the innocent? The truth is a bit more mundane. Supercomputers can process complex calculations very quickly.
As it turns out, that’s the secret behind computing power. It all comes down to how fast a machine can perform an operation. Everything a computer does breaks down into math. Your computer’s processor interprets any command you execute as a series of math problems. Faster processors can handle more calculations per second than slower ones, and they’re also better at handling really tough calculations.
Within your computer’s CPU is an electronic clock. The clock’s job is to create a series of electrical pulses at regular intervals. This allows the computer to synchronize all its components and it determines the speed at which the computer can pull data from its memory and perform calculations.
When you talk about how many gigahertz your processor has, you’re really talking about clock speed. The number refers to how many electrical pulses your CPU sends out each second. A 3.2 gigahertz processor sends out around 3.2 billion pulses each second. While it’s possible to push some processors to speeds faster than their advertised limits — a process called overclocking — eventually a clock will hit its limit and will go no faster.
As of March 2010, the record for processing power goes to a Cray XT5 computer called Jaguar. The Jaguar supercomputer can process up to 2.3 quadrillion calculations per second [source: National Center for Computational Sciences].
Computer performance can also be measured in floating-point operations per second, or flops. Current desktop computers have processors that can handle billions of floating-point operations per second, or gigaflops. Computers with multiple processors have an advantage over single-processor machines, because each processor core can handle a certain number of calculations per second. Multiple-core processors increase computing power while using less electricity [source: Intel]
Even fast computers can take years to complete certain tasks. Finding two prime factors of a very large number is a difficult task for most computers. First, the computer must determine the factors of the large number. Then, the computer must determine if the factors are prime numbers. For incredibly large numbers, this is a laborious task. The calculations can take a computer many years to complete.
Future computers may find such a task relatively simple. A working quantum computer of sufficient power could calculate factors in parallel and then provide the most likely answer in just a few moments. However, quantum computers have their own challenges and wouldn’t be suitable for all computing tasks, but they could reshape the way we think of computing power.
2. Smarter Devices
Smart devices are interactive electronic gadgets that understand simple commands sent by users and help in daily activities. Some of the most commonly used smart devices are smartphones, tablets, phablets, smartwatches, smart glasses and other personal electronics. While many smart devices are small, portable personal electronics, they are in fact defined by their ability to connect to a network to share and interact remotely. Many TV sets and refrigerators are also therefore considered smart devices.
3. Quantum Computing
Quantum computing is a rapidly-emerging technology that harnesses the laws of quantum mechanics to solve problems too complex for classical computers.
Today, IBM Quantum makes real quantum hardware — a tool scientists only began to imagine three decades ago — available to thousands of developers. Our engineers deliver ever-more-powerful superconducting quantum processors at regular intervals, building toward the quantum computing speed and capacity necessary to change the world.
These machines are very different from the classical computers that have been around for more than half a century. Here’s a primer on this transformative technology.
For some problems, supercomputers aren’t that super.
When scientists and engineers encounter difficult problems, they turn to supercomputers. These are very large classical computers, often with thousands of classical CPU and GPU cores. However, even supercomputers struggle to solve certain kinds of problems.
If a supercomputer gets stumped, that’s probably because the big classical machine was asked to solve a problem with a high degree of complexity. When classical computers fail, it’s often due to complexity
Complex problems are problems with lots of variables interacting in complicated ways. Modeling the behavior of individual atoms in a molecule is a complex problem, because of all the different electrons interacting with one another. Sorting out the ideal routes for a few hundred tankers in a global shipping network is complex too.
4. Datafication
Datafication refers to the collective tools, technologies and processes used to transform an organization to a data-driven enterprise. This buzzword describes an organizational trend of defining the key to core business operations through a global reliance on data and its related infrastructure.
Datafication is also known as datafy. An organization that implements datafication is said to be datafied.
Organizations require data and extract knowledge and information to perform critical business processes. An organization also uses data for decision making, strategies and other key objectives. Datafication entails that in a modern data-oriented landscape, an organization’s survival is contingent on total control over the storage, extraction, manipulation and extraction of data and associated information.
5. Artificial Intelligence and Machine Learning
As a whole, artificial intelligence contains many subfields, including:
- Machine learning automates analytical model building. It uses methods from neural networks, statistics, operations research and physics to find hidden insights in data without being explicitly programmed where to look or what to conclude.
- A neural network is a kind of machine learning inspired by the workings of the human brain. It’s a computing system made up of interconnected units (like neurons) that processes information by responding to external inputs, relaying information between each unit. The process requires multiple passes at the data to find connections and derive meaning from undefined data.
- Deep learning uses huge neural networks with many layers of processing units, taking advantage of advances in computing power and improved training techniques to learn complex patterns in large amounts of data. Common applications include image and speech recognition.
- Computer vision relies on pattern recognition and deep learning to recognize what’s in a picture or video. When machines can process, analyze and understand images, they can capture images or videos in real time and interpret their surroundings.
- Natural language processing is the ability of computers to analyze, understand and generate human language, including speech. The next stage of NLP is natural language interaction, which allows humans to communicate with computers using normal, everyday language to perform tasks.