5 Life-Changing Technological Innovations That Most People Don’t Understand Yet

5 Life-Changing Technological Innovations That Most People Don’t Understand Yet

Technology can be amazing and can change the world in positive ways – take breakthroughs that have been made in medicine that save lives, for example, or new developments in industrial automation that save us from having to risk our lives doing dangerous jobs or just waste them on routine and everyday activities.

However, it can also be scary – whether it’s concerns about the privacy implications of computers and the internet or more existential fears like robots taking over the world and harming – creating emissions and pollution.

Sometimes, however, fear and uncertainty are simply caused by a lack of understanding. This is not always our fault, as new technology is often first introduced to us by marketers or salespeople who are more interested in selling it as a solution to our problems than explaining exactly what it is and what it can actually do in real life!

So here’s a look at five groundbreaking developments in technology that have emerged into the mainstream in the last decade or so. In my experience, most of them are still not properly understood and can cause many misconceptions! So I’m going to try to give a super simple explanation of what each one actually is, as well as clear up some of the common misconceptions I come across!

Artificial Intelligence (AI)

This is perhaps the number one most misunderstood technology and also one that causes a great deal of anxiety! I’m certainly not saying that it’s not a cause for concern and that anyone who wants to use it shouldn’t be careful. But it’s not about building robots that will one day take our jobs or our planet!

The term “artificial intelligence”, as used today in technology and business, usually refers to machine learning (ML). This simply means computer programs (or algorithms) that, rather than being told explicitly what to do by a human operator, are able to get better and better at a specific task as they repeat it over and over and are exposed for more data. Eventually, they may become better than humans at these tasks. A good example of this is AlphaGo, a machine intelligence that became the first computer to beat a human champion in the game of Go. Go is a game where there are more possible moves than there are atoms in the universe. This means that it would be very difficult to program a computer to react to every possible move a human player could make. This is how conventional, programmatic gaming machines, such as chess machines, work. But by teaching it to play Go and then trying different strategies until it won, by giving higher weighting to moves and strategies that it found had a higher chance of success, it effectively “learned” to beat a human.

Until a decade or so ago, most people’s understanding of AI came from science fiction, and specifically robots as seen in TV series and films such as 2001, The Matrix or Star Trek. The fictional robots and smart machines in these shows were generally shown to be capable of what we call “general AI,” – meaning they can have pretty much every facet of natural (human or animal) intelligence – abilities of reasoning, learning, decision-making and creativity – and perform any task they may need to do. Today’s real AI (or ML) is almost always what is known as “specialized” (or weak/narrow) AI – only able to perform the specific jobs it was created for. Some common examples of this are matching customers with items they might want to buy (recommendation engines), understanding human speech (natural language processing), or recognizing objects and objects when spotted by cameras (computer vision).

Quantum computing

Most people can be forgiven for this. Gaining a low-level understanding of quantum computing generally requires knowledge of quantum physics that is beyond anyone who has not studied the subject academically!

But at a higher level, there are also many common misconceptions. Quantum computers are not just computers that are much faster than regular “classical” computers. In other words, quantum computers will not replace classical computers because they are only better at a narrow range of highly specialized jobs. This generally involves solving highly specialized mathematical problems that do not usually arise as day-to-day business computing requirements. These problems include the simulation of quantum (subatomic) systems and optimization problems (finding the best route from A to B, for example when there are many variables that can be changed). One area of ​​everyday computing where quantum computing can replace classical computing is encryption – for example, securing communications so that they cannot be hacked. Researchers are already working to develop quantum-safe cryptography because there are fears that some of the most advanced cryptographic protections used for government-level security could be trivially defeated by quantum computers in the future. But it won’t let you run Windows faster or play Fortnite with better graphics!


The first place many people would have heard the term “metaverse” would have been the 1992 dystopian sci-fi novel Snow Crash by Neal Stephenson. And when the concept went mainstream in 2021 after Facebook’s name change to Meta, a number of articles linked it to ideas found in the virtual reality (VR)-focused novel Ready Player One. But in fact, the concept in terms of technology today is not necessarily exclusively about VR. And hopefully doesn’t have to be dystopian!

The fact is that no one yet knows exactly what the metaverse will look like, since it does not exist in its final form yet. Perhaps the best way to think of it is that it encapsulates a collection of somewhat ambiguous ideas about what the internet will evolve into next. Whatever it is, it’s likely to be more immersive, so VR, as well as related technologies like augmented reality (AR), may well play a role in that. However, many proto-metaverses and metaverse-related applications, such as the digital gaming platform Roblox or the virtual worlds Sandbox and Decentraland, do not yet involve VR. It is also likely to be built around the concept of persistence in several ways – for example, users are likely to use a persistent representation of themselves, such as an avatar, as they move between different virtual worlds and activities. Users will also expect to be able to leave a virtual world and return to it later to find that they are still in the same “instance” – which is not the case, for example, in the virtual worlds that many people are used to exploring in video games, where the entire world can be reset when a new game is started.

Once it’s part of our lives, we might not even want to call it the metaverse at all—just like nobody really uses the term “world wide web” anymore. This is nicely illustrated by Apple CEO Tim Cook who says he doesn’t think the idea will catch on because “the average person” doesn’t really understand what it is. However, he believes that individual technologies that are part of the metaverse – such as AR and VR – will be part of the internet’s evolution.


Web3, as it is most commonly used today, refers to a different idea for the “next level” development of the internet, but one that is linked to concepts involving decentralization, blockchain technology and cryptocurrencies. This is confusing because there exists another group of ideas, labeled “web 3.0”, proposed by Tim Berners-Lee – the man often referred to as the father of the World Wide Web. As with the term “metaverse”, both web3 and web 3.0 refer to what the internet can evolve into. And although the ideas are somewhat related and not necessarily mutually exclusive, they describe different things separately! Confused? Don’t worry, so is everyone else!

But specifically, web3 looks forward to an internet where power and ownership are not centralized in large companies that ultimately own the servers where data is stored and software is run. For example, many believe that large social networking companies such as Facebook and Twitter have too much control over public debate, as they ultimately get to control who does or does not have a voice. A decentralized web3 social network would, in theory, be controlled by the users and function as a true democracy, without some Mark Zuckerberg or Elon Musk figure with the ability to cut off anyone they didn’t think should have a platform.

A metaverse-oriented internet can be run on web3 principles – decentralized – but does not necessarily have to be. Likewise, a web3 internet can be organized as a metaverse (with immersion and avatars as key features), but again, it doesn’t have to be. Therefore, the ideas are compatible visions of what the internet could become, but are not necessarily related.


The arrival of a new generation of mobile internet technology has brought with it its own share of misunderstandings. This includes concerns about its possible impact on health. Many people were concerned that high-powered radio waves emitted by telephones or transmitter towers could cause health problems, including cancer. But hundreds of studies conducted around the world by governments and independent research organizations have failed to find evidence that this is true.

There is also a common misconception that 5G is a unique piece of technology or standard that was implemented and now we are just waiting to see the results, which will mainly be faster internet on our phones. In fact, 5G is an evolving standard. Most of the infrastructure in place today relies on a slower form of 5G that effectively piggy-backs on the existing 4G LTE infrastructure. True, “standalone” 5G is gradually being rolled out, which will allow it to reach its full potential in the coming years. This would include enabling many more users to connect within a limited physical geography, such as a shopping mall or sports stadium, and in theory eliminate the connectivity issues that often occur in densely populated locations. The real potential of 5G internet is not just faster data transfer, but a mobile internet that allows us to transfer new and exciting forms of data in different ways to create applications that do completely new things.

To stay updated on new and emerging business and technology trends, be sure to subscribe to my newsletter, follow me on TwitterLinkedIn and YouTube, and check out my books ‘Tech Trends in Practice’ and ‘Business Trends in Practice’, which won the 2022 Business Book of the Year award.

See also  Underrated PS3 Hack And Slash Game

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *