The world is rapidly moving toward a post-digital era, where leaders will need to set their sights beyond their ongoing digital transformations. With digital capabilities alone no longer serving as a differentiator, future-minded business leaders will need more in their technology arsenals to succeed.[1]
Innovative technologies are catalysts for change, offering businesses extraordinary new capabilities. ‘Distributed ledger technology’, ‘Artificial intelligence’, ‘Extended reality’, and ‘Quantum computing’ (known collectively as DARQ technologies) will be the next set of emerging technologies to spark profound change, letting businesses reimagine entire industries.[2] In fact, individual DARQ technologies are already making a difference across industries today. But collectively, the DARQ technologies will also power the innovation and opportunity uniquely associated with the coming post-digital era. As the business landscape becomes increasingly dominated by digital natives and companies that have undergone successful digital transformations, DARQ is the key that will open unimagined new pathways into the future.[3]
Technology trends
Artificial intelligence (AI) today includes a whole host of technologies from data science to computer science, to electronics and social disciplines.[4] It is a very broad field within information technology that is enabling the digital transformation of industry and society by creating computers that have the ability to learn as they are programmed to perform tasks normally requiring human intelligence. This includes reasoning, problem-solving, understanding language, making predictions or inferences, and perceiving situations or environments. Essentially, it involves computers being able to provide better, deeper, and otherwise practically unachievable insights in an efficient way by leveraging computer-learning algorithms.
This trend is about the rapid adoption of AI technologies whose increased capabilities and applications have the potential to reshape almost every industry and profession as they fundamentally change the ways in which humans interact with machines.[5] This should be considered a megatrend because of the scale and geographic reach of its potential economic and societal impacts.
The projected impacts of AI are significant. Many different projections and estimates exist, but to provide an example of the magnitude of these projections, UN Conference on Trade and Development’s Digital Economy Report[6] estimates that AI has the potential to generate USD 13 trillion of additional global economic output by 2030, contributing an additional 1.2% to annual GDP growth. The Future Today Institute predicts that the global AI market will grow at a CAGR of 42.2% from 2021 to2027.[7]
AI enabling next generation applications
Over the coming decades, applications of AI have the potential to change our lives for the better in many ways. Some key examples include:
- Changing the labour market: With AI’s huge potential to boost productivity and economic growth, it will certainly have a significant effect on the labour market. In the short term, automation driven by AI (e.g. ‘Robotics’) could introduce some disruption to many current jobs.[8] But in the longer term, AI promises to create a significant number of new jobs; jobs which can remove the need for humans to do unsafe and repetitive tasks (see ‘Effects of automation’).[5]
- Better healthcare: AI is already transforming healthcare in areas such as pathology and radiology, by improving the speed and accuracy of diagnosing diseases such as breast cancer.[9] In future, AI will facilitate personalized medicine and drug development (e.g. allowing tailored, drug treatments based on an individual’s genetic markers), help to eradicate infectious diseases and even to predict future disease outbreaks that may originate in animals.[9,10]
- Personalized education: Like for personalized healthcare, AI could be used for personalized learning – targeting learning to the gaps in an individual student’s knowledge and creating customized learning content.[11]
- More efficient production and consumption: AI is widely predicted to increase industry and worker productivity, which is why so many companies are interested in adopting it in some form or other, looking to achieve a competitive advantage.[5] McKinsey estimates that 70% of companies may adopt at least one AI technology by 2030.[12] AI will also increasingly be used to identify more efficient delivery routes or supply chains[13] and to maximize efficiency and sustainability in agriculture[10], the outcome of which could lead to significant economic growth.
- More efficient and effective governance: AI could help formulate and evaluate the effectiveness of government policies and even be used to perform legal tasks that require the sifting and analysis of huge amounts of data.[13]
Challenges and risks
Although the expected benefits of AI are enormous, good governance of the technology will be essential if we are to realize them. The development of appropriate legal and ethical frameworks for AI will be critical to build societal trust and to mitigate the potential risks and challenges. International Standards will have an important role to play as part of such frameworks to ensure the responsible adoption of AI.
- Social implications: Because the ability to adopt and benefit from AI is dependent on the presence of adequate digital infrastructures, relevant technical skills in the workforce, and appropriate regulatory systems, AI has the potential to widen the technology gap between those that have the capabilities to benefit from it, and those who do not. This is the case in terms of both countries and companies (large corporations may be able to determine who has access to AI and its benefits).[7] Those with access to the technology could potentially use it for malicious purposes, for example, by creating deepfakes (using AI to alter videos of people) and tailored, online communications (a sort of ‘personalized propaganda’) to radicalize or manipulate people.[14] But even without malicious intentions, just the fact that AI consumes so much data creates potential privacy issues – “As AI evolves, it magnifies the ability to use personal information in ways that can intrude on privacy interests by raising analysis of personal information to new levels of power and speed”.[15] Indeed, AI is advancing so rapidly that it could even one day evoke difficult questions about what it means to be ‘human’. For example, AI is learning to do things that even humans often find difficult – reading human expressions, interpreting the emotions behind them, and analyzing a person’s level of emotional engagement. Researchers are even working on teaching AI to convincingly exhibit human emotions.[7,9]
- Legal and ethical implications: Advanced AI that can make autonomous decisions could be applied to medical diagnoses, legal judgements or even used in warfare. This could be problematic because of the risk of bias in AI – if given incorrect or skewed data, this could lead to algorithmic discrimination being deployed on a large scale (e.g. some facial analysis AI has been shown to be less accurate at identifying minorities and women, because the data it was trained on was not representative).[16,17] AI has the potential to reduce the impact of human biases, but only if humans can identify and adequately address bias in AI.
What is on the horizon for AI? (only a few examples of many…)
- Edge computing: This is a system of moving computation nearer to the sources of data or the ‘Edge’. Moving AI workloads to the Edge (AI processing and decision making is performed nearer to the source of the data generation, rather than in the Cloud) to make it faster and safer.
- System on a chip: Development of advanced chips with a complex series of components that are designed to work on AI projects and deliver faster and more secure processing.
- Digital twins: The use of AI to significantly improve ‘digital twin’ technology (virtual representations of real-world environments or products).
- AI to detect AI: New measures to regulate creation and detection of deepfakes will be complemented by AI systems designed to identify deepfakes, whether these counterfeits are text or imagery.
- Emotion AI: Software that can read human vocal and facial expressions, understand human emotions and the cognitive states underlying them. Uses will include telehealth, online learning, and virtual meetings/events.
Related trends
News stories
- Published 252 Standards | Developing 68 Projects
- ISO/IEC CD 27090 [Under development]Cybersecurity — Artificial Intelligence — Guidance for addressing security threats and failures in artificial intelligence systems
- ISO/IEC WD 27091.3 [Under development]Cybersecurity and Privacy — Artificial Intelligence — Privacy protection
- Published 613 Standards | Developing 100 Projects
- ISO/IEC CD TR 23888-1 [Under development]Information technology — Artificial intelligence for multimediaPart 1: Vision and scenarios
- ISO/IEC CD TR 23888-3 [Under development]Information technology — Artificial intelligence for multimediaPart 3: Optimization of encoders and receiving systems for machine analysis of coded video content
- Published 33 Standards | Developing 36 Projects
- Information technology — Artificial intelligence — Assessment of machine learning classification performance
- Artificial intelligence — Data quality for analytics and machine learning (ML)Part 1: Overview, terminology, and examples
- Artificial intelligence — Data quality for analytics and machine learning (ML)Part 2: Data quality measures
- Artificial intelligence — Data quality for analytics and machine learning (ML)Part 3: Data quality management requirements and guidelines
- Artificial intelligence — Data quality for analytics and machine learning (ML)Part 4: Data quality process framework
- ISO/IEC FDIS 5259-5 [Under development]Artificial intelligence — Data quality for analytics and machine learning (ML)Part 5: Data quality governance framework
- ISO/IEC CD TR 5259-6 [Under development]Artificial intelligence — Data quality for analytics and machine learning (ML)Part 6: Visualization framework for data quality
- Information technology — Artificial intelligence — AI system life cycle processes
- Information technology — Artificial intelligence — Guidance for AI applications
- Information technology — Artificial intelligence — Reference architecture of knowledge engineering
- ISO/IEC DTS 6254 [Under development]Information technology — Artificial intelligence — Objectives and approaches for explainability and interpretability of ML models and AI systems
- Information technology — Artificial intelligence — Data life cycle framework
- Information technology — Artificial intelligence — Controllability of automated artificial intelligence systems
- ISO/IEC DIS 12792 [Under development]Information technology — Artificial intelligence — Transparency taxonomy of AI systems
- ISO/IEC CD TS 42119-3 [Under development]Artificial intelligence — Testing of AIPart 3: Verification and validation analysis of AI systems
- Information technology — Artificial intelligence — Overview of machine learning computing devices
- ISO/IEC AWI TR 18988 [Under development]Artificial intelligence — Application of AI technologies in health informatics
- ISO/IEC DTR 20226 [Under development]Information technology — Artificial intelligence — Environmental sustainability aspects of AI systems
- ISO/IEC CD TR 21221 [Under development]Information technology – Artificial intelligence – Beneficial AI systems
ISO/IEC WD TS 22440[Deleted]Artificial intelligence — Functional safety and AI systems — Requirements- ISO/IEC AWI TS 22443 [Under development]Information technology — Artificial intelligence — Guidance on addressing societal concerns and ethical considerations
- Information technology — Artificial intelligence — Artificial intelligence concepts and terminology
- Framework for Artificial Intelligence (AI) Systems Using Machine Learning (ML)
- ISO/IEC AWI 23282 [Under development]Artificial Intelligence — Evaluation methods for accurate natural language processing systems
- Information technology — Artificial intelligence — Guidance on risk management
- Information technology — Artificial intelligence (AI) — Bias in AI systems and AI aided decision making
- Information technology — Artificial intelligence — Overview of trustworthiness in artificial intelligence
- Artificial Intelligence (AI) — Assessment of the robustness of neural networksPart 1: Overview
- Artificial intelligence (AI) — Assessment of the robustness of neural networksPart 2: Methodology for the use of formal methods
- ISO/IEC AWI 24029-3 [Under development]Artificial intelligence (AI) — Assessment of the robustness of neural networksPart 3: Methodology for the use of statistical methods
- Information technology — Artificial intelligence (AI) — Use cases
- Information technology — Artificial intelligence — Overview of ethical and societal concerns
- Information technology — Artificial intelligence — Process management framework for big data analytics
- Software engineering — Systems and software Quality Requirements and Evaluation (SQuaRE) — Quality model for AI systems
- ISO/IEC CD TS 42119-2 [Under development]Artificial intelligence — Testing of AIPart 2: Overview of testing AI systems
- Information technology — Governance of IT — Governance implications of the use of artificial intelligence by organizations
- Information technology — Artificial intelligence — Management system
- ISO/IEC FDIS 42005 [Under development]Information technology — Artificial intelligence — AI system impact assessment
- ISO/IEC DIS 42006 [Under development]Information technology — Artificial intelligence — Requirements for bodies providing audit and certification of artificial intelligence management systems
- ISO/IEC AWI 42102 [Under development]Information technology — Artificial intelligence — Taxonomy of AI system methods and capabilities
- ISO/IEC AWI TR 42103 [Under development]Information technology — Artificial intelligence — Overview of synthetic data in the context of AI systems
- ISO/IEC CD 42105 [Under development]Information technology — Artificial intelligence — Guidance for human oversight of AI systems
- ISO/IEC CD TR 42106 [Under development]Information technology — Artificial intelligence — Overview of differentiated benchmarking of AI system quality characteristics
- Published 162 Standards | Developing 47 Projects
- Road vehicles — Safety and artificial intelligence
- Published 200 Standards | Developing 51 Projects
- ISO/FDIS 18374 [Under development]Dentistry — Artificial intelligence (AI) and augmented intelligence (AuI) based 2D radiograph analysis — Data generation, data annotation and data processing
- Published 350 Standards | Developing 70 Projects
- ISO/CD TR 12786.2 [Under development]Intelligent transport systems — Big data and artificial intelligence supporting intelligent transport systems — Use cases
- Published 248 Standards | Developing 60 Projects
- Health informatics — Applications of machine learning technologies in imaging and other medical applications
- Published 2 Standards
- Consumer protection — Privacy by design for consumer goods and servicesPart 1: High-level requirements
- Consumer protection — Privacy by design for consumer goods and servicesPart 2: Use cases
Extended reality (XR) refers to environments that combine the real and the virtual, through the use of computer technology and wearable devices. XR technologies consist of virtual, augmented and mixed reality (respectively VR, AR and MR).[18] Each of VR, AR, and MR defines a specific technology to reach XR, including the Metaverse. VR is fully digital and immersive, AR can digitally enhance our view of the real world and more recently, MR can create a hybrid reality where virtual and real worlds coexist.
XR technologies are transforming the way that people interact, live and work by offering access to a new mode of social interactions within the digital space.[2,8] The endgame is the full development of the Metaverse, an online digital world where people can interact with each other and with the computerized environment to do a variety of activities as an extension of reality.
Part of the emerging DARQ technologies, XR is a building block in many companies’ innovation strategies, with the power to significantly transform industries. With the combined global spending in AR and VR expected to reach USD 160 billion by 2023, and repercussion in both leisure and business sectors, this is a trend with rapidly increasing significance.[2]
The experience economy: from ownership to usership in the digital space
AR and VR immersive technologies have been in use for some time already (especially, in online games), but their application is increasingly business-focused, helping the field’s rapid expansion. Epic Games’ Unreal Engine for example, used in the popular Fortnite game, created an online, digital space for users to exchange and participate in a multiplicity of experiences. This gaming engine can also be used for business purposes, with architectural firms using it to showcase their designs to clients, or Finnair using it to build a digital twin of Helsinki Airport for staff training purposes, for example.[2]
The uptake of XR technology in business can be linked back to other societal trends, such as the development of ‘The experience economy’. The experience economy is slowly replacing consumerism, where businesses sell experiences rather than a product. The customer is fully involved in the customization process, shifting its role from ownership to usership.[19] With XR technologies becoming cheaper and more sophisticated, the opportunities for customization are endless, allowing users to immerse themselves in places or situations, whether it is to shop, interact, work, or travel.
In addition, the COVID-19 pandemic is accelerating the need to move everyday experiences to the digital space in order to limit physical interactions, and technologies are rapidly evolving to offer virtual access to a multiplicity of experiences in response to mobility restrictions and isolation policies such as teleworking.[19] With such technologies, customers can try on outfits in the virtual space and see themselves from multiple angles, or travel and work from the comfort of their armchairs. XR technologies thus open the door to alternative approaches to address current social needs, from wellness tourism[19] to training opportunities and even criminal rehabilitation, simulating real life scenarios to prepare offenders before their reintegration into society.[13]
Innovation in communication and visual technologies accelerating the uptake of XR technologies
This expansion in XR use is driven largely by innovations in communication and visual technologies that are improving the user experience and making these technologies more popular and accessible to the general public. Key enabling developments include portability, high speed Internet access, graphic and sound quality as well as GPS data, which increases the potential reach of those technologies.[13] With the evolution of wearable XR technologies such as smart-glasses or contact lenses that include quality sensors, users can experience their surroundings with additional computer-generated inputs that appear real.[8]
In manufacturing, for example, the development of such sensors and AR glasses can help workers with efficiency and safety by giving hands-free access to user manuals and audio instructions, helping them locate items, tracking stock in real time or warning the wearer of equipment needing maintenance or showing defects.[13] The most recent devices, such as the HoloLens 2 (a pair of MR smart-glasses developed and manufactured by Microsoft), can now understand the characteristics of items in their field of vision rather than simply attesting that they exist, which means they can identify and warn the wearer of hazards rather than simply point to the presence of objects.[2] The extensive applications of such technologies lead experts to predict a wide increase in the use of VR and AR at an annual growth rate of over 80% over the next few years. In fact, in the next few decades, electronic communication and information sharing using AR and VR, such as livestream or videos, are expected to take over from traditional text and images.[13]
Risks
The increasing use of cyberspace to perform everyday activities could give more influence and authority to non-traditional actors, possibly leading to the creation of new forms of authority beyond the individual countries.[13] Already, such groups can use social media to exert significant societal pressures – one example of this is how users of Reddit (a social news aggregation, Web content rating and discussion Website), managed to shake the stock market with ‘meme-stocks’ and the coordinated buying of GameStop stocks by retail investors in 2021.[20]
In a pessimistic scenario, expanding the competitive space to the digital realm could also provide a new medium for conflict and warfare, which is already seen with the rise in cyber-terrorism. The virtual arena and XR technologies provide increasing opportunities for misinformation and propaganda, as well as avenues for cyber-attacks and hybrid forms of conflict.[13]
Another risk is that XR technologies might complicate pre-existing issues linked with digital technologies and social media, such as those related to the protection of identity and ownership, as well as the risk of misinformation and bias. Examples of fake news and deepfake videos using deep learning technologies highlight the risks of XR innovations and the increasing difficulty to distinguish what is real and what is digitally constructed.[13,21]
Conclusion
Extended realities technologies such as AR and VR, and the constant evolution of the digital space towards the Metaverse is a promising field that can further user experience in business and leisure alike. Additional trending technologies such as the roll out of ‘5G’ will further support the development of XR experiences, by enabling more people to be connected at the same time to enjoy a quality experience with minimal latency.[2]
With other DARQ technologies, it holds the power to radically modify how we behave and interact and will be directly dependent on innovation in communication and visual technologies, with which it shares similar risks that must be addressed.
Related trends
News stories
- Published 93 Standards | Developing 16 Projects
- Information technology — Computer graphics, image processing and environmental data representation —Information model for mixed and augmented reality content — Core objects and attributes
- Computer graphics, image processing and environmental data representation — Augmented and virtual reality safety — Guidance on safe immersion, set up and usage
- ISO/IEC DIS 9234 [Under development]Information technology — Information modelling for VR/AR/MR based learning, education and training systems
- Information technology — Computer graphics and image processing — The Virtual Reality Modeling LanguagePart 1: Functional specification and UTF-8 encoding
- Information technology — Computer graphics and image processing — The Virtual Reality Modeling Language (VRML)Part 2: External authoring interface (EAI)
- ISO/IEC CD TR 16088 [Under development]Constructs for visual positioning systems in mixed and augmented reality (MAR)
- Information technology — Computer graphics, image processing and environmental representation — Sensor representation in mixed and augmented reality
- ISO/IEC AWI 18038-2 [Under development]Computer graphics, image processing and environmental representation — Sensor representation in mixed and augmented realityPart 2: Information model
- Information technology — Computer graphics, image processing and environmental data representation — Mixed and augmented reality (MAR) reference model
- Information technology — Computer graphics, image processing and environmental data representation — Live actor and entity representation in mixed and augmented reality (MAR)
- ISO/IEC CD TR 16088 [Under development]Constructs for visual positioning systems in mixed and augmented reality (MAR)
- Information technology — Computer graphics, image processing and environmental data representation — Style representation for mixed and augmented reality
- Information technology — Computer graphics, image processing and environment data representation — Object/environmental representation for image-based rendering in virtual/mixed and augmented reality (VR/MAR)
- Published 613 Standards | Developing 100 Projects
- Information technology - Multimedia application format (MPEG-A)Part 13: Augmented reality application format
- Published 55 Standards | Developing 13 Projects
- Information technology for learning, education and training — Human factor guidelines for virtual reality contentPart 1: Considerations when using VR content
- Information technology for learning, education, and training — Human factor guidelines for virtual reality contentPart 2: Considerations when making VR content
- Information technology for learning, education and training — Catalogue model for virtual, augmented and mixed reality content
- Published 85 Standards | Developing 12 Projects
- Ergonomics of human-system interactionPart 380: Survey result of HMD (Head-Mounted Displays) characteristics related to human-system interaction
- Ergonomics of human-system interactionPart 820: Ergonomic guidance on interactions in immersive environments, including augmented reality and virtual reality
Blockchain technology is a form of distributed ledger technology (DLT), which provides unprecedented potential for removing intermediaries by allowing participating parties to exchange not only information but also value (money, contracts, property rights) without necessitating trust in specific, pre-determined intermediaries such as banks or servers.[5,6,22] This is because DLT enables transaction data to be validated within a system wherein control is distributed among multiple, independent participants and stored in a manner that is tamper-evident and immutable by design. By ensuring system-wide agreement about the state of the ledger, DLT can be used to promote privacy, safety, transparency, and integrity of the transaction process.[22,23]
Distributed ledgers open up many new possibilities; for example, for monitoring the supply chain or managing digital rights. DLT is therefore regarded as a central enabler for digital, self-executing contracts, so-called smart contracts.[23]
Many industry leaders have already achieved significant business benefits, including greater transparency, enhanced security, improved traceability, increased efficiency, faster transactions, and reduced costs by DLTs.[24] Financial services and banking are the most frequently targeted sectors for DLT service providers. Capital markets are clearly dominating, followed by insurance, and trade finance.[25] Research from Gartner says that 300 million blockchain transactions were processed through the end of 2017 and assets worth more than USD 270 billion were being managed using DLT.[23,26]
Blockchain is best known as the technology behind cryptocurrencies (see ‘New business models’)[1,6], but is increasingly known for its role facilitating the trading of non-fungible tokens (NFTs). While cryptocurrencies (like physical money) are ‘fungible’, meaning they are equal in value and can be traded or exchanged for one another (one dollar is always worth another dollar; one Bitcoin is always equal to another Bitcoin), NFTs each have their own digital signature that makes it impossible for NFTs to be exchanged, as no two are equal (hence, non-fungible).[27] NFTs are digital assets with programmed scarcity, and as such are ideal to represent ownership of unique virtual assets and digital identities in Web 3.0 and the Metaverse.[28]
But blockchain can be used for a much wider variety of applications beyond cryptocurrency and the financial services and banking sectors. Although much focus is still put on monetary uses, there is an increasing interest in non-monetary uses and applications, e.g. digital identity, healthcare, supply chain, and energy.[25] For example:
- In West Africa and Kenya, blockchain has enabled the efficient verification of property records and transactions, and expanded access to credit in some previously informal sectors of the economy.[29]
- The London-based start-up Resonance uses blockchain to automate the transfer of product information between brands, manufacturers, and retailers. According to Resonance, over 30% of product data in product catalogues is wrong, with each error costing an average of USD 60 to fix. The innovative technology ensures that only trustworthy information is forwarded and is done so anonymously. The recipients first check the data sheets they receive before integrating information into their internal systems – such as for material requirements planning.[23]
- In Switzerland Streamr has developed an anti-theft sticker that protects valuable goods without revealing their location. The sticker is fitted with an array of sensors that identify issues such as location, acceleration, and temperature. The data collected in this way is managed by Streamr’s blockchain network and based on smart contracts. The stickers can be used in the transport of goods, for example. Customers would only find out where they are currently located if the forwarder violates the previously agreed terms and conditions of transportation.[23]
- In Australia Power Ledger has developed a blockchain-based platform that enables users to invest in major, renewable-energy projects. This allows users who want to invest in the expansion of renewable energy to buy small stakes in projects and accelerate their growth. The first offers are parts of a commercial solar park and a grid-connected battery storage project in Australia, which will be offered via cryptocurrencies in the blockchain.[23]
- And CSIRO has explored using blockchain to verify food provenance, so consumers can know exactly where their food came from and what has happened to it at each step of the chain.[30]
According to Gartner’s value forecast for the blockchain business[31], after the first phase of a few high-profile successes in 2018–2021, there will be larger, focused investments and many more successful models in 2022–2026. And these are expected to explode in 2027–2030, reaching more than USD 3 trillion globally.[32] In 2018, China alone accounted for nearly 50% of all patent applications for technology families relating to blockchains, and, together with the United States, represents more than 75% of all such patent applications.[33]
Related trends
News stories
- Published 49 Standards | Developing 13 Projects
- Internet of Things (IoT) —- Integration of IoT and DLT/blockchain: Use cases
- Published 22 Standards | Developing 6 Projects
- ISO/TR 24332 [Under development]Information and documentation — Blockchain and distributed ledger technology (DLT) in relation to authoritative records, records systems and records management
- Published 19 Standards | Developing 10 Projects
- Financial services — Security information for PKI in blockchain and DLT implementations
- Published 39 Standards | Developing 14 Projects
- ISO/DIS 5909 [Under development]Business processes and data interchange of DLT based electronic Bill of Lading
- Application of blockchain-based traceability platform for cold chain food
ISO/CD TR 19626-3[Deleted]Processes, data elements and documents in commerce, industry and administration —Trusted communication platforms for electronic documentsPart 3: Blockchain-based implementation guideline
- Published 815 Standards | Developing 29 Projects
- Data qualityPart 117: Application of ISO 8000-115 to identifiers in distributed ledgers including blockchains
- Published 12 Standards | Developing 11 Projects
- Blockchain and distributed ledger technologies – Use cases
- Blockchain and distributed ledger technologies — Identifiers of subjects and objects for the design of blockchain systems
- Blockchain and distributed ledger technologies — Data flow models for blockchain and DLT use cases
- ISO/CD 20435 [Under development]A Framework for Representing Physical Assets Using Tokens
- Blockchain and distributed ledger technologies — Vocabulary
- Blockchain and distributed ledger technologies — Privacy and personally identifiable information protection considerations
- Blockchain and distributed ledger technologies – Overview of existing DLT systems for identity management
- Blockchain and distributed ledger technologies — Reference architecture
- ISO/WD TS 23353.2 [Under development]Blockchain and distributed ledger technologies — Auditing guidelines
- ISO/CD TS 23516.3 [Under development]Blockchain and Distributed Ledger Technology — Interoperability Framework
- Blockchain and distributed ledger technologies — Guidelines for governance
- Blockchain and distributed ledger technologies (DLTs) — Overview of trust anchors for DLT-based identity management
Cloud technology allows users to access scalable technology services immediately via the Internet’s existing network, promoting lower costs for infrastructure and inventory, reducing overheads, and creating leaps in computing power and speed, data storage, and bandwidth.[8] However, it has one major problem – the latency (time lag or communication delay over the network) that results from the physical distance between users and the data centres hosting cloud-based services. This problem can be overcome using Edge computing; this is a different technology from cloud computing and its relevance is set to increase as the ‘Internet of Things’ becomes ubiquitous and the sheer amount of data that needs to be moved and processed increases exponentially.[23]
This is because edge computing allows users to overcome the latency issue by performing computations near or at the source of data – data is processed directly on-site using dedicated hardware. Edge thus provides an important advantage when processing time-sensitive data, or when data processing is needed in a remote location where there is limited connectivity. In future, edge computing will be important for health care, automotive and manufacturing applications, because of the increased speed and security of processing data directly on devices (as opposed to sending it into the Cloud).[7,23]
Aside from reducing latency, edge computing has several other advantages, such as saving bandwidth and network costs, and enhancing security and privacy.[34] Microsoft, for example, claims that edge computing enables more industries to safely use the cloud and still meet their compliance requirements.[35] McKinsey finds that the industries with the most edge computing use cases are travel, transportation, and logistics; energy; retail; healthcare; and utilities.[36] Here are just a few examples of applications of edge computing:
- ‘Autonomous vehicles’ can gather the data produced by vehicle sensors and cameras, process it, analyze it and make decisions in just a few milliseconds to keep vehicles and pedestrians safe.
- Intelligent transportation systems enable passenger information systems, vehicle monitoring and tracking systems, intelligent surveillance of transportation vehicles and stations, intelligent traffic management systems and more. Fleet management allow organizations to intelligently manage their vehicle fleets with a variety of rich information.
- Remote monitoring of oil and gas assets can be deployed in oil and gas fields where process conditions (such as extreme temperature variations) can be effectively and safely monitored and managed offsite.[37]
- Patient conditions can be tracked in real time and treatment can be improved through better patient treatment compliance and early identification of health complications.[36]
According to a report by Grand View Research, the global edge computing market size is anticipated to reach USD 61.14 billion by 2028, exhibiting at a CAGR of 38.4% over the forecast period.[38]
News stories
- Published 29 Standards | Developing 14 Projects
- Information technology — Cloud computing — Edge computing landscape
- Published 49 Standards | Developing 13 Projects
- Internet of things (IoT) — Edge computing
Quantum technologies rely on the principles of quantum physics and cover a broad range of applied areas like quantum communication, quantum computing, quantum cryptography, quantum imaging, quantum metrology, quantum sensors, and quantum simulation.
Quantum computing, in particular, could be a game changer and revolutionize the way we perform calculations.[39] Quantum computers are the next generation of computers, which operate based on the laws of quantum mechanics and are made up of quantum circuits. The fundamental building-block of the quantum computer is the quantum bit or ‘qubit’, the quantum analogue of the binary digit or classical computing bit. The qubit can exist in two states (analogous to the ‘1’ and ‘0’ of the classical bit) as well as a superposition state (where it is both ‘1’ and ‘0’ at the same time). Because qubits can exist in multiple states at the same time, the quantum computer has the potential to be a hundred million times faster than a traditional computer. With its help, databases can be searched faster, complex systems such as molecular-level behaviour can be modelled and simulated to make better medicines and today’s encryption technologies can be strengthened or cracked.[13,23]
In the future, it will be possible to book and obtain quantum computing power via the Cloud from providers such as Amazon and IBM, triggering the era of hypercomputation.[23]
Even though quantum could be considered the most nascent DARQ technology, investment has been growing rapidly and this investment is happening at multiple levels, e.g. from companies through to supranational institutions and countries. For example, China set up the world’s first quantum cryptographic network (Jinan Project) in 2017. Meanwhile, the European Union launched a quantum flagship initiative in 2018 covering quantum communication, quantum simulation, quantum computing, quantum metrology, and sensing as well as the basic science behind quantum technologies. With a budget of at least EUR 1 billion over ten years, the long-term vision of the flagship initiative is to develop a quantum Web in Europe, where quantum computers, simulators and sensors are interconnected via quantum communication networks.[39]
In terms of private sector advancements, major players like Google, Alibaba, IBM, Baidu and Hewlett Packard are all busy doing their own research.[2] In 2021, IBM Quantum unveiled the Eagle chip, delivering 127 qubits on a single IBM quantum processor for the first time with breakthrough packaging technology. Eagle broke the 100-qubit processor barrier and is leading quantum computers into a new era. IBM anticipates that, with Eagle, users will be able to explore uncharted computational territory – and experience a key milestone on the path towards practical quantum computation.[41]
Despite the excitement and investment, however, quantum technologies are in their very early stages, and it will be a long time before they take over the market. For example, the quantum computer market of the future is only predicted to grow to about the size of today’s supercomputer market, worth around USD 50 billion (as compared to today’s market for classical computing devices, which was already worth over USD 1 trillion in 2019) and, even by 2030, none of the smartphones, tablets and computers in use will be quantum powered.[42]
Related trends
News stories
- Published 1 Standards | Developing 1 Projects
- Published 3555 Standards | Developing 516 Projects
- Information technology — Quantum computing — Vocabulary
- ISO/IEC AWI TR 18157 [Under development]Information technology — Introduction to quantum computing
- Published 252 Standards | Developing 68 Projects
- Information security — Security requirements, test and evaluation methods for quantum key distributionPart 1: Requirements
- Information security — Security requirements, test and evaluation methods for quantum key distributionPart 2: Evaluation and testing methods
References
- Understanding the DNA of DARQ (Accenture, 2020)
- Technology vision 2020. We, the post-digital people (Accenture, 2020)
- Technology vision 2019. The post-digital era is upon us (Accenture, 2019)
- Stanford University launches the institute for human-centered artificial intelligence (Stanford University, 2019)
- Digital megatrends. A perspective on the coming decade of digital disruption (Commonwealth Scientific and Industrial Research Organisation, 2019)
- Digital economy report 2019. Value creation and capture: Implications for developing countries (UN Conference on Trade and Development, 2019)
- 2021 Tech trends report. Strategic trends that will influence business, government, education, media and society in the coming year (Future Today Institute, 2021)
- Beyond the noise. The megatrends of tomorrow's world (Deloitte, 2017)
- Ten trends that will shape science in the 2020s. Medicine gets trippy, solar takes over, and humanity—finally, maybe—goes back to the moon (Smithsonian Magazine, 2020)
- Foresight Africa. Top priorities for the continent 2020-2030 (Brookings Institution, 2020)
- AI in education. Change at the speed of learning (UN Educational, Scientific and Cultural Organization, 2020)
- Global trends 2020. Understanding complexity (Ipsos, 2020)
- Global strategic trends. The future starts today (UK Ministry of Defence, 2018)
- Global risks 2035 update. Decline or new renaissance? (Atlantic Council, 2019)
- Protecting privacy in an AI-driven world (Brookings institution, 2020)
- The global risks report 2021 (World Economic Forum, 2021)
- What do we do about the biases in AI (Harvard Business Review, 2019)
- A review of extended reality (XR) technologies for manufacturing training (Technologies, 2020)
- Future possibilities report 2020 (UAE Government, 2020)
- The Reddit revolt. GameStop and the impact of social media on institutional investors (The TRADE, 2021)
- When seeing is no longer believing. Inside the Pentagon’s race against deepfake videos (CNN Business, 2019)
- Global connectivity outlook to 2030 (World Bank, 2019)
- AGCS trend compass (Allianz, 2019)
- Top five blockchain benefits transforming your industry (IBM, 2018)
- Global blockchain benchmarking study (University of Cambridge, 2017)
- Will blockchain disrupt financial services (Gartner, 2017)
- What is an NFT? Non-fungible tokens explained (Forbes, 2022)
- Significance of NFTs in Web 3.0 and the Metaverse (Selfkey, 2022)
- Blockchain Opens Up Kenya’s $20 Billion Informal Economy (Bloomberg, 2018)
- Is your honey faking it? (Commonwealth Scientific and Industrial Research Organisation, 2018)
- Forecast. Blockchain business value, Worldwide, 2017-2030 (Gartner, 2017)
- World trade report 2018. The future of world trade: How digital technologies are transforming global commerce (World trade Organisation, 2018)
- Patent analytics report on blockchain innovation (IP Australia, 2018)
- 3 Advantages (and 1 disadvantage) of edge computing (Forbes, 2020)
- Edge computing. What it is and how it's a game-changer (CMS WIRE, 2018)
- New demand, new markets: What edge computing means for hardware companies (McKinsey, 2018)
- Examples of edge computing (Premio, 2021)
- Edge computing market growth & trends (Grand View Research, 2021)
- Future technology for prosperity. Horizon scanning by Europe's technology leaders (European Commission, 2019)
- China set to launch an 'unhackable' internet communication (BBC, 2017)
- IBM unveils breakthrough 127-qubit quantum processor (IBM, 2021)
- Quantum computers. The next supercomputers, but not the next laptops (Deloitte, 2018)