What are all these acronyms anyway?

Digital Twins, Artificial Intelligence/Machine Learning and Analytics technologies are not off the shelf or pre-packaged solutions. Their very definition is unclear - ranging from 3D building visualisation through to complex simulation and to dashboards.  Providing clarity is part of our mission.

Our media is now full of technical acronyms like Artificial Intelligence (AI) and Modelling. This was driven chiefly by the pandemic briefings given by governments trying to reassure and plan for an uncertain future. These terms were previously moving into the public domain through of the negative covering f drones, security, and civil liberties.

What are these acronyms & what do them mean to me?

Technical terms permeate our general conversation and have done for many years as technology and software permeates everyday life, ultimately improving our quality of life. A deep understanding of all the terms is rarely needed to enable us to use technology. However, in our work lives we must make decisions and evaluate options often involving these emerging technologies. Therefore, its useful to understand what the terms relate to and how they can help improve business performance.

Before we begin, our discussion and glossary relate to software and technologies that deal with data, lots of data. They consume and produce huge amounts of data to provide some level of additional insight. This is enabled by the availability of very high-powered computers that are relatively cheap. It is this convergence of cheap computing, lots of data and advanced algorithms that is driving new technologies from research into work and home.

What are the acronyms?

Analytics

Analytics is the blanket term for all modern software that processes data to find some insights. 

  • Descriptive Analytics looks at data to understand what has happened
  • Diagnostic Analytics looks at data to understand why something has happened
  • Predictive Analytics tries to identify what will happen if we carry on as we are
  • Prescriptive Analytics aims to identify what actions to take to achieve a goal

As the aim of the analytics study progresses down this list then the value of the output increases greatly as does the complexity. For truly insightful Prescriptive Analytics a combination of the others is required as well as additional techniques such as simulation, optimisation, and dynamic digital twins.

A Dashboard is a tool for displaying data and the results of analytics for interpretation by users. Charts, Graphs & Maps are typical methods to display data for users. More advanced tools can also be used such as 3D models, Virtual Reality & Augmented Reality. 

Artificial Intelligence (AI) describes a class of advanced data consuming algorithms that is becoming very common in everyday life. Generally, they are a ‘neural network’ that is trained to learn to recognise patterns, items, events, etc. This means they replicate the neuron and connection pattern found within a brain. Learning means neurons and connections are constructed by inputting categorised information such as pictures of animals with their type. Once trained this AI can be used to recognise animals in pictures. This is where AI has made its biggest impact – the classification &categorisation of all types of data - including instant facial recognition as seen on an iPhone.

Machine Learning (ML) is a second class of advanced data consuming analytic algorithm. Machine Learning is a form of advanced statistical analysis of data to identify trends such as failure rates, probabilities of failure at different ages or likelihood of customers leaving a supplier. As the name suggests the ‘machine’ is fed with large amounts of data and it ‘learns’ aspects and insights from the data. Although there are analytics tools that can find insights automatically from datasets, In reality a Data Scientist will use experience to know which ML algorithms will yield best results.

Using Regression Analysis on age-based asset inspections it is possible to derive a mathematical formula that describes how the assets deteriorate over time. This is Descriptive and Diagnostic Analytics. Using the formula to calculate the likelihood of failure of the same assets in future years is Predictive Analytics.

AI/ML is the generic term used to represent advanced algorithms used to gain insights from data.

Big Data is a term used to represent very large datasets and the tools used to store and manipulate them, especially for Analytics. Many industries now generate huge amounts of data every minute. Traditionally databases would have held important information such as names & addresses or bank account information like transactions. However, companies like Amazon & eBay measure and store everything, even down to where you place the mouse or click on a screen. This information can be collected and analysed to help design user interfaces that promote selling goods better! 

The Internet of Things (IoT) is the concept that because sensors and data are cheap then all things can always be measured and their data used in Analytics. Things include doors, pumps, pipes, reservoirs, engines, tunnels, almost anything! With the large amount of data collected from the Things new insights can be gained, such as when they are likely to break or if they are over/under capacity. Especially useful is recognising patterns of operation that are a pre-cursor to failure (events) such as an automatic door opening more slowly or the pressure of a pipe dropping over time.

  • Modelling is a generic term meaning used in the Analytics field to indicate that the descriptive part of the analytics is being used to predict some future state. This term was used during the pandemic to describe the prediction of the impact of infection rates within the population. The SAGE academic models predicted the number of people that would contract Covid as well as those getting seriously ill or deaths. The models predicted the daily change in numbers so that different approaches to mitigating the spread could be explored. E.g. general lockdown vs just wearing masks.

Unlike AI/ML modelling is a forward-looking approach, aiming to predict future events and/or identify mitigation before events occur. Modelling can be simple calculating forward the outputs of an ML algorithm to a full simulation of a complex system with multiple predictive models and input scenarios. This latter type of modelling can be called a Simulation or Dynamic Digital Twin

 

 

Digital Twin

Digital Twins originated from Computer Aided Design & Manufacture (CAD/CAM) where 3D visualisation of things enabled better and quicker results. A Digital Twin is a digital facsimile of a real-world item such as a building, bridge, plant or factory that includes its physical description as a 3D model and a database of its construction and assets. Created to design the real-world item, the Digital Twin can be used through life to manage and update its description. This is particularly useful for sophisticated factories and plants where sub-assemblies and assets have different lifetimes or could be upgraded with newer more efficient assets. Within the building industry this version of a Digital Twin is the convergence of Asset Data Management and CAD/CAM, and is known as Building Information Management (BIM)

Dynamic/Operational Digital Twin are an evolution of the concept of a digital twin whereby the twin aims to replicate the processes and lifecycles of the constituents to better manage the operation and maintenance. Data collected on each of the sub-assemblies and assets within the real-world Twin are subjected to Analytics that inform the processes and lifecycles. The Dynamic Digital Twin can then be run forward to simulate its operation and identify key future events and identify mitigating actions or explore the impact of upgrades and changes to the processes and assets before they are implemented. This provides invaluable insight into the choices, big and small, that need to be taken, informing the likely future cost, risk, revenue, and performance of the simulated item.

 

Dynamic Digital Twin

Dynamic Digital Twins can be used to gain insight at any level of the business and for any time frame. Most are used to explore the impact of big strategic decisions such as large investments with long term impact. However increasingly Operational Digital Twins are used to suggest actions within the day or in near real-time. Used in a live environment an operational digital twin can hold and simulate significant parts of the business so that intra-day decisions have multiple options instantly considered and reported. Ultimately in most organisations the ‘person in the loop’ has the final say and responsibility on the action to take. The advent of Operational Digital Twins is likely to release more financial savings, improve performance and lower risk.

Simulation was the term used prior to the advent of Digital Twins to represent any digital facsimile of a real-world process used to explore different configurations or options and identify their likely cost, risk, revenue, and performance. 

Optimisation is an advanced algorithmic approach from the Operational Research field used to search for and identify solutions to Simulation and Dynamic Digital Twin Problems. The simulation problem choices are encoded into the language of the solver, the solve calculation is performed and the result is decoded back to the Simulation and applied. There are many types of Optimisation algorithm however they broadly split into Linear solvers and non-linear solvers. In general; 

  • Linear solvers encode the problem choices as a matrix and solve using advanced matrix mathematics.
  • Non-Linear solvers, called Genetic Algorithms, encode the problem choices like DNA to mutate and crossover like the game of life in search of the ideal solution.

Optimisation is suited to situations where there are many choices that can be taken, or where choices have many different parameters describing them. As the number of choices increases it becomes impossible for a person to select the best set. This is increasingly hard where there are constraints on time, budget or other resources.

Optimisation can identify the mathematically optimal result and with additional runs the next best, and least worst approaches.

What If scenario analysis is the process of using a Dynamic Digital Twin to explore the cost, risk, revenue and performance impact on a system from different configuration choices or actions. The Dynamic Digital Twin informs the ‘what if’ questions before they are implemented enabling multiple different scenarios to be explored before action is initiated.

Uncertainty & Sensitivity analysis is different from scenario analysis in that the scenario is fixed but the input parameters are systematically varied. It is good governance to calculate the variability or sensitivity of the inputs and outputs of any analytic model or Dynamic Digital Twin. It can often be the case that input data is biased or insufficient such that predictions made from it are very wide of reality. Systematically testing, say cost models with inputs 50%, 25% and 10% above and below the main input should delivery stable outputs. However if the outputs vary a lot then the model is very sensitive and more care should be taken with the results presented. 

Formal techniques to mathematically calculate the uncertainty & sensitivity of analytics models exist with Monte Carlo Analysis being the most popular and Latin Hypercube analysis a refined version.

When we hear AI in the media in reality it is one aspect of a  sophisticated system of systems that used together give us valuable hindsight, insight and foresight.

But, what do they mean to me?

After 20 years building and deploying advanced analytics we’ve learned that the acronym(s) to employ are completely dependent upon the problem to be solved. And the availability of data. And there are no silver bullets! Even simple problems require a set of data processing steps and potentially multiple AI/ML phases to provide the required outputs. 

  • Artificial Intelligence is particularly suited to data classification and image or pattern recognition. Faces, objects and number plates can be recognised and interpreted as a single image request to a suitably trained AI.
  • Text and speech recognition can be transferred into useful information or commands using a Natural Language Processing AI trained by many voices or documents.
  • Predictive analytics relies on Machine Learning techniques to extract the mathematical formulas from data to deliver forecasts of future state of assets.
  • Dynamic Digital Twins draw all the other techniques together to provide sophisticated scenario simulation and sensitivity analysis. To populate these models requires many different streams of data, AI & ML models and often an optimisation or monte carlo analysis to identify the solutions.

In reality, when we hear AI in the media it is one aspect of a system that uses multiple approaches to achieve its aim. For highly sophisticated subjects such as military drones a system of systems approach is often employed.

Some other common technology terms...

Blockchain

A Blockchain is a growing list of records, called blocks, that are linked together using cryptography. Each block contains a cryptographic hash of the previous block, a timestamp, and transaction data. The timestamp proves that the transaction data existed when the block was published to get into its hash. Blocks each contain information about the block before it, so they form a chain with each additional block reinforcing the ones before it. Therefore, blockchains are resistant to modification of their data because once recorded, the data in any given block cannot be altered retroactively without altering all subsequent blocks. To aid resilience a Blockchain is hosted in a decentralised model across many servers.

Cryptography/Crypto is the study and practice of techniques to secure communication and data such as it is private (encryption) and can only be read by those with the decryption keys.

 

Cloud computing

Cloud computing the evolution of serviced hosting of compute resource. As organisations like Google and Amazon built ever larger data centres to service their clients, they realised that they needed capacity only at peak times. To make a return when not in use they rented the use of the servers to third parties. This model particularly suited web applications that were build in several layers and could easily be distributed over different servers.

Further evolution in the market enabled elastic resource, where rented server capacity would automatically grow and shrink with demand.

Cloud computing now enables engineers to dispense with the concept of a server instead offering elastic services such as storage & data tools, analytics tools, reporting & dashboards that can be configured and assembled into elastic applications.

Cloud computing is a secure, resilient, flexible, elastic, and affordable mechanism to build and deploy modern software systems

 

Edge computing is a layer of IoT where the data generated by the Thing is analysed at or near the Thing and only the result sent back to the centre for further analysis. This approach enables some autonomy of the Thing so important actions can be more quickly enacted.

Logo

©  Tridyn Consulting LLP/ Copyright.       All rights reserved.