Digital Biology: Shaping & Connecting Translation Research & Healthcare

The Future is Digital

As the world becomes more connected to the digital world so does the field of Biology. From basic research, where labs are embracing digital applications and technologies to increase productivity and agility; through to manufacture where we can see Industry 4.0* driving change; and in the clinic where both diagnosis and medical procedures are benefitting from digital. Whatever the application, the investment into these digital-centric technologies are becoming a part of our everyday science and embracing and understanding its impact will be crucial for a successful future.

Digital Lab Applications

In the drive to achieve increased R&D returns, cost efficiencies have become, and will continue to be, an ever driving factor for improvements in the lab.  The search for these efficiencies through process, automation, information management and access has paved the way for digital to lend itself, quickly and seamlessly into the transformation of the lab environment. Take the lab pipette as an example, our entry point for basic research techniques and where we have seen the evolution of these basic, but compelling devices from mono-mated…to multi-mated…to fully automated liquid handling systems, all of which endow accuracy, robustness and volume efficiencies. Simple and effective. However, by now applying the Internet of Things (IoT) the outlook becomes far more interesting. Automated instrumentation no longer acts as a standalone piece of equipment, but is connected to become one part within a greater set-up.  This connectivity allows integration with lab management systems allowing remote access, automated reagent ordering, and information exchange. Experiments become computer designed; simulations run and optimised before the wet work has even started. Data can be uploaded automatically to the Cloud, enabling storage and access to terabytes of data. This digital revolution dubbed Industry 4.0 encompasses cyber-physical systems, IoT, cloud computing, and allows control and monitoring through computer-based algorithms, which in many cases also utilise artificial intelligence (AI) or machine learning (ML) and so transforms research and manufacture through the power of connectivity.  We are at the beginning of the journey building of a new generation of ‘smart’ labs, creating the Lab of the Future (LoF).

Digital Reality and Artificial Intelligence

Digital Reality

Digital reality encompasses both virtually reality (VR) and augmented reality (AR).  VR is an immersive, interactive experience that takes place in a computer-generated environment.  Currently the visual and auditory senses are the most common to be stimulated although the experience is not likely to stop there; virtual reality gloves have recently been designed.  AR shares many features with VR, but is more firmly based in the actual world, as the virtual information is overlaid onto ‘live’ images.

One of the first VR systems, created in 1929 by Edward Link was the ‘link trainer’ a commercial flight simulator. The US military was very quick to see its value and the success of training simulators across a variety of arenas were rapidly implemented.  It was in the 1930s that science fiction writer Stanley G. Weinbaum first described the use of goggles to see his fictional world, and by 1962 the first ‘VR’ experience was realised by cinematographer Morton Heilig with the Sensorama, consisting of 5 short films that engaged multiple senses in his prototype immersive machine. The first head-mounted display (HMD) was subsequently launched in 1968 by Ivan Sutherland and his team, a far cry from today’s headsets and it was so heavy that it had to be suspended from the ceiling.  With the global gaming market estimated to reach nearly $150 billion in 2019, it is hardly surprising the introduction of VR and HMD into this industry has revolutionised these products. Technological advances have driven a decreasing in size and price of HDM, visual and audio improvements are evolving, together this has made DR readily accessible and as such it is being transferred into other markets.


The healthcare industry has embraced VR and AR.  One of the more common practices is in education and the use of VR as a learning device for surgeries.  Indeed, scientists at John Hopkins have gone one step further by creating 3D personalized virtual simulations of cardiac patients suffering from arrhythmias.  Using computer modelling to precisely locate the origins of the electrical misfire, and the cells that perpetuate it, they can then simulate a range of virtual cardiac ablations to determine the most effective and minimal region to undergo treatment.  Back in the clinic this data has now been used to aid the surgeons on personalised ablation treatment for their patients.  Surgery has also seen the use of robotic increase over the last few years.  Robotic arms controlled by a surgeon using VR or AR is becoming more widespread, allowing more accurate and precise surgeries, with smaller incision, less blood loss and faster recovery times.  Improved recovery rates have been seen with the use of VR, from strokes to curbing memory loss, pain management and physical therapy, from children through to the elderly.  A key generator to DRs success is its immersive yet non-intrusive nature.

Unsurprisingly for such a visual tool, aiding sight is high on the list of DRs applications.  Caltech scientists have created an AR navigation system for the blind.  Allowing an object to ‘talk’ to the user and giving them spatial input.  Around 11 million people in the US have macular degeneration and VR headsets are already being used to enhance their sight allowing them to regain control of their lives and enjoy the world around them. It is an exciting time for DR in our industry, although in its infancy it is likely that in the next few years this will start to become a mainstream technology that we use regularly to augment our lives.

Artificial Intelligence

In essence AI is an area of computer science, whereby machines are programmed and demonstrate intelligence or learning abilities (machine learning).  Whilst, in the physical sense, we might imagine that the world is becoming smaller in terms of habitable space or from an agricultural perspective; from a virtual perspective, the world is growing exponentially and the scope for process automation is vast.  Whether data mining; interpretation; or processing; AI can take the heavy work load away from us and simplify things.  It is not difficult to see how these features can be utilised to great effect in life science.  From AI systems which can trail and analyse the huge volume of cancer literature with a view to accelerating cancer treatments and discoveries, or AI diagnostic systems which have been trained to identify heart disease using patient heart scans, these systems when tested in clinical trials look to outperform their human specialist.  Similar projects are reporting diagnostic success for AI in lung cancer and predicting coronary artery disease mortality rates.  With our immersion into the innovation phase of AI there are still a few obstacles to overcome before the hype equates to the reality. Due to the rapid engagement of AI in life science several key areas may not have evolved at the same rate and may in the immediate future be limiting.  Regulation and security especially surrounding patient data is strict, and apprehension still surrounds the technology as a degree of human control is lost, furthermore the volume of expertise required to integrate AI systems into our everyday science may not be readily available at this time point.  However, its potential is exciting, enabling scientists an agility to automate processes and analyse large datasets whether it be genomes, literature or samples.

Digital Healthcare


With personalised medicine in demand, genomes becoming more cost effectively sequenced (and interpreted) and digital science becoming a reality in our daily life we are embracing digital devices.  Life science and the healthcare industry are constantly evolving new methods to revolutionise our lives with digital wearables becoming one of the faster growing industries.  The worldwide shipment of wearable devices is forecasted to hit 225 million units in 2019, currently skewed towards the watch/wristband market the life science industry is becoming more interested in the health care benefits.  The major companies are currently jostling for position in the watch market, mostly used for as fitness trackers seen as preventative health devices, encouraging changes in behaviour and self-awareness.  However, the potential of wearable as medical tools is quickly becoming realised.  From devices able to relieve Parkinson hand tremors, to ultrasound devices able to aid the incontinent it’s going to be a dynamic and interesting market to stay abreast of.


Biosensors are devices used to measure the presence or concentration of a biological analyte.  The analyte is a biological element such as a chemical substance, a biological structure/tissue/cell or microorganism. Consisting of three parts, the biosensor is comprised of a bio-recognition element or bio-receptor, a signal transducer, and an electronic system or reader device.  The bio-receptor binds to the analyte(s) of interest and produces a signal that the transducer can measure.  Broadly there are four classes of bio-receptors, nucleic acid/DNA, enzymes, antigen/antibody and cells.  Transducer types can also differ with thermal, optical, electrochemical and piezoelectric (quartz crystal microbalance (mass-based) and surface acoustic) being commonly used.  The ability to rapidly sense changes in the body or in the environmental make biosensors extremely powerful digital tools in life science. For example a recently publication reports the design of sepsis diagnosing sensor, this life-threatening illness is complex to diagnose and speed of treatment is crucial, this new diagnostic tool could potentially saving thousands of lives around the world.  Whether it is biosensors able to detect flu, water contamination or monitor drug efficiencies in animals or humans these tiny tools are going to make a big impact on the way our science is measured.

*Industry 4.0 is often referred to as the fourth industrial revolution and describes the current trend of automation and data exchange in manufacturing technologies. Comprised of four design principles; Interconnection, information transparency, technical assistance and decentralised decisions.

Back to Digital Biology home