Latest Posts:

Sorry, no posts matched your criteria.

Follow Us:

Back To Top

our work in pharma

Curative therapeutics: the need for systems level intelligence, in combination with artificial intelligence.

A working thesis on how, as an industry, we can design more effective personalised therapies that significantly extend healthy lifespan.


1. The current rate of therapeutic failure is unsustainable and unnecessary.

2. The reasons are well understood, from poor models to lack of specificity, but often simply put down to “science is hard”.

3. In reality, the knowledge and toolsets are available, the problem remains due to entrenched biases at every stage, from publishing papers to clinician marketing.

4. The emergence of personalised and precision therapeutics creates the opportunity to address these entrenched and repeated failures.

5. This includes leveraging computation in combination with human ingenuity to unpick complex dynamic systems, move the computation into the therapeutic itself, address root causes directly and drive the creation of better models and markers.


Areas for venture creation:
Cystic Fibrosis

We’ve partnered with the Cystic Fibrosis Foundation and are currently looking for people with broad technical knowledge in gene therapy, cell therapy, cellular engineering or synthetic biology and have an interest in science commercialisation, with the dream of building novel therapeutic solutions that can change lives. View further details about this opportunity and how to apply here.


We’ve launched ConcR and ImmTune, and as a result of our partnership with Cancer Research UK launched Enedra, Neobe and Stratosvir.


We’ve launched Reflection Therapeutics

Microbiome and AMR

We’ve launched CC Bio and Ancilia

Considering the systems level, from biology to payer

At DSV we aim to get back to first principles and understand the root cause and bottlenecks that have led to past failures. Not just failures in the science but perhaps even more importantly, to discover the gaps and biases in the innovation pipeline. Like many industries the drug development process is optimised for short term wins regardless of long term outcomes, due to fragmentation even within the same organisation. This has led to the current situation, in which for example, only 10% of drugs succeed at phase 3 clinical trials and even after that, only work in less than a third of patients. Everyone knows that current outcomes aren’t good enough, and aren’t simply a fact of the science being ‘hard’. Yet, it is almost impossible to perturb a system with such strong incentives to get the paper out, pass the trial, get the deal done and persuade the doctor.


The elephant in the room

Targeting uniformity in a complex dynamic system

There are only a small number of diseases in which intervention at a single point, common across all patients, can sufficiently and persistently alter the system towards a healthy state. In these cases medicinal chemistry and antibodies have, in many cases, been extremely effective. However, most diseases consist of a complex set of dynamic failures from genetics and epigenetics to neural activity: these differ across patients, within patients and over time. In these cases, incorrectly perturbing the system can either achieve net zero or throw it into a dangerous runaway state. The complete lack of therapeutic outcomes from the isolated study of genetics is a sobering reminder of this challenge. 

Unrepresentative models

Trials often fail because the model that was used to demonstrate effectiveness in preclinical represents the end point, but not the underlying cause. This is becoming all the more evident as medicines move towards modifying complex systems such as the immune system, microbiome and nervous system, all of which share many of the same components as mice, but which behave in a vastly different way at the systems level. This drives more hacks like testing microbiome and immune products in germ free mice, where germs are clearly essential to defining the function of the system. Perhaps the most depressing example of this is the seemingly endless series of failures in Alzheimer’s drugs that are derived from models which over-express Amyloid rather than accurately represent the disease.

The lack of appropriate biomarkers

The problem is that we all know that they often have no real relation to the primary endpoints and huge variability across disease and patient. However, there is little incentive to fix this at the local level. Trials evaluating new biomarkers appear more likely to fail, not to mention that stratification makes it far harder to find sufficient numbers of patients. This is very clearly the challenge in fields such as neurodegeneration, psychiatric diseases and longevity, but is also one of the key reasons behind failures in many of the most hyped fields at the moment including immuno-oncology.  

Hacking rather than addressing specificity

Targets are rarely unique to the system of interest and rarely respond in a simple on / off fashion, but rather along a distribution. Indeed, the number of potential allosteric molecular interactions are in the hundreds: even for approved drugs. Likewise, antibodies bind all over the place and the current animal production method severely limits the options for avoiding this. In combination, this makes it extremely difficult to find the correct dosing that perturbs the system of interest but not others.


Our approach 

1. Forge a less linear R&D process identifying analytical and model approaches that fully capture complexity and patient specificity.

2. Forge a truly multi-stakeholder creation process with academics, venture capital, charities and industry engaged in the design process from day one.

3. Take a wide, international and interdisciplinary perspective, focused on proving out arrays of potential components to successfully modify a system, rather than aiming to prove the usefulness of a given discovery.

4. Always aim to get as close to the root cause as current understanding and toolsets allow.


Our thesis

Effectively leveraging computational approaches to address complexity

Simply applying deep learning to look for correlations rarely leads to meaningful learning in all but the most simple and fully characterised environments. Not least where such correlations are based on published data with at least 30% being unreproducible. The magic comes in combination, starting with the most constrained and fully measurable systems and using computation to pick apart the highly interdependent and dynamic systems whilst using human ingenuity to direct this in meaningful ways. Leading on our work in tackling heterogeneity is ConcR’s work in mapping the evolutionary path of a given patient’s tumor resistance. They have already shown over 87% accuracy in predicting the most effective treatment regime, something that has the potential to prevent over a third of deaths from cancer.

Simply applying deep learning to look for correlations rarely leads to meaningful learning in all but the most simple and fully characterised environments.


Therapeutics that can compute invivo

There’s something strange about taking a snapshot of a dynamic system, averaging it across a heterogeneous population and using this to design an intervention, when the system itself contains all the necessary analysis and computational ability. Whether its a bacterial quorum deciding when critical mass is achieved, or the immune system mounting an attack, a biological system is using some set of inputs and dynamic models to decide when and how to make a therapeutic intervention. We intend to drive the change towards conducting analysis and differential therapy directly within the patient. Already within the portfolio this includes conditional dampening of inflammation in the brain with Reflection Therapeutics and equipping bacteria with the ability to defend against specific phage in IBD led by Ancilia. 

Focus on better models and markers

The technology is becoming available to create much more realistic systems-level in vitro and in silico models. This will emerge from a combination of rapidly advancing analytics (eg. single cell sequencing, proteomics and imaging) combined with stem-cell modification, genetic editing and simulation. The challenge is that there’s very little money, yet extreme competition, in commercialising models and testing as a service. It’s therefore up to biotech companies to have the discipline to build their own tools, and acquirers then to have the confidence to bet outside of the well-accepted paradigms. Note: markers in early diagnostics is covered in our Healthcare Systems thesis. 

Getting back to the root cause

Even if cancer was eradicated, the average lifespan would only be extended by 2 years, as people die of heart disease instead. This makes you wonder; why do we spend so much resource addressing the downstream representation of disease when there are common upstream causes which are universally driven by time dependent damage. As such, across every disease our approach is to look for opportunities to address all of the upstream factors in concort, not just one or two at a time. This includes fixing and buffering molecular level damage, fixing broken or unhelpful messaging and signalling pathways, correcting errors at every level of gene expression and modifying the state of cells to drive regeneration. 

Even if cancer was eradicated, the average lifespan would only be extended by 2 years, as people die of heart disease instead.


Bringing it all together

This combination of a more joined up approach across the financial and marketing stack, an intellectually honest approach with respect to the chances of recreating results in patients and a step towards far smarter, more specific therapeutics, will finally put us firmly on the road to curing, not patching up, becoming the norm.