Dr. Philip Beer is an experienced hematologist, with significant expertise in translational medicine and cancer genomics. We caught up with him to discuss the data-driven approach needed to propel precision oncology and targeted therapies forward, and what the latest developments in the field portend for the war on cancer.

PF: Your life over the last two decades – if not longer than that, has truly revolved around cancer. You have worked on the disease, and hematological malignancies in particular, through an entire kaleidoscope of perspectives: whether it be managing clinical trials, delivering patient care in a translational medicine approach, working with start-ups or driving cancer genomics forward. What is it that motivates this drive and focus on fighting cancer?

PB: Indeed – throughout my career, I have moved across a variety of positions in healthcare. I have also worked in academic research, and in the industrial sector. My primary motivation, one that is common throughout these endeavors, has always been to improve outcomes for patients with cancer. What I have seen over time is that there are different aspects of care that each of those three sectors – healthcare, industry, academia – excel at. There are also aspects where the sectors are deficient. That is not to say that one is better – they are merely different, and all three are necessary for a holistic approach to fighting cancer. What has truly interested me recently is trying to generate an environment where the best of each of these three can be brought together. Therefore, I am very keen to work closely with physicians and academics in my current commercial role – to increase collaboration and demolish the silos. The pace of research can be truly accelerated if we unite the disparate sectors that study cancer – we did this for COVID-19, after all. We saw pharma work with academia, for example AstraZeneca with Oxford, to bring about an effective vaccine; subsequent collaboration with the NHS accelerated its roll out. This is one example of the three sectors coming together. I would very much like to see similar efforts for cancer. 

PF: Your current work with Step Pharma builds on the entire foundation of your prior expertise – really. Your pipeline makes extensive use of genomics to identify new targets for targeted therapeutics. What advantages do you think are inherent in this approach to drug discovery – particularly in comparison with rival approaches such as phenotypic drug discovery (PDD)? 

PB: I think, for Oncology, the chances of generating a successful drug are much higher when there is an understood mechanism of action that motivates drug discovery; PDD can struggle with this as it is target-agnostic. The mechanism being underpinned by a well-understood genetic mode of action increases those chances even higher – genomics simply make sense when it plays such a large role in causing the cellular changes which are the foundation of cancer. Our lead candidate at Step Pharma has been generated from real data – from mutations observed in humans, which I think provides an extremely strong basis for drug development. Using genetic interactions such as synthetic lethality in cancer to drive development is also a strong method for generating drug candidates. There are also opportunities for leveraging the enormous amounts of publicly available data, for example the Achilles project comprising data from over a thousand cancer cell lines with nearly every gene having been knocked out by CRISPR, along with associated genetic and proteomics datasets. Genomics can turbocharge drug development by enabling the utilization of all these and similar datasets.

PF: Regardless of whether a target-based or phenotypic approach is adopted to pursue new investigational products, the need for improved chemoproteomics and multi-omics approaches will remain constant in order to identify targets and biomarkers. This is particularly true when dealing with more precision-oriented treatments. However, unique challenges also present themselves when trying to integrate different, heterogeneous omics layers to draw meaningful conclusions from. How can these analyses be improved with current technologies – but also future ones, such as advanced AI models?

PB: There is indeed great utility in using multi-omic techniques, as well as spatial techniques – especially for immunotherapies and investigations into the responses they trigger. I think immuno-oncology is a field that can see large benefits from understanding the architecture of the tumor and how it is altered by different therapeutic approaches. This is in addition to helping us characterize the types of cells in the tumor, as well as immune cells in the tumor microenvironment. Spatial multi-omics analyses can not only study these – but they can localize them within the architecture of the cancer. The utility of multi-omic methods should not be underestimated in targeted therapies based on small molecules, however – particularly in understanding how small molecules interact with the immune system rather than just cancer cells, something we often overlook. These analyses are notorious for the amount of data they can generate – beyond what an individual person could feasibly interpret; there is obvious scope for AI models to enter this space. There is, however, a misconception that the function of AI is to have all the data thrown at it and then produce the right answer. But that is not true: the quality, structure and annotation of the data introduced to the model remains critical. We also need to become better at identifying the strengths of AI – right now we seem to be trying it out in a little bit of everything. So far we have seen much success in pattern recognition – such as the interpretation of imaging scans from radiology, or diagnosing melanoma. The hope is that a similar pattern recognition approach can be applied to genomic profiles, where the AI can look for patterns of variables that are clustered with clinical endpoints.  

PF: The development of new biomarkers has been a hot pursuit since the advent of targeted therapies – particularly in the field of solid tumors, where targeted therapies face more obstacles compared to hematological malignancies. Many even seek to revise what we consider a biomarker, moving from simple quantitative measures to more composite biomarkers – such as the tumor mutational burden (TMB), gut microbiota, the tumor microenvironment, and others. One problem with more complex biomarkers is that they need widespread consensus on how they are quantified and assessed, across the industry and healthcare infrastructure. How do you think this problem should be overcome?

PB: To begin with, I wholeheartedly agree that we need to be looking at more complex biomarkers: cancer is a complex disease. The field of immunotherapy provides good examples for this. Response to immunotherapy predicted by a combination of PD-L1 expression, TMB and T-cell infiltration provides a much more accurate picture of the likely patient response rather than using any one of those biomarkers separately. Immunotherapies will particularly benefit from composite biomarkers – they are necessary, I would even dare say, for the successful development and deployment of new treatments. We also have a problem in how biomarkers are used in clinical practice: we often treat them as binary events – whereas biomarkers such as TMB are continuous variables. Converting a composite, continuous variable biomarker to a binary variable results in the loss of a lot of information. This is also predictive of where the biggest problems with using composite biomarkers will be: deploying them in the clinic. We have yet to establish much precedent for this; the current hope is that the information will be fed to an algorithm which will supply a probability of response. But this, as a companion diagnostic, needs to be regulated – and the bar for the regulatory approval for such devices and algorithms is really quite high. To conclude, while I believe that composite biomarkers is a direction we are inexorably heading towards; however, the clinical practitioners and regulators will also need to make significant progress to accommodate them.  

Interpreting data has never been more important. Simplify your approach with the solutions provided by Datafoundry…Read More

PF: You have a long history of working with the UK National Health Service – and your focus on translational medicine has given you a holistic perspective on how cancer care operates across British infrastructure. The NHS compares impressively well with other systems across a variety of metrics – but cancer outcomes have traditionally been an area where it underperforms. This is illustrated by the UK lagging behind cancer survival rates in countries involved in the International Cancer Benchmarking Partnership. Why do you think that is – and what can we do better?

PB: That is an interesting question – and one that I myself have often pondered; I imagine others in the National Health Service are also puzzled by it. I do not think anyone has a definitive answer yet – if we did, we would be implementing it. There are some evident contributing factors, however – such as a lower doctor-to-patient ratio compared to other countries with similar income. Cancer care is highly intensive and each cancer can be vastly different from others, even of the same type – as such, a trained oncologist becomes invaluable. Many of the most common cancers, and perhaps part of all cancers, are also highly influenced by lifestyle factors, although I do not think anyone would say that the UK is a more or less unhealthy nation than other comparable countries. Yet the effects of lifestyle certainly convolute the picture and our ability to draw direct comparisons. Then there are also genetics – cancer is the result of so many physiological, genetic and societal interactions that providing a simple answer to this question of comparison often becomes impossible. However, we are certainly being ambitious with our use of genomics, such as with the new Genomic Medicine Service, or our whole genome sequencing technology for both solid and hematological malignancies. I think these steps orient us towards the right direction for improving outcomes – although realizing the benefits of these may come with a time-delay for developing new drugs and biomarkers. Another area where the NHS is placing a lot of emphasis over the next few years is early detection – one of the biggest determinants of disease severity for cancer, which has the potential to improve outcomes dramatically. 

PF: You will be speaking at Proventa International’s Bioinformatics Strategy Meeting in London in June – where you will be facilitating a discussion on leveraging multi-omics approaches within the next five years. Obviously, multi-omic integration will remain a crucial topic, not merely in oncology but throughout drug discovery. We have even seen single-cell multi-omics approaches leveraged in assessing immune responses to COVID-19. What other impacts do you foresee?

PB: I think this type of approach will show impacts at many levels, throughout the pipeline of drug development – from understanding basic cancer biology to identifying and characterizing therapeutic targets, as well as analyzing test samples. All these data can act in a positive feedback loop – with downstream data reinforcing our understanding of the physiological functions of the drug, and the improvements made based on that understanding translating to better outcomes. I think understanding how drugs work in oncology will be critical – particularly as we seek to predict treatment responses, as well as produce better safety profiles. This is especially true in oncology, where toxicity is often a trade-off for effectiveness – but if we do not understand the full physiological activity of an active ingredient, can we ever say that the toxicology is acceptable? Not understanding causative mechanisms is the number one reason why drugs fail in the clinic. It also complicates effective treatment combinations – it is much easier to plan studies on treatment combinations with fully characterized chemistries: one active ingredient on its own will not cure cancer. We have already seen great strides in hematology where drugs with different modes of action have been combined into efficacious treatment cocktails. Multi-omics retains the potential to inform these decisions at every stage, and deepen our understanding of the drugs we hope to manufacture. Obviously, we still face barriers in integrating datasets as well as determining the best way to analyze that data. These challenges must be solved, and solutions are not impossible – as we have seen with AI. These are the topics we will definitely explore at the meeting, and I look forward to the meeting being a great learning experience for me as well as all the attendants. 

Tuğçe Freeborough Gerard, Producer, Proventa International

Nick Zoukas, Former Editor, PharmaFEATURES

Philip Beer will be facilitating a discussion on multi-omics in Proventa International’s Bioinformatics Strategy Meeting in London this June. Join the event for closed roundtable discussions on the latest topics from Bioinformatics, led by leading industry experts and stakeholders!

Proventa International Bioinformatics Strategy Meeting, London, Europe

Share this:

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settings