Introduction

The 2nd Danube Symposium on advanced biomarker development was held over March 14–17, 2018, in Vienna. This symposium was dedicated towards building a framework centred on the philosophy of convergent engagement (http://news.mit.edu/2011/convergence-0104) of multiple stakeholders in the development and implementation of quantitative, data-driven biomarkers. Multiple definitions for biomarkers exist, but it is understood that reproducible, quantitative and visual biomarkers are required for the practice of personalised diagnosis and treatment [1, 2] (Fig. 1).

Fig. 1.
figure 1

Illustration of the use of a biomarker as a tool for the development of a predictive, preventive and patient-centric (aka personalised) model of care.

The 1st Danube Symposium—held in Vienna during September 28–31, 2016—was one the first of its kind to be specifically devoted to merging complementary expertise in cancer management from molecular pathology, nuclear medicine and clinical pharmacology. At the time, the programme included a “basic track”, which aimed at introducing the partnering specialities—molecular pathology, nuclear medicine and clinical pharmacology, and a “clinical application track” with a focus on prostate cancer. In contrast, the format of the second symposium was modified to provide more in-depth presentation and discussion of different platforms for biomarker identification, development, standardisation and implementation as well as societal and ethical aspects of biomarker-driven personalised medicine (http://www.applied-diagnostics.eu).

The 2018 programme included a series of key presentations, moderated panel discussions, roundtable and rapid-fire presentations the current status of biomarker availability and validity. This included the applicability of liquid biopsies and tissue samples, particularly in light of spatio-temporal tumour heterogeneity as well as access to and sourcing of biobanks. The status of application of existing biomarkers and the development of new compounds for novel therapeutic approaches were also discussed, whereby a particular focus was given to agents targeting the immune system and their translation into the clinic. Finally, a third track brought together panelists who highlighted a frequently under-represented topic: socio-economic and ethical challenges in biomarker development and patient care. Here, we summarise the main topics and outcomes of the discussions. We will indicate progress in key aspects of biomarker-related research as follows: progress (↑), steady state (↔) and regression (↓) (Table 1).

Table 1 Key to current status of biomarker development

The 2018 meeting started with an invited presentation by Dr. Rodney Hicks from the Peter MacCallum Cancer Centre in Melbourne on the “Challenges and Opportunities for Molecular Imaging and Theranostics in the Era of Precision Medicine”, which was entitled “Lost in Translation”, thus referring to a popular clip (https://www.youtube.com/watch?v=yR0lWICH3rY), that is synonymous of the current state communications between clinicians and researchers in the field of molecular medicine. In essence, similar perspectives were raised by other experts in the field who argued that imaging physicists, for example, need to move beyond simple system engineering and performance measurements towards understanding biology and building their methodological research on biological pre-conditions and clinical questions [3], or who claimed failure in speaking the same language when communicating over the imaging needs for oncology.

Dr. Hicks highlighted how our understanding of oncology has transitioned only recently from the assessment of its morphological characteristics on both imaging and histopathology to detailed genomic interrogation. In view of the human cancer genome programme [4], he argued that it has become clear that, rather than focusing on cancers by their organ of origin, individual genomic drivers are critical to the behaviour and, hence, the prognosis and treatment of specific cancer subtypes. Accordingly, it has become increasingly important to characterise the biological features, indeed the specific genomic drivers of cancer in an individual patient in order to select more specifically targeted therapies that will deliver rational and efficient cancer control. This process has been labelled “precision medicine”.

As the oncology community becomes better informed about the hallmarks of cancer, there is an opportunity to leverage these features of biology as both diagnostic and therapeutic targets (http://news.mit.edu/2011/convergence-0104). Through imaging, it becomes feasible to identify variations in target expression throughout the body but not the genomic basis of such variation. Tissue and serum biomarkers, on the other hand, can start to deconvolve these processes but cannot localise the site of disease. Biopsy specimens are also limited to assessment of a small portion of the potential disease burden and may not be representative of all disease sites. However, by combining these techniques, it becomes possible to draw links between phenotype and genotype, which must be inextricably linked [5]. This can be performed prior to the onset of treatment in order to optimise therapeutic selection but also early during treatment to assess target modulation.

However, heterogeneity is ubiquitous in cancer and poses a major challenge to any single therapeutic modality. As a consequence, combinatorial therapies will be required. Molecular imaging, targeted biopsy and genomic evaluation of circulating tumour deoxyribonucleic acid (DNA) already provide complementary methods to characterise cancer for selection and sequencing of cancer therapies and will become vital tools for monitoring the response of cancer to therapy and its evolution if not eradicated by initial treatment efforts. He concluded by saying that the opportunity that molecular imaging provides is mainly in identifying non-responding sites of disease as sources of relapse and metastasis, which ultimately lead to lethality of cancers. Therefore, the assessment of the mechanism of resistance in such disease can then be specifically interrogated by either targeted biopsy or the use of liquid biopsy, providing information that might inform alternative or additional treatment options as well as minimally invasive biomarker candidates for diagnostic applications in parallel to imaging approaches.

Tracks

Building on feedback from the first symposium and ongoing debates in the field, the programme of this second symposium was broken into three serial tracks that highlighted key aspects of biomarker development: disease characterisation (track 1), biomarker handling and treatment (track 2) and, finally, evidence and ethical aspects (track 3).

Track 1: Disease Characterisation

Disease characterisation on a molecular level and the identification of treatable targets as well as clonal evolution and resistance play an increasingly important role. Modern diagnostic and therapeutic biomarkers are frequently identified from large biobanks containing human tissue or liquid samples or are derived from thousands of clinical data sets, applying bioinformatics or other artificial intelligence (AI) approaches. Novel strategies also focus on the generation of patient-derived cell (PDC) culture, three-dimensional spheroid and organoid avatars from human tissue samples in vitro or humanised mice in vivo [6]. This raises multiple ethical and legal questions that have to be resolved before wider adoption.

The overarching goal of any disease characterisation is to collect a sufficient number of reliable and reproducible data points (aka biomarkers) for a personalised approach to each patient in need of a therapy, the right diagnosis at the right time for the right patient yielding the most efficient treatment (Fig. 1). Until a few years ago, over 150,000 studies have been published that document thousands of claims for promising biomarkers, but only a hundred or so have been validated for clinical use [7]. The reason for this high attrition rate is seen by many as a failure of researchers to embrace a collaborative, systems-based approach to biomarker development. This track aimed at highlighting biomarker diversity in view of the need to harmonise their sourcing with high quality. General progress indicators for biomarker development in disease characterisation are summarised in Table 2.

Table 2 Track 1: progress indicators for biomarker development in disease characterisation

Liquid Biopsy Versus Tissue Probes Versus Molecular Imaging

Liquid Biopsy

The Issues

Tumour cells may vary in tissue and genetic composition across lesions within a single patient, and they may change their profile during the course of therapy or monitoring [8,9,10,11]. Thus, it has been suggested to analyse circulating tumour cells (CTCs) and tumour-derived products in the context of liquid biopsy [12,13,14]. Unlike physical fine-needle biopsies, liquid biopsies are quick, minimally invasive and comprehensive in tissue profiling, assuming a high sensitivity and reproducibility of the liquid analysis work-up.

Recent Advances

Recent advances allow for application of liquid biopsy tests for early cancer detection [13]. Cohen et al. [15] reported a novel test, enabling detection and localisation of eight most common cancers. Although sensitivity of the test is up to 98 % for ovarian cancer, the average sensitivity of the test was only 70 % with a specificity of 99 % [15, 16].

CTC can be characterised as anoikis-resistant, genetically and phenotypically heterogeneous cells with ability to invasion and metastasis-initiating potential [12]. CTC analysis is a validated prognostic test in metastatic breast, colon and prostate cancers [17,18,19]. Moreover, the presence of CTCs measured at baseline before neoadjuvant therapy is predictive for survival independently from pathological complete response [20]. Besides CTCs, circulating tumour DNA (ctDNA) is another liquid biopsy analytic widely utilised for monitoring of progression and therapy response [13, 21,22,23,24]. Among circulating non-coding nucleic acids, miRNAs are the most prominent analytics [25,26,27,28].

Future Challenges

Biomarkers discovered and validated on cohorts of late-stage patients might be under-represented in early-stage/high-risk individuals [13]. Furthermore, aging-associated appearance of mutations in cancer-associated genes represents another hurdle in application of early detection tests [29]. Challenges in CTC detection are related to biological properties of the cells [30, 31]. Despite recent advances of the development and application of liquid biopsy tests, one of the major issues remains standardisation of the pre-analytical parameters [28].

Tissue Probes and Tissue-Based Genomics

The Issues

In oncology, clinical decision-making is mostly based on laboratory tests and histopathological analyses. Although these techniques are widely used, the total cost adds up to only around 2 % of healthcare costs worldwide. In contrast, lab tests drive around 80 % of total healthcare costs for therapies. Laboratory tests are widely available, easily accessible and highly sensitive. On the other hand, in the context of novel concepts of targeted therapies, implementation of workflows that identify highly specific proof of target structural alterations (mutations) becomes important.

Recent Advances

Over the past decade, molecular diagnostic laboratories have moved from single gene assays to large-scale genomic analyses, which allow for more sensitive and simultaneous detection of many gene regions, thus enabling testing of multiple genetic alterations even with limited amounts of starting DNA/RNA. Most laboratories utilise targeted sequencing panels for lower sequencing costs and shorter turn-around times. In addition, targeted sequencing panels allow for higher sequencing coverage of genomic regions of interest, which is critical for the identification of somatic mutations in clinical samples that are typically composed of both neoplastic and non-neoplastic cells. These approaches have been successfully implemented to evaluate gene mutations relevant for “solid tumors” and “hematologic malignancies” using small amounts of starting material, such as fine-needle aspirations [32].

Future Challenges

Many potentially targetable genetic alterations are rare and lack sufficient supportive evidence from well-designed clinical trials. A large inter-institutional database would enable healthcare professions to collate and share the molecular and clinical data, as attested by the efforts of the Molecular Analysis for Therapy Choice (MATCH) trial for adults and children supported by the National Cancer Institute, and the Target Agent and Profiling Utilization Registry (TAPUR) trial conducted by the American Society of Clinical Oncology (ASCO) [33].

Furthermore, the role of the tissue pathologist in the implementation and reporting of liquid biopsies needs to be clarified. Figure 2 demonstrates a framework by which various biosamples, molecular testing platforms and their integration may contribute to the clinical decision-making process for patients with lung cancer.

Fig. 2.
figure 2

Framework for multi-modality molecular testing of biospecimens that may be used to identify molecular targets for precision oncology in lung cancer patients. FFPE, formalin-fixed paraffin-embedded; FNA, fine-needle aspiration.

Molecular Imaging: a Non-invasive Tool for Disease Characterisation at the Forefront of Precision Medicine

The Issues

Clinical molecular imaging applications, such as those promoted through positron emission tomography (PET), have been dominated by the relatively non-specific tracer 2-deoxy-2-[18F]fluoro-d-glucose ([18F]FDG), a glucose analogue. However, the intrinsic innovation potential of PET is clearly in the ability to fully quantify metabolic and signaling pathways non-invasively [2]. More recently, innovation in PET detector technology and system design has been demonstrated through the introduction of novel detector and readout concepts that helped improve the spatial and temporal resolution as well as the combination of PET with computed tomography (PET/CT) or PET with magnetic resonance imaging (PET/MRI) [34,35,36,37]. To date, this innovation path has been seemingly more convincing than that of disease-specific imaging probes ready for routine clinical use, thereby attesting more to rather extreme hurdles along the way of probe validation and market authorisation rather than developing probes that target specific metabolic of signaling pathways.

Recent Advances

Over the past decade, radiopharmaceuticals for the diagnosis of Morbus Alzheimer and somatostatin receptor expressing neuroendocrine tumours and prostate cancers have become available [38].

Future Challenges

With the advent of costly yet potent immunotherapies, molecular targeted therapies and combinations thereof, the field of cancer diagnostics, disease monitoring and therapy guidance has to undergo a paradigm shift, as supported by multi-modality PET/CT and PET/MR imaging [36, 37].

However, current clinical diagnostic workflows are trimmed towards simplification and often ignore the wealth of quantitative data yielded by PET, including in-depth information about pharmacokinetics [2]. So far, the costs of non-invasive imaging—which is marginal compared to the costs of modern combinatorial therapies that can be as high as US$400,000 per patient. Unfortunately, the imaging field is not ready yet to provide the desired diagnostic or prognostic information for state-of-the-art and clinically applied immunotherapies (involving checkpoint inhibitors, antibodies or cell-based therapies), molecular targeted therapies or the combination thereof.

Innovation in Biobanking

The Issues

To understand the combined effects of genetic and environmental factors on health and diseases, a broad spectrum of samples and analytes has to be investigated by a variety of omics technologies. The development of analytical technologies goes hand in hand with the development of quality requirements of biospecimen in order to guarantee proper test performance and reproducible data. Furthermore, biosamples have to be associated with detailed information on the patient or sample donors, sample-related meta-data as well as data generated by analysis of samples, making data management a core activity of biobanks.

Recent Advances

The importance of pre-analytical standards has been attested by key features of the European Committee for Standardization (CEN) Technical Specifications, which, following the Vienna Agreement, became International Organization for Standardization (ISO) standards this year. The European biobanking research infrastructure BBMRI-ERIC has mapped 60 million biosamples in Europe. A series of the CEN Technical Specifications on sample pre-analytics have been published to standardise sample quality, which is essential for generating reproducible test results. These standards are also relevant in the context of the European Union Regulation “In vitro Diagnostics and Medical Devices” since diagnostic developers have to provide data on pre-analytical sample requirements to guarantee assay performance [39].

Digital histological images are a new and important resource provided by biobanks. Here, a national digital pathology infrastructure has been established at Austrian medical universities employing latest scanning technology to generate a unique imaging data resource to drive machine learning and imaging biomarker development (Fig. 3). Developments in machine learning have created an extensive interest in large-scale imaging data of tissues that generate synergies between biobanks, digital pathology and information and communication technology (ICT) industry for driving innovation in imaging biomarkers.

Fig. 3.
figure 3

Biobanks and machine learning/artificial intelligence (AI): a novel way of linking biosamples, images and disease outcome.

Future Challenges

There is a need for large-scale and standardised annotation of training data sets and to make results generated by machine learning explicable. Developments in digital pathology and imaging are converging to create a stimulating momentum for innovation in AI and imaging biomarkers. The increasing role of accessing large data sets and the need for international collaboration including data exchange pose new challenges in privacy protection and ethical and legal compliance.

Molecular Imaging and Omics

The Issues

Recent large-scale trials that target exploitable mutations pharmacologically have yielded largely disappointing results. This can be explained in parts by the limited number of effective targeted drugs, the small number of cancers that exhibit single oncogene addictions and, thus, the susceptibility to single-drug treatments. Furthermore, targeting a comparatively large number of mutations requires rationally designed combination therapies. Thus, precision oncology based on genomics remains an elusive goal [6, 39].

Recent Advances

Theranostics is an example of precision medicine. It combines target identification and confirmation with therapies that bind to the same target with high affinity and specificity. Recent reports on targeting fibroblast-activated protein (FAP), a highly relevant tumour stroma target, suggest numerous potential theranostic applications across many cancers [40] (Fig. 4). Other examples for image-derived predictive biomarkers include assessments of hormone receptor status (estrogen receptors, androgen receptors) [41, 42], protein expression (human epidermal growth factor receptor 2 (HER2neu)) and many more [43].

Fig. 4.
figure 4

PET/CT maximum-intensity projections of patient with metastasised pancreatic cancer (a) and patient with breast cancer (c). Maximum standardized uptake (SUVmax) of 68Ga-labelled FAPI in breast cancer lesions; Me = Metastasis. Note the favourable biodistribution of the FAPI ligand for use as a theranostic compound with very high target and very low background activity. Image taken from [40] with approval by publisher.

Future Challenges

n light of the significant differences between the results of genomic testing obtained from tissue, liquid biopsies and molecular imaging, a consensus should be developed to integrate the results of these disparate yet complementary diagnostic modalities in a disease-specific manner [44]. This requires new methodological approaches towards the combination of multi-parametric image features (aka “radiomics”) and multi-omics data from tissue or liquid biopsies so as to yield clinically relevant information. Here, artificial intelligence–based approaches have been proposed as one way to derive predictive biomarkers for therapeutic responses in cancer [45, 46].

Digital Biomarkers and Patient Data Mining

The Issues

Information technology (IT) and companies have transformed our digital footprints into an important commodity. Collections of heterogeneous data from a multitude of sources, such as social network posts and biosensors, are increasingly used for health-related predictions. The challenge presented by the unstructured, noisy and incomplete nature of these data is addressed by a new generation of powerful, neural network–based classification technologies (Fig. 5) [47].

Fig. 5.
figure 5

The triangle between imaging, omics and networks and clinical data.

Recent Advances

The term “digital biomarkers” was coined to denote objective, quantifiable physiological and behavioural data collected and measured by means of portable, wearable or implantable devices [48]. Digital biomarkers are data or data extracts that can be obtained from all kinds of artefacts related to an individual, and on which health-related predictions can be grounded. Digital biomarkers can be harvested from mobile appliances as well as raw textual and other signals, through publicly available sources and electronic health records (EHRs). Likewise, the potential of big data analytics for healthcare, including digital biomarkers, and prevention is well appreciated [49, 50].

Future Challenges

The application of digital biomarkers in clinical decision support systems requires a higher level of standardisation. By nature, data in electronic health records are heterogeneous, and are primarily collected for human communication and documentation, but not intended for automated decision support or risk stratification. Therefore, the digital footprint of EHR data is neither well-structured nor complete. Most of EHRs are plain text, and structured information is collected for purposes, such as billing, quality assurance and disease reporting only.

Adequate secondary use of EHR data, according to the FAIR (Findable, Accessible, Interoperable and Re-Usable) data principles [51], is a challenge for natural language processing (NLP) approaches [52], supported by domain terminologies and ontologies [53,54,55]. Due to different levels of complexity and missing contextual information, distinct strategies are required to transform existing EHR data for the use of data mining. Once the individual patient data is formatted and prepared, it will be necessary to validate these digital biomarkers in prospective study settings.

Interactomics

The Issues

Interactomics is a crossover discipline at the intersection of bioinformatics and molecular biology that relates to the studies of physical or functional interactions of proteins and other molecules within a cell and the consequences of those interactions. Networks provide a data-driven and simple but integrative mathematical framework for the inference of genotype-phenotype relationships that can handle large-scale molecular data, which reflect complex molecular relationships but may be governed by small quantitative differences and may carry methodological biases.

Being a critical component of the genotype-phenotype-function paradigm, networks provide a framework that most directly connects molecular information such as sequence variants to phenotypic traits [56]. However, a gap exists between nucleotide resolution genome data and proteomics data, which typically refer to changes of a protein as a whole [57].

Recent Advances

The parallel quantification of proteomes from a series of cancer cells recently established a connection between copy number variation [58] or nucleotide variation [59] and proteome changes at the protein interaction network level. Concerted proteome changes can be linked to genetic changes via data integration with networks using interactome network models.

A second new development addressing the resolution gap between proteomics and genomics involves deep mutational scanning approaches [60]. Highly diverse genetic libraries, e.g., representing all single-amino acid substations of a protein of interest, are coupled to a phenotypic selection and can be assessed through massive parallel sequencing. Recently, a novel approach that couples deep scanning mutagenise to protein interaction networks via a reverse yeast two-hybrid technique was introduced [61].

Future Challenges

Today, amino acid resolution protein interaction profiles can be recorded on an interactome scale. Interaction perturbation profiles could potentially serve as a sequence-based functional biomarker with the advantage to include rare and newly diagnosed gene variants. Nonetheless, efforts must focus on human interactome mapping that also addresses condition- and tissue-specific network information. Here, computational methods will play a crucial role in transforming current data generation approaches into diagnostic or clinically applicable information.

Track 2: Biomarker and Treatment

Biomarkers can serve as surrogates for the presence of respective treatment-relevant target structures when selecting and guiding targeted therapies in cancer patients (Table 3). The derivation of biomarkers is considered as a cost-intensive process that helps generate very useful and cost-effective databases for local and remote teams working in the field of drug discovery. Therefore, standardisation of biospecimen and biomarker sourcing procedures is a key to an efficient biomarker derivation and validation and subsequent data analysis [7].

Table 3 Track 2: progress indicators for biomarkers and treatment planning

The Needs for Standardised Pre-analytics in Biobanking

The Issues

According to Freedman and Inglese, the share of pre-clinical, biological research studies that failed to be reproduced ranges from 68 to 89 % [62]. It is estimated that in the USA alone, US$28 billion is wasted annually due to irreproducible results, with over a third of it being caused by poor reference materials and biological reagents [63]. Thus, it is not surprising that more than half of the papers reporting biomarker discoveries do not contain any information about the biospecimens used, although some of them were published in interdisciplinary top journals [64].

Recent Advances

To minimise pre-analytical variability in research biobanking, large biobank consortia, such as the European Research Infrastructure Consortium BBMRI-ERIC and its national nodes, foster the implementation of technical specifications (TSs) during the pre-examination phase [65]. Given that governance aspects, ethical requirements and sample access remain largely undervalued, an international biobank–specific quality management standard (ISO 20387—“General requirements for Biobanking”) was recently released. TSs on the pre-examination process require the validation of the whole process, which starts with the collection of the sample and ends with the interpretation of the analytical results.

Future Challenges

Despite increasing evidence of a higher pre-analytical quality positively affecting research reproducibility and scientific progress, the community must engage in building and adhering to evidence-based international protocols [66,67,68].

Standardised Pre-analytics in Biomarker Research

The Issues

Profiles of biomolecules can change significantly during pre-analytical workflows. Previous studies have demonstrated that changes in RNA expression or protein and phosphoprotein amounts can occur at every stage of the sample processing workflow starting with the collection and extending to the archival of the processed samples [69,70,71,72]. This can lead to unreliable results or misinterpretation of the bioanalytical profiles generated in the pre-analytical phase. Of note, diagnostic errors contribute to up to 10 % of all patient deaths and 17 % of adverse events. The pre-analytical workflows are the largest contributors to the error rate and account for 46–68 % of clinical laboratory errors [73].

Recent Advances

A new EU In Vitro Diagnostic Device Regulation 2017/746 [74] (IVDR) was put into effect in May 2017, thereby replacing the old IVD Directive 98/79/EC (Official Journal of the European Union, Legislation, Vol. 60, May 5, 2017). A key modification in the new IVDR is the introduction of sampling requirements for the analytical performance of an in vitro diagnostic test.

The new Horizon 2020 SPIDA4P consortium project (2017–2020) intends to broaden the portfolio of CEN and ISO standards for pre-analytical workflows by generating and implementing finally a comprehensive portfolio of 22 pan-European pre-analytical CEN/technical specifications and ISO/international standards, addressing important pre-analytical workflows such as those applied to personalised medicine (www.spidia.eu).

Future Challenges

Additional new pre-analytical workflow technologies and IVD products on specimen and sample collection, stabilisation, transport, storage and processing are still required for various new applications, e.g., for CTC characterisation, circulating nucleic acid analysis from non-invasive human samples, next-generation sequencing (NGS) and metabolites in human samples. In addition to supporting technology advances, the implementation of the new IVDR and CEN and ISO standards for pre-analytical workflows will remain a major task for the next years. Corresponding external quality assurance (EQA) schemes will be developed and implemented as well, aiming to survey the resulting quality of samples and diagnostic practice.

Role of Mass Spectrometry in Quality Analysis and Analytics

The Issues

In mass spectroscopy, determinants that may compromise sample quality must be minimised [75] in order to obtain samples that assure reproducible and credible results. Despite great efforts to reduce the error rates of pre-analytical stage of sample handling, pre-analytical errors still account for up to 70 % of all problems occurring in laboratory diagnostics [76,76,78] and include hemolysis, coagulation (clotting), inadequate filling volume of the blood collection vial and insufficient information on the blood collection rube or the sample vial [79,79,80,82].

Recent Advances

For example, the quality of blood samples can be affected in multiple ways [83]. It was found that 4–19 % of metabolites were upregulated by mishandling of the samples and 8–12 % of single metabolites were downregulated compared to control samples. Another study from the same group demonstrated significant changes of metabolites when the incubation time of blood samples was prolonged [81]. In particular, amino acids and nucleobases were demonstrated to be sensitive for these affections.

Future Challenges

Ongoing studies accentuate the importance of high-quality biospecimens in the biobanks, as well as a need for identification of more biomarkers of sample quality.

Translational Immunology: Tackling Barriers in Science and Bureaucracy

The Issues

Immunotherapy has become a mainstay of oncological treatment. The ability to stimulate Fc receptor (FcR)–bearing effector cells, which results in lysis of target cells (antibody-dependent cellular cytotoxicity (ADCC)), is central for the immunotherapeutic activity of monoclonal antitumour antibodies, in particular in hematological malignancies. In humans, natural killer (NK) cells constitute the major cell population that mediates this important antibody function [84]. However, the success of rituximab and other so far available antitumour antibodies is limited with regard to efficacy and due to the lack of compounds for many other cancer entities.

A major drawback of strategies to mobilise T cells against cancer is that they cause partially severe side effects. In case of checkpoint blocking antibodies, these arise from the general, undirected activation of the T cell system. In case of presently available bispecific antibodies (bsAbs) and chimeric antigen receptor (CAR) T cells, side effects are largely due to the fact that the employed target antigens are expressed not only on malignant but also on healthy cells, such as B cells in case of CD19. A second major drawback is, as mentioned above, that available bsAbs and CAR T cells are far less successful in solid tumours than hematopoietic malignancies.

Recent Advances

Hofmann and collaborators [85] recently developed an Fc-optimised monoclonal antibody targeting the surface antigen FLT3 (CD135), which is expressed on leukemic cells in nearly all patients with acute myeloid leukaemia (AML), for which no immunotherapy is available to date. An alternative approach to improve the therapeutic efficacy of antitumour antibodies is to develop strategies, which allow for antibody-mediated stimulation of T cells with their—compared to NK cells—profoundly higher effector potential [86].

A recent publication confirmed the relevance of targeting vascular antigens for cancer treatment [87]. Accordingly, a bsAb directed to prostate-specific membrane antigen (PSMA) was developed, which is expressed on prostate carcinoma cells as well as on tumour-associated vasculature of numerous other cancers. This PSMAxCD3 bsAb termed CC-1 was constructed in a novel, whole IgG-based (IgGsc) format with increased serum half-life. CC-1 contains a proprietary PSMA antibody, which displays additional unique reactivity with squamous cell carcinoma (SCC) of the lung, allowing for the desired dual targeting of both tumour cells and neovasculature.

Future Challenges

A plethora of compounds with pre-clinically substantiated antitumour efficacy awaits clinical evaluation and holds great promise to benefit cancer patients. At present, the biggest challenge is the steadily increasing regulatory burden with regard to both requirements regarding production (good manufacturing practice, GMP) and lately also for clinical evaluation (good clinical practice, GCP). Together, they dramatically extend the time from conceptualisation of a drug to its clinical evaluation, but notably, the first also almost completely bars academic institutions from drug development [88]. It is worth to consider whether and how in the particular case of cancer patients with a life-threatening disease, who failed to respond to established/available therapy, novel compounds with pre-clinically proven efficacy can be made more rapidly available. In addition, the costs of drug development and ultimately treatment costs, which are becoming a tremendous socio-economic challenge in the future, must become a focus of our attention.

Tracers to Target the Immune System

The Issues

Immune-modulating therapies not only target the cancer cells but also modulate the cancer-promoting or cancer-inhibiting properties of the complex multi-cellular tumour microenvironment (TME) [89,89,91]. There are still a large number of patients that do not respond to the chosen immune-modulating therapy. Therefore, understanding the dynamic changes in the TME, in particular those involving the immune system, may help advance the development of new strategies for cancer diagnosis, treatment and assessment of therapeutic response.

Thus, there is a need for translational and validated biomarkers for the prediction and monitoring of responders to immune system–modulating drugs and categorising responsive tumours early after therapy initiation in a non-invasive manner. Unfortunately, an increase of infiltrating immune cells that have an increased glucose metabolism compared with peripheral non-immune cells pretends a transient worsening (pseudo-progression) in [18F]FDG PET [92].

Recent Advances

Currently, one of the most interesting targets for therapy and imaging is the immune checkpoint protein, PD-1 ligand (PD-L1). PD-L1 is over-expressed by a variety of tumour cells, induced as an adaptive mechanism in response to tumour-infiltrating cytotoxic T cells [93]. Increased PD-L1 expression in cancer cells as well as in TME promotes immune evasion of tumour cells via binding to PD-1 expressed by the active immune infiltrates [94]. A number of studies have demonstrated anticancer activity of PD-L1–targeting antibodies [95]. For in vivo detection of PD-L1, clinical advances have been made using radiolabelled antibodies such as atezolizumab (ClinicalTrials.gov identifiers NCT02453984 and NCT02478099) [96].

In addition, other key mediators in immune cell activation/inactivation and various types of cells involved in immune regulation offer additional potential targets for imaging response to immunotherapy [97]. Novel, potential targets and targeting agents are currently under investigation [98]. A list of immunotargets and targeting concepts is shown in Table 4.

Table 4. Targets and nuclear targeting concepts and agents that are currently under investigation for monitoring immunotherapies (adapted from [99] and expanded)

Future Challenges

Major challenges on the way to imaging biomarkers in immunotherapies include translational gaps as a result of lack of or inadequate in vitro and animal models that allow assessing novel radiotracers; the time for clinical validation of novel, imaginable biomarkers; repetitive exposure of patients to ionising radiation; and the cost for tracer development and clinical studies.

Targeting the Immune System: a Clinical Perspective

The Issues

Tumour cells express or induce molecules, which block cytotoxic tumour ablative immune cells, such as cytotoxic T cells [114]. Vice versa, a rational and efficient tumour immunotherapy is based on the identification and subsequent blockage of those pathways by immune checkpoint inhibitors.

Recent Advances

One of the best predictive markers for PD-1 blockade is microsatellite instability (MSI) [115]. It is anticipated that the MSI-associated increase in mutations resulting in an increased rate of neo-antigens is substantial for the observed therapeutic effect in those patients.

In 2017, the FDA approved PD-L1 inhibitors for unresectable or therapy-resistant metastatic MSI-positive solid cancer. An association of mutational burden and a therapeutic effect of a checkpoint blocker have also been demonstrated for the CTLA-4 antibody ipilimumab [116]. The PD-L1 inhibitor pembrolizumab revealed a benefit for patients with metastatic non-small cell lung cancer (NSCLC) compared to platinum-based chemotherapy with an overall response rate of 45 % versus 28 %, respectively [117].

PD-L1 or PD-1 blockade have also been highly successful in phase II studies in Morbus Hodgkin, leading to an overall response rate of more than 60 %. In this disease, the expression of the receptors is increased due to genetic aberrations, such as copy gain or amplification [118]. Currently, the EMA approval of checkpoint inhibitor includes MSI cancer, NSCLC, Hodgkin lymphoma, melanoma, head and neck cancer, urothelial cancer, kidney cancer and Merkel cell cancer.

Future Challenges

The success of these initial studies did not only promote the development of antibodies against other checkpoint molecules, such as LAG3 or TIM3, but also combination therapy with other agents. Despite first publications showing anti-PD-1 antibody to be effective also in microsatellite-stable cancers in light of combination therapies [99, 119], future studies have to provide additional evidence of such combined approaches.

Oncolytic viruses have been developed for most of the viral families [120], and the combination of an FDA-approved oncolytic herpes virus with an anti-PD-L1 antibody led to a clinical response of around 60 % in stage III and stage IV melanoma. The effect was associated with a high influx of cytotoxic T cells and an increase of PD-1 expression in tumour tissue. However, proof of evidence is required to implement these approaches into the clinic.

Track 3: Evidence and Ethics

Until recently, drug development followed a clear path, i.e., from in vitro pharmacodynamic studies and experiments in animals to first-in-men studies in healthy volunteers as well as in patients. If successful, this process would be followed by approval through regulators and, finally, reimbursement negotiations with payers. While this one-way pathway employing the interaction between stakeholders at predefined time points clearly still does exist, recent approaches aim at early and more flexible interactions of involved players.

Medical ethics does play a key role during these stages (Table 5), perhaps best defined in the sixth paragraph of the Declaration of Helsinki, the key document for ethics in medical research: “The primary purpose of medical research involving human subjects is to understand the causes, development and effects of diseases and improve preventive, diagnostic and therapeutic interventions (methods, procedures and treatments). Even the best proven interventions must be evaluated continually through research for their safety, effectiveness, efficiency, accessibility and quality.” (https://www.wma.net/policies-post/wma-declaration-of-helsinki-ethical-principles-for-medical-research-involving-human-subjects/). This depicts well the inseparable binding between ethics (“what can I do?”) and evidence (“what do I know?” and “what do I want to know?”) but also encourages the community as such to continuously challenge their “standard of care”.

Table 5 Track 3: progress indicators for ethics and evidence aspects of biomarker development

Cost-Effectiveness Aspects of Biomarker Development

The Issues

Providing value for money, identifying the best course of action based on the available evidence and optimal allocation of resources from the given budget are the central concepts of cost-effectiveness analyses (CEAs). CEAs evaluate the difference in costs relative to the difference in effects between competing alternative activities. They are commonly used decision-making support tools in the context of health economics and health technology assessments (HTAs) to identify the value of new health technologies.

The top ten most common medications in the USA help between 4 and 25 % of patients [121]. Spending on cancer drugs has increased faster than spending in most other areas of healthcare, because of rising prices and increased use. Diminishing rate of return in terms of the benefit gain associated with the increased costs is another concern [122].

Recent Advances

Biomarkers may present multi-faceted values: clinical value is achieved by directing effective targeted therapy, informative value is achieved by reduced uncertainty about treatment benefits, financial value is achieved by reducing costs spent on ineffective care and economic value is achieved by providing greater benefits for the amount paid than the benefits of potential alternative use of the same resources (the so-called “opportunity cost”) [123].

Traditional HTAs and CEAs focus on population average results to support population-level reimbursement decisions and generally ignore patient variations or broader value aspects leading to potentially ineffective and futile use of healthcare resources. Establishing the different value aspects of biomarker tests requires the application of a broader value framework and the explicit consideration of patient heterogeneity [124].

Future Challenges

Conducting the value evaluation of biomarker tests usually require more data, time and analytical resources than traditional pharmacoeconomic evaluations. Although procedurally challenging, early inclusion of HTA in the development process can promote cost-effectiveness of the developed test. Since relevant evidence may not be available at the time of readiness for marketing, new reimbursement policy models (e.g., c\overage with evidence development) are also required to allow the earliest possible availability for patient care [125].

Precision Medicine: Mapping the Ethical Challenges

The Issues

One of the central technical challenges of any biomarker-focused precision medicine approach is successful translation from bench to bedside and from lab to lifeworld [126]. On the technical side, and as evidenced at the meeting, significant work is currently being done to ensure advanced biomarkers are developed that are clinically actionable and help improve understanding of disease progression and individual patient care. However, to reach this goal, technology and clinical development need to consider various important ethical and social issues.

Recent Advances

A knowledge gap with doctors regarding the multiplexed facets of biomarker development and use has been identified [127]. However, studies on how this gap plays out in clinical decision-making and how it can be managed or even avoided are missing. Likewise, more information on patients’ understanding of advanced biomarker approaches, their expectations, attitudes and values as well as their concerns need to be collected, to assist future patient care and policymaking. From a philosophical perspective, understandings and illness taxonomies are currently shifting [128]. On the sociological side, important discussions focus on whether precision medicine contributes to the medicalisation of pre-disease states and/or social trends, and whether precision medicine could potentially be a poor choice while a low-tech public health approach is perhaps more appropriate and cost-effective.

Future Challenges

Studies mapping the specific ethical and social issues of advanced biomarker approaches, including the studied disease entities, are required.

Value of Diagnostics to Health Systems

The Issues

Research and development need to be mindful of the ultimate aim of healthcare technologies. The last hurdle for these technologies to access the market is reimbursement. Such decisions are often preceded by an HTA, which is mandatory in many countries for pharmaceuticals and is increasingly being undertaken for medical devices and other diagnostic technologies and activities in Europe.

Recent Advances

Clinical value is measured in population health, often using generic metrics that allow cross-condition comparisons. In the UK and many other European countries, the measure used is the quality-adjusted life year (QALY). Because any additional costs imposed on the healthcare system means that these resources cannot be used in other ways, some health is lost or foregone (aka “health opportunity costs”).

With regard to diagnostic tests, HTA often considers not only which test should be used in clinical practice but also how the test should be used, that is, what is the best way to proceed, clinically, using the information it provides [124, 129,129,130,132].

Future Challenges

The mechanism of value accrual for diagnostic technologies differs to that of pharmaceuticals. Tests identify the level or magnitude of factors that determine or explain health outcomes. In this regard, tests identify patients who are expected to benefit most from distinct regimens of healthcare. It is only by tailoring treatment decisions to patients, that is the expected (net) health of the overall population improves and value is generated. This means that, ultimately, the value of the test is bound by the value of available treatments and conditioned by other technologies in the diagnostic pathway (e.g., combinations of tests or sequences of tests). Value also critically depends on prevalence, on the costs or adverse events of the technology and on the level of misclassification (false positives and false negatives).

Paradigm of Predictive Medicine and Ethical Challenges

The Issues

The possible role of genetic information in a future healthcare system, in which “systems medicine” approaches (Fig. 6) are applied, is currently intensely debated [133,133,134,136]. But, the question of whether the importance and role of genetics will either increase or decrease in such a healthcare system remains unanswered.

Fig. 6.
figure 6

Systems medicine can be understood as a heterogeneous set of methods and approaches connected by an emphasis on information technologies.

Recent Advances

Systems medicine reconciles heterogeneous approaches, using not only modelling methods of systems biology to obtain information about the aetiology of complex diseases but also integrating big amounts of data from various sources (omics data, data from imaging methods, treatment data, etc.) as well [137,137,138,140]. By committing itself to the paradigm of predictive medicine, ethical challenges remain highly virulent in the approach of systems medicine. Depending on a specific disease, systems medicine could contribute to a boost of genetic prediction and could strengthen the role of non-genetic predictors [133, 139, 141, 142].

Future Challenges

The question of how patients and doctors can deal with probabilities and risks must be addressed and communicated across all stakeholders. Here, the question of a future relevance of algorithms and scores for the decision-making process within the context of a medical treatment is crucial. The problem of so-called “health-related personal responsibility” remains potentially as relevant for systems medicine as it is for the approaches of precision medicine and individualised medicine. Challenges regarding the clinical processing of incidental or secondary findings could arise from the translation of systems medicine into clinical practice [138].

Round Table Discussion: What Is Needed to Breed the Next Generation of Experts in Molecular Diagnostics?

It is obvious that current educational and training schemes that support the in-depth expertise of individual and autonomous fields of research are no longer appropriate to embrace the multi-disciplinary approach to (cancer) patient management [143]. This was obvious from a spirited debate as well. The panelists together with the audience then carved out requirements for the education of the next generation of biomarker experts, or “experts in molecular diagnostics”.

In view of the existing gap between clinical and basic research expertise, which is a major bottleneck to generate experts in molecular diagnostics, it was reiterated that here are currently too few combined training pathways that provide training in genetics, inherited genetic disorders, oncology and molecular imaging. Moreover, limiting the preclinical competence to genomics ignores the critically important role of signaling networks in cancer and other diseases. Efforts should be made to introduce clinicians to animal models, drug development and biology of disease (in addition to imaging). Such improved mechanistic insights could lead to accelerated translation into clinical benefits. There is a need for compact training curricula as economic pressure including large debt volumes force trainees to join the work force quickly. It was suggested that there was an inverse relationship between general and speciality training. The sophistication and depth of knowledge and skills to practice molecular diagnostics would require 3–4 years of training and less general training. An alternative approach would be the expansion of postgraduate fellowships for physicians in molecular diagnostics. This should be synchronised with passion and enthusiasm among the faculty in order to most effectively spread the energy and excitement about molecular methodologies to trainees and colleagues.

Panelists and attendees appreciated the significant differences in career development opportunities between the USA and Europe (e.g., the nuclear medicine training duration is 5 years in Europe vs. 3 years in the USA). Nonetheless, research fellowship like opportunities is available in both continents that could be used to strengthen the basic science component. The audience did not quite agree on whether creating more structured learning opportunities would enhance physicians’ interest in pursuing research. However, it was conceded that clinicians were often poorly educated and informed about omics platforms, such as proteomics and metabolomics, and that such opportunities should be supported and sponsored by academic institutions, foundations and government agencies.

Conclusions and Outlook

Biomarker discovery and validation should become part of a “big science approach” [7], which involves multiple stakeholders from both academia and industry and experts in molecular pathology, genetics, molecular imaging, computer science, statistics, epidemiology, regulations and healthcare economics. However, such an approach mandates to overcome obstacles that reside within the existing culture of research organisations and ecosystems. Likewise, opportunities and abilities to communicate across stakeholders must be expanded, and deficiencies in (large sample) analytical technologies must be addressed.

The 2nd Donau Symposium provided an important forum for individuals involved in biomarker research with an empathy for cross-speciality engagement, such as in the validation of liquid biopsies and circulating tumour cells, where traditional single-feature biomarkers will be replaced by next-generation biomarker (e.g., derived from computational models). Cross-speciality engagement will yield also attention to the role of quantitative biomarkers, which is expected to grow [2]. When combining quantitative, visual and non-imaging biomarkers, standardised sourcing procedures as well as harmonisation measures will become essential for merging relevant information and building models and prediction algorithms that help advance patient management strategies.

In the future, the concept of precision medicine can be supported only if traditional ways of sourcing biomarker information are combined with environmental/lifestyle factors [144] and modern concepts of molecular imaging. While the 1st Donau Symposium in 2016 intended to blend, the 2nd in 2018 aimed to converge. We are now looking forward to our 3rd Symposium in 2020 to making it real!