Introduction to chemical mixtures

In real life, humans are constantly exposed to a variety of chemicals most often present as complex mixtures via multiple routes of exposure. Chemical mixtures range from simple combinations of a few chemicals to a complete, integrated exposure profile known as the exposome, that is, cumulative exposure to environmental stressors over a lifetime. However, even newly developed approaches such as grouping of chemicals for cumulative risk assessment are not considering real-life exposure scenarios as outlined in a recent editorial by Tsatsakis et al. (2019a). Furthermore, particular attention should be given to the exposures occurring at critical life periods when assessing risk from chemical exposures (Sarigiannis and Karakitsios 2018).

According to the World Health Organization (WHO)/International Programme on Chemical Safety (IPCS) framework (Meek 2013), exposure to the same substance from multiple sources and by multiple routes is described as ‘‘single chemical, all routes’’ (aggregate exposure). Exposure to ‘‘multiple chemicals by a single route’’ can be distinguished from exposure to ‘‘multiple chemicals by multiple routes’’ (cumulative exposure). In practice, humans can be exposed to chemical mixtures as a result of sequential exposure to individual chemicals or from concurrent exposure to multiple chemicals through the same route or different routes (i.e., oral, dermal, or inhalation). Chemicals are considered to co-occur if their toxicokinetic/toxicodynamic profiles are such that the chemicals or their biological effects are present within an organism at the same time (Nelms et al. 2018).

Assessing the risk from exposure to multiple chemicals by multiple routes represents a scientific challenge. Interest in this field has intensified over the past decades as evidenced by the increasing number of articles appearing in the scientific literature (e.g., Feron et al. 1998; Wilkinson et al. 2000; Monosson 2005; Boobis et al. 2011; Sarigiannis and Hansen 2012; Hernández et al. 2013; Kienzler et al. 2016; Hernández and Tsatsakis 2017) and the various meetings, technical reports, programs and documents claiming the need for additional research into the risk assessment of chemical mixtures (Bopp et al. 2015; Quignot et al. 2015; Rotter et al. 2017; ATSDR 2018).

There is a pressing need to develop methods, in particular with regard to their regulatory application, that are able to assess the health effects that may be produced by exposure to chemical mixtures with the aim of minimizing or preventing the risk of developing diseases (Kostoff et al. 2018; Webster 2018). In general, the hazard of chemical mixtures can be assessed as a whole, using whole-mixture approaches (Wade et al. 2002; Docea et al. 2018, 2019; Taghizadeh et al. 2019), or predicted based on individual components of the mixture (component-based approach), depending on the available data. The whole-mixture approach is frequently applied for environmental samples, as it has the advantage of assessing the toxicity of mixtures of unknown composition and unknown potential for interactions, and measures directly the combined effect of the complete mixture. Although this approach is increasingly being used in situations of unknown and varying composition, such as chemical pollutants in surface water and wastewater effluents, the whole-mixture effect data are available for a limited number of mixtures and the substances driving the overall response frequently remain unidentified as well as potential interactions amongst mixture components (Bopp et al. 2019; Hernández et al. 2017). Conversely, the component-based approach requires identification of the chemicals present in the mixture of concern (e.g., concentration, mode of action (MoA) and toxicity of the components of the mixture, which is often lacking) followed by mathematical modeling to predict their joint action based on information from each individual component of the mixture. The missing information can be addressed using novel technologies which enable a mechanistic understanding of the combined effects or can be estimated in part using computational models (see “Mechanism-based methodologies for the evaluation of the impact of chemical mixtures on human health”). However, component-based approaches can potentially lead to underestimations of hazard when the composition of a mixture is not fully known, which is usually the case, except for clearly defined, intentionally manufactured products (e.g., pesticide formulations), or chemicals present in foodstuff (e.g., multiple residues of pesticides) (Bopp et al. 2019; Hernández et al. 2017).

The component-based approach relies on the concept of additivity, which includes dose addition and independent action models. The mathematical approach depends mainly on whether the chemicals act by the same MOA or by independent MOA (Groten et al. 2001). Generally, the concept of dose or concentration (often used in ecotoxicology) addition is applied to substances with similar MOA, and in the case of substances with dissimilar MOA, response addition is applied. For dose addition, the effects of a mixture of compounds can be predicted from the sum of the dose or the concentration of similarly acting substances after adjusting for the individual differences in potencies. Models assuming dose addition are the most frequently applied because they provide reliable estimates of combined effects and are appropriate for the risk assessment of chemical mixtures when their individual components share the same molecular mechanism of action. A robust body of experimental evidence indicates that the basic assumption of additivity offers a reasonable expectation of mixture toxicity assuming that the components of the mixture do not interact with each other, which can modify the magnitude and even the nature of the toxic effect (Bopp et al. 2019; Hernández et al. 2017). The European Food Safety Authority (EFSA) EFSA Panel on Plant Protection Products and their Residues (2013) recommended the use of methodologies derived from dose addition for the assessment of mixtures of pesticides with dissimilar MOA, provided they produce a common adverse outcome as no cumulative risk assessment method has been derived from independent action. Furthermore, other European Union Scientific Committees have recommended the use of the dose/concentration addition approach as a pragmatic and precautious default assumption if no MOA data are available since this concept is generally regarded as more protective (Bopp et al. 2018). Recently, refined approaches have been developed for risk characterization of chemical mixtures, accounting for aggregate exposures and using toxicological reference values referred to the same critical effect/endpoint produced by the individual components of the mixture (Goumenou and Tsatsakis 2019). This methodology proposed new approaches for the risk characterization of single chemicals and chemical mixtures by introducing the source-related Hazard Quotient (HQs) and Hazard Index (HI) and the adversity-specific Hazard Index (HIa). This methodology can be used for any kind of adverse effects and endpoints as shown in case studies determining HQs for PCBs in fish (Renieri et al. 2019) and HQs and HIa for pesticides in pistachio (Taghizadeh et al. 2019).

The question of whether larger adverse effects occur from a chemical mixture than would be predicted by adding the dose or response of each individual substance is of particular concern (Rotter et al. 2017; SAPEA (Science Advice for Policy by European Academies) 2018; Schäffer et al. 2018) as it indicates that an interaction may be occurring (Hernández and Tsatsakis 2017; Hernández et al. 2017). The term interaction includes all forms of joint action that could deviate from the abovementioned additivity concepts, i.e., a greater effect (synergistic, potentiating, supra-additive), or a lesser effect (antagonistic, inhibitive, sub-additive, infra-additive). As some of these concepts describing deviations from additivity are usually a matter of controversy, a proper and unambiguous definition is needed. The three main forms of interactions are (a) potentiation, which occurs when the toxicity of a chemical on a certain tissue or organ system is enhanced when given together with another chemical that alone does not have toxic effects on the same tissue or organ system; (b) synergism, which can result from chemicals acting simultaneously in different molecular or cellular targets from toxicity pathways, thus magnifying their final toxic effect, with deviations from additivity-based predictions being within a factor of four at environmentally realistic concentrations; (c) antagonism, a phenomenon that occurs when two or more chemicals in a mixture have an overall effect that is less than the sum of their individual effects, either as a result of toxicokinetic interactions (one chemical may stimulate the metabolism of a second one or somehow interfere with its absorption or distribution) or toxicodynamic interactions (chemicals elicit the opposite effects by acting on the same or different molecular targets, such as enzymes and receptors) (Boobis et al. 2011; Bopp et al. 2019; Hernández et al. 2013, 2017).

The early toxicology literature contains many unjustified claims of synergism or antagonism based on inadequate study design as reviewed by Boobis et al. (2011) and Borgert et al. (2001) who presented five criteria that studies examining toxicological interaction in mixtures should consider (Table 1). These criteria can be applied broadly to interaction studies for drugs, pesticides, industrial chemicals, food additives, and natural products, although compliance with them does not necessarily render the results of an interaction study suitable for every purpose or every risk assessment. The focus on chemical interaction, particularly at environmentally relevant concentrations, using valid toxicological and statistical tests, has been reported in a few experimental studies (Ćurčić et al. 2012; Buha et al. 2013; Zhu et al. 2014; Chen et al. 2016; Curcic et al. 2017) and represents an important step toward advancing human and environmental risk assessments for chemical mixtures.

Table 1 Criteria for the evaluation of interaction in toxicological tests (based on Borgert et al. 2001)

Another important shortcoming of many mixture studies is the fact that they often characterize the interactions of chemicals at high doses instead of using realistic environmental or dietary exposures. Hence, they do not provide the necessary data to support risk management. It is, therefore, important that mixture studies examine the low-dose region of the dose–response curve with doses at or below the No Observed Adverse Effect Levels (NOAEL) for the individual components of the mixture. To this end, experiments should have sufficient statistical power to detect effects at these lower doses (Simmons et al. 2018). For this reason, novel methodological approaches simulating real-life exposures should be developed to assess the potential adverse health effects of long-term low-dose exposure to chemical mixtures simulating complex real-life human exposures (Tsatsakis et al. 2016, 2017, 2019b; Docea et al. 2018, 2019). Furthermore, special attention should be given to statistical methodologies used in these studies (Sun et al. 2013; Lazarevic et al. 2019).

One important issue of mixture assessment is when exposure to toxicologically similar chemicals (i.e., with structural and/or biological outcome similarity) occurs at levels below the toxicological reference values for the individual components in the mixture. In this case, the individual risk assessment, performed separately, would conclude that none of the chemicals poses a significant risk. However, if the dose addition approach is applied, the combined exposure might lead to significant adverse effects, which are predictable from the dose–response curves of the individual components. Therefore, it is important to characterize relevant portions of the dose–response curves of each individual chemical, particularly when the focus is on the low-dose/low-effect region (Kostoff et al. 2018; Tsatsakis et al. 2018).

Regardless of the selected approach for the chemical mixture risk assessment, it is of paramount importance to have data on each individual component of the mixture from different lines of evidence. This paper will review the role of epidemiological, in vivo, in vitro and some new in silico tools in hazard assessment and how these streams of evidence can be combined for mixture risk assessments.

The role and critical assessment of the reliability and relevance of epidemiology in toxicological prediction and risk assessment of chemical mixtures

Basically, epidemiological studies are designed to assess whether a certain disease may result from exposure to a certain risk factor or whether a particular exposure may be associated with a disease in the population. Environmental epidemiological studies rely on observational data obtained from cohort, case–control, or cross-sectional studies. While associations between chemical exposures and biological responses found in these studies are relevant for risk assessment, epidemiological studies cannot predict risks associated with exposures that have not yet occurred. Hence, predictive risk assessment relies primarily on toxicology studies (Brunekreef 2008). Therefore, epidemiology contributes to risk assessment of chemicals that have been approved for use and human exposure that has already happened. The magnitude of the effect following exposure depends on an accurate exposure characterization using suitable quantitative metrics. The integration of existing human biomonitoring (HBM) data can contribute to address combined and aggregate exposure more realistically. This approach can identify real-life exposure patterns, priority and drivers of mixture toxicity, thus allowing potential health risks of mixtures to be assessed (Bopp et al. 2018).

Although information from epidemiological studies provides support for hazard identification and characterization, these studies have common factors that restrict the use of epidemiological data for risk assessment (Wilks and Tsatsakis 2014; Ockleford et al. 2017). These include exposure and/or outcome misclassification, and the impact of confounding factors on either exposure or health effects. If not controlled, confounding can distort the relationship between chemical exposure and health outcomes, with this distortion either masking a true association or giving rise to a false association. Furthermore, epidemiology studies are also subject to various forms of bias that hinder causal inference. In addition, the associations observed may be unrepresentative as a consequence of random sampling variation, especially when studies include small numbers of individuals. Depending on the circumstances, confounding, bias and chance may under- or overestimate the health effects of a chemical or mixtures. Likewise, classical epidemiology often lacks insight into the pathogenesis of the disease, thus limiting the characterization of the exposure–response relationship (Koureas et al. 2012; Kokkinaki et al. 2014; Barrie and Nichols 2015). However, the use of newer and novel technologies in modern epidemiological studies (e.g., omics) is contributing in part to fill this gap.

In the case of chemical mixtures, it is necessary to know the exposure of each person to the individual components of the mixture (during critical time windows), the outcome of interest, and potential confounders and effect modifiers (Webster 2018). Since human exposure to chemicals is not characterized by regular, uniform events, exposure assessment needs to account for the frequency, duration and magnitude of the exposure. Reliable ascertainment of exposure is particularly challenging when carried out retrospectively, especially when the relevant timing of exposure occurred a long time before the onset of the disease and the collected information relies on the recall of study participants. In turn, the use of biomarkers of exposure in biological fluids or tissues (i.e., HBM data) represents a useful alternative in cases in which the biomarker reflects the relevant metrics of exposure. However, as such data are most often generated at low levels of exposure, their potential role in causing an adverse outcome is not always clear. In such cases, exposure needs to be assessed regularly over many years to obtain a reliable measure of long-term cumulative exposure (SAPEA (Science Advice for Policy by European Academies) 2018).

For mixture risk assessment, it is a priority to identify whether multiple environmental factors, or chemical mixtures, are associated with the disease phenotype in human populations. New methodological developments that improve the scope and quality of epidemiological data on chemicals include the use of (a) hypothesis-free, environment-wide association studies (EWAS); (b) pooled data from multiple existing studies; (c) markers of disease processes as an outcome (SAPEA (Science Advice for Policy by European Academies) 2018). EWAS assess simultaneously the relationships between health outcomes and a wide range of chemicals, thus allowing the identification of chemical mixtures associated with a particular human disease. Obviously, EWAS would greatly benefit from the characterization of each individual’s “exposome”. Exposomes can be determined using personal information (standardized questionnaires, geocodes, etc.), biomonitoring data or environmental measurements and prediction models (geospatial models, air pollution models, etc.) (Patel 2018).

Regarding health outcomes, these should be clearly and unambiguously defined in epidemiological studies to ensure that cases are consistently diagnosed and thus avoid potential misclassification. Hence, harmonized case definitions are encouraged across investigations. Furthermore, surrogate outcomes (biomarkers of effects or biochemical measures) are generally accepted as substitutes for, or predictors of, specific clinical outcomes as they are on the causal pathway for the clinical outcome. In contrast to overt clinical disease, such biological markers of health allow detection of subtle, subclinical toxicodynamic processes. However, often these intermediate outcome measures are not validated and do not meet the strict definitions of surrogate outcomes. Only validated surrogate outcomes indicating an increased risk of adverse health effects should be used in epidemiological studies (Ockleford et al. 2017; SAPEA (Science Advice for Policy by European Academies) 2018).

A variety of statistical methods and approaches is available for the analysis of complex chemical mixture data in both simulated and real-life data sets (Taylor et al. 2016). Although novel methods have been applied to assess mixtures in epidemiological settings, such as weighted quantile sum regression, Bayesian kernel machine regression, and exposure space smoothing (Webster 2018), there is no consensus on standard methods for studying chemical mixtures in epidemiological studies (Braun et al. 2016). Methodologies need to be developed to determine adequately the adverse health effects resulting from exposure to combined chemicals. Statistical assessment of increasingly complex datasets and bioinformatics will provide greater insight into the contribution of individual components of chemical mixtures to long-term health outcomes (Dennis et al. 2017).

Methodologies for the formal synthesis of epidemiological evidence include systematic reviews and meta-analyses. These can be used for risk assessment if they have been rigorously conducted. Systematic reviews and meta-analyses assess the quality and the risk of bias of individual studies in a standardized way as well as the quality of reporting. Meta-analyses, on the other hand, increase statistical power, improve precision of the effect size estimation, provide an overall summary measure from conflicting results and assess the possibility of publication bias. However, the quality of a meta-analysis depends, to a large extent, on the quality of the individual studies. Even if the separate studies are of high quality, a meta-analysis may not be advisable if there is lack of compatibility among studies (e.g., differences in the study populations, doses, case definitions, and intensity of surveillance for adverse effects may not be comparable), which may lead to large heterogeneity in the results that challenge drawing robust conclusions on causality (Ball et al. 2011). A recent review of several case studies on human and environmental risk assessment of chemical mixtures (Bopp et al. 2016) identified twenty-one case studies that covered several compound classes and environmental media and provided clear evidence that chemicals need to be further addressed in risk assessment not only as single substances but as mixtures and that data sharing regarding toxicity and exposure needs to be significantly improved.

The role and critical assessment of the reliability and relevance of experimental data in toxicological prediction and risk assessment of chemical mixtures

Evaluation of the potential toxicity of chemicals and pharmaceuticals in experimental studies is a key element of the human safety evaluation. Quantitative risk assessment typically relies on animal data and extrapolation of this information to humans. To this end, regulatory agencies have prescribed guidelines to conduct studies to characterize the potential toxicity of chemicals (Brunekreef 2008).

Regulatory experimental studies for chemicals are usually performed according to harmonized Organisation for Economic Co-operation and Development (OECD) test guidelines (TG) and conducted following Good Laboratory Practice (GLP) principles; regulatory testing for drugs generally follow the experimental protocols develop by the International Committee on Harmonization (ICH) and also follow GLP. However, many of these studies are never published in the peer-reviewed literature since they contain proprietary information that is subject to confidentiality. For this reason, data from peer-reviewed scientific publications, in addition to mandatory regulatory toxicology studies, constitute an important part of the database used for the risk assessment of chemicals previously approved and marketed, but more often than not, such information is not available for newly developed chemicals (Kaltenhäuser et al. 2017). Regulatory studies typically expose genetically homogenous, inbred strains of experimental animals to a range of (usually high) doses of a chemical under-defined and controlled conditions to establish at which dose level chemicals elicit clear toxic effects or even death (NRC 2007). These effects can be attributed with high certainty to the chemical tested and confounding is reduced or even avoided by appropriate experimental design. However, such studies require extrapolation from animals to humans, from high to low dose, and from single-to multiple-chemical exposures if a mixture is being investigated. Unless there is evidence to the contrary, findings in animal toxicology studies are generally considered applicable to humans, although responses of laboratory animals and humans to chemicals may differ qualitatively and/or quantitatively (Hernández and Tsatsakis 2017).

Unlike regulatory studies, toxicological studies published in the peer-reviewed, open literature usually do not adhere to harmonized OECD TGs, ICH TGs or to GLP principles. Therefore, they need to be evaluated for relevance (e.g., whether they are fit-for-purpose) and reliability (e.g., whether they are trustworthy in terms of quality and validity). Their contribution to the overall weight of evidence is influenced by the test organism, study design, statistical methods, documentation, and reporting of methods and results. Criteria for relevance and reliability of toxicological data need to be considered when this type of information is used for regulatory purposes (Roth and Ciffroy 2016; Kaltenhäuser et al. 2017).

The quality and reproducibility of scientific investigations, in general, have been a topic of considerable interest in the scientific community. Besides, when studies are not conducted according to accepted guideline protocols, it is necessary to consider whether they are sufficiently powered. Equally, guideline studies should also include considerations of power (Simmons et al. 2018). Criteria for the evaluation of the reliability of guideline-compliant/non-guideline studies are given in Table 2.

Table 2 Criteria for the evaluation of reliability of guideline-compliant or non-guideline studies (based on Kaltenhäuser et al. 2017)

For guideline-compliant studies, the respective TGs provide harmonized “checklists” to assess the reliability of the study. However, adherence to GLP does not guarantee methodological quality or error-free experimentation and data analysis, nor does it ensure that the study is actually relevant for human health risk assessment. Furthermore, the lack of adherence of published toxicological studies to OECD-TGs does not mean that these studies are not of high quality (Kaltenhäuser et al. 2017).

As previously mentioned, another problem when assessing toxicity in animal studies is to adequately design the study to make the evaluation of interactions possible. Some of the major challenges in developing an experimental design for assessing the hazard of chemical mixtures are to prioritize chemicals of concern, to set the number of mixture components, to define study duration and relevant dose concept, to identify the critical end point, to model dose–response relationship and to adequately interpret obtained results in relation to the effect(s) induced by a single component. Theoretically, a vast number of globally relevant chemicals that are proven to have an impact on human health will result in an almost indefinite number of possible combinations. Additionally, experimental design becomes even more complex taking into account how many different doses per single chemical, alone or in the mixture, should be administered to ensure not only reliable dose–response modeling but also respective bases for the analysis of potential in vivo interactions. The aforementioned number of mixture components is multiplied by the number of doses, giving thus a total of experimental groups, which should be considered in the light of the three Rs principle addressing the importance of animal welfare in research and testing. In the past, uncritically high doses were used with the aim of ensuring clear and unambiguous toxicological effects. It is, therefore, of utmost interest to stick to a lower dose concept since in the majority of cases low exposures are closer to real-life human exposure. Finally, extrapolation of animal data to humans is of particular importance, i.e., translation of toxic response induced in animals should be sufficiently reliable to predict potential adverse health effects as well as the safe levels of the chemicals for the human population.

Pioneer studies trying to shift the current approaches for risk assessment toward the aforementioned real-life risk assessment simulation (RLRS) approach are emerging. These studies are utilizing true exposure scenarios (multiple chemicals, multiple adverse outcomes and doses around or well below regulatory limits) to derive respective exposure limits (Tsatsakis et al. 2016, 2017; Docea et al. 2018, 2019).

Mechanism-based methodologies for the evaluation of the impact of chemical mixtures on human health

Novel tools, such as in vitro methods, omics, organs-on-a-chip, quantitative structure–activity relationships (QSARs), read-across, Physiologically Based Toxicokinetic (PBTK) modeling, Threshold of Toxicological Concern (TTC), Adverse Outcome Pathways (AOP), Dynamic Energy Budget (DEB) models and Integrated Approaches to Testing and Assessment (IATA) are being increasingly used in risk assessments of mixtures.

In vitro approaches

Significant efforts are being made to move beyond the traditional empirical assessment of apical toxic endpoints in laboratory animals to evaluation of toxicity based on an understanding of toxic mechanisms and pathways with quantitative determination of relevant parameters through in vitro studies. In terms of toxicokinetics, a major limitation of many in vitro test systems is that they are usually deficient in enzymes and transporters involved in the disposition of chemicals. Furthermore, where the toxic moiety is presented by an active metabolite instead of the parent chemical, in vitro models not accounting for the toxicokinetics can overlook some toxicity. With respect to toxicodynamics, the use of human-based in vitro test systems similar to the target cells in vivo enables us to address organ-specific toxicities and often to eliminate inter-species differences (SAPEA (Science Advice for Policy by European Academies) 2018).

Understanding the MOA can be considered as a starting point in forecasting adverse effects in humans, and such approaches often rely on in vitro data (Rouquié et al. 2015). Current in vitro systems are suitable to study and identify different biomolecules associated with various physiological processes where hazard assessment is tracked by testing cell lines receptive to specific effects (mechanistic assays). Such response to individual chemicals or a combination of chemical compounds could include activation of receptors and/or specific pathways, triggering certain intracellular mechanisms such as lesion repairing (Polini et al. 2014; Bopp et al. 2015). The in vitro assessment of chemical mixtures is complex as it can be challenged through top-down (overall toxicity caused by complex mixtures) (Tang et al. 2014) and/or bottom-up methods (chemicals activity checked with a variety of in vitro tests) and further combined with chemical analysis resulting in an Effect directed analysis (EDA) (Burgess et al. 2013; Curcic et al. 2014).

In chemical/chemical mixture toxicology, it is essential first to evaluate the toxicity profile with in vitro approaches that will provide important information related to the MOA (toxicodynamics). Such analyses rely basically on cancer cell lines, normal cell lines or primary cells maintained in standard and controlled conditions, namely a definite time period, certain cellular concentrations, level of cellular confluency, etc. (Liu et al. 2017). The principal benefit for chemical effect assessment is that cellular responses noticed through in vitro cell cultures can be evaluated, whereas in vivo models might be disturbed by “non-chemical stressors” that can interfere with the real studied effects (Bopp et al. 2015).

In vitro batteries can play a pivotal role in toxicity testing, although such systems do not necessarily reproduce the precise intercellular interaction or the extracellular environment (Polini et al. 2014). The combination of in vitro with in vivo approaches represents an excellent progress to replace unnecessary in vivo tests. This approach is linked to the initiatives implemented by programs such as the ToxCast Program from the U.S. Environmental Protection Agency (US EPA) and Tox21 of the National Institute of Health (NIH) (Bopp et al. 2015; Rouquié et al. 2015). Tox21 is an initiative studying a compound’s activity using in vitro assays focused mainly on metabolic activity, cellular viability and/or intracellular organelle evaluation such as mitochondria (Attene-Ramos et al. 2013) and nuclear receptors (Huang et al. 2011). This innovative approach aims to develop relevant toxicity methods by screening a large number of compounds (currently over 10,000) in quantitative high-throughput screening platforms. These methods allow activity profile patterns to be generated to outline similarity between compounds. On the assumption that chemicals with similar activity profiles have analogous properties, their toxicological properties may be assessed through read-across from a compound’s profile (Hur et al. 2018).

Several in vitro lines are accepted by regulatory organizations for the assessment of single compounds and/or mixtures taking into account their ability to envisage in vivo results. For example, the US EPA has accepted the ToxCast Estrogen Receptor model as a substitute for the in vivo uterotrophic assay (Browne et al. 2015; Kohno et al. 2018).

Omics-related tools

In recent years, omic approaches (transcriptomics or gene expression profiling, proteomics, metabolomics, and so on) are proving to be powerful tools in advancing knowledge regarding toxicological effects and risk assessments. Due to their high sensitivity, omic methods are advantageous for studying effects occurring at doses which are relevant for environmental exposure either to a single chemical or a chemical mixture. Since a molecular signature does not always produce an adverse outcome at the physiological level, proteomic data, although very precise, need to be carefully interpreted (Beyer et al. 2014). Nonetheless, omics are a valuable tool for toxicity analysis of mixtures as well as for single compounds because mechanistic insight regarding the MOA and upcoming affected pathways can be evaluated and better characterized. Furthermore, in conjunction with standard in vitro methods and computational tools, omic technologies have the potential to offer valuable insight concerning MOA obtained in animal models (Brockmeier et al. 2017). Omic methodologies assess if a certain compound induces changes that could develop into adverse effects and helps identify MOAs. The resulting omic data provide class comparisons (which gene/protein best discriminate the studied groups), class predictions (the pattern of gene/protein expression induced by the test compound to predict the MOA and its effects) and class discovery (the case when unpredicted, but biologically relevant patterns are generated by omics data) (Sauer et al. 2017).

Gene expression microarrays have provided improved insights into genetic signatures with the power to discriminate genotoxic from nongenotoxic carcinogens and/or to assess functional effects of chemical exposure. Transcriptomics, among the omics, are the most engaged in providing evidence on patterns of cellular alterations and pathways and molecular processes affected by chemicals. For the most part, this approach had been limited to single-chemical exposures; however, recently transcriptomics studies are being used to investigate the effects of mixtures of chemicals (Van Delft et al. 2005; Thomas et al. 2007). However, one drawback of this approach is related to the lack of clear regulations and validated protocols (Bopp et al. 2015). In contrast to classical toxicology methods, transcriptomics can address issues such as the physiological behavior of a chemical mixture as a whole or of a specific component in the mixture (Moiz Mumtaz et al. 2010). Transcriptomic analysis investigates differential gene expression at realistic doses known to induce specific toxicological reactions, a strategy denoted as phenotypic anchoring (Andersen et al. 2018). Moreover, gene expression studies have proven to be valuable tools to assess MOAs following in-life exposures of experimental animals to various chemical compounds or mixtures. Phenotypic anchoring belongs to an arsenal of tools including heatmaps, pathway enrichment analysis, benchmark dose estimations and network representations which are useful to assist in patterns of differential gene expression analysis of various affected cellular pathways (Andersen et al. 2018).

A novel omics-based strategy in chemicals toxicity assessment is to combine in vitro cell-based assays with quantitative high-throughput proteomics technology to identify underlying molecular mechanisms of expected effects, and thus predict in vivo responses. In this regard, stem cells are a suitable tool because they possess unique proliferation and differentiation abilities thus being more physiologically relevant than classic standardized cell lines. Such omics-based assays comprising stem cells would provide mechanistic insights into chemical toxicity and may help for a more thorough toxicological assessment (Han et al. 2018).

Organs-on-a-chip

In recent years, development of physiology-relevant experimental models for assessing toxicological effects has become a pivotal topic. Thus, in vitro models have evolved from 2D to 3D settings which authentically mimics inter- and extracellular interactions in a living organism. In addition, the 3D systems are, in some cases, even surpassing animal models that may inexactly depict toxicological effects due to inter-species differences. As a consequence, advances in microfluidic technologies have allowed 3D organs-on-a-chip (OC) design, expanding their usefulness not only in cancer and drug toxicity research but also in environmental toxicology studies (Pamies et al. 2014; Cho and Yoon 2017; Pamies and Hartung 2017).

OC systems consist of tissue constructs and the cultured cells displayed on a microfluidic channel networks attempt to replicate the human organ structure. The 3D system cells can perform most of the in vivo functions (adherence, proliferation, cell to cell communication, etc.) due to a specific layer covering the network channels mimicking the extracellular matrix. In addition, a fluid flow is applied in OC to ensure a gradient of oxygen, growth factors/nutrients and proper intercellular interactions (Inamdar and Borenstein 2011). Therefore, these physiology-mimetic microfluidic OC are very promising candidates in replacing animal models in drug discovery, screening, and assessment of efficacy and safety of chemicals (Polini et al. 2014).

Microfluidic 3D devices have been developed that simulate various human organs (kidney, liver, gut, and lung) and in combining of organs such as a lung/liver chip system. These OC systems can play a significant role in understanding the susceptibility of those organs to environmental toxicant exposure.

Several tasks must be considered to obtain accurate results when using OC: a tissue-compatible chip material, immortalized cell lines versus primary cells, best analytical methods to assess OC as the current techniques (typically fluorescence microscopy-based) are time-consuming with no real-time analysis capacity (Cho and Yoon 2017; Uto et al. 2017).

In silico methods as complement of experimental and human observational data

Computer-based (in silico) methods are increasingly used in modern chemical risk assessment, particulary as a complement of experimental and human data since they are a very cost-effective means for evaluating mechanistic explanatory hypotheses. Thus, in silico approaches can be used in cases of missing data, i.e., when the toxicological data are not available in published databases or in regulatory assessments. This is all the more true for chemical mixtures due to the complexity in assessing actual exposures to mixtures and the associated health effects. Thus, the application of in silico methods to assess chemical mixtures covers the spectrum of risk assessment, from external and internal exposure to toxicodynamics and pathology onset and development.

Exposure models

A key problem in exposure modeling in support of complex mixture risk assessment is the potential infinity of possible mixtures of chemicals that are plausible in the environment and/or consumer goods. Machine learning techniques, such as frequent itemset mining (FIM), a clustering method used widely in marketing studies (Borgelt 2012), may support the identification of consumer exposure patterns without missing out on the detail of the consumer exposure information. Hence, FIM can support the identification of the most relevant chemical mixtures for real-life exposures. To apply the FIM method, the data must first be converted from a concentration matrix into a discretized presence-absence matrix. The discretization threshold per compound would be derived using a toxicologically relevant criterion (e.g., risk characterization ratio, RCR, > 0.05).

The next step in exposure modeling involves the use of multimedia chemical fate and transport models. McKay’s fugacity III model is the one most widely used and with widespread acceptance. A number of multimedia and integrated exposure models that describe the transfer among various environmental media are practically based on the fugacity approach. The EUSES model system (Lijzen and Rikken 2004) aims at the evaluation of exposure and the associated risks for a broad range of compounds, by directly associating aggregate exposure to potential adverse outcomes without accounting for internal dosimetry or temporal exposure dynamics (Fryer et al. 2004). The Calendex™ model is the tool proposed by the US EPA for pesticide exposure assessment, accounting for a broad range of compounds, taken up via multiple pathways and routes.

An alternative to the usual approach of multimedia modeling proposed by Pistocchi et al. (2010), focuses on providing higher spatial resolution of chemical compounds by replacing the numerical solution to the advection–dispersion equation with local analytical solutions. The Stochastic Human Exposure Dose Simulation (SHEDS) system developed by the US EPA is a continuously evolving modeling system, which in its current form allows the investigation of complex exposure scenarios, as well as associations to HBM data (Zartarian et al. 2002). Recent advances of SHEDS have resulted in the Modeling Environment for Total Risk studies (MENTOR) (Georgopoulos et al. 2005, 2006, 2008; Georgopoulos and Lioy 2006; Lioy et al. 2007), which incorporates internal dosimetry of multiple compounds via Multimedia, Multipathway, Multiroute exposures (4 M). Similarly, the INTEGRA (Sarigiannis et al. 2014) model has been designed in Europe to address both far field exposure resulting from multimedia exchange, as well as within near field exposure through indoor microenvironments and consumer exposure, integrating all pathways and routes into an internal dosimetry model.

In order to represent population exposure starting from individual data, probabilistic modeling techniques such as Maximum Likelihood Estimation (for variables that fall exclusively on normal distributions), Monte Carlo analysis, or Bayesian modeling were introduced (Harper 2004; Zidek et al. 2005; Mutshinda et al. 2008; Bogen et al. 2009). Further to probabilistic modeling, stochastic agent-based models (Brandon et al. 2018) refine further exposure estimates on the individual and population level by encompassing behavioral dynamics and exposure and risk determinants related to sociodemographic characteristics such as socio-economic status, age, gender or educational level. By properly describing these societal dynamics and following the evolution of the virtual agents, a more representative description of behavioral aspects of exposure can be obtained.

Toxicokinetic (TK) and physiology-based toxicokinetic (PBTK) models

The main aim of TK models is to capture the chemical and biochemical kinetics and the biodistribution of xenobiotics after exposure. A key function of TK/PBTK models is to convert external exposure level to internal dose in biological fluids (e.g., blood, urine) and tissues used in HBM studies, thus permitting the link between external exposure and measured HBM levels (Bois et al. 2010). The need for PBTK models is increasing, as a result of the continuously growing concern regarding exposure to multiple chemicals from multiple pathways and routes (Yang et al. 2010), and how this is translated into temporal dynamics of chemical concentrations at the target tissue (Sarigiannis and Karakitsios 2011; Valcke and Krishnan 2011). Further considerations include early life exposures such as in utero (Beaudouin et al. 2010), through lactation (Verner et al. 2008) and during infancy (Edginton and Ritter 2009). PBTK models are capable of capturing multiple chemical interactions at the level of metabolism; however, the lack of sufficient experimental data to describe the interaction terms has resulted in a limited number of applications so far, such as in the case of volatile organic compounds (VOCs) (Sarigiannis and Gotti 2008) and metals (Sasso et al. 2010). PBTK models can also be used to reconstruct external exposure from HBM data. Several numerical methods have been proposed for this, mainly categorized into Bayesian and non-Bayesian approaches (Georgopoulos et al. 2008). A comprehensive method for the implementation of probabilistic exposure reconstruction is provided by the Bayesian approach. This type of approach has been used in the INTEGRA model, based on the Markov Chain Monte Carlo and Differential Evolution Markov Chain techniques (Andra et al. 2015; Sarigiannis et al. 2016).

Physiology-based toxicodynamic models (PBTD)

PBTD models capture the interaction of xenobiotics with molecular targets and how these may be affected by biochemical interactions between mixture components when reaching the same or allosterically linked molecular receptors. Thus, they provide a higher level of biological plausibility supporting the derivation of causal dose–response relationships in mixture risk assessment.

Pathology models

A comprehensive method to assess risks of long-term effects would include the development of a mechanistic approach (subject to the constraint that the necessary data are available) based on a biology-based dose response (BBDR) model. BBDR models aim at linking exposure to internal dose and by describing computationally the complex biological processes that result into clinically observed adverse health outcomes.

There are two major ways to derive a BBDR:

  • The first one is to describe the sequence of biological events that culminate in the clinical observations following an entirely mechanistic procedure; this requires consideration of all relevant toxicological aspects, following the concept of quantitative adverse outcome pathways.

  • The second one is to advance the association of epidemiological data with exposure levels, by linking them not only to external exposure estimates, but to internal dose, accounting thus for differences in exposure pathway, route of administration and genetic variability in metabolism and clearance.

Aiming at the quantification of risks associated with cancer, the multistage approach proposed by Armitage and Doll (2004) can be applied, based on the decomposition of the dose–response relationship into different micro-relations that correspond to different interconnected biological processes (Sarigiannis and Gotti, 2008). Constructing dose–response relationships based on internal dose provides a stronger biological basis for extrapolations across different studies, species, exposure magnitudes and routes (Aylward et al. 1996; Benignus et al. 1998; Melnick and Kohn 2000). This type of analysis provides additional insights when co-exposure to chemical mixtures results in alterations of the internal dose that are reflected in the risk calculation, even if the toxic effect is attributed to one of the mixture components. This is the case of the ubiquitous chemical mixture BTEX (benzene–toluene–ethylbenzene–xylene(s)). Sarigiannis and Gotti (2008) developed a BBDR capturing efficiently the dose-dependent inhibition of benzene metabolism in humans co-exposed to toluene, ethylbenzene and xylenes, a feature that although of little consequence at environmental concentration levels, becomes highly significant in occupational settings at exposure levels close to the regulatory threshold limit values.

(Quantitative) structure–activity relationships—(Q)SARs

QSARs are mathematical relations linking molecular attributes related to chemical structure to biological activity and physico/chemical properties. When the description of the relationship between chemical structure and molecular activity is based only on qualitative descriptors, we refer to them as structure–activity relationships (SARs); when this description is given in quantitative terms, it is referred to as QSARs. Two major types of SAR/QSAR models can be distinguished:

  • Structural alerts linking the chemical structure of the molecule analyzed or the presence of specific chemical groups in key positions in the overall molecular configuration with toxicity (e.g., carcinogenicity).

  • Chemical models allowing prediction of (a) key descriptors of physico/chemical properties such as kow (octanol–water partition coefficient, a metric of lipophilicity) (Papadaki et al. 2017), which affect toxicokinetic properties; and (b) biochemical parameters (vmax, Km) that determine the quantitative toxicokinetic and toxicodynamic behavior of chemicals. The latter supports effectively the widespread use of PBTK/TD models in the face of data paucity by enabling complex model parameterization. Performance of this type of QSARs has improved significantly over the last years, by coupling advanced QSAR modeling such as the Abraham’s solvation equation with machine learning algorithms (Sarigiannis et al. 2017). Use of (Q)SARs for PBTK/TD model parameter estimation is a key enabling technology for further development of biology-based dose–response models of chemical mixtures, since it would expand the chemical space covered by models encompassing biochemical interactions among mixture components and between these chemicals and co-exposed biological receptors.

Use of adverse outcome pathways (AOPs) to structure predictions of toxicity testing—toward mechanistic risk assessment

In the last decade, assessment of chemical hazards started to move toward mechanistically based approaches. The “adverse outcome pathway” (AOP) framework is conceptually similar to a MOA and originally was described in the ecotoxicology community. Now this framework has been adopted within an international initiative coordinated by the OECD and supported by organizations such as the US EPA, the EU’s Joint Research Centre (JRC) and the European Food Safety Authority (EFSA) to assess, document and develop potentially predictive testing strategies for adverse ecological and human health effects. Since 2012, the OECD’s AOP development program has created a knowledge base comprising a set of tools that allow the coordinated development and adoption of AOPs.

An AOP represents a sequence of key events (KE) that has a logical flow, as KE are linked through key event relationships (KER), and take place at various levels of a biological system upon exposure to chemicals. This exposure interacts with specific molecular targets and can trigger a chain of events eventually leading to deleterious effects on humans or other living organisms. As the scientific information progresses, updates are needed focusing on practical directions for AOP developers and assessors (OECD Series).

In an AOP, the molecular initiating event (MIE) is defined as the point where a chemical directly interacts with a biological target(s) to create a perturbation. This perturbation can develop through a dependent succession of intermediate KE and finish with the appearance of an adverse outcome (AO); the latter being relevant for the current risk assessment framework. AOPs should describe the critical steps along the developed path and these steps should be measurable and predictable. Of particular importance are the links between upstream and downstream KE, the KERs. They should be supported by plausibility and evidence and, ideally, a quantitative understanding. The analytical construct of an AOP should include causally interconnected events that can generate deleterious health and/or eco-toxicological effects (OECD 2018).

It is expected that AOPs will lead to a broader identification of testing (e.g, in vivo, in vitro, in chemico) and in silico methods. All these newly identified experimental approaches can further support regulatory decision-making. This is the focus of another OECD initiative, the Integrated Approaches to Testing and Assessment (IATA) (Tollefsen et al. 2014). An IATA is a pragmatic, science-based approach for chemical hazard characterization that relies on the combined analysis of existing information coupled with the generation of new information using appropriate testing strategies. An IATA follows an iterative approach to answer a defined question in a specific regulatory context, taking into account the acceptable level of uncertainty associated with the decision context.

In 2018 an extended international survey reported that there are major topics that need future development: increased communication and stakeholder involvement in AOP development and knowledge; increased regulatory use and acceptance of the AOP framework; and increased use of the AOP framework in applications (Knapen et al. 2018). The latest updated guideline also makes reference to AOP networks which are formed by the assembly of two or more AOPs having one or more shared key events. Various scenarios can be envisaged, e.g., a single MIE leading to different AOs, or multiple MIEs converge to a single adverse outcome.

On the other hand, although AOPs provide an indication of the connection between a chemical) and a particular disturbance in the organism, they do not necessarily define the probability or severity of the AO that can be expected under a specified exposure scenario (Conolly et al. 2017). Hence, most of AOPs are inappropriate for quantitative risk assessment. The term quantitative AOP (qAOP) refers to a relatively advanced stage in the progression of AOP development and consists of one or more biologically based, computational models describing KER linking MIE to the AO (Conolly et al. 2017). Although its development can be rather resource-intensive, it seems that due to its quantitative, dose–response, and time-course predictions, qAOPs will be critical to future regulatory decision-making.

Integration of separate streams of evidence (epidemiology, toxicology, mechanistic data) for the assessment of chemical mixtures toxicity

The awareness of the limitations of observational epidemiological studies has been used frequently to dismiss or reduce a potentially useful body of information. Conversely, experimental data have been considered a critical component of any well-conducted risk assessment. Human and experimental data should be seen as complementary, i.e., one emanating from controlled exposures using an experimental study design and a relatively homogeneous surrogate population, the other reflecting the changes observed in a heterogeneous target population from mixed (and varying) exposure conditions using a non-experimental study design (ECETOC (European centre for ecotoxicology and toxicology of chemicals) 2009).

Most of the toxicological studies that are required to support the authorization of regulated chemicals use experimental animals, which are not always predictive of the full range of possible adverse outcomes that can occur in humans. For instance, complex human diseases (e.g., developmental neurotoxicity, neurodegenerative diseases, mental disorders, some cancers and endocrine disturbances) may not be adequately addressed by the standard series of regulatory toxicity tests currently in practice. Individually, human, experimental and in silico data are not able by themselves to capture the pathophysiology of complex apical effects. However, a set of assays can provide insight on the diverse pathways that can lead to end-organ effects (SAPEA (Science Advice for Policy by European Academies) 2018). An emerging approach is focusing on the detailed mechanisms by which chemicals interact with the body, and on the biochemical pathways that can lead to the adverse outcome or disease. All types of evidence addressing the same exposure and health endpoints of outcome can be weighted in a systematic, consistent and transparent way for hazard identification using the AOP concept.

The assessment of mixture toxicity needs to address the actual pattern of co-exposure in the target population, and the adverse health effects posed by the mixture. While the former requires valid and reliable methods for exposure assessment (measurement and/or modeling), the latter approach benefits from the complementary sciences of epidemiology and toxicology—in vivo, in vitro, in silico data (Webster 2018). The combination of all lines of evidence can contribute to a weight-of-evidence analysis in the characterization of human health risks with the aim of improving decision-making. Approaches have been proposed by EFSA for integration of different streams of evidence in the case of pesticides (Ockleford et al. 2017).

Epidemiology plays a special role in risk assessment of environmental exposures because it is based on direct observations in humans. However, one important limitation of observational epidemiology is the validity of associations observed, which are not necessarily causal. Hence, information from experimental in vivo and in vitro studies complements epidemiology to provide biological plausibly and therefore more confidence in the causal interpretation of epidemiological findings for risk assessment (Brunekreef 2008).

For chemicals already marketed, relevant information may be available from toxicological and/or epidemiological studies in the peer-reviewed scientific literature. A requirement for regulatory decision-making for the re-approval of chemicals should not be based solely on the (non-publicly) available regulatory studies, but should also include a systematic review of the published literature to collect and synthesize all evidence from non-standardized toxicological and epidemiological studies relevant for risk assessment (SAPEA (Science Advice for Policy by European Academies) 2018). Then data from multiple lines of evidence can be integrated based on a weight-of-evidence analysis accounting for relevance, consistency and biological plausibility using the modified Bradford Hill criteria (Ockleford et al. 2017). This flexible approach for chemical safety assessment based on the integration and translation of the data derived from multiple sources and methodologies is known as IATA.

Although there is no effective means to read-across health outcomes from epidemiological studies with experimental findings, basic principles have been proposed as the basis for the comparative interpretation of human and experimental findings. If experimental and human data are concordant it might be assumed that the whole dataset is consistent with the hypothesis that the same processes are going on in both animals and humans and that, in the event of appropriate exposure, similar toxicities might result. In the case of non-concordance, the reasons for the discordance should be examined. If the reason is found to be on the basis of the underlying biology, then confidence in the risk assessment will increase (ECETOC (European centre for ecotoxicology and toxicology of chemicals) 2009).

Final reflections

The traditional approach of toxicity testing in which a model system is exposed to a single chemical at relatively high concentrations to elicit an effect yields limited information regarding real-life exposures. This is further complicated by the fact that humans are not exposed to a single chemical, but to a complex mixture of chemicals at low doses on a daily basis. Simply trying to ‘add up’ the different adverse outcomes from single chemicals is not sufficient to fully understand the nature of these complex chemical/chemical interactions. Furthermore, exposure during critical developmental periods may lay a foundation for increased toxicity to chemicals later in life, with the sequence of chemical exposures becoming important in determining the adverse outcome. Hence, model systems previously used in toxicity studies may have oversimplified the actions of various chemicals. To develop new or improved model systems, investigators need to find the appropriate balance of rigor, reproducibility and appropriateness of the models used for mixture toxicity studies.

When assessing the toxicity of chemical mixtures using epidemiological studies important limitations limit the use of these data for risk assessment, including exposure and/or outcome misclassification, confounding factors and different types of bias that challenge the interpretation of the findings. Analysis of chemical mixtures follow the same parameters as described above, with time of exposure, duration and concentration being vital factors in determining or predicting outcomes. Epidemiological studies can take advantage of the use of biomarkers of exposure (e.g., HBM), translational biomarkers of response (e.g., omics) and biomarkers of susceptibility (e.g., toxicogenomics) that contribute to a better understanding of the association between exposure to chemical mixtures and health outcomes, which is often difficult to ascertain.

Recent advances in technologies have resulted in significant progress in the study of mixture toxicology. Improved technology has added impact to data obtained from new in vitro approaches; omics-related tools, organs-on-a-chip and 3-D cell cultures, which facilitate high-throughput analysis of biomarkers. Improved culture techniques have greatly advanced our understanding of a chemical’s action at a cellular level and helped gain insight on tumor biology. Complex scaffolding or matrices can be used to more fully mimic the function of the tissue on an organ level, facilitating interpretation of toxicology data from chemical mixture studies. These strategies, together with in silico methods, have improved understanding of toxicity pathways and allowed for developing AOP-informed IATA to support regulatory decisions.

By understanding the multiple pathways associated with adverse outcomes, an investigator is better equipped to predict risks associated with exposure to chemical mixtures. As technology and understanding advance, our ability to harness separate streams of evidence and collectively to provide an outcome associated with mixture exposure improves. As many national and international organizations are currently stressing, studies on chemical mixture toxicity is of primary importance. These authorities have stressed the importance of conducting studies to address the toxicity of chemical mixtures and the significance of developing appropriate tools and technologies to facilitate these studies and their predictive abilities. The use of AOPs may help our understanding of these intricate pathways as they may contain multiple branches and intersections. Gaining greater knowledge on the pathways associated with each adverse outcome will improve the predictability of all possible outcomes and will increase our understanding of the toxicity of chemical mixtures.

Clearly the path to understanding the toxicity of chemical mixtures is complicated, convoluted and sometimes intertwined with other systems, which makes risk assessment a complex task. The development and joint use of technologies and methodologies across all of the paths, including new exposure assessment models, in vitro approaches, omics-related tools, organs-on-a-chip and 3-D cell culture, in silico methods, epidemiological and experimental methods and models, will improve risk assessment of chemical mixtures and the prioritization of mixtures of concern. Overall, the combined use of all these lines of evidence will result in a better understanding of AOPs, particularly quantitative ones, thus contributing to a better insight into chemical mixture toxicity.