In Pursuit of Total Exposure Health: Leveraging Exposure Science, the Omics, and Other Emerging Technologies

Sherrod Brown and Dirk Yamamoto

Wright-Patterson AFB

Introduction

Evaluation of harmful exposures that could pose health threats to workers traditionally focuses on environmental and occupational exposures. Considering that the workday only accounts for approximately one-third of a worker’s day, these evaluations are somewhat incomplete as they don’t represent the worker’s total exposure. However, how can the health of a worker be determined from only a fraction of his or her daily exposure?

Hazards such as first- or secondhand cigarette smoke, air pollution, and harmful contaminants in drinking water or lifestyle factors such as stress, diet, and sleep habits all affect an individual’s health and subsequently their work performance. However, these stressors aren’t accounted for in workplace exposure evaluations and typically are not included in an individual’s clinical assessment. Not accounted for are numerous exposures from a worker’s everyday lifestyle, e.g., foods and/or drugs consumed, commercial products applied to their skin, and ambient air inhaled away from the workplace. These off-duty exposures can lead to a cumulative exposure, or “total exposure,” that poses a long-term threat to health. This chapter discusses exposure science and how emerging technologies may be applied to the Total Exposure Health (TEH) framework (Goff and Hartman 2018).

Exposure Science: Key Principles Useful for Total Exposure Health

Defined as “the study of the contact between human and physical, chemical, or biological stressors,” exposure science seeks to understand the nature of this contact for the purpose of protecting ecologic and public health (Teeguarden et al. 2016). Exposure science has two primary goals: (1) to understand how stressors affect human and ecosystem health and (2) to prevent or reduce contact with harmful stressors or to promote contact with beneficial stressors to improve public and ecosystem health (NRC 2012). Applying exposure science principles to TEH will require the use of more tools such as sensors, biomarkers, and analytics to accomplish the goals of improving exposure characterization and identifying and understanding variability, susceptibility, and vulnerability to an individual’s exposome. Doing so will, e.g., lead to a better understanding of why individuals who share duties in the same workplace have noticeably different health outcomes after experiencing the same representative occupational exposure.

First introduced by Dr. Christopher Wild (2005), the exposome “encompasses life-course environmental exposures from the prenatal period onwards.” The Centers for Disease Control and Prevention (CDC) further defines the exposome as “the measure of all the exposures of an individual in a lifetime and how those exposures relate to health” (CDC 2014). Accordingly, each person develops his or her unique exposome ever since birth—thought of as the environmental correlation to the human genome (Betts and Sawyer 2016; Wild 2005). While important, particularly to the predisposition to certain diseases, researchers have determined that the genome does not play as large a role as initially expected in disease (approximately only 10%). However, the genome, in concert with the highly variable exposome, holds promise for explaining the cause of various diseases. Critical to understanding the causes, and eventually the prevention of disease, environmental hazards and their relation to diseases need to be further studied. The knowledge gained from further research of both the genome and exposome could ultimately result in more effective treatment and improved patient management under the TEH paradigm.

Wild (2012) described three overlapping domains within the exposome: (1) a general external environment (e.g., climate, stress, socioeconomic status), (2) a specific external environment (e.g., diet, occupational and environmental exposures, drugs), and finally, (3) an internal environment (e.g., DNA, metabolism, microbiome). Traditionally, occupational health evaluations have focused only on the specific external environment, with collection of samples from air, water, and other external mediums to determine potential exposures and provide risk assessments. Relying strictly on the external environment gives exposure scientists only a glimpse of the exposures affecting the overall health of an individual. Studying all three domains relies on the application of unique internal and external exposure assessment methods.

Previously, exposure science relied heavily on external exposure information for a small number of stressors, locations, times, and individuals. Now, the science is shifting to a more systematic assemblage of internal exposures of individuals in entire populations and multiple elements of the ecosystem to multiple stressors (NRC 2012). Over the past 15 years, there has been greater emphasis on the use of internal markers of exposure (i.e., biomarkers) to assist in defining exposure- response relationships. Internal measures of exposures to stressors are closer to the target site of action for biologic effects than are external measures, but the variability in the relationship between sources of stressors and effects can be greater than that when relying on external measures of exposure. Analytical methods enable detection of much lower concentrations of stressors internally and measurement of multiple stressors in single samples, but the correlation between external exposures and resulting internal concentrations is still being investigated. Thus, most biomonitoring data cannot be interpreted without far more information that can only be gained through more research. New analytical methods to quantify human biomarkers of chemical exposure have been developed for many substances, and many newly developed techniques for other chemicals are currently in the validation process (Becker et al. 2003).

High-Throughput Metabolomics in Total Exposure Health

The Omics

The past few decades have seen a revolution in medical research with the rise of various omics technologies (see also Chapter 8, “Omics”: An Introduction), and today, many biological research efforts are incorporating these high-throughput technologies into their methodologies. The suffix omics is used to describe a field of study in life sciences that focuses on large-scale data/information. The first of the omic technologies to gain attention, genomics, provided a framework useful for mapping and studying specific genetic variants contributing to various diseases. The identification of these genetic variants has allowed scientists to rationalize the development of additional systems biology technologies that involve integrating different omics data types—all w'ith the goal of identifying molecular patterns associated with diseases. The triad of omics is comprised of genomics, which studies cell DNA and genetic information (genome), proteomics, which studies the structure and functions of proteins (proteome), and the youngest of the three, metabolomics, which studies the human metabolome.

Metabolomics

In simple terms, metabolomics looks at the unique footprint left behind after metabolism occurs and it is integral to an understanding of biological system functionality. Because of its great potential to the study of the exposome, this field is emerging as an important way of characterizing human exposures and, therefore, aiding in describing an individual’s total health. This section presents the basics of metabolomics, advances in the field and advantages of high-throughput metabolomics, and how they could benefit TEH.

First introduced in 1998, the metabolome is the collection of low molecular weight compounds (metabolites), to include parts of amino acids, lipids, and organic acids present at any given time in a cell or organism that participate in metabolic reactions (i.e., metabolism). It contains the biological endpoints of genomic, tran- scriptomic, and proteomic perturbations, and also includes environmental (to include stress, lifestyle, and xenobiotic use) and gut microbiota influences (Johnson and Gonzalez 2012). Metabolism is the life-sustaining biochemical processes occurring in living things as a result of dissolution and nourishment, and from those processes, each individual forms a unique metabolic profile. Because metabolism is influenced by all the factors previously mentioned, examining the metabolic profile can provide a better representation of an organism or individual’s phenotype than genomics or proteomics alone. Metabolomes become altered when an organism’s biological systems are disturbed by disease, genetic mutations, or environmental factors. These altered metabolic profiles can result in large perturbations to metabolite concentrations and flux even from very subtle variations between individuals. Recently, experts in the exposure sciences have pushed to narrow the focus of the exposome to include only metabolomics (NIOSH 2019). By monitoring metabolite changes in biofluids (i.e., blood and urine), metabolomics can be used to profile individuals’ responses to drug treatment or other medical therapy. Observing a set of metabolites with different concentration changes is a unique advantage to using metabolites as biomarkers. The changes in concentration can be correlated with a disease state or treatment response.

Compared to the other omics technologies that are more narrowly focused, metabolomics must interrogate a wide variety of molecules with very diverse physicochemical properties. This may explain why metabolomics is perhaps not as well developed and is highly dependent on well-defined internal standards to achieve consistent results. Although the genomics revolution brought an unprecedented ability to obtain genetic information across individuals and populations to aid in finding causes of human disease, tools for measuring the exposome have been slower to develop. However, recent technological advancements have enabled metabolomics researchers to detect the small molecules within the metabolome, quantify a large amount of those metabolites, and study their role in disease states. The science has the ability to provide a broad, agnostic assessment of the compounds existing in a biosample, rather than being limited to a chemical or class of chemicals selected in advance (Betts and Sawyer 2016). These assessments can be accomplished via two distinct detection approaches: targeted and untargeted metabolomics.

Targeted Metabolomics

Targeted metabolomics refers to the exact quantification of known, as well as expected, metabolites by employing analytical standards. The targeted approach focuses on a single analyte, a class of chemically similar analytes, or a set of analytes with chemistries similar enough to allow for their measurement in a single analysis (Metz et al. 2017). This is similar to an external exposure assessment where there is usually a targeted (i.e., known) contaminant of concern. The benefits of the targeted approach include accurate quantification of the contaminant of interest, low limits of quantification, and usually high confidence in the analyte’s identification. With metabolomics capabilities, medical clinicians could periodically perform biomonitoring of the workplace employees with the goal of tracking and quantifying biomarker levels. This approach would provide a more complete assessment of workers’ individual internal exposures versus only a representation of their specific external exposure.

Untargeted Metabolomics

Untargeted metabolomics simultaneously measures as many metabolites as possible in a biosample without having prior knowledge of the identity of the assessed metabolites. The untargeted approach is increasingly popular among metabolomics experts, as it allows for unknown or emerging exposures of concern to be detected. Although an advantage of the untargeted approach is that collection can be accomplished without preexisting knowledge, sample preparation and analytical methods have a direct impact on the qualitative results obtained. Due to the diversity of the metabolome, sample preparation steps, separation methods, and instrument platforms and parameters will affect the subset of the metabolites detected (Schrimpe- Rutledge et al. 2016). Datasets from untargeted studies are particularly complex, and a number of metabolites remain uncharacterized (Agin et al. 2016). When using mass spectrometry (MS) technology to analyze samples, it is likely impractical to manually inspect and interpret the thousands of peaks detected (Patti et al. 2013).

There is a significant range of platforms that can be employed to conduct metabolomics studies in biosamples. MS and nuclear magnetic resonance (NMR) have emerged as the most common of these available analytical platforms (Dias and Koal 2016). The high reproducibility associated with NMR, and the high sensitivity and selectivity associated with MS, make these tools superior over other analytical techniques (Emwas 2015). Various techniques within both methods of analysis offer multifaceted approaches to detect and identify a variety of metabolites. But because of its detection and quantitation sensitivity, MS is the preferred method as it allows for several hundred metabolites in a single measurement. The very diverse characteristics of small metabolites make chemical separation and detection challenging steps in the application of metabolomics. In order to separate the makeup of a mixture, MS is paired with a separation technique. These “hyphenated” analytical techniques (e.g., gas chromatography-mass spectrometry (GC-MS), liquid chromatography- mass spectrometry (LC-MS)), which combine separation technology with MS, have become highly effective tools for small-molecule analysis (Gowda and Djukovic 2014). GC-MS and LC-MS are the most popular separation techniques as both can be used to detect low-concentration metabolites. Instrument capability has slowly progressed in the number of metabolites detectable versus sensitivity, with NMR being able to detect 102 metabolites at>pM (i.e., micromolar) sensitivity, GC-MS being able to detect 102 metabolites at3 metabolites atcnM (i.e., nanomolar) sensitivity (Bradburne et al. 2015).

A relatively new and appealing technique for enhancing single-cell analysis methods is the incorporation of ion mobility spectrometry (IMS) with MS (Metz et al. 2017). This technique offers an improved identification of metabolites and provides an alternative to the solution-phase methods since separations are performed on the order of milliseconds in IMS, while on the order of minutes to hours in the conventional chromatographic methods. However, the potential of this integration has been hampered by the loss of sensitivity and the mismatched duty cycles traditionally associated with IMS/MS combination systems. Scientists at Pacific Northwest National Laboratory (PNNL) have developed an integrated instrument platform that overcomes these issues (PNNL 2020). The specific platform increases sensitivity and throughput by incorporating three distinct innovations: (1) an ion funnel technology, (2) ion funnel trap advancements that provide ion accumulation and precise release, and (3) a multiplexing feature for greater sensitivity with better-aligned duty cycles. The greater sensitivity derives from the technique yielding a higher signal- to-noise ratio than that produced by the conventional techniques. The new technique also increases analysis throughput because it allows ion packets to travel simultaneously through the drift region of the MS detector (PNNL 2020). Industry has further advanced the ion funnel technology by putting it on printed circuit boards, making the technology more economical. Structures for Lossless Ion Manipulations, or SLIM technology, can revolutionize the game of molecular and, more specifically, metabolomics studies. SLIM adds significant length—15, 20, 60 meters and more— into the typical ion path of a detector by implementing a serpentine pathway into the compactness of a small circuit board. Because the resolution of ion separations depends on the length of the drift path, the longer paths offered by SLIM provide more separation, grouping, and molecular analysis. Its developers think SLIM will “allow for a whole new universe of compounds and materials to be synthesized, purified, and collected” (PNNL 2017).

Growth of Metabolomics

Ever since its introduction in the late 1990s, metabolomics has increased in popularity and applicability. It has been widely adopted as a novel approach for biomarker discovery and, in tandem with genomics, has the potential for improving the understanding of underlying causes of pathology (Trivedi et al. 2017). However, in the clinical area, the science can no longer be described as novel, as indicated by the thousands of research articles on metabolomics that are available. In 2012, it was estimated that the National Institutes of Health (NIH) would invest approximately $14.3 million, and more than $51.4 million over 5 years, to accelerate the field of research (NIH 2012). Other government bodies have followed suit, supporting metabolomics activities on the international level, which further emphasizes the promise seen in metabolomics.

Metabolomics is the omics field most closely linked to the host phenotype and, thus, can report on the status of diseases as well as the effect and response to external stimuli (e.g., drug therapy, nutrition, exercise). Even miniscule changes made to the host genome, epigenome, and proteome are easily detected in the metabolome. Some researchers are emphasizing that the metabolome can also be used to detect endogenous changes in response to environmental chemicals from one’s job, diet, or other means.

Approximately 90% of deaths and disease in the United States and developed countries can be attributed to some kind of environmental exposure. When it comes to determining how these environmental exposures may contribute to disease, a major obstacle is the scarcity of publicly available information on the 80,000-plus chemicals registered for commercial use. The use of metabolomics can help, as it can provide a broad, agnostic assessment of the compounds in the respective biosample versus being limited to a preselected chemical or class of chemicals (Betts and Sawyer 2016).

Researchers have cataloged over 42,000 metabolites associated with food, drugs, food additives, phytochemicals, and pollutants (Betts and Sawyer 2016). Metabolomic studies have identified environmentally linked biomarkers related to numerous diseases, to include chronic fatigue syndrome and congenital heart defects. Three online metabolomics databases allow free access to the metabolome: DrugBank, Human Metabolome Database, and Toxic Exposome Database (Betts and Sawyer 2016). As both a bioinformatics and cheminformatics resource, DrugBank (2020) contains information on drugs and drug targets and has approximately 2,280 drug and drug metabolites. The Human Metabolome Database (2020) includes information about small molecule metabolites found in the human body and contains over 114,000 metabolite entries, with 5,702 protein sequences linked to those entries. The Toxic Exposome Database (2020), formerly the Toxin and Toxin Target Database (T3DB), is a bioinformatics resource that combines detailed toxin data with comprehensive toxin target information and contains approximately 3,670 toxins and environmental pollutants.

Although still evolving as a science compared to other more mature omics, metabolomics has found application in disease profiling, personalized medicine (e.g., drug discovery and drug assessment), toxicology, agriculture, and the environment. For its continued maturation, there are a few objectives that need to be met: (1) improvement in the comprehensive coverage of the metabolome, (2) standardization between laboratories and metabolomics experiments, and (3) enhancement of the integration of metabolomics data with other functional genomic information. The NIH (2012) funding mentioned previously was an effort to increase metabolomics research capacity by funding various initiatives in the area, to include training, technology development, standards synthesis, and data sharing capability for the field.

Founded in 2004, the Metabolomics Society (http://metabolomicssociety.org) is dedicated to promoting the growth, use, and understanding of metabolomics in the life sciences. It is a nonprofit organization with more than 1,000 members in over 40 countries, with focus on promoting the international growth and development of metabolomics, collaboration, opportunities to present research, and the publication of meritorious research (Metabolomics Society 2020). The growth of this society indicates the growing popularity of metabolomics around the world.

Whereas the data compiled from the older omics technologies is readily available and analyzed through electronic databases, a significant amount of metabolomics data is still only resident in books, journals, and other paper archives. Additionally, analytical software used in metabolomics studies is different from any software used in genomics, proteomics, and transcriptomics because of its emphasis on chemicals and analytical chemistry. Furthermore, the expanding use of metabolomics has been accompanied by a significant increase in the number of computational tools available to process and analyze the copious amounts of data it generates, with a goal of providing an automated and standardized operational platform.

 
Source
< Prev   CONTENTS   Source   Next >