Tuesday, April 2, 2019

Predicting Effects of Environmental Contaminants

Predicting Effects of Environmental Contaminants1.1. Debunking some chemical substance substance substance mythsIn October 2008, the Royal Society of Chemistry announced they were offering 1 million to the first member of the public that could bring a snow% chemical free material. This exploit to reclaim the word chemical from the attend and marketing industries that use it as a synonym for poison was a chemical reaction to a decision of the Advertising Standards Authority to defend an advert perpetuating the myths that natural products were chemical free (Edwards 2008). Indeed, no material regard slight of its declination is chemical free. A link common misc formerlyption is that chemicals made by personality argon intrinsically good and, conversely, those manufactured by man be bad (Ottoboni 1991). T present ar me genuinely examples of unhealthful escalates produced by alga or some another(prenominal)(prenominal) micro- beings, venomous sensuals and plants, or even e xamples of purlieual ill-treat extending from the front line of relatively benign natural compounds either in unhoped-for places or in unexpect quantities. It is therefore of prime importance to assign what is meant by chemical when referring to chemical hazards in this chapter and the rest of this book. The correct status to describe a chemical compound an electric organism whitethorn be overt to, whether of natural or synthetic origins, is xenobiotic, i.e. a substance distant to an organism (the term has as well been apply for transplants). A xenobiotic displace be delimitate as a chemical which is found in an organism but which is non commonly produced or expected to be insert in it. It good deal also cover substances which atomic number 18 present in practically juicyer assiduitys than argon usual.A grasp of some of the essential principles of the scientific disciplines that underlie the characterisation of piths associated with characterization to a x enobiotic is mandatory in order to gain the potential consequences of the presence of pollutants in the environment and faultfindingly appraise the scientific evidence. This chapter will attempt to briefly tot up some serious concepts of basic poisonousology and environmental epidemiology relevant in this context.1.2. Concepts of original oto oto cyanogenicologyToxicology is the science of poisons. A poison is commonly define as every substance that can cause an unbecoming topic as a result of a physicochemical interaction with living meander(Duffus 2006). The use of poisons is as old as the human race, as a method of hunt club or warfargon as well as murder, suicide or execution. The evolution of this scientific discipline cannot be separated from the evolution of pharmacology, or the science of cures. Theophrastus Phillippus Aureolus Bombastus von Hohenheim, much commonly known as Paracelsus (1493-1541), a medico contemporary of Copernicus, Martin Luther and da Vinc i, is widely considered as the racyher of cyanogenicology. He challenged the antediluvian concepts of medicine based on the balance of the four humours (blood, phlegm, yel minor and disastrous bile) associated with the four elements and believed illness authorisered when an organ failed and poisons accumulated. This use of chemistry and chemical analogies was especially offensive to his contemporary medical establishment. He is famously credited the pursual quote that whitewash underlies present-day toxicology.In other speech communication, all substances be potential poisons since all can cause injury or finis fol dispiriteding excessive image. Conversely, this statement implies that all chemicals can be used safely if handled with distinguish precautions and word picture is kept below a outlined limit, at which risk is considered tolerable (Duffus 2006). The concepts both(prenominal) of tolerable risk and unfavourable entrap illustrate the value judgements embedded in an otherwise scientific discipline relying on observable, measurable empirical evidence. What is considered abnormal or hateful is dictated by society rather than science. Any change from the normal state is not necessarily an adverse effect even if statistically significant. An effect whitethorn be considered harmful if it causes damage, irreversible change or increased susceptibility to other stresses, including infectious ailment. The pointedness of development or state of health of the organism whitethorn also fork out an entrance on the degree of harm.1.2.1. R come outes of photo toxicity will vary depending on the route of photograph. There are three routes via which delineation to environmental contaminants may kick the bucketIngestionInhalationSkin adsorptionDirect jibe may be used in environmental perniciousness testing. Toxic and pharmaceutical agents generally produce the most rapid result and greatest effect when accustomed intravenously, verbatimly i nto the bloodstream. A descending order of forcefulness for environmental moving-picture show routes would be inhalation, ingestion and skin adsorption.Oral toxicity is most relevant for substances that might be ingested with food or drinks. Whilst it could be argued that this is generally under an individuals control, there are complex issues regarding information both rough the occurrence of substances in food or urine and the afoot(predicate) state-of-knowledge about associated harmful set up.Gases, vapours and dusts or other airborne particles are inhaled involuntarily (with the infamous exception of smoking). The inhalation of solid particles depends upon their size and shape. In general, the smaller the particle, the further into the respiratory tract it can go. A tumid proportion of airborne particles breathed by dint of the mouth or vindicated by the cilia of the lungs can enter the gut.Dermal exposure generally requires direct and prolonged contact with the skin. The skin acts as a very sound barrier against m both external viciouss, but because of its great surface region (1.5-2 m2), some of the many diverse substances it comes in contact with may tranquilize elicit topical or systemic effects (Williams and Roberts 2000). If dermal exposure is often most relevant in occupational settings, it may thus far be pertinent in relation to bathing waters (ingestion is an classical route of exposure in this context). Voluntary dermal exposure related to the use of cosmetics raises the analogous questions regarding the adequate communication of current knowledge about potential effects as those related to food.1.2.2. Duration of exposureThe toxic response will also depend on the duration and absolute oftenness of exposure. The effect of a single process of a chemical may be severe effects whilst the same point total pane given at several intervals may have little if any effect. An example would be to compare the effects of drinking four be ers in one evening to those of drinking four beers in four days. scene duration is generally divided into four broad categories penetrating, sub- incisive, sub-inveterate and chronic. Acute exposure to a chemical usually refers to a single exposure essence or repeated exposures over a duration of less than 24 hours. Sub- smashing exposure to a chemical refers to repeated exposures for 1 month or less, sub-chronic exposure to continuous or repeated exposures for 1 to 3 months or approximately 10% of an investigateal species life time and chronic exposure for more than 3 months, usually 6 months to 2 years in rodents (Eaton and Klaassen 2001). Chronic exposure studies are designed to assess the cumulative toxicity of chemicals with potential lifetime exposure in humans. In real exposure situations, it is generally very catchy to as certain(prenominal) with any certainty the frequency and duration of exposure but the same terms are used.For acute effects, the time component of the sexually transmitted disease is not primal as a high dose is responsible for these effects. However if acute exposure to agents that are rapidly absorbed is likely to induce immediate toxic effects, it does not rule out the possibility of delayed effects that are not necessarily similar to those associated with chronic exposure, e.g. latency amid the onset of certain cancers and exposure to a carcinogenic substance. It may be worth here mentioning the fact that the effect of exposure to a toxic agent may be entirely dependent on the timing of exposure, in other words long-term effects as a result of exposure to a toxic agent during a critically sensitive stage of development may differ widely to those seen if an adult organism is exposed to the same substance. Acute effects are almost al slipway the result of accidents. former(a)wise, they may result from criminal poisoning or self-poisoning (suicide). Conversely, whilst chronic exposure to a toxic agent is generally associated with long-term low-level chronic effects, this does not preclude the possibility of some immediate (acute) effects afterward each administration. These concepts are closely related to the mechanisms of metabolic degradation and reasoning by elimination of ingested substances and are best illustrated by 1.1.Line A. chemical with very boring elimination. Line B. chemical with a rate of elimination equal to frequency of dosing. Line C. Rate of elimination faster than the dosing frequency. Blue-shaded area is representative of the concentration at the target office necessary to elicit a toxic response.1.2.3. Mechanisms of toxicityThe interaction of a foreign compound with a biologic system is devil-fold there is the effect of the organism on the compound (toxicokinetics) and the effect of the compound on the organism (toxicodynamics).Toxicokinetics relate to the delivery of the compound to its position of action, including absorption ( carry-over from the site of administration into the general circulation), dispersion (via the general circulation into and out of the tissues), and elimination (from general circulation by metabolism or excretion). The target tissue refers to the tissue where a toxicant exerts its effect, and is not necessarily where the concentration of a toxic substance is higher. Many halogenated compounds such as polychlorinated biphenyls (PCBs) or flare out retardants such as polybrominated diphenyl ethers (PBDEs) are known to bioaccumulate in body fat stores. Whether such sequestration processes are actually protective to the individual organisms, i.e. by lowering the concentration of the toxicant at the site of action is not clear (OFlaherty 2000). In an ecological context save, such bioaccumulation may facilitate as an indirect route of exposure for organisms at higher trophic levels, thereby potentially contributing to biomagnification through the food chain.Absorption of any compound that has not been directed intravenously inj ected will entail delegate across membrane barriers before it reaches the systemic circulation, and the efficiency of absorption processes is exceedingly dependent on the route of exposure.It is also important to note that scattering and elimination, although often considered separately, take place simultaneously. Elimination itself comprises of two kinds of processes, excretion and biotransformation, that are also taking place simultaneously. Elimination and statistical distribution are not independent of each other as effective elimination of a compounds will prevent its distribution in peripheral tissues, whilst conversely, wide distribution of a compound will impede its excretion (OFlaherty 2000). Kinetic models attempt to predict the concentration of a toxicant at the target site from the administered dose. If often the ultimate toxicant, i.e. the chemical species that induces structural or guideal alterations resulting in toxicity, is the compound administered (parent comp ound), it can also be a metabolite of the parent compound generated by biotransformation processes, i.e. toxication rather than detoxication (Timbrell 2000 Gregus and Klaassen 2001). The liver and kidneys are the most important excretory organs for non-volatile substances, whilst the lungs are active in the excretion of volatile compounds and gases. Other routes of excretion intromit the skin, hair, sweat, nails and milk. Milk may be a major route of excretion for lipophilic chemicals due to its high fat mental object (OFlaherty 2000).Toxicodynamics is the study of toxic response at the site of action, including the reactions with and binding to cadre constituents, and the biochemical and physiological consequences of these actions. Such consequences may therefore be manifested and ascertained at the molecular or cellular levels, at the target organ or on the whole organism. Therefore, although toxic responses have a biochemical basis, the study of toxic response is generally s ubdivided either depending on the organ on which toxicity is observed, including hepatotoxicity (liver), nephrotoxicity (kidney), neurotoxicity (nervous system), pulmonotoxicity (lung) or depending on the type of toxic response, including teratogenicity (abnormalities of physiological development), immunotoxicity (immune system impairment), mutagenicity (damage of genetic material), carcinogenicity (cancer causation or promotion). The choice of the toxicity ending to observe in experimental toxicity testing is therefore of critical importance. In recent years, rapid advances of biochemical sciences and technology have resulted in the development of bioassay techniques that can contribute invaluable information regarding toxicity mechanisms at the cellular and molecular level. However, the extrapolation of such information to predict effects in an intact organism for the purpose of risk assessment is lock away in its infancy (Gundert -Remy et al. 2005).1.2.4. Dose-response bloods8 3A7DC81The theory of dose-response relationships is based on the assumptions that the activity of a substance is not an inherent quality but depends on the dose an organism is exposed to, i.e. all substances are inactive below a certain threshold and active over that threshold, and that dose-response relationships are monotonic, the response rises with the dose. Toxicity may be detected either as all-or-nothing phenomenon such as the death of the organism or as a graded response such as the hypertrophy of a specific organ. The dose-response relationship involves correlating the badness of the response with exposure (the dose). Dose-response relationships for all-or-nothing (quantal) responses are typically S-shaped and this reflects the fact that sensitivity of individuals in a creation generally exhibits a normal or Gaussian distribution. Biological variation in susceptibility, with fewer individuals being either hypersusceptible or resistant at both end of the cut back and the majority responding among these two extremes, gives rise to a bell-shaped normal frequency distribution. When plot as a cumulative frequency distribution, a sigmoid dose-response wreathe is observed ( 1.2).Studying dose response, and developing dose response models, is aboriginal to determining safe and hazardous levels.The simplest time of toxicity is lethality and determination of the normal lethal dose, the LD50 is usually the first toxicological test performed with new substances. The LD50 is the dose at which a substance is expected to cause the death of fractional of the experimental fleshlys and it is derived statistically from dose-response curves (Eaton and Klaassen 2001). LD50 determine are the standard for comparison of acute toxicity between chemical compounds and between species. Some values are given in Table 1.1. It is important to note that the higher the LD50, the less toxic the compound.Similarly, the EC50, the median effective dose, is the quantity of the chemical that is estimated to have an effect in 50% of the organisms. However, median doses alone are not very informative, as they do not convey any information on the shape of the dose-response curve. This is best illustrated by 1.3. While toxicant A appears (always) more toxic than toxicant B on the basis of its lower LD50, toxicant B will start affecting organisms at lower doses (lower threshold) tour the steeper slope for the dose-response curve for toxicant A means that once individuals become overexposed (exceed the threshold dose), the increase in response occurs over much smaller increments in dose.Low dose responsesThe classical paradigm for extrapolating dose-response relationships at low doses is based on the concept of threshold for non-carcinogens, whereas it assumes that there is no threshold for carcinogenic responses and a linear relationship is hypothesised (s 1.4 and 1.5).The NOAEL (No Observed indecorous Effect Level) is the exposure level at which there is no statistically or biologically significant increase in the frequency or virulence of adverse effects between exposed population and its appropriate control. The NOEL for the most sensitive test species and the most sensitive indicator of toxicity is usually employed for regulative purposes. The LOAEL (Lowest Observed Adverse Effect Level) is the utmost exposure level at which there is a statistically or biologically significant increase in the frequency or severity of adverse effects between exposed population and its appropriate control. The main(prenominal) criticism of NOAEL and LOAEL is that there are dependent on study design, i.e. the dose groups selected and the number of individuals in each group. Statistical methods of deriving the concentration that produces a specific effect ECx, or a benchmark dose (BMD), the statistical lower confidence limit on the dose that produces a defined response (the benchmark response or BMR), are increasingly preferred.To understand the ris k that environmental contaminants pose to human health requires the extrapolation of limited entropy from animal experimental studies to the low doses critically encountered in the environment. Such extrapolation of dose-response relationships at low doses is the source of much controversy. Recent advances in the statistical compendium of very large populations exposed to ambient concentrations of environmental pollutants have however not observed thresholds for cancer or non-cancer outcomes (White et al. 2009). The actions of chemical agents are triggered by complex molecular and cellular events that may lead to cancer and non-cancer outcomes in an organism. These processes may be linear or non-linear at an individual level. A thorough understanding of critical steps in a toxic process may help refine current assumptions about thresholds (Boobis et al. 2009). The dose-response curve however describes the response or variation in sensitivity of a population. Biological and statist ical attributes such as population variability, additivity to pre-existing conditions or diseases induced at background exposure will tend to peaceful and linearise the dose-response relationship, obscuring individual thresholds.HormesisDose-response relationships for substances that are essential for normal physiological function and survival are actually U-shaped. At very low doses, adverse effects are observed due to a deficiency. As the dose of such an essential nutrient is increased, the adverse effect is no interminable detected and the organism can function normally in a state of homeostasis. Abnormally high doses however, can give rise to a toxic response. This response may be qualitatively different and the toxic endpoint measured at very low and very high doses is not necessarily the same.There is evidence that nonessential substances may also impart an effect at very low doses ( 1.6). Some authors have argued that hormesis ought to be the default assumption in the risk assessment of toxic substances (Calabrese and Baldwin 2003). Whether such low dose effects should be considered stimulatory or right is controversial. Further, potential implications of the concept of hormesis for the risk management of the combinations of the wide variety of environmental contaminants present at low doses that individuals with variable sensitivity may be exposed to are at best unclear.1.2.5. Chemical interactionsIn regulatory hazard assessment, chemical hazard are typically considered on a compound by compound basis, the possibility of chemical interactions being accounted for by the use of safety or suspense factors. Mixture effects gloss over represent a challenge for the risk management of chemicals in the environment, as the presence of one chemical may alter the response to another chemical. The simplest interaction is additivity the effect of two or more chemicals acting unitedly is equivalent to the sum of the effects of each chemical in the compartmen talisation when acting independently. Synergism is more complex and describes a situation when the presence of both chemicals causes an effect that is greater than the sum of their effects when acting alone. In potentiation, a substance that does not produce specific toxicity on its own increases the toxicity of another substance when both are present. opposition is the principle upon which antidotes are based whereby a chemical can snub the harm caused by a toxicant (James et al. 2000 Duffus 2006). Mathematical illustrations and examples of known chemical interactions are given in Table 1.2.Table 1.2. Mathematical representations of chemical interactions (reproduced from James et al., 2000)EffectHypothetical mathematical illustrationExample bilinear2 + 3 = 5Organophosphate pesticidesSynergistic2 + 3 = 20Cigarette smoking + asbestosPotentiation2 + 0 = 10Alcohol + ascorbic acid tetrachlorideAntagonism6 + 6 = 8 or5 + (-5) = 0 or10 + 0 = 2Toluene + benzene caffein + alcoholDimercapr ol + mercuryThere are four main ways in which chemicals may interact (James et al. 2000)1. Functional both chemicals have an effect on the same physiological function.2. Chemical a chemical reaction between the two compounds affects the toxicity of one or both compounds.3. Dispositional the absorption, metabolism, distribution or excretion of one substance is increased or decreased by the presence of the other.4. Receptor-mediated when two chemicals have differing affinity and activity for the same receptor, competition for the receptor will modify the overall effect.1.2.6. Relevance of animal modelsA further complication in the extrapolation of the results of toxicological experimental studies to humans, or indeed other untested species, is related to the anatomical, physiological and biochemical differences between species. This paradoxically requires some previous knowledge of the mechanism of toxicity of a chemical and comparative physiology of different test species. When adve rse effects are detected in screening tests, these should be interpreted with the relevance of the animal model chosen in mind. For the derivation of safe levels, safety or uncertainty factors are again usually applied to account for the uncertainty surrounding inter-species differences (James et al. 2000 Sullivan 2006).1.2.7. A few words about dosesWhen discussing dose-response, it is also important to understand which dose is being referred to and differentiate between concentrations measured in environmental media and the concentration that will illicit an adverse effect at the target organ or tissue. The exposure dose in a toxicological testing setting is generally known or can be readily derived or measured from concentrations in media and average uptake (of food or water for example) ( 1.7.). Whilst toxicokinetics help to develop an understanding of the relationship between the internal dose and a known exposure dose, relating concentrations in environmental media to the actu al exposure dose, often via multiple pathways, is in the nation of exposure assessment.1.2.8. Other hazard characterisation criteriaBefore continuing further, it is important to clarify the difference between hazard and risk. hazard is defined as the potential to produce harm, it is therefore an inherent qualitative attribute of a given chemical substance. Risk on the other hand is a quantitative measure of the magnitude of the hazard and the probability of it being realised. Hazard assessment is therefore the first step of risk assessment, followed by exposure assessment and finally risk characterization. Toxicity is not the sole quantity evaluated for hazard characterisation purposes.Some chemicals have been found in the tissues of animals in the arctic for example, where these substances of concern have never been used or produced. This realisation that some pollutants were able to travel far distances across national borders because of their assiduity, and bioaccumulate thr ough the food weather vane, led to the status of such inherent properties of organic compounds alongside their toxicity for the purpose of hazard characterisation.Persistence is the result of resistance to environmental degradation mechanisms such as hydrolysis, photodegradation and biodegradation. Hydrolysis only occurs in the presence of water, photodegradation in the presence of UV light and biodegradation is primarily carried out by micro-organisms. abasement is related to water solubility, itself inversely related to lipid solubility, therefore persistence tends to be correlated to lipid solubility (Francis 1994). The persistence of inorganic substances has proved more difficult to define as they cannot be degraded to coulomb and water.Chemicals may accumulate in environmental compartments and constitute environmental sinks that could be re-mobilised and lead to effects. Further, whilst substances may accumulate in one species without adverse effects, it may be toxic to i ts predator(s). Bioconcentration refers to accumulation of a chemical from its surrounding environment rather than specifically through food uptake. Conversely, biomagnification refers to uptake from food without consideration for uptake through the body surface. Bioaccumulation integrates both paths, surrounding medium and food. ecologic magnification refers to an increase in concentration through the food web from lower to higher trophic levels. Again, accumulation of organic compounds generally involves transfer from a hydrophilic to a hydrophobic phase and correlates well with the n-octanol/water partition coefficient (Herrchen 2006).Persistence and bioaccumulation of a substance is evaluated by standardised OECD tests. Criteria for the identification of persistent, bioaccumulative, and toxic substances (PBT), and very persistent and very bioaccumulative substances (vPvB) as defined in Annex XIII of the European Directive on the Registration, Evaluation, Authorisation and bre astwork of Chemicals (REACH) (Union 2006) are given in table 1.3. To be class as a PBT or vPvB substance, a given compound must run into all criteria.Table 1.3. REACH criteria for identifying PBT and vPvB chemicalsCriterionPBT criteriavPvB criteriaPersistenceEither half-life 60 days in marine water half-life 60 days in fresh or estuarine waterhalf life 180 days in marine posit half-life great hundred days in fresh or estuarine sedimentHalf-life 120 days in soilEitherHalf-life 60 days in marine, fresh or estuarine waterHalf-life 180 days in marine, fresh or estuarine sedimentHalf-life 180 days in soilBioaccumulationBioconcentration factor (BCF) 2000Bioconcentration factor (BCF) 2000ToxicityEitherChronic no-observed effect concentration (NOEC) substance is classified as carcinogenic (category 1 or 2), mutagenic (category 1 or 2), or toxic for reproduction (category 1, 2 or 3)there is other evidence of endocrine disrupting effects1.3. Some notions of Environmental Epidemi ologyA complementary, experimental approach to the study of scientific evidence of associations between environment and disease is epidemiology. Epidemiology can be defined as the study of how often diseases occur and why, based on the measurement of disease outcome in a study sample in relation to a population at risk. (Coggon et al. 2003). Environmental epidemiology refers to the study of patterns and disease and health related to exposures that are exogenous and involuntary. Such exposures generally occur in the air, water, diet, or soil and include physical, chemical and biologic agents. The extent to which environmental epidemiology is considered to include social, political, cultural, and engineering or architectural factors affecting human contact with such agents varies according to authors. In some contexts, the environment can refer to all non-genetic factors, although dietary habits are generally excluded, despite the facts that some deficiency diseases are environmenta lly set and nutritional status may also modify the impact of an environmental exposure (Steenland and Savitz 1997 Hertz-Picciotto 1998).Most of environmental epidemiology is concerned with endemics, in other words acute or chronic disease occurring at relatively low frequency in the general population due partly to a common and often unsuspected exposure, rather than epidemics, or acute outbreaks of disease affecting a limited population shortly after the instauration of an unusual known or unknown agent. Measuring such low level exposure to the general public may be difficult when not impossible, specially when seeking historical estimates of exposure to predict future day disease. Estimating very small changes in the incidence of health effects of low-level common multiple exposure on common diseases with multifactorial etiologies is particularly difficult because often greater variability may be expected for other reasons, and environmental epidemiology has to rely on natural experiments that unlike controlled experiment are subject to confounding to other, often unknown, risk factors. However, it may still be of importance from a public health perspective as small effects in a large population can have large attributable risks if the disease is common (Steenland and Savitz 1997 Coggon et al. 2003).1.3.1. DefinitionsWhat is a grapheme?The definition of a case generally requires a dichotomy, i.e. for a given condition, people can be divided into two discrete classes the impact and the non-affected. It increasingly appears that diseases exist in a continuum of severity within a population rather than an all or nothing phenomenon. For practical reasons, a cut-off point to divide the diagnostic continuum into cases and non-cases is therefore required. This can be through on a statistical, clinical, vaticination or operational basis. On a statistical basis, the norm is often defined as within two standard deviations of the age-specific mean, thereby arbit rarily fixing the frequency of abnormal values at around 5% in every population. Moreover, it should be state that what is usual is not necessarily good. A clinical case may be defined by the level of a variable in a higher place which symptoms and complications have been found to become more frequent. On a prognostic basis, some clinical findings may carry an adverse prognosis, yet be symptomless. When none of the other approaches is satisfactory, an operational threshold will need to be defined, e.g. based on a threshold for treatment (Coggon et al. 2003).Incidence, prevalence and deathrateThe incidence of a disease is the rate at which new cases occur in a population during a specified period or frequency of incidents.Incidence =The prevalence of a disease is the proportion of the population that are cases at a given point in time. This measure is appropriate only in relatively stable conditions and is unsuitable for acute disorders. Even in a chronic disease, the manifestation s are often intermittent and a point prevalence will tend to underestimate the frequency of the condition. A better measure when possible is the period prevalence defined as the proportion of a population that are cases at any time within a stated pe

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.