Visionary Ideas: Research Methodologies Tools and Resources
New Strategic Plan
- Idea 1: Alternative methodolody for derma/photocarcinogenicity studies
- Idea 2: At the bleeding edge: benchmarking next-gen nanotox protocols
- Idea 3: Autism and vaccines: what does the data actually say?
- Idea 4: Build robust environmental data networks
- Idea 5: Collecting data of human exposure to environmental pollutants
- Idea 6: Collecting public health data
- Idea 7: Community based environment research pods
- Idea 8: Confront GLP & OECD chronic tox protocols
- Idea 9: Cumulative impacts research
- Idea 10: Develop models assessing exposures to mixtures
- Idea 11: Enable the exposome
- Idea 12: Exposure biology is important
- Idea 13: Focus toxicity testing on replacement chemicals and newly identified
- Idea 14: The Human Microbiome Project" and environmental health sciences
- Idea 15: The ICCVAM needs new leadership
- Idea 16: Identify early indicators of disease associated with toxic exposure
- Idea 17: Include EDC, PBT, low-dose and immunotoxicity in screening tests
- Idea 18: Incorporating powerful new physiological methods
- Idea 19: Incorporation of informatics in environmental health sciences
- Idea 20: Integrate exposure assessment with mHealth
- Idea 21: Integration of NTP molecular expression, tox/path, and FFPE data
- Idea 22: Names of people who nominate substances should be made public
- Idea 23: New assays are essential to identify environmental toxicants
- Idea 24: Next generation technology for environmental health studies
- Idea 25: Promote research that uses bacteria to clean up toxic spills
- Idea 26: Screening tests for mercury
- Idea 27: Strategy for using predictive toxicity testing for regulations
- Idea 28: Strategy to incorporate EDC, PBT, low dose endpoints in tox test
- Idea 29: Switchable solvent
Idea 1: Alternative methodolody for derma/photocarcinogenicity studies
The (NIEHS-FDA) NTP -Center for Phototoxicity (NCP) focus on the toxic effects of chemicals in combination with sunlight (phototoxicology) and changes in UV radiation-induced skin cancer by chemicals or other applied agents (photo carcinogenicity). The only animal model that is currently available in compliance with GLP for photo-carcinogenesis studies is the SKH-1 albino hairless mice under simulated solar light and topical application study. However, this model has not been validated so far and the mechanistic understanding that is provided by this model is limited (Forbers 1996). Thus, the predictively of the rodent photocarcinogencity model for humans (especially when existing mechanistic -mode of action -data demonstrates that the mouse is not a relevant species), is unclear.
The NCP methodology for phototoxicity/photo-carcinogenesis should incorporate specific mechanistic information about the chemical to be tested. For example, if an extensive mechanistic dataset exists for the chemical of concern, the use of in vitro confirmatory mechanistic studies, including photogenotoxicity, or long-term in vivo studies using a different animal specie, should be regarded as 'sufficient" to prove negative carcinogenicity effects. The in vitro findings would indicate when the chemical is ready for whole animal studies, if required, with justification provided by comparison with human results.
Mechanistic data may show that a different animal species (other than the mouse) may be more relevant to humans. In these cases the more relevant animal species should be allowed to be used. In the past several species of animals were utilized to serve as surrogate models for human response to topically applied chemicals and solar light, including South American opossum (Monodelphis domestica), hairless rats or guinea pigs, and shaved conventional or transgenic animals (2-5). The selection of the best animal model should also take into consideration gender and age.
There are also other factors that should be considered for the photo-carcinogenesis study. One includes determining the characteristics controlling any interaction between the solar spectrum (and its parts) with the chemical in question. This would not be limited to time, rate of formation of potential metabolites and characteristics of those “metabolites”. !nother includes determining the inherent skin binding ability of the chemical including rate, location, duration and species differences (including human) followed by changes induced by the solar spectrum. Finally it would be relevant determining the likelihood of biologic alteration (e.g. DNA, RNA, protein, etc.) by the chemical and then the solar spectrum influence. Inherent in all of this is selection of a dose or preparation that minimizes the likelihood of skin irritation as a confounder.
This does not remove the possibility that irritation, to some degree, may be wanted as a variable, both in vivo, and in the above early stages.
Additional research from NIEHS might be necessary to evaluate the effect of these factors and to review existing animal studies that may be relevant to expand the use of animal models for the photocarcinogenesis study.
- Forbes, P.D. (1996). Relevance of animal models of photocarcinogenesis to humans. Photochem Photobiol 63, 357-362.
- Davies, R.E., and Forbes, P.D. (1986) Effect of UV radiation on survival of non-haired mice. Photochem.Photobiol. 43, 267-274.
- DeGruijl, F.R., and Forbes, P.D. (1995) UV-induced skin cancer in a hairless mouse model. Bioessays 17, 651-660.
- Agarwal, R., and Mukhtar, H. (1996) Chemoprevention of photocarcinogenesis. Photochem. Photobiol. 63, 440-444.
- Ley, R.D. (2002) Animal models of ultraviolet radiation (UVR)-induced cutaneous melanoma. Frontiers in Bioscience 7, 1531-1534.
Idea 2: At the bleeding edge: benchmarking next-gen nanotox protocols
For all of the razzmatazz accorded Tox21's recent implementation of a state-of-the-art robotic system at NIH's Chemical Genomics Center, it advances by not one nanometre the sorry state-of-the-science sustained by nanotoxicology research and contributes negligibly to the elucidation of the infinitely more complex interactions of manufactured nanoforms and the diversification of emergent nanophysiological dynamics that can give rise to nanotoxicity.
While comprehension of molecular composition has been fundamental to standard, established hazard evaluations, risk assessments, and regulations-setting, the NIEHS ToxCast High-Throughput Screening Initiative's orientations towards basic chemical composition conventions and comparatively simplistic substance behavioural observations elude the urgent challenge of highly-refined, relevant fate and exposure explorations into engineered nano-scaled materials according to their specific potential toxicological synergies and structures and their peculiar emergent properties, processes, and patterns.
Proposed here are considerations for optimization of the National Toxicology Program's Nanotechnology Safety Initiative and its NanoHealth Enterprise Framework to move meaningfully and measurably forward towards structure and device standardisations while keeping pace with state–of–knowledge advances during the 5–years' Strategy Plan period and beyond.
- Devise a discrete predictive system by creating NEW PREDICTIVE PROTOTYPE that integrates 3DQSAR/ QSPR modelling, ex vivo procedures, virtual compound screening methodologies, and 3DPharmacore mapping with a view towards scaling out and up to PBPK–type modelling. Tightly–tweak welldefined nanomaterials domain descriptors to increase hit–rates and design new assays [as per Burello and Worth].
- Create a new, definitive NANOPARTICLE-PROTEIN CORONA CLASSIFICATION SYSTEM. Design particle-specific delineations according to morbid, mortal, latent, lethal, desirable, and deleterious, acute, chronic, naturally–occurring, and fabricated exposure toxicity differentials. Link NPC characterizations to toxic causality identifications (above) and incorporate into progressively predictive QSAR/PBPK prototype system. Improve testing correlations, cross–species outcomes. Influence safety–by–design mandates based on particle–protein specifications and nano-component locale.
Focus on corona compartmentalization [as per Faunce, White, and Matthaei] to minimize/eliminate toxicity and to reduce cacophony of confusion on the matter. The corona is where defect, deformation, dysfunction, disruption, disease, and death will more readily reveal themselves.
- Make revolutionary NANOMETROLOGY the sine qua non.
- Move ever forward towards a TRULY TRANSLATIONAL approach.
- Set the standard for toxic torts-and then RAISE IT; NONE of which can happen without a dedicated NANOINFORMATICS unit devoted strictly to the NTP.
- REFUSE to fail upwards. RENDER UNTO THE SCRAP HEAP all UNWORTHY, UNWORKABLE, UNSCALABLE projects. RETURN that REVENUE and those resources to permanent nanotox line item status.
Idea 3: Autism and vaccines: what does the data actually say?
There are a very large number of studies on the question of whether vaccines, or the mercury in them, cause autism.
These are generally mischaracterized as 'proving' no causation.
In fact, most are in the ambiguous zone where the p value is between 0.5 and 0.05, wherein the study shows it is more likely than not the hypothesis of causality is true, but not so likely that it is prudent to conclude so.
Most of these studies had inadequate sample sizes for the methodology used, so that the 95% CI included odds ratios consistent both with causality and with no association.
There is also the issue raised by DeSoto and Hitlan that some studies did not accurately report their results.
I suggest that all relevant studies be compiled, a table showing the p value, N and a one sentence summary of the hypothesis published, (this being at most a few man years of work suitable for Master's students) and the data obtained from the original authors and statistical calculations verified or corrected, with the original or subsequent publication showing the status of such efforts. This portion is perhaps 10 man years of effort and is more suitable for PhD students. Most likely much of the data will not be available and this fact should be noted and made public. This very straightforward and tractable project is well within your budget and will go a long way towards bridging the gap between public perception that vaccines are dangerous and the vaccine injury proceedings were a kangaroo court and scientific perception that the journal literature proves vaccines are safe.
Idea 4: Build robust environmental data networks
Environmental health research requires usable and accessible data from meteorological, hydrological, and ecological disciplines. In a changing climate, geographically scaled environmental data, in particular, could support research on emerging conditions that are triggered by environmental deterioration/toxins. Making the suppliers of environmental data aware of its utility to health research is one step toward ensuring data collection continuity and also the delivery of usable/beneficial data. I
I think that making this information available publicly and transparently will go a long way in advancing research efforts.
There is a plethora of open, transparent sources available now. The issue is having a public interested enough in educating itself with such available sources.
Idea 5: Collecting data of human exposure to environmental pollutants
I have an idea for how to collect data from people who are exposed to environmental detriments such as air pollutants (CO, SO2, NOx, etc.).
What if people wore a type of breathing apparatus in their nose that detected which substances were inhaled by that person?
This could be accompanied by follow-up health studies to see how the inhaled substances affect the health of the person.
Then we would know how specific pollutants affect human health, and if preventative measures were taken (e.g. using electric vehicles rather than fossil-fuel ones) we could see if they work to improve human health.
Idea 6: Collecting public health data
I have an idea of how to collect data about what types of electromagnetic radiation people are exposed to in their daily lives.
People could wear electromagnetic sensors (photodetectors) on their heads to find out what types of radiation they're exposed to, and for how long, during their daily travels. For example, when one goes out on the beach on a sunny day in Florida, the types of radiation the person was exposed to would be recorded and for how long.
Idea 7: Community based environment research pods
I would like to see a pilot study done with identifying community people, educating to the needs of the study, forming environment research pods as environmental factors and actions in their communities. The possible environmental causes of cancer. Taking that information and making a map of environmental factor activity and cancers in those pods areas. Once the research pods are developed they can also be used for community health education and outreach. We need to find more ways in getting community based approaches into our research,health education, and outreach practices. We need to empower our communities with the tools. Because they are the eyes and ears of our communities.
Idea 8: Confront GLP & OECD chronic tox protocols
For decades, not one NIH/NIEHS or other govt. funded piece of toxicity research has been used in a pre-marketing risk assessment (RA) of a toxic agent (post-marketing, such independent literature is slightly used, but industry's studies still get too much respect).
The cause of this horrifying waste of billion$ & of health is simply: Good Laboratory Practices (GLP). Regulatory agencies defined high quality data as GLP; and as independent researchers do need or like GLP; so GLP completely shields industry from the inconvenient truth about toxic agents; instead allowing the manufacturer to say how toxic they agent they earn billions or millions of $ from is!!
NIEHS should both sponsor comparative (GLP vs. independent academia) toxicity research and urge regulatory agencies to discuss the elements of good quality data; and work to finally get it used in RA!
Most aggravating is the classic chronic tox test's killing of the test animals before almost any chronic disease is allowed to develop--even the NTP uses this outrageous protocol. Adopt the Ramazzini Foundation's protocol!!
notify me of activity within the idea
Idea 9: Cumulative impacts researchNIEHS should fund research on better methodologies for cumulative impacts/risk assessment.
Idea 10: Develop models assessing exposures to mixtures
The development of biologically based dose-response models can be used for trans-species extrapolations of toxic or carcinogenic effects, and can address inter-individual differences in susceptibility as well as the effects of exposures to mixtures. Studies on time dependence should cover the time interval between exposure and elimination of the agent under study, at least over a 24-hour cycle (longer for bio-accumulating agents or for agents in which continuous exposure affects their metabolic elimination), and at multiple life stages to capture effects of age-related changes.
Idea 11: Enable the exposome
NIEHS needs to develop new tools and technologies to allow measurement of 'who' is exposed to 'how much' of 'what'. This includes both tools for 'point of contact' assessment (a la what has been supported in Exposure Biology), traditional biomonitoring, and development of new lab-on-chip technologies for biomonitoring.
What is "Exposome"? I think we have enough new terminologies but very little materials. If the Exposome is another type of genome research, I think I have pointed exactly the problem.
The exposome is a characterization of all of the components of an individual's environment and allows an investigation of how those exposures influence disease.
There have been a number of publications over the past 6 years describing the concept and it's potential in environmental health. See Rappaport (2011), JESEE, 21(1):1:5-9 and Wild (2005), Cancer Epidemiol Biomarkers 14(8):1847-50. There was also a meeting at teh National Academy of Science last year on this topic; http://delsold.nas.edu/envirohealth/exposome.shtml Hope that helps to shed some light.
Idea 12: Exposure biology is important
NIEHS needs to build on its investments in exposure biology and exposure measurement.
In particular we should push towards meaningful Gene-Environment interaction studies building on the Exposure Biology Program's capacity in measuring the personal environment including chemical exposures, diet, activity and stress.
Idea 13: Focus toxicity testing on replacement chemicals and newly identified
We are in a desperate need for research to understand the potential health impacts of more recently recognized contaminants such as polybrominated diphenyl ethers (P DE’s), perfluorinated chemicals (e.g., PFOA and PFOS), halogenated flame retardants, and nanomaterials, cyclosiloxanes, and newer plasticizers that are replacement chemicals for phthalates and other plasticizers. Research strategies must include studies to determine the effects of in utero and perinatal exposures on the later development of childhood and adult diseases. Improved understanding the health effects of less studied chemicals still in commerce and to which many people are exposed will result in a tangible worker and public health impact.
Idea 14: The Human Microbiome Project" and environmental health sciences
I would like see "Microbiomes" to be a Major Vision of NIEHS. In the last (4-5) years we have discovered over “500 Living Microbial Genomes” in our Human Bodies---Thanks to the Vision and Research of the NIH. The NIH is currently supporting the study of “400 Microbial Genomes”. How many Illnesses can we Cure by knowing the Interaction of 100’s of Living Organisms in Our bodies? What Cancers and Virus’s can we Cure with this New Knowledge? The 1st. Stethoscope was introduced in 1812 and Insulin was 1st. used as a Medicine for Diabetes in 1922. If there was ever a great Investment for all of Mankind---This is it---"MICROBIOMES". What role dose the Environment Play with the over "500 Microbiomes" that we have already discovered?
Enclosed is an article that was published by the National Institute of Health on June 23, 2009
“NIH Expands Human Microbiome Project-Funds Sequencing enters and Disease Projects
The Human Microbiome Project has awarded more than $42 million to expand its exploration of how the trillions of microscopic organisms that live in or on our bodies affect our health, the National Institutes of Health (NIH) announced today.
The human microbiome is all the microorganisms that reside in or on the human body, as well as all their DN!, or genomes. Launched in 2007 as part of the NIH ommon Fund’s Roadmap for Medical Research, the Human Microbiome Project is a $140 million, five-year effort that will produce a resource for researchers who are seeking to use information about the microbiome to improve human health.
"This effort will accelerate our understanding of how our bodies and microorganisms interact to influence health and disease," said Acting NIH Director Raynard S. Kington, M.D., Ph.D. "Examining the differences between the microbiomes of healthy patients and those of patients suffering from a disease promises to change how we diagnose, treat and, ultimately, prevent many health conditions." In the new round of funding, the Human Microbiome Project will support the work of the large-scale DNA sequencing centers that participated in the initial phase of the project. These centers will work together to sequence at least 400 microbial genomes. Another approximately 500 microbial genomes are already completed or in sequencing pipelines and supported by individual NIH institutes and internationally funded projects. These data will then be used to characterize the microbial communities found in samples taken from healthy human volunteers. These samples are currently being collected by the Human Microbiome Project from five areas of the body: the digestive tract, the mouth, the skin, & the nose (left 1 out).
The Human Microbiome Project’s large-scale sequencing centers, their principal investigators and approximate funding levels over four years are”
In 1969 we landed on the Moon. This is another World that will advance all Mankind.
Idea 15: The ICCVAM needs new leadership
The Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) Authorization Act of 2000 stipulates that one of the primary purposes of ICCVAM is to "ensure that new and revised test methods are validated to meet the needs of Federal agencies." Yet under the leadership of Dr. William Stokes, ICCVAM has been repeatedly unresponsive to agency needs. For example, ICCVAM:
- failed to review a single non-animal test method under the EPA's Endocrine Disruptor Screening Program;
- failed to appropriately review an industry-initiated and sponsored program to use non-animal methods to test for eye irritation with anti-microbial pesticides;
- ignored EP! and FD! nominations for members to its Scientific !dvisory ommittee on !lternative Toxicological Methods; and
- is actively campaigning to stop the U.S. from adopting the Globally Harmonized System of labeling put forth by the United Nations and the World Health Organization – a decision which is not within ICCVAM's purview and which, if adopted, would spare animals used in skin irritation tests.
With great strides being made in the development of non-animal test methods and an urgent need to put this science to use to protect workers and the public, the demand for rapid and effective validation of alternative methods has never been greater. ICCVAM could contribute far more effectively than its past record demonstrates; however, it cannot do so under its current leadership. PETA calls for the replacement of Dr. Stokes with someone genuinely committed to meeting the goals of ICCVAM and promoting modern alternatives to animal testing.
Idea 16: Identify early indicators of disease associated with toxic exposure
Research is needed to identify early clinical biomarkers of effect associated with toxic chemical exposure. These biomarkers could be signatures of alterations in toxicity pathways that can be clearly linked to disease endpoints. Ultimately this can be used to identify early indicators of health effects in people that are exposed to toxic chemicals in their workplaces, homes, and communities. These data can potentially be paired with biomonitoring data in cross-sectional surveys. The goal would be to chip away at the long lag time between exposure and disease by identifying markers that occur in closer proximity to the exposure event.
Pliny the Elder described a sickness of the lungs of slaves who wove asbestos into cloth. Given that (and such similar associations as: scrotal cancer in chimney sweeps, mercury-mediated madness in hatters, et cetera. The time is now to use extant information to bolster our growing set of indicators.
This visionary idea should join the number one issue on this site, Chemical Sensitivity and Intolerance, began by Mary Lamielle. I have been studying the effects of chemicals on human beings since I was sprayed with Dursban and Organophosphate pesticide in 1986. I believe that these ideas run so closely together that the two ideas should meld into one. Thanks.
Idea 17: Include EDC, PBT, low-dose and immunotoxicity in screening tests
When conducting new toxicology tests, predictive toxicology, and rapid or high throughput screening tests, NIEHS should be able to state with certainty how many different endpoints will need to be assayed, how many different cell types will need to be evaluated, and how multi-pathway events involved in disease processes and effects at different stages of multi-step processes will be integrated into overall evaluations of health risks. Specific targets for NIEHS and NTP research and toxicity testing should be expanded to include identifying environmental agents that are endocrine disrupting chemicals, immunotoxic agents, neurotoxic agents, persistent and bioaccumulative toxicants (PBTs), or demonstrate low dose or non-linear toxicity.
Idea 18: Incorporating powerful new physiological methods
Basic research in exposure science disproportionately measures histological markers and biochemical outcomes, but changes in PHYSIOLOGY are the critical determinant of harmful health outcomes. We should increase the emphasis on physiological measures at the cellular, tissue, system, and behavioral levels (as appropriate to the toxicant and disease) so that we can more concretely understand the health consequences of environmental exposure to potentially harmful agents. Consider the example of lead paint exposure, which can produce learning disabilities, fine movement deficits, and other neurocognitive problems at low doses that do not produce any obvious structural changes in the brain. However, by looking at the synaptic physiology in a rodent model it became clear that lead exposure can disrupt synaptic transmission through NMDA receptors in the hippocampus, a system involved in some forms of learning. This physiological work thus provides an explanation for the observed learning disabilities that conventional histology and biochemistry could never have generated. Physiological experimentation has undergone something of a revolution in the last decade, with a wealth of new genetic and optical techniques becoming widely available (especially in the neurosciences), and NIEHS should promote the application of these powerful new tools to traditional and novel questions in environmental science.
Idea 19: Incorporation of informatics in environmental health sciences
In order to better understand and track trends in communities, It would be worthwhile to work toward putting together an online program can aid in identifying these trends
Idea 20: Integrate exposure assessment with mHealth
To get a comprehensive characterization of the health impact of exposure, it will be useful to support the development, evaluation, and validation of a mobile health device that can be used in the field to continuously and simultaneously assess exposure to mixtures, physiological response, psychological response (i.e, stress), and various health conditions (from personal health records). Making the entire system unobtrusively wearable in the field can make it suitable for wider adoption by researchers and end-users.
Idea 21: Integration of NTP molecular expression, tox/path, and FFPE data
Reference dose-response datasets for known toxicants are invaluable in defining chemical mode of action and disease signatures. These signatures are derived through analysis of changes in molecular expression in exposed rodent and human tissues (or cells in culture) with concomitant measurement of conventional toxicity parameters, or disease biomarkers, and phenotypes. Comprehensive datasets from standardized studies or derived tissues exist for approximately 60 reference chemicals with complete dose-response in rats and mice (and, in some cases, primary human cells) that have been exposed from 1-14 days, 90 days, and 2 years. Thus, existing molecular expression, conventional toxicity, and pathology data could be assembled for the reference compounds that have been studied, and for which data are now available, in DrugMatrix (owned by NTP), NTP-TDMS, possibly Japan's TG-GATEs, and the NTP Archives. The integrated conventional and toxicogenomic data could then be made available (either within NTP/CEBS or online within a reference database framework) to define key events within a sequence of events in a chemical’s mode of action, as well as chemically-induced pre-disease signatures, thus ultimately enabling the realization of a predictive toxicology based on high quality reference data. Additionally, mode/mechanism of action, pre-disease, and disease classifiers and signatures could be derived from the 800-900 dose-response gene expression datasets currently available in these same and other public repositories and from the approximately 560 chemicals for which archival NTP FFPE (formalin fixed paraffin embedded) tissues exist -from which RNA, DNA, and protein could be extracted. This would enable potential studies of the genome, transcriptome, proteome, and epigenome of up to 40 exposed tissues per animal for which histopathology is already available. These reference datasets and tissue specimens would enable derivation of predictive toxicity and human disease signatures, and biomarkers, far into the future. The resulting data also will support test selection as well as validation of high-throughput screening (HTS) test results from the Tox 21 Program.
Idea 22: Names of people who nominate substances should be made public
People for the Ethical Treatment of Animals (PETA), the world’s largest animal rights organization with more than two million members and supporters, appreciates the opportunity to submit its Visionary Ideas for strategic planning to the National Institute of Environmental Health Sciences (NIEHS).
The National Toxicology Program (NTP), along with its Center for the Evaluation of Risks to Human Reproduction (CERHR), accepts nominations from private citizens, but it does not release the names of these individuals. These nominations have included the herbal dietary supplements Dong quai and bitter orange extract as well as the pharmaceutical fluoxetine, better known as Prozac. Comprehensive toxicological characterizations – often including carcinogenicity and developmental toxicity testing on hundreds or thousands of animals – are typically proposed for these substances despite extensive human use and experience. Releasing these individuals’ names is in the public interest as it would contribute to the public's understanding of NTP's operations and activities. It is also possible that conflicts of interest exist between the individuals making these nominations and the general public – particularly when a large market exists for these substances. We call upon NTP and CERHR to make public the names of all individuals nominating substances to their programs.
Idea 23: New assays are essential to identify environmental toxicants
To effectively screen all environmental chemicals for risks to human health will require much faster, cheaper and more direct scientific measurements than we have used routinely in the past. Making better use of simpler model genetic organisms than mice and engineering cell lines with fluorescent reporters of the molecular cascades that we are identifying as fundamental to human cell and organ function will be essential to meet this challenge.
Idea 24: Next generation technology for environmental health studies
Better understanding of links between environmental exposures and diseases, and more effective protection of human from environmental exposures (e.g., natural and human-caused disasters) will benefit from new wearable sensor technologies in the hands of average citizen that can measure personal exposure levels to multiple analytes. One emerging trend is to develop miniaturized sensors that are wirelessly connected to cell phones and internet to provide simultaneous monitoring of chemical exposures of a large population. Such trend must be supported by government before it reaches the critical mass required for commercial world.
I agree with this idea. New chemical sensor technologies needs to become part of people life to find out the origin of devastating diseases such as asthma, cancer, and autism, which are caused by environmental exposures.
That's a really good idea. The two key words of this idea is "miniature" and "wirelessly". These reflect the trend of technique development. Miniature means easy to use and carry. Wireless utilizes the advantage of communication technique, which is one of most successful technique human have developed, to collect data and gather data from a very broad population or locations.
I think that wearability and critical mass are some of the important keys to successful deployment.
An excellent approach to looking at the environment around populations and keying in on factors that could be potentially harmful.
This is a great idea for the betterment of next generation. Many studies prove that various deadly diseases can be cured at the early stages. With advancements in mobile and wireless technologies, Pervasive monitoring with miniaturized sensors will help in the detection of diseases caused by environmental exposures in early stages.
it's a great thing
It is a very cool stuff, just using cell phone to monitor the environmental pollution, then everyone could be easily informed of our air quality.
Portable, wearable real-time environmental sensors can provide a wealth of information on exposure. That said, there should also be support for statistical analyses of these data.
Products like these will help greatly in accurately measuring environmental exposures.
It's very important for consumers to be able to come to their own conclusions based on data they have collected, rather than being told numbers that are sometimes misleading.
Great idea! Could identify any exposure in a region, city, workplace or area. a study on school age children to adults. The possibilities for data are endless.
Idea 25: Promote research that uses bacteria to clean up toxic spills
Slick Solution: How Microbes Will Clean Up the Deepwater Horizon Oil Spill. Bacteria and other microbes sare the only thing that will ultimately clean up the ongoing oil spill in the Gulf of Mexico http://www.scientificamerican.com/article.cfm?id=how-microbes-clean-up-oil-spills
New bacteria could help clean up oil spill http://www.tgdaily.com/sustainability-features/50193-new-bacteria-could-help-clean-up-oil-spil
Such research has a long history. A great variety of such research is being pursued now.
Idea 26: Screening tests for mercury
Develop screening tests for mercury and reference ranges for all ages that are more accurate for body burden than blood testing. Blood testing is only indicative of current exposures and does not reflect past cumulative exposures.
Idea 27: Strategy for using predictive toxicity testing for regulations
The new predictive toxicity testing and screening methods will bring their own new set of challenges. Regulatory acceptance and the reliability of new predictive toxicology approaches must be a very high priority for NIEHS. The NIEHS must formulate a strategy on how these new data should be used in the decision making process. It is not good enough to produce huge volumes of toxicity data for the regulatory Agencies; the reliability of these data for assessing human health effects must be validated by NIEHS. While promises on the value and utility of predictive toxicology are being promoted, it is necessary for NIEHS to describe now how they will measure success.
Idea 28: Strategy to incorporate EDC, PBT, low dose endpoints in tox test
What is the NIEHS strategy that will insure that potential health effects at all organ sites, as a function of life stage, inter-individual susceptibility, and exposure to multiple agents will be properly addressed by new toxicology approaches? If such a strategy doesn't exist, then now is the time to develop it.
Idea 29: Switchable solventhttp://www.queensu.ca/news/articles/chemistry-prof-receives-polanyi-award
Idea 30: Use technology to communicate the science of NIEHS
NIEHS and NTP need to use technology (examples: mobile, social media, email, web, data collaboration, data sharing, social networks, crowdsourcing) to better a)collaborate and share data between scientists b)communicate with the wider scientific community c)communicate with the regulatory community and d)communicate with the general public. These areas need more resources (time, staff, funding) in order to support the research science of the NIEHS.
Agree! People are using new ways to communicate and learn about science: Mobile apps, twitter, Facebook, YouTube -these are the new communication vehicles, not traditional media and press releases. NIEHS needs to invest in the technologies and skills to keep up in this new age of communications.
Adoption of a plethora of disparate technologies (that is, mobile, social media, email, web, data collaboration, data sharing, social networks, crowd sourcing)will assure the fragmentation of any message NIEHS aspires to communicate. Make information available in PDF format and let it go at that (people will be forced to READ (rather than multi-task), and likely begin to UNDERSTAND). With the "other" technologies suggested above people are driven to react rather than assimilate, consider carefully, and understand.