Sunday, January 2, 2011

Budding Research Links Climate Change and Earlier Flowering Plants

0 comments

University of Cincinnati biologist Denis Conover has done extensive plant studies in Hamilton County Parks and the Oxbow area. Here he studies a specimen at Burnet Woods.

According to research published November 16 by a University of Cincinnati faculty member, native plants in southwestern Ohio are flowering significantly earlier, a finding he attributes, at least in part, to global warming.

UC biologist Denis Conover, field service associate professor, has spent countless hours walking the Shaker Trace Wetlands at Miami Whitewater Forest over the last 18 years to survey hundreds of different plant species.

Conover's results, published in the December issue of Ecological Restoration, reveal that for species that were observed flowering during two distinct multi-year surveys, a significant number of wild plants (39 percent) bloomed earlier from 2005 to 2008 than when he recorded the same species' blooming times from 1992 to 1996. Forty-five percent of the plants bloomed at the same time, and 16 percent bloomed earlier.

"I was doing a plant survey to see how the wetlands had changed over the years, and I noticed a lot of the plants were blooming earlier than they had in the previous survey," said Conover.

The biologist pointed out that the mean annual temperature during the survey periods increased nearly 2 degrees from 53.38 degrees (11.88 C) to 55.27 degrees (12.93 C) in roughly a decade's time.

"This is a big change for such a short time period," said Conover. "There is a lot of data coming from all over the world indicating that biological communities are being impacted by warmer temperatures."

To determine the impact of these changes, Conover said scientists would need to look closely at the complete ecological picture, including the impact on insects and birds that interact with the plants.

"If the right insects aren't out at the right time, it could affect some of the cross-pollination that goes on," he said. Or it could affect certain birds that depend on the seeds from those plants. Everything is interrelated. It is hard to say what impact it will have. We could also see things like more invasive species moving in because of the warmer conditions."

Conover worked closely with UC's Steve Pelikan, a math professor, who crunched all the data from the surveys. Pelikan said he found both the number of earlier-flowering plants and the temperature change from one survey to the next to be statistically significant.

Conover's wild-plant research follows a similar pattern of findings from a recent 30-year garden-plant study in southwestern Ohio (McEwan, et al.). Pelikan points out that Conover's published research is significant because it is one of the first to highlight the earlier flowering phenomena among plants in a natural habitat as opposed to a more-controlled garden setting.

"His is one of the first papers to reach this conclusion when working with native plants in a native setting," said Pelikan.

Further substantiating the work, Conover has found that his observations also aren't unique to the Shaker Trace Wetlands. He's finding similar results as he compares data he collected from a plant survey in 2000 at Oxbow -- a wetland at the confluence of the Great Miami and Ohio Rivers that spans southeastern Indiana and southwestern Ohio -- to data from today.

He's also noticed the presence of new invasive species in the Oxbow area such as Callery pear, Japanese stiltgrass and Japanese chaff flower.

Source Story by Sciencedaily
READ MORE - Budding Research Links Climate Change and Earlier Flowering Plants
Your Ad Here

Type 1 Diabetes Computer Model's Predictive Success Validated Through Lab Testing

0 comments

A La Jolla Institute team, led by leading type 1 diabetes researcher Matthias von Herrath, M.D., has demonstrated the effectiveness of a recently developed computer model in predicting key information about nasal insulin treatment regimens in type 1 (juvenile) diabetes.

The findings, which also showed the platform's ability to predict critical type 1 diabetes molecular "biomarkers," were published in the December issue of the scientific journal Diabetes, and further validate the importance of the new model as a valuable research tool in type 1 diabetes. The software is designed to enable researchers to rapidly streamline laboratory research through the evaluation of alternative scenarios for therapeutic strategies that show the most promise for working in humans.

"Since laboratory studies can cost hundreds of thousands of dollars, and early stage human clinical trials can cost $10 million dollars or more, predicting the right conditions to try is important," said Dr. von Herrath, director of the Type 1 Diabetes Research Center at the La Jolla Institute for Allergy & Immunology, where the studies were conducted.

Development of the software, the Type 1 Diabetes PhysioLab® Platform, was funded through the peer-reviewed grant program of the American Diabetes Association.

"We've found that using this in silico (computer analysis) prediction platform can quicken the pace and effectiveness of type 1 diabetes research," he continued. "By allowing us to pre-test our theories in computer models, we can ensure that the more time-intensive and costly process of laboratory testing is focused on the most promising therapeutic strategies, with the greatest chance of success."

The platform, developed by Entelos Inc., a life sciences company specializing in predictive technologies, has previously been shown to successfully predict various data from published type 1 diabetes experiments. Dr. von Herrath's team used a different approach to test the model, asking it to predict the outcome of a hypothetical experiment on nasal insulin dosing frequency in animal models that had not yet been performed. The prediction was then tested in the laboratory, where its results were confirmed.

In addition, he said, the model was able to accurately identify the particular time frame at which key type 1 diabetes "biomarkers" kicked in. Biomarkers are specific cell types or proteins that tell researchers at what point a therapeutic option is working or when it is time to start treatment. In the case of the La Jolla Institute study, the model successfully predicted the onset of biomarkers indicating beta cell protection in the NOD mouse.

"The model accurately predicted that implementing a low frequency nasal insulin dosing regimen in animal models was more beneficial in controlling type 1 diabetes than a high frequency regimen," said Dr. von Herrath, noting that the software's prediction of the biomarkers was key in this process. "These results confirmed our hypotheses on the benefits of low-frequency nasal insulin dosing. But even more importantly, the advantage of applying computer modeling in optimizing the therapeutic efficacy of nasal insulin immunotherapy was confirmed."

The results were reported in the paper "Virtual Optimization of Nasal Insulin Therapy Predicts Immunization Frequency To Be Crucial for Diabetes Protection." Dr. von Herrath was senior author on the paper and La Jolla Institute scientist Georgia Fousteri, Ph.D., and Jason Chan, Ph.D., from Entelos' R&D group, were first co-authors.

The Type 1 Diabetes PhysioLab® Platform is a large-scale mathematical model of disease pathogenesis based on non-obese diabetic (NOD) mice. The platform was developed with input from an independent scientific team of leading type 1 diabetes experts. The research support group of the American Diabetes Association funded the work of the software's development to provide a new scientific tool for enhancing the speed and effectiveness of type 1 diabetes research.

More than 400,000 children worldwide suffer from type 1 diabetes, a chronic disease that can lead to severe complications, such as blindness, cardiovascular disease, renal disease, coma or even death.

The platform, developed over two years, simulates autoimmune processes and subsequent destruction of pancreatic beta cells from birth through frank diabetes onset (hyperglycemia). The destruction of insulin-producing beta cells in the pancreas is the underlying cause of type 1 diabetes.

Specifically, Dr. von Herrath's team employed the model to investigate the possible mechanisms underlying the effectiveness of nasal insulin therapy, using the B: 9-23 peptide. "The experimental aim was to evaluate the impact of dose, frequency of administration and age at treatment on key molecular mechanisms and optimal therapeutic outcome," he said.

Using parameters input by the scientific team, the model accurately predicted that less frequent doses of nasal insulin, started at an early disease stage, would protect more effectively against beta cell destruction than higher frequency doses in NOD mice.

Dr. von Herrath added that the positive results add credence to the idea of creating computer models for analyzing therapeutic interventions in human disease. "These results support the development and application of humanized platforms for the design of clinical trials," he said.


Source Story by Sciencedaily
READ MORE - Type 1 Diabetes Computer Model's Predictive Success Validated Through Lab Testing
Your Ad Here

Neandertals’ Extinction Not Caused by Deficient Diets, Tooth Analysis Shows

0 comments

Neandertal teeth from Shanidar cave.

Researchers from George Washington University and the Smithsonian Institution have discovered evidence to debunk the theory that Neandertals' disappearance was caused in part by a deficient diet -- one that lacked variety and was overly reliant on meat. After discovering starch granules from plant food trapped in the dental calculus on 40-thousand-year-old Neandertal teeth, the scientists believe that Neandertals ate a wide variety of plants and included cooked grains as part of a more sophisticated, diverse diet similar to early modern humans.

"Neandertals are often portrayed as very backwards or primitive," said Amanda Henry, lead researcher and a post-doctoral researcher at GW. "Now we are beginning to understand that they had some quite advanced technologies and behaviors."

Dr. Henry made this discovery together with Alison Brooks, professor of anthropology and international affairs at GW, and Dolores Piperno, a GW research professor and senior scientist and curator of archaeobotany and South American archaeology at the Smithsonian National Museum of Natural History, Washington D.C., and Smithsonian Tropical Research Institute, Panama.

The discovery of starch granules in the calculus on Neandertal teeth provides direct evidence that they made sophisticated, thoughtful food choices and ate more nutrient-rich plants, for example date palms, legumes and grains such as barley. Until now, anthropologists have hypothesized that Neandertals were outlived by early modern humans due in part to the former's primitive, deficient diet, with some scientists arguing Neandertals' diets were specialized for meat-eating. As such, during major climate swings Neandertals could be outcompeted by early humans who incorporated diverse plant foods available in the local environment into their diets.

Drs. Henry, Brooks and Piperno's discovery suggests otherwise. The researchers discovered starch granules in dental calculus, which forms when plaque buildup hardens, on the fossilized teeth of Neandertal skeletons excavated from Shanidar Cave in Iraq and Spy Cave in Belgium. Starch granules are abundant in most human plant foods, but were not known to survive on fossil teeth this old until this study. The researchers' findings indicate that Neandertals' diets were more similar to those of early humans than originally thought. The researchers also determined from alterations they observed in the starch granules that Neandertals prepared and cooked starch-rich foods to make them taste better and easier to digest.

"Neandertals and early humans did not visit the dentist," said Dr. Brooks. "Therefore, the calculus or tartar remained on their teeth, preserving tiny clues to the previously unknown plant portion of their diets."

Dr. Henry is currently a post-doctoral researcher in the Columbian College of Arts and Sciences Hominid Paleobiology program at the George Washington University, where she also received her Ph.D. in Jan. 2010. Her research focuses on the uses of plant foods by human ancestors. In Jan. 2011, Dr. Henry will begin leading an independent research group focusing on the evolution of human diet at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Dr Brooks' research focuses on the evolution of modern human behavior. Dr. Piperno is a pioneer in the detection and study of plant microfossils and the evolution of human diets.

"This significant finding provides new insight on the plight of the Neandertals," said Peg Barratt, dean of GW's Columbian College of Arts and Sciences. "It's also an excellent example of our dynamic partnership with the Smithsonian to further advance learning and discovery."

The research was supported by a National Science Foundation IGERT award, a Wenner Gren Foundation doctoral dissertation award, a Smithsonian Institution pre-doctoral fellowship, a National Science Foundation HOMINID award to the Smithsonian Institution and a selective excellence award from the George Washington University.

Source story by Sciencedaily
READ MORE - Neandertals’ Extinction Not Caused by Deficient Diets, Tooth Analysis Shows
Your Ad Here

Calculating Tidal Energy Turbines' Effects on Sediments and Fish

0 comments

Model results of the water speed around a turbine blade. Currents are slower (blue) behind the blade, and faster (red) at the blade's tip.

The emerging tidal-energy industry is spawning another in its shadow: tidal-energy monitoring. Little is known about tidal turbines' environmental effects and environmentalists, regulators and turbine manufacturers all need more data to allow the industry to grow.

Engineers at the University of Washington have developed a set of numerical models, solved by computers, to study how changing water pressure and speed around turbines affects sediment accumulation and fish health. They will present their findings this week at the American Geophysical Union's meeting in San Francisco.

The current numerical models look at windmill-style turbines that operate in fast-moving tidal channels. The turbine blade design creates a low-pressure region on one side of the blade, similar to an airplane wing. A small fish swimming past the turbine will be pulled along with the current and so will avoid hitting the blade, but might experience a sudden change in pressure.

Teymour Javaherchi, a UW mechanical engineering doctoral student, says his model shows these pressure changes would occur in less than 0.2 seconds, which could be too fast for the fish to adapt.

If the pressure change happens too quickly the fish would be unable to control their buoyancy and, like an inexperienced scuba diver, would either sink to the bottom or float to the surface. During this time the fish would become disoriented and risk being caught by predators. In a worst-case scenario, severe pressure changes could cause internal hemorrhaging and death.

It's too early to say whether tidal turbines could harm fish in this way, Javaherchi said. The existing model uses the blade geometry from a wind turbine.

"The competition between the companies is very tight and they are hesitant to share the designs," Javaherchi said.

The researchers are open to working with any company that wants to use the technique to assess a particular turbine design.

Another set of numerical modeling looked at whether changes in speed of water flow could affect the settling of suspended particles in a tidal channel. Slower water speeds behind the turbine would allow more particles to sink to the bottom rather than being carried along by the current.

Javaherchi's modeling work suggests this is the case, especially for mid-sized particles of about a half-centimeter in diameter, about two-tenths of an inch. This would mean that a rocky bottom near a tidal turbine might become sandier, which could affect marine life.

The UW research differs from most renewable energy calculations that seek to maximize the amount of energy generated.

"We are [also] interested in the amount of energy that can be extracted by the turbines, but we are aware that the limiting factor for the development of these technologies is the perception by the public that they might have a big environmental impact," said Alberto Aliseda, a UW assistant professor of mechanical engineering and Javaherchi's thesis adviser.

As to whether any negative effects discovered for tidal turbines would be preventable, Aliseda said, "Absolutely."

"We need to establish what is the lowest pressure that the animals can sustain and the period of time that they need to adjust," Aliseda said. "The blade can be shaped to minimize this effect."

Aliseda says engineers in the wind-turbine industry are already adapting the UW work to look at interactions between wind turbines and bats, since high-frequency pressure changes are now thought to be responsible for the mysterious deaths of bats caused by wind turbines.

"Maybe the best turbine is not the one that extracts the most energy, but the one that extracts a reasonable amount of energy and at the same time minimizes the environmental effects," he said.

The research was funded by a Department of Energy grant to the Northwest National Marine Renewable Energy Center. Joseph Seydel, a Boeing engineer and UW graduate in mechanical engineering, also contributed to the research.

Source Story by Sciencedaily
READ MORE - Calculating Tidal Energy Turbines' Effects on Sediments and Fish
Your Ad Here

New Technology to Speed Cleanup of Nuclear Contaminated Sites

0 comments

This new spectrometer developed by engineers at Oregon State University will help speed the cleanup of nuclear waste sites and reduce costs.

Members of the engineering faculty at Oregon State University have invented a new type of radiation detection and measurement device that will be particularly useful for cleanup of sites with radioactive contamination, making the process faster, more accurate and less expensive.

A patent has been granted on this new type of radiation spectrometer, and the first production of devices will begin soon. The advance has also led to creation of a Corvallis-based spinoff company, Avicenna Instruments, based on the OSU research. The market for these instruments may ultimately be global, and thousands of them could be built, researchers say.

Hundreds of millions of dollars are spent on cleanup of some major sites contaminated by radioactivity, primarily from the historic production of nuclear weapons during and after World War II. These include the Hanford site in Washington, Savannah River site in South Carolina, and Oak Ridge National Laboratory in Tennessee.

"Unlike other detectors, this spectrometer is more efficient, and able to measure and quantify both gamma and beta radiation at the same time," said David Hamby, an OSU professor of health physics. "Before this two different types of detectors and other chemical tests were needed in a time-consuming process."

"This system will be able to provide accurate results in 15 minutes that previously might have taken half a day," Hamby said. "That saves steps, time and money."

The spectrometer, developed over 10 years by Hamby and Abi Farsoni, an assistant professor in the College of Engineering, can quickly tell the type and amount of radionuclides that are present in something like a soil sample -- contaminants such as cesium 137 or strontium 90 -- that were produced from reactor operations. And it can distinguish between gamma rays and beta particles, which is necessary to determine the level of contamination.

"Cleaning up radioactive contamination is something we can do, but the process is costly, and often the question when working in the field is how clean is clean enough," Hamby said. "At some point the remaining level of radioactivity is not a concern. So we need the ability to do frequent and accurate testing to protect the environment while also controlling costs."

This system should allow that, Hamby said, and may eventually be used in monitoring processes in the nuclear energy industry, or possibly medical applications in the use of radioactive tracers.

The OSU College of Engineering has contracted with Ludlum Instruments, a Sweetwater, Texas, manufacturer, to produce the first instruments, and the OSU Office of Technology Transfer is seeking a licensee for commercial development. The electronic systems for the spectrometers will be produced in Oregon by Avicenna Instruments, the researchers said.

Source Story by Sciencedaily
READ MORE - New Technology to Speed Cleanup of Nuclear Contaminated Sites
Your Ad Here

Light Dawns on Dark Gamma-Ray Bursts

0 comments

This artist's impression shows a dark gamma-ray burst in a star forming region. Gamma-ray bursts are among the most energetic events in the Universe, but some appear curiously faint in visible light. The biggest study of these dark gamma-ray bursts to date, using the GROND instrument on the 2.2-metre MPG/ESO telescope at La Silla in Chile, has found that these gigantic explosions, while puzzling, don't require exotic explanations. Their faintness is now fully explained by a combination of causes with the most important being the presence of dust between the Earth and the explosion.

Gamma-ray bursts are among the most energetic events in the Universe, but some appear curiously faint in visible light. The biggest study to date of these so-called dark gamma-ray bursts, using the 2.2-meter MPG/ESO telescope at La Silla in Chile, has found that these explosions don't require exotic explanations. Their faintness is now explained by a combination of causes, the most important of which is the presence of dust between the Earth and the explosion.

Gamma-ray bursts (GRBs), fleeting events that last from less than a second to several minutes, are detected by orbiting observatories that can pick up their high energy radiation. Thirteen years ago, however, astronomers discovered a longer-lasting stream of less energetic radiation coming from these violent outbursts, which can last for weeks or even years after the initial explosion. Astronomers call this the burst's afterglow.
While all gamma-ray bursts [1] have afterglows that give off X-rays, only about half of them were found to give off visible light, with the rest remaining mysteriously dark. Some astronomers suspected that these dark afterglows could be examples of a whole new class of gamma-ray bursts, while others thought that they might all be at very great distances. Previous studies had suggested that obscuring dust between the burst and us might also explain why they were so dim.
"Studying afterglows is vital to further our understanding of the objects that become gamma-ray bursts and what they tell us about star formation in the early Universe," says the study's lead author Jochen Greiner from the Max-Planck Institute for Extraterrestrial Physics in Garching bei München, Germany.
NASA launched the Swift satellite at the end of 2004. From its orbit above the Earth's atmosphere it can detect gamma-ray bursts and immediately relay their positions to other observatories so that the afterglows could be studied. In the new study, astronomers combined Swift data with new observations made using GROND [2] -- a dedicated gamma-ray burst follow-up observation instrument, which is attached to the 2.2-metre MPG/ESO telescope at La Silla in Chile. In doing so, astronomers have conclusively solved the puzzle of the missing optical afterglow.
What makes GROND exciting for the study of afterglows is its very fast response time -- it can observe a burst within minutes of an alert coming from Swift using a special system called the Rapid Response Mode -- and its ability to observe simultaneously through seven filters covering both the visible and near-infrared parts of the spectrum.
By combining GROND data taken through these seven filters with Swift observations, astronomers were able to accurately determine the amount of light emitted by the afterglow at widely differing wavelengths, all the way from high energy X-rays to the near-infrared. The astronomers used this information to directly measure the amount of obscuring dust that the light passed through en route to Earth. Previously, astronomers had to rely on rough estimates of the dust content [3].
The team used a range of data, including their own measurements from GROND, in addition to observations made by other large telescopes including the ESO Very Large Telescope, to estimate the distances to nearly all of the bursts in their sample. While they found that a significant proportion of bursts are dimmed to about 60-80 percent of the original intensity by obscuring dust, this effect is exaggerated for the very distant bursts, letting the observer see only 30-50 percent of the light [4]. The astronomers conclude that most dark gamma-ray bursts are therefore simply those that have had their small amount of visible light completely stripped away before it reaches us.
"Compared to many instruments on large telescopes, GROND is a low cost and relatively simple instrument, yet it has been able to conclusively resolve the mystery surrounding dark gamma-ray bursts," says Greiner.

Notes:

[1] Gamma-ray bursts lasting longer than two seconds are referred to as long bursts and those with a shorter duration are known as short bursts. Long bursts, which were observed in this study, are associated with the supernova explosions of massive young stars in star-forming galaxies. Short bursts are not well understood, but are thought to originate from the merger of two compact objects such as neutron stars.

[2] The Gamma-Ray burst Optical and Near-infrared Detector (GROND) was designed and built at the Max-Planck Institute for Extraterrestrial Physics in collaboration with the Tautenburg Observatory, and has been fully operational since August 2007.

[3] Other studies relating to dark gamma-ray bursts have been released. Early this year, astronomers used the Subaru Telescope to observe a single gamma-ray burst, from which they hypothesised that dark gamma-ray bursts may indeed be a separate sub-class that form through a different mechanism, such as the merger of binary stars. In another study published last year using the Keck Telescope, astronomers studied the host galaxies of 14 dark GRBs, and based on the derived low redshifts they infer dust as the likely mechanism to create the dark bursts. In the new work reported here, 39 GRBs were studied, including nearly 20 dark bursts, and it is the only study in which no prior assumptions have been made and the amount of dust has been directly measured.

[4] Because the afterglow light of very distant bursts is redshifted due to the expansion of the Universe, the light that left the object was originally bluer than the light we detect when it gets to Earth. Since the reduction of light intensity by dust is greater for blue and ultraviolet light than for red, this means that the overall dimming effect of dust is greater for the more distant gamma-ray bursts. This is why GROND's ability to observe near-infrared radiation makes such a difference.

Source Story by Sciencedaily
READ MORE - Light Dawns on Dark Gamma-Ray Bursts
Your Ad Here

Bizarre Bioluminescent Snail

0 comments

This image shows examples of the clusterwink snail H. brasiliana emitting biolumuniescent light (right) and without light.
Bizarre Bioluminescent Snail: Secrets of Strange Mollusk and Its Use of Light as a Possible Defense Mechanism Revealed. Two scientists at Scripps Institution of Oceanography at UC San Diego have provided the first details about the mysterious flashes of dazzling bioluminescent light produced by a little-known sea snail.
 
Dimitri Deheyn and Nerida Wilson of Scripps Oceanography (Wilson is now at the Australian Museum in Sydney) studied a species of "clusterwink snail," a small marine snail typically found in tight clusters or groups at rocky shorelines. These snails were known to produce light, but the researchers discovered that rather than emitting a focused beam of light, the animal uses its shell to scatter and spread bright green bioluminescent light in all directions.

The researchers, who describe their findings in the Dec. 15 online version of Proceedings of the Royal Society B (Biological Sciences), say the luminous displays of Hinea brasiliana could be a deterrent to ward off potential predators by using diffused bioluminescent light to create an illusion of a larger animal.
In experiments conducted inside Scripps' Experimental Aquarium facility, Deheyn documented how H. brasiliana set off its glow, which he likens to a burglar alarm going off, when the snail was confronted by a threatening crab or a nearby swimming shrimp.

Wilson collected the snails used in the study in Australia and collaborated with Deheyn to characterize the bioluminescence.

"It's rare for any bottom-dwelling snails to produce bioluminescence," Wilson said. "So its even more amazing that this snail has a shell that maximizes the signal so efficiently."

Discovering how the snail spreads its light came as a surprise to the researchers since this species of clusterwink features opaque, yellowish shells that would seem to stifle light transmission. But in fact when the snail produces green bioluminescence from its body, the shell acts as a mechanism to specifically disperse only that particular color of light. Deheyn says such adaptations are of keen interest in optics and bioengineering research and development industries.

"The light diffusion capacity we see with this snail is much greater than comparative reference material," said Deheyn, of Scripps' Marine Biology Research Division. "Our next focus is to understand what makes the shell have this capacity and that could be important for building materials with better optical performance."
The study was funded by the Air Force Office of Scientific Research and the Mark Mitchell Foundation.
READ MORE - Bizarre Bioluminescent Snail
Your Ad Here

SEO Stats powered by MyPagerank.Net
 

science of daily. Copyright 2010 All Rights Reserved Blue Shinobi template by Andre Johns