Diberdayakan oleh Blogger.

Popular Posts Today

How birds master courtship songs: Zebra finches shed light on brain circuits and learning

Written By Unknown on Senin, 17 September 2012 | 15.15

ScienceDaily (Sep. 17, 2012) — By studying how birds master songs used in courtship, scientists at Duke University have found that regions of the brain involved in planning and controlling complex vocal sequences may also be necessary for memorizing sounds that serve as models for vocal imitation.

In a paper appearing in the September 2012 issue of the journal Nature Neuroscience, researchers at Duke and Harvard universities observed the imitative vocal learning habits of male zebra finches to pinpoint which circuits in the birds' brains are necessary for learning their songs.

Knowing which brain circuits are involved in learning by imitation could have broader implications for diagnosing and treating human developmental disorders, the researchers said. The finding shows that the same circuitry used for vocal control also participates in auditory learning, raising the possibility that vocal circuits in our own brain also help encode auditory experience important to speech and language learning.

"Birds learn their songs early in life by listening to and memorizing the song of their parent or other adult bird tutor, in a process similar to how humans learn to speak," said Todd Roberts, Ph.D., the study's first author and postdoctoral associate in neurobiology at Duke University. "They shape their vocalizations to match or copy the tutor's song."

A young male zebra finch, Roberts said, learns his song in two phases -- memorization and practice. He said the pupil can rapidly memorize the song of an adult tutor, but may need to practice singing as many as 100,000 times in a 45-day period in order to accurately imitate the tutor's song.

During the study, voice recognition software was paired with optogenetics, a technology that combines genetics and optics to control the electrical activity of nerve cells, or neurons. Using these tools, the researchers were able to scramble brain signals coordinating small sets of neurons in the young bird's brain for a few hundred milliseconds while he was listening to his teacher, enabling them to test which brain regions were important during the learning process.

The study's results show that a song pre-motor region in the pupil's brain plays two different roles. Not only does it control the execution of learned vocal sequences, it also helps encode information when the pupil is listening to his tutor, Roberts said.

"We learn some of our most interesting behaviors, including language, speech and music, by listening to an appropriate model and then emulating this model through intensive practice," said senior author Richard Mooney, Ph.D., professor of neurobiology and member of the Duke Institute for Brain Sciences. "A traditional view is that this two-step sequence -- listening followed by motor rehearsal -- first involves activation by the model of brain regions important to auditory processing. This is followed days, weeks or even months later by activation of brain regions important to motor control."

"Here we found that a brain region that is essential to the motor control of song also has an essential role in helping in auditory learning of the tutor song," Mooney said. "This finding raises the possibility that the premotor circuits important to planning and controlling speech in our own brains also play an important role in auditory learning of speech sounds during early infancy." This brain region, known as Broca's area, is located in the frontal lobe of the left hemisphere.

The research has implications for the role of premotor circuits in the brain and suggests that these areas are important targets to consider when assessing developmental disorders that affect speech, language and other imitative behaviors in humans, Roberts said.

In addition to Roberts and Mooney, study authors include Sharon M. H. Gobes of Harvard University and Wellesley College; Malavika Murugan of Duke; and Bence P. Ölveczky of Harvard.

The research was supported by grants from the National Science Foundation and the National Institutes of Health (R01 DC02524) to Richard Mooney; and grants from NIH (R01 NS066408) and the Klingenstein, Sloan and McKnight Foundations to Bence P. Ölveczky; and a Rubicon fellowship from the Netherlands Organization for Scientific Research to Sharon M.H. Gobes.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Duke University Medical Center, via EurekAlert!, a service of AAAS.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Todd F Roberts, Sharon M H Gobes, Malavika Murugan, Bence P Ölveczky, Richard Mooney. Motor circuits are required to encode a sensory model for imitative learning. Nature Neuroscience, 2012; DOI: 10.1038/nn.3206

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

18 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/top_news/top_science/~3/IIk0D0jpgi0/120917173101.htm
--
Manage subscription | Powered by rssforward.com
15.15 | 0 komentar | Read More

Most extensive pictures ever of an organism's DNA mutation processes

ScienceDaily (Sep. 17, 2012) — Biologists and informaticists at Indiana University have produced one of the most extensive pictures ever of mutation processes in the DNA sequence of an organism, elucidating important new evolutionary information about the molecular nature of mutations and how fast those heritable changes occur.

By analyzing the exact genomic changes in the model prokaryote Escherichia coli that had undergone over 200,000 generations of growth in the absence of natural selective pressures, the team led by IU College of Arts and Sciences Department of Biology professor Patricia L. Foster found that spontaneous mutation rates in E. coli DNA were actually three times lower than previously thought.

The new research, which appeared September 17 in early edition of the journal Proceedings of the National Academy of Sciences, also notes that the mismatch repair proteins that survey newly replicated DNA and detect mistakes not only keep mutation rates low but may also maintain the balance of guanine-cytosine content to adenine-thymine content in the genome. Guanine-cytosine and adenine-thymine are the nitrogenous bases that bond between opposing DNA strands to form the rungs of the double helix ladder of DNA.

"We know that even in the absence of natural selection, evolution will proceed because new mutations get fixed at random in the genome," Foster said. "So, if we want to determine whether specific patterns of evolutionary change are driven by selection, knowledge of the expected pattern in the absence of selection is absolutely essential. Here we are defining the rate and molecular spectrum of spontaneous mutations while minimizing the ability of natural selection to promote or eradicate mutations, which allows us to capture essentially all mutations that do not cause the bacterium to die."

In a parallel mutation accumulation experiment using a strain defective in mismatch repair, in which the mutation rate was increased over 100-fold, the researchers analyzed nearly 2,000 mutations and found that these were strongly biased toward changing adenine-thymine base pairs to guanine-cytosine base pairs, the opposite of what is seen in the normal bacteria.

E. coli chromosome

"The molecular spectrum of spontaneous base-pair substitutions in almost all organisms is dominated by guanine-cytosine to adenine-thymine changes, which tends to drive genomes toward higher adenine-thymine content," Foster noted. "Because the guanine-cytosine content of genomes varies widely, there must be some selective pressure, or some non-adaptive mechanism, that can drive genomes back toward increased guanine-cytosine content."

The new research, co-authored by IU Bloomington School of Informatics and Computing associate professor Haixu Tang, Informatics predoctoral researcher Heewook Lee and Department of Biology postdoctoral researcher Ellen Popodi, demonstrates that mismatch repair is a major factor in the types of mutations that occur and in determining the base composition of the genome. Because the activity of mismatch repair can be influenced by the environment, another implication of this work is that the pattern of mutations could be used in forensics to help determine where a particular bacterial strain originated.

"By establishing baseline parameters for the molecular nature of spontaneous mutational change unbiased by selection, we can begin to achieve a deeper understanding of the factors that determine mutation rates, the mutational spectra, genomic base composition, how these may differ among organisms and how they may be shaped by environmental conditions," Foster said. "Since mutations are the source of variation upon which natural selection acts, understanding the rate at which mutations occur and the molecular nature of spontaneous mutational changes leads us to a fuller understanding of evolution."

The research took nearly two years to complete and was supported by a Multidisciplinary University Research Initiative Award from the U.S. Army Research Office.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Indiana University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

18 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/top_news/top_science/~3/nf-VEtsxTl4/120917151721.htm
--
Manage subscription | Powered by rssforward.com
14.33 | 0 komentar | Read More

Scientists accurately predict connections between neurons

ScienceDaily (Sep. 17, 2012) — One of the greatest challenges in neuroscience is to identify the map of synaptic connections between neurons. Called the "connectome," it is the holy grail that will explain how information flows in the brain. In a landmark paper, published the week of 17th of September in PNAS, the EPFL's Blue Brain Project (BBP) has identified key principles that determine synapse-scale connectivity by virtually reconstructing a cortical microcircuit and comparing it to a mammalian sample. These principles now make it possible to predict the locations of synapses in the neocortex.

"This is a major breakthrough, because it would otherwise take decades, if not centuries, to map the location of each synapse in the brain and it also makes it so much easier now to build accurate models," says Henry Markram, head of the BBP.

A longstanding neuroscientific mystery has been whether all the neurons grow independently and just take what they get as their branches bump into each other, or are the branches of each neuron specifically guided by chemical signals to find all its target. To solve the mystery, researchers looked in a virtual reconstruction of a cortical microcircuit to see where the branches bumped into each other. To their great surprise, they found that the locations on the model matched that of synapses found in the equivalent real-brain circuit with an accuracy ranging from 75 percent to 95 percent.

This means that neurons grow as independently of each other as physically possible and mostly form synapses at the locations where they randomly bump into each other. A few exceptions were also discovered pointing out special cases where signals are used by neurons to change the statistical connectivity. By taking these exceptions into account, the Blue Brain team can now make a near perfect prediction of the locations of all the synapses formed inside the circuit.

Virtual Reconstruction

The goal of the BBP is to integrate knowledge from all the specialized branches of neuroscience, to derive from it the fundamental principles that govern brain structure and function, and ultimately, to reconstruct the brains of different species -- including the human brain -- in silico. The current paper provides yet another proof-of-concept for the approach, by demonstrating for the first time that the distribution of synapses or neuronal connections in the mammalian cortex can, to a large extent, be predicted.

To achieve these results, a team from the Blue Brain Project set about virtually reconstructing a cortical microcircuit based on unparalleled data about the geometrical and electrical properties of neurons -- data from over nearly 20 years of painstaking experimentation on slices of living brain tissue. Each neuron in the circuit was reconstructed into a 3D model on a powerful Blue Gene supercomputer. About 10,000 of virtual neurons were packed into a 3D space in random positions according to the density and ratio of morphological types found in corresponding living tissue. The researchers then compared the model back to an equivalent brain circuit from a real mammalian brain.

A Major Step Towards Accurate Models of the Brain

This discovery also explains why the brain can withstand damage and indicates that the positions of synapses in all brains of the same species are more similar than different. "Positioning synapses in this way is very robust," says computational neuroscientist and first author Sean Hill, "We could vary density, position, orientation, and none of that changed the distribution of positions of the synapses."

They went on to discover that the synapses positions are only robust as long as the morphology of each neuron is slightly different from each other, explaining another mystery in the brain -- why neurons are not all identical in shape. "It's the diversity in the morphology of neurons that makes brain circuits of a particular species basically the same and highly robust," says Hill.

Overall this work represents a major acceleration in the ability to construct detailed models of the nervous system. The results provide important insights into the basic principles that govern the wiring of the nervous system, throwing light on how robust cortical circuits are constructed from highly diverse populations of neurons -- an essential step towards understanding how the brain functions. They also underscore the value of the BBP's constructivist approach. "Although systematically integrating data across a wide range of scales is slow and painstaking, it allows us to derive fundamental principles of brain structure and hence function," explains Hill.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Ecole Polytechnique Fédérale de Lausanne, via EurekAlert!, a service of AAAS.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

18 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/top_news/top_science/~3/kVELjT8AQ-0/120917152043.htm
--
Manage subscription | Powered by rssforward.com
14.05 | 0 komentar | Read More

Precision motion tracking -- thousands of cells at once: Technique could open new windows into protozoan behavior, microbial diseases and fertility

ScienceDaily (Sep. 17, 2012) — Researchers have developed a new way to observe and track large numbers of rapidly moving objects under a microscope, capturing precise motion paths in three dimensions.

Over the course of the study -- reported online Sept. 17, 2012, in the Proceedings of the National Academy of Sciences -- researchers followed an unprecedented 24,000 rapidly moving cells over wide fields of view and through large sample volumes, recording each cell's path for as long as 20 seconds.

"We can very precisely track the motion of small things, more than a thousand of them at the same time, in parallel," says research lead and National Science Foundation CAREER awardee Aydogan Ozcan, an electrical engineering and bioengineering professor at UCLA. "We were able to achieve sub-micron accuracy over a large volume, allowing us to understand, statistically, how thousands of objects move in different ways."

The latest study is an extension of several years of NSF-supported work by Ozcan and his colleagues to develop lens-free, holographic microscopy techniques with applications for field-based detection of blood-borne diseases and other areas of tele-medicine.

For the recent work, Ozcan and his colleagues--Ting-Wei Su, also of UCLA, and Liang Xue, of both UCLA and Nanjing University of Science and Technology in China--used offset beams of red and blue light to create holographic information that, when processed using sophisticated software, accurately reveal the paths of objects moving under a microscope. The researchers tracked several cohorts of more than 1,500 human male gamete cells over a relatively wide field of view (more than 17 square millimeters) and large sample volume (up to 17 cubic millimeters) over several seconds.

The technique, along with a novel software algorithm that the team developed to process observational data, revealed previously unknown statistical pathways for the cells. The researchers found that human male gamete cells travel in a series of twists and turns along a constantly changing path that occasionally follows a tight helix--a spiral that, 90 percent of the time, is in a clockwise (right-handed) direction.

Because only four to five percent of the cells in a given sample traveled in a helical path at any given time, researchers would not have been able to observe the rare behavior without the new high-throughput microscopy technique.

"This latest study is an extension of truly novel and creative work," says Leon Esterowitz, the NSF biophotonics program officer who has supported Ozcan's efforts. "The holographic technique could accelerate drug discovery and prove valuable for monitoring pharmaceutical treatments of dangerous microbial diseases."

The PNAS paper reports observations of 24,000 cells over the duration of the experiments. Such a large number of observations provide a statistically significant dataset and a useful methodology for potentially studying a range of subjects, from the impact of pharmaceuticals and other substances on large numbers of cells--in real time--to fertility treatments and drug development.

The same approach may also enable scientists to study quick-moving, single-celled microorganisms. Many of the dangerous protozoa found in unsanitary drinking water and rural bodies of water have only been observed in small samples moving through an area that is roughly two dimensional. The new lens-free holographic imaging technique could potentially reveal unknown elements of protozoan behavior and allow real-time testing of novel drug treatments to combat some of the most deadly forms of those microbes.

Ozcan's research receives support from an NIH Director's New Innovator Award, Office of Naval Research Young Investigator Award and an Army Research Office Young Investigator Award from the Department of Defense.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by National Science Foundation.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Ting-Wei Su, Liang Xue, and Aydogan Ozcan. High-throughput lensfree 3D tracking of human sperms reveals rare statistics of helical trajectories. PNAS, September 17, 2012 DOI: 10.1073/pnas.1212506109

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

18 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/top_news/top_science/~3/Oz16mrgeu8M/120917152045.htm
--
Manage subscription | Powered by rssforward.com
13.36 | 0 komentar | Read More

Dry-run experiments verify key aspect of nuclear fusion concept: Scientific 'break-even' or better is near-term goal

ScienceDaily (Sep. 17, 2012) — Magnetically imploded tubes called liners, intended to help produce controlled nuclear fusion at scientific "break-even" energies or better within the next few years, have functioned successfully in preliminary tests, according to a Sandia research paper accepted for publication by Physical Review Letters (PRL).

To exceed scientific break-even is the most hotly sought-after goal of fusion research, in which the energy released by a fusion reaction is greater than the energy put into it -- an achievement that would have extraordinary energy and defense implications.

That the liners survived their electromagnetic drubbing is a key step in stimulating further Sandia testing of a concept called MagLIF (Magnetized Liner Inertial Fusion), which will use magnetic fields and laser pre-heating in the quest for energetic fusion.

In the dry-run experiments just completed, cylindrical beryllium liners remained reasonably intact as they were imploded by huge magnetic field of Sandia's Z machine, the world's most powerful pulsed-power accelerator. Had they overly distorted, they would have proved themselves incapable of shoveling together nuclear fuel -- deuterium and possibly tritium -- to the point of fusing them. Sandia researchers expect to add deuterium fuel in experiments scheduled for 2013.

"The experimental results -- the degree to which the imploding liner maintained its cylindrical integrity throughout its implosion -- were consistent with results from earlier Sandia computer simulations," said lead researcher Ryan McBride."These predicted MagLIF will exceed scientific break-even."

A simulation published in a 2010 Physics of Plasmas article by Sandia researcher Steve Slutz showed that a tube enclosing preheated deuterium and tritium, crushed by the large magnetic fields of the 25-million-ampere Z machine, would yield slightly more energy than is inserted into it.

A later simulation, published last January in PRL by Slutz and Sandia researcher Roger Vesey, showed that a more powerful accelerator generating 60 million amperes or more could reach "high-gain" fusion conditions, where the fusion energy released greatly exceeds (by more than 1,000 times) the energy supplied to the fuel.

These goals -- both the near-term goal of scientific break-even on today's Z machine and the long-term goal of high-gain fusion on a future, more powerful machine -- require the metallic liners to maintain sufficient cylindrical integrity while they implode.

The liner is intended to contain fusion fuel like a can holds peanut butter, and push it together in nanoseconds like two semicylindrical shovels compacting snow together.

An element of drama is present because the metallic liner doing the compressing is also being eaten away as it conducts the Z machine's enormous electrical current along its outer surface. This electrical current generates the corresponding magnetic field that crushes the liner, but under the stress of passing that current, the outer surface of the liner begins to vaporize and turn into plasma, in much the same way as a car fuse vaporizes when a short circuit sends too much current through it. As this happens, the surface begins to lose integrity and becomes unstable. This instability works its way inward, toward the liner's inner surface, throughout the course of the implosion.

"You might say: The race is on," said McBride. "The question is, can we start off with a thick enough tube such that we can complete the implosion and burn the fusion fuel before the instability eats its way completely through the liner wall?

"A thicker tube would be more robust in standing up to this instability, but the implosion would be less efficient because Z would have to accelerate more liner mass. On the flip side, a thinner tube could be accelerated to a much higher implosion velocity, but then the instability would rip the liner to shreds and render it useless," he continues. "Our experiments were designed to test a sweet spot predicted by the simulations where a sufficiently robust liner could implode with a sufficiently high velocity."

By following the dimensions proposed by the earlier simulations, the physical test proved successful and the liner walls maintained their integrity throughout the implosion. Radiographs taken at nanosecond intervals depicted the implosion of the initially solid beryllium liner through to stagnation -- the point at which an implosion stops because the liner material has reached the cylinder's central axis. The images show the outer surface of the imploding liner distorting until it resembles threads on a bolt. However, the more crucial inner surface remains reasonably intact all the way through to stagnation.

Said McBride's manager Dan Sinars, "When Magnetized Liner Inertial Fusion was first proposed, our biggest concern was whether the instabilities would disrupt the target before fusion reactions could occur. We had complex computer simulations that suggested things would be OK, but we were not confident in those predictions. Then McBride did his experiments, using liners with the same dimensions as our simulations, and the outcomes matched. We are now confident enough to take the next steps on the Z facility of integrating in the new magnetic field and laser preheat capabilities that will be required to test the full concept. Consequently, we intend to take those first integration steps in 2013."

Slated for December are the first tests of the final two components of the MagLIF concept: laser preheating to put more energy into the fuel before magnetic compression begins, and the testing of two secondary electrical coils placed at the top and bottom of the can. Their magnetic fields are expected to keep charged particles from escaping the hot fuel horizontally. This is crucial because if too many particles escape, the fuel could cool to the point where fusion reactions cease.

Sandia researchers intend to test the fully integrated MagLIF concept by the close of 2013.

"This work is one more step on a long path to possible energy applications," said Sandia senior manager Mark Herrmann.

The liner implosion experiments also served to verify that simulation tools like the popular LASNEX code are accurate within certain parameters, but may diverge when used beyond those limits -- information of importance to other labs that use the same codes. McBride will give an invited talk on his work this fall at the American Physical Society's annual Division of Plasma Physics meeting in Providence, RI. He is also preparing an invited paper for Physics of Plasmas to explain the PRL results in greater depth.

The work was funded by Sandia's Laboratory Directed Research and Development program and the National Nuclear Security Administration.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by DOE/Sandia National Laboratories.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

18 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/top_news/top_science/~3/y1m6-6GKIx0/120917124210.htm
--
Manage subscription | Powered by rssforward.com
12.37 | 0 komentar | Read More

Sex matters: Men recognize cars and women recognize living things best, psychological analysis finds

ScienceDaily (Sep. 17, 2012) — Women are better than men at recognizing living things and men are better than women at recognizing vehicles.

That is the unanticipated result of an analysis Vanderbilt psychologists performed on data from a series of visual recognition tasks collected in the process of developing a new standard test for expertise in object recognition.

"These results aren't definitive, but they are consistent with the following story," said Gauthier. "Everyone is born with a general ability to recognize objects and the capability to get really good at it. Nearly everyone becomes expert at recognizing faces, because of their importance for social interactions. Most people also develop expertise for recognizing other types of objects due to their jobs, hobbies or interests. Our culture influences which categories we become interested in, which explains the differences between men and women."

The results were published online on Aug. 3 in the Vision Research journal in an article titled, "The Vanderbilt Expertise Test Reveals Domain-General and Domain-Specific Sex Effects in Object Recognition."

"Our motivation was to assess the role that expertise plays in object recognition with a new test that includes many different categories, so we weren't looking for this result," said Professor of Psychology Isabel Gauthier. She directs the lab where post-doctoral fellow Rankin McGugin conducted the study.

"This isn't the first time that sex differences have been found in perceptual tasks. For example, previous studies have shown that men have an advantage in mental rotation tasks. In fact, a recent study looking only at car recognition found that men were better than women but attributed this to the male advantage in mental rotation. Our finding that women are better than men at recognizing objects in other categories suggests that this explanation is incorrect."

Discovery of the sex effect in object recognition also casts doubt on several studies that claim an individual's ability to recognize faces is largely independent of his or her ability to recognize objects.

"Face recognition abilities are exciting to study because they have been found to have a clear genetic basis," said Gauthier, "and many studies conclude that abilities in face recognition are not predicted by abilities in object recognition. But this is usually based on comparing faces to only one object category for men and women."

It took the multi-category analysis to reveal that face recognition abilities are correlated to the ability to recognize different object categories for men and women. For example, men who are better at recognizing vehicles also tend to be better at recognizing faces, while women who are better at recognizing living things tend to be better at recognizing faces.

The researchers modeled their new test after the well-established Cambridge Face Memory Task, which effectively measures a person's ability to recognize faces. After familiarizing themselves with a number of images, participants are shown three images at a time -- one from the study group and two that they haven't seen before -- and then are asked to pick out the image that they had studied.

While one goal of the new study was to compare object and face recognition skills, another goal was to develop a better way to measure who has exceptional skills in one domain: how to find the experts in the recognition of cars or birds or even mushrooms. To do this, the Vanderbilt researchers reasoned that performance on any category of interest needed to be compared to performance on many other categories, to ensure that the self-proclaimed bird expert is not only better with birds than most people, but also better with birds than with most other categories. So they designed the new test with eight categories of visually similar objects: leaves, owls, butterflies, wading birds, mushrooms, cars, planes and motorcycles.

To evaluate the new test, they administered it to 227 subjects -- 75 male and 82 female -- with a mean age of 23. When the results of the entire group were analyzed, the researchers found that increasing the number of categories revealed a large sex difference: Women proved significantly better at recognizing living things while men were better at recognizing vehicles. In addition, the researchers administered a face recognition test to about half of the participants, which allowed them to determine the correlation between vehicle recognition and face recognition in men and the correlation between recognition of living things and faces in women.

Vanderbilt post-doctoral fellow Jennifer Richler as well as Grit Herzmann, a post-doctoral fellow at the University of Colorado, Boulder and Research Assistant Magen Speegle also contributed to the study.

The research was supported by the National Eye Institute grants EYO13441-06A2 and P30-EY008126 and by the National Science Foundation's Temporal Dynamics of Learning Center grant SBE-0542013.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Vanderbilt University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Rankin W. McGugin, Jennifer J. Richler, Grit Herzmann, Magen Speegle, Isabel Gauthier. The Vanderbilt Expertise Test reveals domain-general and domain-specific sex effects in object recognition. Vision Research, 2012; 69: 10 DOI: 10.1016/j.visres.2012.07.014

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

18 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/top_news/top_science/~3/GiRG07ulFyE/120917132022.htm
--
Manage subscription | Powered by rssforward.com
11.06 | 0 komentar | Read More

Mobile phones and wireless networks: No evidence of health risk found, Norwegian experts find

ScienceDaily (Sep. 17, 2012) — There is no scientific evidence that low-level electromagnetic field exposure from mobile phones and other transmitting devices causes adverse health effects, according to a report presented by a Norwegian Expert Committee. In addition, the Committee provides advice to authorities about risk management and regulatory practice.

The Committee has assessed the health hazards from low-level electromagnetic fields generated by radio transmitters. These electromagnetic fields are found around mobile phones, wireless phones and networks, mobile phone base stations, broadcasting transmitters and other communications equipment. The Committee has evaluated the power of the fields, whether they pose a health risk, the current regulatory practice, and whether the threshold limit values for exposure are observed.

The report is entitled "Svake høyfrekvente elektromagnetiske felt -- en vurdering av helserisiko og forvaltningspraksis. FHI-rapport 2012:3" (In English: Low-level radiofrequency electromagnetic fields -- an assessment of health risks and evaluation of regulatory practice. NIPH report 2012:3). Published on September 13th, the report contains a Norwegian and English summary.

Studied electromagnetic fields below threshold limit values

The low-level electromagnetic fields generated when antennas in mobile phones and other wireless devices transmit radio signals are referred to as radiofrequency (RF) fields.

The health authorities have determined that the threshold limit values for electromagnetic fields around transmitters in mobile phones and other equipment should be the same as those recommended by the International Commission on Non-ionising Radiation Protection (ICNIRP). The threshold limit values are based on fields above a certain power that can cause harmful heating of tissue. The ICNIRP has not observed other adverse health effects under this level.

The threshold limit values for these fields are 50 times below the level that causes heating of human tissue or stimulation of nerve cells. Due to increasing public concerns, the government requested the appointment of an Expert Committee to assess whether such low-level electromagnetic fields could cause health effects.

The Norwegian Institute of Public Health was commissioned to appoint the Expert Committee by the Ministry of Health and Care Services and the Ministry of Transport and Communication. The Committee was chaired by Professor Jan Alexander, Assistant Director-General at the Institute.

Research indicates no health risk

The Committee has assessed a number of possible health effects from low-level electromagnetic fields and has evaluated the research in each area.

The group found no evidence that the low-level fields around mobile phones and other transmitters increase the risk of cancer, impair male fertility, cause other reproductive damage or lead to other diseases and adverse health effects, such as changes to the endocrine and immune systems.

No cancer risk found

Most studies concerning cancer have focused on the risk of cancer in the head and neck. The Committee found no scientific evidence for an association between mobile phone use and fast-growing brain tumours. So far, the effect on slow-growing tumours has been studied in people who have used mobile phones for up to 20 years. These studies show no association.

Only limited data exist for the other types of cancer in the head and neck area, as well as for leukaemia and lymphoma, but so far there is no evidence of an increased risk from mobile phone use. Cancer registries have not observed an increase in these tumours in the population since mobile phones were introduced.

Electromagnetic hypersensitivity

The Committee did not find that mobile phones and other equipment can cause health problems such as electromagnetic hypersensitivity.

Does this mean that electromagnetic hypersensitivity is an imaginary problem? "We have no grounds to say that the symptoms are imaginary. But a large number of studies suggest that these symptoms must have other causes than the physical effects of low-level electromagnetic fields around mobile phones, wireless transmitters and other wireless equipment. Research provides no evidence to support that interventions help, such as reducing the use of mobile phones or wireless networks. Our opinion is that patients with these health problems must be taken seriously by the health service and should be treated as other patients. There is a need for greater expertise in the health service for this group of patients," says Alexander.

Many people have found that holding a mobile phone to the head causes the area around the ear to become hot -- is this due to electromagnetic radiation? "The skin warms up slightly due to heat from the battery and not from the radio transmitter in the phone. The electromagnetic field will have very little or no heating effect. The body will remove the heat through normal blood flow, in the same way as the body otherwise regulates temperature."

Some mobile phone models transmitting at maximum power provide exposure that comes close to the threshold limit values. Even so, any heating due to electromagnetic fields would be negligible.

Advice: Show general caution

Since there are no uncertainties in the health risk assessment of low-level electromagnetic fields that warrant introduction of the precautionary principle, the Committee believes that general caution is sufficient. This means that exposure should not be higher than needed to achieve the intended purpose.

When comparing the power of the fields around different types of equipment, talking on a mobile phone tops the list, whilst wireless internet networks are at the bottom. Base stations and broadcasting transmitters also come low down in the list. An example of exercising general caution would be for the authorities to inform that hands-free kits will significantly reduce exposure from mobile phones.

Furthermore, the field strength around a mobile phone is lower when there is good coverage.

Little benefit from more research

The Committee has evaluated the assessments previously published by international expert groups, as well as recent individual studies. The material is very extensive. A number of studies were performed on cells and tissues in the laboratory, as well as in animals and humans. In addition, population studies and cancer registry studies were conducted in several countries.

Little uncertainty

There is always an element of uncertainty in all risk assessments. In this case, the Committee considers the uncertainty to be small. Some uncertainty is associated with high exposure over time, such as extensive use of mobile phones over several decades. Until now, this has been impossible to study. Cancer registries should follow the development of cancer incidence in the future and research should not cease. Studies of animals that have been exposed throughout life provide no evidence that low level RF fields cause cancer. It is unlikely that long-term use of mobile phones will cause health risks that are unknown today.

Regarding equipment that provides the lowest exposure, such as base stations, wireless networks, broadcasting transmitters and proximity to other mobile phones, the experts believe that the risk assessment has negligible uncertainty. In other words, it is reasonably certain that such equipment is not associated with health risks.

The report is approximately 200 pages long and includes Norwegian and English summaries. It can be downloaded in PDF format at http://www.fhi.no.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Norwegian Institute of Public Health.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

17 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/top_news/top_science/~3/Y7TfRQUL5FM/120917085525.htm
--
Manage subscription | Powered by rssforward.com
08.28 | 0 komentar | Read More

World’s most powerful digital camera opens eye, records first images in hunt for dark energy

ScienceDaily (Sep. 17, 2012) — Eight billion years ago, rays of light from distant galaxies began their long journey to Earth. That ancient starlight has now found its way to a mountaintop in Chile, where the newly-constructed Dark Energy Camera, the most powerful sky-mapping machine ever created, has captured and recorded it for the first time.

That light may hold within it the answer to one of the biggest mysteries in physics -- why the expansion of the universe is speeding up.

Scientists in the international Dark Energy Survey collaboration announced this week that the Dark Energy Camera, the product of eight years of planning and construction by scientists, engineers, and technicians on three continents, has achieved first light. The first pictures of the southern sky were taken by the 570-megapixel camera on Sept. 12.

"The achievement of first light through the Dark Energy Camera begins a significant new era in our exploration of the cosmic frontier," said James Siegrist, associate director of science for high energy physics with the U.S. Department of Energy. "The results of this survey will bring us closer to understanding the mystery of dark energy, and what it means for the universe."

The Dark Energy Camera was constructed at the U.S. Department of Energy's (DOE) Fermi National Accelerator Laboratory in Batavia, Illinois, and mounted on the Victor M. Blanco telescope at the National Science Foundation's Cerro Tololo Inter-American Observatory (CTIO) in Chile, which is the southern branch of the U.S. National Optical Astronomy Observatory (NOAO). With this device, roughly the size of a phone booth, astronomers and physicists will probe the mystery of dark energy, the force they believe is causing the universe to expand faster and faster.

"The Dark Energy Survey will help us understand why the expansion of the universe is accelerating, rather than slowing due to gravity," said Brenna Flaugher, project manager and scientist at Fermilab. "It is extremely satisfying to see the efforts of all the people involved in this project finally come together."

The Dark Energy Camera is the most powerful survey instrument of its kind, able to see light from over 100,000 galaxies up to 8 billion light years away in each snapshot. The camera's array of 62 charged-coupled devices has an unprecedented sensitivity to very red light, and along with the Blanco telescope's large light-gathering mirror (which spans 13 feet across), will allow scientists from around the world to pursue investigations ranging from studies of asteroids in our own Solar System to the understanding of the origins and the fate of the universe.

"We're very excited to bring the Dark Energy Camera online and make it available for the astronomical community through NOAO's open access telescope allocation," said Chris Smith, director of the Cerro-Tololo Inter-American Observatory. "With it, we provide astronomers from all over the world a powerful new tool to explore the outstanding questions of our time, perhaps the most pressing of which is the nature of dark energy."

Scientists in the Dark Energy Survey collaboration will use the new camera to carry out the largest galaxy survey ever undertaken, and will use that data to carry out four probes of dark energy, studying galaxy clusters, supernovae, the large-scale clumping of galaxies and weak gravitational lensing. This will be the first time all four of these methods will be possible in a single experiment.

The Dark Energy Survey is expected to begin in December, after the camera is fully tested, and will take advantage of the excellent atmospheric conditions in the Chilean Andes to deliver pictures with the sharpest resolution seen in such a wide-field astronomy survey. In just its first few nights of testing, the camera has already delivered images with excellent and nearly uniform spatial resolution.

Over five years, the survey will create detailed color images of one-eighth of the sky, or 5,000 square degrees, to discover and measure 300 million galaxies, 100,000 galaxy clusters and 4,000 supernovae.

The Dark Energy Survey is supported by funding from the U.S. Department of Energy; the National Science Foundation; funding agencies in the United Kingdom, Spain, Brazil, Germany and Switzerland; and the participating DES institutions.

More information about the Dark Energy Survey, including the list of participating institutions, is available at the project website: www.darkenergysurvey.org.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by DOE/Fermi National Accelerator Laboratory.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

17 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/top_news/top_science/~3/hAkNWkmg9bc/120917104651.htm
--
Manage subscription | Powered by rssforward.com
07.55 | 0 komentar | Read More

Skilled hunters 300,000 years ago

[unable to retrieve full-text content]Finds from early stone age site in north-central Germany show that human ingenuity is nothing new -- and was probably shared by now-extinct species of humans. Archeologists have found eight extremely well-preserved spears -- an astonishing 300,000 years old, making them the oldest known weapons anywhere. The spears and other artifacts as well as animal remains found at the site demonstrate that their users were highly skilled craftsmen and hunters, well adapted to their environment -- with a capacity for abstract thought and complex planning comparable to our own.

17 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/top_news/top_science/~3/dAEO2oAZuxw/120917085535.htm
--
Manage subscription | Powered by rssforward.com
07.29 | 0 komentar | Read More

Most coral reefs are at risk unless climate change is drastically limited, study shows

ScienceDaily (Sep. 16, 2012) — Coral reefs face severe challenges even if global warming is restricted to the 2 degrees Celsius commonly perceived as safe for many natural and human-made systems. Warmer sea surface temperatures are likely to trigger more frequent and more intense mass coral bleaching events. Only under a scenario with strong action on mitigating greenhouse-gas emissions and the assumption that corals can adapt at extremely rapid rates, could two thirds of them be safe, shows a study now published in Nature Climate Change. Otherwise all coral reefs are expected to be subject to severe degradation.

Coral reefs house almost a quarter of the species in the oceans and provide critical services -- including coastal protection, tourism and fishing -- to millions of people worldwide. Global warming and ocean acidification, both driven by human-caused CO2 emissions, pose a major threat to these ecosystems.

"Our findings show that under current assumptions regarding thermal sensitivity, coral reefs might no longer be prominent coastal ecosystems if global mean temperatures actually exceed 2 degrees Celsius above the pre-industrial level," says lead author Katja Frieler from the Potsdam Institute for Climate Impact Research. "Without a yet uncertain process of adaptation or acclimation, however, already about 70% of corals are projected to suffer from long-term degradation by 2030 even under an ambitious mitigation scenario." Thus, the threshold to protect at least half of the coral reefs worldwide is estimated to be below 1.5 degrees Celsius mean temperature increase.

A more comprehensive and robust representation than in previous studies

This study is the first comprehensive global survey of coral bleaching to express results in terms of global mean temperature change. It has been conducted by scientists from Potsdam, the University of British Columbia in Canada and the Universities of Melbourne and Queensland in Australia. To project the cumulative heat stress at 2160 reef locations worldwide, they used an extensive set of 19 global climate models. By applying different emission scenarios covering the 21st century and multiple climate model simulations, a total of more than 32,000 simulation years was diagnosed. This allows for a more robust representation of uncertainty than any previous study.

Corals derive most of their energy, as well as most of their famous color, from a close symbiotic relationship with a special type of microalgae. The vital symbiosis between coral and algae can break down when stressed by warm water temperatures, making the coral "bleach" or turn pale. Though corals can survive this, if the heat stress persists long enough the corals can die in great numbers. "This happened in 1998, when an estimated 16% of corals were lost in a single, prolonged period of warmth worldwide," says Frieler.

Adaptation is uncertain and ocean acidification means even more stress

To account for a possible acclimation or adaptation of corals to thermal stress, like shifts to symbiont algae with a higher thermal tolerance, rather optimistic assumptions have been included in the study. "However, corals themselves have all the wrong characteristics to be able to rapidly evolve new thermal tolerances," says co-author Ove Hoegh-Guldberg, a marine biologist at the University of Queensland in Australia. "They have long lifecycles of 5-100 years and they show low levels of diversity due to the fact that corals can reproduce by cloning themselves. They are not like fruit flies which can evolve much faster."

Previous analyses estimated the effect of thermal adaptation on bleaching thresholds, but not the possible opposing effect of ocean acidification. Seawater gets more acidic when taking up CO2 from the atmosphere. This is likely to act to the detriment of the calcification processes crucial for the corals' growth and might also reduce their thermal resilience. The new study investigates the potential implications of this ocean acidification effect, finding that, as Hoegh-Guldberg says: "The current assumptions on thermal sensitivity might underestimate, not overestimate, the future impact of climate change on corals."

This comprehensive analysis highlights how close we are to a world without coral reefs as we know them. "The window of opportunity to preserve the majority of coral reefs, part of the world's natural heritage, is small," summarizes Malte Meinshausen, co-author at the Potsdam Institute for Climate Impact Research and the University of Melbourne. "We close this window, if we follow another decade of ballooning global greenhouse-gas emissions."

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Potsdam Institute for Climate Impact Research (PIK).

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. K. Frieler, M. Meinshausen, A. Golly, M. Mengel, K. Lebek, S. D. Donner, O. Hoegh-Guldberg. Limiting global warming to 2 °C is unlikely to save most coral reefs. Nature Climate Change, 2012; DOI: 10.1038/NCLIMATE1674

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

17 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/top_news/top_science/~3/Q1lxAUZ2UFk/120916160926.htm
--
Manage subscription | Powered by rssforward.com
03.36 | 0 komentar | Read More
techieblogger.com Techie Blogger Techie Blogger