Quantcast

Different tonsillectomy techniques may result in fewer complications

Alexandria, VA — In a review of three different surgical techniques commonly used for tonsillectomy, the microdebrider technique (where a rotary cutting tool is used to shave tissue) had the lowest overall complication rate when compared to the other two techniques. The results are shown in new research published in the June 2010 issue of Otolaryngology — Head and Neck Surgery.

Tonsillectomy is among the oldest and most commonly performed procedures in the pediatric population. Approximately 530,000 outpatient pediatric adenotonsillectomies (which include removal of the adenoids) are performed annually in U.S. hospitals. Despite the frequency of the procedure, there is no universally accepted “ideal” method. And although generally considered a safe procedure, tonsillectomy has significant potential for complications, especially in the pediatric population. Potential major complications include: post-operative hemorrhage, dehydration, and anesthetic and airway risks. Common post-operative complaints like odynophagia (painful swallowing), otalgia (ear pain), fever, and uvular swelling tend to prolong the pediatric patient’s recovery.

The objective of the new study was to determine if surgical technique is associated with a patient experiencing post-surgical complications after undergoing adenotonsillectomy, and to identify patients who may be “at-risk” for major complications after such surgery.

The case-controlled study included 4,776 patients age 1 to 18 years, undergoing adenoidectomy, tonsillectomy, or adenotonsillectomy over a 36-month period. The three techniques compared were microdebrider, coblation (a controlled, non-heat-driven process using radiofrequency energy), or electrocautery (process of destroying tissue by heat conduction from an electrically heated metal probe).

Study results showed a statistically significant difference in the risks of developing a major complication of having a tonsillectomy with coblation, electrocautery, or microdebrider. The microdebrider had the lowest overall complication rate of 0.7 percent, versus that of the coblator, 2.8 percent, and electrocautery, 3.1 percent. The authors reported that those who had tonsillectomies via coblation were 3.9 times more likely to have complications than those who had tonsils removed via microdebrider. Additionally, complications for patients who had tonsillectomies via electrocautery were 4.4 times more likely than those who had tonsils removed via microdebrider.

“Questions will remain regarding what is the best procedure,” said study author Craig S. Derkay, MD. “However, an important point is that that no matter which surgical technique was used for removal of the tonsils in the study, our results demonstrate an acceptable level of safety across all procedures.”

In terms of risk factors for post-surgical complications, patient’s age proved to be a significant aspect in their treatment and outcome following tonsillectomy. When looking at the average age of dehydration and post-tonsillectomy hemorrhage, the authors observed that the younger patients had more episodes of dehydration, while older patients tend to have more episodes of post-tonsillectomy hemorrhage.

The authors urge continued study in the area of adenotonsillectomy surgical techniques, but noted that the large cohort of patients in their study adds to the literature supporting the safety of intracapsular tonsillectomy.

Otolaryngology — Head and Neck Surgery is the official scientific journal of the American Academy of Otolaryngology — Head and Neck Surgery Foundation (AAO-HNSF). The study’s authors are Thomas Q. Gallagher, DO, Lyndy Wilcox, BA, Erin McGuire, BS, and Craig S. Derkay, MD.

Reporters who wish to obtain a copy of the study should contact [email protected].

About the AAO-HNS

The American Academy of Otolaryngology — Head and Neck Surgery (www.entnet.org), one of the oldest medical associations in the nation, represents nearly 11,000 physicians and allied health professionals who specialize in the diagnosis and treatment of disorders of the ears, nose, throat, and related structures of the head and neck. The Academy serves its members by facilitating the advancement of the science and art of medicine related to otolaryngology and by representing the specialty in governmental and socioeconomic issues. The organization’s vision: “Empowering otolaryngologist-head and neck surgeons to deliver the best patient care.”

TV food advertisements promote imbalanced diets

St. Louis, MO, June 1, 2010 — Making food choices based on television advertising results in a very imbalanced diet according to a new study comparing the nutritional content of food choices influenced by television to nutritional guidelines published in the June issue of the Journal of the American Dietetic Association.

Investigators found that a 2,000-calorie diet consisting entirely of advertised foods would contain 25 times the recommended servings of sugars and 20 times the recommended servings of fat, but less than half of the recommended servings of vegetables, dairy, and fruits. In fact, the excess of servings in sugars and fat is so large that, on average, eating just one of the observed food items would provide more than three times the recommended daily servings (RDS) for sugars and two and a half times the RDS for fat for the entire day.

“The results of this study suggest the foods advertised on television tend to oversupply nutrients associated with chronic illness (eg, saturated fat, cholesterol, and sodium) and undersupply nutrients that help protect against illness (eg, fiber, vitamins A, E, and D, calcium, and potassium),” according to lead investigator Michael Mink, PhD, Assistant Professor and MPH Program Coordinator, Armstrong Atlantic State University, Savannah, GA.

Researchers analyzed 84 hours of primetime and 12 hours of Saturday morning broadcast television over a 28-day period in 2004. ABC, CBS, Fox and NBC were sampled on a rotating basis to develop a complete profile of each network. The Saturday-morning cartoon segment (from 8:00 am to 11:00 am) was included to capture food advertisements marketed primarily to children.

All 96 hours of observations were videotaped and reviewed later to identify food advertisements and specific food items being promoted. Only food items that were clearly promoted for sale during an advertisement were recorded. Each food item was then analyzed for nutritional content. Observed portion sizes were converted to the number of servings.

The article indicates that the observed food items fail to comply with Food Guide Pyramid recommendations in every food group except grains. The average observed food item contained excessive servings of sugars, fat, and meat and inadequate servings of dairy, fruit and vegetables. The situation was similar for essential nutrients, with the observed foods oversupplying eight nutrients: protein, selenium, sodium, niacin, total fat, saturated fat, thiamin and cholesterol. These same foods undersupplied 12 nutrients: iron, phosphorus, vitamin A, carbohydrates, calcium, vitamin E, magnesium, copper, potassium, pantothenic acid, fiber, and vitamin D.

The authors advocate nutritional warnings for imbalanced foods similar to those mandated on direct-to-consumer drug advertisements. They recommend investigating health promotion strategies that target consumers, the food industry, public media, and regulation focusing on a three-pronged approach.

“First, the public should be informed about the nature and extent of the bias in televised food advertisements. Educational efforts should identify the specific nutrients that tend to be oversupplied and undersupplied in advertised foods and should specify the single food items that surpass an entire day’s worth of sugar and fat servings. Second, educational efforts should also provide consumers with skills for distinguishing balanced food selections from imbalanced food selections. For example, interactive websites could be developed that test a participant’s ability to identify imbalanced food selections from a list of options. This type of game-based approach would likely appeal to youth and adults. Third, the public should be directed to established nutritional guidelines and other credible resources for making healthful food choices.”

The article is “Nutritional Imbalance Endorsed by Televised Food Advertisements” by Michael Mink, PhD, Alexandra Evans, PhD, Charity G. Moore, PhD, Kristine S. Calderon, PhD, CHES, and Shannon Cosgrove, MPH, CHES. It appears in the Journal of the American Dietetic Association, Volume 110, Issue 6 (June 2010) published by Elsevier.

Flies offer insight into human metabolic disease

Galactosemia is a metabolic disease resulting from an inherited defect that prevents the proper metabolism of galactose, a sugar commonly found in dairy products, like milk. Exposure of affected people to galactose, can damage most of their organ systems and can be fatal. The ability to study the disease is limited by a lack of animal models. New information suggests that similarities between humans and flies may provide scientists with useful clues.

The inability to breakdown simple sugars in common foods, such as milk, can lead to the accumulation of sugars in the blood, which become toxic and damaging to a variety of organ systems. People with galactosemia, either classic galactosemia or epimerase deficiency galactosemia, have genetic mutations that decrease their levels of the key enzymes (GALT and GALE) responsible for the metabolism of a common form of dietary sugar. Without proper levels of these proteins, these people are unable to process the sugar, galactose, which makes up about half of the calories in milk. Both disorders can have severe effects. Patients suffer from liver and brain damage, cataracts, and kidney failure. The disease can be fatal. There is currently no cure and prognosis and treatment remain ill-defined, partly due to the lack of a good animal model that scientists can use to study the disease and to develop potential treatments.

Discovery of treatments for galactosemia is complicated by the unique sensitivities among different organisms to defects in sugar metabolism. For example, galactose accumulation in mice does not have the same physiological consequences as it does in humans, limiting the applicability of mouse models and slowing advances in this area of research. The fly (Drosophila melanogaster) is a popular laboratory model organism that has been used for many decades and in numerous studies, including those of human metabolic disease.

Scientists at Emory University developed flies that carry genetic changes similar to those found in patients with galactosemia. Like patients with classic galactosemia, flies that are missing GALT survive if they are raised on food that does not contain galactose, but die in development if exposed to high levels of galactose. Flies with impaired GALE function also succumb in greater numbers when exposed to galactose during development, similar to patients with defects in the same area of their metabolic pathway. The Emory scientists also tested the relationship between the timing of galactose exposure with the fly’s outcome, and designed and characterized flies in which they could remove or control the production of GALT or GALE at variable points in the animal’s development to determine when and where the sugar breakdown was most needed. These models can help researchers understand how changes in sugar metabolism lead to disease and open the door to novel drug discovery by serving as a testing ground for candidate therapeutics.

This work is presented in corresponding Research Articles in Volume 3 issue 7/8 of the research journal, Disease Models & Mechanisms (DMM), <http://dmm.biologists.org/>, published by The Company of Biologists, a non-profit organisation based in Cambridge, UK. The first article entitled ‘A Drosophila melanogaster model of classic galactosemia, was written Rebekah Kushner, Emily Ryan, Jennifer Sefton, Rebecca Sanders, Patricia Jumbo Lucioni, Kenneth Moberg, and Judith Fridovich-Keil. The second article entitled ‘UDP-galactose 4′ epimerase (GALE) is essential for development of Drosophila melanogaster’ was written by Rebecca Sanders, Jennifer Sefton, Kenneth Moberg, and Judith Fridovich-Keil

About Disease Models & Mechanisms:

Disease Models & Mechanisms (DMM) (http://dmm.biologists.org) is a new research journal, launched in 2008, that publishes primary scientific research, as well as review articles, editorials, and research highlights. The journal’s mission is to provide a forum for clinicians and scientists to discuss basic science and clinical research related to human disease, disease detection and novel therapies. DMM is published by the Company of Biologists, a non-profit organization based in Cambridge, UK.

The Company also publishes the international biology research journals Development, Journal of Cell Science, and The Journal of Experimental Biology. In addition to financing these journals, the Company provides grants to scientific societies and supports other activities including travelling fellowships for junior scientists, workshops and conferences. The world’s poorest nations receive free and unrestricted access to the Company’s journals.

Concealed patterns beneath life’s variety

Although the tropics appear to the casual observer to be busily buzzing and blooming with life’s rich variety when compared with temperate and polar regions — a fact that scientists have thoroughly documented — the distribution of species in space and time actually varies around the globe in surprising and subtle ways. So explains Janne Soininen of the University of Helsinki in an article published in the June 2010 issue of BioScience.

Soininen explores a number of recent studies on the topic, synthesizing conclusions from thousands of observations. The studies focus on how the proportion of species that are present in both of two samples varies depending on the distance separating the sites where the samples were taken, or with the time separating two samplings of the species present at one place. Many such studies indicate that, as expected, the species mix turns over more in the tropics than closer to the poles. But it turns out that this is true only for studies that look at small areas — roughly, a square kilometer or less — or at periods of less than about a year. Studies that look at very large areas, or at multi-year changes, often find the opposite effect: turnover of species is higher close to the poles than in the tropics. Soininen suggests that changes in climate over large distances and over multi-year periods explain these paradoxical trends. Moreover, the data Soininen surveys imply that species turnover does not change in a straightforward way over distance and time, perhaps because of different interactions between the species that make up different ecosystems. Soininen suggests further studies to clarify these effects. Such work could shed light on the fundamental processes that assemble ecosystems.

By noon EST on 1 June 2010 and until early July, the full text of the article will be available for free download through the copy of this press release available at www.aibs.org/BioScience-press-releases/.

BioScience, published 11 times per year, is the journal of the American Institute of Biological Sciences (AIBS). BioScience publishes commentary and peer-reviewed articles covering a wide range of biological fields, with a focus on “Organisms from Molecules to the Environment.” The journal has been published since 1964. AIBS is an umbrella organization for professional scientific societies and organizations that are involved with biology. It represents some 200 member societies and organizations with a combined membership of about 250,000.

The complete list of peer-reviewed articles in the June 2010 issue of BioScience is as follows:

A Phylogenomic Perspective on the New Era of Ichthyology by Wei-Jen Chen and Richard L. Mayden

Species Turnover along Abiotic and Biotic Gradients: Patterns in Space Equal Patterns in Time? by Janne Soininen

Using Landscape Limnology to Classify Freshwater Ecosystems for Multi-ecosystem Management and Conservation by Patricia A. Soranno, Kendra Spence Cheruvelil, Katherine E. Webster, Mary T. Bremigan, Tyler Wagner, and Craig A. Stow

Not So Fast: Inflation in Impact Factors Contributes to Apparent Improvements in Journal Quality by Bryan D. Neff and Julian D. Olden

Wildfire and Management of Forests and Native Fishes: Conflict or Opportunity for Convergent Solutions? by Bruce E. Rieman, Paul F. Hessburg, Charles Luce, and Matthew R. Dare

ACR task force makes recommendations for improving relationships between radiologists and hospitals

The American College of Radiology’s (ACR) Task Force on Relationships between Radiology Groups and Hospitals and Other Healthcare Organizations has proposed several steps that can help improve relationships between radiologists and the health care systems that they service, according to an article in the June issue of the Journal of the American College of Radiology (www.jacr.org).

“The vast majority of U.S. radiologists are affiliated with hospital-based group practices, making professional relationships between radiologists and hospitals one of the most crucial factors in building and maintaining successful and secure practices,” said Cynthia S. Sherry, MD, FACR, lead author of the article. “Yet lately tensions between hospitals and radiologists have been increasing,” she said.

The ACR has assembled a task force on relationships between radiology groups and hospitals and other health care organizations. The task force’s goal was not merely to identify problems but to propose positive steps that would benefit radiologists, hospitals, and the patients and communities they serve.

“Radiologists must re-dedicate themselves to the concept of service and be more visible to patients, referring physicians, and to the hospital administration. It is imperative for the survival of the specialty for radiologists to provide a “value added” to the clinical evaluations and therapies of patients. This can entail expanded hours of onsite coverage, a greater number of available radiologists, more sub-specialization, and/or greater opportunities for consultations with referring physicians and their patients,” she said.

“Hospitals should place a high priority on nurturing a functional relationship with their radiology group. A successful relationship will go a long way toward laying a sound bedrock for a radiology service that is optimal for patients, referring physicians, and the administration. Furthermore, hospitals should recognize the core strategic value of a strong foundational radiology service and the critical importance of on-site involvement by radiologists,” said Sherry.

The June issue of JACR is an important resource for radiology and nuclear medicine professionals as well as students seeking clinical and educational improvement.

For more information about JACR, please visit www.jacr.org.

To receive an electronic copy of an article appearing in JACR or to set up an interview with a JACR author or another ACR member, please contact Heather Curry at 703-390-9822 or [email protected].

Geologist: Fla. ridges’ mystery marine fossils tied to rising land, not seas

GAINESVILLE, Fla. — Sea level has not been as high as the distinctive ridges that run down the length of Florida for millions of years. Yet recently deposited marine fossils abound in the ridges’ sands.

Now, a University of Florida geologist may have helped crack that mystery.

In a paper appearing June 1 in the June edition of the journal Geology, Peter Adams, a UF assistant professor of geological sciences, says his computer models of Florida’s changing land mass support this theory: The land that forms the sandy Trail Ridge running north to south from North Florida through South Georgia, as well as lesser-known ridges, was undersea at the time the fossils were deposited — but rose over time, reaching elevations that exceeded later sea level high stands.

“If you look at the best records, there’s no evidence that global sea level has come close to occupying the elevation of these fossils since the time of their emplacement,” Adams said, referring to Trail Ridge’s elevation today, nearly 230 feet above modern sea level. “The only thing that explains this conundrum is that Trail Ridge was underwater, but later rose to an elevation higher than subsequent sea levels.”

At the heart of the phenomenon are Florida’s unique weather patterns and geology, Adams said.

The state’s abundant rain contains a small amount of carbon dioxide, which forms carbonic acid in lake and river water. This slightly acidic water slowly eats away at Florida’s limestone bedrock, forming the karst topography for which Florida is so well known, replete with pockmarks, underground springs and subterranean caverns. The surface water washes the dissolved limestone out to sea, over time significantly lightening the portion of the Earth’s crust that covers Florida.

A mass of slow-moving mantle rock resides 6 to 18 miles below the crust. As the Florida land mass lightens, this mantle pushes upward to equilibrate the load, forcing Florida skyward, Adams said. The process is known as isostatic rebound, or isostatic uplift.

“It’s just like what happens when you get out of bed in the morning. The mattress springs raise the surface of the bed back up,” Adams said, adding that the uplift is similar to what takes place when glaciers retreat, with Maine and Norway, for example, also gaining elevation.

Glaciers melt off the land surface to drive isostatic uplift. But in Florida, varying rainfall rates during different periods have slowed or quickened the karstification just below the land. This has in turn slowed or quickened the mantle’s push up from below. Additionally, sea level high stands do not always return to the same elevation, which creates a complex history of which beach ridges are preserved and which aren’t, Adams said.

For instance, during periods when sea level rose quickly, some pre-existing ridges were overtaken and wiped out. During other periods, however, when sea level rose slowly or did not reach a certain ridge’s elevation, a beach ridge was preserved. In effect, Trail Ridge, Lake Wales Ridge and other lesser-known ridges are the remains of isostatically uplifted land that was kept out of harm’s way, Adams said. The ridges carry with them the marine fossils that are the evidence of their lowly early beginnings.

Today, the land surface of Florida is rising at a rate of about one-twentieth of a millimeter annually, far more slowly than sea level rise estimated at approximately 3 millimeters annually. Adams noted that Florida’s rise is not nearly rapid enough to counteract sea level rise — and that society should be mindful that low-lying coastal areas are threatened.

Neil Opdyke, a UF professor emeritus and a co-author of the recent paper, first proposed the uplift process in a 1984 paper. Adams tested it using computer models that matched known information about sea levels dating back 1.6 million years with historic rainfall patterns, karstification rates and mantle uplift. The models concluded that Trail Ridge is approximately 1.4 million years old — and has been preserved because of uplift and the fact that sea levels have not reached the ridge’s elevation since its formation. In addition, Florida’s one-twentieth of a millimeter rise is twice as fast as previously thought.

“The neat thing about this paper is, it combines many different systems that people work on. There are people who work on uplift, people who work on erosion of karst, people who work on precipitation and paleoclimate,” Adams said. “And I knew just enough about all these things to be dangerous. So I said. ‘Let’s take what we know from the literature and put it together in a simple mathematical model to see how the whole system responds.'”

Scientists decipher structure of nature’s ‘light switch’

UPTON, NY — When the first warm rays of springtime sunshine trigger a burst of new plant growth, it’s almost as if someone flicked a switch to turn on the greenery and unleash a floral profusion of color. Opening a window into this process, scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory and collaborators at the University of Wisconsin, Madison, have deciphered the structure of a molecular “switch” much like the one plants use to sense light. Their findings, described online in the Proceedings of the National Academy of Sciences the week of May 31, 2010, help explain how the switch works and could be used to design new ways to modify plant growth.

Previous studies showed that the light-sensing structure, called a phytochrome, exists in two stable states. Each state is sensitive to a slightly different wavelength, or color, of light — from red to “far red,” which is close to the invisible infrared end of the light spectrum. As the phytochrome absorbs photons of one wavelength or the other, it changes shape and sends signals that help plants know when to flower, produce chlorophyll, and grow.

“The phytochrome is almost like nature’s light switch,” said Brookhaven biophysicist Huilin Li, who is also an associate professor at Stony Brook University and a lead author on the study. “Finding out how this switch is flipped on or off by a signal as subtle as a single photon of light is fascinating.”

As with all biological molecules, one key to the phytochrome’s function is its structure. But scientists trying to get a molecular-level picture of a phytochrome have a formidable challenge: The phytochrome molecule is too dynamic to capture in a single image using techniques like x-ray crystallography. So, scientists have studied only the rigid and smaller pieces of the molecule, yielding detailed, but fragmented, information.

Now using additional imaging and computational techniques, the Brookhaven researchers and their collaborators have pieced together for the first time a detailed structure of a whole phytochrome.

Li and his collaborators studied a phytochrome from a common bacterium that is quite similar in biochemistry and function to those found in plants, but easier to isolate. Plant biologist Richard Vierstra of the University of Wisconsin provided the purified samples.

At Brookhaven, Li’s group used two imaging techniques. First, they applied a layer of heavy metal dye to the purified phytochrome molecules to make them more visible, and viewed them using an electron microscope. This produced many two-dimensional images from a variety of angles to give the researchers a rough outline of the phytochrome map.

The scientists also froze the molecules in solution to produce another set of images that would be free of artifacts from the staining technique. For this set of images, the scientists used a cryo-electron microscope.

Using computers to average the data from each technique and then combine the information, the scientists were able to construct a three-dimensional map of the full phytochrome structure. The scientists then fitted the previously determined detailed structures of phytochrome fragments into their newly derived 3-D map to build an atomic model for the whole phytochrome.

Though the scientists knew the phytochrome was composed of two “sister” units, forming a dimer, the new structure revealed a surprisingly long twisted area of contact between the two individual units, with a good deal of flexibility at the untwisted ends. The structure supports the idea that the absorption of light somehow adjusts the strength or orientation of the contact, and through a series of conformation changes, transmits a signal down the length of the molecular interface. The scientists confirmed the proposed structural changes during photo-conversion by mutagenesis and biochemical assay.

The scientists studied only the form of the phytochrome that is sensitive to red light. Next they plan to see how the structure changes after it absorbs red light to become sensitive to “far red” light. Comparing the two structures will help the scientists test their model of how the molecule changes shape to send signals in response to light.

This research was supported by Brookhaven’s Laboratory Directed Research and Development program, the National Institutes of Health, the National Science Foundation, and a grant from the University of Wisconsin College of Agricultural and Life Science.

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by the Research Foundation of State University of New York on behalf of Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

Visit Brookhaven Lab’s electronic newsroom for links, news archives, graphics, and more: http://www.bnl.gov/newsroom

Sugary band-aid may help heal post-operative tissue

NEW YORK (May 31, 2010) — A compound found in sunless tanning spray may help to heal wounds following surgery, according to new results published by plastic surgeons from NewYork-Presbyterian Hospital/Weill Cornell Medical Center in New York City and biomedical engineers at Cornell University in Ithaca, N.Y., where the novel compound was developed.

Results published today in the Proceedings of the National Academy of Sciences show that a sticky gel composed of polyethylene glycol and a polycarbonate of dihydroxyacetone (MPEG-pDHA) may help to seal wounds created by surgery.

Procedures to remove cancerous breast tissue, for example, often leave a hollow space that fills with seroma fluid that must typically be drained by a temporary implanted drain. “This is an unpleasant side effect of surgery that is often unavoidable,” explains Dr. Jason Spector, co-author of the study and plastic surgeon at NewYork-Presbyterian Hospital/Weill Cornell Medical Center.

The gel could potentially be used in all different reconstructive surgeries to prevent seroma formation. “The new substance would act to glue together the hole left behind to prevent seroma buildup,” says Dr. Spector.

DHA is a compound that sticks to compounds in biological tissues, called amines. The sticky properties of DHA are what allows sunless tanner to adhere to the skin without being wiped off. However, it is biodegradable and water soluble as well, which means that the compound does not stay tacked onto the body’s tissues forever. Currently used “bio-glues” are made from animal products and take a long time to degrade in the body — both factors that raise the risk of infection.

“DHA is a compound that is naturally produced in the body,” explains Dr. David Putnam, the study’s senior author and a biomedical engineer from Cornell University’s Department of Biomedical Engineering and School of Chemical and Biomolecular Engineering. “The glue is broken down, or metabolized, and then safely removed by the body.”

Dr. Putnam’s lab and his collaborators work to create safe, synthetic compounds from chemicals found in nature. DHA is an intermediary compound produced during the metabolism of glucose, a sugar used by the body for fuel.

To create the new compound, MPEG-pDHA, Dr. Putnam and his lab first bound the single molecule monomer of DHA, which is highly reactive, to a protecting group molecule, making it stable enough to manipulate. This allowed the engineers to bind the monomers together to form a polymer, or chain of molecules, along with MPEG. Doing so allows the polymer gel to be injected through a syringe.

“Making a polymer from DHA has eluded chemical engineers for about 20 years,” says Dr. Putnam.

Now in gel form, the compound has the ability to stick tissues together, preventing the pocket from filling with seroma fluid, like an internal Band-Aid, explains Dr. Putnam. The researchers found that the gel prevented or significantly lowered seroma formation or fluid buildup in rats that had breast tissue removed.

“The next step would be to test the gel on larger animals and then in clinical trials in human surgical cases,” says Dr. Spector.

Previous results, published by Drs. Putnam and Spector, in the August 2009 issue of the Journal of Biomedical Materials Research, showed that the gel also prevented bleeding in a rat liver.

“This is another aspect of the compound that would be greatly beneficial if proven to be applicable in humans,” says Dr. Spector. “The gel could speed the healing and decrease bleeding within the body.”

This research was supported in part from a National Science Foundation CAREER Award, a grant from the Morgan Tissue Engineering Fund, an Early Career Award from the Wallace H. Coulter Foundation, and the New York State Center for Advanced Technology.

Co-authors of the study include Dr. Peter Zawaneh from Cornell University, Dr. Sunil Singh and Dr. Peter Henderson from Weill Cornell, and Dr. Robert Padera from the Department of Pathology at Brigham and Women’s Hospital.

NewYork-Presbyterian Hospital/Weill Cornell Medical Center

NewYork-Presbyterian Hospital/Weill Cornell Medical Center, located in New York City, is one of the leading academic medical centers in the world, comprising the teaching hospital NewYork-Presbyterian and Weill Cornell Medical College, the medical school of Cornell University. NewYork-Presbyterian/Weill Cornell provides state-of-the-art inpatient, ambulatory and preventive care in all areas of medicine, and is committed to excellence in patient care, education, research and community service. Weill Cornell physician-scientists have been responsible for many medical advances — including the development of the Pap test for cervical cancer; the synthesis of penicillin; the first successful embryo-biopsy pregnancy and birth in the U.S.; the first clinical trial for gene therapy for Parkinson’s disease; the first indication of bone marrow’s critical role in tumor growth; and, most recently, the world’s first successful use of deep brain stimulation to treat a minimally conscious brain-injured patient. NewYork-Presbyterian Hospital also comprises NewYork-Presbyterian Hospital/Columbia University Medical Center, NewYork-Presbyterian/Morgan Stanley Children’s Hospital, NewYork-Presbyterian Hospital/Westchester Division and NewYork-Presbyterian/The Allen Hospital. NewYork-Presbyterian is the #1 hospital in the New York metropolitan area and is consistently ranked among the best academic medical institutions in the nation, according to U.S.News & World Report. Weill Cornell Medical College is the first U.S. medical college to offer a medical degree overseas and maintains a strong global presence in Austria, Brazil, Haiti, Tanzania, Turkey and Qatar. For more information, visit www.nyp.org and www.med.cornell.edu.

Powerful genome barcoding system reveals large-scale variation in human DNA

MADISON — Genetic abnormalities are most often discussed in terms of differences so miniscule they are actually called “snips” — changes in a single unit along the 3 billion that make up the entire string of human DNA.

“There’s a whole world beyond SNPs — single nucleotide polymorphisms — and we’ve stepped into that world,” says Brian Teague, a doctoral student in genetics at the University of Wisconsin-Madison. “There are much bigger changes in there.”

Variation on the order of thousands to hundreds of thousands of DNA’s smallest pieces — large swaths varying in length or location or even showing up in reverse order — appeared 4,205 times in a comparison of DNA from just four people, according to a study published May 31 in the Proceedings of the National Academy of Sciences.

Those structural differences popped into clear view through computer analysis of more than 500 linear feet of DNA molecules analyzed by the powerful genome mapping system developed over nearly two decades by David C. Schwartz, professor of chemistry and genetics at UW-Madison.

“We probably have the most comprehensive view of the human genome ever,” Schwartz says. “And the variation we’re seeing in the human genome is something we’ve known was there and important for many years, but we haven’t been able to fully study it.”

To get a better picture of those structural variations, Schwartz and his team developed the Optical Mapping System, a wholly new type of genome analysis that directly examines millions of individual DNA molecules.

Common systems for analyzing genomes typically chop long DNA molecules into fragments less than a couple thousand base pairs long and multiply them en masse, like a copy machine, to develop a chemical profile of each piece.

Reading such small sections without seeing their place in the larger picture of DNA leaves out critical understanding. To make matters worse, interesting parts of the human genome are often found within DNA’s trickiest stretches.

“Short pieces could really come from so many different locations,” Teague says. “An enormous part of the genome is composed of repeating DNA, and important differences are often associated with areas that have a lot of repeated sections.”

It’s a problem inherent to the method that has irked Schwartz for a long time.

“Our new technology quickly analyzes huge DNA molecules one at a time, which eliminates the copy machine step, reduces the number of DNA jig-saw pieces and increases the unique qualities of each piece,” Schwartz says. “These advantages allow us to discover novel genetic patterns that are otherwise invisible.”

The genome mapping system in Schwartz’ lab takes in much larger pieces, at least millions of base pairs at a time. Sub-millimeter sections of single DNA molecules — thread-like and, in full, 4 to 5 inches long in humans — are coaxed onto treated glass surfaces.

The long strands of DNA straighten out on the glass, and are clipped into sections by enzymes and scanned by automated microscopes. The pattern of these cuts along each molecule thread produces a unique barcode, identifying the DNA molecule and revealing genetic changes it harbors.

The scan results are passed along to databases for storage and retrieval, and handled by software that stitches collections of bar-coded molecules together with others to reconstitute the entire strand of DNA and quickly pinpoint genetic changes.

“What we have here is a genetic version of Google Earth,” Schwartz says. “I could sit down with you and start at chromosome 1, and we could pan and zoom through each one and actually see the genetic changes across an individual’s genome.”

To Teague, the Optical Mapping System provides access to a new frame of reference on human genetic variation.

“I’ve got a whole folder of papers on diseases that are ascribable to these structural differences,” he says. “If you can see the genetic basis for those diseases, you can figure out the molecular differences in their development and pick drug targets to treat or cure or avoid them altogether. We fit into that storyline right up at the front.”

It’s been a long story.

“We’ve been thinking about these large structural variations for decades,” says Schwartz, whose work is funded by the National Institutes for Health and the National Science Foundation. “The problem was that the system for discerning large structural variants was not available. So we had to build it.”

The integrative building process included studying the behavior of fluids at microscopic scale, manipulating large DNA molecules and placing barcodes on them, automating high-powered microscopes to analyze single molecules, organizing the computing infrastructure to handle the data and algorithms to analyze whole human genome, and more.

And after notable turns analyzing the DNA of corn, parasites, bacteria and even the mold that caused the 19th-century potato famine in Ireland, Schwartz has arrived at the human genome, his original target.

“It’s like you spend years making a telescope, and then one day you point it at the sky and you discover things that no one else could see,” he says. “We’ve integrated so many scientific problems together in a holistic way, which lets us solve very hard problems.”

The result is a 30-day turnaround for one graduate student to analyze one human genome, but that’s just a waypoint. Schwartz’s team isn’t just pointing at the sky. They are aiming for the stars by building new systems for personal genomics.

“This will go even further,” says Konstantinos Potamousis, the lab’s instrumentation innovator and a co-author on the study, which included researchers from UW-Madison, Mississippi State University, the University of Pittsburgh, the University of Southern California and the University of Washington. “Our systems scale nicely into the future because we’ve pioneered single molecule technologies. The newer systems we are building will provide more genetic information in far less time.”

With development complete on new molecular devices, software and analysis, a large piece of the system is already in place.

And the speed of innovation will synergize the pace of genome analysis.

“Our newer genome analysis systems, if commercialized, promise genome analysis in one hour, at under $1,000,” Schwartz says. “And we require that high speed and low cost to power the new field of personal genomics.”

Alcohol consumption in Portugal: The burden of disease

  • Portugal is currently ranked eighth in the world in alcohol consumption.
  • A new study has examined the costs that alcohol consumption has on Portugal’s health system.
  • Findings show that roughly 3.8 percent of deaths are attributable to alcohol.

The World Health Organization has estimated that 3.2 percent of the “burden of disease” around the world is attributable to the consumption of alcohol. Portugal is currently ranked eighth in the world in alcohol consumption. A new study has found alcohol consumption in Portugal represents a heavy economic burden for that country’s health system.

Results will be published in the August 2010 issue of Alcoholism: Clinical & Experimental Research and are currently available at Early View.

“As a gastroenterologist, someone who takes cares of patients with liver diseases, I am very aware that alcohol-related liver cirrhosis is a disease with a very high morbidity and mortality,” said Helena Cortez-Pinto, associate professor in the Institute of Molecular Medicine and corresponding author for the study.

“This is a brilliant study,” said Helmut K. Seitz, professor of internal medicine, gastroenterology and alcohol research at the University of Heidelberg, and president of the European Society for Biomedical Research on Alcohol (ESBRA). “It shows very clearly on the basis of real data that alcohol consumption is not only a risk factor for various diseases, especially liver disease, but it also shows that these alcohol-related diseases cost enormous money. We have data from Canada and England and some other countries, but we need more accurate data to convince the public, the doctors and the politicians that we have to do something about it.”

Cortez-Pinto and her colleagues analyzed 2005 demographic and health statistics using Disability-Adjusted Life Years (DALY). The DALY “sums” the effects that alcohol-related diseases (ARDs) can have on premature mortality, that is, how it might reduce years of life normally expected for an individual, with the effects it might have in reducing the quality of life during the years that an individual lives after being diagnosed with an ARD.

“We found that a significant percentage of deaths — 3.8 percent — in Portugal were somehow caused by alcohol consumption,” said Cortez-Pinto, “resulting in a significant burden of disease, roughly 38,370 years of life lost for death or disability due to alcohol. The main source was liver disease at 31.5 percent, followed by traffic accidents at 28.2 percent, and several types of cancer and cardiovascular disease at 19.2 percent. In addition, this collectively represented a total cost of €191.0 million ($239 million USD) in direct costs, representing 0.13 percent of Gross Domestic Product and 1.25 percent of total national health expenditures.”

She added they also found that heavy drinking was significantly higher in males, inversely correlated with amount of education, and strongly correlated with cigarette smoking.

“I think other countries could do similar studies to ours in order to get a clearer picture of the problem,” said Cortez-Pinto. “It may help to increase consciousness of the severity of the problem and what it represents in economic costs, thus helping to put pressure on governments to take measures to reduce this heavy burden of disease, either by creating and enforcing laws to dissuade people from excessive drinking, mostly youngsters, and also to support funding investigation projects in the alcohol field.”

Cortez-Pinto noted that countries in Eastern Europe currently have the highest alcohol consumption levels in Europe. Seitz added that countries like the United Kingdom and Germany have even greater problems, especially with binge drinking.

“The public has to understand that alcohol is a major health problem, that alcohol is the number one factor for liver disease in Europe, and that if we want to control this problem a variety of measures have to be taken, such as increasing the price of alcoholic beverages and decreasing their availability,” said Seitz. “Clinicians have to be aware of alcoholic liver disease, and that there is a need for early detection and intervention. Furthermore, many clinicians as well as the public do not know that alcohol is a major risk factor for cancer of the mouth, larynx, pharynx, esophagus, liver, female breast, and the colorectum — cancers with a high prevalence in our societies.”

Alcoholism: Clinical & Experimental Research (ACER) is the official journal of the Research Society on Alcoholism and the International Society for Biomedical Research on Alcoholism. Co-authors of the ACER paper, “The Burden of Disease and the Cost of Illness Attributable to Alcohol Drinking — Results of a National Study,” were: Miguel Gouveia of the Centro de Estudos Aplicados in the Faculdade de Ciências Económicas e Empresariais at the Universidade Católica Portuguesa; and Luís dos Santos Pinheiro, João Costa, Margarida Borges, and António Vaz Carneiro of the Centro de Estudos de Medicina Baseada na Evidência in the Faculdade de Medicina de Lisboa.

Animal study reveals new target for antidepressants

Ann Arbor, Mich. — University of Michigan scientists have provided the most detailed picture yet of a key receptor in the brain that influences the effectiveness of serotonin-related antidepressants, such as Prozac.

The findings, which appear online Monday ahead of print in the journal Proceedings of the National Academy of Sciences, open the door to providing a more targeted treatment of depression and anxiety with fewer side effects.

Depressive disorders change a person’s mood, emotions and physical well-being and can co-occur with anxiety disorders and substance abuse.

“There are big drawbacks in the current therapies for depression,” says senior author John Traynor, Ph.D., professor of pharmacology at the U-M Medical School and director of the U-M Substance Abuse Research Center. “Therapeutic benefits are delayed, there are unwanted side effects, and it’s not unusual for depressive symptoms to return.”

Authors say the high relapse rate indicates a need for additional treatment options for the estimated 20.9 million Americans with depression.

The best current treatments for depression are selective serotonin reuptake inhibitors, or SSRIs. These drugs work by flooding the brain’s synapses with serotonin, a neurotransmitter linked with mood, and increasing signaling through the more than 20 serotonin receptors in the brain.

However the team of researchers showed that one particular pathway, the serotonin 5HT1a receptor is linked with antidepressive and antianxiety behavior in mice.

“Rather than activating all serotonin receptors as SSRIs do, one could increase signaling through the one critical serotonin receptor that our research shows is important for anti-depressant behavior,” says co-author Richard R. Neubig, M.D., Ph.D., co-director of the U-M Center for Chemical Genomics and professor of pharmacology at the U-M Medical School.

The new research details the complex actions of a family of proteins, known as RGS proteins, that act as brakes on neurotransmitter signaling.

Researchers created a mutant mouse to boost serotonin signaling at the 5HT1a receptor. This was done by genetically inhibiting the activity of braking proteins. Without the normal brake on serotonin signaling, these mutant mice showed antidepressive behavior even without being given antidepressant drugs. The mice were also more responsive to SSRIs.

Authors say that further research could lead to drugs capable of inhibiting the RGS proteins and which would target the antidepressant signal where it is required on critical 5HT1a receptors.

Additional authors in the study are: Jeffrey Talbot, Ph.D., Ohio Northern University, formerly of U-M who continued parts of this work with Crystal Clemans and Melanie Nicol, Pharm. D. Other U-M authors are Emily Jutkiewicz, Ph.D., Steven Graves, B.S., and Xinyan Huang, Ph.D., from the Department of Pharmacology and Richard Mortensen, M.D., Ph.D., from the Department of Molecular and Integrative Physiology.

Funding: National Institute of General Medical Sciences; National Institute on Drug Abuse.

Reference: Proceedings of the National Academy of Sciences, “RGS inhibition at Gai2 selectively potentiates 5-HT1A-mediated antidepressant effects,” doi 10.1073/pnas.1000003107

Resources

University of Michigan Health System

http://www.med.umich.edu/

Substance Abuse Research Center

http://sitemaker.umich.edu/umsarc/home

Center for Chemical Genomics

http://lsi.umich.edu/ccg

National Institute of Mental Health

http://www.nimh.nih.gov/health/publications/the-numbers-count-mental-disorders-in-america/index.shtm

Written by Shantell M. Kirkendoll

Educational researcher devotes May issue to ‘Report of the National Early Literacy Panel’

WASHINGTON, D.C., May 31, 2010 — The May 2010 issue of Educational Researcher (ER) provides a significant scholarly review of Developing Early Literacy: Report of the National Early Literacy Panel (NELP). Educational Researcher is one of six journals published by the American Educational Research Association. In the special issue, NELP Panel members Timothy Shanahan and Christopher J. Lonigan provide a summary of the report followed by nine peer-reviewed commentaries written by literacy scholars who examine the report and offer suggestions for where it illuminates issues and where it is lacking or ambiguous. The commentaries are followed by two responses, written by Shanahan, Lonigan, and Christopher Schatschneider.

The National Early Literacy Panel (NELP) Report, issued in 2009, presents the work of the nine-member panel, convened in 2002 by the National Institute for Literacy, in consultation with the National Institute of Child Health and Human Development, U.S. Department of Education, Head Start Bureau, and U.S. Department of Health and Human Services. The report provides the findings of meta-analyses of approximately 300 studies showing which early literacy measures correlate with later literacy achievement. It also provides a series of meta-analyses on ways of teaching early literacy (preschool and kindergarten) that have been published in refereed journals. These analyses examine the effects of code-based instruction, shared book reading, home/parent interventions, preschool/kindergarten interventions, and early language teaching

The May ER takes up the topic of early literacy where Developing Early Literacy leaves off, by creating a forum for additional dialogue on future research that needs to be conducted, including translational research and research that will build a sufficient knowledge base concerning early literacy skill development.

“The nine [commentary] contributors to this special issue have a long-standing commitment to the early literacy field; they also have broad-based research expertise, an understanding of early literacy practice, and a grasp of the ways in which policy reports, such as the NELP report, if left unexamined, can influence research and pedagogy with unintended consequences,” writes Anne McGill-Franzen, the issue’s guest editor, in her introduction. “The views of these authors as well as those of the panel are widely respected, and their insight is critical, particularly now as early literacy policy is taking shape on a national level.”

Early literacy — the central focus of NELP — confers a transformative power on individuals, and there now is “a sense of urgency about the need for policy makers, practitioners, and researchers to understand the limitations as well as the strengths of the NELP report,” adds McGill-Franzen, Professor of Teacher Education and Director of the Reading Center at the University of Tennessee, Knoxville, College of Education.

Commentary authors also provide perspectives that look backward and forward, noted McGill-Franzen. As an example, she cited the lead commentary by P. David Pearson and Elfrieda H. Hiebert, which locates the NELP report “within the universe of scientific reports on reading research, spanning more than five decades’ worth of policy contexts.”

In a reflective article, Susan B. Neuman, formerly Assistant Secretary for Elementary and Secondary Education in the George W. Bush administration, writes, “We need to expose children to language-rich and content-rich settings that can help them acquire the broad array of knowledge, skills, and dispositions that build a foundation for literacy and content learning.” Neuman, who conducts research on early childhood policy and early reading instruction in urban settings, is now Professor of Educational Studies at the University of Michigan, School of Education. She argues that effective interventions must mediate a knowledge and technology gap between economically advantaged children and those who are poor.

Data reported by NELP underrepresented the importance of language, a slowly acquired and highly complex ability, David K. Dickinson and his colleagues contend. They expressed concern that schools will target the easier to teach code skills, such as letter knowledge and the ability to link sounds to symbols, rather than language and background knowledge, which are harder to teach and may require longer interventions.

Commenting on prekindergarten and kindergarten classroom instructional practice, William H. Teale and his colleagues argue that the NELP report is “both insufficiently clear and overly narrow with respect to what preschool teachers should be focusing on instructionally in early literacy.”

One of the many aspects of early literacy addressed in the ER commentaries focused on dual-language learners (DLLs). Kris D. Gutiérrez and her colleagues point out that dual-language learners are one of the fastest growing populations in the United States, and yet gaps in knowledge exist. They call for “a more expansive research agenda for young DLLs,” noting that the field would benefit from “longitudinal studies that examine how children exposed to two languages from an early age develop in relation to their specific individual differences and sociocultural contexts, including different types of educational interventions.”

The intent of the ER special issue is to provide an introduction to the NELP report and commentaries that foster continued conversation and inquiry around critical issues in the field of early literacy research and practice.

Educational Researcher, May 2010

Special Issue: The National Early Literacy Panel Report: Summary, Commentary, and Reflections on Policies and Practices to Improve Children’s Early Literacy

  • Guest Editor’s Introduction

    Anne McGill-Franzen, University of Tennessee, Knoxville
  • The National Early Literacy Panel: A Summary of the Process and the Report

    Timothy Shanahan, Chair of NELP and University of Illinois-Chicago and the Center for Literacy, and Christopher J. Lonigan, Florida State University and the Florida Center for Reading Research
  • National Reports in Literacy: Building a Scientific Base for Practice and Policy

    P. David Pearson and Elfrieda H. Hiebert, University of California, Berkeley
  • Recognizing Different Kinds of “Head Starts”

    Marjorie Faulstich Orellana, University of California, Los Angeles, and Jacqueline D’warte, University of California, Irvine
  • Lessons From My Mother: Reflections on the National Early Literacy Panel Report

    Susan B. Neuman, University of Michigan
  • Speaking Out for Language: Why Language Is Central to Reading Development

    David K. Dickinson, Vanderbilt University, Roberta M. Golinkoff, University of Delaware, and Kathy Hirsh-Pasek, Temple University
  • Where Is NELP Leading Preschool Literacy Instruction? Potential Positives and Pitfalls

    William H. Teale, University of Illinois-Chicago, Jessica L. Hoffman, Miami University, and Kathleen A. Paciga, University of Illinois-Chicago
  • Confounded Statistical Analyses Hinder Interpretation of the NELP Report

    Scott G. Paris and Serena Wenshu Luo, National Institute of Education
  • The NELP Report on Shared Story Reading Interventions (Chapter 4): Extending the Story

    Judith A. Schickedanz, Boston University, and Lea M. McGee, The Ohio State University
  • Recasting the Role of Family Involvement in Early Literacy Development: A Response to the NELP Report

    Alanna Rochelle Dail and Rebecca L. Payne, University of Alabama
  • Advancing Early Literacy Learning for All Children: Implications of the NELP Report for Dual-Language Learners

    Kris D. Gutiérrez, University of Colorado, Marlene Zepeda, California State University, Los Angeles, and Dina C. Castro, University of North Carolina at Chapel Hill
  • Developing Early Literacy Skills:Things We Know We Know and Things We Know We Don’t Know

    Christopher J. Lonigan, Florida State University and the Florida Center for Reading Research, and Timothy Shanahan, University of Illinois-Chicago and the Center for Literacy
  • Misunderstood Statistical Assumptions Undermine Criticism of the National Early Literacy Panel’s Report

    Christopher Schatschneider and Christopher J. Lonigan, Florida State University and the Florida Center for Reading Research

Editor’s Note: The full text of this special issue of Educational Researcher, “The National Early Literacy Panel Report: Summary, Commentary, and Reflections on Policies and Practices to Improve Children’s Early Literacy,” is posted on the AERA Web site: www.aera.net (http://www.aera.net/publications/Default.aspx?menu_id=38&id=9962)

The full text of the NELP Report, Developing Early Literacy: Report of the National Early Literacy Panel, A Scientific Synthesis of Early Literacy Development and Implications for Intervention, is posted here: http://www.nifl.gov/earlychildhood/NELP/NELPreport.html

To reach AERA Communications, call (202)238-3200; Helaine Patterson ([email protected]) or Lucy Cunningham ([email protected]).

The American Educational Research Association (AERA) is the national interdisciplinary research association for approximately 25,000 scholars who undertake research in education. Founded in 1916, AERA aims to advance knowledge about education, to encourage scholarly inquiry related to education, and to promote the use of research to improve education and serve the public good.