June 2022

As food costs continue to rise and a global food crisis looms on the horizon, it’s staggering to think that some 30-40 percent of America’s food supply ends up in landfills, mostly due to spoilage. At the same time, the World Health Organization estimates that foodborne illness from microbial contamination causes about 420,000 deaths per year worldwide.

What if there were a way to package fresh foods that could extend their shelf life and eliminate microbial contamination?

Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences and the Harvard T.H. Chan School of Public Health have developed a biodegradable, antimicrobial food packaging system that does both.

“One of the biggest challenges in the food supply is the distribution and viability of the food items themselves,” said Kit Parker, the Tarr Family Professor of Bioengineering and Applied Physics at SEAS and senior author of the paper. “We are harnessing advances in materials science and materials processing to increase both the longevity and freshness of the food items and doing so in a sustainable model.”

The research was published in Nature Food.

From the battlefield to the farm

Surprisingly, the new food packing system has its roots in battlefield medicine. For more than a decade, Parker and his Disease Biophysics Group have been developing antimicrobial fibers for wound dressings. Their fiber manufacturing platform, known as Rotary Jet-Spinning (RJS), was designed specifically for the purpose.

RJS works likes a cotton candy machine — a liquid polymer solution is loaded into a reservoir and pushed out through a tiny opening by centrifugal force as the device spins. As the solution leaves the reservoir, the solvent evaporates, and the polymers solidify to form fibers, with controlled diameters ranging from microscale to nanoscale.

Rotary Jet-Spinning coats the avocado in thin fibers of pullulan.

Credit: Disease Biophysics Group/Harvard SEAS

The idea to translate the research from wound dressing to food packing was born of a collaboration with Philip Demokritou, the former co-Director of the Center for Nanotechnology and Nanotoxicology (NanoCenter) at the Harvard’s Chan School. The NanoCenter is a joint initiative between Harvard and Nanyang Technological University of Singapore.

“As it turned out, wound dressings have the same purpose, in some ways, as food packaging — sustaining tissues, protecting them against bacteria and fungi, and controlling moisture,” said Huibin Chang, a postdoctoral fellow at SEAS and first author of the paper.

To make the fibers food-safe, the team turned to a polymer known as pullulan. Pullulan is an edible, tasteless and naturally occurring polysaccharide commonly used in breath fresheners and mints.

The researchers dissolved the pullulan polymer in water and mixed it with range of naturally derived antimicrobial agents, including thyme oil, nisin, and citric acid. The solution is then spun in an RJS system and the fibers are deposited directly on a food item. The researchers demonstrated the technique by wrapping an avocado with pullulan fibers. The result resembles a fruit wrapped in spiderweb.

The research team compared their RJS wrapping to standard aluminum foil and found a substantial reduction of contamination by microorganisms, including E.coli, L. innocua (which causes listeria), and A. fumigatus (which can cause disease in people who are immunocompromised).

“The high surface-to-volume ratio of the coating makes it much easier to kill dangerous bacteria because more bacteria are coming into contact with the antimicrobial agents than in traditional packaging,” said John Zimmerman, a postdoctoral fellow at SEAS and co-author of the paper.

The team also demonstrated that their fiber wrapping increased the shelf life of avocado, a notoriously finnicky fruit that can turn from ripe to rotten in a matter of hours. After seven days on a lab bench, 90 percent of unwrapped avocados were rotten while only 50 percent of avocados wrapped in antimicrobial pullulan fibers rotted.

The wrapping is also water soluble and biodegradable, rinsing off without any residue on the avocado surface.

Making food more sustainable

This antimicrobial, biodegradable food packing system is not the Disease Biophysics Group’s first foray into making our food supply system more sustainable.

Parker’s group has used their RJS system to grow animal cells on edible gelatin scaffolds that mimic the texture and consistency of meat. That technology was licensed by Tender Food, a Boston-based startup that aims to combat the enormous environmental impact of the meat industry by developing a new generation of plant-based alternative meat products that have the same texture, taste, and consistency as real meat.

The lab’s latest innovations in food packaging may also soon enter commercial development. Harvard Office of Technology Development has protected the intellectual property relating to this project and is now exploring commercialization opportunities with Parker’s lab.

“One of my research group’s the long-range goals is reducing the environmental footprint of food,” said Parker. “We’ve done that by building more sustainable food to now packaging the food in a sustainable way that can reduce food waste.”

This research was co-authored by Jie Xu, Luke A. Macqueen, Zeynep Aytac, Michael M. Peters, Tao Xu and Philip Demokritou.

It was supported by the Nanyang Technological University–Harvard T. H. Chan School of Public Health Initiative for Sustainable Nanotechnology, under project number NTUHSPH 18003; the Harvard Center for Nanoscale Systems (CNS), a member of the National Nanotechnology Coordinated Infrastructure Network (NNCI), which is supported by the National Science Foundation under NSF award number 1541959; and Harvard Materials Research Science and Engineering Center, under grant numbers DMR-1420570 and DMR-2011754.



A summer dilemma worthy of Solomon: how to stay cool in days of high heat and humidity without turning to traditional air conditioning, which consumes vast amounts of electricity and emits potent climate-changing greenhouse gases.

The answer potentially involves a new class of solid-state refrigerants that could enable energy-efficient and emission-free cooling. And now researchers from the Department of Chemistry and Chemical Biology have developed an environmentally friendly mechanism to enable solid-state cooling with two-dimensional perovskites. Their findings are described in a new study in Nature Communications.

“Shifting away from the vapor compression systems that have been in use for a really long time is a crucial part of the overall push toward a more sustainable future,” said Jarad Mason, the paper’s senior author and assistant professor of chemistry and chemical biology. “Our focus is looking deeply at the intrinsic properties of these materials to see what is possible in terms of solid-state cooling as a sustainable alternative.”

Also known as barocaloric materials, the two-dimensional perovskites release and absorb heat in response to pressure changes as they expand and contract. The effect is based on a phenomenon you may be familiar with if you’ve ever stretched a balloon and felt it warm up against your lips. Similarly, these materials release heat when pressurized or stressed. Without releasing any harmful emissions, this mechanism can remove heat in the solid state using low driving pressures.

The work was led by members of Mason’s lab, including Jinyoung Seo, Ryan D. McGillicuddy, Adam H. Slavney, Selena Zhang ’22, Rahil Ukani, and Shao-Liang Zheng, director of the X-ray Laboratory. Advanced tests were also performed in collaboration with scientists at the Argonne National Laboratory in Lemont, Illinois.

This new mechanism for solid-state cooling has the potential to overcome the limitations of traditional vapor-compression cooling technology, which has remained largely unchanged since the early 20th century.

Any kind of refrigeration system runs in a cycle from a low-entropy state when a material can absorb heat, thereby cooling a space, to a high-entropy state when that energy can be released in a heat sink, where it dissipates. Vapor-compression air conditioners circulate a volatile fluid refrigerant that evaporates and condenses under varying pressure through metal coils to cool an enclosed space and eject heat outside. Running vapor-compression cycles is energy-intensive, responsible currently for almost 20 percent of electricity use in buildings around the world. In addition, leaking refrigerants are more than 1,000 times more potent greenhouse gases than carbon dioxide.

Air conditioner

The new mechanism (pictured) could replace traditional vapor-compression cooling technology, which has remained largely unchanged since the early 20th century.

The team identified two-dimensional perovskites as ideal substitutes because they undergo phase transitions that can be driven reversibly under minimal pressure, all while remaining in a solid state; the more a material can change its entropy, the more effective it can be for running cooling cycles. With organic bi-layers capable of undergoing large changes in entropy when their hydrocarbon chains switch between ordered and disordered states, the team anticipated that two-dimensional perovskites could serve as a highly tunable solid-state cooling material that could operate at lower pressures than thought possible.

The team synthesized the materials in their lab and tested them in a high-pressure calorimeter to measure changes in heat flow in the material under varying pressures and temperatures. These experiments reveal how much heat can be removed in a potential refrigeration cycle, and how much pressure is needed to drive the cycle reversibly.

“As soon as we began testing the material, we realized that we could remove a very large amount of heat with a very small pressure change,” Mason said. “From that point on, we knew that there was going to be something interesting here.”

The researchers also conducted high-pressure powder X-ray diffraction experiments at Argonne to understand phase changes at the molecular level. With the X-ray synchrotron, the teams were able to characterize how the structures of each material changes at varying temperatures and pressures.

“These materials are worth studying beyond their promising performance,” Seo said, “They can also be useful for chemists to understand the fundamental properties that are critical to realizing this technology at scale.”

The Mason Lab next plans to craft prototype barocaloric cooling devices while continuing to explore the potential use of different materials.

“We will likely use next-generation materials for the prototype device,” Seo said. “We are trying to come up with new technologies to address the cooling challenge.”



Why do we sleep? Scientists have debated this question for millennia, but a new study adds fresh clues for solving this mystery.

The findings, published in the Journal of Neuroscience, may help explain how humans form memories and learn, and could eventually aid the development of assistive tools for people affected by neurologic disease or injury. The study was conducted by Massachusetts General Hospital in collaboration with colleagues at Brown University, the Department of Veterans Affairs, and several other institutions.

Scientists studying laboratory animals long ago discovered a phenomenon known as “replay” that occurs during sleep, explains neurologist Daniel Rubin of the MGH Center for Neurotechnology and Neurorecovery, the lead author of the study. Replay is theorized to be a strategy the brain uses to remember new information. If a mouse is trained to find its way through a maze, monitoring devices can show that a specific pattern of brain cells, or neurons, will light up as it traverses the correct route. “Then, later on while the animal is sleeping, you can see that those neurons will fire again in that same order,” says Rubin.

Scientists believe that this replay of neuronal firing during sleep is how the brain practices newly learned information, which allows a memory to be consolidated — that is, converted from a short-term memory to a long-term one.

However, replay has only been convincingly shown in lab animals. “There’s been an open question in the neuroscience community: To what extent is this model for how we learn things true in humans? And is it true for different kinds of learning?” asks neurologist Sydney S. Cash, co-director of the Center for Neurotechnology and Neurorecovery at MGH and co-senior author of the study. Importantly, says Cash, understanding whether replay occurs with the learning of motor skills could help guide the development of new therapies and tools for people with neurologic diseases and injuries.

To study whether replay occurs in the human motor cortex — the brain region that governs movement — Rubin, Cash, and their colleagues enlisted a 36-year-old man with tetraplegia (also called quadriplegia), meaning he is unable to move his upper and lower limbs, in his case due to a spinal cord injury. The man, identified in the study as T11, is a participant in a clinical trial of a brain-computer interface device that allows him to use a computer cursor and keyboard on a screen. The investigational device is being developed by the BrainGate consortium, a collaborative effort involving clinicians, neuroscientists, and engineers at several institutions with the goal of creating technologies to restore communication, mobility, and independence for people with neurologic disease, injury, or limb loss. The consortium is directed by Leigh R. Hochberg of MGH, Brown University, and the Department of Veterans Affairs.

In the study, T11 was asked to perform a memory task similar to the electronic game Simon, in which a player observes a pattern of flashing colored lights, then has to recall and reproduce that sequence. He controlled the cursor on the computer screen simply by thinking about the movement of his own hand. Sensors implanted in T11’s motor cortex measured patterns of neuronal firing, which reflected his intended hand movement, allowing him to move the cursor around on the screen and click it at his desired locations. These brain signals were recorded and wirelessly transmitted to a computer.

That night, while T11 slept at home, activity in his motor cortex was recorded and wirelessly transmitted to a computer. “What we found was pretty incredible,” says Rubin. “He was basically playing the game overnight in his sleep.” On several occasions, says Rubin, T11’s patterns of neuronal firing during sleep exactly matched patterns that occurred while he performed the memory-matching game earlier that day.

“This is the most direct evidence of replay from motor cortex that’s ever been seen during sleep in humans,” says Rubin. Most of the replay detected in the study occurred during slow-wave sleep, a phase of deep slumber. Interestingly, replay was much less likely to be detected while T11 was in REM sleep, the phase most commonly associated with dreaming. Rubin and Cash see this work as a foundation for learning more about replay and its role in learning and memory in humans.

“Our hope is that we can leverage this information to help build better brain-computer interfaces and come up with paradigms that help people learn more quickly and efficiently in order to regain control after an injury,” says Cash, noting the significance of moving this line of inquiry from animals to human subjects. “This kind of research benefits enormously from the close interaction we have with our participants,” he adds, with gratitude to T11 and other participants in the BrainGate clinical trial.

Hochberg concurs. “Our incredible BrainGate participants provide not only helpful feedback toward the creation of a system to restore communication and mobility, but they also give us the rare opportunity to advance fundamental human neuroscience — to understand how the human brain works at the level of circuits of individual neurons,” he says, “and to use that information to build next-generation restorative neurotechnologies.”

Rubin is also an instructor in neurology at Harvard Medical School. Cash is an associate professor of neurology at HMS. Hochberg is a senior lecturer on neurology at HMS and professor of engineering at Brown University.

This work was supported by the Department of Veterans Affairs, the National Institute of Neurologic Disease and Stroke, the National Institute of Mental Health, Conquer Paralysis Now, the MGH-Deane Institute, the American Academy of Neurology, and the Howard Hughes Medical Institute at Stanford University.  



In the 1970s, as Evelynn Hammonds walked the halls of MIT’s physics department on her way to a Ph.D., faculty, students, and staff kept asking her the same question:

“Hi, Shirley, how’s your work going?”

Her name, of course, was Evelynn, not Shirley. A few years before, Shirley Jackson, the second African American woman to receive a Ph.D. in physics in the United States, had been the only Black woman in the department. But she was gone by the time Hammonds, also the only Black woman in MIT physics, arrived.

“As an African American woman, I carried the double burden,” Hammonds, the chair and professor of the history of science at Harvard, recalled in a recent interview with the Harvard Museums of Science and Culture. “The first two years, I just felt like I was constantly fighting to have people take me seriously.”

After she earned her degree, Hammonds took a leave of absence to grapple with questions physics couldn’t answer: “Where are the women? What’s going on?”

Today, on the 50th anniversary of Title IX, which prohibits sex-based discrimination at educational institutions that receive federal funding, she’s still asking why there aren’t more women in science, technology, engineering, and mathematics. “The attitudes and the culture haven’t changed as much as they absolutely have to,” she said.

Most people associate Title IX with anti-discrimination initiatives in athletics and efforts to reduce sexual harassment. But it was also intended to increase women’s access to higher education. This “big hammer,” as the chemist Debra Rolison calls it, could be used to put universities on alert: Fail to increase the representation of women in STEM, and the government might take away your federal funding.

“No money from the feds is death at the door,” said Rolison, who has been fighting for women’s equality since her male cousins excluded her from a baseball game when she was 5 years old. “I have never, ever appreciated that double standard,” she said. “And every time I can say or do something about it, I do.”

For decades, Rolison, who is head of the Advanced Electrochemical Materials section at the Naval Research Laboratory, has given talks to raise awareness about gender inequities in STEM. When words haven’t been enough, she has sought to force change by using the “big hammer” Title IX represents.

Since the law’s enactment, women have made some gains in science and engineering — they’ve gone from representing just 8 percent of STEM workers in 1970 to 27 percent in 2019. But men still dominate, making up half of all U.S. employees but 73 percent of the STEM workforce. A report published in 2018 made clear that sexual harassment was still an everyday reality for many women in science. Two years later, the American Association of University Women (AAUW) found that women in science earned about $15,000 less than their male colleagues.

“The gender gap that exists in the STEM workforce cannot be blamed on differences in academic preparation,” the National Coalition for Women and Girls in Education wrote in a report released this month. Research has shown that women are no less capable than men in science and mathematics. But, according to the AAUW, external factors, like a lack of role models, cultures that tend to exclude women, and persistent stereotypes about women’s intellectual abilities, reinforce a wide gender gap. Even today, teachers and parents underestimate girls’ STEM abilities from a young age. From a historical perspective, at least in certain fields, we seem to have regressed.

“In 1700, 14 percent of German astronomers were women,” said Londa Schiebinger, who earned her Ph.D. in history from Harvard and is now the John L. Hinds Professor of History of Science at Stanford University. As a student, she couldn’t stop wondering: “What happened to all the women?”

The were pushed out, she found, and men brandished biased science to justify keeping them out. Take the Harvard-trained physician Edward Hammond Clarke, who practiced in the late 1800s.

Clarke, Schiebinger noted, claimed that as a woman’s brain grows, her ovaries shrink — clear-cut evidence, he said, that a woman’s rightful place was in the home, bearing children. The first drawings of the female skeleton, also from the 19th century, depicted (inaccurately) small skulls and large pelvises — more evidence, male scientists thought, that a woman’s body was made for procreation. In the 1920s, men blamed female hormones for their inferior minds. A century later, when Schiebinger was a graduate student at Harvard, she still heard men argue that women were intellectually deficient and ill-suited for science.

The gender gap isn’t just about percentages or fairness; a lack of women (and other minorities) in a field affects the quality of the science, as Schiebinger has discovered.

Women are 47 percent more likely to be injured in a car crash, she said, because crash test dummies are just smaller versions of the male anatomy. “It’s a shrink-it and pink-it sort of thing,” Schiebinger said. Medical devices fail more often for women. Pulse oximeters, for example, which measure blood oxygen levels, tend to miscalculate these levels in women and people of color, putting them at risk of not receiving emergency care (an especially pernicious threat during the pandemic). Large datasets don’t have equal representation of women or different ethnic groups. Search algorithms, for example, show more high-paying jobs to men because that’s what their underlying data tells them: men earn more.

“It’s not like Title IX changed anything,” Schiebinger said. “It’s been the women’s movement, right? It’s the people who have been working nonstop on institutional reform. Some law doesn’t really change anything.”

A male-dominated culture — along with its unconscious biases and rewards for cutthroat pursuit of money and prestige — is still considered one of the biggest and most stubborn barriers women face in STEM, said Schiebinger.

Hammonds would agree.

“You’re in a culture where, on any given day, somebody might think you were a secretary, or a janitor, or anything but a graduate student in physics, so you had to live with that,” she said. “It was made very clear to us by some people that we didn’t fit, that we didn’t belong, that we were only there because of affirmative action, that we could never be successful. We were constantly finding those attitudes.”

Some people argue that women choose to avoid STEM careers because of long hours that can intrude on family responsibilities. It’s true, Rolison noted, that child care and parental leave are critical benefits that ensure women can have a family without sacrificing career ambitions.
But, she said, the particular demands and realities of a life in science are not the primary problem. “There’s plenty of work done to show that, no, it’s the culture. It’s always the culture. It’s not the low pay; it’s not the long hours; it’s the culture.”

To fix the culture, Schiebinger argues for a coordinated approach. Courses should teach awareness of social issues; journals and funding agencies shouldn’t accept research that doesn’t consider gender and sex; big hammers like Title IX should be used as levers to force stubborn institutions to do better.

“These have to be structural solutions. It’s not just getting this woman or that woman,” Schiebinger said. “You gotta do it everywhere. Start in the womb, I always say.”



Humans display a capacity for tolerance and cooperation among social groups that is rare in the animal kingdom, our long history of war and political strife notwithstanding. But how did we get that way?

Scientists believe bonobos might serve as an evolutionary model. The endangered primates share 99 percent of their DNA with humans and have a reputation for generally being peace-loving and sexually active — researchers jokingly refer to them “hippie apes.” And interactions between their social groups are thought to be much less hostile than among their more violent cousins, the chimpanzees.

Some, however, have challenged this because of a lack of detailed data on how these groups work and how they separate themselves. A new study led by Harvard primatologists Liran Samuni and Martin Surbeck on the social structure of bonobos may begin to fill in some of the blanks.

The research, published in PNAS, shows that four neighboring groups of bonobos they studied at the Kokolopori Bonobo Reserve in the Democratic Republic of Congo maintained exclusive and stable social and spatial borders between them, showing they are indeed part of distinct social groups that interact regularly and peacefully with each other.

“It was a very necessary first step,” said Samuni, a postdoctoral fellow in Harvard’s Pan Lab and the paper’s lead author. “Now that we know that despite the fact that they spend so much time together, [neighboring] bonobo populations still have these distinct groups, we can really examine the bonobo model as something that is potentially the building block or the state upon which us humans evolved our way of more complex, multilevel societies and cooperation that extends beyond borders.”

The study is a result of three consecutive years of observing the bonobo community in the Kokolopori reserve from 2017 to 2019. Previous research showed evidence of the 59 bonobos forming four separate groups that routinely crossed paths to interact, groom each other, and share meals. What hasn’t been clear is the extent to which the behavior of these bonobo groups resembles that of chimpanzee subgroups that form within one larger community.

Liran Samuni.

Liran Samuni, a postdoctoral fellow in Harvard’s Pan Lab, is the paper’s lead author.

Kris Snibbe/Harvard Staff Photographer

Primatologists refer to chimp subgroups, which are highly territorial and hostile to those in different communities, as neighborhoods. Essentially, members of these subgroups don’t spend all their time together as part of one large group but are all still part it, maintaining relationships with each other and (most importantly) not battling each other when they meet.

Bonobos have been far less studied than chimps due to political instability and logistical challenges to setting up research sites in the forests of the Democratic Republic of Congo, the only place where the primates are found. In addition, studying relationships among and between Bonobo groups has been further complicated by the fact that subgroups appear to intermingle with some frequency.

“There aren’t really behavioral indications that allow us to distinguish this is group A, this is group B when they meet,” Samuni said. “They behave the same way they behave with their own group members. People are basically asking us, how do we know these are two different groups? Maybe instead of those being two different groups, these groups are just one very large group made up of individuals that just don’t spend all their time together [as we see with chimpanzee neighborhoods].”

To get at the answer, at least two observers from the reserve followed each bonobo group daily from dawn to dusk, recording behavioral and location data that was then analyzed.

The researchers primarily tracked how much time individual bonobos spent together, with whom, and what activities they engaged in. This helped the researchers perform a statistical method called a cluster analysis. This method groups data points in a cluster so that points from the same group are clustered closely on a plot, while data points not in the same group are clustered in another space.

Essentially, they tracked which bonobos shared significant associations with one another, which ones tended to come together for meals more often, which ones tended to stick together when faced with a choice of whom to go with, and which ones interacted more in the same home range. This helped them draw clear distinctions between what bonobos were part of the same group and when members of one group were peacefully interacting with neighboring groups across each other’s borders.

They compared this to data collected on 104 chimpanzees that lived in the Ngogo community in Uganda’s Kibale National Park between 2011 and 2013.

The researchers found the bonobo clusters were overall more consistent and stable than the subgroups of chimps. This suggests that the bonobos within each cluster had a stronger social preference for one another than was seen within chimpanzee subgroups.

When it comes to the Kokolopori bonobos, this helped the researchers not only confirm the four groups — which they named the Ekalakala, the Kokoalongo, the Fekako, and the Bekako — but also come up with a reliable way to predict which bonobos were most likely to stick together when the different groups of bonobos met and separated.

Samuni and Surbeck, an assistant professor in the Department of Human Evolutionary Biology and the paper’s senior author, say the results show that bonobos, like humans, are capable of complicated relationships outside their immediate core network.

Now that the researchers have firmly established that these bonobos have distinct groups, they want to dig further into what cooperation and trade look between these groups and whether it can potentially represent what it looked like in our common ancestor. This would help explain how humans, to an extent, overcame antagonism between different groups and developed peaceful cooperation.

Surbeck, who founded and directs the Kokolopori Bonobo Research Project, points out the window to gain these powerful insights is closing as bonobos near extinction.

“There are very few left,” he said. “We gather here information that potentially will not be available anymore in 50 years if things continue the way they do.”



Scientists looking to measure the biodiversity of wild animals in a nature reserve are taking their lead from leeches.

In a new study led by a team of Harvard researchers, DNA samples extracted from the blood meals of leeches were used to map which animals live in the Ailaoshan National Nature Reserve in Yunnan, China. The findings suggest that the blood-sucking worms may serve as a simpler, much less expensive surveillance instrument for some biodiversity surveys than existing tools such as camera traps and bioacoustic recorders.

“This study shows how leech-derived DNA can be used to estimate biodiversity on a scale that makes it useful as a real-world conservation tool,” said Chris Baker, a postdoctoral fellow in Naomi Pierce’s lab at Harvard and one of the study’s first authors. “We’re offering a way to measure the biodiversity of wild animals and, in particular, a way to measure biodiversity directly.”

The research used DNA extracted from more than 30,000 leeches to survey more than 80 species of vertebrates, including amphibians, mammals, birds, and squamates. The leeches were collected over a three-month period by forest rangers throughout the 260-square-mile nature reserve, which stretches for nearly 80 miles along a mountain ridge in Southern China.

The work, published in Nature Communications, addresses a major practical challenge in measuring animal biodiversity over large spaces. Protected areas are often set aside with the goal of conserving wild animal communities, but it is costly and time-consuming to monitor those communities directly.

Ailaoshan National Nature Reserve in Yunnan, China.

Leeches from Ailaoshan National Nature Reserve in Yunnan, China, were used in the study.

“You can set out automated cameras; you can set out acoustic recorders; or you could do it manually with people out into the field to survey things, but it’s difficult to do that on a really large scale,” Baker said. “These surveys tend to be either limited in the spatial scale that they can cover, limited in the frequency with which they can be done, or limited in the resolution that they can provide. We wanted to be able to use environmental DNA as a way to be able to address this problem … instead of having to rely on proxies, like forest cover or the budget of forest rangers.”

Leeches turned out to be perfect for the job.

For starters they are abundant, at least in tropical environments. They also feed on a broad range of animals, from large bears to small mice. Because leeches don’t travel but lie in wait, their diet represents a kind of record of which animals passed by the spots where they’re found. And the worms digest slowly so scientists from China’s Kunming Institute of Zoology — who collaborated with the Harvard researchers — could still get animal blood from the leeches four months after the last feed.

The researchers looked for DNA sequences present only in vertebrates to identify the animals. Past studies have shown this was possible, but this is believed to be the first time that such an analysis has been done at such a large scale.

The Harvard and Kunming Institute researchers coordinated with about 160 volunteer park rangers to do the collections. The team in China extracted DNA from the samples, arranged the sequencing, and investigated to which animals the DNA belonged. Their Harvard counterparts analyzed the locations of the animals using a technique known as multispecies occupancy modeling, which accounts for ecological patterns.

The team was able to identify 86 vertebrate species. Some of those are listed as near-threatened or threatened by the International Union for Conservation of Nature. These included the Asiatic black bear, the tufted deer, the stump-tailed macaque, several types of frogs, and an antelope-like creature called a serow.

The study also showed encroaching pressures on the reserve from human activity such as farming, livestock management, and poaching. DNA from cows, sheep, and goats, for example, was recovered from leeches collected within the reserve, especially close to the edges. It suggests that animals from surrounding farmland are grazing inside the reserve and so are competing for resources with those within or otherwise degrading the habitat.

Researchers are optimistic that the results from their study can be used as a baseline to help track changes in Ailaoshan’s wild animal populations going forward and that the method could grow as strategy for improving monitoring of wild animals in tropical and subtropical areas where leeches are abundant.

They also see broader applicability in terms of tracking zoonotic reservoirs for diseases, since leech blood meals can also be screened for the viruses they contain. This is especially relevant considering the COVID-19 pandemic may have been harbored by animals, such as bats, transferring to humans.

“It’s a pretty effective way to sample a great diversity of wild animals, and if we think that [zoonotic disease reservoirs] are really something to worry about and monitor, this is a good way to do it,” said Pierce, Hessel Professor of Biology in OEB and Curator of Lepidoptera in the Museum of Comparative Zoology.

The work was supported by a grant from the Harvard Global Institute.



Late spring is typically prime time for weddings and graduations, but this year a global helium shortage worsened by Russia’s invasion of Ukraine has forced retailers like Dollar Tree and Party City to warn party planners that gas-filled balloons may be in short supply at times.

While that threatens to slightly dampen seasonal festivities, for physicists like Harvard Professor Amir Yacoby tight supplies of the noble gas in recent months actually threaten to halt research in its tracks. “This is a tremendous blow,” said Yacoby, who estimates about 40 percent of his lab’s activity has been negatively impacted. “This crisis is not going to go away quickly.”

It is one that is affecting researchers everywhere. In physics, engineering, chemistry, biology, and medical research, helium and liquid helium are employed whenever cold environments are needed for experiments, including cooling large magnets, MRI machines, or mass spectrometers, or slowing down atoms in condensed-matter physics research. Some 16 Nobel Prizes have been generated by work done using liquid helium, showing how much of a workhorse it has become in these fields because of distinct characteristics of both the gas and liquid.

At Harvard, researchers may have to shut down pieces of expensive technical equipment that rely on helium and liquid helium, the super cold liquid version of the gas. In some cases, this could cause irrevocable damage to the instruments and force some of the scientists to bring lines of research to a halt. Some ripple effects could include graduation delays for students whose thesis work depends on those projects.

“We are already in that worst-case scenario,” said FAS Dean of Science Christopher W. Stubbs. “The supply has been cut in half, so half of the experiments that rely on liquid helium have been shut down as a result. This impedes progress on both the scientific and educational aspects of our division.”

Helium is the second-most-abundant element in the universe, but on Earth it’s relatively rare. It results from the decay of uranium, can’t be artificially created, and is produced as a byproduct of natural gas refinement. Only a limited number of countries produce it, with the U.S. and Russia among top suppliers. Because that’s the case, it only takes a handful of supply disruptions to trigger a crisis — the gas industry refers to the current one as “Helium shortage 4.0,” it being the fourth since 2006.

This latest shortfall began last year and started gaining steam in late winter. It was triggered by a confluence of world events, including global supply-chain problems brought on by the pandemic and worsened by the war in Ukraine, as well as planned and unplanned shutdowns at major producers — such as a mid-January leak in the U.S.’s helium reserve in Texas after a four-month scheduled shutdown and an October fire and a January explosion that closed a major Russian facility.

At Harvard, the shortage has already forced some researchers who rely heavily on the element to make painful decisions to slow down projects.

Philip Kim, a professor of physics and applied physics, says he’s had to shut down about half of his lab’s research activities that rely on cryogenic instruments.

“We use liquid helium to achieve low temperature for our experiment in several different cryostats and in applying high magnetic fields using superconducting magnets [to study the physical properties of low-dimensional quantum materials],” said Kim, whose lab goes through about 3,000 liters of liquid helium per month. “We have six cryostats [we are] actively using and had to shut off three.”

Kim worries about the effect it will have on the research and futures of lab members.

“Graduation of my graduate students might be delayed due to slow-down of their thesis work,” he said. “Postdoctoral research fellows’ research projects are also [slowed] down as they also need to wait to be able to perform their experiments.”

Charles Vidoudez at the Harvard Center for Mass Spectrometry is starting to lose sleep over the shortage. Vidoudez, the center’s principal research scientist, uses it to keep four of the facility’s mass spectrometers at the extremely low pressures they need to be at to operate. The halt would affect dozens of labs that depend on the center to perform a range of analyses using the machines. Vidoudez has spent countless hours calling or emailing just about every supplier that he could find.

“Most just either don’t answer or the ones that do answer say, ‘We don’t take new customers at the moment,’” Vidoudez said. “It’s been a real struggle.”

Administrators are trying to help researchers become more efficient with their use of liquid helium and reduce dependency on it. Stubbs and Sarah Lyn Elwell, the FAS Division of Science’s assistant dean for research, are co-chairing a committee focusing on helium conservation and allocation to manage their way through the crisis. There is also the division’s Helium Recovery Facility at 38 Oxford St., which came online in 2012. The facility recovers liquid helium boil-off from 12 participating labs within the FAS. It then purifies and reliquefies the boiled-off helium and dispenses it to the labs again at reduced prices.

The facility has felt the impacts of the shortage, too, said Markos Hankin, Harvard’s helium liquefier engineer. Since boil-off capture is not 100 percent, the facility must use an outside supply to make up for what is lost. Their supplier allocated them 60 percent of their usual supply. That became 45 percent by the end of March.

For now, the facility has been able to supply normal levels to labs that use small amounts of liquid helium, but it has had to ration quantities for labs like Kim and Yacoby’s. The facility is coordinating with the conservation committee as well as the University’s Strategic Procurement Office to try to find helium and develop alternate solutions.

The Nuclear Magnetic Resonance (NMR) Core Facility, which supports more than 30 research groups and over 200 active researchers in the Harvard community, is one of the labs that the recovery facility has been able to keep running at normal levels because it uses only about 1,500 liters per year.

Anthony Lowe, the electronics technician at the instrumentation center, is responsible for the NMR lab’s liquid helium and helium gas supplies. He is particularly appreciative of the efforts because the liquid helium keeps the large magnets inside the spectrometers at about 450 degrees below zero Fahrenheit. Without it, they would have to ramp down the NMR magnets. The machines, whose cost range from half a million to $2 million, aren’t meant to be turned off, however, and there’s a possibility that the NMR magnets would be permanently damaged. “It’s a risky proposition,” Lowe said.

Lowe hopes the shortage gets better soon, but — like many — he worries about the future of helium in general.

“It’s a finite resource,” Lowe said. “In our lifetime it might not run out, but for humanity it has a finite supply. We can’t make any more.”

 



Here’s what happens when you get an infection: The nervous system talks to the immune system to figure out that the body is under attack and then orchestrates a series of behavioral and physiological alterations that manifest as the unpleasant symptoms of sickness. For neuroscientists, longstanding questions have been: How and where does this happen in the brain?

Harvard researchers from the labs of Catherine Dulac and Xiaowei Zhuang sought the answer in the brains of mice.

In a study published in Nature, the researchers and their collaborators describe finding a small group of neurons near the base of the brain that can induce symptoms of sickness, including fever and appetite loss.

The neurons, which have not been described previously, are found in the hypothalamus, which controls key homeostatic functions that keep the body in a balanced, healthy state. The researchers discovered receptors in the neurons that are capable of detecting molecular signals coming from the immune system, an ability most neurons don’t have.

“It was important for us to establish this general principle that the brain can even sense these immune states,” said Jessica Osterhout, a postdoctoral researcher in the Dulac Lab and the study’s lead author. “This was poorly understood before.”

The researchers found that the key area of the hypothalamus is right next to the permeable blood-brain barrier, which helps circulates blood to the brain.

“What’s happening is that the cells of the blood-brain barrier that are in contact with the blood and with the peripheral immune system get activated and these non-neuronal cells secrete cytokines and chemokines that, in turn, activate the population of neurons that we found,” said Dulac, Lee and Ezpeleta Professor of Arts and Sciences and Higgins Professor of Molecular and Cellular Biology.

The hope is that scientists can one day apply the research to humans, reversing the process when it becomes a health threat.

A fever, for instance, is typically a healthy reaction that helps eliminate a pathogen. But when it gets too high, it becomes dangerous. The same can be said for loss of appetite or reduced thirst, either of which can, at first, be beneficial. But a sustained lack of nutrients or hydration can start to impede recovery.

“If we know how it works, perhaps we can help patients who have difficulty with these kinds of symptoms, like chemo patients or cancer patients, for example, who have a very low appetite but there’s really nothing we can do for them,” Osterhout said.

The work started as an effort to examine the “fever effect” in autism patients, a phenomenon in which autism symptoms fade as a patient experiences symptoms of infection. The goal was to find the neurons that generate fever and link them to the neurons that are involved with social behavior.

Instead, Osterhout found many populations of neurons that are activated when an animal is sick. She zeroed in on about 1,000 neurons in the ventral medial preoptic region of the hypothalamus because of their proximity to the blood-brain barrier.

Osterhout injected the mice with pro-inflammatory agents that mimic bacterial or viral infection. She analyzed the areas that lit up in the brain scans, and then used a powerful and precise set of methods, chemogenetics and optogenetics, to control and investigate the connectivity among the different neuronal populations. Using these tools, the researchers were able to activate or silence the neurons in command in the brains of mice and pin down their function by seeing what happened.

The researchers found that they could increase body temperature in the mice, increase warmth-seeking behavior, and decrease appetite. The neurons described in the report project to 12 brain areas, some of which are known to control thirst, pain sensation, and social interactions. This suggests that other sickness behaviors may be affected by the neuron activity in the area.

During the experiments, the scientists also noticed increased activity and activation in this population of neurons when molecules from the immune system gave off increased signals. This suggests that the brain and the immune system were communicating with each other through paracrine signaling at the ventral medial preoptic area and the blood-brain barrier right next to it. Paracrine signaling is when cells produce a signal to trigger changes in nearby cells.

Osterhout said the process expanded her understanding of how neurons work.

“As a neuroscientist, we often think of neurons activating other neurons and not that these other paracrine-type or secretion-type methods are really critical,” she said. “It changed how I thought about the problem.”



MKRdezign

Contact Form

Name

Email *

Message *

Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget