October 2023

Wondering is a series of random questions answered by experts. The Medical School’s Aleksandra Stankovic is an aerospace psychologist and spaceflight biomedical researcher who studies how to optimize human performance and behavioral health in extreme operational environments. We asked her how a person gets ready to travel to space.

The spaceflight environment presents many challenges — technical, physical, and psychological. With more people having access to space travel today than ever before, successful and safe spaceflights require varying levels of preparation before launch day.

For government astronauts, candidates undergo a rigorous two-year initial training period before qualifying for flight assignment. This training includes learning about Space Station and flight vehicle systems, studying orbital mechanics, becoming proficient in emergency procedures (like how to handle scenarios such as fire, cabin depressurization, or medical issues), conducting flight training in T-38 jets (to build quick decision-making skills in high-performance aircraft), and developing Russian language skills (since international space missions involve collaboration among astronauts from various countries).

To prepare for the microgravity environment of space, astronauts also participate in simulations of weightlessness, including parabolic flights and training in the Neutral Buoyancy Lab, a large swimming pool where astronauts practice conducting spacewalks and learn to perform tasks in their pressurized spacesuits. Astronauts complete survival training and learn to cope with extreme conditions — a crucial skill in case of an emergency landing back on Earth in the water or in very cold locations like Siberia. They are trained to operate the robotic arm that is used for tasks such as capturing cargo spacecraft.

Once they receive a flight assignment, astronauts complete an additional 18 months of mission-specific training. They simulate various mission scenarios — including launch, rendezvous, and docking — and emergency procedures. Additionally, they undergo extensive training on the scientific experiments they’ll be conducting, like how to work with equipment, collect samples, and handle data.

Since maintaining physical fitness is vital for astronauts to counteract the muscle and bone loss experienced in microgravity, they spend a lot of time preflight working out. At the same time, long-duration space missions can be mentally challenging, given the prolonged isolation, confinement, and separation from family and friends. Astronauts learn strategies to manage stress, maintain psychological well-being, and work effectively in close environments with their fellow crewmembers.

Commercial astronaut training is significantly less intensive than the training government-sponsored astronauts receive, since their missions are often of shorter duration and focus more on providing safe and enjoyable flying experiences. While commercial crews may stay in space for shorter intervals ranging from a few minutes for suborbital flight to several days or even weeks on the Space Station, government astronauts typically spend six months or more on the station. (Astronaut Frank Rubio recently set the record for longest American space mission with 371 consecutive days in space; cosmonaut Valeri Polyakov, who logged 437 continuous days in orbit on Russia’s Mir space station between 1994 and 1995, still holds the world record.)

Commercial astronauts often receive more generalized training that covers the basics of space travel and safety/emergency procedures. Anyone who spends prolonged periods in space will need to spend a lot of their day working out to keep their bodies in strong shape to be healthy when they return home. Everyday activities can be challenging without gravity, and sleeping can be difficult without the normal light cues from the sun that our bodies rely upon on Earth to regulate our circadian rhythms. A combination of technology and training help space travelers adapt.

As more people travel to space, on an expanding range of flight vehicles and for varying types of missions, spaceflight preparation too will undoubtedly continue to evolve. It’s an exciting time to be studying how to keep humans safe and healthy in space, and researchers like me are thrilled to be a part of enabling this next great wave of human space exploration!

— As told to Anna Lamb/Harvard Staff Writer



By combining noninvasive imaging techniques, investigators have created a comprehensive cellular atlas of a region of the human brain known as Broca’s area — an area critical for producing language.

The new technology will provide insights into the presence and spread of pathologic changes that occur in neurodegenerative illnesses — such as epilepsy, autism, and Alzheimer’s disease — as well as psychiatric illnesses.

Until now, scientific advances have not produced undistorted 3D images of cellular architecture that are needed to build accurate and detailed models. In new research published in Science Advances, a team led by investigators at Harvard-affiliated Massachusetts General Hospital, has overcome this challenge with detailed resolution to study brain function and health.

Using sophisticated imaging techniques — including magnetic resonance imaging, optical coherence tomography, and light-sheet fluorescence microscopy — the researchers were able to prevail over the limitations associated with any single method to create a high-resolution cell census atlas of a specific region of the human cerebral cortex, or the outer layer of the brain’s surface. The team created such an atlas for a human postmortem specimen and integrated it within a whole-brain reference atlas.

Atlas overview

Overview of the new pipeline. Human brain samples are imaged at multiple scales with multiple modalities (MRI, OCT, and light-sheet fluorescent microscopy or LSFM).

MGH

“We built the technology needed to integrate information across many orders of magnitude in spatial scale from images in which pixels are a few microns to those that image the entire brain,” says co–senior author Bruce Fischl, director of the Computational Core at the Athinoula A. Martinos Center for Biomedical Imaging at MGH and a professor in radiology at Harvard Medical School.

Ultimately, the methods in this study could be used to reconstruct undistorted 3D cellular models of particular brain areas as well as of the whole human brain, enabling investigators to assess variability between individuals and within a single individual over time.

“These advances will help us understand the mesoscopic structure of the human brain that we know little about. Structures that are too large and geometrically complicated to be analyzed by looking at 2D slices on the stage of a standard microscope, but too small to see routinely in living human brains, says Fischl.

“Currently we don’t have rigorous normative standards for brain structure at this spatial scale, making it difficult to quantify the effects of disorders that may impact it such as epilepsy, autism, and Alzheimer’s disease.”

Additional co-authors include Irene Costantini, Leah Morgan, Jiarui Yang, Yael Balbastre, Divya Varadarajan, Luca Pesce, Marina Scardigli, Giacomo Mazzamuto, Vladislav Gavryusev, Filippo Maria Castelli1, Matteo Roffilli, Ludovico Silvestri, Jessie Laffey, Sophia Raia, Merina Varghese, Bridget Wicinski, Shuaibin Chang, Anderson Chen I-Chun, Hui Wang, Devani Cordero, Matthew Vera, Jackson Nolan, Kimberly Nestor, Jocelyn Mora, Juan Eugenio Iglesias, Erendira Garcia Pallares, Kathryn Evancic, Jean Augustinack, Morgan Fogarty, Adrian V. Dalca, Matthew Frosch, Caroline Magnain, Robert Frost, Andre van der Kouwe, Shih-Chi Chen, David A. Boas, Francesco Saverio Pavone, and Patrick R. Hof.

Support for this research was provided in part by the BRAIN Initiative Cell Census Network, the National Institute for Biomedical Imaging and Bioengineering, the National Institute on Aging, the National Institute of Mental Health, the National Institute for Neurological Disorders and Stroke, Eunice Kennedy Shriver National Institute of Child Health and Human Development, Chan-Zuckerberg Initiative DAF an advised fund of Silicon Valley Community Foundation, Shared Instrumentation,  NIH Blueprint for Neuroscience Research, European Union’s Horizon 2020 research and innovation Framework Programme, European Union’s Horizon 2020 Framework Programme for Research and Innovation, Marie SkÅ‚odowska-Curie, Italian Ministry for Education in the framework of Euro-Bioimaging Italian Node, European Research Council, Alzheimers Research UK, The National Institutes of Health,  and “Fondazione CR Firenze” (private foundation) Human Brain Optical Mapping.



Having pulled themselves from the water 360 million years ago, amphibians are our ancient forebears, the first vertebrates to inhabit land. 

Now, this diverse group of animals faces existential threats from climate change, habitat destruction, and disease. Two Harvard-affiliated scientists from India are drawing on decades of study — and an enduring love for the natural world — to sound a call to action to protect amphibians, and in particular, frogs.

Sathyabhama Das Biju, a Harvard Radcliffe Institute fellow and a professor at the University of Delhi, and his former student Sonali Garg, now a biodiversity postdoctoral fellow at Harvard’s Museum of Comparative Zoology, are co-authors of a sobering new study in Nature, featured on the journal’s print cover, that assesses the global status of amphibians. It is a follow-up to a 2004 study about amphibian declines.

Biju and Garg are experts in frog biology who specialize in the discovery and description of new species. Through laborious fieldwork, they have documented more than 100 new frog species across India, Sri Lanka, and other parts of the subcontinent.

According to the Nature study, which evaluated more than 8,000 amphibian species worldwide, two out of every five amphibians are now threatened with extinction. Climate change is one of the main drivers. Habitat destruction and degradation from agriculture, infrastructure, and other industries are the most common threats to these animals.

Biju and Garg are among more than 100 scientists who contributed their data and expertise to the report, which shows that nearly 41 percent of amphibian species are threatened with extinction, compared with 26.5 percent of mammals, 21.4 percent of reptiles, and 12.9 percent of birds.

Frogs, says Biju, are excellent model organisms to study evolution and biogeography because of the extreme diversity of traits they acquired over millennia. They are also very sensitive to abrupt changes in their environment, including droughts, floods, and storms, which makes them a barometer for assessing the health of an ecosystem.

Indian Purple Frog.

The Indian Purple Frog, first described by Sathyabhama Das Biju in 2003.

“But very frankly, what drives me the most is their beauty and diversity in shapes, form, colors, as well as behaviors,” said Biju, who has dedicated 30 years to frog taxonomy across biodiversity hotspots in or near India, rising to fame through his formal description in 2003 of the Indian Purple Frog. He is known as the Frogman of India.

India is home to one of the most diverse frog populations in the world, with more than 460 documented species. Of those about 41 percent are considered threatened, according to Biju. Habitat destruction and degradation from cultivation of tea, coffee, spices, and other products pose the most danger to the animals.

As a Radcliffe Fellow, Biju is focused on “outpacing nameless extinctions” — saving frogs before they go extinct without being classified or even recognized. He is looking to understand key areas within biodiversity hotspots for effective conservation planning. He is also writing a book — filled with fieldwork photos — on amphibians of India.

“Without understanding the species themselves, and properly identifying them and their geographic distributions, no meaningful conservation planning can be undertaken,” Biju said. “Unless we know what we have, we cannot know what we need to conserve, and where we need to conserve.”

Garg remembers a time, during fieldwork in the Western Ghats mountain range, when she held a frog so small it could sit on the tip of her finger. It was a moment of striking contrast with the everyday puddle-hoppers that surrounded her in the small Indian village where she grew up. “I never thought they could be so beautiful,” she said. “There was so much to discover, and they just became a calling.”

She joined Biju’s lab at University of Delhi as a graduate student to find, identify, name, and better understand these species. She has done extensive fieldwork in India, Sri Lanka, the Western Ghats, the Himalayas, and Indo-Burma. Her research focuses on capturing the diversity of frogs in India using integrative taxonomy, or finding new ways to classify organisms, as well as elucidating their evolutionary histories. She has worked to deepen her quest by incorporating DNA sequencing and CT scanning.

Günther’s Shrub Frog
Franky's Narrow-Mouthed Frog

After 136 years from its original description, Günther’s shrub frog was recently rediscovered in the wild. Franky’s narrow-mouthed frog is among the threatened species.

Credit: S.D. Biju and Sonali Garg

Both she and Biju are using the vast herpetological collection of Harvard’s Museum of Comparative Zoology to inform their studies and provide benchmarking against potentially new species they uncover. According to MCZbase, the museum’s online specimen database, the Herpetology Department’s permanent research collection has 117,165 frog specimens, with 223 from India. The Vertebrate Paleontology and Special Collections departments hold additional specimens.

The researchers have begun a fruitful collaboration with James Hanken, Harvard’s Alexander Agassiz Professor of Zoology and curator of herpetology at the Museum of Comparative Zoology. Hanken is an expert on amphibian morphology, with a special emphasis on salamanders. He hosts Garg as a postdoctoral fellow in his lab, and recently joined the frog specialists on three field expeditions to India — two to the Himalayas, on the border with Tibet and Nepal, and another to the Western Ghats.

“In terms of amphibians that I saw, it was like going to the moon,” Hanken said. “It’s very exciting as a biologist to be immersed in a group of organisms that are completely new to you.”

Hanken and the Indian scientists plan to publish research describing frogs found in India, including their historical migration patterns, reproductive behavior, and genetic variation.

As for conserving endangered species, and the bleak picture the Nature study depicts, knowledge must lead to action, Biju said.

“Governments, individuals, and organizations need to join efforts to scale up global conservation action for amphibians to make sure they are thriving in nature,” he said. “Otherwise the ongoing amphibian crisis will have devastating effects for ecosystems and the planet.”

In some instances, conservation strategies have worked, added Garg. According to the Nature study, 63 species previously considered endangered have improved their status since 2004 due to concerted conservation efforts.

“There is hope,” Garg said. “Scaled up research and conservation efforts can play an important role in making sure amphibians are not just surviving, but also thriving in nature.”



Climate change is raising sea levels, creating stronger and wetter storms, melting ice sheets, and fostering conditions for more and worse wildfires. But as cities around the world warm, climate change’s complex global picture often comes down to this: Residents say they are just too hot.

Jane Gilbert, one of the nation’s first official “heat officers,” works in Miami-Dade County. She said South Florida may be suffering the effects of sea level rise and is in the crosshairs of stronger and more frequent hurricanes, but residents testifying at 2020 hearings on climate-change impacts on low-income neighborhoods repeatedly said the biggest one was the heat.

Panelists gathered at the Harvard Graduate School of Education’s Longfellow Hall last Friday for an event on the “Future of Cities” in a warming world said the topic is particularly relevant this year, when global temperatures soared to new records. As Gilbert spoke on the Cambridge campus on a cool fall afternoon, the heat index in Miami was 109 degrees, just the latest of more than 60 days this year that have seen heat indices higher than 105 degrees.

The summer of 2023 was Earth’s hottest since global records began in 1880, according to NASA scientists. This animated map shows monthly temperature changes from summer 1880 to summer 2023

Credit: NASA

Satchit Balsari, who conducts research among members of India’s largest labor union for women in the nation’s informal economy, did research in Gujarat among the millions of people who are already living with a global climate that has increased 1 degree Celsius. While that rise may seem a small change, that global average is experienced through much wider daily swings in some areas in the form of longer and hotter heat waves, warmer winters, higher nighttime temperatures and more extreme weather events, such as stronger storms or wildfires.

One thing that has become apparent, said Balsari, an assistant professor of global health and population at the Harvard T.H. Chan School of Public Health, is that when talking about individuals, microenvironments matter much more than global averages, because those environments are what affect people as they live and work.

Balsari shared stories of a street vendor, a weaver who works in a building whose rooftop temperature was 10 to 15 degrees above that of the surrounding area, who put up awnings to create shade from the sun, only to have them taken down because they blocked security cameras.

“It’s very hot, and it cools down a little bit at night, but in their work environment, in the lived experience in their homes, there’s this constant experience of ‘It’s too hot,’” said Balsari, who is also an assistant professor of emergency medicine at Harvard Medical School.

As hot as this year has been globally, experts who gathered for the event only expect it to get hotter in the decades to come.

“This is an issue for the long run. Yes, things are bad now. We’re at 1.3, 1.2 (degrees Celsius above preindustrial temperatures) now; we’re going to blow through 1.5. We’re going to probably blow through 2,” said James Stock, vice provost for climate and sustainability and director of the Harvard Salata Institute for Climate and Sustainability. “It gets worse nonlinearly really quickly.”

Stock offered closing remarks at the event, which wrapped up Worldwide Week at Harvard and included lectures, performances, exhibitions, and other events across campus to highlight the ways in which the University interacts and intersects with the world around it through the sciences, arts, culture, politics, and other disciplines.

Joining Stock, Balsari, and Gilbert were Spencer Glendon, founder of the nonprofit Probable Futures; Francesca Dominici, co-director of the Harvard Data Science Initiative; Zoe Davis, climate resilience project manager for the city of Boston; and moderator John Macomber, senior lecturer at Harvard Business School. Harvard Provost Alan Garber and Mark Elliott, vice provost for international affairs, offered opening remarks.

Panelists agreed that better data collection is key to adapting solutions to circumstances that vary widely even across small geographic areas. Interventions such as providing vulnerable populations with air conditioners, for example, may be valuable in low-income communities, but less so in nearby communities with wealthier residents.

In Miami-Dade County, Gilbert said, air conditioners are considered life-saving equipment to the extent that, after Hurricane Irma, the state required nursing homes to have back-up power supplies so that residents could be cooled even in a power outage. ZIP codes with the highest land temperatures — which also tend to be low-income neighborhoods — have four times the rate of hospital admissions during heat waves as other parts of the region.

Gilbert echoed other panelists in calling for better, more granular data through more widespread use of sensors, including wearable sensors that can record heat impact on individuals. With different microclimates affecting different people, different jobs — whether someone is in an office or working at a construction site — also matter, both to public health officials and business leaders. Estimates of the potential economic impact of extreme heat in the Miami metro area are around $10 billion per year in lost productivity.

Nonprofit leader Glendon said we’re entering an unprecedented climate era. Humans were nomadic, regularly moving to where conditions were best, until about 10,000 years ago, when the temperature stabilized to the narrow range that we now consider normal. Centered in the range that humans prefer, climate stability helped foster human settlement and the rise of civilizations.

In the 10,000 years since, Glendon said, everything we’ve created, from building designs to cultural practices, has been made with the unstated assumption that this stable temperature regime — averaging roughly 60 degrees Fahrenheit — will continue. Recent decades’ warming and the projected warming in the decades to come will push heat and humidity in some places beyond the range that the human body can cool itself, with unknown consequences for societies.

“Everything is built on that stability, on the assumption that those ranges are fixed,” Glendon said. “It’s in building codes, grades of asphalt, architecture. … Those ranges are embodied so they became unconscious, but we need to make them conscious, and ideally they motivate us to avoid 2, 2½, or 3 degrees.”



When an algorithm-driven microscopy technique developed in 2021 (and able to run on a fraction of the images earlier techniques required) isn’t fast enough, what do you do?

Dive DEEPer, and square it. At least, that was the solution used by Dushan Wadduwage, John Harvard Distinguished Science Fellow at the FAS Center for Advanced Imaging.

Scientists have worked for decades to image the depths of a living brain. They first tried fluorescence microscopy, a century-old technique that relies on fluorescent molecules and light. However, the wavelengths weren’t long enough and they scattered before they reached an appreciable distance.

The invention of two-photon microscopy in 1990 brought longer wavelengths of light shine onto the tissue, causing fluorescent molecules to absorb not one but two photons. The longer wavelengths used to excite the molecules scattered less and could penetrate farther.

But two-photon microscopy can typically only excite one point on the tissue at a time, which makes for a long process requiring many measurements. A faster way to image would be to illuminate multiple points at once using a wider field of view but this, too, had its drawbacks.

“If you excite multiple points at the same time, then you can’t resolve them,” Wadduwage said. “When it comes out, all the light is scattered, and you don’t know where it comes from.”

To overcome this difficulty, Wadduwage’s group began using a special type of microscopy, described in Science Advances in 2021. The team excited multiple points on the tissue in a wide-field mode, using different pre-encoded excitation patterns. This technique — called De-scattering with Excitation Patterning, or DEEP — works with the help of a computational algorithm.

“The idea is that we use multiple excitation codes, or multiple patterns to excite, and we detect multiple images,” Wadduwage said. “We can then use the information about the excitation patterns and the detected images and computationally reconstruct a clean image.”

The results are comparable in quality to images produced by point-scanning two-photon microscopy. Yet they can be produced with just hundreds of images, rather than to the hundreds of thousands typically needed for point-scanning. With the new technique, Wadduwage’s group was able to look as far as 300 microns deep into live mouse brains.

Still not good enough. Wadduwage wondered: Could DEEP produce a clear image with only tens of images?

In a recent paper published in Light: Science and Applications, he turned to machine learning to make the imaging technique even faster. He and his co-authors used AI to train a neural network-driven algorithm on multiple sets of images, eventually teaching it to reconstruct a perfectly resolved image with only 32 scattered images (rather than the 256 reported in their first paper). They named the new method DEEP-squared: Deep learning powered de-scattering with excitation patterning.

The team took images produced by typical two-photon point-scanning microscopy, providing what Wadduwage called the “ground-truth.” The DEEP microscope then used physics to make a computational model of the image formation process and put it to work simulating scattered input images. These trained their DEEP-squared AI model. Once AI produced reconstructed images that resembled Wadduwage’s ground-truth reference, the researchers used it to capture new images of blood vessels in a mouse brain.

“It is like a step-by-step process,” Wadduwage said. “In the first paper we worked on the optics side and reached a good working state, and in the second paper we worked on the algorithm side and tried to push the boundary all the way and understand the limits. We now have a better understanding that this is probably the best we can do with the current data we acquire.”

Still, Wadduwage has more ideas for boosting the capabilities of DEEP-squared, including improving instrument design to acquire data faster. He said DEEP-squared exemplifies cross-disciplinary cooperation, as will any future innovations on the technology.

“Biologists who did the animal experiments, physicists who built the optics, and computer scientists who developed the algorithms all came together to build one solution,” he said.



Evidence of the clean-energy transition abounds, with solar panels dotting rooftops, parking lots, and open spaces. In Massachusetts, future proliferation of these sunlight-soaking cells will be a high priority: About five times more solar energy will be needed to reach the state’s goal of net-zero greenhouse gas emissions by 2050.

But at what cost? Harvard Forest researchers have co-authored a landmark report detailing how many projects have required the clearing of carbon-absorbing forested areas, unnecessarily harming nature as well as undercutting environmental progress. The report, written with Mass Audubon, says stronger land-use and incentive policies would allow a smoother transition to clean energy sources without sacrificing more forests and farmlands.

Growing Solar, Protecting Nature” was co-authored by Jonathan Thompson, research director at the Harvard Forest, a 4,000-acre natural laboratory that houses research and education in forest biology, ecology, and conservation. In their analysis, Thompson and collaborator Michelle Manion of Mass Audubon outline scenarios and recommendations for smart, sustainable solar development in Massachusetts.

“We found that we can achieve the commonwealth’s clean energy targets with very little impact on natural and working lands,” said Thompson, whose team specializes in geospatial analysis and land-use impacts to forest ecosystems. Over the past year, Thompson led the land-use and carbon modeling that portrayed different future solar impacts across the state under different scenarios. The team worked closely with Evolved Energy Research, which provided energy and economic consulting.

Since 2010, more than 500 ground-mount solar projects have been developed across the state, covering 8,000 acres, of which about 60 percent are forest acres, according to the report. This illustrates a terrible irony: Deploying solar often means cutting back on tree cover and losing the climate change mitigation it provides through pulling carbon dioxide from the air, storing the carbon, and releasing oxygen into the atmosphere.

As the authors make clear, clean energy sources aren’t enough to meet climate goals. Removing carbon from the atmosphere is just as important, and Massachusetts’ famously abundant forests are a primary means to that end. Beyond the beautiful green canopies they provide, forests are a critical, natural carbon sink, and the clean energy transition can’t happen without them.

“We need to think not only about how many acres we’re using for solar development, but also which acres are being developed,” Thompson said. “Our core forests are incredibly valuable for wildlife habitat, biodiversity, and carbon storage, and we must do everything we can to protect them from further fragmentation.”

By shifting from large, ground-mount solar to more projects on rooftops, parking lots, and already-developed lands, Massachusetts can head off further, unnecessary damage to forests and farmlands while also meeting net-zero emission goals, the report states.

State policymakers expressed support for the new report’s findings. “‘Growing Solar, Protecting Nature’ provides a clear-eyed analysis of the impacts of the commonwealth’s solar policy to date and provides a roadmap for better aligning our goals of rapidly transitioning away from fossil fuels, protecting our forests that help to draw down carbon, and protecting biodiversity,” said Massachusetts climate chief Melissa Hoffer. “The joint crises of climate and biodiversity loss require fresh thinking, and this report offers just that.”

Key policy recommendations include:

  • Eliminating Solar Massachusetts Renewable Target (SMART) incentives for projects sited on core habit and critical natural landscapes while increasing incentives for solar on rooftops and developed lands.
  • Investing in approaches that will reduce costs of rooftop and canopy solar projects.
  • Prioritizing solar with the lowest impacts to nature.
  • Supporting governmental, institutional, commercial, and industrial landowners in building solar near existing transmission infrastructure to reduce costs of energy distribution.
  • Launching a statewide planning effort to integrate clean energy and transmission infrastructure into the process of land development.
  • Funding permanent protection of Massachusetts’ highest-value natural and working lands.

“One of the goals of this work is to broaden how we think about the costs and benefits of the clean energy transition and what we need to fight climate change,” said Manion, Mass Audubon’s vice president for policy and advocacy. “Our results are clear: When we place real value on nature’s contribution to the fight against climate change and protection of biodiversity, the path forward with the lowest costs is the one that solves for both clean energy and nature. And it’s right in front of us.”



Quantum computers promise to reach speeds and efficiencies impossible for even the fastest supercomputers of today. Yet the technology hasn’t seen much scale-up and commercialization largely due to its inability to self-correct. Quantum computers, unlike classical ones, cannot correct errors by copying encoded data over and over. Scientists had to find another way.

Now, a new paper in Nature illustrates a Harvard quantum computing platform’s potential to solve the longstanding problem known as quantum error correction.

Leading the Harvard team is quantum optics expert Mikhail Lukin, the Joshua and Beth Friedman University Professor in physics and co-director of the Harvard Quantum Initiative. The work reported in Nature was a collaboration among Harvard, MIT, and Boston-based QuEra Computing. Also involved was the group of Markus Greiner, the George Vasmer Leverett Professor of Physics.

An effort spanning the last several years, the Harvard platform is built on an array of very cold, laser-trapped rubidium atoms. Each atom acts as a bit — or a “qubit” as it’s called in the quantum world — which can perform extremely fast calculations.

Markus Greiner and Mikhail Lukin with a quantum simulator.

Harvard physicists Mikhail Lukin (foreground) and Markus Greiner work with a quantum simulator.

File photo by Jon Chase/Harvard Staff Photographer

The team’s chief innovation is configuring their “neutral atom array” to be able to dynamically change its layout by moving and connecting atoms — this is called “entangling” in physics parlance — mid-computation. Operations that entangle pairs of atoms, called two-qubit logic gates, are units of computing power.

Running a complicated algorithm on a quantum computer requires many gates. However, these gate operations are notoriously error-prone, and a buildup of errors renders the algorithm useless.

In the new paper, the team reports near-flawless performance of its two-qubit entangling gates with extremely low error rates. For the first time, they demonstrated the ability to entangle atoms with error rates below 0.5 percent. In terms of operation quality, this puts their technology’s performance on par with other leading types of quantum computing platforms, like superconducting qubits and trapped-ion qubits.

However, Harvard’s approach has major advantages over these competitors due to its large system sizes, efficient qubit control, and ability to dynamically reconfigure the layout of atoms.

“We’ve established that this platform has low enough physical errors that you can actually envision large-scale, error-corrected devices based on neutral atoms,” said first author Simon Evered, a Harvard Griffin Graduate School of Arts and Sciences student in Lukin’s group. “Our error rates are low enough now that if we were to group atoms together into logical qubits — where information is stored non-locally among the constituent atoms — these quantum error-corrected logical qubits could have even lower errors than the individual atoms.”

The Harvard team’s advances are reported in the same issue of Nature as other innovations led by former Harvard graduate student Jeff Thompson, now at Princeton University, and former Harvard postdoctoral fellow Manuel Endres, now at California Institute of Technology. Taken together, these advances lay the groundwork for quantum error-corrected algorithms and large-scale quantum computing. All of this means quantum computing on neutral atom arrays is showing the full breadth of its promise.

“These contributions open the door for very special opportunities in scalable quantum computing and a truly exciting time for this entire field ahead,” Lukin said.

The research was supported by the U.S. Department of Energy’s Quantum Systems Accelerator Center; the Center for Ultracold Atoms; the National Science Foundation; the Army Research Office Multidisciplinary University Research Initiative; and the DARPA Optimization with Noisy Intermediate-Scale Quantum Devices program.



The COVID-19 pandemic seemed like a never-ending parade of SARS-CoV-2 variants, each equipped with new ways to evade the immune system, leaving the world bracing for what would come next.

But what if there were a way to make predictions about new viral variants before they actually emerge?

A new artificial intelligence tool named EVEscape, developed by researchers at Harvard Medical School and the University of Oxford, can do just that.

The tool has two elements: A model of evolutionary sequences that predicts changes that can occur to a virus, and detailed biological and structural information about the virus. Together, they allow EVEscape to make predictions about the variants most likely to occur as the virus evolves.

In a study published Wednesday in Nature, the researchers show that had it been deployed at the start of the COVID-19 pandemic, EVEscape would have predicted the most frequent mutations and identified the most concerning variants for SARS-CoV-2. The tool also made accurate predictions about other viruses, including HIV and influenza.

The researchers are now using EVEscape to look ahead at SARS-CoV-2 and predict future variants of concern; every two weeks, they release a ranking of new variants. Eventually, this information could help scientists develop more effective vaccines and therapies. The team is also broadening the work to include more viruses.

“We want to know if we can anticipate the variation in viruses and forecast new variants — because if we can, that’s going to be extremely important for designing vaccines and therapies,” said senior author Debora Marks, associate professor of systems biology in the Blavatnik Institute at HMS.

From EVE to EVEscape

The researchers first developed EVE, short for evolutionary model of variant effect, in a different context: gene mutations that cause human diseases. The core of EVE is a generative model that learns to predict the functionality of proteins based on large-scale evolutionary data across species.

In a previous study, EVE allowed researchers to discern disease-causing from benign mutations in genes implicated in     various conditions, including cancers and heart rhythm disorders.

“You can use these generative models to learn amazing things from evolutionary information — the data have hidden secrets that you can reveal,” Marks said.

As the COVID-19 pandemic hit and progressed, the world was caught off guard by SARS-CoV-2’s impressive ability to evolve. The virus kept morphing, changing its structure in ways subtle and substantial to slip past vaccines and therapies designed to defeat it.

“We underestimate the ability of things to mutate when they’re under pressure and have a large population in which to do so,” Marks said. “Viruses are flexible — it’s almost like they’ve evolved to evolve.”

Watching the pandemic unfold, Marks and her team saw an opportunity to help: They rebuilt EVE into a new tool called EVEscape for the purpose of predicting viral variants.

They took the generative model from EVE — which can predict mutations in viral proteins that won’t interfere with the virus’s function — and added biological and structural details about the virus, including information about regions most easily targeted by the immune system.

“We’re taking biological information about how the immune system works and layering it on our learnings from the broader evolutionary history of the virus,” explained co-lead author Nicole Thadani, a former research fellow in the Marks lab.

Such an approach, Marks emphasized, means that EVEscape has a flexible framework that can be easily adapted to any virus.

Turning back the clock

 In the new study, the team turned the clock back to January 2020, just before the COVID-19 pandemic started. Then they asked EVEscape to predict what would happen with SARS-CoV-2.

“It’s as if you have a time machine. You go back to day one, and you say, I only have that data, what am I going to say is happening?” Marks said.

EVEscape predicted which SARS-CoV-2 mutations would occur during the pandemic with accuracy similar to this of experimental approaches that test the virus’ ability to bind to antibodies made by the immune system. EVEscape outperformed experimental approaches in predicting which of those mutations would be most prevalent. More importantly, EVEscape could make its predictions more quickly and efficiently than lab-based testing since it didn’t need to wait for relevant antibodies to arise in the population and become available for testing.

Additionally, EVEscape predicted which antibody-based therapies would lose their efficacy as the pandemic progressed and the virus developed mutations to escape these treatments.

The tool was also able to sift through the tens of thousands of new SARS-CoV-2 variants produced each week and identify the ones most likely to become problematic.

“By rapidly determining the threat level of new variants, we can help inform earlier public health decisions,” said co-lead author Sarah Gurev, a graduate student in the Marks lab from the Electrical Engineering and Computer Science program at MIT.

In a final step, the team demonstrated that EVEscape could be generalized to other common viruses, including HIV and influenza.

Designing mutation-proof vaccines and therapies

The team is now applying EVEscape to SARS-CoV-2 in real time, using all of the information available to make predictions about how it might evolve next.

The researchers publish a biweekly ranking of new SARS-CoV-2 variants on their website and share this information with entities such as the World Health Organization. The complete code for EVEscape is also freely available online.

They are also testing EVEscape on understudied viruses such as Lassa and Nipah, two pathogens of pandemic potential for which relatively little information exists.

Such less-studied viruses can have a huge impact on human health across the globe, the researchers noted.

Another important application of EVEscape would be to evaluate vaccines and therapies against current and future viral variants. The ability to do so can help scientists design treatments that are able to withstand the escape mechanisms a virus acquires.

“Historically, vaccine and therapeutic design has been retrospective, slow, and tied to the exact sequences known about a given virus,” Thadani said.

Noor Youssef, a research fellow in the Marks lab, added, “We want to figure out how we can actually design vaccines and therapies that are future-proof.”

Additional authors: Pascal Notin, Nathan Rollins, Daniel Ritter, Chris Sander, and Yarin Gal.

Disclosures: Marks is an adviser for Dyno Therapeutics, Octant, Jura Bio, Tectonic Therapeutic, and Genentech, and is a co-founder of Seismic Therapeutic. Sander is an adviser for CytoReason Ltd.

Funding for the research was provided by the National Institutes of Health (GM141007-01A1), the Coalition for Epidemic Preparedness Innovations, the Chan Zuckerberg Initiative, GSK, the UK Engineering and Physical Sciences Research Council, and the Alan Turing Institute. 

 

 

 



MKRdezign

Contact Form

Name

Email *

Message *

Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget