Hi this is Eswar prasad presenting you a site that makes you to know something better about Technology and Using a personal computer in a better way.So help me to promote some more interesting and Amazing things that can be done on a P.C.Write your suggestions on the comment section so as to improve the content.
My Name is Eswar prasad From Srikalahasthi and You can learn Many things like:Android tips,Ubuntu and windows tutorials,Gaming ,Tech Hacks and Latest Updates from this Blog
facebookgpluslinkedinpinteresttwitter
Climate change’s toll on human health is becoming more widely appreciated as impacts mount, including recent disasters such as floods in Pakistan and severe drought in the Horn of Africa, according to Harvard physicians who attended recent climate talks in Egypt. They noted an increased presence of health care experts at this year’s event, calls for more studies to prompt policy changes, and a move by health systems to clean up their own emissions.
“We know we can’t have healthy people without a healthy planet,” said Kimberly Humphrey, an emergency room physician and FXB Climate Change and Human Health Fellow at the Harvard Francois-Xavier Bagnoud Center for Health and Human Rights. “The thing that I’m probably most excited about over the next 12 months is cross-sectoral collaborations — how health is being integrated across all different areas — and the solutions that we will be able to come up with working with others in this space who don’t come from the health sector. It’s just incredibly promising.”
Panelists shared their experiences at COP27, the 27th Conference of the Parties to the United Nations’ Framework Convention on Climate Change, which occurred earlier this month. Attendees hailed progress made toward creating a fund financed by the world’s industrialized nations to help the most vulnerable countries deal with climate change’s impacts. But the event has also drawn criticism for a lack of progress toward more ambitious national pledges to reduce greenhouse gas emissions.
The physicians said they were pleasantly surprised at the widespread acknowledgement that health is an important consideration when talking about climate change but said it was not absolute and hasn’t translated to a more prominent role for health experts. Research can help, Wiskel said, by generating specific data on impacts that can provide the foundation for policy changes that are needed.
Just walking around the COP site, in the desert community of Sharm El Sheikh, Egypt, highlighted some of the challenges of climate change, including extreme heat — a major issue in the years to come — and the disparities between rich and poor, as seen in the differences between the air-conditioned tented pavilions of different countries. Wiskel cited the contrast between the basic pavilion of developing Niger next and the well-appointed site for oil-rich United Arab Emirates, host of next year’s COP.
Also visible was a large and energetic youth contingent, which continues to push for greater action to address climate change, panelists said. At the opposite end of the spectrum, Humphrey pointed out, were the 660 lobbyists who reportedly attended representing fossil fuel interests, which were otherwise largely invisible.
A key development during the event, Bernstein said, was the announcement by the U.S. Department of Health and Human Services that more than U.S. 100 health care organizations have pledged to reduce emissions in the coming years. The organizations signed the White House/HHS Health Sector Climate Pledge to cut emissions 50 percent by 2030 and reach net zero by 2050. HHS also announced an agreement with the National Health Service England to collaborate on procurement and supply chain reforms.
While voluntary, panelists said the steps can reduce the considerable contribution that health care makes to planet warming emissions. Health care and its supply chain account for 8.5 percent of all greenhouse gas emissions in the U.S., for example, Bernstein said. Globally, Humphrey said, if the health care industry was a nation, its emissions would make it the fifth-largest country.
“We’ve all heard the term ‘supply chains’ more in the last couple of years than in the whole previous part of our lives, unless you’re in economics,” Dresser said. “And that’s a really important thing to be learning about, because as a health care system, this is a huge piece of our carbon emissions, and I think the solutions to this have to be large-scale.”
In addition to these steps, Dresser said, it’s critical for those in health care to not just continue to offer the opinions of experts but to maintain focus on those being impacted by warming and keep bringing their personal stories to light at high-level gatherings like COP27, to personalize what’s at stake.
“I think we also need to be making an effort to elevate the voices of people, communities, nations that have been impacted by climate change,” Dresser said. “Those individual stories are the single most powerful message we have about why it matters that we take steps to address ongoing carbon emissions and set up such structures and policies that will keep people safe.”
Climate negotiators from around the world recently wrapped up talks in Egypt that were by turns frustrating and hopeful: frustrating because they did little to accelerate the slow pace of action to reduce carbon emissions, and hopeful because of a reawakened dialogue between the world’s biggest emitters and movement to address climate-related damage to the world’s most vulnerable nations. The Gazette spoke with Robert Stavins, the Harvard Kennedy School’s A.J. Meyer Professor of Energy & Economic Development, director of the Harvard Project on Climate Agreements, and a regular attendee at the annual summits, to better understand successes and failures at the 27th Conference of the Parties to the United Nations Framework Convention on Climate Change.
Q&A
Robert Stavins
GAZETTE: What stands out from COP27, this year’s climate change conference?
STAVINS: Two things stand out, one of which has not been talked about and I think is the most important development for long-term climate policy. The second one is the loss and damage issue. The first is that ever since Donald Trump was elected president in November 2016, a major question has been: When would the United States and China return to what had been a highly effective co-leadership during the Obama years? The Paris agreement would not have been achieved had it not been for the cooperation between China and the United States. That cooperation broke off with Trump and then hadn’t returned under Biden because of disagreements on international trade, human rights, South China Sea, Hong Kong, Taiwan, and other non-climate issues.
At COP27, we got the beginning of the answer, although in surprising fashion. The most important development during COP27 took place 6,000 miles away, in Bali, Indonesia, on Nov. 14. U.S. President Joe Biden and China President Xi Jinping met on the sidelines of the G-20 summit. They shook hands and engaged in a three-hour conversation in which they signaled their return to the cooperative stance that had been so crucial for international progress on climate change. That quickly trickled down to the heads of the respective negotiating teams at COP27, John Kerry for the United States and Xie Zhenhua of China. They’re friends but had not had discussions because of the problems between the two governments. Once the meeting in Bali took place, statements came from both John Kerry and Xie Zhenhua indicating that the two countries plan to resume cooperation, and that they had met and talked several times.
GAZETTE: What makes these two countries so potentially powerful when they work together? Is it just size and global influence?
STAVINS: It’s partially size and global influence, but more specifically, it’s that they’re the two largest emitters in terms of greenhouse gases. Also, the United States has traditionally been the leader of the Western world on many topics, and China — through what is called the “G-77 plus China,” which is 134 developing countries — plays a leadership role in the developing world. These are the two most important constituencies for climate negotiations, and they’re the respective leaders.
GAZETTE: Another theme from COP27 is the failure to increase pledges for emissions reductions by different nations, which was expected after Glasgow. Is that something that could happen because of this U.S.-China cooperation?
STAVINS: Essentially, on every element where there is desired progress, which is feasible and reasonable, China and the U.S. can play a crucial role. They were the first ones out of the gate with their NDCs [Nationally Determined Contributions to emission reductions under the Paris climate agreement] that became a wind at the back of the other delegations. It’s one thing for the European Union to get out in front because they always do. That doesn’t pull a lot of other countries along. It’s one thing for Costa Rica to get out in front among developing countries, but that doesn’t pull a lot of countries along. The United States and China bring a multitude of other countries along when they cooperate and play a leadership role.
GAZETTE: What can you tell me about the new fund for loss and damage?
STAVINS: The most dramatic and contentious decision within the halls of COP27 — by the negotiators from 195 countries — was the establishment of a fund for so-called loss and damage. This is an issue that’s been kicked down the road for a long time. It was first floated in 1991 when Vanuatu, a small island nation in the Pacific, suggested the creation of such a fund to pay for the consequences of rising sea levels, and for 30 years action has been delayed. In COP27, developing countries pushed to establish an explicit fund for this. China came out in favor of an explicit fund for loss and damage, though they also said they weren’t going to put any money into it. The European Union then said they supported it — that was very important — and then a few other developed countries came out in support of it. Then, in the second week of the COP, in a rather dramatic announcement, John Kerry said that the United States was reversing its position. Before the COP started, he said, “We do not support the creation of such a fund.”
GAZETTE: How large will the fund be?
STAVINS: On the demand side, such a fund could eventually amount to trillions of dollars per year. The World Bank estimate of the damages caused by this year’s floods in Pakistan alone is $40 billion. So, if you picture climate change increasing in intensity over time in countries around the world, you can see why it could easily amount to hundreds of billions — indeed, trillions — per year. On the supply side, though, there are very few quantitative pledges thus far: tens of millions of dollars — trivial compared to what the demand is going to be. So, to me, the question becomes: Is the new loss and damage fund an empty shell?
China’s announced position at COP27 was that it supports the creation of the loss and damage fund, but as a “developing country” it will not be responsible for any contributions to the fund. You’ll find a lot of quotes in which they’re self-described as a developing country and what they’re referring to is the definitions of “annex one” and “non-annex one” countries in 1992, under the United Nations Framework Convention on Climate Change. Then, China’s per capita GDP was less than $400 per year. They don’t talk about the fact that per capita GDP in China has grown by 3,330 percent since then.
There’s actually some convergence on the loss and damage fund issue between China and the United States, though not in regard to China’s self-proclaimed exemption from financial contributor status — the U.S. has been outspoken that China should be a contributor. They are not a poor, developing country any longer, they’re a major contributor to the atmospheric “stock” of greenhouse gases — damages are a function of stock, not emissions — and the largest contributor to the annual emissions.
The reason I say there’s convergence is that the United States has a story that puts the United States in a similar place: “We support the loss and damage fund, but due to the new Republican majority in the House of Representatives, it is impossible for us to make any commitment of new funding.” That’s not explicitly stated, but that’s essentially the U.S.’ position. I think that’s why Kerry could make this flip, because there’s no way that there’s going to be any money.
GAZETTE: With the COP over, how do you feel we’re doing? Predictions from the U.N.’s Intergovernmental Panel on Climate Change (IPCC) are getting more dire. People talk about timelines getting shorter. Keeping warming to 1.5 degrees Celsius seems less and less likely.
STAVINS: There have been many statements of disappointment regarding this COP, because the closing statement did not fully embrace the 1.5 degrees C target. But that would have been a nonbinding resolution versus the 2 degrees centigrade target, which is a commitment in the Paris agreement. I may be a glass-half-full guy, but I remember when the IPCC’s business-as-usual estimates of temperature change for this century were as high as 7 degrees centigrade. With the Paris NDCs, they were 3 degrees centigrade. And now, with the new commitments made by countries — including the U.S. — plus the Kigali amendments to the Montreal Protocol, we’re talking about 2.5 degrees C. I recognize that 2.5 degrees is a long way from 1.5, but it’s also a long way from where we were.
GAZETTE: Do you feel relatively certain that the trajectory will continue to improve as we go forward?
STAVINS: I wouldn’t say “relatively certain,” but I’m certainly hopeful. It’s conditional upon political developments in China, in the United States, in the European Union — the other major contributor — but we know where they’re going. China is the big question mark. We have to wait and see. The United States is also a question mark because we flip back and forth between Democratic and Republican administrations and Congresses, which drives the other countries of the world crazy. Russia is also a very large emitter, but has almost removed itself from the discussions, because it has other things to worry about.
Cephalopods are capable of some truly impressive behaviors. They can quickly process information to transform shape, color, and even texture, blending in with their surroundings. They can also communicate with one another, show signs of spatial learning, and use tools to solve problems. They are so smart they even get bored and start making mischief.
It’s no secret what makes this all possible: These marine animals, which include octopus, squid, and their cuttlefish cousins, have the most complex brains of any invertebrates on the planet. What remains something of a mystery, however, is how cephalopods developed those big brains in the first place. A Harvard lab that studies the visual systems of these soft-bodied creatures — which is where two-thirds of their central processing tissue are focused — believe they’ve come close to figuring it out.
Researchers from the FAS Center for Systems Biology describe in a new study how they used a new live-imaging technique to watch neurons being created in squid embryos almost in real-time. They were then able to track those cells through the development of the nervous system in the retina.
They were surprised to discover that these neural stem cells behaved very much like those in vertebrates during nervous-system development. The results suggest that while vertebrates and cephalopods diverged from one other 500 million years ago, the process by which both developed big brains was similar. In addition the way the cells act, divide, and are shaped may essentially follow a kind of blueprint required for this kind of nervous system.
“Our conclusions were surprising because a lot of what we know about nervous system development in vertebrates has long been thought to be special to that lineage,” said Kristen Koenig, a John Harvard Distinguished Fellow and senior author of the study. “By observing the fact that the process is very similar, what it suggested to us is that these two independently evolved, very large nervous systems are using the same mechanisms to build them. What that suggests is that those mechanisms — those tools — the animals use during development may be important for building big nervous systems.”
The scientists from the Koenig Lab focused on the retina of a squid called Doryteuthis pealeii, more simply a longfin inshore squid. The squid grow to be about a foot long and are abundant in the northwest Atlantic Ocean. The embryos look like adorable anime characters with big heads and eyes.
The researchers employed similar techniques to those regularly used to study model organisms, like fruit flies and zebrafish. They created special tools and made use of cutting-edge microscopes that can take high-resolution images every 10 minutes for hours on end to see how individual cells behave. The researchers used florescent dyes to mark the cells so they could map them and track them.
This live-imaging technique allowed the team to observe stem cells called neural progenitor cells and how they are organized. The cells formed a special kind of structure called a pseudostratified epithelium. Its main feature is that the cells are elongated so they can be densely packed. Researchers also saw the nucleus of these structures move up and down before and after dividing. This movement is important for keeping the tissue organized and allowing for continued growth, they said.
This type of structure is universally seen in brain and eye development in vertebrate species. It long has been considered one of the reasons the vertebrate nervous system could grow so large and complex. Scientists have observed examples of this type of neural epithelium in other animals, but the squid tissue was also strikingly similar to that of vertebrates in size, organization, and nucleus movement.
The research was led by Francesca R. Napoli and Christina M. Daly, research assistants in the Koenig Lab.
Next, the lab plans to look at how different cell types in cephalopod brains emerge. Koenig wants to determine whether they’re expressed at different times, how they decide to become one type of neuron versus another, and whether this action is similar across species.
“One of the big takeaways from this type of work is just how valuable it is to study the diversity of life,” Koenig said. “By studying this diversity, you can actually really come back to fundamental ideas about even our own development and our own biomedically relevant questions. You can really speak to those questions.”
He’s taking two highly advanced courses in quantum science and engineering, each of which assigns complex problem sets that take about five hours apiece to complete. In his free time Lopez can usually be found in the lab of Harvard Professor Kang-Kuen Ni, whose chemistry and physics lab designs new experiments to study fundamental chemical reactions and physical dynamics by slowing them down in super-cold environments.
He sits in on three hours of meetings per week at the lab and also works on his own quantum project when time allows. That project is to build a laser that can cool and trap molecules and control their quantum state interactions. The work involves hours of tinkering with wiring and electronics as well as putting the physical parts together and aligning them all.
All in all, Lopez’s first semester at Harvard has a bit of a hustle, but the first-year graduate student from Santa Barbara, California — who dreams of one day being a professor at a research university — says it’s worth it. He feels fortunate to be getting the kind of unique background he’s getting as an inaugural member of Harvard’s new Ph.D. program in quantum science and engineering.
“The weeks fill up, but I’ve been learning a lot and really enjoying it,” he said. “I can definitely get [where I want to be].”
Launched in spring 2021, the new quantum program is one of the world’s earliest Ph.D. programs in the subject and is designed to prepare future leaders and innovators in the critical and fast-emerging field.
This semester, 11 students, including Lopez, have been settling in as the first-ever cohort. Since September, they have started making Harvard their home and grappling with their studies in quantum information, systems, materials, and engineering.
The hope is that the extensive research experience they receive — combined with coursework and the mentorship embedded in the program — will help give them a broad and well-rounded education to go on to careers in quantum, whether as an educator in academia, or developing next-level systems and applications as a researcher at a university, a national laboratory, or in industry.
“When you have a new intellectual area it’s a good idea to train students in it and to come up with a curriculum that’s really tailored to that area — in this case: an understanding of the engineering and the science behind new quantum technologies,” said John Doyle, Henry B. Silsbee Professor of Physics and co-director of the Harvard Quantum Initiative, of which the new program is a part. “You develop these new ideas into a real firm bedrock on which students can go on to do whatever they want to do.”
Quantum mechanics and technology cut across disciplines. Advances in the field promise to usher in real-world breakthroughs in health care, quantum computing infrastructure, cyber security, drug development, climate-change prediction, machine learning, communication technologies, and financial services. The backgrounds of students who have been accepted into the program reflect that diversity — they range from physics and computer science to chemistry, electrical engineering, and math.
The well-rounded curriculum on offer was one of the driving factors for many of the students enrolling. In fact Quynh Nguyen, an international student from Vietnam who studied physics and computer science as an undergrad at MIT, said that the interdisciplinary nature of the field is what makes him so passionate about it.
“There are just so many questions to explore,” Nguyen said.
As a part of the program, he hopes to learn more about quantum information and algorithms and explore the capabilities of quantum systems such as the programmable quantum simulator being worked on in the lab of physics professor Mikhail Lukin, work that will eventually lead to a new world of ultra-fast computing.
A major focus of the new program is research experience. Along with rigorous course loads, students begin lab rotations in the first year and continue that through the rest of the program. They are also strongly encouraged to pursue cross-disciplinary research and industry internships. The idea is to give students an understanding of how research is done in different labs.
Some of the students’ research falls on the side of theory, like Nguyen’s work. Other research is more experimental, like Lopez’s work with lasers. Youqi Gang, who’s exploring experimental platforms for quantum simulation and quantum computation, is doing her first rotation in Markus Greiner’s lab studying ultracold quantum gases. Gang is gradually learning to operate the many optics, electronics, and control systems the lab uses to cool and manipulate atoms.
“The equipment is very complicated,” Gang said. “We have many different laser beams and everything needs to be very well aligned … and we have to do some day-to-day alignments and calibrations. People have put in a lot of thought about how to optimize the equipment. It’s a very cool process to be able to kind of get familiar with such a complicated machine and learn how to use it.”
Students in the program will receive their degree from the Graduate School of Arts and Sciences. The faculty for the Ph.D. program are drawn from the departments of Physics and of Chemistry and Chemical Biology in the Division of Science and the Harvard John A. Paulson School for Engineering and Applied Sciences. Students say the different class options offer them the chance to explore quantum science across the disciplines.
Nazli Ugur Koyluoglu, an international student from Istanbul, for example, is taking two very different classes this semester: Physics 271, which covers topics in quantum information, and Physics 295a, which looks at quantum theory applied to solid-state physics.
When not in class or research labs, students often can be found in the designated office space set up for them on the fifth floor of the Laboratory for Integrated Science and Engineering building. The large area is divided into two shared offices with working stations in each section and a big meeting room.
The meeting space is where students gather for weekly lunches and host weekly journal clubs where they present on different topics in quantum science, whether it’s something in a scientific journal that got their attention, something they themselves are studying, or a theory or experiment someone wants to learn more about.
The efforts have helped them quickly develop into a tight-knit community.
“It’s helped us start creating a culture for the program,” Koyluoglu said. “It’s constantly being up to date about each other’s work, which is really enlightening and helps us find out the different paths and the different questions that people are thinking about.”
HQI administrators for the Ph.D. program anticipate enrolling up to 60 students in the program in the future.
“The first cohort of students in the program are exceptional in their talents, vision, and enthusiasm in embracing a ‘quantum future,’” said Evelyn L. Hu, the Tarr-Coyne Professor of Electrical Engineering and Applied Science at SEAS and co-director of the Harvard Quantum Initiative. “My hopes are that the program and its students continue to build on this strong platform: diverse and multifaceted in its outlook and opportunities, while maintaining a strong sense of community even as the program expands.”
A major new report suggests that with a handful of strategies New England’s 32 million acres of forests, which cover about three-quarters of the region, could eventually come close to absorbing 100 percent of all the carbon produced by the six states.
The report, “New England’s Climate Imperative,” commissioned by the conservation nonprofit the Highstead Foundation and led by a Harvard ecologist, looks at how forests in the region can be better utilized in the fight against climate change.
“Most people have learned that forest or trees in one way or another can be a help to climate, but beyond that there isn’t a lot of clarity about how significant a role they could play or what their role is,” said Jonathan Thompson, a senior ecologist at Harvard Forest who helped lead the research team. “It’s why we felt that there was a need, despite all the many climate reports that come out, for a specific estimate on this role forests could play, especially if you take different activities that are defined by state governments themselves and NGOs.”
According to the report, the region’s forested areas already annually absorb almost 27 million tons of carbon through photosynthesis, the process by which plants synthesize food and release oxygen as a byproduct. The report lays out five steps policymakers and conservation NGOs can pursue that can lead to forests absorbing almost 360 million additional tons of carbon dioxide over 30 years. That means New England’s forests will be able to absorb virtually all the carbon produced in the region, provided the six states hit their existing emission-reduction goals.
Thompson and collaborators from nine institutions — including the New England Forestry Foundation and the Northeast Wilderness Trust — created their recommendations after interviewing dozens of local lawmakers and conservationists on steps they hope to take or have already started taking to use trees and forests in the region to reduce carbon.
The five strategies include changing development practices to reduce annual rates of forest destruction; designating at least 10 percent of existing forests as forever wild, allowing more trees to grow old and accumulate and store more carbon; improving forest management; replacing concrete and steel with mass timber materials in half of all new institutional buildings and multifamily homes; and taking actions on urban and suburban forests to increase tree canopy and forest cover in cities and suburbs.
The researchers ran the metrics on how each would contribute to reducing carbon dioxide in the atmosphere at different tiers of implementation. In the report, they break it down by state and then calculate them together.
“Each of these pathways offers a way to pull more carbon out of the atmosphere,” Thompson said. “We think of these pathways very much as all-of-the above-type solutions. There are a lot of forests in New England, and there is a role for multiple different strategies to meet climate goals.”
For example, if even moderately implemented, the strategies would boost the amount of carbon New England forests absorb each year from the equivalent of 14 percent of 2020-level fossil fuel emissions to 20 percent. That increase would eventually jump to 97 percent by 2050 if all individual emission reduction scenarios are met by the states.
The researchers admit that some of their recommendations may seem contradictory, such as promoting policies that avoid deforestation and creating more wildland while also promoting an increase in construction using timber. But studies and metrics have shown that the numbers make it worthwhile.
Timber building materials, for instance, are much less carbon-intensive than steel or concrete. They also store carbon through the life of the building. The researchers calculate that if half of six- to 12-story buildings used wooden frames, an additional 15 million U.S. tons of carbon could be stored.
The report, which took two years to compile, seeks to inform legislators and policymakers throughout New England as they pursue state-level climate goals.
With Earth perilously close to eclipsing the 1.5-degree Celsius increase in average annual temperatures that climate scientists say will cause irreparable harm to society and nature, the researchers note that while technological approaches exist to reduce carbon in the atmosphere, none of them rival forests. They hope lawmakers will take heed and take action.
“In New England, nature is a major ally in our effort to address the global crises of climate, biodiversity, and human health,” said David Foster, a co-author of the report, Highstead Foundation board member, and director emeritus of Harvard Forest. “If we can conserve forest infrastructure and embrace the pathways outlined in our report, we can increase forest carbon sequestration and help all six states achieve their emissions targets.”
In the 1970s, Sheila Tobias noticed something peculiar going on in mathematics. In one of her early studies, the graduate of Radcliffe College, self-described “scholar activist,” and author of 14 books, including the 1978 bestseller “Overcoming Math Anxiety,” gave elementary school students a sheet of paper, divided in half. On one side, they worked on a math problem; on the other, they wrote down how the problem made them feel.
“I’m finished,” one wrote. “Nobody else is. I must be wrong.”
Another wrote: “I’m not finished. Everybody else is. I must be wrong.”
Many remembered their parents saying something like, “Nobody in our family is good at math.” Others recalled the shame of standing at a blackboard, failing to solve an equation as their classmates heckled and laughed.
“Math anxiety is a serious handicap,” Tobias wrote in a 1976 article in Ms. magazine describing her findings. “It is handed down from mother to daughter with father’s amused indulgence. (‘Your mother never could balance a checkbook,’ he says fondly.)”
Today, math anxiety still smothers students — especially those who belong to groups historically underrepresented in the field — and there’s more at stake than a balanced checkbook. Threats like climate change, pandemics, and gerrymandering cannot be solved without math.
“You can’t begin to grasp those issues,” Tobias said in a talk at West Virginia University just one year before she died, in 2021, at the age of 86. (The death was not widely reported until a New York Times obituary appeared in September of this year.)
Almost 50 years have passed since Tobias, whose papers are held by Radcliffe’s Schlesinger Library, first described math anxiety’s impact on students — especially young girls and women. And yet, not much has changed. According to the cognitive scientist Sian Beilock’s 2019 Harvard Business Review article, “Americans Need to Get Over Their Fear of Math,” nearly half of first- and second-grade students say they are “moderately nervous” or “very, very nervous” about math, and a quarter of college students report moderate or high levels of math anxiety.
“Hating math seems to bring people together,” said Reshma Menon, a preceptor in Harvard’s mathematics department. “This isn’t just about my students. I’ll meet people at the grocery store or I’ll be in an Uber chatting with the driver. When I tell them I teach math, the immediate response is, ‘Oh my God, I used to hate math in school.’ Math anxiety is worldwide and very, very real.”
Math hatred, or what some call “math trauma,” is like the common cold: ubiquitous, tricky to trace, and hard to treat.
“There’s a genius myth in mathematics,” said Brendan Kelly, director of introductory math at Harvard. “There’s often this perception that success requires some natural ability, some unteachable qualities, some immutable traits.”
When students learn to write stories or play the violin, most don’t expect to replicate Toni Morrison or Niccolò Paganini in their first attempts. No one says, “I’m not a writing person.” But in math, said Allechar Serrano López, also a preceptor in mathematics at Harvard, “It gets decided when they’re literally children if they are going to be math people or if they’re not math people.” And because math is a gateway to almost every other field of science, that early stamp can squeeze students out of the STEM pipeline.
But the genius myth isn’t the only barrier.
Students come to college with vastly different educational backgrounds based on which elementary and secondary schools they attended. Some schools don’t even offer calculus, Menon noted. “During the pandemic, these differences became wider; the disparities are much more evident now than they were before,” she said.
Disparities across schools often disproportionately affect low-income students and students of color. “That divide creates lower confidence among students,” said Menon. “But there’s also the problem, generally, of women, students of color, and nonbinary students feeling like they don’t fit in.”
Mathematicians might argue that math is a meritocracy based solely on whether a student can solve a problem or not. But that argument not only ignores imbalances in schooling and opportunities, it also lets teachers off the hook.
“The responsibility really should be mine to create the space where students feel that they can ask questions, share their ideas, and slowly become more confident and overcome their math anxiety,” said Menon.
That means smaller class sizes, collaborative group work, and extra attention paid to students who come from less-privileged schools or whose identity might make them feel less confident in their abilities.
“There’s a cultural change we need in our math education,” said Kelly. “It’s challenging to be vulnerable, right? To raise your hand and be wrong in the classroom is a hard thing. We need to create a space where it’s OK to be confused and to share that confusion.”
Tobias would agree. Back in the 1970s, she created a “Math Anxiety Bill of Rights,” which included: “I have the right to not understand,” “I have the right to dislike math,” and “I have the right not to base my self-worth on my math skills.”
Today’s students might add: “I have the right to be seen as a math person.”
Even Menon said she still struggles with impostor syndrome, despite having taught calculus for 10 years now. Serrano López failed exams and even classes on her way to becoming a mathematician and faculty member. And Michael Hopkins, the George Putnam Professor of Pure and Applied Mathematics and math department chair, admits that he spends much of his time in the dark.
“Most of the time, I’m in a state of ignorance,” he said. “I think the anxiety comes from not knowing what to do when you’re in a state of ignorance, but that’s a state I value greatly.”
Hopkins and other Harvard mathematicians see a pressing need for a cultural shift to make math education more welcoming and inclusive.
“Everything is at stake,” said Serrano López. “I come from a low-income family, and I have seen that access to education is vital for social mobility.” STEM careers not only pay more; they’re crucial in the fight against existential threats like climate change, nuclear war, and global diseases.
“If this planet is going to survive,” Tobias said, “we need more of the population to think like scientists. A majority don’t.”
“It’s not their fault at all, right?” said Menon. “It’s the way we’ve built the system that makes them feel completely unwelcome. It’s a huge problem if we lose so many talented and intelligent people to this kind of divide.”
Harper looked at Experimenter Two as if he were crazy. Alexander Junxiang Chen disregarded the look and started the sequence again: lightly banging a rubber mallet onto the floor around his outstretched hand before pretending to smash it on his hand and crying out in pain.
The first time Harper, a 6-year-old, tan Wheaten terrier, seemed concerned, walking over to the Harvard junior and giving him a lick on the face. The second time, Harper appeared more wary and just studied Chen. The third time, she was over it. She flinched when the mallet came down and let out a mix between a small growl and annoyed bark as if asking: “What are you doing?”
A small group of students in the adjoining room in the basement of Northwest watched the events over a camera feed and through a one-way mirror. Their class — HEB 1353, “Dogs: Behavior, Evolution, and Domestication” — looks at a question all canine lovers, especially owners, ask themselves at some point: Why did the dog just do that?
“It’s really hard not to anthropomorphize their expressions,” Harvard neuroscientist Erin Hecht told her students as they tried putting themselves in Harper’s paws and evaluating what she was thinking.
In animated shows like Disney’s popular “Paw Patrol,” dogs are capable of talking so can say what they are thinking or feeling. Real dogs have only gestures and vocalizations to communicate: sad puppy eyes, rolling over on its belly, an impatient growl to show it wants to go outside, barking for a treat.
Students in the class examine a range of dog behaviors, how they evolved, and how they relate to human behavior.
“Humans are animals, too,” said Hecht, an assistant professor of human evolutionary biology who heads up the Evolutionary Neuroscience Laboratory and the Canine Brains Project in the Department of Human and Evolutionary Biology. “Our behavioral adaptations have evolved in response to selection pressures just like every other species on the planet. It’s often easier for students (and scientists) to think about the evolution of behavior in other animals. In the classroom, dogs provide an accessible and familiar access point for applying an evolutionary perspective to the study of behavior.”
The first part of the course runs like a lecture seminar, while the second part is a hands-on lab where students measure and analyze dog behavior. The animals they study belong to volunteers, usually Harvard faculty and staff.
For lectures, Hecht and Sophie Barton, the graduate assistant for the course, assign readings on topics ranging from domestication theories and dog species variation to how social communication styles and abilities differ across species.
The discussions are often lively, said Isabel Levin ’25.
“This is my first experience being in a small, discussion-based class where people were so much more engaged than in any other class I’ve ever taken before,” Levin said. “Everybody is genuinely interested and wants to participate.”
The class was a hot commodity during enrollment. Forty-five students applied but Hecht capped it at 12 so everyone would get a chance to work closely with the dogs in the lab. She hopes to increase class size in the future.
Chen, who took the lead role in the experiment with Harper, never had a dog growing up but is interested in animal behavior generally and getting lab experience.
Emily Ruiz ’25 isn’t shy about why she took the course. “I’m really interested in behavior … but also you get to work with dogs,” Ruiz quipped.
The Canine Brains Project room looks like a standard lab office with a comfortable couch, desk, and computer, and a large screen with live camera feeds. But there are plenty of dog treats on the desk at all times. The experiment room is soundproof and sparse, other than a few pieces of furniture and taped-off sections on the floor to mark spaces.
The tests are meant to replicate experiences most dogs encounter in their daily lives, like meeting a new person, problem-solving, communicating with humans, and being left alone. Owners also fill out a survey to fill in some gaps on the dog’s life history and their temperament and behavior at home.
“The main goal of the research is to understand the variation in behavior that happens in dogs and then we’re trying to examine which traits are associated with each other,” Hecht said. “For example, are dogs that are more anxious at home when the owner leaves more likely to show stress in a novel environment when they’re separated from the owner? Are they more likely to make contact with the owner first, as opposed to the experimenter, when both of them come back into the room? We’re looking at how different factors influence dog behavior.”
The first test, called the Impossible Task, looks at whether the dog will look to the owner or experimenter for permission or assistance when a treat is left on the counter and then placed in a sealed jar. In the second test, observers see how the dog responds when the owner and experimenter return after a five-minute absence. The next experiment involves how the dog reacts to someone getting hurt, and the last is to determine which paw the dog favors.
Hecht said there are no right and wrong behaviors on the tests. Harper, a rescue pup from a breeding farm, ignores a treat Chen left on the counter. This tends to happen with dogs who are well-behaved and wait for permission from their owners to go for the treat. As it turned out, Harper had been trained to refrain from taking food off counters.
The separation task was also cut short when Harper started barking after being left alone a few minutes and couldn’t be calmed. When Chen and Harper’s owner, Mallory McCoy, HEB’s academic coordinator, re-entered the room, Harper rushed past Chen straight to McCoy.
“Velcro dog,” McCoy laughed.
The empathy and paw tests were the only ones Harper completed. The data will prove useful when the class combines it with the data from the other 21 dogs they will test. They’ll then analyze it and form theories that they will test.
“They might investigate whether dogs that have had only a single owner since leaving their mother and littermates are less interactive with the experimenter than with their owners, or whether dogs who score high on a trainability survey completed by owners are also more likely to look at the owner for ‘help’ during the impossible task,” Hecht said.
As for Harper, she appeared happy to be headed home after receiving a few well-earned treats and a stylish red bandana showing she took part in the project.
“You’re a Canine Brains graduate!” McCoy told Harper as the students cheered and took turns petting her.
Today, the Harvard University Data Science Initiative announced the AWS Impact Computing Project at the HDSI, a collaboration with Amazon Web Services (AWS) aimed at reimagining data science to identify potential solutions for society’s most complex challenges.
The alliance, enabled by Harvard’s Office of Technology Development, will support current and future faculty projects by creating new data solutions that will amplify the University’s social impact and transform research capabilities.
Since its launch in 2017, the HDSI has advanced data science methodology and application across Harvard and sought out opportunities presented through new data streams to better understand society. It pulls together experts across the University to work on large and complex evolving issues through the lenses of artificial intelligence, computer science, statistics, and with attention to policy and context. The AWS Impact Computing Project at the HDSI will empower faculty, students, and researchers at Harvard with new funding and new capabilities to accelerate their work and expand its impact, unlocking new possibilities for learning and discovery through data.
“This alliance will bring together researchers from across the University, representing multiple disciplines and areas of focus, with collaborators at AWS. They will address some of the most consequential problems facing humanity and the world. Whether the topic is socioeconomic and racial aspects of health disparities, or the climate crisis, the application of novel data-science approaches will lead to deeper insights and better designed solutions,” said Harvard Provost Alan M. Garber.
The Gazette spoke with HDSI co-directors Francesca Dominici, the Clarence James Gamble Professor of Biostatistics, Population and Data Science, and David Parkes, the George F. Colony Professor of Computer Science, to learn more about the alliance and what it means for the field of data science and how it will help further HDSI’s research and teaching mission.
Q&A
Francesca Dominici and David Parkes
GAZETTE: The AWS Impact Computing Project at the HDSI will be a catalyst for impact computing. How would you describe that field to someone who has never heard of it before?
DOMINICI: Impact computing reimagines data science with the goal of addressing and finding potential solutions to society’s biggest challenges. The idea is that when you’re thinking of large, complex, problems in society, they require sophisticated solutions that harness new types of data, new ways of making that data usable, new methods to examine that data, and new ways of leveraging underlying computing power – all with the input of the populations and communities that those solutions are built for.
PARKES: We’re thinking of issues like social determinants of health, mass migration, and economic resilience, to name a few. To study any of these challenges requires working with complex data, both historical and that which is generated in real time. Harvard faculty are already leading advances in extracting scientific understanding from these data, but we’ve heard from many of them that they’re eager to do more and work at new scale. Our vision of impact computing will require building new coalitions between academia, industry, governments, and NGOs to create better outcomes for stakeholders everywhere.
GAZETTE: How will the HDSI and this new project with AWS help advance impact computing and data science more broadly? And what are the gaps in the field that this project will be able to fill because of this collaboration?
DOMINICI: We’re really taking a bottom-up approach by beginning with our faculty and research colleagues and listening to the data bottlenecks that they face, be it access to better data, resolving issues around data ownership, or access to computational scale. AWS is already a major player in data science and high-performance computing, especially in building solutions to real-world challenges. By bringing these groups together, our goal is to help faculty unstick these bottlenecks, for example improving access to data or building data environments that enable new kinds of work.
PARKES: I’d add that those solutions will be designed so that others at Harvard and beyond can incorporate them into their work in the future. This is a new kind of impact. In one way, you can view the cloud computing that this alliance allows us to engage in as a democratizing force — it enables access to data, access to tools, access to the high-performance computing environments that are required for this ambitious work. We hope the project will act as an accelerant for Harvard as a whole and beyond. I think there is an incredible opportunity here to leverage investments that have been made around campus — for example, the Kempner Institute and the Salata Institute — that directly address hard questions such as understanding natural and artificial intelligence or seeking durable climate solutions. The HDSI works side-by-side with these groups to advance the data science that will help realize their goals. And our alliance further supports and catalyzes that work.
GAZETTE: So, what are the top societal challenges that HDSI will tackle in this collaboration?
PARKES: Prioritizing where to begin has been a challenge in itself — there are so many issues Harvard faculty are working on, and all of them are urgent. That being said, there are some opportunities that we can see taking shape as quickly as within the next few months. One challenge that we’re eager to explore is food insecurity related to droughts and climate change. Our colleague Peter Huybers, in Earth and Planetary Sciences, is at the forefront of interpreting satellite data, and wants to use this data to understand how climate change impacts food disparities in places like Madagascar, where the issue is so far-reaching, but the data that can be used to solve the problem — from satellites, from maps of agriculture, from yield outcomes — is incomplete.
We’ve also learned that there are similar efforts being conducted elsewhere in Africa, including in South Africa. We’re now bringing these groups together to share learnings, and hope that in the near term, we’ll be able to build a new community that can drive systematic understanding of the drivers of food insecurity and address crises due to famine.
DOMINICI: I’d also mention Caroline Buckee from epidemiology, who is studying global problems such as crisis-driven mass migration, including people fleeing the war in Ukraine. There are massive data-engineering problems associated with tracking people as they cross borders, associated with geographic scale, working through different regulatory environments, and handling data privacy and important ethical concerns. At the same time, there is an urgent need to use data effectively in responding to humanitarian crises, and driving good policy decisions, and part of this is an urgent need to be able to run statistical methods efficiently and at a new scale.
But beyond these two examples, there is huge interest in broad topics like understanding complex economic systems through multi-agent modeling, studying the drivers of trust and mistrust, and finding major social determinants of health, to name a few. Echoing David, there are so many urgent societal problems where we can hope to make positive impacts by enabling access to better technology, better data, and better partners for our faculty through this project. It’s really changing the speed with which Harvard faculty can respond to challenges as they arise.
GAZETTE: What difference will the gift component of the AWS Impact Computing Project at the HDSI make?
PARKES: The gift will allow HDSI to commit to a vision of building the new field of impact computing, and doing so in a way that respects data, respects methods, and respects the challenges themselves., and doing so in a way that respects data, respects methods, and respects the challenges themselves. They understood that HDSI needed flexible support to continue building a community beyond what we’ve done so far — for example, investing in undergraduate research programs or supporting graduate, postdoctoral, and faculty work through open funding calls. The data-science community at Harvard is hungry for these kinds of training and education opportunities. We see these activities as symbiotic with the new impacting computing projects that we will be undertaking.
DOMINICI: To expand on David’s point, flexibility is key. We know we have only begun to scratch the surface of what it means to “do” impact computing. By committing their support, this alliance enables us to stay open to pursuing new opportunities that build the field of impact computing when they arise, for example piloting new projects, bringing in dedicated data-science expertise, or convening the community through events that celebrate and further the incredible work Harvard faculty are doing.
GAZETTE: How does this project build upon the work that’s been supported by the Data Science Initiative in its first five years already?
DOMINICI: The HDSI launched in 2017 with the explicit goal of uniting computer scientists, statisticians, and domain experts to derive meaningful and actionable insights that shape the new science of data. We’re proud of the community of faculty that we’ve brought together, who are doing work on a wide range of topics, from understanding and preserving democracy, to improving AI modeling of chest X-rays, to creating new methods that can establish causal effects beyond correlation. But it was the overlapping crises of 2020 — the emergence of COVID-19 and our national reckoning with systemic racism in the aftermath of the murder of George Floyd — that allowed Harvard’s data-science community to use the full spectrum of our ability to amplify the impact of research, moving in real time with community stakeholders. This is the future of data science impact computing that we envision building with AWS — more faculty involvement, larger projects, and at a faster speed, and being able to bring people together who would not otherwise be working together.
On Nov. 15-16, HDSI will be hosting is a conference showcasing data science in research and education through panels, keynotes, workshops, and tutorials featuring speakers from across Harvard, academia, and industry. Please go to www.hdsiconference.org for more information.
New waves of statisticians, including a team at Harvard, have developed tools they think can help police the longstanding problem of gerrymandering of congressional and legislative districts in states by parties seeking to tip the scales for their candidates.
Gerrymandering has been part of American politics since the 1800s, with results that are at the very least controversial and sometimes illegal, particularly if it’s done to dilute the voting power of communities of color. The battle is renewed every decade in state legislatures when Census figures show which districts need to be rebalanced owing to population shifts. Sometimes violations appear blatant but often they are more subtle, and in either case they can be difficult to prove.
By harnessing the quantitative power of big data computing, these statisticians have developed algorithms that can identify likely gerrymandered maps by putting redrawn districts through hundreds, even thousands of computational tests and simulations. These tools — developed over the past decade often through open sourcing — provide powerful evidence of whether a plan falls outside the norm of a “fair” plan. The tests recently have been gaining traction and increasingly are being used as evidence of illegitimacy in court.
One method started at Harvard in 2020 has been quickly making an impact. It’s been used by researchers, journalists, and election analysts, and has played significant roles in recent legal cases where legislators were forced to throw out gerrymandered maps.
Called “redist,” the tool creates a vast pool of alternate nonpartisan plans (upwards of 5,000 to 10,000) that can be compared to a map that’s being proposed or has already been enacted by local legislators or redistricting committees. This pool of nonpartisan baseline maps makes it possible to see whether the new map fairly represents the new shifts shown in the Census, or is an outlier.
“What the algorithm does is that using the geography and distribution of different voters within the state, it shows what kind of partisan outcome we should expect,” said Kosuke Imai, a professor in the departments of statistics and government. “But if we see something very different in comparison to this nonpartisan baseline, favoring one party under the enacted plan, that’s evidence that there are some other factors influencing when the plan was drawn.”
For example, researchers say, consider a case in which the tool runs 5,000 simulations and finds that on average the legislative minority party should be winning about five to seven seats. But simulations using a map pushed through by the majority party has its rival winning only two seats. This signifies a rare or almost impossible occurrence and supports the likelihood of partisan gerrymandering, the researchers said.
Imai developed redist with Cory McCartan, a Ph.D candidate at the Graduate School of Arts and Sciences focusing on statistics. The pair found traditional methods for evaluating the fairness of redistricting plans weren’t working well, because they didn’t provide a neutral baseline to make objective comparisons. Fairness often became a subjective call, they said.
“For a long time, people have done gerrymandering and the question is ‘OK, how do I prove it?’” McCartan said. “It’s one thing to say, ‘Hey, I think that map looks unfair because the boundaries are super squiggly.’ But these things get litigated in court, so a judge has to clearly be able to decide: Is this fair or not?”
The redist software uses what’s called the sequential Monte Carlo (SMC) algorithm to run its simulations. The software starts with a blank map, then draws one district map at a time, then does it again and again. Each alternative map is drawn in parallel and incorporates that state’s population, demographic data, and districting laws. Once these alternative maps are drawn, the redist software uses visualization tools to help users understand the data with charts and graphs summarizing the simulations.
Unlike most similar algorithms, the SMC algorithm doesn’t start from a single map and keep altering that one. Instead, the algorithm starts with a blank map and generates new alternative plans from new blank canvases. This random generation enables the algorithm to efficiently explore more unique alternatives and generate a representative sample of plans. Existing algorithms that don’t do this run the risk of exploring plans that are very similar to the starting map, which may already have a partisan bias, the researchers said.
“Say the enacted plan is favoring Democrats,” Imai said. “If an algorithm only explores very similar plans [because it starts with this enacted plan], then maybe all the simulated plans based on it also favor Democrats. When this happens, that enacted plan doesn’t look like a gerrymander.”
Redist has been used by plaintiffs in gerrymandering cases, including actions in Alabama, New York, Ohio, and South Carolina. In New York, an elections analyst used the SMC algorithm to produce 10,000 maps to help show the map drawn by New York state’s Democratic legislature was an “‘extreme outlier’ that likely reduced the number of Republican congressional seats from eight to four.” A state Appeals Court ordered the map be redrawn.
In Ohio, Imai was called as an expert witness for plaintiffs accusing the Republican-controlled redistricting commission of gerrymandering. The SMC algorithm generated 5,000 maps for the case, none of which was as favorable to Republicans as the commission’s proposal. The state Supreme Court ordered that body to revise.
The algorithm is also being used in a case brought before the U.S. Supreme Court involving allegations of racial gerrymandering in Alabama. Imai served as an expert witness for the plaintiffs, arguing that its new congressional map intentionally dilutes the Black vote. The case, which relies on protections in the Voting Rights Act, could eliminate one of the few remaining nationwide safeguards against rigged legislative maps.
Redist has become a major focus in Imai’s research group at Harvard called the Algorithm-Assisted Redistricting Methodology (ALARM) Project. The group recently launched the 50-State Redistricting Simulation Project and is using the software to evaluate the redistricting plans across the country by producing 5,000 alternative maps for each state.
The easy-to-use tool lets users select a state and explore the maps. To make the data more accessible and digestible, ALARM provides a breakdown of how many congressional districts exist in that state, its redistricting requirements, its political geography, and a summary of what the computer found, including graphs and tables.
All the data can be downloaded. The source code is provided so it can be used as a template to generate simulated plans under different specifications.
The ALARM group members — who includes undergraduates, graduate students, and even high school students — say the process is rigorous since each state has different rules that need to be translated into the algorithm, and they need to run diagnostics to make sure everything goes smoothly.
Still, they believe the effort is worth it.
Up next for the ALARM Project is a plan to explore expanding redist from looking at the partisan bias of gerrymandering to racial gerrymandering, and to evaluate gerrymandering at more local levels.
“There’s always a question of how different ways of drawing boundaries can benefit some voters and harm others,” Imai said. “It’s important for social scientists to understand the nature of these types of political manipulations and address it.”