February 2022

A future in which food is plant-based or grown in a petri dish won’t necessarily translate to better health or progress against climate change, says the journalist Larissa Zimberoff, whose latest book, “Technically Food: Inside Silicon Valley’s Mission to Change What We Eat,” explores the role of the high-tech sector in what lands on our tables.

On Tuesday, Zimberoff zeroed in on the new food-tech sector — think burgers made from pea protein or cheese grown in the lab — in a talk sponsored by Harvard’s Food Literacy Project. Other topics in the hourlong discussion included government regulation, fast food, industrial versus vertical farming, Wall Street investors, and chicken nuggets in Singapore that are a mix of cultured meats and plant-based products.

New players in food, many based in Silicon Valley and backed by bullish investors, are looking more like technology or science ventures, said Zimberoff. Impossible Foods, famous for its plant-based meat substitutes, names new versions of its burger 1.0 and 2.0 — “like they’re software.” The author detailed one startup’s efforts to find a microbe that can expel the proteins needed for the fermentation of cheese minus the dairy, and a vintner in California who has installed sonar probes in his wine drums that send out pings to help him identify potential problems with fermentation.

Wine-making is a skill passed on through the centuries. “Yet here we are,” Zimberoff said, “it’s got a new way to be done.”In her research, Zimberoff has found that players in the new food movement are almost always seeking a better, more environmentally friendly, world — but with a rub. They are focused on less pollution or animal protection, she said, but they aren’t always thinking about the healthiest option for the consumer. “They are mission-based but profit-driven,” said Zimberoff, adding, “I worry about this profit being tied to food because it sets us up to be the ones that don’t win out.”

With a background in high tech, a passion for food, and a master’s in creative writing from the New School, Zimberoff began writing about the growing intersection between food and technology around 2015. But the author has been interested in food and its effects on the body since childhood. Diagnosed with Type 1 diabetes at the age of 12, she told the audience that she has X-ray vision when it comes to diet. “I look at macronutrients like fiber and protein and fat and carbohydrates,” she said. “I have to be the computer for my body. … I expect food to be much more than many people do.”

Topping Zimberoff’s list of plant-based products to watch are algae, such as kelp or seaweed, and mycelium, the root structure of mushrooms that can be used as a protein substitute.

Looking ahead, Zimberoff said she is optimistic about a successful new food model, one that is good for people and for the planet. In her view, such a system will require stronger oversight from the FDA and the USDA, solutions to equity and access issues around healthy foods, more collaboration and science-based support from experts and, perhaps most importantly, a deep desire on behalf of the public to better understand what it is exactly they are eating.

“People don’t know about their food, we’ve never known about our food, we’ve never done the work to know about our food,” she said. “You’ve got the nutrition facts panel, read it and look up something you don’t know … that’s how you start learning.”



Wondering is a series of random questions answered by Harvard experts. For the latest installment, we asked the human evolutionary biologist Erin Hecht, whose research focuses in part on neural and behavioral variation in domestic canine breeds, if your dog would care if you dropped dead.
 

Yes. Dogs have behavioral and circulating hormone responses to the presence or absence of their owner — and in interacting with their owner — that parallel what we see when humans interact with other humans with whom they share a bond: close friends, family members, and children. That suggests that biological correlates of the human-animal bond are similar — at least in some ways — to human-human bonds. And we know that humans get very distressed if someone they care about and have a close bond with dies. That’s one line of evidence. Another is that, phenomenologically, we can see examples of dogs that show evidence of distress when they’re separated from their owner even for a short period of time. When the owner goes to work, a lot of dogs have separation anxiety. Or if the owner does die, then dogs commonly exhibit distress that can take the form of destructive behavior or something that might look like reduced activity levels — what depression looks like in humans.

One thing we don’t know is to what extent biological bonding mechanisms within a group of wolves are similar to the mechanisms that support bonding between dogs and humans. Another thing we don’t fully understand is to what degree different dog breeds might have different patterns of bonding with humans. It seems like a reasonable hypothesis. Some breeds of dogs have been selected for cooperative working behaviors with one individual human — really one-on-one cooperation. Examples of that could be border collies, Australian shepherds, and other livestock herding dogs that interact closely with a human handling livestock. Other breeds don’t have that cooperative working arrangement with a single human. A livestock guardian dog seems to form strong social bonds with the livestock rather than with the human. And then there are other breeds of dogs that have been developed for human companionship. It’s possible that they might show different patterns of bonding, maybe less with one person and more with an entire household — but that’s speculation. Aside from genetic differences across breeds, individual dogs’ histories of positive interactions with individual people must also play a big role.

From the basic science perspective, I think dogs are a really unique window on how brains change across generations when there’s selection pressure on behavior. We have all these different breeds that have been selectively bred for different behavioral profiles, different types of cognitive abilities, different skills, and so forth. And there’s nothing else really like that in the animal kingdom. Studying them is a way for us to understand brain evolution in a more precise way than we can get from studying any other species. I think we’re at a point in dog research where we’re just starting to get empirical evidence of things that people who have interacted with dogs take as a given. For example, there were a few papers over the past few years establishing that dogs experience jealousy, and I think anybody who’s ever had more than one dog in their house at once knows that that happens. And this question — whether dogs would care if their human dies — is along the same lines. But it’s still important to get that empirical validation and not just go off of our gut understanding of how their minds are working.

 

 



Between 80,000 and 50,000 years ago, people across early eastern and southern Africa left behind traces of an explosion of symbolic expression reflected in far more intensive use of eggshell beads, pendants, pigments, and other art.

Archaeologists have long hypothesized that these new creations were likely the result of major changes in where and how people lived, traveled, and interacted, but the theory has been difficult to test and prove. The problem is the items alone don’t tell the full story. To get at that missing piece of the puzzle, a team of archaeologists and geneticists turned to the ancient people themselves.

A new study analyzing ancient human remains has produced the earliest DNA from sub-Saharan Africa and gives a more complete look at how people from different regions moved around the continent, a shift that led to the creation of social networks for trade and information-sharing. The analysis also shines fresh light on the ancestral populations of almost all foraging peoples in eastern and southern Africa, including ancient and modern hunter-gatherer populations.

A team of 44 researchers, co-led by Harvard geneticists David Reich and Mark Lipson and archaeologist Mary Prendergast, A.M. ’05, Ph.D. ’08, an associate professor at Rice University and a 2016-17 Radcliffe Fellow, describe the findings in Nature. The study is the latest in a series of papers that find their roots in the time Prendergast spent at Radcliffe and met Reich, ultimately leading to their collaboration in using ancient DNA to shed light on questions in African archaeology.

“All humans today descend from ancestors who lived in Africa more than 100,000 years ago, and Africa today is the place on Earth where our species is most diverse,” said Reich, a professor in the Department of Human Evolutionary Biology and a professor of genetics at Harvard Medical School. “But while we all recognize the centrality of Africa, we often overlook Africa in the last 50,000 years.”

The new study helps fill this gap through the analysis of DNA data from six ancient individuals found in areas now part of Malawi, Tanzania, and Zambia. Three date from 14,000 to 20,000 years ago, more than twice as old as the oldest ancient human DNA formerly studied from the region. In addition, the genetic material is the first such data examined from the Late Pleistocene age (the geological period from about 130,000 to 12,000 years ago).

The researchers looked at these data alongside that from 28 individuals buried at sites across the continent to gain new insight into the genetic population structure of eastern and southern Africa.

The analysis shows that most foragers in eastern and southern Africa derived substantial ancestry from three source populations instead of two, as previously thought. The third came from central Africa and became mixed with eastern and southern groups.

“What we’re seeing is a three-way mixture of deeply divergent modern human lineages,” Reich said. “The central contribution of people related to today’s central African rainforest hunter-gatherers was not previously appreciated.”

Beads.
David Emil Reich.

Ostrich eggshell beads from Tanzania were found where one of six ancient individuals being studied was buried. Harvard's David Reich (pictured) co-led the study that gives a more complete look at how people from different regions moved around the continent, a shift that led to the creation of social networks for trade and information-sharing.

Photos by Jennifer Miller; Jon Chase/Harvard Staff Photographer

The analysis showed these groups moved thousands of kilometers across Africa between 20,000 and 50,000 years ago and mixed in different proportions.

“Over millennial timescales, migration brought together people whose ancestors had lived in distant parts of Africa and had been largely separated for 200,000 years,” said Prendergast. “This produced a cline of ancestry, with roots in eastern, southern, and central Africa.”

The work also showed that by about 20,000 years ago — roughly the date of the oldest samples analyzed— the ancestry mixes of many forager populations were changing relatively little.

“People had stopped moving around as much — they lived locally,” Prendergast said.

The researchers noted, however, that materials for art and tools were still frequently on the move, signifying a strong trading network. Artifacts began to differ stylistically from one region to the next, perhaps reflecting local ideas about aesthetics.

“Archaeology shows us that by 20,000 years ago, social networks were pretty well developed, and that enabled people to live and interact with people extremely locally while still taking part in wide-ranging cultural exchanges,” Prendergast said.

Researchers say the paper serves as a prime example of how geneticists and archaeologists can work together to open windows into the lives of ancient peoples. This is the third original research paper from the collaboration on ancient DNA in Africa sparked by Prendergast’s year at Radcliffe.

Along with Prendergast, Reich, and members of his lab, the research team included scholars from Canada, Kenya, Malawi, Tanzania, Zambia, and other countries. The project is part of a general push in the field of genetics to diversify what DNA is studied, especially in Africa.

“Only about 3 percent of ancient DNA data has come from the continent of Africa,” Reich said. “That’s an extraordinary imbalance when it comes to learning about some of the most important, complex, and difficult-to-understand moments in our species’ past.”



This article is part of a series introducing new faculty members.

Gabriella “Biella” Coleman took her first anthropology course in high school, a formative experience that led her to study healing rituals and spirit possession in college and grad school. But a year-long illness and fast internet connection inspired a swerve in focus to the burgeoning hacker world. Today she is considered one of the foremost scholars in the interdisciplinary fields of technology studies, media anthropology, digital activism, and security. Author of “Hacker, Hoaxer, Whistleblower, Spy,” Coleman began teaching in the Department of Anthropology last month and joined the Berkman Klein Center for Internet and Society as a faculty associate. Interview has been edited for length and clarity.

Q&A

Gabriella  Coleman

GAZETTE: The world of hacking is constantly evolving. What are you working on now?

COLEMAN: I am deep into two big projects — one recently published by the Data and Society research nonprofit called “Wearing Many Hats: The Rise of the Professional Security Hacker” traces how former underground hackers came to occupy a privileged spot in the security industry.

Many had formerly been underground hackers banded in small groups who broke into computer systems. They did so not to cause harm, but for the intellectual thrill of learning about security and computers when such knowledge was not readily available in books, online, or the academy.

In the late 1990s, these hackers started to claim they had the expertise for improving computer security. And they did, but they still had to convince others of their trustworthiness given their outlaw status.

When some of these hackers started to seek employment, a few prominent figures cautioned against hiring them. So as they stepped out the shadows to engage with a nascent security public, hackers had to do all this labor to rehabilitate their image. Many indeed went pro and became employable, highly respected security professionals. Some even became industry leaders, but at the time they first made a claim for their special expertise, it was unclear whether anyone would take them seriously.

This project, which is ongoing, is historical and so the pace of research was slower and much kinder compared with my earlier work on Anonymous, the global collective credited with cyberattacks on governments and other institutions, and the free and open software movement. Even as I conducted face-to-face research, much of it was online. And as an anthropologist, the expectation is to be present, to participate. And when you can be present 24-7 by logging into various online forums, the pressure to be there is as immense as it is enticing.

The second project is a book of essays I’m currently calling “Weapons of the Geek,” which is tied to the Henry Morgan Lectures I’m giving this spring at the University of Rochester. Many tend to think hackers are either freedom fighters, sticking it to “The Man,” or are part of re-establishing and fortifying power structures. But it’s both or neither. It all depends on time, place, the groups, and projects. With this book, I hope to make this ethical and political variability far more palpable by juxtaposing different hacker projects and histories that complicate the idea there is an essence to hacking. For instance, one essay covers a radical hacker collective in Spain called Xnet that managed to jail bankers thanks to their whistleblowing efforts. Another essay covers a cloak-and-dagger history whereby the French state infiltrated the hacker underground to enroll them to spy. In sharing very different stories around hackers in different eras and regions, I hope to get readers to think about hacking in more nuanced and less binary terms.

GAZETTE: Why is the history of hacking so important to research now and amplify?

COLEMAN: That’s a great question. First, so much of hacker history exists online but is vanishing before our eyes. My latest project wouldn’t have been possible without the uber internet archive, the Wayback Machine. Other material unavailable there, which I used, has already disappeared. I try to digitally snapshot and store everything offline. We can’t assume what exists today will be here tomorrow.

Second, the history of hacking holds lessons to thinking through contemporary problems and predicaments. Take, for instance, the hackers who became security professionals. These hackers did not play nice. They forced software vendors like Microsoft to care and pay attention to security by being loud, spectacular, and adversarial. Today the need for such antagonistic, independent critique has never been greater, as a means to force social media giants like Facebook or Twitter to fix problems around socio-technical harms and vulnerabilities. For example, being on Twitter as woman journalist of color might mean that you’re going to get really harassed. So what can Twitter do, either at the level of policy or design, to minimize this harm? That’s much tougher to solve than fixing some technical bug, but getting the companies to even care requires the same sort of pressure that hackers had exerted on software vendors.

GAZETTE: How well-educated is the public as to who and what hackers are?

COLEMAN: Certainly 10 or 15 years ago if I told anyone, “I study hackers,” most people would immediately think: criminal-wizard stealing my credit card. Today someone still might think that or Russian nation-state hacker, ransomware hacker, Bitcoin entrepreneur, or hacktivist. Hackers land on headlines so often and for so many different reasons.

One common misperception is that hackers are asocial, loners, misfits. The reality is hacking is quite cooperative and social. For example, with free and open-source projects, some of which boast thousands of collaborators, project members not only develop software, but build institutions guided by complex voting procedures, membership guidelines, and legal philosophies. They are really thoughtful about governance structures — it is one reason that some of these projects are still kicking around 20-plus years after they were first chartered!

Anonymous was also highly social in nature, and norms encouraged the sublimation of ego. If anyone tried to amass personal fame or attention, they were put in their place. There is still a tremendous degree of sharing with little fear of being scooped.

More so, some people may think that hackers are psychologically off-kilter. The reality is far more mundane. While hackers are still overwhelmingly men, they are not deranged misfits. Some are college students; many hold jobs in technology companies doing the work of coding, security research, or administering servers and networks. They love to gather at conferences and other meet-ups, on- and especially offline. They may be obsessed with computers and learning, but academics are similarly obsessed with their work, right? 

GAZETTE: Tell me about your background, and how you got here.

COLEMAN: I grew up in Puerto Rico. Interestingly enough, when I was 16 I took an anthropology course in high school. I fell in love almost right away and knew this was the path I wanted to take. When I went to college and grad school, I settled on religious healing in Guyana, South America. My side interest in free and open-source software already existed but that topic did not even strike as an option. It was too far off the beaten path, at least for graduate students.

But life had other plans for me. I ended up sick and home-bound for a year. My laptop and fast internet connection meant I could continue learning from home. When I got better, I was keen to continue my study of hackers, partly as it had not been studied ethnographically. My supervisor was supportive, but honest about my future. She told me point-blank, “You probably won’t get a job in an anthropology department,” and she was right. I’ve spent most of my career outside anthropology departments [media studies at NYU and McGill], so Harvard is my first bona fide anthropology position!

GAZETTE: Can you talk about doing this work in the anthropology space?

COLEMAN: Back in 1998 when I started grad school, I could count the number of anthropologists working on digital media on one hand. Today there are dozens and dozens of anthropologists who dedicate themselves to the study of contemporary technologies. Digital media is now so vital and for so many spheres of life, that many projects include a media component.

Still, I don’t always find it useful to approach the digital world as a stand-alone research arena. It encompasses tremendous plurality. Even digital activism is a bit of a misleading category. It includes everything from slacktivism, like signing a petition, to risky hacking and leaking. Researchers have to specialize, and yet there is this understandable need and pressure to follow more general trends, around, say, laws that shape online privacy or content, platform politics, the role of artificial intelligence in determining what content we see, and much more.

Even my area of specialization is home to tremendous diversity, and I’ve only specialized in a few arenas: free software, hacktivism, security. There is so much more: like piracy, cryptography, phone phreaking, biohacking, hardware hacking. I do try to follow research in these other arenas. It helps me determine what might be particular to one domain and what needed cross-cuts across field of practice and place.



A team of researchers led by Harvard and Broad Institute scientists has developed a new drug-delivery system using engineered DNA-free virus-like particles (eVLPs) that is able to edit genes associated with high cholesterol and partially restore vision in mice.

Because eVLPs enable safer in vivo delivery of gene-editing agents than some clinical methods with comparable or higher efficiencies, this new platform holds promise of being able to deliver therapeutic macromolecules with less risk of off-target editing or DNA integration.

In the paper, published in Cell, the researchers detail how they developed virus-like particles to deliver base editors, proteins that make programmable single-letter changes in DNA, and CRISPR-Cas9 nuclease, a protein that cuts DNA at targeted sites in the genome. The authors identified factors that influence virus-like-particle delivery efficiency and demonstrated that engineering these particles can overcome multiple structural limits to their potency. The team’s eVLPs are the first virus-like particles to deliver therapeutic levels of base editors to a variety of cell types in adult animals.

“The delivery of therapeutic macromolecules into mammalian cells in animals, and eventually in patients, is one of the most important challenges in life sciences,” said the paper’s senior author, David Liu, the Thomas Dudley Cabot Professor of the Natural Sciences and a core faculty member of the Broad. “There is often a very steep drop-off between in vitro and in vivo delivery, so we made the decision early on that our new delivery technology would need to show good efficacy in animal models.”

This work was led by members of Liu’s lab, including postdoctoral fellow Samagya Banskota, and Aditya Raguram, a chemical biology student in the Graduate School of Arts and Sciences, in collaboration with research teams led by Krzysztof Palczewski at UC Irvine, and Kiran Musunuru at the University of Pennsylvania.

This new delivery system finds a novel use for virus-like particles and builds on the success of base editors, which the Liu Lab developed in 2016, to rewrite individual DNA bases such as the mutations that cause thousands of genetic diseases.

Virus-like particles have long been studied as drug-delivery vehicles. Because they can carry molecular cargo and lack viral genetic material, they are able to exploit the efficiency and tissue-targeting advantages of viral delivery without the drawbacks of using actual viruses, which can insert their genetic material into a cell’s genome and potentially cause cancer and other diseases. However, existing VLP-delivery strategies have had limited therapeutic efficacy in vivo.

The team identified delivery limitations and systematically engineered the components of VLPs to overcome cargo packaging, release, and localization bottlenecks. In doing so, they developed fourth-generation eVLPs that packaged 16 times more cargo proteins than previous designs and enabled an eight- to 26-fold increase in editing efficiency in cells and animals.

The team tested their optimized eVLP system to deliver base editors to the liver in mice, where they efficiently edited a gene that can lower “bad” cholesterol levels. A single injection of eVLPs resulted in an average of 63 percent editing of the target gene and a 78 percent drop in its protein levels, which substantially reduce the risk of coronary heart disease.

“The cholesterol target is particularly interesting because it is not only relevant to patients with a rare genetic disease,” Raguram said. “We are hopeful this is one example of genome editing being able to benefit a large population because cholesterol levels impact the health of billions of people.”

The researchers also used a single eVLP injection to correct a disease-causing mutation in mice with a genetic retinal disorder, resulting in the partial restoration of vision.

Going forward, Banskota is optimistic that eVLPs will be utilized by scientists quite easily because of the system’s relative simplicity and versatility.

“Because our system is relatively simple and easily engineered, it allows other scientists to adopt and build upon this technology quickly,” Banskota said. “Beyond carrying gene editors, eVLPs have the ability to transport other macromolecules with lots of therapeutic potential.”

This work was supported by the National Institutes of Health, the Bill & Melinda Gates Foundation, and the Howard Hughes Medical Institute.



A new study by Harvard and MIT researchers, including psychologist Elizabeth Spelke, shows that children as young as 8 to 10 months old infer two people are likely in a close relationship if they see them having interactions that involve a transfer of saliva.

Such activities include kissing, taking bites out of each other’s food, and sharing the same fork or straw. The study indicates that babies understand all of these activities as social cues indicating whether people are on casual terms or share stronger bonds.

The research, published in Science, also suggests babies glean from such gestures whether those involved are likelier to comfort one another if something stressful comes up — say, if one of them starts crying.

Lead researcher Ashley Thomas says the experiments involved fuzzy puppets, orange slices, and saliva-dipped fingers and were performed under the watchful eyes of infants and toddlers, ranging from 8 to 18 months of age.

The children were shown two sets of videos, some of which involved examples of saliva-sharing and some of which didn’t. In one set, a woman took a bite of an orange slice then placed it in the mouth of a fuzzy blue puppet before taking it back for another nibble. The infants and toddlers then were shown a different woman passing a ball back and forth with the puppet. The last video showed the puppet seated between both women before it starts crying and droops its head.

“The question is: Who do the infants and toddlers expect to respond to the distress of the puppet?” said Thomas, who is now a researcher at MIT but started this work as a researcher in Spelke’s Harvard lab.

The babies consistently looked first and much longer at the woman who’d shared the orange slice than the woman who just passed the ball.

As a control the two same women were also shown with a new puppet to a different group of infants and toddlers. Neither shared an orange slice with the puppet, and when the puppet started crying, the babies spent an equivalent amount of time looking at each of the two women. This indicates the determining factor for the children who looked to the woman who shared the orange slice was the relationship the babies assumed between the puppet and the woman.

Researchers also showed the babies a new set of videos with a purple and a green puppet. In these videos, a woman touched her forehead, touched the forehead of the purple puppet, then her own forehead again. The same woman then put her finger in her mouth, put it in the mouth of a green puppet, and then back in her own mouth.

The woman then sat between the puppets and acted as if in distress. The babies largely turned their focus to the puppet who’d shared saliva with the woman, signaling their expectation that it would most likely be the one to offer help.

Researchers said the study showed that infants and toddlers recognize what are called “thick relationships,” ones characterized by enduring obligations and attachments. The work also highlights what may be an inherent interest in babies to learn to identify social structures and close relationships between people by seeing how they interact.

“The primary feature of a human infant is they can do almost nothing, and they know almost none of the things that we know,” Spelke said. “Most of the things that we spend our time thinking about probably don’t matter to an infant at all. … But infants seem to care about this, and they seem to be spontaneously interested in this. That I think sends up a signal that this may be something that’s actually culturally really important to human beings.”

The experiments were conducted over Zoom with Brandon Woo, a student in psychology at the Harvard Graduate School of Arts and Sciences and researcher in Spelke’s lab who played a key role in logistics.

In addition the researchers did some slightly different but related work with 5- to 7-year-olds. They found that when it came to sharing things that could be divided easily (like pieces of candy or toys) and things that couldn’t be divided (like a scooter), the children thought it would be just as likely that someone might share with either friends or family. But they thought interactions that involved saliva-sharing (like eating applesauce with the same utensil) were likelier to involve just family.

The team plans to keep looking at infants, toddlers, and young children’s abilities to recognize thick relationships. They want to know how early in life babies possess this knowledge.

Thomas, who has a toddler at home, has a strong feeling it starts pretty young.

“My husband at this point is really sick of me filming every time that my kid feeds one of us or does a saliva-sharing interaction,” she said.



With its adorable big eyes, this predator seemed more Disney than deadly.

A crab roughly the size of a quarter was found in the waters of what is now Colombia 95 millions years ago. It was known for stalking its unsuspecting victims from the dark before gracefully chasing down its prey.

Today, most adult crabs are largely known as bottom-dwellers who have to scavenge and scurry across the sea floor because of their tiny, almost useless eyes. But this long-extinct crab, dubbed Callichimaera perplexa, had unique physical features that made it an agile swimmer that likely zoomed through the water to catch whatever was in its sight.

Artistic reconstruction of Callichimaera perplexa

Artistic reconstruction of Callichimaera perplexa swimming after a comma shrimp.

By Masato Hattori

A new study, published in iScience, looks at the signature feature of this ancient crab: its enormous eyes that made up about 16 percent of its body.

The researchers from Harvard and Yale show that, unlike most crabs today, this critter had incredible eyesight, which helped it become an active hunter.

“Even though it’s the cutest, smallest crab, the big eyes of Callichimaera and its overall body form with unusually large, oar-like legs indicate that it might have been an active swimming predator, rather than a bottom-crawler as most crabs are,” said Javier Luque, a postdoctoral researcher in the Harvard Department of Organismic and Evolutionary Biology.

Luque has been trying to understand Callichimaera since initially stumbling upon it on a dig in Colombia in 2005. He fondly calls it the platypus of crabs because of its unique assemblage of body parts that are present in many groups but hardly ever together in one body plan. The assortment includes a long, spider-like body with bent claws; flat, paddle-shaped legs; a strong but exposed tail; and large, stalk-less compound eyes.

Video credit: Images by Daniel Ocampo/Vencejo Films and Javier Luque/Harvard University; animation and 3D reconstruction by Alex Duque/Lynn University

“It is a true chimera,” Luque said.

Based on where the fossils have been found, it lived in what is now Colombia, Northern Africa, and in Wyoming. Luque and his colleagues have collected more than 100 well-preserved specimens. It’s unclear what type of prey the crab hunted, but it was found surrounded by fossilized comma shrimps, which are about as big as a rice grain.

Luque first described Callichimaera in 2019 in the journal Science Advances after years of periodically visiting museums and analyzing their crab fossils before confirming it was a new species.

In this recent paper, Luque and colleagues analyze seven pairs of exceptionally preserved Callichimaera eyes. They also compared them to the eyes of 15 crab species, both living and extinct, and put together a growth sequence for Callichimaera.

Luque and first author Kelsey Jenkins, a Ph.D. candidate at Yale, found Callichimaera had the fastest-growing eyes of the 1,000-plus crab specimens to which they compared them; the eyes could amount to 16 percent of their entire body. That’s equivalent to humans with eyes the size of soccer balls.

Usually, crabs have tiny compound eyes located at the end of a long stalk with an orbit to cover and protect them. Callichimaera, on the other hand, had large compound eyes with no protective sockets, meaning they were always exposed and vulnerable. The researchers said that anatomical make-up was almost a giveaway for its visual acumen.

“Plus, eyes that big impose a huge investment of energy and resources to maintain them,” Luque said. “This animal must have relied considerably on vision.”

In their analysis, the researchers found that to be true. The internal soft tissues in the eyes of Callichimaera they examined showed a closer similarity to the eyes of bees and other large-eyed insects than the stalked eyes of crabs. The researchers showed the crab saw as sharply as dragonflies or mantis shrimp, two remarkably visioned and efficient hunters.

The researchers first thought Callichimaera was a crab in megalopa, the last larval stage, where young crabs are free-swimming predators that have relatively big eyes. It’s a brief stage and as the crab matures into a juvenile, the body outgrows the eyes and the crab transforms into the sideways-scurrying scavengers seen today. The analysis showed Callichimaera was not only an adult predator, but that it maintained features from larval stages.

Luque said the researchers next plan to look at the biomechanics of Callichimaera’s swimming. Its name translating to “perplexing beautiful chimera,” Luque says he’s always surprised by what the team continues to learn about the long-dead crab.

“The animal itself is gorgeous and baffling,” Luque said. “I call it my beautiful nightmare.”



Leonard is desperately trying to find the man who killed his wife and knocked him out, leaving him without the ability to form new memories of anything after that horrendous night.

That is the storyline of filmmaker Christopher Nolan’s 2000 psychological thriller “Memento,” which served as the focus for a lively online panel discussion Monday evening on memory, how it functions and shapes who we are — and how accurate Hollywood typically is in depicting all of it.

Led by moderator Kirk R. Daffner, Harvard Medical School’s J. David and Virginia Wimberly Professor of Neurology, panelists Daniel L. Schacter, the William R. Kenan Jr. Professor of Psychology, and Susanna C. Siegel, Edgar Pierce Professor of Philosophy, opened the event, sponsored by the Mind Brain Behavior Interfaculty Initiative, with a discussion of the plausibility of the acclaimed film’s premise.

Leonard, played by Guy Pearce, attempts to compensate for his inability to remember by leaving himself clues — notes, Polaroid pictures, and tattoos all over his body — that apparently document what he has learned about the rape and murder of his wife. Images of this brutal act are Leonard’s last enduring memory.

Summarizing the film’s complex plot, Daffner praised it for not repeating the erroneous Hollywood trope that amnesia “is often caused by a blow to the head, which causes the loss of memory and personality and identity.” In this mistaken concept, the sufferer is still able to form new memories, and the problem is often cured by another blow to the head. “Although this account may seem silly, it is surprisingly pervasive,” said Daffner.

Daniel L. Schacter.

“There’s a very long tradition in memory research of how what we remember about the past not only reflects what happened, but our own current needs,” said panelist Daniel Schachter (left).

Schachter also praised Nolan’s understanding of anterograde amnesia, in which a person is unable to form new memories, although he noted that most cases of anterograde amnesia also entail some retrograde amnesia, or loss of memories from before the injury. (Schachter also traced the filmmaker’s insight to a Georgetown University psychology class taken by Nolan’s brother Jonathan, who used the idea for a short story.)

“The movie overall does a great job of portraying the severity of Leonard’s anterograde amnesia and clarifying the difference between what this is and the more typical movie depiction,” Schachter said.

“The way in which the movie depicts the severity is part of the genius,” he added. “The reliance on notes and photos, the continued inability to recognize other characters whom we know he knows, illustrates the severity of the deficit.”

Siegel raised the role of memory in the formation of personality. “Another thing that’s realistic and compelling is the picture of what it might be like for your mind to be limited in this way,” she said. For example, she noted, when we see Leonard reliving an argument with his wife, he doesn’t present a full range of emotions. “He’s very flat and has little affect,” she said. “Is he flat like this because of his condition? When he’s remembering that scene is he projecting the affect he has back on the scene?

“That’s a big pitfall we all face,” she said. “When we’re remembering ourselves are we remembering ourselves as we were — or are we projecting our present selves into the past?”This mutability plays a role in the story and adds to Leonard’s uncertainties. It also, said Schachter, illustrates a key understanding of memory. Noting the unreliability of eyewitness testimony, he pointed out “memories are an interpretation, not a record.” He called this plot point “a nice summary of our basic understanding of memory.”

Questions about the trustworthiness of memory and its function recurred throughout the panel, even as the participants dismissed some implausibilities of the plot — such as Leonard’s ability to remember and follow directions. As the plot unfolds, with its fragmented shifting of time, the audience has to grapple with the possibility that Leonard is intentionally misleading himself. This misdirection, the experts noted, plays on a well-known aspect of memory.

“There’s a very long tradition in memory research of how what we remember about the past not only reflects what happened, but our own current needs,” said Schachter. This theory of retrospective bias, he explained, “fits with the idea that we use our memory, or shape our memory, to fit our current needs.”

“We try to be nice to our future selves,” added Siegel. As an example, she cited a common cultivated habit of “putting things where we’ll find them.” The severity of Leonard’s case, and his awareness of it, makes this self-kindness optional. “If the future self isn’t going to appreciate this, why help him?”

Addressing one essential function of memory — it allows us to shape our own life narrative — she noted that Leonard has an advantage: Without lasting new memories, she said, “He’s a little bit freed up.”



The argument about whether head or heart is more valuable has raged forever. Would you rather possess the clear rationality of the Greek god Apollo or the wild emotion of Dionysus? The cold logic of Mr. Spock from “Star Trek” or the messy humanity of Dr. McCoy?

The answer, according to Leonard Mlodinow, is moot. Why? He argues in his new book, “Emotional: How Feelings Shape Our Thinking,” that the two constitute a kind of false dichotomy because they’re actually inseparable.

“Even if you think you’re applying cold reason, you’re not,” said Mlodinow, who spoke about his book last week in a virtual Harvard Science Book Talk presented by the University’s Division of Science, Cabot Science Library, and Harvard Book Store.

A theoretical physicist by training, Mlodinow spent years on the faculty at the California Institute of Technology but left to write 11 science books, including five best-sellers, plus several episodes of TV shows such as “Star Trek: The Next Generation,” “MacGyver,” and “Night Court,” among others. “Emotional” digs into recent discoveries in neuroscience and psychology — fields entirely different from his own — to explain how feelings, like rage, fear, disgust, and joy, are the unconscious rudders behind all human decision-making.

“Emotions play a hidden role in our behavior,” Mlodinow said. They help the brain choose what sensory information to pay attention to, how to process it, and what other data — such as memories or goals — to weave into decisions. For example, said Mlodinow, when you walk through an unfamiliar neighborhood, fear can amplify the sound of a twig breaking a block away, a sound you might otherwise ignore.

Before jumping into the book’s details, discussion moderator Nick Owchar, executive director of advancement communications at Claremont Graduate University and former deputy book editor of the Los Angeles Times, wanted to know: Why would a theoretical physicist even want to write a book about emotions?

“As a physicist, I’m curious about the universe,” Mlodinow said “How did we get here? How did it get here? Why is it the way it is? I have the same questions as a human: Why am I the way I am? How did we as a species get here?”

We got here, Mlodinow continued, because of emotions. Thousands of years ago, when wild humans roamed the plains and savannas, they were far from the fastest or strongest animals. Alone, they couldn’t survive, so they banded together. And emotions, like empathy, guilt, and shame, evolved to glue these packs together and encourage cooperation.

But in today’s society, emotions frequently don’t lead to anything as productive as cooperation. Just as human eyes can experience optical illusions, Mlodinow said, emotions can malfunction, revving too high or lingering too long, or flaring up in inappropriate situations. To rein things in, Mlodinow recommends a few simple tactics: meditate; express emotions (suppression has been shown to lower life spans, he said); and change the story. For example, when a driver cuts you off, anger might tell you the person is a selfish jerk. But, if you shift the story, telling yourself the person could be dealing with an emergency, you can shift the emotion. “Put a different spin on it,” Mlodinow said.

 

Emotions can also be contagious, right? Owchar asked.

 

Right, said Mlodinow. Think about yawns in a classroom or audience laughter in TV sitcoms (when these disappeared during the pandemic, he said he could no longer tell if the jokes were funny). But in the age of social media, where one person can interact with thousands of others expressing intense emotions all at once, this contagiousness can become overwhelming. “We’re not necessarily armed and equipped for that,” he said. “Emotional contagion on that scale can get out of hand. Hate and anger and sometimes fear can magnify and amplify in a way that isn’t necessarily healthy for society.”

Three-quarters through the hour-long discussion, Owchar engaged Mlodinow in a game he called “Plead the Fifth,” in which he asked the author to pick one of two options (or neither and choose not to incriminate himself). These included: computer or yellow paper and pencil (“Computer,” Mlodinow said quickly. “I’m an incessant editor.”); Captain Kirk or Captain Picard (he pleaded the Fifth); and Isaac Asimov or Carl Sagan (Asimov).

To conclude the talk, Owchar asked Mlodinow about his work with fellow theoretical physicist and author Stephen Hawking, with whom he co-wrote two books, “A Briefer History of Time” and “The Grand Design.” Their partnership led to a decades-long friendship, and when Hawking died in 2018, The New York Times asked Mlodinow to write a tribute. How did you write while coping with grief? Owchar wanted to know.

“It was easy, in a way,” Mlodinow said. “I just sat there and felt my emotions. I let it all pour in. It came from the heart. I think that’s how you should write.”



You wouldn’t know it to look at Mars now, but rushing waters that crested banks had a hand in sculpting the face of the bone-dry red planet.

New research finds that billions of years ago Mars was beset by catastrophic river flooding that played a massive role in shaping what the planet’s expansive but now dry network of valleys looks like today, with its deep chasms and canyons.

This study builds on previous work looking at the planet’s water systems and was published late last year in Nature. The project team includes Harvard postdoctoral research fellow Gaia Stucky de Quay from the Department of Earth and Planetary Sciences and scientists from the University of Texas at Austin.

Years of evidence show that Mars had streams, ponds, lakes, and maybe even seas and oceans around 4 billion years ago. Large river systems covered about 50 percent of the planet during this time, and large crater lakes that could hold up to a small ocean’s worth of water were common.

Scientists analyzed satellite images and sophisticated topographical mapping of these dried-up water basins on Mars to determine what happened when the lakes that used to fill the planet’s giant craters overflowed, likely due to immense rainstorms. It appears the ensuing floods loosed immense amounts of water, along with sediment.

“Mars, like the moon, is completely filled with lots and lots of craters, so you have this landscape that’s kind of like a golf ball that’s just full of little holes,” said Stucky de Quay. “What ends up happening is that all these rivers are just flowing and then fill up and then they overflow and make these catastrophic floods.”

The impact on the planet was powerful and almost immediate. Landscapes and other geological features were rapidly reshaped in a matter of days or weeks. In fact, the breach floods eroded more than enough sediment to completely fill Lake Superior and Lake Ontario.

Gaia Stucky de Quay.

Stucky de Quay holds a globe of Mars.

Jon Chase/Harvard Staff Photographer

Although flood evidence has been observed previously on Mars, this is the first time that scientists have quantified how widespread it was, particularly during the planet’s early history. The study results provide scientists new insights into river formation on Mars and how it differed from the usually slow and gradual process on Earth. It also shows these types of floods on the red planet were much more common than previously thought and were an important geographical process.

“This hydrological activity was really active for some reason between 4 billion years and 3.5 billion years ago when it peaked, and we see loads of rivers that are about this age,” said Stucky de Quay. “It changes how we study these rivers because a river that forms in one day and forms a huge valley is very different from a river that formed over millions of years, so we need to reassess how we study the shapes and evolution of these river systems.”

Previous research from the group discovered that some of the crater lakes were prone to rapid flooding as a result of rainstorms and snowmelt that filled them up.

The researchers decided as a next step to look at the impact of this flooding on a global scale. For the study, they looked at images of 262 breached crater lakes that were collected by remote sensing satellites. This is believed to be the first time scientists looked at such a large number of breached Martian lakes at once.

The scientists found that the river valleys formed by these breaches were much deeper than those formed in other ways. For instance, river valleys formed by breach floods were around 560 feet deep, while the depth of valleys formed in other ways was just around 254 feet. The results suggest that these floods carved out rock and sediment rapidly while other valleys were formed more slowly.

One of the most astounding findings for the researchers was that the valley systems formed from breached lakes accounted for only 3 percent of total river valley length on Mars but made up about a quarter of its volume. That means floods from overflowing lakes carved out about 13,675 cubic miles of volume.

“This is a bit of a surprising result because they’ve been thought of as one-off anomalies for so long,” Timothy Goudge, an assistant professor at UT’s school of geosciences and lead author on the study, said in a statement.

The researchers believe the chasms and canyons that formed from flooding may have had a lasting effect on the surrounding landscape and influenced the formation of other nearby river valleys. It provides a potential alternative explanation for the unique Martian river valley topography that is usually attributed to climate.

The group hopes additional research, such as new data from the Mars 2020 Perseverance rover, will add to these findings and perhaps help pinpoint exactly how fast the floods carved out the land.

“Hopefully, we can shed some light on the timing of that,” Stucky de Quay said.



MKRdezign

Contact Form

Name

Email *

Message *

Powered by Blogger.
Javascript DisablePlease Enable Javascript To See All Widget