Here you find the recent daily news of the Harvard University
Harvard GazetteOfficial news from Harvard University covering innovation in teaching, learning, and researchRacing against antibiotic resistance Scientists fear funding cuts will slow momentum in ongoing battle with evolving bacteria
Scientists fear funding cuts will slow momentum in ongoing battle with evolving bacteria
A series exploring how research is rising to major challenges in health and society
In 2023, more than 2.4 million cases of syphilis, gonorrhea, and chlamydia were diagnosed in the U.S. Though that number is high, it’s actually an improvement, according to the Centers for Disease Control and Prevention: The number of sexually transmitted infections, or STIs, decreased 1.8 percent overall from 2022 to 2023, with gonorrhea decreasing the most (7.2 percent).
But the number of STI diagnoses is only one part of the problem.
One treatment for STIs is doxycycline. It has been prescribed as a prophylactic for gonorrhea, recommended as a treatment for chlamydia since 2020, and used to treat syphilis during shortages of the preferred treatment, benzathine penicillin. But bacteria are living organisms, and like all living organisms, they evolve. Over time, they develop resistance mechanisms to the antibiotics we create to kill them. And according to Harvard immunologist Yonatan Grad, resistance to doxycycline is growing rapidly in the bacteria that cause gonorrhea.
“The increased use of doxycycline has, as we might have expected, selected for drug resistance,” Grad said.
The pattern of bacteria evolving to overcome our best treatments is one of medicine’s most fundamental problems. Since the introduction of penicillin in the 1940s, antibiotics have radically transformed what’s possible in medicine, far beyond treatments for STIs. They can knock out the bacteria behind everything from urinary tract infections to meningitis to sepsis from infected wounds. But every antibiotic faces the same fate: As soon as it enters use, bacteria begin evolving to survive it.
The scope of the problem is staggering. Doctors wrote 252 million antibiotic prescriptions in 2023 in the U.S. That’s 756 prescriptions for every 1,000 people, up from 613 per 1,000 people in 2020. According to the CDC, more than 2.8 million antimicrobial infections occur each year in the U.S., and more than 35,000 people die, as a result of antimicrobial-resistant (AMR) infections.
Veasey Conway/Harvard Staff Photographer
“I think of antibiotics as infrastructure.”
Yonatan Grad
For researchers like Grad, the endless battle against the clock can be a bit like a game of high-stakes Whac-a-Mole — tracking antibiotic resistance, figuring out how it works, and developing new kinds of drugs before the bacteria can catch up.
“Being able to treat these infections underlies so many aspects of medicine — urinary tract infections, caring for people who are immunocompromised, preventing surgical infections and treating them if they arise, and on and on,” said Grad. “This is foundational for modern clinical medicine and public health. Antibiotics are the support, the scaffolding on which medicine depends.”
Hold or release new drugs?
Grad’s research shows how quickly resistance can develop. In research described in a July letter in the New England Journal of Medicine, Grad and colleagues evaluated more than 14,000 genome sequences from Neisseria gonorrhoeae, the bacteria that causes gonorrhea, and found that carriage of a gene that confers resistance to tetracyclines — the class of antibiotics to which doxycycline belongs — shot up from 10 percent in 2020 to more than 30 percent in 2024.
Fortunately, doxycycline remains effective as a post-exposure prophylaxis for syphilis and chlamydia. It’s an open question why some pathogens are quicker to develop resistance than others. The urgency varies by organism, Grad said, with some, like Mycobacterium tuberculosis, the cause of tuberculosis, and Pseudomonas aeruginosa, showing “extremely drug-resistant or totally drug-resistant strains” that leave doctors facing untreatable infections.
The findings raise alarm bells, or at least questions, in doctors’ offices around the country: As bacteria develop resistance to tried-and-true antibiotics, when should new drugs be introduced for maximal utility before the bacteria inevitably outwit them, too? Traditional stewardship practice has recommended holding back new drugs until the old ones stop working. But 2023 research from Grad’s lab has challenged that approach. In mathematical models evaluating strategies for introducing a new antibiotic for gonorrhea, Grad found that the strategy of keeping the new antibiotics in reserve saw antibiotic resistance reach 5 percent much sooner than quickly introducing the antibiotic or using it in combination with the existing drug.
Lifesaving progress halted
Extra time could be critical for Amory Houghton Professor of Chemistry Andrew Myers, whose lab has been developing new antibiotics, including ones that target gonorrhea, for more than 30 years.
“Most of the antibiotics in our ‘modern’ arsenal are some 50 years old and no longer work against a lot of the pathogens that are emerging in hospitals and even in the community,” Myers said. “It’s a huge problem and it’s not as well appreciated as I think it should be.”
File photo by Stephanie Mitchell/Harvard Staff Photographer
“In my opinion, we can absolutely win the game — temporarily.”
Andrew Myers
Many antibiotics work by targeting and inhibiting bacterial ribosome, the central machinery that translates the instructions in RNA into a protein readout. Ribosomes are “fantastically complex” 3D shapes, Myers said. Creating new antibiotics means inventing new chemical compounds that can bind like puzzle pieces into their grooves and protrusions.
“My lab will spend quite a lot of time, sometimes years, to develop the chemistry — to invent the chemistry — that allows us to prepare new members of these classes of antibiotics,” Myers said. “And then we spend years making quite literally thousands of different members of the class, and then we evaluate them. Do they kill bacteria? Do they kill bacteria that are resistant to existing antibiotics? We’ve been incredibly successful with this, one antibiotic class after another. The strategy works.”
But it’s also in danger. The Trump administration ended a National Institutes of Health grant to Myers’ lab for the development of lincosamides, a class of antibiotics whose last approved member, clindamycin, dates to 1970. A second terminated NIH grant may kill a promising new antibiotic on the cusp of further development. Myers’ lab has created a new molecule that has proven effective in killing Klebsiella Pneumoniae and E. coli, both identified by the World Health Organization as among the highest priority pathogens. Without continued funding, the molecule may not make it to the clinical trial phase and may never become an approved drug.
“A delusion among people is that these decisions can simply be reversed and these NIH grants restored,” Myers said. “That’s not true. The damage is real, and it’s irreversible in some cases.”
Carrying on Paul Farmer’s legacy
The funding cuts extend beyond individual labs to a global health infrastructure. Carole Mitnick, a professor of global health and social medicine at Harvard Medical School, studies multidrug-resistant tuberculosis (MDR-TB) and has watched about 79 percent of USAID funding for global TB support get slashed this year.
“In the Democratic Republic of Congo, in Sierra Leone, and no doubt elsewhere, we’ve seen stocks of lifesaving anti-TB drugs sitting in warehouses, expiring, because programs that would have delivered them have been canceled or staff who would have collected them have been abruptly fired,” she said. “Not only is it immediately deadly and cruel not to deliver these lifesaving cures, but it sets the scene for more antimicrobial resistance by not delivering complete treatments. And it very clearly wastes U.S. taxpayer money to invest in the purchase of these drugs and let them sit in warehouses and expire.”
Mitnick’s work on multidrug-resistant TB, a form of antimicrobial resistance, builds on the legacy of Paul Farmer, the late Harvard professor and Partners In Health co-founder who revolutionized MDR-TB treatment by rejecting utilitarian approaches that wrote off the most vulnerable patients.
“Getting to know Paul and having him advise me, initially on my master’s thesis and ultimately on my doctoral dissertation, gave me a new framework,” Mitnick said. “It allowed me the freedom to use a social justice framework and to say that actually our research should be motivated by who’s suffering the greatest. How do we blend the research, which we’re very well placed to do at Harvard, with direct service and trying to reach the populations who are most marginalized? That shape is still very much in place and still informing the choices that several researchers in our department make in Paul’s legacy.”
Veasey Conway/Harvard Staff Photographer
“Our research should be motivated by who’s suffering the greatest.”
Carole Mitnick
Globally, about 500,000 new people are estimated to have MDR-TB or its even heartier relative, extensively drug-resistant TB, each year. MDR-TB caused an estimated 150,000 deaths worldwide in 2023. TB is the poster child for pathogen characteristics and social conditions that favor selection for drug-resistant mutants. In a single case of TB, the bacteria population comprises bacteria at different stages of growth and in different environments of the body, requiring distinct drugs that can attach to each of these forms. Multidrug treatment regimens are long (measured in months, not days) and toxic, making them difficult for people to complete. And in the absence of any incentives or requirements, there’s a long lag between developing new drugs and developing tests that can detect resistance to those drugs. Consequently, treatment is often delivered without any information about resistance, in turn generating more resistance.
The fight against MDR-TB has an unlikely new ally: Nerdfighters, the fan group of prominent video bloggers John and Hank Green — or, more specifically, a subset of that fandom calling themselves TBFighters. John Green’s 2024 book, “Everything is Tuberculosis,” raised awareness about the prohibitive cost of TB diagnostic tests.
Mitnick said that in the acknowledgments, Green called his book a sort of love letter to Paul Farmer. “Paul didn’t directly introduce John to TB, but it really is Paul’s legacy that took John Green to Sierra Leone, and then he met this young man named Henry who had multidrug-resistant tuberculosis. It awakened in John the awareness that actually TB was not a disease of the past, but a disease very much of the present.”
The TBFighters energized an existing coalition movement to reduce the cost of testing for TB and other diseases from about $10 per test to about $5 per test, based on estimates that $5 covered the cost of manufacturing plus a profit, even at lower sales volumes.
“It wasn’t until John Green and the TBFighters entered the fray in 2023 that we made any headway: The manufacturer announced a reduction of about 20 percent on the price of one TB test,” Mitnick said. “So not a full win, but a partial win.”
Despite the challenges, researchers remain cautiously optimistic. “In my opinion, we can absolutely win the game — temporarily,” said Myers. “Whatever we develop, bacteria will find a way to outwit us. But I’m optimistic that the molecules that we’re making could have a clinical lifetime of many decades, maybe even as long as 100 years, if they’re used prudently.”
Grad sees his work more like the construction crews that repair the city sidewalk or maintain bridges. “I think of antibiotics as infrastructure,” he said. “These tools that we use to maintain our health require continual investment.”
What makes us sleepy during the day?Research links by-products of steroid hormone to excessive daytime sleepiness
Research links by-products of steroid hormone to excessive daytime sleepiness
Jacqueline Mitchell
BIMDC Communications
3 min read
A new study sheds light on the biological underpinnings of excessive daytime sleepiness, a persistent and inappropriate urge to fall asleep during the day — during work, at meals, even mid-conversation — that interferes with daily functioning.
The findings, published in The Lancet eMedicine, open the door to exploring how nutrition, lifestyle, and environmental exposures interact with genetic and biological processes to affect alertness.
The findings add weight to the idea that excessive daytime sleepiness isn’t just the result of too little sleep.
“Recent studies identified genetic variants associated with excessive daytime sleepiness, but genetics explains only a small part of the story,” said co-corresponding author Tamar Sofer, director of Biostatistics and Bioinformatics at the Cardiovascular Institute at Beth Israel Deaconess Medical Center, and an associate professor at Harvard T.H. Chan School of Public Health and Harvard Medical School. “We wanted to identify biomarkers that can give stronger insights into the mechanisms of excessive daytime sleepiness and help explain why some people experience persistent sleepiness even when their sleep habits seem healthy.”
Investigators from Harvard-affiliated BIDMC and Brigham and Women’s Hospital turned to metabolite analysis to better understand the biology behind excessive daytime sleepiness. Metabolites are small molecules produced as the body carries out its normal functions, from synthesizing hormones, to metabolizing nutrients to clearing environmental toxins. By measuring these metabolites researchers created a profile of excessive daytime sleepiness.
The scientists analyzed blood levels of 877 metabolites in samples taken from more than 6,000 individuals in the Hispanic Community Health Study/Study of Latinos (HCHS/SOL), a long-running study sponsored by the National Institutes of Health since 2006. When they cross-referenced these data with participants’ self-reported measures of sleepiness on an official survey, investigators identified seven metabolites that were significantly linked with higher levels of excessive daytime sleepiness.
The seven metabolites turned out to be involved in the production of steroids and other biological processes already implicated in excessive daytime sleepiness. When the investigators looked only at data from male participants, an additional three metabolites were identified, suggesting there might be sex-based biological differences in how excessive daytime sleepiness manifests.
The findings add weight to the idea that excessive daytime sleepiness isn’t just the result of too little sleep but can reflect physiological circumstances that might someday be diagnosed through blood tests or treated through targeted interventions.
“As we learn what’s happening biologically, we are beginning to understand how and why EDS occurs, the early signs that someone might have it, and what we can do to help patients,” said lead author Tariq Faquih, a postdoctoral research fellow in Sofer’s lab, the lab of Heming Wang at BWH, and a fellow in medicine at HMS. “These insights could eventually lead to new strategies for preventing or managing sleep disorders that include daytime sleepiness as a major symptom.”
This research was supported in part by the National Institutes of Health and the National Institute on Aging.
Solving evolutionary mystery of how humans came to walk uprightNew study identifies genetic, developmental shifts that resculpted pelvis, setting ancestors apart from other primates
Solving evolutionary mystery of how humans came to walk upright
Gayani Senevirathne (left) holds the shorter, wider human pelvis, which evolved from the longer upper hipbones of primates, which Terence Capellini is displaying.
Niles Singer/Harvard Staff Photographer
Kermit Pattison
Harvard Staff Writer
6 min read
New study identifies genetic, developmental shifts that resculpted pelvis, setting ancestors apart from other primates
The pelvis is often called the keystone of upright locomotion. More than any other part of our lower body, it has been radically altered over millions of years, allowing our ancestors to become the bipeds who trekked and settled across the planet.
But just how evolution accomplished this extreme makeover has remained a mystery. Now a new study in the journal Nature led by Harvard scientists reveals two key genetic changes that remodeled the pelvis and enabled our bizarre habit of walking on two legs.
“What we’ve done here is demonstrate that in human evolution there was a complete mechanistic shift,” said Terence Capellini, professor and chair of the Department of Human Evolutionary Biology and senior author of the paper. “There’s no parallel to that in other primates. The evolution of novelty — the transition from fins to limbs or the development of bat wings from fingers — often involves massive shifts in how developmental growth occurs. Here we see humans are doing the same thing, but for their pelves.”
Anatomists have long known that the human pelvis is unique among primates. The upper hipbones, or ilia, of chimpanzees, bonobos, and gorillas — our closest relatives — are tall, narrow, and oriented flat front to back. From the side they look like thin blades. The geometry of the ape pelvis anchors large muscles for climbing.
In humans, the hipbones have rotated to the sides to form a bowl shape (in fact, the word “pelvis” derives from the Latin word for basin). Our flaring hipbones provide attachments for the muscles that allow us to maintain balance as we shift our weight from one leg to the other while walking and running.
In their new paper, the international team of researchers identified some of the key genetic and developmental shifts that radically resculpted the quadrupedal ape pelvis into a bipedal one.
“What we have tried to do is integrate different approaches to get a complete story about how the pelvis developed over time,” said Gayani Senevirathne, a postdoctoral fellow in Capellini’s lab and study lead author.
Senevirathne analyzed 128 samples of embryonic tissues from humans and nearly two dozen other primate species from museums in the U.S. and Europe. These collections included century-old specimens mounted on glass slides or preserved in jars.
The researchers also studied human embryonic tissues collected by the Birth Defects Research Laboratory at the University of Washington. They took CT scans and analyzed histology (the microscopic structure of tissues) to reveal the anatomy of the pelvis during early stages of development.
“The work that Gayani did was a tour de force,” said Capellini. “This was like five projects in one.”
The researchers discovered that evolution reshaped the human pelvis in two major steps. First, it shifted a growth plate by 90 degrees to make the human ilium wide instead of tall. Later, another shift altered the timeline of embryonic bone formation.
Most bones of the lower body take shape through a process that begins when cartilage cells form on growth plates aligned along the long axis of the growing bone. This cartilage later hardens into bone in a process called ossification.
In the early stages of development, the human iliac growth plate formed with growth aligned head-to-tail just as it did in other primates. But by day 53, the growth plates in humans evolved to radically shift perpendicularly from the original axis — thus shortening and broadening the hipbone.
“Looking at the pelvis, that wasn’t on my radar,” said Capellini. “I was expecting a stepwise progression for shortening it and then widening it. But the histology really revealed that it actually flipped 90 degrees — making it short and wide all at the same time.”
The authors suggest that these changes began with reorientation of growth plates around the time that our ancestors branched from the African apes, estimated to be between 5 million and 8 million years ago.
Another major change involved the timeline of bone formation.
Most bones form along a primary ossification center in the middle of the bone shaft.
In humans, however, the ilia do something quite different. Ossification begins in the rear of the sacrum and spreads radially. This mineralization remains restricted to the peripheral layer and ossification of the interior is delayed by 16 weeks compared to other primates — allowing the bone to maintain its shape as it grows and fundamentally changing the geometry.
“Embryonically, at 10 weeks you have a pelvis,” said Capellini as he sketched on a whiteboard. “It looks like this — basin-shaped.”
To identify the molecular forces that drove this shift, Senevirathne employed techniques such as single-cell multiomics and spatial transcriptomics. The team identified more than 300 genes at work, including three with outsized roles — SOX9 and PTH1R (controlling the growth plate shift), and RUNX2 (controlling the change in ossification).
The importance of these genes was underscored in diseases caused by their malfunction. For example, a mutation in SOX9 causes campomelic dysplasia, a disorder that results in hipbones that are abnormally narrow and lack lateral flaring.
Similarly, mutations in PTH1R cause abnormally narrow hipbones and other skeletal diseases.
The authors suggest that these changes began with reorientation of growth plates around the time that our ancestors branched from the African apes, estimated to be between 5 million and 8 million years ago.
They believe that the pelvis remained a hotspot of evolutionary change for millions of years.
As brains grew bigger, the pelvis came under another selective pressure known as the “obstetrical dilemma” — the tradeoff between a narrow pelvis (advantageous for efficient locomotion) and a wide one (facilitating the birth of big-brained babies).
They suggest that the delayed ossification probably occurred in the last 2 million years.
The oldest pelvis in the fossil record is the 4.4-million-year-old Ardipithecus from Ethiopia (a hybrid of an upright walker and tree climber with a grasping toe), and it shows hints of humanlike features in the pelvis.
The famous 3.2-million-year-old Lucy skeleton, also from Ethiopia, includes a pelvis that shows further development of bipedal traits such as flaring hip blades for bipedal muscles.
Capellini believes the new study should prompt scientists to rethink some basic assumptions about human evolution.
“All fossil hominids from that point on were growing the pelvis differently from any other primate that came before,” said Capellini. “Brain size increases that happen later should not be interpreted in a model of growth like chimpanzee and other primates. The model should be what happens in humans and hominins. The later growth of fetal head size occurred against the backdrop of a new way of new way of making the pelvis.”
This research was funded in part by the National Institutes of Health.
When global trade is about more than moneyEconomist’s new tool looks at how China is more effective than U.S. in exerting political power through import, export controls
Economist’s new tool looks at how China is more effective than U.S. in exerting political power through import, export controls
Christy DeSmith
Harvard Staff Writer
6 min read
International trade can yield far more than imports and exports. According to David Y. Yang, Yvonne P. L. Lui Professor of Economics, trade can be used to wield political power.
Yang watched as China imposed trade restrictions on competitor Taiwan following a 2022 visit to the island by U.S. Speaker of the House Nancy Pelosi. A decade earlier, the arrest of a Chinese fishing boat captain in contested waters culminated with Beijing blocking exports to Japan of certain rare earth minerals, critical components for wind turbines and electric vehicles.
“Another example is China banning the import of Norwegian salmon for nearly a decade as punishment for awarding a Nobel Prize to the dissident Liu Xiaobo,” said Yang, a political economist with expertise in the East Asian superpower.
His latest working paper, co-authored with Princeton’s Ernest Liu, presents a framework for measuring how much geopolitical muscle a country can flex by threatening trade disruptions. Today, the economists find, China exerts outsized influence over trading partners while the United States has less power than expected relative to the size of its economy
“With the arrival of new data sources and empirical tools, this is something we can now study very rigorously,” Yang emphasized. “Conducting these objective, data-driven analyses feels all the more urgent in today’s global geopolitical climate.”
Their model specifically tests a set of predictions made by mid-20th-century Harvard professor Albert O. Hirschman, a German-born Jew who fled Europe during World War II. His book “National Power and the Structure of Foreign Trade” (1945) offered a theoretical account of how countries might use trade to assert geopolitical dominance.
“Hirschman viewed the issue through a positive lens,” Yang noted. “Rather than bombing each other, countries could just fight economic wars to achieve the same goals.”
Hirschman saw that trade asymmetries could be exploited. But deficits and surpluses weren’t the only relevant variables. Also important was how crucial and easily replaced the goods in question were. Halting the flow of crude oil tends to pack a far bigger punch than withholding textile exports.
“If one country becomes overly reliant on another, it might be economically efficient,” Yang explained. “But it can leave the first country vulnerable by exposing it to unfavorable power dynamics.”
Hirschman’s ideas seemed less relevant in the post-war years, with the widespread desire for increased free trade. But the book feels fresh again today, said Yang, who recently assigned it in an undergraduate economics course.
“I asked students to read the first few chapters and guess when it was written,” he recalled. “Many guessed it was last year.”
Yang and Liu set about formalizing Hirschman’s vision about three years ago, long before the current suite of aggressive U.S. tariffs. “A lot of the anecdotal examples that motivated our work came from China,” Yang said.
Indeed, their model shows China’s trade power rising over the past two decades as it turned key industries into political instruments. Chemical products, medical instruments, and electrical equipment emerged as especially potent. The country’s trade power proved larger than expected given the size of its GDP, second to the world’s largest economy by many trillions of dollars.
U.S. trading power over China declines
This figure plots the directed power (in all sectors) between the U.S. and a country for each year.
Credit: Ernest Liu and David Y. Yang
“In the early 2000s, the U.S. was able to exert more absolute power over China through trade disruptions,” said Yang, noting that findings on the U.S. were relatively stable over the 20-year period they studied.
“But things have quickly flipped,” he continued. “China now has more trade power over the U.S. and, at the moment, can exert positive power over any other entity in the world.”
China’s trading power on the rise
This figure plots the directed power (in all sectors) between China and a country for each year.
Credit: Ernest Liu and David Y. Yang
Yang and Liu also tested a pair of predictions concerning the consequences of unbalanced power. First, the economists tapped a database of millions of events involving the governments of two trading partners, confirming that negotiations and other forms of engagement increase with the asymmetries Hirschman described.
Another dataset, sourced from international opinion polls, was used to gauge bilateral geopolitical alignment over time and to verify a second predicted consequence. Yang and Liu found national leaders strategizing to build and bank trade power — by limiting imports, for example — when relations with a trading partner turned frosty due to political turnover.
“While many of the examples we give in the paper are from China, we hope to show this is a more general phenomenon,” Yang said. “Trade is a source of power any country can access.”
The paper is threaded with other insights.
“If the European Union acted as one country, it would actually be able to exercise positive power over China,” Yang said. “But individual EU members all have negative power over China. I don’t think it’s a coincidence that China typically engages with EU members bilaterally.”
What’s more, the U.S. and China are weaker against each another. The paper features a pair of maps illustrating their trade power over the rest of the world from 2001 to 2021. U.S. strength appears to peak in North America, while China’s is anchored in the Asia Pacific region.
“In terms of global power dynamics,” Yang observed, “medium-sized countries are very much the ones that get bullied.”
The results underscore a recent shift in the global trade order. For half a century following World War II, Yang said, the largest economies imported and exported with hopes of maximizing efficiency for the benefit of domestic businesses and consumers.
“What’s worrisome is that we’re starting to see the opposite,” he offered. “Trade is being restructured to take power into consideration. But in contrast with the positive-sum nature of efficiency-enhancing trade as countries produce according to their comparative advantage, power consideration in trade is negative-sum, hurting welfare on both sides.
“As we begin to painfully realize,” Yang added, “it may not be geopolitically feasible to implement efficient trade.”
Analysts highlight a school-sized gap in mental health screeningLess than a third conduct screenings, according to survey of more than 1,000 principals
Analysts highlight a school-sized gap in mental health screening
Alvin Powell
Harvard Staff Writer
4 min read
Hao Yu.
Stephanie Mitchell/Harvard Staff Photographer
As anxiety and depression persist at alarming rates among U.S. teens, less than a third of the nation’s public schools conduct mental health screenings, and a significant number of those that do say it’s hard to meet students’ needs, according to a new survey of principals.
With staffing that includes counselors and nurses, public schools are uniquely positioned to help address the youth mental health crisis declared in 2021 by the U.S. surgeon general, according to Harvard Medical School’s Hao Yu, a co-author of the study.
“Child mental health is a severe public health issue in this country,” he said. “Even before COVID, about a quarter of children had different degrees of mental health problems, and during the pandemic the problem just got worse.”
The study, published last month in JAMA Network Open, is the first since 2016 to poll public school principals on children’s mental health, said Yu, an associate professor of population medicine. The intervening years have included COVID-related disruptions, growing worries about screen time, and a surge of artificial intelligence in everyday life, he noted.
$1BCut from previously approved federal funding for school mental health support
One positive finding from the survey, which was funded with a grant from the National Institute of Mental Health, is that the percentage of U.S. public schools that screen for mental health issues has risen significantly in the past nine years, albeit from just 13 percent to 30.5 percent. The survey asked 1,019 principals three questions: Do you screen for student mental health issues? What steps are taken for students identified with anxiety or depression, two of the most common youth mental health issues? And how easy or hard it is to find adequate mental health care for students who need it?
The responses show that the most common step taken for students struggling with anxiety or depression is to notify parents — almost 80 percent of schools did that. Seventy-two percent offer in-person treatment, while about half refer to an outside mental health provider. Less than 20 percent offer telehealth treatment.
Responses to the final question highlight the challenge facing those seeking to address the problem, with 41 percent describing the task of getting care as “hard” or “very hard,” a result that Yu said, while concerning, isn’t surprising given the nationwide shortage of mental health providers.
The survey, conducted with colleagues from the Medical School, the nonpartisan research organization RAND, Brigham and Women’s Hospital, the University of Pittsburgh, the Harvard Pilgrim Health Care Institute, and Brown University, also showed that school-based screening programs are concentrated in larger schools, with 450 students or more, and in districts with larger populations of racial and ethnic minority students.
Helping young people overcome mental health challenges is a multistep process, Yu said.
“We need to make child psychiatry an attractive profession and we need to train more mid-level providers — social workers, school nurses, and counselors — because those middle-level providers play an important gatekeeper role, helping identify children with mental health problems and helping children and their families get into the healthcare system,” he said.
It’s also important, Yu said, to get policy right at all levels of government. For example, he said, even though it’s clear that meeting the challenge will require more resources, the federal government recently slashed $1 billion in previously approved school mental health funding. A potentially positive development, he said, is the nationwide trend toward restrictions on smartphone use.
“I don’t think any other institution can replace the schools in identifying and treating child mental health problems,” Yu said. “If mental health problems are treated, their severity can be greatly reduced. Mental health problems not treated in childhood can have a long-lasting effect into adulthood. That’s not an optimal situation for our society.”
‘We’re so happy to have you here’Yard brims with voices and motion, excitement and nerves, sweat and tears on move-in day
First-year students and their families criss-cross the Yard on move-in day.
Yard brims with voices and motion, excitement and nerves, sweat and tears on move-in day
Ryan Zhou was busy moving items into his Weld Hall dorm room on Tuesday with the help of his parents and his new suitemates, Kelvin Cheung and Ronan Pell, when there was a knock on the door.
“Hi, Ryan, my name’s Hopi,” said Hopi Hoekstra, the Edgerley Family Dean of the FAS, coming into the room with some bags she had helped Zhou’s brother carry up from the car downstairs. “Welcome, we’re so happy to have you here.”
“I’m excited,” said Zhou, as he stood in the suite’s common area piled high with duffels, boxes, and bedding. “I’m excited to get started with meeting new people, making new friends, excited for all the professors and the classes.”
Harvard Yard came alive Tuesday morning as first-year students and their families unloaded cars and carried bags and boxes to the dorms in preparation for the start of their time at Harvard.
Dean Hopi Hoekstra chats with first-years Ronan Pell (left) and Kelvin Cheung as they settle in their new home in Weld Hall.
Veasey Conway/Harvard Staff Photographer
Zhou and his family drove up from their home in Ellicott City, Maryland, a few days beforehand. His father, Ning Zhou, said he’s feeling positive about the road ahead.
“I am just extremely proud of him and his years of effort,” he said. “This is his dream school. A lot of Harvard graduates told him the experience was transformative for them, so I hope that he will have a similar experience.”
“I just feel happy for him,” Zhou’s mother, Jun Gui, added. “He found the place he wants to go. I haven’t shed a tear yet.”
Welcoming the new students were President Alan Garber (second from left), joined by his wife, Anne Yahanda (far left); Faculty of Arts and Sciences Dean Hopi Hoekstra; and Dean of Harvard College David Deming.
Stephanie Mitchell/Harvard Staff Photographer
First-year Cate Frerichs with her mother, Desiree Luccio.
Stephanie Mitchell/Harvard Staff Photographer
First-year Jose Garcia helps hoist a box up a Hollis Hall stairway.
Veasey Conway/Harvard Staff Photographer
Senior Lexi Triantis takes a bubble break.
Veasey Conway/Harvard Staff Photographer
Boxes collect in a staging area outside the Science Center.
Veasey Conway/Harvard Staff Photographer
A T-shirt decorated with emblems of Harvard’s first-year Houses.
Stephanie Mitchell/Harvard Staff Photographer
By Johnston Gate, a group of upper-level students from the Crimson Key Society, holding a “Welcome to Harvard” sign, sang and danced along to Nicki Minaj and Bruno Mars songs, waving to the cars that pulled in. Outside each dorm, upper-level Peer Advising Fellows, dressed in red T-shirts, greeted new students and helped show them to their rooms
“What makes move-in day so special?” Hoekstra said. “Three things: Experiencing the energy that our returning students bring to welcoming new first-years to the Harvard community. Meeting proud, and sometimes nervous, parents who have traveled from around the globe. Watching new friendships form among roommates meeting for the first time — ones that often not only last for four years at Harvard but across lifetimes.”
“A lot of Harvard graduates told him the experience was transformative for them, so I hope that he will have a similar experience.”
Ning Zhou, about son Ryan
Leila Holland and her parents, Keisha and Jaime Holland, from Long Beach, California, took it all in as they paused outside the key distribution tent in the center of the green. Leila, who had just picked up her ID and register book, said she was looking forward to seeing her Hollis Hall room.
“I’m a little nervous, but I’m really excited to be part of a new community,” she said.
Jaime Holland said he knows this will be a time of changes.
“Just the discovery process, as she figures out what she wants to do and the kind of person she wants to be,” he said. “This is a great place to do it.”
Veasey Conway/Harvard Staff Photographer
David Deming, Danoff Dean of Harvard College, made his way between the parked cars, cheerfully accepting a black rolling suitcase and a pink wall sign from a family’s car, and leading the way to Weld.
“Move-in day is one of my very favorite days of the year at Harvard,” Deming said. “There is so much positive energy and excitement and anticipation. I feel that, too, in my first year as dean. It’s great to be able to help new students move in and feel the positive energy with them.”
Outside Grays Hall, Harvard President Alan Garber and his wife, Anne Yahanda, chatted with parents, swapping stories and recalling what it felt like to drop their own children at college.
“For everyone here, all the hard work, everything they’ve done — it’s just such an accomplishment and dream.”
Desiree Luccio
For most parents, move-in day prompts complicated emotions.
Desiree Luccio couldn’t help tearing up as she spoke about moving her daughter, Cate Frerichs, into Wigglesworth Hall. The two wore matching red Harvard sweatshirts.
“I didn’t cry at graduation, but now it’s hitting me,” Luccio said. “For everyone here, all the hard work, everything they’ve done — it’s just such an accomplishment and dream.”
For her part, Frerichs was particularly looking forward to being a student athlete — she will be a coxswain on the men’s heavyweight rowing team.
“I guess I’m nervous and excited,” Frerichs said. “I’ve met my roommates, and I’m excited to start living with them and to meet everyone.”
Global concerns rising about erosion of academic freedomNew paper suggests threats are more widespread, less obvious than some might think
Global concerns rising about erosion of academic freedom
New paper suggests threats are more widespread, less obvious than some might think
Christina Pazzanese
Harvard Staff Writer
8 min read
Political and social changes in the U.S. and other Western democracies in the 21st century have triggered growing concerns about possible erosion of academic freedom.
In the past, colleges and universities largely decided whom to admit and hire, what to teach, and which research to support. Increasingly, those prerogatives are being challenged.
In a new working paper, Pippa Norris, the Paul F. McGuire Lecturer in Comparative Politics at Harvard Kennedy School, looked at academic freedom and found it faces two very different but dangerous threats. In this edited conversation, Norris discusses the lasting effects these threats can have on institutions and scholars.
How is academic freedom defined here and how is it being weakened?
Traditional claims of academic freedom suggest that as a profession requiring specialist skills and training like lawyers or physicians, universities and colleges should be run collectively as self-governing bodies.
Thus, on the basis of their knowledge and expertise in their discipline and subfield, scholars should decide which colleagues to hire and promote, what should be taught in the classroom curriculum, which students should be selected and how they should be assessed, and what research should be funded and published.
Constraints on this process from outside authorities no matter how well-meaning can be regarded as problematic for the pursuit of knowledge.
Encroachments on academic freedom can arise for many different reasons. For example, the criteria used for state funding of public institutions of higher education commonly prioritize certain types of research programs over others. Personnel policies, determined by laws, set limits on hiring and firing practices in any organization. Donors also prioritize support for certain initiatives. Academic disciplines favor particular methodological techniques and analytical approaches. And so on.
Therefore, even in the most liberal societies, academic institutions and individual scholars are never totally autonomous, especially if colleges are publicly funded.
But nevertheless, the classical argument is that a large part of university and college decision-making processes, and how they work, should ideally be internally determined, by processes of scholarly peer review, not externally controlled, by educational authorities in government.
You say academic freedom faces threats on two fronts, external and internal. Can you explain?
Much of the human rights community has been concerned primarily about external threats to academic freedom. Hence, international agencies like UNESCO, Amnesty International, and Scholars at Risk, and domestic organizations like the American Association of University Professors, are always critical of government constraints on higher education like limits to free speech and the persecution of academic dissidents, particularly in the most repressive authoritarian societies.
In America, much recent concern has focused on states such as Florida and Texas, and the way in which lawmakers have intervened in appointments to the board of governors or changed the curriculum through legislation.
But, in fact, the government has always played a role, even in private universities. Think about sex discrimination, think about Title IX, think about all the ways in which we’ve legislated to try to improve, for example, diversity. That wasn’t accidental. That was a liberal attempt to try to make universities more inclusive and have a wider range of people coming in through social mobility.
So, we can’t think this all just happened because of Trump. It hasn’t. It’s a much larger process, and it’s not simply America. In all democracies, official bodies in the federal or state government, whichever party is in power, generally regulate employment conditions, university accreditation, curriculum standards, student grants and loans, and so on and so forth, and so it’s going to do that for colleges and universities in the U.S., as well.
Academic freedom is also at risk from internal processes within higher education, especially informal norms and values embedded in academic culture. Those can exist in any organization.
In academic life, surveys of academics since the 1950s have commonly documented a general liberal bias (broadly defined) amongst the majority of scholars, where the proportion of conservatives has usually been a heterodox minority.
This bias comes from a variety of different sources: It’s partly self-selection, a matter of who chooses to go into academic life versus going into the private sector careers. But is also internally reinforced — a matter of who gets selected, appointed, promoted, and who gets research grants and publications. There are lots of different ways people have to conform to the social norms of the workplace and within their discipline.
Those cultural norms are tacit. The problem is that if you don’t follow the norms, there may be a financial penalty — you don’t get promoted, or you don’t get that extra step in your grant and your award.
But they may also be just informal pressures of collegiality, friendship, and social networks. People don’t want to offend so they seek to fit in with their colleagues, department, or institution. As a result, heterodox minorities may well decide to “self-censor,” to decline from speaking up in dissent with the prevailing community.
The result is to accentuate the liberal bias, since criticisms of prevailing orthodoxies are not even expressed or heard in debate. Thus, many holding orthodox views shared by the majority in departmental meetings, appointment boards, or classroom seminars may believe that there is discussion open to all viewpoints, but silence should not be taken as tacit agreement if minority dissidents silently feel unable to speak up.
The mere perception that academic freedom is in decline increases people’s tendency to self-censor, according to the paper. Why is that?
Liberals often feel that there is no self-censorship, and there is no problem in academe, that everybody is free to speak their opinion, and that they welcome diversity in the classroom, they welcome diversity in the department, and things like that.
The problem is that if you’re in a minority and in particular right now the conservative minority, then you feel you can’t immediately speak up on a number of issues, which might offend your colleagues or might have material problems for your career.
If you’re a student and you have a heterodox view, you might feel that you won’t be popular, you won’t be invited to the parties, and you won’t have all those social networks which are a really important part of why people go to college. So, there’s this informal penalty.
Liberals don’t sense it because when they are discussing things, they think there are a variety of different views, but they may well be antithetical. They don’t even hear the criticisms of their views because those who are in the minority don’t want to speak up.
The minority can be defined in lots of different ways. It’s not simply one ideology. There are multiple viewpoints in any subject discipline. But there’s a particular way of looking at these things within a discipline, which sets the agenda, which also affects textbooks and affects the classroom, and, in fact, affects the informal culture.
You found that endorsements of strong pro-academic freedom values predict the willingness of scholars to speak out even when it differs from popular opinion. What did you mean?
Think about the people who are standing up for Harvard right now or standing up for any institution or any other unpopular view. A strong liberal is somebody who follows the John Stuart Mill argument, which is that the only way you know your argument is to know the opponent’s and to be able to act like a prosecutor in which you can put the argument on both sides. I try to use this as a pedagogy in my own classes.
People who believe in academic freedom are largely in the more liberal democracies, the Western democracies of the world. In many countries, they don’t have those luxuries.
In China, you’re not going to be speaking up against the Communist Party. It’s about what can you say and when can you say it — being sensitive to the silences and what generates the silence. And how do you ask a question, which is not going to belittle somebody and is not going to make them feel small, but you’re taking them seriously when you don’t agree with them.
The most important finding from my research evidence is that if you’re working and living in a country with more institutional constraints and less legal freedom, you’re also more likely to suppress your own views.
You can think of it as an embedded model like a Russian nesting doll. The internal group is limiting your willingness to speak up; the external is about the punishments you face if you do speak up. The two interact, obviously, but the informal norms are the subtlest things, which will keep you quiet.
Funny or failure? It’s a fine line.‘Jimmy Kimmel Live!’ writer on taking risks in comedy and why getting laughs is worth near-constant rejection
‘Jimmy Kimmel Live!’ writer Will Burke on taking risks in comedy and why getting laughs is worth near-constant rejection
Anna Lamb
Harvard Staff Writer
7 min read
A series exploring how risk shapes our decisions.
Imagine walking a tightrope. Your goal is to get to the other side without falling. Below you — certain death. Well, maybe not death. Maybe there’s a net to catch you, but it’s not a very soft net, and falling into it will certainly not feel good. That, says Will Burke, alumnus of Harvard College and nearly two-decade veteran staff writer, now director, for “Jimmy Kimmel Live!,” is what trying to be funny is like.
“The second you walk out on stage or you start to tell a joke, you’re walking a tightrope,” Burke said. “You’re betting on your timing, your point of view, and sometimes you’re putting your dignity on the line in the hopes that people will laugh.”
Making people laugh, both on stage and off, has been a lifelong pursuit for Burke ’99. His comedy career started as a class clown in the hallways of the New England prep schools where his father was a teacher, and continued on stage at Harvard with the improv group On Thin Ice and the Shakespeare troupe he helped found. Then it blossomed in Los Angeles, practicing with improv groups like The Groundlings and auditioning for acting gigs.
And while a career spent trying to be funny sounds like a dream for many, Burke said it’s actually been quite risky. There’s the risk of putting yourself out there creatively, the risk of crossing a line with a joke, and then, of course, the risk of not “making it” as a funny guy full-time.
Burke (from left) on stage with Zach Galifianakis and Kimmel.
“The biggest risk was taking my Harvard diploma in one hand and trading the ivory towers of Harvard for the dive bars of Hollywood,” Burke said. “I was turning my back on the pedigree and the connections.”
Burke knows a Harvard degree can get you far. But, he said, when he moved to Los Angeles after graduation in 1999, he also knew it wouldn’t get him on TV. He’d have to do the same open mics, auditions, and acting classes the rest of the aspiring comedians in LA were doing. And in the meantime, he’d be a bartender slash tutor slash cater-waiter slash comedian.
“I suppose in some ways, you could say for a Harvard grad it’s less risky to go try to do this thing, because if it doesn’t work out you’ve still got a Harvard diploma, and some doors will open to you in a different field. But once you’re 10 years in, 15 years in, starting over in a totally different career is risky too,” he said.
And 10 years, Burke said, would be all he gave it before accepting defeat and going back to the East Coast.
“As an actor, it took me, like, 150 auditions before I booked my first thing,” Burke said. “And at this point I had become a little jaded. I was like, ‘This is so annoying. I don’t even want this commercial. This is a terrible Taco Bell ad, who cares?’ And when you don’t care, then they’re like, ‘Oh, that guy’s great. He doesn’t care. He doesn’t need this job.’ They feel it. And so that taught me a lot.”
“You’re betting on your timing, your point of view, and sometimes you’re putting your dignity on the line in the hopes that people will laugh.”
Besides booking some commercials, and some small roles on TV, after six years of auditioning and being rejected, Burke was offered a job back in Boston, working for a bank. He had a baby on the way, rising rent, and an income being stitched together through various odd jobs.
“I essentially, verbally accepted a job — I went down to HR and they photocopied my driver’s license and gave me the 401K package, what it would look like, and that whole thing. And I was like, ‘This feels like the most responsible thing to do. I have mouths to feed.’ And I could still scratch the itch in comedy clubs in Boston on the weekends, if I wanted. I kept trying to give myself a pep talk that I felt good about this — having a steady paycheck and a guaranteed career.”
Fate, said Burke, had other plans.
“Shortly thereafter, I flew back to LA and I got offered a job writing for ‘Jimmy Kimmel Live!’ And thank God I did. That was 19 years ago, and I’ve been there ever since.”
Since landing “Kimmel,” Burke said every day on the job, trying to be funny, is a risk.
“There were stressful days where I was convinced I was getting fired,” he said. “You’d see other writers get fired. I was like, ‘Oh, he’s not pitching stuff. Jimmy doesn’t like his stuff or her stuff,’ and then the next thing you know, that guy’s desk is empty. That’s real-world risk. There’s a lot of pressure to continue to produce stuff that lands and you’re trying to hit this moving target — the stuff that was making Jimmy laugh last week, he’s over it. Now that’s played out. Humor is like that.”
“It’s a dream job. It’s what I envisioned doing when I was a little kid, and I’d see ‘Saturday Night Live,’ or even ‘The Muppet Show.’ The idea of, there’s a show going on, and there’s insanity backstage, and there’s a Stormtrooper and free chickens and Gonzo and things are crashing and the show must go on.”
Asked about how he deals with near-constant rejection in the office, Burke said your feelings are always on the line.
“It’s impossible to not take things personally,” he said. But he added, there’s a trick to avoid getting too hurt.
“You walk into the room convinced that you are the absolute only person who could ever play this role, and you do your audition, and as soon as they say, ‘Thank you so much,’ you walk out of that room convinced you will never hear from them again and that you didn’t get it, so that you’re not disappointed. And it’s this weird game you play with yourself. Extrapolating that to the writers’ room as you’re pitching a joke, you stop caring what people think, because your nerve endings get frayed.”
In his personal life, Burke says his approach to humor errs on the risky side.
“Comedy can disarm tension. It can bridge divides. It can humanize a room, especially when you’re an underdog or an outsider,” he said. “Sometimes telling a dirty joke at a fancy dinner party is like, ‘Oh, we’re going there. Everyone loves a dirty joke, and now we’re all sharing dirty jokes, and it’s OK. This is an R-rated dinner.’”
But of course, there’s always the risk of the joke going too far. In a fictionalized scenario that definitely wasn’t him, he lays out the rule of time and place.
“Sometimes, in doing a joke, it goes too far, and you learn from it, but you have to go too far sometimes to know where the line is,” he said. “I know you thought it was super funny to come downstairs wearing a bra on your head at the party, but we’re at my friend’s house, and that’s his girlfriend’s bra, and you don’t know them.”
But overall, the chance of being funny, Burke said, well outweigh the risks of being embarrassed, or falling off the tightrope.
“It’s a dream job,” he said. “It’s what I envisioned doing when I was a little kid, and I’d see ‘Saturday Night Live,’ or even ‘The Muppet Show.’ The idea of, there’s a show going on, and there’s insanity backstage, and there’s a Stormtrooper and free chickens and Gonzo and things are crashing and the show must go on.”
Mediterranean diet offsets genetic risk for dementia, study findsGreatest benefit for those with highest predisposition to Alzheimer’s disease
Mediterranean diet offsets genetic risk for dementia, study finds
Greatest benefit for those with highest predisposition to Alzheimer’s disease
Mass General Brigham Communications
4 min read
New research suggests that following a Mediterranean-style diet may help offset a person’s genetic risk for developing Alzheimer’s disease.
The study, published in Nature Medicine and led by investigators from Mass General Brigham, Harvard T.H. Chan School of Public Health, and the Broad Institute of MIT and Harvard, found that people at the highest genetic risk for Alzheimer’s disease who followed a Mediterranean diet — rich in vegetables, fruits, nuts, whole grains, and low in red and processed meats — showed slower cognitive decline as well as a greater reduction in dementia risk than those at lower genetic risk.
“One reason we wanted to study the Mediterranean diet is because it is the only dietary pattern that has been causally linked to cognitive benefits in a randomized trial,” said study first author Yuxi Liu, a research fellow in the Department of Medicine at Brigham and Women’s Hospital and a postdoctoral fellow at the Harvard Chan School and the Broad. “We wanted to see whether this benefit might be different in people with varying genetic backgrounds, and to examine the role of blood metabolites, the small molecules that reflect how the body processes food and carries out normal functions.”
“These findings suggest that dietary strategies could help reduce the risk of cognitive decline and stave off dementia by broadly influencing key metabolic pathways.”
Yuxi Liu, study’s first author
Over the last few decades, researchers have learned more about the genetic and metabolic basis of Alzheimer’s disease and related dementias. These are among the most common causes of cognitive decline in older adults. Alzheimer’s disease is known to have a strong genetic component, with heritability estimated at up to 80 percent.
One gene in particular, apolipoprotein E, or APOE, has emerged as the strongest genetic risk factor for sporadic Alzheimer’s disease — the more common type develops later in life and is not directly inherited in a predictable pattern. People who carry one copy of the APOE4 variant have a three- to fourfold higher risk of developing Alzheimer’s. People with two copies of the APOE4 variant have a 12-fold higher risk of Alzheimer’s than those without.
To explore how the Mediterranean diet may reduce dementia risk and influence blood metabolites linked to cognitive health, the team analyzed data from 4,215 women in the Nurses’ Health Study, following participants from 1989 to 2023 (average age 57 at baseline). To validate their findings, the researchers analyzed similar data from 1,490 men in the Health Professionals Follow-Up Study, followed from 1993 to 2023.
Researchers evaluated long-term dietary patterns using food frequency questionnaires and examined participants’ blood samples for a broad range of metabolites. Genetic data were used to assess each participant’s inherited risk for Alzheimer’s disease. Participants were then followed over time for new cases of dementia. A subset of 1,037 women underwent regular telephone-based cognitive testing.
They found that the people following a more Mediterranean-style diet had a lower risk of developing dementia and showed slower cognitive decline. The protective effect of the diet was strongest in the high-risk group with two copies of the APOE4 gene variant, suggesting that diet may help offset genetic risk.
“These findings suggest that dietary strategies, specifically the Mediterranean diet, could help reduce the risk of cognitive decline and stave off dementia by broadly influencing key metabolic pathways,” Liu said. “This recommendation applies broadly, but it may be even more important for individuals at a higher genetic risk, such as those carrying two copies of the APOE4 genetic variant.”
A study limitation was that the cohort consisted of well-educated individuals of European ancestry. More research is needed in diverse populations.
In addition, although the study reveals important associations, genetics and metabolomics are not yet part of most clinical risk prediction models for Alzheimer’s disease. People often don’t know their APOE genetics. More work is needed to translate these findings into routine medical practice.
“In future research, we hope to explore whether targeting specific metabolites through diet or other interventions could provide a more personalized approach to reducing dementia risk,” Liu said.
This study was funded in part by the National Institutes of Health.
Seeding solutions for bipolar disorderBrain Science grants promote new approaches to treat the condition and discover underlying causes
Human brain organoid showing the integration of excitatory (magenta) and inhibitory neurons (green) of the cerebral cortex.
Credit: Arlotta Lab
Kermit Pattison
Harvard Staff Writer
9 min read
Brain Science grants promote new approaches to treat the condition and discover underlying causes
Paola Arlotta holds up a vial of clear fluid swirling with tiny orbs. When she shakes her wrist, the shapes flutter like the contents of a snow globe.
“Those small spheres swirling around are actually tiny pieces of human cerebral cortex,” said Arlotta, the Golub Family Professor of Stem Cell and Regenerative Biology, “except instead of coming from the brain of a person, they were made in the lab.”
Those minuscule shapes may represent a giant opportunity for breakthroughs into bipolar disorder, a mental health condition that affects about 8 million people in the U.S. These lab-grown “organoids” — brain-like tissue engineered from blood cells of living patients — offer a means to discover more effective drugs and develop more personalized treatments for bipolar patients.
Paola Arlotta.
Harvard file photo
The research effort is just one example of the diverse array of projects funded by the Bipolar Disorder Seed Grant Program of the Harvard Brain Science Initiative, a collaboration between the Faculty of Arts and Sciences (FAS) and Harvard Medical School (HMS). Over the last decade, the program has funded more than 90 projects across the University and affiliated hospitals and hosted five symposia. In some cases, the grants have enabled researchers to develop innovative approaches that subsequently won larger grants from major funding agencies and to publish their findings in prominent journals such as Nature.
“The goal for this grant program has always been to help creative scientists in our community initiate new avenues of research related to bipolar disorder,” said Venkatesh Murthy, co-director of the Harvard Brain Science Initiative and Raymond Leo Erikson Life Sciences Professor of Molecular & Cellular Biology. “New directions, as well as new thinkers, are vital for understanding and eventually curing this damaging disorder.”
The program began in 2015 with the first of a series of gifts from the Dauten Family Foundation and recently expanded thanks to a new gift from Sandra Lee Chen ’85 and Sidney Chen. Kent Dauten, M.B.A. ’79, and his wife, Liz, took up the cause after two of their four children were diagnosed with bipolar disorder despite no known family history of the illness. “The field is terribly underfunded and for too long was a discouraging corner of science because of the complexity of these brain disorders, but in recent years has become an exciting frontier for discovery,” said Kent Dauten. The Chens had similar motivations. “Bipolar disorder has touched our family,” said Sandra Chen. “Our experiences drive our commitment to help advance understanding of what causes this disruptive disorder.”
The program now provides each project with $174,000 spread over two years. The 11 projects funded this year will investigate bipolar disorder causes and treatments from perspectives including genetics, brain circuitry, sleep, immune dysregulation, stress hormones, and gut bacteria.
The seed grants seek to nurture “outside-the-box ideas,” Murthy said. He added, “Many of our grantees have made significant discoveries with this support.”
An unsolved problem
Bipolar disorder usually begins in adolescence and on average patients suffer from symptoms for nine years before they are diagnosed. It brings recurrent episodes of mania and depression — most often the latter.
The typical treatment involves mood stabilizer medications such as lithium. Some patients also are prescribed antipsychotic medications, but these can cause weight gain.
The disorder often brings other health challenges such as cardiovascular diseases, Type 2 diabetes, metabolic syndrome, and obesity. Patients have a life expectancy 12 to 14 years lower than average, and high rates of suicide.
The causes of bipolar remain unknown, but the disorder appears to arise from a complex mix of genetic, epigenetic, neurochemical, and environmental factors.
Basic science: When brain signaling goes awry
Extreme mood swings are a hallmark of bipolar disorder. Patients often veer between manic episodes (characterized by grandiosity, risky behaviors, compulsive talking, distractibility, and reduced need for sleep) to depressive periods (sullen moods, joylessness, weight changes, fatigue, inability to concentrate, indecisiveness, and suicidal thoughts).
Nao Uchida, a professor of molecular and cellular biology, suspects that one driver of this volatility is dopamine, a neurotransmitter that plays a key role in learning, memory, movement, motivation, mood, and attention.
Uchida studies the role of dopamine in animal learning and decision-making. Dopamine often is described as the brain’s “reward system,” but Uchida suggests it is better understood as an arbiter of predictions and their outcomes. Mood often depends not on the result itself, but instead on how much the outcome differs from expectations — what scientists call the reward prediction error (RPE).
A few years ago, Uchida became interested in how dysregulation of the dopamine system might offer insights into the swings of bipolar disorder.
“We had not done research related to these diseases before, so this seed grant really let me enter the field,” said Uchida.
The funds allowed his lab to test how manipulation of depressive or manic states altered the responses of dopamine neurons in mice. The team incorporated new revelations about how synapses became potentiated or depressed to make certain pathways stronger or weaker. Some of their early findings will soon be published in Nature Communications.
Uchida posits that the disorder may be linked to skewed signaling of the neurotransmitters involved in prediction and learning. When the dopamine baseline is high, the person may become biased to learn from positive outcomes and fail to heed negative ones — and thus become prone to taking dangerous risks or entering manic states. In contrast, when the dopamine baseline is low, people pay too much attention to negative outcomes and ignore positive ones — and this pessimism pushes them toward depression.
“A lot of our future predictions depend on our experiences,” said Uchida. “I think that process might be altered in various diseases, including depression, addiction, and bipolar disorders.”
Nao Uchida (left) and Louisa Sylvia.
Harvard file photo; courtesy photo
Clinical research: Reducing obesity
Louisa Sylvia got an intimate glimpse of bipolar disorder in her first job after college. Working as a clinical research coordinator in a bipolar clinic, she witnessed patients struggling with anxiety, depression, and other symptoms. Again and again, she saw patients gain weight after being prescribed medications.
“I quickly became disappointed by the options that were out there for individuals with bipolar,” recalled Sylvia, now an associate professor in the Department of Psychiatry at Mass General Hospital and HMS. “It was really just medications — medications that can have really bad side effects.”
Sylvia has devoted her career to finding better options. (She also is the author of “The Wellness Workbook for Bipolar Disorder: Your Guide to Getting Healthy and Improving Your Mood.”) Even with the best current medications and psychotherapy, many patients continue to suffer from depression and other side effects. To supplement standard therapies, she has sought to develop interventions involving diet, exercise, and wellness.
One promising strategy is time-restricted eating (TRE). Restricting meals to a limited window — say 8 a.m. to 6 p.m. — can result in weight loss, improved mood and cognition, and better sleep.
With the seed grant, Sylvia plans to conduct a trial to evaluate the effects of TRE on bipolar patients. The study will investigate how the regulation of eating habits affects weight, mood, cognition, quality of life, and sleep patterns. She will work with Leilah Grant, an instructor at HMS and researcher at Brigham and Women’s Hospital who specializes in sleep and circadian physiology.
“For individuals who are depressed or have difficulty with motivation or energy, TRE is actually considered one of the easier lifestyle inventions to adhere to,” said Sylvia, who also is associate director of the Dauten Family Center for Bipolar Treatment Innovation at MGH. “We’re basically just saying, ‘Don’t focus as much on what you eat, but rather when you are eating.’”
The seed grants seek to nurture promising approaches that might not get funded through other channels. Sylvia can attest to the value of this opportunity; she had two TRE grant applications for federal funding rejected.
“I look at it like an innovation grant to try something that’s a little bit different but won’t get funded by the normal channels,” she said.
Translational research: Brain avatars
Despite decades of research, the success rate of drugs for treating bipolar disorder remains frustratingly low. Lithium, the mainstay first-line treatment, fully benefits only about 30 percent of patients — but three-quarters of them also suffer from profound side effects.
Animal models do not always translate to human medicine. Among humans, responses vary greatly; some individuals benefit from drug treatments while others do not.
To address these shortcomings, Arlotta is developing an innovative method to test drugs on brain cells of people with bipolar — without putting the humans themselves at risk.
Her team has spent more than a decade developing human brain organoids. They begin by taking a single sample of blood from a person. Because blood cells carry copies of our DNA, they hold the instruction manuals that guide development from fetus to adult. With a series of biochemical signals, these blood cells are reprogrammed to become stem cells. The team then uses another set of signals to mimic the normal process of cell differentiation to grow human brain cells — except as cell cultures outside the body.
“You can grow thousands and thousands of brain organoids from any one of us,” said Arlotta. “If the blood comes from a patient with a disorder, then every single cell in that organoid carries the genome, and genetic risk, of that patient.”
These “avatars” — each about five millimeters in diameter — contain millions of brain cells and hundreds of different cell types. “That is the only experimental model of our brain that science has today,” she said. “It may not be possible to investigate the brain of a patient with bipolar disorder, but scientists might be able to use their avatars.”
In pilot studies, the Arlotta team created brain organoids from stem cells from two groups of bipolar patients: “lithium responders” who benefit from the drug and “lithium nonresponders” who do not. The researchers will test whether these organoids replicate the differences seen in living patients — and then use them to develop more effective therapeutic drugs.
But Arlotta knows that no single approach represents a panacea. Because bipolar disorder remains so mysterious, the seed grant program is valuable because it promotes many promising lines of research across disciplines.
“The program has the modesty of understanding that we know very little about bipolar disorder,” said Arlotta. “Therefore, we need to have multiple shots on goal.”
Physicians embrace AI note-taking technology‘There is literally no other intervention in our field that impacts burnout to this extent’
‘There is literally no other intervention in our field that impacts burnout to this extent’
AI-driven scribes that record patient visits and draft clinical notes for physician review led to significant reductions in physician burnout and improvements in well-being, according to a Mass General Brigham study of two large healthcare systems.
The findings, published in JAMA Network Open, draw on surveys of more than 1,400 physicians and advanced practice providers at both Harvard-affiliated Mass General Brigham and Atlanta’s Emory Healthcare.
At MGB, use of ambient documentation technologies was associated with a 21.2 percent absolute reduction in burnout prevalence at 84 days, while Emory Healthcare saw a 30.7 percent absolute increase in documentation-related well-being at 60 days.
50%Physician burnout linked to maintaining electronic patient files
“Ambient documentation technology has been truly transformative in freeing up physicians from their keyboards to have more face-to-face interaction with their patients,” said study co-senior author Rebecca Mishuris, chief medical information officer at MGB, a faculty member at Harvard Medical School, and a primary care physician in the healthcare system. “Our physicians tell us that they have their nights and weekends back and have rediscovered their joy of practicing medicine. There is literally no other intervention in our field that impacts burnout to this extent.”
Physician burnout affects more than 50 percent of U.S. doctors and has been linked to time spent in electronic health records, particularly after hours. There is additional evidence that the burden and anticipation of needing to complete their appointment notes also contributes significantly to physician burnout.
“Burnout adversely impacts both providers and their patients who face greater risks to their safety and access to care,” said Lisa Rotenstein, a co-senior study author and director of The Center for Physician Experience and Practice Excellence at Brigham and Women’s Hospital. She is also an assistant clinical professor of medicine at the UCSF School of Medicine. “This is an issue that hospitals nationwide are looking to tackle, and ambient documentation provides a scalable technology worth further study.”
“Our physicians tell us that they have their nights and weekends back and have rediscovered their joy of practicing medicine.”
Rebecca Mishuris, Mass General Brigham
Qualitative feedback from users touted that ambient documentation enabled more “contact with patients and families,” improvements in their “joy in practice,” while recognizing its potential to “fundamentally [change] the experience of being a physician.” However, some users felt it added time to their note-writing or had less utility for certain visit types or medical specialties. Since the pilot studies began, the AI technologies have evolved as the vendors make changes based on user feedback and the large language models that power the technologies improve themselves through additional training, warranting continued study.
The researchers analyzed survey data from pilot users of ambient documentation technologies at two large health systems. At Mass General Brigham, 873 physicians and advanced practice providers were given surveys before enrolling, then after 42 and 84 days. About 30 percent of users responded to the surveys at 42 days, and 22 percent at 84 days. All 557 Emory pilot users were surveyed before the pilots and then at 60 days of use, with an 11 percent response rate. Researchers analyzed the survey results quantifying different measures of burnout at Mass General Brigham and physician well-being at Emory Healthcare.
The study authors added that given that these were pilot users and there were limited survey response rates, the findings likely represent the experience of more enthusiastic users, and more research is needed to track clinical use of ambient documentation across a broader group of providers.
Mass General Brigham’s ambient documentation program launched in July 2023 as a proof-of-concept pilot study involving 18 physicians. By July 2024, the pilot, which tested two different ambient documentation technologies, expanded to more than 800 providers. As of April 2025, the technologies have been made available to all Mass General Brigham physicians, with more than 3,000 providers routinely using the tools. Later this year, the program will look to expand to other healthcare professionals such as nurses, physical and occupational therapists, and speech-language pathologists.
“Ambient documentation technology offers a step forward in healthcare and new tools that may positively impact our clinical teams,” said Jacqueline You, lead study author and a digital clinical lead and primary care associate physician at Mass General Brigham. “While stories of providers being able to call more patients or go home and play with their kids without having to worry about notes are powerful, we feel the burnout data speak similar volumes of the promise of these technologies, and importance of continuing to study them.”
Ambient documentation’s use will continue to be studied with surveys and other measures tracking burnout rates and time spent on clinical notes inside and outside of working hours. Researchers will evaluate whether burnout rates improve over time as the AI evolves, or if these burnout gains plateau or are reversed.
This project received financial support from the Physician’s Foundation and the National Library of Medicine of the National Institutes of Health.
How to reverse nation’s declining birth rateHealth experts urge policies that buoy families: lower living costs, affordable childcare, help for older parents who want more kids
Health experts urge policies that buoy families: lower living costs, affordable childcare, help for older parents who want more kids
Alvin Powell
Harvard Staff Writer
5 min read
Financial-incentive programs for prospective parents don’t work as a way to reverse falling birth rates, Harvard health experts said on Tuesday about a policy option that has been in the news in recent months.
Instead, they said, a more effective approach would be to target issues that make parenting difficult: the high cost of living, a lack of affordable childcare, and better options for older parents who still want to see their families grow.
The discussion, held at The Studio at Harvard T.H. Chan School of Public Health, came in the wake of a July report from the Centers of Disease Control and Prevention that showed that the U.S. fertility rate was down 22 percent since the last peak in 2007.
Ana Langer, professor of the practice of public health, emerita, said the causes of fertility decline are numerous, complex, and difficult to reverse.
Surveys investigating why people might not want children cite things such as the cost of living, negative medical experiences from previous pregnancies, and wariness about major global issues such as climate change. In fact, she said, many survey respondents are surprised that declining fertility is even a problem and say they’re more concerned about overpopulation and its impacts on the planet.
The landscape is complicated by the fact that U.S. society has changed significantly since the 1960s, when expectations were that virtually everyone wanted to raise a family. Today, she said, people feel free to focus on careers rather than families, and there is far greater acceptance of those who decide never to have children.
Margaret Anne McConnell, the Chan School’s Bruce A. Beal, Robert L. Beal and Alexander S. Beal Professor of Global Health Economics, said some of the factors that have contributed to the declining birth rate reflect positive cultural shifts.
Fertility rates are falling fastest, for example, in the youngest demographic, girls age 15 to 20. Teen pregnancy has been long considered a societal ill and is associated with difficult pregnancies, poor infant health, interrupted education, and poor job prospects.
Other factors include the widespread availability of birth control, which gives women more reproductive choice, as well as the increasing share of women in higher education and the workforce.
Today people feel free to focus on careers rather than families, and there is far greater acceptance of those who decide never to have children.
Margaret Anne McConnell
McConnell said some stop short of having the number of children they desire, due to fertility, medical, and other issues. One way to address declining fertility, she said, would be to find ways to enable those parents to have the number of children they wish.
“Any time we see people being able to make fertility choices that suit their family, I think that’s a success,” McConnell said. “I think people choosing to have children later in life is also a success. … To the extent that we can make it possible for people to reach whatever their desired family size is, I think that that would be a societal priority.”
The event, “America’s declining birth rate: A public health perspective,” brought together Langer, McConnell, and Henning Tiemeier, the Chan School’s Sumner and Esther Feldberg Professor of Maternal and Child Health.
Addressing the declining birth rate has become a focus of the current administration — President Trump has floated the idea of a $5,000 “baby bonus” and $1,000 “Trump Accounts” that were part of the “One Big Beautiful Bill” approved this summer.
Panelists at the virtual event pointed out that a declining birth rate is not just a problem in the U.S. It has been declining in many countries around the world, and for many of the same reasons. As people — particularly women — become better educated and wealthier, they tend to choose smaller families than their parents and grandparents.
Tiemeier said that changing societies and cultures have altered the very nature of relationships between men and women. He added sex education to the list of key changes that have fueled the birth-rate decline, particularly for teen pregnancies. The question of whether declining fertility is a problem is too simple for such a complex issue, he said.
In a country with a growing population, where women have, on average, three children, the birth rate falling to 2½, slightly over the replacement value, would be beneficial economically, ensuring more workers to support the population as it ages.
Countries with a birth rate below 1, whose population is already contracting, risk too few workers to fuel their economy, not to mention the social and societal impacts of a lack of young people.
Tiemeier and McConnell said that other countries have tried simply paying people to have more children, and it doesn’t work. Even if the declining birth rate was considered a catastrophe, McConnell said, governments haven’t yet found levers that can bring it back up.
That doesn’t mean there aren’t things government can do to help parents navigate a difficult and expensive time in life. Programs to lower the cost of childcare have been instituted in some cities and states, and more can be done.
Tiemeier said both Republicans and Democrats are interested in supporting families, though their approaches may be different. So this may be a rare issue on which they could find common ground.
Other areas of associatedneed include maternal health — a significant part of the population lives in healthcare “deserts” far from medical care. Programs designed to reach those areas, as well as a national parental-leave policy, would help young families navigate that time.
“Any measure that we take will have a modest effect, because there are so many things contributing to this,” Tiemeier said. “To say that we are waiting and looking for a measure that has a big effect is an illusion. There are no big effects in this discussion.”
Dr. Robot will see you now?Medical robotics expert says coming autonomous devices will augment skills of clinicians (not replace them), extend reach of cutting-edge procedures
Pierre E. Dupont holds a transcatheter valve repair device with a motorized catheter drive system, replacing the traditional manual handle.
Niles Singer/Harvard Staff Photographer
Alvin Powell
Harvard Staff Writer
8 min read
Medical robotics expert says coming autonomous devices will augment skills of clinicians (not replace them), extend reach of cutting-edge procedures
The robot doctor will see you now? Not for the foreseeable future, anyway.
Medical robots today are pretty dumb, typically acting as extensions of a surgeon’s hands rather than taking over for them. Pierre E. Dupont, professor of surgery at Harvard Medical School, co-authored a Viewpoint article in the journal Science Robotics last month saying that autonomous surgical robots that learn as they go are on the way.
But their likely impact will be to augment the skills of clinicians, not replace them, and to extend the reach of cutting-edge advances beyond the urban campuses of academic medical centers where they typically emerge.
In this edited conversation, Dupont, who is also chief of pediatric cardiac bioengineering at Boston Children’s Hospital, spoke with the Gazette about the areas most likely to see surgical robots operating autonomously, and some of the hurdles to their adoption.
You note that robot autonomy and learning system technologies are being used in manufacturing as well as medical settings. How does that work?
Yes, in just about every other field, robots are used as autonomous agents to replace the manpower that would be needed to perform a task. But in many surgical applications, like laparoscopy, they’re used as extensions of the clinician’s hand. They improve ergonomics for the clinician, but there’s still some question as to how much they’re improving the experience for the patient.
Outside of medicine, teleoperation, in which the operator uses a mechanical input device to directly control robot motion, is only used in remote or hostile environments like space or the ocean floor. But it’s how laparoscopic robots are controlled.
The hot extension today, which ties into hospital economics, is telesurgery, where you might have a Boston-based hospital and satellite facilities in the suburbs. Rather than the clinician being with the patient in the operating room, you would have robots at the satellite hospitals, and the clinician could stay at the main hospital and connect remotely to perform procedures. That’s trending today, but it’s not automation.
What would an automated procedure look like?
Some simpler medical procedures are already automated using non-learning methods.
In joint replacement, for example, you need to create a cavity in the bone to place an implant. Historically, the skill of the clinician determined how well the implant fit and whether the joint alignment was appropriate.
But there’s a strong parallel with machining processes, which was the impetus for creating robots to mill cavities in the bone — leading to more accurate and consistent outcomes. That’s a big market today in orthopedics.
The autonomy of the milling robot is possible because it’s a well-defined problem and easy to model. You create a 3D model of the bones and a clinician can sit at a computer interface and use software to define exactly how the implant will be aligned and how much bone will be removed. So everything can be modeled and preplanned — the robot is basically just following the plan. It’s a dumb form of automation.
“Rather than the clinician being with the patient in the operating room, you would have robots at the satellite hospitals, and the clinician could stay at the main hospital and connect remotely to perform procedures. ”
Pierre E. Dupont
That’s because of the nature of the bone and the implant. The dimensions are known. Nothing’s moving like it would if you were operating on a beating heart.
That’s right, although I think transcatheter cardiac procedures and endovascular procedures in general are actually great targets for automation.
The geometry is not as well-defined as orthopedic surgery, but it’s much simpler than in laparoscopy or any type of open surgery where you’re dealing with soft tissue.
In soft tissue surgery, you’re using forceps, scalpel, and suture to grasp, cut, and sew tissue. The clinician, through experience, has a model in their head of how hard they can squeeze the tissue without damaging it, how the tissue will deform when they pull on it and cut it, and how deeply they have to place the needle while suturing.
Those things are much harder to model with classical engineering techniques than milling bone.
How much of the progress in this area is due to the speed of technological development versus acceptance among clinicians and patients?
If you just think about robotics, the amount of acceptance is surprising. A lot of academic clinicians love to play with new toys. Many patients, perhaps incorrectly, assume that the clinician must do a better job with this incredible piece of equipment.
Hospitals want to know about costs. They don’t necessarily care if the clinician’s back is a little less sore at the end of the day because they used a robot. They want to know whether the patient had fewer complications and was discharged sooner — in other words, better care for less money. That’s the tough aspect of this: Robots cost more to make and roll out than most other medical equipment.
When you talk about the acceptance of medical robot automation, clinicians may be a little reluctant because they wonder whether they are going to lose their jobs. But it’s actually like giving them a highly effective tool that can raise their skill level.
There are a lot of clinicians who may only see a particular procedure 10 times a year. If you think of anything that’s complex in life that you only do once a month, you’re not going to do that as well and feel as confident as if you did it every day.
So, if the robot is not replacing them, but acting like a highly experienced colleague whom you can communicate with, and who can coach you through the procedure, explaining, “Now I’m going to do this.” Or ask, “Do you think I should do it this way?” or “Should I put this device a little to the left?” I think there’ll be acceptance. If you have a system that can bend a clinician’s learning curve down and raise their proficiency level very quickly, every clinician will want one.
How important are recent advances in large language models and other forms of AI in the discussion of autonomy?
These advances are what is going to enable progress in medical robot autonomy. We’re working on transcatheter valve repair procedures that right now are done by hand. Clinicians need to do a lot of these procedures to get good at them — and to stay that way.
We have seen in my lab that adding a robotic teleoperation makes them easier. But if we can add learning-based autonomous functionality, we could make it possible for these procedures to be safely offered in low-volume facilities.
That’s important because a significant concern is that you get the best care and the newest treatments in the big urban areas that have academic medical centers. But many people don’t live in those areas and even though they could travel to get treatment, they want to get treated locally.
So, if you can enable community hospitals to offer these services, even though they’re low-volume, that’s an opportunity for a much larger fraction of the population to take advantage of the best medical care.
When we look further out, do you have any doubt that medicine will become more autonomous?
I think there’s a lot of opportunity for increasing levels of autonomy, but it has to be done gradually. You want to make sure that you’re regulating it so that patients are always safe.
There will be unanticipated events, such as unusual anatomical variations, that the system hasn’t been trained for. You need to make sure that the system will catch these problems as they come up — it needs to recognize when it’s out of its depth.
Currently, that’s a research topic in learning systems — there is technology that still needs to be developed. But the revolution over the last few years in foundation models has shown us how much is possible.
Ultimately, will there be a case where there’s no clinician involved? We don’t have to worry about that question yet.
You mentioned that these systems are expensive. Will costs come down the more they’re used?
The challenge is that medical devices are designed and approved for specific procedures. If you want to create a new medical device, you need to look at how many procedures are performed per year, and what the reimbursements are for those procedures.
For any medical device — not a robot — the smallest realistic market size is $100 million in sales per year. And if you want to raise venture capital funding, the market has to be at least a billion dollars.
Since medical robots are so expensive to develop, that means you should have a multibillion-dollar market for a medical robot. Those markets do exist: Laparoscopy and orthopedics are current examples. Endovascular procedures including heart valve repair and replacement are another that I am targeting.
An important factor for each of these three examples is that the robot is a platform. It can be used for a variety of procedures and so has a much larger addressable market than a robot that can only do one thing.
Setback in the fight against pediatric HIVFunding cut disrupts effort to liberate Botswana patients from antiretroviral regimen
Funding cut disrupts Botswana-based effort to help patients control illness without regular treatments
Liz Mineo
Harvard Staff Writer
4 min read
Roger Shapiro.
Niles Singer/Harvard Staff Photographer
For more than 20 years, Harvard infectious disease specialist Roger Shapiro has fought HIV on the ground in Botswana, where the rate of infection exceeded 30 percent in some areas of the country in the 1990s.
Progress has been steady since then. According to the World Bank, Botswana still has one of the world’s highest rates of infection — over 20 percent of the adult population — but far fewer HIV deaths. The main lifesaver has been antiretroviral treatment (ART).
Shapiro began working in Botswana in 1999 under the mentorship of pioneering AIDS researcher Max Essex, who helped launch the Botswana Harvard Health Partnership (BHP). He has run dozens of studies on HIV/AIDS in Botswana and has become an expert in how HIV affects maternal and child health.
In 2008, pioneering AIDS researcher Professor Max Essex spoke to a group gathered at his lab in Gaborone, Botswana.
Harvard file photos
On the grounds of Princess Marina Hospital in Gaborone, a plaque recognizes the partnership between Harvard School of Public Health and the Botswana Ministry of Health.
Among Shapiro’s current studies is a trial with the potential to help some children control HIV without the need for regular treatment. Efforts to create a vaccine have so far failed, but there are exciting new developments with products known as broadly neutralizing antibodies, or bNAbs, he says.
The trial aims to find a new treatment option by examining the effects of a combination of three broadly neutralizing HIV antibodies. It builds upon previous studies suggesting that bNAbs might help the immune system clear the virus better than standard ART, and may offer a promising avenue for getting to post-treatment viral control, Shapiro says.
“It is the only study in pediatrics looking at three antibodies as combination treatment for HIV and ultimately as a path toward HIV cure,” he said. “It’s really exciting science, since we are testing whether some children can go off all treatment and control HIV on their own.”
“Botswana probably has the best program to prevent HIV transmission to children on the continent.”
Roger Shapiro
In May, the five-year grant supporting the study was slashed as part of the Trump administration’s mass cancellation of Harvard research funds. Four other grants for Botswana-based projects led by Shapiro were also canceled. The cuts have not only dealt a serious blow to the participants in the trial and their families, said Shapiro, but imperiled progress toward a cure for pediatric HIV.
“This was one of the largest funded studies to begin making inroads in this field,” he said. “Now all this science is up in the air.”
Funded by National Institutes of Health and the National Institute of Allergy and Infectious Diseases, the trial is following 12 children, ages 2-9 years, who are living with HIV. The study is in its second year, and researchers have been gearing up to have the children pause standard ART and start using antibodies alone as treatment.
The team had planned to scale up to 41 children, but due to the cuts, they are now aiming for 30. They were able to secure donations to continue with the project until March, but it’s unclear what will happen after that.
According to the Centers for Disease Control, Botswana is a leader in global HIV efforts, having exceeded the UNAIDS 95-95-95 targets: “95 percent of people living with HIV in Botswana know their status, 98 percent of people who know their status receive treatment, and 98 percent of people on treatment are virally suppressed.”
“Botswana probably has the best program to prevent HIV transmission to children on the continent,” said Shapiro. “Now less than half a percent of the children become infected because most women access free drug treatment during pregnancy, which effectively turns off transmission. It’s a tiny percentage, but it still leads to more pediatric HIV infections than we see in the United States.”
Giving treatment to children infected with HIV every day for the rest of their lives is a daunting prospect for many families, said Shapiro. Parents were excited about the possibility of their children being liberated from regular infusions of antibodies.
The grant’s termination is yet another blow to Botswana’s fight against HIV/AIDS. In February, assistance through three U.S. programs — USAID, the U.S. President’s Emergency Plan for AIDS Relief (PEPFAR), and the Centers for Disease Control and Prevention — was cut. Botswana’s government pays for medication, but it relied on those funds to provide services around HIV, said Shapiro.
“HIV/AIDS is essentially a chronic problem in Botswana, and a chronic problem needs ongoing treatment,” he said. “If treatment lapses … We worry about HIV transmission going back up again, not only in Botswana but throughout all of Africa.”
Why was Pacific Northwest home to so many serial killers?In ‘Murderland,’ alum explores lead-crime theory through lens of her own memories growing up there
Why was Pacific Northwest home to so many serial killers?
In ‘Murderland,’ alum explores lead-crime theory through lens of her own memories growing up there
Jacob Sweet
Harvard Staff Writer
5 min read
In Caroline Fraser’s 2025 book “Murderland,” the air is always thick with smog, and sinister beings lie around every corner.
Fraser, Ph.D. ’87, in her first book since “Prairie Fires,” her Pulitzer Prize-winning biography of “Little House on the Prairie” author Laura Ingalls Wilder, explores the proliferation of serial killers in the 1970s — weaving together ecological and social history, memoir, and disturbing scenes of predation and violence. The resulting narrative shifts the conventional focus on the psychology of serial killers to the environment around them. As the Pacific Northwest reels from a slew of serial murderers, Fraser turns toward the nearby smelters that shoot plumes of lead, arsenic, and cadmium into the air and the companies, government officials, and even citizens who are happy to overlook the pollution.
Of the Pacific Northwest’s most notorious killers, Fraser ties many to these smokestacks. Ted Bundy, whose crimes and background are discussed more than any other character, grew up in the shadows of the ASARCO copper smelter in Tacoma, Washington. Gary Ridgway grew up in Tacoma, too, and Charles Manson spent 10 years at a nearby prison, where lead has seeped into the soil. Richard Ramirez, known as the Night Stalker, grew up next to a different ASARCO smokestack in El Paso, Texas, long before committing murders in Los Angeles.
Fraser’s own experiences growing up in Mercer Island, Washington, add another eerie dimension. A classmate’s father blows up his home with the family inside. Another classmate becomes a serial killer. Her Christian Scientist father is menacing and abusive, and Fraser, as a child, considers ways to get rid of him, possibly by pushing him off a boat. The darkness is unrelenting; something is in the air.
To what extent environmental degradation directly led to the killings described in the book, Fraser leaves up to readers. “There are many things that probably contribute to somebody who commits these kinds of crimes,” she said in an interview. “I did not conceive of it as a work of criminology or an academic treatise on the lead-crime hypothesis. I really just wanted to tell a history about the history of the area — what I remember of it — and create a narrative that took all these things into account.”
“I did not conceive of it as a work of criminology or an academic treatise on the lead-crime hypothesis. I really just wanted to tell a history about the history of the area — what I remember of it — and create a narrative that took all these things into account.”
Fraser has been thinking about these ideas for decades. Before “Prairie Fires” was published, she had already written some of the memoir portions of the book, recalling the crimes and unusual occurrences near her family’s home. She was long interested in why there were so many serial killers in the Pacific Northwest and whether the answer was simply happenstance.
Though she had some knowledge of the pollution in Tacoma as a kid — the area’s smell was referred to as the “Aroma of Tacoma” due to sulfur emissions from a local factory — it wasn’t until decades later that she learned the full scope of industrial production and pollution.
Some revelations came by chance. When looking at one property on Vashon Island, across the Puget Sound from West Seattle, she came across a listing with the ominous warning — “arsenic remediation needed.”
“That just leapt out at me,” she said. “How can there be arsenic on Vashon Island?” After more research, she discovered that arsenic had come from the ASARCO smelter, on the south end of the same body of water. The damage reached much farther; the Washington State Department of Ecology says that air pollutants — mostly arsenic and lead — from the smelter settled on the surface soil of more than 1,000 square miles of the Puget Sound Basin.
“Much of Tacoma, with a population approaching 150,000, will record high lead levels in neighborhood soils,” Fraser wrote in the book, “but the Bundy family lives near a string of astonishingly high measurements of 280, 340, and 620 parts per million.”
The connection made Fraser focus more on the physical environment in which these serial killers lived and less on other factors — like a history of abuse — on which true-crime writers have historically placed greater emphasis.
In this ecological pursuit, Fraser points readers toward once-ubiquitous sources of pollution like leaded gas and the industry forces that popularized them against advice from public-health experts.
American physicians raise concerns that lead particulates will blanket the nation’s roads and highways, poisoning neighborhoods slowly and “insidiously.” They call it “the greatest single question in the field of public health that has ever faced the American public.” Their concerns are swept aside, however, and Frank Howard, a vice president of the Ethyl Corporation, a joint venture between General Motors and Standard Oil, calls leaded gasoline a “gift of God.”
Though Fraser doesn’t explicitly support the lead-crime hypothesis, the core of the idea — that greater exposure to lead results in higher rates of crime — remains central. In the book’s final chapter, Fraser cites the work of economist Jessica Wolpaw Reyes, Ph.D. ’02, who concluded in her dissertation that lead exposure correlates with higher adult crime rates.
Regardless of exactly how much this hypothesis can be assuredly proven, Fraser thinks the connections between unapologetic and unfettered pollution and violent crime warrant scrutiny. In “Murderland,” she gives the idea, and an era of crime, a nimble, haunting narrative.
For some, the heart attack is just the beginning Harvard clinic uses mindfulness techniques to treat medically induced PTSD
Harvard clinic uses mindfulness techniques to treat medically induced PTSD
Heart attacks are life-changing events, but one type can be particularly distressing.
Spontaneous coronary artery dissection primarily strikes women under 50. Often, they are physically fit nonsmokers with good cholesterol and normal blood pressure — in other words, the very people who least expect a cardiac emergency. The shock of such an event may help explain why as many as 30 percent of survivors develop symptoms of medically induced post-traumatic stress disorder.
“Medically induced PTSD is basically PTSD that results from a sudden, catastrophic, life-threatening medical condition,” said Christina Luberto, a clinical health psychologist in the Department of Psychiatry at Mass General Hospital/Harvard Medical School. “It actually accounts for about 7 percent of all PTSD cases.”
Luberto is the founding director of the Mindful Living Center, a mental health service embedded with the Mass General Women’s Heart Health Program. The Mindful Living Center is one of the few programs in the country to integrate psychological services directly into cardiovascular care for women.
Christina Luberto.
Stephanie Mitchell/Harvard Staff Photographer
“We treat survivors whose primary presenting problem is the fear of recurrence,” she said. “They’re terrified by the uncertainty and possibility that it is going to happen again.”
Despite its prevalence, medically induced PTSD wasn’t formally recognized until the 1990s, when the Diagnostic and Statistical Manual of Mental Disorders expanded the definition to include trauma from medical events. It later tightened the criteria to sudden conditions, excluding chronic conditions like cancer or HIV. Research has shown that patients with medically induced PTSD tend to have worse recoveries and a higher risk of death than those without.
Medically induced PTSD symptoms mirror the symptoms of PTSD from external traumas, Luberto said: intrusive memories, hyperarousal, negative changes in mood or belief, and avoidance. But there are key differences.
“People often think of PTSD that results from external events like serving in combat. People may have flashbacks and intrusive memories. They’re thinking about what happened in the past. They might avoid things like celebrations with fireworks and loud noises, friends from that time, and they’re sort of able to do that,” she said. “With medically induced PTSD, the threat is not left in the past. You can’t escape the source of the ongoing threat, because the source of the threat is your own body.”
That reality makes survivors hyper-aware of physical sensations. Sweat or an elevated heart rate can trigger panic. Because exercise can mimic the sensations patients experienced during their heart attack, they may avoid working out — paradoxically, the very thing that could aid recovery and prevent future events. Others may skip medication, avoid medical follow-ups, or, conversely, over-engage with the healthcare system, frequently calling or messaging their providers.
“It’s a vicious cycle. What I hear is the future-oriented worry: ‘Is this going to happen again?’”
Christina Luberto
“There’s what we call cognitive reactivity in response to physical symptoms. ‘Why am I sweating? Why is my heart beating? Maybe it’s the coffee, but maybe it’s not. Should I go to the hospital?’ And then all of this thinking creates more physical symptoms of anxiety,” Luberto said. “It’s a vicious cycle. What I hear is the future-oriented worry: ‘Is this going to happen again?’”
Her research shows how the distressing thoughts can escalate. “Survivors start to believe different things about their body, and on some level, about the world. They believe, you know, ‘My body betrayed me. This is going to happen again. I’m not safe.’”
The Mindful Living Center, which opened in October 2023, employs an adapted Mindfulness-Based Cognitive Therapy method based on Luberto’s prior NIH-funded research. In online group therapy sessions, patients confront the source of their distress: their bodies.
“Mindfulness meditation brings you into the body, noticing the body without judgment, feeling sensations, noticing where the body can still feel safe or can still feel comfortable, and being able to regulate your attention to move it out of the body if the anxiety gets too much.”
The results are encouraging. Since it opened, the Mindful Living Center has received 181 referrals and treated 86 patients. Ninety percent of patients in the Mindfulness-Based Cognitive Therapy sessions reported improved emotional health, and 75 percent reported improved cardiac health.
“Stress and anxiety can have significant negative consequences for patients, from how they experience medical care to their ability to empower themselves to take steps to reduce future events,” said Amy Sarma, Cathy E. Minehan Endowed Chair in Cardiology at MGH and an assistant professor of medicine at Harvard Medical School. “However, most cardiologists do not have access to the resources to help their patients as we do at Mass General Brigham. Our partnership with Dr. Luberto in this unique program enables us to significantly advance the care of our patients.”
Nandita Scott, Ellertson Family Endowed Chair in Cardiovascular Medicine and the director of the Women’s Heart Health Program, highlighted the “exceptional support” the mindfulness program has received from the cardiology leadership at Mass General Brigham. “It’s well-established that mental health and cardiovascular outcomes are closely linked, yet few divisions would have had the vision or resources to fund such an initiative,” she said.
Luberto, who is also an executive faculty member in the MGH Health Promotion Resiliency Intervention Research Center and the MGH Benson-Henry Institute for Mind-Body Medicine, hopes to increase the Mindful Living Center’s offerings to other research-backed methodologies for managing medically induced PTSD. In a recent study led by UCLA doctoral student Corinne Meinhausen, with Luberto serving as a co-author, researchers reviewed therapies ranging from traditional cognitive behavioral therapy to written exposure therapy, a short five-session program in which patients write detailed accounts of the traumatic event. The written exposure therapy’s lower dropout rates and strong earlier results make it an appealing option, especially for patients reluctant to commit to longer, more intensive therapies.
Luberto said doctors can be on the lookout for PTSD symptoms resulting from traumatic medical events. The American Heart Health Association recommends screening for depression; she suggests adding PTSD screening for spontaneous coronary artery dissection patients, along with a clear treatment pathway. There is little research on risk factors or prevention of medically induced PTSD, but compassionate care during hospitalization couldn’t hurt, she said.
“There are trauma-informed care principles in mental healthcare in general that include giving patients choice. Being transparent. Considering cultural and identity factors. It’s an important research question to see if that can prevent risk, but even if it can’t, it’s just good care.”
Reading like it’s 1989Report on classroom literature shows staying power for ‘Gatsby,’ ‘Of Mice and Men,’ other classics. Time to move on?
Report on classroom literature shows staying power for ‘Gatsby,’ ‘Of Mice and Men,’ other classics. Time to move on?
Look back 40 years and you’ll see a lot of seismic change. The rise of the Internet, the smartphone revolution, and now AI everywhere. The end of the Cold War and the dawn of many messier conflicts. The overturning of paradigms of gender and sexuality, and then the backlash.
What are young people reading to help them make sense of their world? According to a recent report, pretty much the same things their parents read.
That report — compiled by researchers Kyungae Chae and Ricki Ginsberg for the National Council of Teachers of English — queried more than 4,000 public school teachers in the U.S. about what they assign students in grades six through 12.
It found little movement at the top of the English curriculum. F. Scott Fitzgerald’s “The Great Gatsby,” John Steinbeck’s “Of Mice and Men,” and a few Shakespeare tragedies occupy half of the top 10 most-assigned spots — just as they did in 1989. Even back in 1964, the top 10 was remarkably similar: If two Dickens novels have been dropped, “Hamlet” and “Macbeth” have not.
Classics are “classic” for a reason, of course. But that English-class inertia coincides with a trend that troubles educators, authors, and many parents: a long-term slide in the habit of reading among young Americans.
Some worry that — in a diverse and polarized nation — books that once felt accessible now feel remote or impenetrable, or that cultural conservatism or education bureaucracies have kept the curriculum from a healthy evolution.
With their many avid readers, Harvard’s classrooms contain almost as many views of the problem, if it is one, of curricular stagnation.
Stephanie Burt, the Donald P. and Katherine B. Loker Professor in the Department of English, made headlines last year as she launched a course on Taylor Swift. It was, in part, a self-conscious bid to use the world’s most popular songwriter as a gateway drug to Wordsworth and hermeneutics.
But Burt — also a working poet — said that her embrace of Swift is no sign that she has moved beyond, say, John Donne. To teach Shakespeare to young people, she said, is “not conservatism — it’s conservation, like the protection of old-growth forests.”
Rosette Cirillo, too, sees pedagogical value in true classics from the top of the English-language pantheon — though for a different reason.
Today, Cirillo is a lecturer and a teacher educator at the Harvard Graduate School of Education. But not so long ago, she was teaching eighth-grade English in Chelsea, Massachusetts, a largely Latin-American enclave where nearly half the students are classed as English learners.
“If I had an eighth-grader who went on to Harvard after he graduated Chelsea High, and he had never read Shakespeare, he would be at a serious disadvantage,” Cirillo said.
And, she stresses, she’s arguing less in terms of assimilation than of challenge.
“If I don’t understand ‘The Great Gatsby’ — this story of the American dream — and the idea of a masculine reinvention in order to achieve something, then I don’t understand the mythology of America enough that I could critique it, that I can say, ‘I don’t want that,’” Cirillo said. “We’re thinking about building a language and culture of power and building access for our students.”
“Better readers are better at understanding the multiple points of view that might be held about a civic or a moral issue. They’re less likely to think that if you disagree with them, it’s because you’re stupid.”
Catherine Snow
The teachers and researchers who spoke to the Gazette were divided on whether Steinbeck, Fitzgerald, and Harper Lee still deserve their ubiquity.
“To Kill a Mockingbird,” which Lee published in 1960, could be considered the foundational American text of the ‘white savior’ archetype, Burt said. And, yes, Steinbeck was a Nobel laureate in literature, but with “Of Mice and Men” — “the point is that somebody cognitively disabled is probably going to commit a murder … the high school curriculum would be better off without that,” she said.
And while Burt praised “Gatsby” as a great option for many teens, Catherine Snow was less charitable.
“I always hated that book,” she said.
Snow, a legendary literacy researcher, recently retired from the Harvard Graduate School of Education. She argued that hard evidence still shows real benefits that come from building readers.
Not only do well-read people perform better on tests of general knowledge — but as early as elementary school, Snow said, “better readers are better at understanding the multiple points of view that might be held about a civic or a moral issue. They’re less likely to think that if you disagree with them, it’s because you’re stupid … I think that’s pretty important.”
Digesting a text, analyzing tone and symbolism, understanding meaning and perspective — it’s all still useful. But, Snow said, some older books may no longer be ideal teaching tools.
“You can make all of those hoary texts relevant to students today,” Snow said. (True even of “Gatsby,” she joked: “Here’s a chance to learn about some really boring, worthless people, and how badly they’ve screwed up their lives.”)
“But,” Snow added, “an easier and perhaps more efficient approach would be to try to think about a selection of texts which are more automatically relevant that can be used to develop the same very important cognitive and linguistic and analytic skills.”
“Harry Potter” and “The Hunger Games” traffic, too, in “big, inherent, cultural themes and memes,” she said, and neither is “particularly easy reading.”
The cultural phenomena around those two series defied a decadeslong slump in pleasure reading among youth. In light of that trend, Cirillo and others see room to renovate the curriculum in the margins.
For Cirillo, stories by writers of color — from Toni Morrison to Junot Díaz — should by now be standard fare, part of a “new canon” to be read alongside the old one.
Burt’s chief concern, meanwhile, is the smartphone and its iron grip on our attention. “We’re living through a change in media that comes from a change in technology that is — unfortunately — at least half as consequential as the printing press,” Burt said. “I hate it; it makes me sad. But it’s not something we can wish away.”
Burt proposed shelving “Of Mice and Men” in favor of Frederick Douglass’ first autobiography, as “one piece of American prose literally everyone should have to read.”
Whether or not it can be neatly quantified, teachers of English still believe that there is something irreplaceable about profound immersion in the world of a book. Joining their number is M.G. Prezioso, a 2024 Ed School grad now conducting postdoctoral research on that very phenomenon.
In a recent journal article, Prezioso found a cyclical relationship between frequent reading and “story-world absorption” — a virtuous cycle of joy in reading that might lessen the need for external motivators.
And her ongoing research of grade-school students in Massachusetts and Pennsylvania has yielded early but promising correlations between that kind of absorption and skill at reading comprehension of the kind measured by a standardized test.
But that doesn’t mean abandoning what is already taught, Prezioso said. “There tends to be this dichotomy, first of all, between classic, canonical books versus books that are fun, as if canonical books can’t be engaging or dramatic or enjoyable to read.”
Prezioso was reminded of that in her surveys of high school students. What did they find most engrossing? “Harry Potter,” “The Hunger Games,” Edgar Allan Poe — and “Of Mice and Men.”
Why Malcolm X matters even more 60 years after his killingNew book by Mark Whitaker examines growth of artistic, political, cultural influence of controversial Civil Rights icon
Why Malcolm X matters even more 60 years after his killing
New book by Mark Whitaker examines growth of artistic, political, cultural influence of controversial Civil Rights icon
Christina Pazzanese
Harvard Staff Writer
8 min read
Malcolm X was the provocative yet charismatic face of Black Nationalism and spokesman for the Nation of Islam before he was gunned down at an event in New York City on Feb. 21, 1965, after breaking with the group.
In a new book, “The Afterlife of Malcolm X: An Outcast Turned Icon’s Enduring Impact on America” (2025), journalist Mark Whitaker ’79, explores how the controversial Civil Rights figure’s stature and cultural legacy have only grown since his death.
With dazzling verbal flair, Malcolm X’s advocacy for Black self-determination and racial pride stirred many of his contemporaries like Muhammed Ali, John Coltrane, Maya Angelou, and the founders of the Black Panther party, and helped spur the Black Arts Movement and the experimental genre known as “Free Jazz.”
Whitaker notes that even decades later Malcolm X’s words and ideas have continued to influence new generations of artists and activists, including NBA Hall of Famer Kareem Abdul-Jabbar, playwright August Wilson, filmmaker Spike Lee, pop star Beyoncé, and rappers Tupac Shakur and Kendrick Lamar, among others.
Whitaker recently spoke with the Gazette about why Malcolm X continues to shape American culture. The conversation has been edited for clarity and length.
You say Malcolm X’s cultural influence is even greater than when he was alive. Why is that?
You have to start with “The Autobiography of Malcolm X” [co-authored by Alex Haley]. Many more people, even in the ’60s but certainly subsequently, have gotten to know him through “The Autobiography” than anything else. It’s an extraordinary book. There’s a reason why it’s one of the most read and influential books of the last half century. There are few books by public figures of his stature where you experience this extraordinary personal journey he underwent, from losing his parents at a young age to becoming a street hustler and going to prison, and then turning his life around through the Nation of Islam, becoming a national figure, but then becoming disenchanted with the Nation and with Elijah Muhammad, going out on his own, making a pilgrimage to Mecca, traveling the world, reassessing all of his thoughts and beliefs about white people and separatism and so forth. So that’s extraordinary.
“One of the things that’s interesting is he keeps getting rediscovered generation after generation by young people.”
One of the things that’s interesting is he keeps getting rediscovered generation after generation by young people. I think he spoke to young people for a variety of reasons. One is the reality of race that he described was closer to what they were witnessing than the “I Have a Dream” speech.
There was a hard-headed realism about his analysis of race relations that spoke to young people. Even before you get to politics, his emphasis was on psychology, on pride, and on self-belief and on culture. The belief that Black folks had to start with celebrating themselves and their own culture and their own history — that was extremely appealing to subsequent generations.
I also think there was just something about the way he communicated. There’s a reason that the pioneers of hip-hop thought that you could take snippets of his speeches and put them in the middle of raps, and it would still sound like it belonged. There was something incredibly direct and pithy and honest about the way he communicated.
You put those elements together — his hard-headed analysis, his emphasis on culture and self-belief and pride, and his extraordinary communication — generation after generation of people rediscover that and feel that all of those things are still very powerful.
So many important Black artists, writers, musicians, and activists of that period had either a personal relationship with Malcolm X or said they had an epiphany of sorts after listening to him speak. Why do you think that was?
Part of it was that he did believe, very strongly, that politics is downstream from culture. That was something that he very much believed and preached.
It was interesting because his parents were Black nationalists of the Marcus Garvey generation. And so followers of Marcus Garvey of their generation basically said, “Things are so bad for Black people in America that they have to go someplace else, whether it be someplace in Africa or the Caribbean.” There was this idea of a Black homeland, someplace else that everybody would get on ships and go to.
“In his view, the way Black folks should practice nationalism is by staying in America but demanding their own culture, which began with studying their own history.”
Malcolm explicitly said, “We are a nation, but we belong here.” In his view, the way Black folks should practice nationalism is by staying in America but demanding their own culture, which began with studying their own history. In his separatist era, it was literally we have our own networks of support. He was a big believer in Black business by and for Black people. That was a cultural project as much as a political project.
He lived in an era when a lot of Black culture, even though it was separate from white culture, sought to emulate white culture. A lot of the societies and the rituals were Black versions of white rituals. And he said, “That’s a form of brainwashing. We shouldn’t seek to be like white people. We should have our own culture.”
So, starting with the Black Arts Movement and the “Free Jazz” movement in the ’60s, and then later, the hip-hop generation and today’s artists like Kendrick Lamar, Beyoncé, all the great artists who still invoke him, that’s the message they’re picking up on as much as his political message.
There’s also something just so supremely confident about him that people relate to. He was unapologetically who he was. He’s preaching Black pride and so forth with such supreme elegance and confidence and humor. That’s always appealing.
One chapter looks at Malcolm X as a hero to the political left and right. President Barack Obama has talked about how influential the autobiography was on him as a teenager, and Supreme Court Justice Clarence Thomas has also spoken about his attraction to Malcolm X and his message of self-determination when he was in college. Few political or cultural figures today have that kind of appeal. What do you attribute that to?
There are people on the left who revere Malcolm X who were appalled that Clarence Thomas would say he’s also a hero to him, and feel like Clarence Thomas just cherry-picked the parts of his message that are convenient to him — the emphasis on Black business, the skepticism about integration and so forth. I spent a lot of time researching that chapter and talking not to Thomas himself, but to his clerks and people who had written about his interest in Malcolm X, and I think it was sincere.
Malcolm X was a truth teller. I don’t think he was interested in being a hero to white people. He would go around saying things like, “I prefer the white racist who at least has his cards on the table to the white liberal who can’t be trusted.” And as we see today, people embrace people who attack the people who they oppose.
“Malcolm X came to Harvard in 1961 and then twice in 1964 to talk with Harvard Law School students and to debate faculty. He was known for his willingness to speak in all sorts of settings, whether a college campus, a street corner, or a TV talk show. ”
Would Malcolm X be surprised to find that he’s still so influential?
It’s a tricky thing for biographers to say what would he have thought. It’s presumptuous, but one of the things that is clear is that people at the time who were followers of his said his message and his influence will outlive him. Actor Ossie Davis said that in his eulogy. He said, “What we put in the ground now is only a seed which will rise up to meet us.”
Sociologist Harry Edwards, when he was organizing a Malcolm X day at San Jose State — this was a year after King’s assassination — people said, “Why all this fuss about Malcolm X and not about King?” And Harry Edwards said the thing about Malcolm X is it’s not so much what he did during his lifetime, it’s what he inspired in others, which will continue. There’s something about Malcolm that is still alive in the influence that he’s having on all these other people.
Brain implants that don’t leave scarsHarvard startup is developing a softer device to monitor head injuries
Harvard startup is developing a softer device to monitor head injuries
Traumatic brain injuries vary in severity from mild to life-threatening, but neurologists have limited tools to assess the damage. While examinations and external imaging can help, neural probes — devices that create brain-computer interfaces — are even better. The problem? They are made of rigid materials that scar the brain.
Axoft, a startup launched out of Harvard in 2021, is developing a softer alternative, one the company’s researchers say could be inserted into the brain without disturbing its gel-like consistency but is durable enough to deliver accurate neural data.
“With a brain-computer interface, we can determine very precisely what’s happening in the brains of the patients — if they are conscious, if they are not conscious, if they are vegetative, if they are recovering, or if their state is degrading,” said Paul Le Floch, co-founder and CEO of Axoft, who received his Ph.D. in materials science from Harvard.
Clinicians have used neural probes for decades. When inserted into the brain, they measure electrical activity with much more accuracy than external neural imaging. But traditionally, neural probes have been made of rigid materials, which damage the surrounding, highly flexible brain tissue — like razor blades in gel, said Le Floch. Damage to the brain makes neural probes less effective, because the brain responds by surrounding them with scar tissue. Encapsulated in that more rigid tissue, the probes cannot communicate as readily with the neurons around them. Plus, rigid devices can only stay implanted for a short time before they significantly scar the brain. As more sensors are added to a neural probe — an essential element of gathering as much brain activity data as possible — the probes become even more rigid.
Traditionally, neural probes have been made of rigid materials, which damage the surrounding, highly flexible brain tissue — like razor blades in gel.
Le Floch and his collaborators understood that they needed a softer alternative to existing neural probes. “The problem is: Soft materials are not very high-performance,” he said.
During a Ph.D. focused on material science and polymers at Harvard, Le Floch began working as a graduate student in the lab of Jia Liu, an assistant professor of bioengineering at the John A. Paulson School of Engineering and Applied Sciences. Le Floch and Liu focused on an intractable problem: engineering neural probes that worked better for the brain.
Le Floch and Liu collaborated with Tianyang Ye, Ph.D. ’20, a graduate student and then postdoctoral scholar at Harvard specializing in nanoelectronics, as well as a fellow at the Office of Technology Development (OTD), where he worked on commercialization strategies for academic innovations. Ye is now Axoft’s chief technology officer, as well as a co-founder. While Le Floch engineered a higher-performance, soft material that could be inserted into the brain without harming it, Ye designed the electronics that could transmit the data for analysis.
The resulting neural probe is “very biocompatible, because it’s so small, but also very soft,” said Le Floch. “It creates less damage within tissues over time.”
Paul Le Floch (left) and Jia Liu.
Axoft’s novel material, Fleuron, is thousands to millions of times softer and more flexible than the material used in modern neural probes. At the same time, Fleuron is a photoresist, applicable for the chip-fabrication process. As a result, the probe can easily fit more than 1,000 sensors, delivering precise brain-signal data to clinicians.
“In the last few decades, we’ve gone from measuring one neuron, to 10 neurons, to hundreds of neurons — now we’re getting into thousands,” said Le Floch. Those greater multitudes allow researchers to “learn more about the brain and develop new diagnoses and therapies.”
Axoft is working to double the number of electrodes its probe can host every year as it continues to develop the technology. “This will significantly increase the number of neurons Axoft’s probes can measure and stimulate,” said Liu, who helped co-found the company and joined as a scientific adviser.
Brain implants are not necessary for every patient with neurological damage, Le Floch says, but the company has already experienced significant interest from neurologists who have struggled to measure the brain activity of unresponsive patients with acute and traumatic brain injuries.
“We see a big need from a patient perspective.”
Paul Le Floch, Axoft CEO
The impact of the startup’s work was clear from the beginning, according to Christopher Petty, OTD director of business development in physical sciences. “From our point of view, we’re always talking about this mission of taking academically generated knowledge and making a difference in the world with it. This is that in spades,” he said. “That’s the point of everything we’re doing.”
OTD safeguarded the intellectual property of the core discoveries, connected Axoft’s team with potential investors and structured the startup’s license to further develop the technology, while helping its founding researchers think about real-world applications and the journey from testing to commercialization.
Helping a medical device startup flourish, says Petty, differs from the process for a software startup. The clinical trials necessary for approval can take a significant amount of time and cost a lot of money, but there’s also a much more clearly defined path to market. “There’s a clear set of milestones,” Petty said.
Since its founding, Axoft has been working to hurdle those milestones. The company has raised more than $18 million in funding thus far. In 2025, it completed its first human trial at the Panama Clinic in Panama, which demonstrated that the implants were safe to insert and remove and didn’t create additional risks for the brain in the process. The team also determined that the probe could differentiate when patients are conscious or unconscious (due to anesthesia), the latter of which mimics a coma-like state. Within a few minutes, the team was able to measure brain states in the way a functional MRI might over several hours.
Now, in order to generate more preclinical data, Axoft is working with clinicians at Massachusetts General Hospital on porcine models of traumatic brain injury. Le Floch expects Axoft will be able to begin another in-human study with the hospital in the next year.
In 2027, Axoft is targeting an FDA-managed clinical trial focused on individuals with traumatic brain injuries, in whom the device can measure recovery and consciousness. If all goes well, the devices could be available to physicians by 2028. Le Floch believes the implants could quickly scale to hundreds of patients.
“We see a big need from a patient perspective, and there is already an ecosystem in hospitals for using neuromonitoring devices,” he said.
This research received federal funding from the National Science Foundation.
In touch with our emotions, finallyInsights at intersection of gender, anger, and risk are just one example of shift in science of decision making
Insights at intersection of gender, anger, and risk are just one example of shift in science of decision making
Sy Boles
Harvard Staff Writer
5 min read
Jennifer Lerner is the Thornton F. Bradshaw Professor of Public Policy, Decision Science, and Management at Harvard Kennedy School.
Niles Singer/Harvard Staff Photographer
A series exploring how risk shapes our decisions.
Letting raw emotion drive financial decisions sounds like a recipe for disaster. But Jennifer Lerner, the Thornton F. Bradshaw Professor of Public Policy, Decision Science, and Management at the Kennedy School, found that anger turned out well, at least for men, in a computerized gambling game.
Lerner and National Institutes of Health scientist Rebecca Ferrer (a former student) co-led a set of experiments using the Balloon Analog Risk Task, in which participants earn more money each time they add air to a virtual balloon, but lose it all if they go too far and burst the balloon.
When males were primed for anger, they took bigger risks and walked away with fatter wallets than did neutral-emotion males or angry females. Unlike previous studies that demonstrated causal effects of anger on lowering risk perceptions and reducing the likelihood of taking protective actions among males and females, these experiments focused on actual risk-taking behavior — revealing that anger drove bolder bets, primarily among men.
Correlating gender and emotion is always slippery, Lerner noted. After all, you can’t randomly assign adults to the category of male and female, never mind tease out whether differences are due to biology, socialization, culture, or something else entirely. But the findings raise interesting questions about how gender, emotions, and risk intersect in high-stakes environments like entrepreneurship or the stock market, she said.
“There’s a large and interesting debate about whether males are more risk-taking in general.”
“There’s a large and interesting debate about whether males are more risk-taking in general, which our work only partially addresses,” Lerner said. “It looks only at the role of anger in financial risk-taking and the gender differences there.”
Wanting the results to be understood in broader context, she stressed two points.
“Whether risk-taking turns out to be good or bad depends entirely on the situation,” Lerner said. “We designed our studies to reward risk-taking, but there are many real-world situations where caution would be a better strategy.
“Also, while males and females may differ on average in how anger influences their financial risk-taking, across most decisions there’s more variation within each gender than between genders. So, knowing someone’s gender will tell you less about their decision-making than will understanding their individual traits or social/cultural context.”
Emotion has been massively understudied as a factor in decision-making, according to Lerner. “Even though,” she said, “if you ask people on the street, ‘What’s important to understand in decision-making,’ they often say ‘emotion.’”
Her work plays a role in a recent shift — studies examining how emotion affects decision-making have dramatically increased, recognizing that emotion can be adaptive or maladaptive. More generally, emotion now appears prominently in emerging models of brain, mind, and behavior.
Science is catching up to the widespread use of emotion to shape behavior in marketing campaigns.
In other words, science is catching up to the widespread use of emotion to shape behavior in marketing campaigns. In student-led studies, Lerner’s lab has examined emotionally evocative public health campaigns designed to communicate the risks associated with tobacco use. One study, led by Charlie Dorison, Ph.D. ’20, found that inducing certain kinds of sadness can backfire, inadvertently increasing smoking. Another, led by Ke Wang, Ph.D. ’24, found that gratitude can play a powerful role in encouraging smoking cessation. Both studies come from a broader stream of work in Lerner’s lab examining ways in which emotion influences appetitive risk behaviors (e.g., smoking, vaping, gambling).
In Lerner’s own life, being well-informed about risk gave her something very valuable: her daughter. As a child, Lerner was diagnosed with lupus. Among other effects of the autoimmune disease, she was told she should never have biological children: The risk of miscarriage was high, and her health could be in serious jeopardy. She considered adoption, but soon learned that having lupus would significantly lower the odds that she would ever be selected as an adoptive parent.
Lerner could have let generalized fear guide her decision about having a child. Instead, she and her husband dug into the medical research.
Their plan was to examine feelings shaped by doctors’ warnings and common beliefs before taking a hard look at the actual risks. Was their fear incidental or integral? How much uncertainty could they comfortably tolerate?
“We just analyzed everything, all the scientific studies we could find,” Lerner said. “And we decided, given where my health was at the time and the medications I was on, we could accept the risks.” Two months ago, that baby — now grown up — graduated from college.
Professionally, Lerner studies risk across a variety of domains, including health, economics, national security, and (most recently) climate change. She serves on several councils and boards, including the board of the Forecasting Research Institute, a nonprofit attempting to develop methodologies for quantifying the risk of existential threats (e.g., AI takeover). She believes decision-making skills are essential life skills for everyone, and should be taught at a young age. For that reason, she also volunteers as an ambassador for the Alliance for Decision Education, a nonprofit providing free access to decision-making curricula for K-12 schools. Much of her own time is spent with professionals from around the world who work in decision-intensive roles — “from financial analysts to firefighters,” as she puts it — in an executive education course she teaches.
“In today’s world, the ability to leverage information effectively is a crucial skill,” she said. “That means being clear on how to estimate uncertainty, how to judge your confidence in those estimates, and how to recognize the myriad — helpful or unhelpful — ways emotion may shape judgments. These aren’t just tools for leaders or analysts — they’re for all of us.”
Researchers uncover surprising limit on human imagination
Humans can track a handful of objects visually, but their imaginations can only handle one
Christy DeSmith
Harvard Staff Writer
4 min read
Human beings can juggle up to 10 balls at once. But how many can they move through the air with their imaginations?
The answer, published last month in Nature Communications, astonished even the researchers pursuing the question. The cognitive psychologists found people could easily imagine the trajectory of a single ball after it disappeared. But the imagination couldn’t simultaneously keep tabs on two moving balls that fell from view.
“We set out to test the capacity limits of the imagination, and we found that it was one,” said co-author Tomer D. Ullman, associate professor in the Department of Psychology. “I found this surprising, so I can understand if others do, too.”
Ullman, who heads Harvard’s Computation, Cognition, and Development lab, has a long-time interest in what is known as intuitive physics. Think of the brain conjuring a ball as it rolls downhill, or sounding the alarm over two objects on a sure-fire collision course.
“How do we interact with the physical world around us?” wondered Ullman, who is also affiliated with the Kempner Institute for the Study of Natural and Artificial Intelligence. “I subscribe to the theory that the brain may be running mental simulations, kind of like a video game.”
These couldn’t be perfect simulations of physical environments, right down to the level of atoms and molecules. So Ullman’s lab has worked to understand what kinds of hacks and workarounds make mental simulations possible.
“The human imagination is just really cool, and we find a lot of people are quite interested in how it works,” he offered.
A sizablebody of research has explored the capacity limits of human perception, or how many objects the brain can track in a visual scene. “Maybe you’re a parent watching multiple kids, or maybe you’re a lifeguard on duty,” Ullman said. “Obviously you can’t keep track of everything.”
Neuroscientists, psychologists, and computational modelers have found visual tracking is limited to just a handful of moving objects. But few have explored the imagination’s capacity limits.
In the new study, online participants were shown an animation of a bouncing ball, as if on a racquetball court, before it vanished. Others saw two balls ricocheting at completely different cadences before both disappeared. Designing the experiments with Ullman was lead author Halely Balaban, an assistant professor of cognitive psychology at the Open University of Israel.
Also devised were two computational models to explain how the imagination might follow these invisible balls to their moment of impact. The first model posited that multiple objects would be moved in parallel, while the second envisioned independently moving each ball in more of a serial fashion.
Ullman and Balaban found their online recruits were pretty good at predicting when a single invisible ball would have hit the ground. But people fumbled at tracking two.
“It was harder than any of us expected,” said Ullman, noting how reliably the exercise produced laughs.
Based on pastfindings, the co-authors originally thought the imagination could probably track about three or four objects. There were also intuitive reasons to think the mind’s eye could move multiple objects in parallel.
“If I close my eyes right now, I can see a tower of blocks falling down,” Ullman noted. “It doesn’t feel limited. People feel like they should be able to move more than one.”
In fact, a follow-up experiment found people were slightly better at tracking two balls that moved in tandem before disappearing. But performances still paled next to a yet another follow-up, in which study participants tracked two balls that remained visible until impact.
When it comes to tracking objects that have disappeared, the researchers found, the human imagination relies largely on a serial model, moving each piece one after the other.
A separate follow-up tested whether people might be conserving mental energy by employing a serial model. After all, running a simulation via the parallel model would require more effort. Imagine a computer running multiple simulations at once.
“We offered participants a bunch of money if they could get this right,” Ullman explained. “But that didn’t seem to matter.”
For Ullman, the findings open an exciting frontier. “There has been decades and decades of work on how the mind uses clever tricks to keep track of what’s in front of you,” he said. “But there’s been so little on the tricks and limitations of the mind’s eye. I could imagine a lot more work to do here.”
Keeping kids safe in extreme heatExperts outline threats to childhood development, school challenges, play-time risks
Experts outline threats to childhood development, school challenges, play-time risks
Anna Lamb
Harvard Staff Writer
4 min read
With heat waves becoming more intense and frequent across the U.S., experts gathered for a Harvard webinar on how to protect children’s health amid soaring temperatures.
“Extreme heat is really one of the most dangerous but also one of the least recognized threats to healthy development,” said Lindsey Burghardt, chief science officer at Harvard’s Center on the Developing Child, which hosted the talk.
According to Burghardt, extreme heat has been linked to premature birth, low birth weight, disruptions in sleep and learning, and negative effects on mental health.
Outdoor playgrounds can turn into miniature heat islands — areas that increase dangerous heat even more.
Jennifer Vanos
“These outcomes are really important for us to understand,” she said. “Because they have immediate effects in childhood, but they also have the ability to have effects and impacts across children’s lifetimes. This makes intervention just so important.”
The Environmental Protection Agency defines extreme heat days as those in which outside temperatures exceed 95 degrees Fahrenheit. It also encompasses periods in which temperatures fail to drop, even at night. According to EPA statistics, these types of heat waves are becoming more frequent and severe.
Joining Burghardt in the talk was Michelle Kang, chief executive officer for the National Association for the Education of Young Children. She said her team has worked with Harvard’s Graduate School of Education to analyze how the changing climate affects learning and childcare practices across the country.
“Ideally, you’re able to take children out at times to have that gross motor play — that important time where they’re able to get those zoomies out,” she said. “But if you don’t have adequate shade, and it’s hot and getting hotter, then you actually can’t take your children outside. It changes what the learning environment looks like for the day to day and creates more stress on educators to ensure that they have what they need to keep their children safe.”
According to listening sessions with members of her organization, Kang said educators across the country are dealing with vastly different resources to keep kids cool during increasingly hot spring semesters and summer sessions. Many school buildings lack adequate air conditioning, for example, or fail to insulate against the extreme heat.
Another speaker, Jennifer Vanos, associate professor in the School of Sustainability and the College of Global Futures at Arizona State University, added that many schools also lack sufficient indoor play space.
Outdoor playgrounds, said Vanos, can turn into miniature heat islands — areas that increase dangerous heat even more.
“It really comes down to an individual school, and what their environment is like. What are their indoor conditions like? What are their outdoor conditions like? Some schools have really great shaded designs that are still OK to be playing in under slightly hotter weather,” she said. “Some schools can handle it better than others.”
During times of extreme heat, she said, it’s important for parents and educators to monitor play time.
“Kids want to keep playing,” she said. “What can happen if we don’t stop that soon enough is you’ll start to see the rise in the heart rate, because they’re trying to pump blood to the skin to lose heat from the body, and then you’ll also start to see a rise in sweat rate.”
Because kids have fewer sweat glands than adults, they aren’t able to release heat at the same rate, Vanos said. At the point when they’re getting sweaty, their core temperature is dangerously rising.
“If you see that rise above 104 degrees Fahrenheit or so, that’s when you can get into this very high risk of heat stroke occurring, or, if you’re playing, it’s more exertional heat stroke or a heat exhaustion,” she said.
In the direst cases of extreme heat exposure, the body can experience multiple organ failure and need hospitalization. And while it happens quickly, Vanos said, there are signs that indicate it’s time to cool off.
“One child is very different from another child, and we have to know which kids have potential pre-existing factors to account for certain medications, certain illnesses that they might have that make them higher risk than average,” she said. “We have to figure out the intervention points there — what are they, and how can we keep kids safe.”
Possible clue into movement disorders like Parkinson’s, othersRodent study suggests different signaling ‘languages’ in parts of brain for learned skills, natural behaviors
Possible clue into movement disorders like Parkinson’s, others
Kiah Hardcastle.
Stephanie Mitchell/Harvard Staff Photographer
Kermit Pattison
Harvard Staff Writer
4 min read
Rodent study suggests different signaling ‘languages’ in parts of brain for learned skills, natural behaviors
Among the many wonders of the brain is its ability to master movements through practice — a dance step, piano sonata, or tying our shoes.
For decades, neuroscientists have known that these tasks require a cluster of brain areas known as the basal ganglia.
According to a new study in Nature Neuroscience led by Harvard researchers, this so-called “learning machine” speaks in two different codes — one for recently acquired learned movements and another for innate “natural” behaviors.
These surprising findings with lab animals may shed light on human movement disorders such as Parkinson’s disease.
“When we compared the codes across these two behavioral domains, we found that they were very different,” said Bence Ölveczky, professor of organismic and evolutionary biology (OEB). “They had nothing to do with each other. They were both faithfully reflecting the animal’s movements, but the language was profoundly different.”
“When we compared the codes across these two behavioral domains, we found that they were very different.”
Bence P. Ölveczky
Located in the midbrain below the cerebral cortex, the basal ganglia are involved in reward, emotion, and motor control. This region also is the site of some of our most concerning movement disorders: Huntington’s disease, Tourette’s syndrome, and Parkinson’s all arise from different defects of the basal ganglia.
Although it has long been known that the basal ganglia play a central role in motor control among mammals, it remains unclear whether this part of the brain directs all movements or just those for specialized tasks.
Some researchers posit that the basal ganglia act as a learning locus for movements acquired through practice, but not other routine behaviors. Other scholars argue that it plays a role in all movements.
To shed light on this mystery, the researchers scrutinized one particular part of the basal ganglia in rats — the dorsolateral striatum (DLS), which plays a role in learned behaviors.
The team studied rats during two different activities: free exploration and a learned task in which they were trained to press a lever twice within a specific time interval to obtain a reward. To track their movements, the team used a system of six cameras around the enclosure plus a software system that categorized behaviors.
In earlier studies, the team removed the DLS of rats, who afterward showed no differences in free exploration, demonstrating that it played no role in natural behaviors such as walking or grooming.
But the same animals were profoundly impaired when performing learned tasks, revealing that the DLS was essential for the newly acquired skills.
“There was a massive change, like night and day,” said Kiah Hardcastle, a postdoctoral fellow in the Ölveczky lab and lead author of the new study. “The animal could do a task super well, performing a stereotyped movement repeatedly, like 30,000 times. Then you lesion the DLS, and they never do that movement again.”
In the new study, the investigators sought to understand the neural activity during these behaviors, implanting tiny electrodes into the brains of rats and recording the electrical firing of neurons as they engaged in free exploration and the learned task.
To their surprise, they discovered the basal ganglia used two distinct “kinematic codes” — or patterns of neuronal electrical activity — during the learned task and natural movements.
“It’s as if the basal ganglia ‘speak’ different languages when the animal performs learned versus innate movements,” said Ölveczky. “Brain areas downstream that control movement only know one of these languages — the one spoken during learned behaviors.”
“It’s as if the basal ganglia ‘speak’ different languages when the animal performs learned versus innate movements.”
Bence P. Ölveczky
The researchers concluded in the paper that the basal ganglia switch back and forth “between being an essential actor and a mere observer.”
Hardcastle speculated that the basal ganglia may be unable to completely turn off electrical signaling when not directing behavior, so it shifts to a harmless “null code.”
Ölveczky said the findings may well be informative about humans because the structures below the cerebral cortex are believed to have remained largely conserved through evolutionary time. He believes the study demonstrates that the basal ganglia play essential roles in learned movements — but not necessarily in routine motor control.
He also thinks the findings offer hints about what may go wrong in some human movement disorders.
“Our research suggests that the pathology associated with Parkinson’s can be understood as the diseased basal ganglia speaking gibberish, but in a very loud and forceful way,” said Ölveczky. “Thus, it inserts itself, in a nonsensical way, into behaviors it would otherwise not control.”
Federal funding for the research was provided by the National Institutes of Health.
‘Turning information into something physical’Houghton exhibit looks at how punched cards — invented 300 years ago to streamline weaving — led to modern computing
Houghton exhibit looks at how punched cards — invented 300 years ago to streamline weaving — led to modern computing
The punched card, a paper instrument invented 300 years ago to automate looms, helped create a technology that most of us today can’t live without: computers.
A new Houghton Library exhibition — “The Punched Card from the Industrial Revolution to the Information Age” — on view in the library’s lobby through the end of the summer, traces the technology’s history through three works: a book from 1886 woven entirely with a punched card loom; the writing of mathematician Ada Lovelace on the punched card’s computer capabilities; and a 1940s manual on using a punched card computer.
“Computers now permeate almost every aspect of our society,” said the exhibition’s curator, John Overholt. “It’s interesting to learn more about the roots of things that feel very commonplace and widespread these days … to learn how those things evolved over time can provide new insights.”
Punched cards, or punch cards as they are often called, are thought to have originated in 1725, when French silk weaver Basile Bouchon invented the use of a paper tape with punched holes to automate the work of a loom. But perhaps the best-known early example comes from French inventor Joseph Marie Jacquard, who in the early 19th century used a series of punched cards to create intricate brocade patterns. Each card had holes threaded to create a single row of design.
Historians think the first time the technology was used for data collection and analyzing was in the late 1880s, when American engineer Herman Hollerith created punched cards for gathering statistical information for the U.S. Census.
“That’s the thing computer historians are most likely to fight about — what the cutoff is,” said Marc Aidinoff, who teaches the history of technology at Harvard. “You get some people who say, ‘Well actually, programming a loom is not that different from computing. It’s putting in directions.’”
Aidinoff added that there is one thing that all tech historians can agree on: “There is no computing without punch cards. When you think of what a semiconductor is doing, it’s really a very similar system to a punch card, just at a vastly more complex scale.”
The earliest use of punched cards to process Census data drastically sped up the time to count results, marking a milestone on the path to modern computing. Hollerith’s company — which started as the Tabulating Machine Company based out of Washington, D.C. — would go on to become computer giant IBM.
At Harvard, graduate student Howard Aiken designed the Mark I in 1937 — a first-of-its-kind computer able to make a wide array of calculations using punched paper tape.
Aiken partnered with IBM engineers to develop the machine, and after five years Mark I was delivered to Harvard, where it was operated by the U.S. Navy Bureau of Ships for military purposes over the next decade.
Punched card computing continued throughout the next several decades — improving alongside evolving microprocessing and memory capabilities.
Overholt, curator of Early Books and Manuscripts at Houghton, remembers the discarded punched cards his mom would bring home from her job at IBM throughout the 1960s.
Exhibit curator John Overholt used to play with the discarded punched cards his mom would bring home from her job at IBM throughout the 1960s.
Photo courtesy of John Overholt
“She would bring home punch cards that had been used to program computers for us to play with and build little card houses out of,” he said.
Today, Harvard is home to supercomputers that make punch card computers look like an abacus. But at Houghton, you can see the seeds of innovation that started it all.
“Present-day computer technology has moved in new directions but encoding in ones and zeros and bits and bytes is still pretty fundamental to the way computers work,” Overholt said. “It’s hard for me to put myself in the position of what somebody 300 years ago would have imagined about computers, but I’m sure it was clear right away that it was a very powerful tool for turning information into something physical.”
Carving a place in outer space for the humanitiesThe cosmos ‘is as weird and astonishing as any great work of art,’ argues Jennifer Roberts, and navigating it requires ‘a new kind of ethics’
The cosmos ‘is as weird and astonishing as any great work of art,’ argues Jennifer Roberts, and navigating it requires ‘a new kind of ethics’
Jennifer Roberts is an art historian whose work orbits an unexpected subject: outer space. Fascinated by images that are created as a way of understanding the unknown, she builds alliances between scientists and humanists — work she finds even more urgent as we enter an age of commercial space travel.
“Astronomers and art scholars should be working together whenever we can,” said Roberts, X.D. and Nancy Yang Professor of Arts and Sciences and Drew Gilpin Faust Professor of the Humanities. “We both know that images are not just illustrations; they are tools for understanding and interpretation, and they have a powerful role in shaping what humanity will do with the revelations about the universe that science is delivering.”
Roberts will publish a study later this year on the first image transmitted from Mars, paradoxically drawn in pastel on paper. In 1965, the 21 images captured by the Mariner 4 probe in its flyby of Mars were being transmitted too slowly for scientists at the Pasadena Jet Propulsion Lab: Each took eight hours to process. Desperate for the first glimpse of the then-mysterious planet, they bought a box of Rembrandt soft pastels from a nearby art store, pinned the incoming numerical data to a wall, and colored by number each pixel, using a color-code system with brown representing the darkest sections of the image and yellow the brightest.
“This is a really interesting story to me because it indicates one of the many ways in which scientists rely upon visualization,” said Roberts. “They needed to create an image in order to understand and interpret the data. And it’s not irrelevant that they used the fugitive, dusty medium of pastel to do it — artists have long used pastel as a visual technology for perceiving hidden or transient realities.”
A real-time data translator machine converted Mariner 4 digital image data into numbers printed on strips of paper. The team colored in the strips by hand with pastels, making this both a work of art and the first digital image from space.
NASA/JPL-Caltech
Roberts, who attributes her interest in science and the humanities to watching Carl Sagan’s “Cosmos: A Personal Voyage” on PBS as a child, is also currently working on a book about the Voyager Golden Record, which she calls the “most distant work of art ever created.” “The Heartbeat at the Edge of the Solar System: Science, Emotion, and the Golden Record,” a collaboration with artist and writer Dario Robleto, will be published by Scribner in 2026.
Images of space determine how we think about it, Roberts explained, especially the images typically published by NASA such as those taken by the Webb and Hubble telescopes, which are not raw snapshots but carefully constructed visuals made from data that is often captured beyond the visible spectrum. The images are colored, cropped, rotated, and edited to help viewers make sense of something fundamentally unfamiliar, she said. These aesthetic choices are necessary to make the images visible at all, but they can shift how we perceive outer space, often making it feel closer and more comprehensible than it really is.
Roberts pointed to research by Stanford scholar Elizabeth Kessler, who found that Hubble visualization scientists often styled space imagery to resemble 19th-century paintings of the American West — incidentally framing the cosmos as something desirable, traversable, and ripe for exploration.
Roberts says she admires the expertise and imagination that went into these images. “But there are so many other ways to render the same data, and it’s important that people understand that,” she said. “You could have taken the famous ‘Cosmic Cliffs’ image in which a nebula is cropped to look like a rock face and turned it upside down, and it would have been equally scientifically valid. You could have used any number of other colors. It could have been made to look much, much stranger.”
She worries about this when it comes to commercial space ventures depicting outer space as “there for the taking.” Its narrative, she feels, is all too similar to Earth’s most destructive colonial pursuits.
“We’re about to step off the planet and I’m worried that we’re going to repeat all the same mistakes that we’ve made before,” Roberts said. “We are talking about space as a ‘frontier,’ as something to be colonized or occupied. But we should be listening to what the science tells us: Space is as weird and astonishing as any great work of art. It does not support the status quo.”
This is one reason Roberts believes humanists need a stronger presence in conversations about outer space. She’s noticed a tendency for some humanities scholars to dismiss space as escapist or eccentric, and a distraction from Earth’s real problems, but she disagrees.
“We’ve ceded the heavens, in some extent, to the tech sector, to scientists, to commercial ventures,” Roberts said. “It doesn’t seem to be a place where we can exercise our skills. But while we haven’t been paying attention, we have come to the brink of a new space age that is now upon us. Our move into space is going to require a totally new kind of ethics and a totally new philosophy and we aren’t going to be able to access that if we don’t have the arts and humanities involved in close collaboration with scientists.”
“Our move into space is going to require a totally new kind of ethics and a totally new philosophy and we aren’t going to be able to access that if we don’t have the arts and humanities involved in close collaboration with scientists.”
Jennifer L. Roberts.
To put this idea into action, Roberts has begun teaching “Art and Science of the Moon” in the Department of History of Art and Architecture. The experimental seminar focuses on the world history of artistic engagement with the moon, including the response of photographers and conceptual artists to the Apollo program in the 1960s and ’70s. She hopes to teach a similar seminar on Mars.
She’s also starting a seminar at the Mahindra Humanities Center seminar this fall titled “Celestial Spheres,” which will bring scientists and humanists together to talk about what’s happening outside planet Earth.
Roberts wants to think about outer space as something more like an ocean in which we are immersed than a void filled with image targets.
“What would it mean if we didn’t think about it as a frontier that we had to cross and conquer?” Roberts said. “What if we thought about it as an ecosystem, something that we are already part of?”
‘Hopeful message’ on brain diseaseResearcher Sanjula Singh has looked at stroke, dementia, late-life depression for years, finds lifestyle changes make big difference
Sanjula Singh wants people to know that stroke, dementia, and depression are much more preventable than they might think.
“The most common misconception that a lot of people have is that Alzheimer’s or depression or stroke is like a train coming down the tracks,” said Singh, a principal investigator at Massachusetts General Hospital and Harvard Medical School’s Brain Care Labs who has been studying brain disease for years.
Though genetics plays a factor in developing these illnesses, Singh’s research has helped show that up to 80 percent of strokes, 45 percent of all instances of dementia, and 35 percent of late-life depression can be addressed through behavioral changes.
One of the most potent risk factors of dementia, Singh explained, is high blood pressure. Instead of focusing on treating diseases, Singh has aimed to help people avoid them in the first place.
“I think what I communicate is a very hopeful message,” she said. “There’s so much you have in your own hands that you can do to remain healthy and happy. … It’s so simple, but I think that’s what makes it so powerful.”
Singh, born to a family of doctors in the Netherlands, originally planned to be a singer-songwriter upon graduating from high school. But after studying at the Codarts Conservatory in Rotterdam, she felt drawn back toward science.
“I loved the creative process, and I also loved solving complex problems. I realized I didn’t have to choose — I wanted a life that made room for both.”
“There’s so much you have in your own hands that you can do to remain healthy and happy. … It’s so simple, but I think that’s what makes it so powerful.”
After traveling around the world, she began at medical school the next year, at first aiming to become a neurosurgeon.
Hoping to make an impression on Bart Brouwers, a neurosurgeon who she thought might have room in his lab, she spent a full night during her first year of medical school trying to memorize his dissertation. He referred her to Gabriël Rinkel, a professor of neurology at University Medical Centre Utrecht. Though Singh hadn’t yet taken a course on the brain, Rinkel said she could start working on research that would later become her Ph.D. thesis.
As she navigated medical school in Utrecht, it was the research side that fascinated her most. She spent much of her neurosurgery Ph.D. studying the cerebellar intracerebral hemorrhage, a deadly subtype of stroke in the cerebellum.
The results of her research would eventually result in changes to international treatment guidance for the disease. Though that was a rewarding outcome, she also had the realization that, while her work could help a small group of people, it wouldn’t stop strokes from happening.
“I wanted to be on the forefront,” she said. “I wanted to prevent the suffering.”
She got her first major exposure to some of these modifiable risk factors in grad school while working in the lab of Josh Goldstein, professor of emergency medicine at Harvard Medical School and a co-supervisor along with Rinkel. Her work covered specific neurosurgical topics, but she started seeing how influential modifiable risk factors were — even for those who had already suffered a stroke.
To learn how to develop questions and conduct analytical research about these risk factors, she took a year to study epidemiology and statistics during a master’s degree at the University of Oxford.
“I came across so many great datasets in which I just saw how much brain disease could be prevented,” she said, “but I wasn’t sure who was truly leading that work.”
80%—Of strokes are attributable to modifiable risk factors, according to Singh’s research
Singh hadn’t planned to return to the U.S. after completing her Ph.D., until Jonathan Rosand, a Harvard professor of neurology and member of her Ph.D. dissertation committee, changed her mind.
During a walk-and-talk, Rosand shared his vision for a new lab focused on preventing brain disease, which would go on to take the name Brain Care Labs.
“I believed in him — and in what he was building,” Singh said. “I told him, ‘I want you to be my mentor. Wherever you go, I’ll follow.’”
At the Brain Care Labs, Singh began spending her time exploring brain health and the factors controlling it. In 2022, she was the lead author of “Brain health begins with brain care,” an article in The Lancet that called for a rapid response to what major health organizations have called a global brain-health crisis.
“Although prevention of brain disease is yet to be a focus of primary care medicine,” she wrote, “a crucial opportunity exists to leverage the global acceptance that more than 40 percent of dementia, stroke, and depression cases are attributable to modifiable risk factors.”
With her new colleagues, Singh helped develop the Brain Care Score, a tool for people to gauge how their habits affect their brain health, backed by data collected from hundreds of thousands of adults followed for more than a decade.
Instead of simply predicting disease, the score was designed to help people modify risk factors that can increase the chance of stroke, dementia, and depression. Those risk factors span three domains: physical (e.g., blood pressure, blood sugar, cholesterol), lifestyle (diet, exercise, sleep), and social-emotional (stress, relationships, purpose in life).
“It doesn’t matter where you’re starting. What matters is that you begin. Improving — even just a little — is the way forward.”
Singh continues to build upon her research to strengthen the scientific link between modifiable risk factors and brain diseases.
Recently, she and her team identified 17 overlapping factors that affect one’s risk of stroke, dementia, and late-life depression. By knowing and adjusting even one of these factors, people can reduce their risk of suffering from brain diseases long thought to be intractable.
“Start with something small and doable,” Singh said. “Those first steps can create momentum — and over time, they can lead to powerful change.”
As Singh figures out what causes brain diseases, she’s also working on helping people adjust their lifestyles. “We know behavior change is really hard,” she said, “and, amongst other things, we know that individual health coaching can actually work.”
She and her labmates are approaching implementation from a few levels. Through the Global Brain Care Coalition Rosand founded in 2024, Singh and her colleagues have developed community-specific Brain Care Scores to make sure adjustments to factors such as diet are relevant and applicable to different cultural groups across the world.
They’ve also recently applied for a grant for an AI Avatar that can help coach people toward small changes in their daily life.
They’re building physical tools, too, like a product to improve medication adherence that they’re now testing in a clinical trial. Singh imagines creating a whole suite of products that can help people manage their health in easy, accessible ways. She wants to make products that can blend right into a living room — unobtrusive ways for people to improve their health.
This impulse has brought her back to school again — this time to an M.B.A. program at Columbia, where she’s trying to turn her ideas into products.
“I want to make sure people have easy tools that can be integrated into their households that are fun, that are artsy, and that will actually have impact.”
Singh believes brain health deserves the same level of awareness and action as heart health.
“The major papers are out,” she said. “We’re getting the signs out there.” The real impact, she knows, will come when people incorporate the research into their lives. “It doesn’t matter where you’re starting,” she said. “What matters is that you begin. Improving — even just a little — is the way forward.”
Funding cuts upend projects piecing together saga of human historyAncient DNA expert Christina Warinner notes losses come just as innovations are driving major advances in field
Funding cuts upend projects piecing together saga of human history
Ancient DNA expert Christina Warinner notes losses come just as innovations are driving major advances in field
Christy DeSmith
Harvard Staff Writer
6 min read
In February, Christina Warinner, M.A ’08, Ph.D. ’10, was accepting an award from the American Association for the Advancement of Science when she learned that one of her projects was on a list circulating in Washington of targeted federal research grants. A couple of months later, she appeared in Stockholm at a Nobel symposium and lost two National Science Foundation grants over the span of two weeks.
Warinner, Landon T. Clay Professor of Scientific Archaeology, is well-known in the field of ancient DNA, with her pioneering methods cracking several mysteries concerning early human diets and health. Hers were among the more than 1,600 NSF grants for active projects that were terminated in the spring.
“I recognize it can be hard to compare this work with medical research, which has such obvious applications for saving lives,” Warinner said. “But people also have a deep curiosity about who we are and where we come from. Our work is important because it uses our most powerful technologies to reveal how we, as humans, lived thousands of years ago so that we may better understand our world today.”
The cuts come at a critical time for practitioners of ancient DNA science, a discipline in rapid ascent due to recent advances in lab techniques and computing power. The multidisciplinary field got its start in the mid-1980s in the United States, but support here for the work has lagged behind Northern Europe during the 21st century.
“It’s just really sad,” Warinner said. “American archaeologists have been leaders in telling the stories of humankind. But if our funding is removed, we won’t be leaders anymore.”
“American archaeologists have been leaders in telling the stories of humankind. But if our funding is removed, we won’t be leaders anymore.”
At the Boston reception, a fellow researcher told Warinner one of her major projects was on a database of recommended research cuts. She and her team have been in the thick of a three-year inquiry into the diplomatic role of marriage and extended kin networks in connecting ancient Maya kingdoms along a major river valley in Belize.
It’s one of the most intensely studied corners of the ancient Maya, yielding more than a century’s worth of archaeological discovery.
Cracking the civilization’s elaborate hieroglyphic script, with key breakthroughs made at Harvard in the 1950s, clarified the importance of intermarriage to maintaining inter-kingdom relations. Recent innovations in remote sensing helped researchers uncover a string of previously unknown settlements in a densely forested area known over thousands of years for its cacao harvests.
Was the Belize River Valley more tightly knit with cross-community relations than previously thought? Warinner and her collaborators were on the cusp of finding out.
“The genetic data would really help us tie it all together, to really understand how the ancient Maya political system worked,” she said.
Only in the last five or six years has such a revelation become possible with advancements in sequencing ancient genomes from hot, humid climates, where DNA is far quicker to deteriorate.
Researchers in Belize and at Harvard extracted genetic data from 400 individuals who inhabited the valley over hundreds of years, between 300 B.C.E. and 1000 A.D. To Warinner’s surprise, nearly all, sourced from newly identified sites as well as decades-old excavations, generated at least partial genomes.
“We never anticipated such a high success rate,” shared Warinner, a native Midwesterner who has been studying ancient Maya since her undergraduate years at the University of Kansas. “It’s wonderful. But it also makes our project more expensive than we originally budgeted.”
A May 15 letter canceling the project’s NSF funding dealt an unexpected second blow. Also lost was support for newer research on the practice of horse milking, with recent findings suggesting its origins may be close in age to horse domestication itself.
“Modern society was literally built on the backs of horses,” Warinner said. “But many people are surprised to learn that early domesticated horses were milked. We still don’t know where or when this practice began — that’s something we wanted to trace, to better understand these very earliest human-horse relationships.”
As an ancient DNA expert and also group leader at Germany’s Max Planck Institute, she had been invited by the Nobel Committee to present her work on ancient microbes at a Nobel Symposium on Paleogenomics. Warinner presented May 28 on the archaeology of infectious diseases, the history of fermented foods, and the evolution of the human microbiome.
The topic of horse milking fits squarely with this research focus. Of longstanding interest to Warinner is how milk and dairy products became dietary staples in a world where most are lactose-intolerant. “They are some of our oldest — and least-understood — manufactured foods,” she marveled.
Koumiss, a fermented beverage still popular in Central Asia, makes for a particularly fascinating case study. Made from horse milk, it hails from the very region where horse domestication is believed to have started more than 4,000 years ago. In fact, the mildly alcoholic drink is known to have fueled some of the great Eurasian nomadic empires, including the Mongols and the Xiongnu.
“The whole reason we have undertaken this project is because we believe it is important for understanding human history.”
Christina Warinner
Warinner and her collaborators proposed a novel approach to identifying when, and where, these grassland dwellers got their first sips of koumiss. As a postdoctoral researcher at the University of Oklahoma, she was among the first to recognize that dental tartar functions could be a goldmine for archaeological scientists. The calcified buildup, she found, entraps and preserves biomolecules like DNA as well as proteins, providing unique insights into ancient diets.
Learning about the emergence of koumiss — or raw horse milk, for that matter — meant collaborating with researchers across Central Asia to perform dental cleanings on their archaeological collections.
“The whole reason we have undertaken this project is because we believe it is important for understanding human history,” Warinner offered. “Our grant proposal was successful because a panel of peer reviewers agreed, deeming our research vital science of high priority.
“It’s such an honor,” she added, “to receive funding this way.”
A setback to research that offered hope for fibrous dysplasia patientsPromising HSDM research into the rare and debilitating disease was halted due to withdrawal of federal funding. The research had implications for treating a range of skeletal conditions and broader medical applications.
A setback to research that offered hope for fibrous dysplasia patients
Halt to federal funding disrupts study of rare skeletal disease
Heather Denny
HSDM Communications
3 min read
In 2023, the Harvard School of Dental Medicine was awarded a U.S. Department of Defense grant to fund a four-year study of fibrous dysplasia (FD), a severe skeletal disease in which benign tumors cause bone deformities, fractures, and pain. The award aimed to investigate the cellular and molecular underpinnings of the disease, which affects an estimated 1 in 15,000 to 30,000 people and currently has no cure. The research had promise not only for treating FD, but also for finding treatments for conditions affecting military personnel, including blast-induced heterotopic ossification and chronic bone pain.
At the time, the funding was applauded by patients and patient advocacy groups such as FD/MAS Alliance, a nonprofit dedicated to finding evidence-based treatments for Fibrous Dysplasia and McCune-Albright syndrome.
“This funding was more than just a financial award—it was a crucial investment in understanding and eventually treating a devastating disease.”
Adrienne McBride
“This funding was more than just a financial award—it was a crucial investment in understanding and eventually treating a devastating disease,” said Adrienne McBride, executive director of the Alliance. “Advancing research in FD/MAS benefits those living with this rare disease and holds great potential for broader medical applications.”
The mechanisms investigated in FD research have the potential to yield insights relevant for many other diseases causing bone fragility, pain, and fractures. With federal research funding to Harvard now frozen, these insights may never be realized.
“FD patients and their families had been closely following research advances, hoping for novel, effective interventions. The termination of leading-edge projects like this erodes this hope and sends a discouraging signal to those living with an already-overlooked disease,” said Yingzi Yang, professor of Developmental Biology at HSDM, and principal investigator on the grant.
Yingzi Yang.
Photo by Steve Gilbert
Yang and her partners at Massachusetts General Hospital (MGH) had been making progress in the few years since the funding was awarded. While some work continues at MGH, the research based in the Yang Lab at HSDM, which was critical to providing a greater understanding of the disease mutation, has stopped.
“We had made substantial progress in terms of identifying potential treatment targets of this devastating disease based on getting a better understanding of the molecular mechanisms,” said Yang. “Cutting off our study disrupts the holistic understanding of the FD disease and reduces the research rigor and impacts.”
“Cutting off our study disrupts the holistic understanding of the FD disease and reduces the research rigor and impacts.”
Yingzi Yang
“The cancellation of this grant is a significant setback for FD/MAS research and for patients, including military personnel, who rely on scientific progress for hope and support,” said McBride.
FD/MAS can affect every bone in the body, but the largest subpopulation of those with the disease are affected by FD lesions in their craniofacial bones, leading to severe facial deformities.
HSDM alumnus Christopher H. Fox, DMD87, DMSc91, who leads the American Association for Dental, Oral, and Craniofacial Research (AADOCR), also expressed deep concerns over the implications.
“This funding cut of such promising research is a tragedy for the FD/MAS community and indeed for our country. Through our advocacy efforts, AADOCR is doing everything we can to reverse these ill-advised decisions,” said Fox.
Could lithium explain — and treat — Alzheimer’s?Study offers new theory of disease and strategy for fighting it
In a mouse model of Alzheimer’s disease, lithium deficiency (right) dramatically increased amyloid beta deposits in the brain compared with mice that had normal physiological levels of lithium (left). Bottom row: The same was true for the Alzheimer’s neurofibrillary tangle protein tau.
Yankner Lab
Stephanie Dutchen
HMS Communications
9 min read
Study offers new theory of disease and strategy for fighting it
What is the earliest spark that ignites the memory-robbing march of Alzheimer’s disease? Why do some people with Alzheimer’s-like changes in the brain never go on to develop dementia? These questions have bedeviled neuroscientists for decades.
Now, a team of researchers at Harvard Medical School may have found an answer: lithium deficiency in the brain.
The work, published Wednesday in Nature, shows for the first time that lithium occurs naturally in the brain, shields it from neurodegeneration, and maintains the normal function of all major brain cell types. The findings — 10 years in the making — are based on a series of experiments in mice and on analyses of human brain tissue and blood samples from individuals in various stages of cognitive health.
The scientists found that lithium loss in the human brain is one of the earliest changes leading to Alzheimer’s, while in mice, similar lithium depletion accelerated brain pathology and memory decline. The team further found that reduced lithium levels stemmed from binding to amyloid plaques and impaired uptake in the brain. In a final set of experiments, the team found that a novel lithium compound that avoids capture by amyloid plaques restored memory in mice.
The results unify decades-long observations in patients, providing a new theory of the disease and a new strategy for early diagnosis, prevention, and treatment.
Lithium screening through routine blood tests may one day offer a way to identify at-risk individuals who would benefit from treatment to prevent or delay Alzheimer’s onset.
Affecting an estimated 400 million people worldwide, Alzheimer’s disease involves an array of brain abnormalities — such as clumps of the protein amyloid-beta, neurofibrillary tangles of the protein tau, and loss of a protective protein called REST — but these never explained the full story of the disease. For instance, some people with such abnormalities show no signs of cognitive decline. And recently developed treatments that target amyloid-beta typically don’t reverse memory loss and only modestly reduce the rate of decline.
It’s also clear that genetic and environmental factors affect risk of Alzheimer’s, but scientists haven’t figured out why some people with the same risk factors develop the disease while others don’t.
Lithium, the study authors said, may be a critical missing link.
“The idea that lithium deficiency could be a cause of Alzheimer’s disease is new and suggests a different therapeutic approach,” said senior author Bruce Yankner, professor of genetics and neurology in the Blavatnik Institute at HMS, who in the 1990s was the first to demonstrate that amyloid-beta is toxic.
The study raises hopes that researchers could one day use lithium to treat the disease in its entirety rather than focusing on a single facet such as amyloid-beta or tau, he said.
One of the main discoveries in the study is that as amyloid-beta begins to form deposits in the early stages of dementia in both humans and mouse models, it binds to lithium, reducing lithium’s function in the brain. The lower lithium levels affect all major brain-cell types and, in mice, give rise to changes recapitulating Alzheimer’s disease, including memory loss.
The authors identified a class of lithium compounds that can evade capture by amyloid-beta. Treating mice with the most potent amyloid-evading compound, called lithium orotate, reversed Alzheimer’s disease pathology, prevented brain-cell damage, and restored memory.
Treating mice with the amyloid-evading lithium orotate (top) reduced amyloid beta (left) and tau (right) much more effectively than lithium carbonate (bottom).
Yankner Lab
Although the findings need to be confirmed in humans through clinical trials, they suggest that measuring lithium levels could help screen for early Alzheimer’s. Moreover, the findings point to the importance of testing amyloid-evading lithium compounds for treatment or prevention.
Other lithium compounds are already used to treat bipolar disorder and major depressive disorder, but they are given at much higher concentrations that can be toxic, especially to older people. Yankner’s team found that lithium orotate is effective at one-thousandth that dose — enough to mimic the natural level of lithium in the brain. Mice treated for nearly their entire adult lives showed no evidence of toxicity.
“You have to be careful about extrapolating from mouse models, and you never know until you try it in a controlled human clinical trial,” Yankner said. “But so far the results are very encouraging.”
Lithium depletion is an early sign of Alzheimer’s
Yankner became interested in lithium while using it to study the neuroprotective protein REST. Discovering whether lithium is found in the human brain and whether its levels change as neurodegeneration develops and progresses, however, required access to brain tissue, which generally can’t be accessed in living people.
So the lab partnered with the Rush Memory and Aging Project in Chicago, which has a bank of postmortem brain tissue donated by thousands of study participants across the full spectrum of cognitive health and disease.
Having that range was critical because trying to study the brain in the late stages of Alzheimer’s is like looking at a battlefield after a war, said Yankner; there’s a lot of damage and it’s hard to tell how it all started. But in the early stages, “before the brain is badly damaged, you can get important clues,” he said.
Led by first author Liviu Aron, senior research associate in the Yankner Lab, the team used an advanced type of mass spectroscopy to measure trace levels of about 30 different metals in the brains and blood of cognitively healthy people, those in an early stage of dementia called mild cognitive impairment, and those with advanced Alzheimer’s.
Lithium was the only metal that had markedly different levels across groups and changed at the earliest stages of memory loss. Its levels were high in the cognitively healthy donors but greatly diminished in those with mild impairment or full-blown Alzheimer’s.
Lithium (upper left) was the only metal that differed significantly between people with and without mild cognitive impairment, often a precursor to Alzheimer’s.
The team replicated the findings in samples obtained from multiple brain banks nationwide.
The observation aligned with previous population studies showing that higher lithium levels in the environment, including in drinking water, tracked with lower rates of dementia.
But the new study went beyond by directly observing lithium in the brains of people who hadn’t received lithium as a treatment, establishing a range that constitutes normal levels, and demonstrating that lithium plays an essential role in brain physiology.
“Lithium turns out to be like other nutrients we get from the environment, such as iron and vitamin C,” Yankner said. “It’s the first time anyone’s shown that lithium exists at a natural level that’s biologically meaningful without giving it as a drug.”
Then Yankner and colleagues took things a step further. They demonstrated in mice that lithium depletion isn’t merely linked to Alzheimer’s disease — it helps drive it.
Loss of lithium causes the range of Alzheimer’s-related changes
The researchers found that feeding healthy mice a lithium-restricted diet brought their brain lithium levels down to a level similar to that in patients with Alzheimer’s disease. This appeared to accelerate the aging process, giving rise to brain inflammation, loss of synaptic connections between neurons, and cognitive decline.
In Alzheimer’s mouse models, depleted lithium dramatically accelerated the formation of amyloid-beta plaques and structures that resemble neurofibrillary tangles. Lithium depletion also activated inflammatory cells in the brain called microglia, impairing their ability to degrade amyloid; caused the loss of synapses, axons, and neuron-protecting myelin; and accelerated cognitive decline and memory loss — all hallmarks of Alzheimer’s disease.
The mouse experiments further revealed that lithium altered the activity of genes known to raise or lower the risk of Alzheimer’s, including the best-known, APOE.
Lithium deficiency thinned the myelin that coats neurons (right) compared to normal mice (left).
Yankner Lab
Replenishing lithium by giving the mice lithium orotate in their water reversed the disease-related damage and restored memory function, even in older mice with advanced disease. Notably, maintaining stable lithium levels in early life prevented Alzheimer’s onset — a finding that confirmed that lithium fuels the disease process.
“What impresses me the most about lithium is the widespread effect it has on the various manifestations of Alzheimer’s. I really have not seen anything quite like it all my years of working on this disease,” said Yankner.
A promising avenue for Alzheimer’s treatment
A few limited clinical trials of lithium for Alzheimer’s disease have shown some efficacy, but the lithium compounds they used — such as the clinical standard, lithium carbonate — can be toxic to aging people at the high doses normally used in the clinic.
The new research explains why: Amyloid-beta was sequestering these other lithium compounds before they could work. Yankner and colleagues found lithium orotate by developing a screening platform that searches a library of compounds for those that might bypass amyloid-beta. Other researchers can now use the platform to seek additional amyloid-evading lithium compounds that might be even more effective.
“One of the most galvanizing findings for us was that there were profound effects at this exquisitely low dose,” Yankner said.
If replicated in further studies, the researchers say lithium screening through routine blood tests may one day offer a way to identify at-risk individuals who would benefit from treatment to prevent or delay Alzheimer’s onset.
Studying lithium levels in people who are resistant to Alzheimer’s as they age might help scientists establish a target level that they could help patients maintain to prevent onset of the disease, Yankner said.
Since lithium has not yet been shown to be safe or effective in protecting against neurodegeneration in humans, Yankner emphasizes that people should not take lithium compounds on their own. But he expressed cautious optimism that lithium orotate or a similar compound will move forward into clinical trials in the near future and could ultimately change the story of Alzheimer’s treatment.
“My hope is that lithium will do something more fundamental than anti-amyloid or anti-tau therapies, not just lessening but reversing cognitive decline and improving patients’ lives,” he said.
This research was supported by the National Institutes of Health.
What your credit score says about how, where you were raisedStudy looks at national disparities, finds bill-paying habits emerge by early adulthood, influence upward mobility
What your credit score says about how, where you were raised
Study looks at national disparities, finds bill-paying habits emerge by early adulthood, influence upward mobility
Christy DeSmith
Harvard Staff Writer
6 min read
A person’s credit report tells a story about their childhood.
New research, released last month by Harvard’s Opportunity Insights, shows that a strong predictor of an adult’s bill-paying habits —the main determinant of credit scores — is the environment in which they grew up. The study, based on a sample of more than 25 million Americans, reveals lifelong differences in repayment behavior emerging by early adulthood according to race, hometown, and socioeconomic class.
These habits proved surprisingly stubborn as individuals moved up and down the socioeconomic ladder.
“It turns out the credit bureaus are able to learn something about us by age 25 that is extremely persistent,” said co-author Jamie Fogel, a research scientist at Opportunity Insights.
“It turns out the credit bureaus are able to learn something about us by age 25 that is extremely persistent.”
Jamie Fogel
A strong credit score, frequently defined as 661 or higher, is a key tool for economic advancement. It means greater access to loans at lower interest rates for education, cars, homes, or starting businesses.
A solid rating can also open other doors.
“Credit scores are also used to screen job applicants, renters, and even people looking to buy insurance,” Fogel noted. “So lacking a good score can shut down multiple opportunities all at once.”
Fogel and his co-authors set out to take an ambitious, population-wide look at disparities in access to credit and the financial management skills that make affordable borrowing possible. Anonymized records from a major credit bureau were linked with U.S. Census and tax data on roughly 1 percent of U.S. residents.
“We were able to get a representative sample while simultaneously zooming in on particular cohorts,” Fogel explained.
For people born between 1978 and 1985, parental data was also incorporated. “That means we were able to look at people’s parents’ income as well as where they grew up,” Fogel said. “Both turned out to be pretty important.”
Credit bureaus’ scoring algorithms, designed to predict the likelihood of default, are based solely on recent repayment history. The bureaus are legally prohibited from incorporating information on race, age, income, and location. But a growing body of evidence finds that demographic disparities still persist.
OI’s new study, with its big-data approach, yields powerful new insights. By age 25, the researchers found, Americans whose parents were in the lowest 20 percent of earners have an average credit score of 615. Those whose parents were in the top 20 percent averaged 725.
“Your parents’ credit score is extremely predictive of your own repayment,” Fogel noted.
“Your parents’ credit score is extremely predictive of your own repayment.”
Jamie Fogel
Also at 25, Black Americans average credit scores that are nearly 100 points lower than white Americans and 140 points lower than Asian Americans.
What’s more, these disparities looked “almost identical” at age 65, Fogel said.
Controlling for income, by looking only at those from the lowest 25th percentile of parental earnings, revealed a still prominent 69-point gap between Black and white individuals. And the average credit score for Black Americans from the top 90th percentile of parental earnings is similar to whites with low-income backgrounds.
There are almost certainly racial disparities in job stability, he added. “But we can restrict to people who are continuously employed at the same firm, with not too much income volatility. These gaps persist even then.”
Geographic patterns proved equally striking, suggesting that children absorb personal financial lessons from their broader community as well as from parents.
Those from the Upper Midwest, New England, and certain areas of the western U.S. average the highest credit scores and therefore benefit from lower interest rates. People from Appalachia and certain parts of the South have lower scores, with unmet borrowing needs.
A set of more granular analyses revealed hyper-local differences. The country’s highest overall credit scores (an average of 724) were found in Bergen County, New Jersey, just across the Hudson River from New York City. Baltimore averaged nearly 100 points lower as the locale with the country’s lowest scores.
A separate analysis, focusing exclusively on Americans who grew up in low-income families, confirmed the influence of place on repayment behaviors. In Brooklyn, white Americans from low-income families had the highest average scores (719) whereas individuals with similar backgrounds in the Indianapolis area saw the lowest averages (629).
Also illuminating were patterns observed in people who moved from a place like Brooklyn to a place like Indianapolis, or vice versa. Those who relocated in early childhood appeared more likely to absorb the debt-paying habits of their adopted community. But moving as a teenager meant retaining more influences from the birthplace.
“We don’t know exactly what it is,” Fogel said, “but there really is something you’re getting from your community that has a strong effect on your repayment behavior.”
In the paper, the co-authors also review possible explanations. For example, previous research documents the long-term behavioral effects of historic economic traumas, with the 1921 Tulsa Race Massacre offered as one example.
OI’s data also show that Black Americans and those from low-repayment areas are more likely to float cash to family and friends, with Black Americans also less likely than white Americans to receive assistance from parents. In fact, Black Americans are more likely to be the ones helping their elders.
Correlations with previous OI findings are especially suggestive. The geographic patterns of repayment, newly incorporated into OI’s online Opportunity Atlas, mirror previous work documenting regional and racial variances in access to the American Dream.
“Places that promote repayment are the exact same places that promote upward mobility.”
Jamie Fogel
“Places that promote repayment are the exact same places that promote upward mobility,” Fogel observed. “We can see these places promoting repayment even when controlling for income.”
The co-authors don’t see an easy fix, noting that the current credit-scoring system understates repayment gaps by race, geography, and class. More accurate measures would likely exacerbate disparities, they wrote.
Instead, the OI team called for more social scientists to examine how race and childhood environment shape financial management skills for life.
“If we want to improve access to credit,” Fogel said, “we really need to understand what’s happening before people’s 25th birthday.”
Foundation for U.S. breakthroughs feels shakier to researchersFunding cuts seen as threat to nation’s status as driver of scientific progress
Foundation for U.S. breakthroughs feels shakier to researchers
Max Larkin
Harvard Staff Writer
6 min read
Funding cuts seen as threat to nation’s status as driver of scientific progress
With each dollar of its grants, the National Institutes of Health — the world’s largest funder of biomedical research — generates, on average, $2.56 worth of economic activity across all 50 states.
The awards yield new drugs, like the naloxone spray used to prevent opioid overdoses, and breakthroughs in basic science, like the link between cholesterol and heart health.
But NIH grants also support more than 400,000 U.S. jobs, and have been a central force in establishing the country’s dominance in medical research. A recent survey by Nature found that, in health sciences, American research output is larger than that of the next 10 leading countries combined.
And that’s in large part due to federal government support of research conducted by universities. According to data from the Organisation for Economic Co-operation and Development, over the course of the past three decades, those universities have become, as a sector, the largest hub of nonbusiness research in the world.
Waves of grant terminations under the second Trump administration have thrown that relationship into doubt — and posed particular threats to certain kinds of research. Harvard has challenged the terminations in federal court. And in July officials confirmed they will provide 80 percent of expected expenses so that most defunded research inside the University can continue temporarily.
But that doesn’t protect researchers from the anxiety that comes with what could be a life-altering jolt. Another concern is lost time. Most of the affected grants support projects that touch many human lives. Disruptions have consequences.
Walter Willett, the Fredrick John Stare Professor of Epidemiology and Nutrition and — by one count — Harvard’s single most-cited scholar, worries about the maintenance of biobanks whose samples can date back 45 years.
The longitudinal studies behind these samples, including research conducted at Harvard and in Washington, generate insights by following populations over long periods of time. So an ill-timed loss of funding can leave an irremediable gap in the dataset or a question mark in place of a finding.
As his grants dried up in May, Willett and his team started “scrambling to try to protect the samples and the data we have”: freezers full of blood samples, DNA, and other biological material. Willett confirmed that those samples are safe this summer, thanks to the University’s stopgap funding. “But we still don’t have long-term solutions,” he added.
Of her four canceled grants, Molly Franke, an epidemiologist and professor of global health and social medicine at Harvard Medical School, worried most about a five-year randomized trial following roughly 160 teens and young adults living with HIV in Peru. The study tests a community-based support intervention that includes mental health support and healthcare liaisons who help them sign up for insurance, get government IDs, and enter treatment.
After the grant was canceled, that network of support was at risk of disappearing. “It was devastating,” she said. “These young people are often in very precarious social situations: Sometimes they don’t have adults in their lives; they’re struggling with mental health issues, substance abuse, or extreme poverty.”
Once University administrators committed to maintaining funding on a temporary basis, researchers breathed a small sigh of relief.
But Franke will still have to look for other backers to make sure the Peru study can be brought to a satisfactory conclusion. Her team tries to lighten the toll of disease in far-flung places because they believe “it’s the right thing to do,” she said. But the work is far from irrelevant to Americans, she noted.
“Infectious diseases know no borders,” Franke said. “And when we get drug-resistant tuberculosis in this country, we know how to treat it because of studies conducted elsewhere.”
In the spring of 2024, Kelsey Tyssowski — a research associate in organismic and evolutionary biology — received a grant of $130,255 through the NIH’s BRAIN Initiative for her work on the nervous systems of deer mice, in the hopes that it might shed light on ALS and other neurodegenerative disorders. (That may sound like a stretch, Tyssowski acknowledged, before pointing out that “skilled movement is the thing that people lose first with a lot of diseases.”)
But, as with nearly all other government grants to Harvard, those funds were finally revoked in early May.
Across 15 years in labs, Tyssowski said she’s been funded by government money “more often than not.” Her latest grant was supposed to serve as a bridge between her postdoc in the lab of Hopi Hoekstra and a tenure-track job, and a dedicated lab, probably on another campus.
“I may be the only person studying skilled movement, from this angle, right?” she said with a laugh. “I’d like to start my own lab, and train other people to do this. And if I can’t do that, all of the money and time and energy that’s gone into getting me to this point will have been almost completely wasted.”
Similar stories are playing out across Greater Boston and elsewhere in the nation’s research hubs. Grant data from the NIH shows that affected researchers at Harvard were working across a variety of medical frontiers, from cancer immunotherapy and stem cells to environmental health.
But researchers also stress that their work is not limited to labs on campus or in local hospitals.
At Harvard Medical School, the termination of 350 grants — totaling $230 million in annual funding — has also entailed the cancellation of over 100 “sub-awards.” Those are funds that pass through to partner institutions — in Harvard’s case, in 23 states and Washington, D.C. — that might have better access to animal species or lab resources.
Jonathan Abraham, associate professor of microbiology at HMS, won a grant to analyze mosquitos en route to a better understanding of Eastern equine encephalitis, or EEE. And it came with a sub-award for the University of Texas Medical Branch, as the world’s largest depository of insect-borne viruses.
Meanwhile, Stephanie Mohr won a similar sub-award for a team at the University of Maryland School of Medicine, for a study of tick biology that hoped to shed light on Lyme disease. They were just a few months into a five-year grant when the termination hit.
The same goes for Franke’s study of HIV in youth, which involved a sub-award to the Peruvian branch of Partners In Health.
That study involved, she said, a commitment not just to patients but to the staff paid to care for them and to Peru’s Ministry of Health. The collapse of one grant had ripples of risk, even thousands of miles away.
“It affects the care, people’s livelihoods … and a trust that had taken 20 years to build,” Franke said. “That was what kept me up at night.”
Working through pain? You’re not alone.Researchers use Dutch tool to pursue full scale of functional limitations in U.S. labor force
Researchers use Dutch tool to pursue full scale of functional limitations in U.S. labor force
Alvin Powell
Harvard Staff
3 min read
A new study of functional abilities in the U.S. labor market reveals a workforce both vulnerable and resilient, with a large majority of workers reporting multiple limitations even as they fulfill their job duties, according to researchers at Harvard Medical School.
“Prior research finds that people in midlife are less healthy than people who are older now were at midlife,” said Maestas, the John D. MacArthur Professor of Economics and Health Care Policy. “And it’s even true that younger people are less healthy than the midlife people were when they were younger.”
The study, published in June in the Proceedings of the National Academy of Sciences, employed a tool developed in the Netherlands to assess disability claims. The Dutch tool measures 97 job-related functional abilities, providing a far more granular picture of American workers than the U.S. government’s disability measure, which considers six domains.
“We haven’t seen a detailed portrait like this of the American workforce,” Maestas said. “It’s not that we’re measuring it better, it’s that we’re measuring it for the first time.”
“We haven’t seen a detailed portrait like this of the American workforce.”
Nicole Maestas
The study of 3,396 working adults age 22 and older found that three-fourths faced at least one functional limitation. It also indicated that U.S. workers average more than five functional limitations each. The most prevalent limitations are upper-body strength and range of motion of one’s torso. Also common are limitations related to sensitivity to the ambient environment — hot weather, for example — and to knee function. Other limitations include problems linked to the immune system, head and neck movements, emotional regulation, and cognition.
The researchers also asked workers about underlying medical issues. The conditions that cause the greatest number of functional limitations are mental illness, joint conditions such as arthritis, substance use disorder, and asthma and chronic obstructive pulmonary disease.
The data was collected in 2019. The National Institute on Aging grant supporting the project has been canceled, but Maestas said that researchers managed to collect additional data early this year for a follow-up study that she hopes will identify targets for intervention.
While the employment of people with functional limitations is a success of the U.S. labor market, the new paper highlights the vulnerability of the workforce and, by extension, the national economy, Maestas said. The highest levels of functional limitations were seen in jobs that involve constant physical labor, as well as clerical, service, and sales positions. Many of these roles are essential. The upshot is a workforce less equipped for the impact of a pandemic or some other major disruption.
“The fact that so many people with functional limitations are working is a success,” Maestas said. “It also reveals points of vulnerability when you’re thinking about the country’s broader economic performance. The backdrop of this study is the fact that the U.S. population is aging at its most rapid clip ever. We knew this was coming but you have more people retiring than are coming into the workforce. We need workers in order to keep our economy growing.”
Slavery researchers seek more detailed picture of pre-Civil War HarvardCareful effort to identify leaders, faculty, and staff is key to descendants probe:
‘This work takes time to do well’
Slavery researchers seek more detailed picture of pre-Civil War Harvard
Gabriel Raeburn and Christine Bachman-Sanders review documents.
Photo courtesy of Claire Vail at American Ancestors
Jacob Sweet
Harvard Staff Writer
9 min read
Careful effort to identify leaders, faculty, and staff is key to descendants probe: ‘This work takes time to do well’
In their efforts to trace the descendants of enslaved people connected to Harvard, researchers with American Ancestors first had to tackle a surprisingly difficult question: Who were the University’s pre-Civil War leaders, faculty, and staff?
Now, a once-scattered record is steadily coming into focus.
The work started soon after the University accepted the recommendation of the Presidential Committee on Harvard & the Legacy of Slavery to identify, engage, and support direct descendants of slavery. In their report, released in 2022, the committee identified several Harvard leaders, faculty, and staff who enslaved people. Among them were philanthropist Benjamin Bussey, who built his wealth through the trans-Atlantic trade of products produced by enslaved people and later donated his estate to Harvard College, and steward Andrew Bordman, who owned and relied on eight enslaved people to feed Harvard students and complete his job duties.
Efforts to identify pre-Civil War leaders, faculty, and staff have been underway since 2023, in parallel with research to identify direct descendants of enslaved individuals. Researchers with the Harvard Slavery Remembrance Program led this aspect of the work, while American Ancestors advanced the direct descendant research. In January, American Ancestors also took the lead on the research to identify Harvard officials.
“At first glance, it seems like a straightforward task to ‘identify leadership, faculty, and staff,’” said Lindsay Fulton, chief research officer at American Ancestors. “But that’s a modern perspective that’s shaped by access to yearbooks, alumni directories, and carefully maintained records. Those tools didn’t always exist, so our researchers had to get creative in locating where, and how, these names were documented. In our experience, this work takes time to do well.”
“Those tools didn’t always exist, so our researchers had to get creative in locating where, and how, these names were documented. In our experience, this work takes time to do well.”
Lindsay Fulton
For well over 200 years — from Harvard’s founding in 1636 to the end of the Civil War in 1865 — the University existed when slavery was legal in at least parts of the U.S. Even after 1783, when slavery was effectively banned in Massachusetts, leaders, faculty, and staff could still come into ownership of enslaved people, often referred to as servants, through relatives or have businesses that were closely tied to the labor of enslaved people.
While positions like the University’s president and treasurer are easy to trace through hundreds of years of history, others are not.
Gabriel Raeburn reviews documents with fellow American Ancestors researcher Christine Bachman-Sanders.
Photo courtesy of Claire Vail at American Ancestors
“The University doesn’t have its own compiled digital staff directory before the modern era,” said Gabriel Raeburn, senior research project manager at American Ancestors. “The first step, even for finding enslaved people, was to go through thousands and thousands of pages of dense archival records in 17th- and 18th-century cursive to work out who the people are who worked at the University.”
To identify leaders, faculty, and staff, researchers continue to comb through handwritten notes from University meetings, as well as stewards’ books, faculty records, colonial and state legislative charters, church rosters, city archives, and a variety of other sources to recreate a roster from the ground up. Through this work, researchers at the Harvard Slavery Remembrance Program and American Ancestors have verified more than 3,000 members of leadership, faculty, and staff from this period.
A foundation for deeper knowledge
Figuring out who worked at pre-Civil War Harvard often begins with fragments of information: a brief mention in centuries-old meeting notes, a class registry. For Harvard’s early history, identifying the makeup of the leadership, staff, and faculty requires a deep understanding of the University’s connections to colonial and local church leadership and knowing where to look. Unearthing this information leverages proven genealogical methodologies which the researchers at American Ancestors are skilled at applying.
For example, for two centuries, certain members of the colonial government and ministers of local towns and cities were automatically granted positions on Harvard’s Board of Overseers. Therefore, researchers looked to legislative acts and Harvard’s colonial charters to see which roles were automatically granted leadership positions — like the Congregational ministers of Boston, Cambridge, Charlestown, Watertown, Dorchester, and Roxbury — and are using church documents to determine which individuals were on Harvard’s board.
These contextual methods are particularly important to try to bridge gaps in Harvard’s own archives. One such gap owes to a 1764 fire that destroyed much of the University’s collections.
At the bottom of these handwritten notes from a 1737 meeting between president and faculty, six new waiters are identified by last name.
Harvard University, Harvard University Archives, UAI5_5_B08_V12-METS
Additionally, for those without extensive experience in records-based genealogy, the records that the University has in its possession can be difficult to decode. Most are written in script of variable clarity and consistency.
Researchers must also know where to look. For example, in earlier years of the University’s existence, it was during Harvard Corporation meetings that leaders appointed paid staff members — often current or recently graduated students. In most cases, notetakers did not list the new staff members by full name. Instead, most are referred to by last name, and in cases where there are multiple students at Harvard with that same name, by a mark of seniority. Someone with the last name Smith who was appointed as head cook, for example, might be referred to as Smith Jr.
Genealogists at American Ancestors now have a system for categorizing people with the same last name. For certain periods of University history, figuring out which Smith was appointed for a new position means going through records and establishing which Smith was the youngest at the time. In other periods, whether a person was referred to as senior, junior, III, or IV depended on their relative social standing. Both require researchers to peruse contemporary records and identify the proper Smith. At times, distinguishing between family members requires researchers to search through birth and death records held both in Massachusetts and across the country.
Understanding who worked at the University allows researchers to then explore whether these individuals owned enslaved people. It also gives researchers a bird’s-eye view of the interconnected names, families, and communities that shaped Harvard.
“These individuals did not operate in isolation,” said Fulton. “They studied together, taught together, published together, worshipped together, and often their children married one another. Understanding this complex, living network makes our conclusions more comprehensive, more accurate, and more reflective of the institution’s true historical landscape.”
“Understanding this complex, living network makes our conclusions more comprehensive, more accurate, and more reflective of the institution’s true historical landscape.”
Lindsay Fulton
During the period being studied by researchers, the size of the University — both students and staff — increased greatly. Harvard’s first graduating class, in 1642, included just nine students. Throughout the 17th century, there were five years in which no students graduated at all. Precise documentation of staff and faculty was sometimes hard to come by. Over time, the number of Harvard faculty, staff, and students grew and documentation improved. In 1860, Harvard awarded more than 200 degrees across the College, Medical School, and Law School.
As the University expanded, the number of individuals to sort through increased, but documents produced during these times help simplify the process. For instance, entries in the Massachusetts Register, published annually by the state beginning in 1767, recorded each new appointment to the University. Researchers can use the list and verify it with primary sources.
‘Marathon of research’
This work to establish a robust list of Harvard leaders, faculty, and staff is enabling American Ancestors not only to more accurately identify individuals who enslaved people, but also to begin uncovering the names of those who were enslaved — and, ultimately, to trace their living descendants.
The researchers emphasize that the different components of this work continue simultaneously. In addition to identifying former University leaders, faculty, and staff, researchers contributing to the Harvard Slavery Remembrance Program are working to identify those who were enslaved and their living descendants. To date, 964 formerly enslaved people and 591 living descendants of these individuals have been identified.
After pinpointing members of Harvard’s faculty, staff, and leadership, researchers from American Ancestors turn to more historical documents, like tax lists, to identify individuals who enslaved people.
Photo courtesy of Claire Vail at American Ancestors
The meticulous nature of records-based genealogy is slow, and the scope can be hard to predict. On TV shows like “Finding Your Roots,” hosted by Alphonse Fletcher University Professor Henry Louis Gates Jr., guests learn about their genealogy in a single episode. In reality, the genealogical work behind each episode of “Finding Your Roots,” which is fact-checked at American Ancestors and focuses on a single person, takes about six months.
“Genealogical research is painstaking work — poring over centuries-old records, tracing forgotten names, and piecing together histories that have often been lost or obscured,” said Gates. “It demands not just patience and rigor, but a passion for discovery. That’s why American Ancestors is the perfect organization to do this work for Harvard. Their deep expertise, meticulous attention to detail, and unwavering commitment to uncovering the stories of our past make them uniquely qualified to take on this vital work.”
“Genealogical research is painstaking work — poring over centuries-old records, tracing forgotten names, and piecing together histories that have often been lost or obscured.”
Henry Louis Gates Jr.
The identification of a clear list of pre-Civil War leaders, faculty, and staff, according to American Ancestors researchers, will lead to a much fuller picture of the University’s ties to slavery — and create a useful foundation for future research and engagement with living direct descendants. Fulton said that the 3,000 individuals they’ve identified as leaders, faculty, and staff far exceeded their initial estimate and gave the group a more accurate — and expansive — view of their work.
“Getting this right is critical — it’s the starting line for what will be a marathon of research,” Fulton said. “And in a marathon, you don’t want to head off in the wrong direction and realize halfway through that you need to double back.”
Is dirty air driving up dementia rates?Federal funding cuts halt 3 studies exploring how pollution and heat affect the brain and heart
Federal funding cuts halt 3 studies exploring how pollution and heat affect the brain and heart
Liz Mineo
Harvard Staff Writer
4 min read
Antonella Zanobetti was conducting groundbreaking research to examine links between exposure to environmental factors, such as pollution and heat, and deadly neurological and cardiovascular diseases. But three of her studies came to a halt with the Trump administration’s mass cancellation of Harvard research grants in May.
Preliminary evidence suggests air pollution harms the brain, said Zanobetti, an environmental epidemiologist and principal research scientist at the T.H. Chan School of Public Health. She had hoped that her studies would raise awareness of potential links between exposure and increased risk of dementia, as well as explore the protective effects of modifiable risk factors such as green space.
“It’s crucial to finish all the work that we are doing,” said Zanobetti, who led a team of researchers in 2020 to conduct the first national study on air pollution’s effect on Alzheimer’s and Parkinson’s. “We need to understand the factors that can impact hospitalization for neurological disorders. The high prevalence of neurodegenerative diseases is a matter of public health.”
Fueled by aging and industrialization, neurological disorders are surging around the country and the world. Alzheimer’s disease is the sixth leading cause of death in the U.S., and the death rates for Parkinson’s are rising fast. The number of people globally with Parkinson’s is projected to reach more than 12 million by 2040.
“It’s important to understand the role of environmental exposures on neurological disorders to help develop public health policies.”
For one of Zanobetti’s halted studies, her team was analyzing Medicare and Medicaid claims to estimate how long-term exposure to air pollution may increase hospitalizations for Alzheimer’s and related dementias. “We wanted to assess whether air pollution exposure increases risk of mortality and/or hastens rehospitalization,” she said.
Collecting the data was challenging because when patients with Alzheimer’s or Parkinson’s are hospitalized, their neurodegenerative disease is often not the main reason. “It could be a stroke or a fall,” said Zanobetti. “We were in the middle of developing methods to overcome statistical challenges, including outcome misclassification, in addition to studying the impact of heat on hospitalizations.”
Another study, co-led by Danielle Braun, examining the effect of heat and other environmental exposures on hospitalizations for Parkinson’s was supposed to have two more years of funding when it was canceled.
“We were in the middle of looking at the effects of high temperature and other air pollutants on Parkinson’s hospitalization,” said Zanobetti. “We wanted to estimate the chronic and acute effects of multiple environmental exposures to understand the impact of air pollution, heat, or other exposure on Parkinson’s hospitalizations.”
Zanobetti had a third grant terminated. Co-led by Petros Koutrakis, the study was to be the first to provide evidence of the effects of particle radioactivity on heart disease, which is the leading cause of death in the U.S.
Particulate matter, or tiny particles of air pollutants, can be inhaled and reach the lungs, the heart, and the brain, said Zanobetti. Particle radioactivity is caused by radionuclides in the air that attach to ambient fine-particle pollution and, after inhalation, release ionizing radiation inside the body.
The Environmental Protection Agency has used previous research by Zanobetti and her team on particulate matter’s impact of on health to lower National Ambient Air Quality Standards for fine particulate matter in order to reduce health risks linked to air pollution. Last year, her work and that of other T.H. Chan School of Public Health researchers helped establish more rigorous federal regulations on particulate air pollution.
Overall, Zanobetti’s three canceled grants sought to provide scientific evidence of the links between environmental factors and Alzheimer’s, Parkinson’s, and heart disease to inform the development of policies that would improve air quality and protect public health, she said.
“It’s important to understand the role of environmental exposures on neurological disorders to help develop public health policies,” said Zanobetti. “It’s really heartbreaking to see that everything we worked for has been stopped. There is so much to discover, so much to learn, and we cannot do it.”
‘By mid-March, corpses littered the street like newspapers’Young Ukrainian mother and her toddler left to fend for themselves after husband joins soldiers defending Mariupol
‘By mid-March, corpses littered the street like newspapers’
long read
Young Ukrainian mother and her toddler left to fend for themselves after husband joins soldiers defending Mariupol
Excerpted from “By the Second Spring: Seven Lives and One Year of the War in Ukraine” by Danielle Leavitt, Ph.D. ’23.
By the end of February, Leonid had begun taking food and supplies to the Ukrainian soldiers at the front lines of Mariupol’s defense. He talked about them constantly — he called them “his guys” — and he worried about them, regaling Maria with how their positions were changing and they weren’t getting the help they needed. He bought carton upon carton of cigarettes and as many jugs of water as he could find, then drove through the shelling to deliver them. He was eager to help, and even as the barrages intensified and Maria said she didn’t want him to go anymore, he still went several more times.
On March 1, Maria and Leonid decided that staying in their apartment for any length of time during the daylight hours was no longer an option. They would shelter in the basement. For the time being, they would still sleep in the apartment — mainly for comfort — but if things got even worse, they’d begin sleeping in the cellar, too. Explosions, shelling, and shock waves were so frequent that darting from the basement to do anything — grab an item from the apartment, get some fresh air, cook food — risked sudden death.
Maria’s older sister, her husband, and their toddler son had also joined Leonid, Maria, and David by the beginning of March, and they stayed in the cellar for 12 hours at a time, trying to keep everyone warm and fed and entertain the two babies. In their courtyard, Leonid broke down the crates and old furniture they found in the basement to build a fire. He melted snow to boil and cooked soup and dried pasta.
Photo by Carolyn Moffat
On March 3, Leonid began preparing his military clothes. He had received some ribbons from those he visited on the front lines — ribbons that suggested a specific group or unit — and she saw him sew them on the chest of his uniform. He was enlisting, and she was watching it happen. Before the full-scale invasion, young men in Ukraine were required to serve 12 to 18 months in the army, but as Russia invaded, Ukraine did away with that policy. The state instead implemented new conscription practices, allowing the government to summon for service any able-bodied man between the ages of 27 and 60, including those without former military experience. Later, Ukraine would lower this age to 25 years. Men would often receive a summons to report to a recruitment center, after which they would be medically examined and sent off for a short stint in training. Early on, many men and women volunteered without a summons, a surge that sustained the army in the first months of the war.
Leonid had completed his compulsory military service in the previous years. Though he was not summoned, seeing the situation deteriorate so rapidly in his hometown compelled him to rejoin the ranks.
On March 5, Leonid drove across town to wish his mother a happy birthday. It confused Maria that he’d risked exposure during an air raid simply to see her, but he insisted on going there in person.
Early the next morning, Leonid gently woke Maria. “We need to say goodbye,” he whispered. Still groggy, she shrugged him off. “Maria, it’s time to say goodbye,” he insisted. He had already been out that morning on a reconnaissance mission. She didn’t understand. “What are you talking about?” She yawned.
“Let’s say goodbye. I need to go.”
“Let’s say goodbye. I need to go.”
She pushed her eyes open, and he looked at her with a seriousness that scared her. He did not look away.
“No, no, Leonid,” she whispered. She would have to talk sense into him, beg him to stay. “No, Lyonya,” she said, using his familiar shortened name. “You can’t leave me,” she pleaded, “David, our life. What goes on there is not for you. Let’s leave together, we can try to get out through the humanitarian corridor, we can go as a family.”
He cast his eyes down. “I have to go, Maria.” Watching him carefully, she knew he was serious — she had never seen him this resolute, as though his face had turned to stone, as though nothing she did, no threats, no pleading, no weeping, could keep him there. He tried to embrace her, and she stiffened, flaring with anger and grief. He turned to walk out into the stairwell.
“I was in a stupor, I just lay there, stuck. I didn’t understand,” Maria said.
Her parents then told her to go chase after him, talk to him. Following him into the stairwell, Maria caught up with him. Leonid was upset. He twitched with agitation and emotion.
“Maybe you’ll at least hug me?” he said, and she did, and the pain sliced through them. Before he could change his mind or she could say anything, he turned and jogged down the stairwell.
Telling me the story several months later, her voice wavered with emotion: “I truly did not think he would go. But I watched him leave.”
The next day, Leonid’s father came to check on them and bring some food.
“Where’s Leonid?” he asked.
Maria realized that Leonid had not told anyone else, not even his parents when he’d gone to see them.
“Where is Leonid?” her father-in-law asked again.
“He went to fight,” Maria said.
With Leonid gone, Maria knew she would need to fortify herself. Despite her stubbornness and resilience, she had come to rely on him in their relationship. Without him, Maria knew she could not expect anyone to help her anymore.
By the time Leonid left, her parents and sister, along with her sister’s husband and young son, were staying with her in the same basement. Their basement was large, and to get to the part of it where they could sit down, where they had built a small encampment, they had to walk through dark tunnels, feeling their way along the cold stone walls. Her mother did not hear well and her father did not walk well, and Maria’s days quickly evolved into the singular pursuit of food, water, and heat. I will do everything now, she told herself constantly, like a mantra. I can do everything now. I will be the strong one. Later that day Leonid’s colleague came and brought Maria a letter from Leonid. It was a short note, but he wrote that everything was OK with him, he was safe and healthy, he was thinking about them, he loved them. She knew he felt guilty for leaving, she could hear it in his note. If he would just come back, she thought, they could have a long talk and sort it all out. But with every passing day, he didn’t. She got very little concrete information from him — only an occasional check-in to say that he was OK and he loved them — and she was furious.
Though they never took off their coats or shoes in case they had to run, the children screamed constantly from cold. Maria and her family tried occupying them in the basement by playing games, telling stories, and rocking them to sleep. But explosions roared outside relentlessly, frightening and waking the children. They could not let the kids watch TV or play on tablets or phones because any battery life they had on their devices was a precious commodity reserved exclusively for communication.
They became dirty quickly, and there was no water to wash themselves. Maria crawled out of the basement a couple of times a day to make a fire in the courtyard and prepare soup with potatoes and canned fish. They also boiled pasta and fried it with tomatoes and onions. Sick to their stomachs with anxiety and constantly cold, Maria and her sister couldn’t bring themselves to eat much. They were both breastfeeding and started to lose their milk supply, which further distressed the children, who batted at their breasts begging for milk that was not coming.
Every day was the same: They were awakened by the sounds of shelling, a distinct metallic whir followed by concussive blasts at impact, then a couple of hours of silence. They waited every moment for it to begin again, wondering if the shelling would be closer this time. When the bombing began once more, she’d go so rigid that the edges of all her body’s muscles would ache. Taking a deep breath, she’d run to David, pick him up, hold him close, sing him songs, and rock him gently, a meditative motion she did as much for her own comfort as for his.
Periodically, at her own risk, she took David to the apartment to run around for 10 minutes or so. “It drove me crazy that I was sitting there in the basement,” she said. “It was so dark, my eyes couldn’t see at all when I came out into the light.”
When a bar of service appeared on her phone, she’d receive a handful of messages — from her sister in Kharkiv, from friends who had already evacuated, from Leonid. He would not say where he was fighting, but she knew he was in the city. Witnessing the daily carnage, he urged her and the family to leave Mariupol as soon as they could.
After he left on March 6, Leonid came back in person three times: once on March 8 to wish her a happy International Women’s Day — a major holiday in former Soviet countries — then on March 11, and finally on March 13. Each time it was, as Maria writes, “for literally one minute,” except the last visit, when he was able to stay for five. He met her outside the basement, hugged her, and ran quickly to the cellar to see David, swooping in, picking David up, and hugging him tight, trying to make him laugh. The last time Leonid came, he ran to the basement, where David was sleeping, and laid his face near his son’s for a moment.
The last time Leonid came, he ran to the basement, where David was sleeping, and laid his face near his son’s for a moment.
What kind of conversation can two people have in one minute? She told him that she had been making a fire in the courtyard, what they were eating, if they’d had any news from her sister. He told her to leave the city immediately, as soon as they could arrange an evacuation vehicle. He’d meet them wherever they went as soon as it was over. As he shifted to leave, they hugged, and she looked away so that she didn’t fall apart and cling to his clothes, begging him to stay like a woman possessed. Then he ran off.
“I didn’t know what he had become,” she wrote later. “I didn’t understand at all. I didn’t understand the essence of the disaster.”
Because the city was constantly, indiscriminately shelled, leaving it posed enormous risk. People who tried to escape were killed every day, hit by shells or shrapnel or snipers. At checkpoints, Russian soldiers often forced evacuees to undress and examined their tattoos. They confiscated phones and searched texts, emails, and photos for any indication of Ukrainian patriotism. Maria was 23 years old, and small. She worried that at a checkpoint she would have no capacity to defend herself against rape, assault, or abduction, especially because she would travel with her parents, both of whom were in poor health and could do little to protect her. They decided they’d wait a few more days to see if things calmed down. “How long could this unending bombardment possibly continue?” she wondered. But Leonid insisted that they must get out — that things would never return to normal, that there was no life left to be had in Mariupol. By then the police force in Mariupol had collapsed, and the next day, the Mariupol Drama Theater was bombed. A thousand civilians had sheltered underneath the building and several hundred were killed.
By mid-March, corpses littered the street like newspapers, victims of violence, hunger, or untreated infections. People were scared to look too closely. What if you recognized them? Eventually the Russian troops occupying certain parts of the city began collecting the bodies in trucks and depositing them in the city square.
Maria occasionally returned to the apartment to retrieve toys or secure the windows and doors, trying to keep it pristine. She still held out hope that eventually they’d return to that apartment and resume their life. From there, she caught broader views of the city. “I had a view from the window, I saw absolutely everything,” she wrote. “The whole city was burning.” She could see, in the distance, one of the large steel factories in town, Azovstal, glowing. Smoke rose in a continuous black cloud over the horizon. At night, the sky glowed pink, and buildings crackled in flames or smoldered, collapsing piece by piece.
By March 19, Maria decided they needed to leave. They had no more candles or matches. “We were just walking by inertia, in the darkness. I was trying to feel my way to the doors to get out of the basement.” She and her sister gathered their possessions in the apartment, letting the children get a better sleep in the beds a final time before departing in the morning. Through the middle of the night, Maria and her sister pumped breast milk for the journey to ensure that they would not need to lift their shirts and could calm the babies with bottles in a pinch. As they pumped in silence, they heard a whistle and planes roaring overhead. Somewhere near them an air assault was underway, and when the bomb dropped, they felt their building sway, the furniture sliding across the floor.
For most an evacuation ride was extremely difficult to secure. Though drivers came with their cars and buses from throughout Ukraine to help in the effort, the route was dangerous, and drivers began charging high prices — several hundred dollars — for rides just beyond the city limits. Maria and Leonid’s car had been damaged by a shell, so it was not reliable, but Leonid’s father agreed to take them. Only part of their group — Maria, David, and Maria’s dad — could fit on the first trip; the others, Maria’s sister, nephew, brother-in-law, and mom, would need to wait until Leonid’s father got back and was ready for another trip. It would be, they hoped, just a day or two. With a white ribbon tied to the car to indicate they were civilians, they inched through the city toward the checkpoint. They lived on the outskirts, and it was a short drive to the edge of the city. “As we pulled out onto the main street, I saw that every house was burned down. There were tanks lying around on the roads, buses overturned, people were digging graves at every step — every step, wherever there was a free spot.” Their city was gone, replaced with ghosts. She went on: “Where there had been trees, or in the fields, where there used to be just gardens, now bodies are just lying there. And people walk, people walk on them.”
Their city was gone, replaced with ghosts.
They crossed through 15 checkpoints to leave the city. Russian soldiers rifled through her bags, patted her body, looked at her son. They made men strip naked and stole food and belongings. After hours of waiting, Maria’s party crossed the city limits into a village on their way to Zaporizhzhya, the closest major city under Ukrainian control, 120 miles away. She had never been to Zaporizhzhya. In fact, she had never been much of anywhere at all. Except for a few short trips to neighboring cities and one to Kyiv, she’d spent her whole life in the city behind her, just like her parents, just like her grandmother Vera before her.
“She is nearby,” Maria said, “I know this for certain.”
Harvard aligns resources for combating bias, harassmentOffice for Community Support, Non-Discrimination, Rights and Responsibilities targets discrimination, bullying, sexual harassment, and other misconduct
Harvard aligns resources for combating bias, harassment
Peggy Newell (left) and Nicole Merhill.
Harvard file photos
Nicole Rura
Harvard Correspondent
8 min read
Office for Community Support, Non-Discrimination, Rights and Responsibilities targets discrimination, bullying, sexual harassment, and other misconduct
Harvard on Monday announced the establishment of the new Office for Community Support, Non-Discrimination, Rights and Responsibilities (CSNDR), a move that aligns resources, supports, and policy implementation previously housed across the Office for Community Conduct (OCC) and the Office for Gender Equity (OGE).
Nicole Merhill, the director of CSNDR and the University’s Title IX coordinator, and Peggy Newell, vice president and deputy to the president, spoke with the Gazette about this new alignment of resources and supports available to all members of the community, the laws and policies the new office upholds, and the shared responsibility for creating a safer and more inclusive community.
What is the Office for Community Support, Non-Discrimination, Rights and Responsibilities (CSNDR)?
Newell: This new office brings together all of the important work happening under the Office for Community Conduct and the Office for Gender Equity and continues it in one place, with the aim of making it easier for members of our community to know what resources and supports are available to them and where they can go in order to access them.
Merhill: Under the newly formed CSNDR umbrella, we will have further aligned these resources and supports — the confidential SHARE team, the prevention team, and the NDAB [Non-Discrimination and Anti-Bullying] and Title IX compliance team. Both the prevention team and the compliance team have expanded their portfolios to cover Title IX, other sexual misconduct, non-discrimination, and anti-bullying. The SHARE team remains dedicated to serving community members who may have experienced sexual harassment, sexual assault, stalking, abusive relationships, or discrimination on the basis of gender or sexual orientation.
The CSNDR office works to provide accessible information on discrimination, including antisemitism and Islamophobia, sexual harassment, other sexual misconduct, and bullying, which are grounded in the commitment to working to ensure that every member of our community has the opportunity to learn, conduct research, and work in an environment free from discrimination, harassment, and other forms of harm. Before this merger, OCC focused on implementing the University’s policies and procedures for non-discrimination and anti-bullying, while the Title IX team within OGE focused on implementing the University’s policies and procedures addressing sexual harassment and other sexual misconduct.
Newell: Nicole came to Harvard from the federal agency that oversees both Title IX and Title VI as well as other federal civil rights laws. During her nearly 10 years here as the director of OGE and as the University Title IX coordinator, she has built strong relationships across Harvard’s Schools and in our community. We’re very fortunate to have her — a civil rights attorney who knows both what is required by these regulations and how to navigate Harvard systems to increase access to support — leading this new CSNDR office.
Title VI
Prohibits discrimination on the basis of race, color, and national origin in programs and activities receiving federal financial assistance.
Title IX
Prohibits sex-based discrimination in education programs and activities that receive federal financial assistance.
Why were OCC and OGE combined?
Newell: We believe the new structure will improve access to supports and resources available to members of our community and the expectations built into our policies, as well as our ability to respond to policy violations appropriately, when they happen.
Merhill: Yes, OCC and the Title IX team within OGE had parallel missions — to provide information about their respective policies and procedures, support community members regarding those policies, review concerns under the policies including examining systemic impact, handling formal complaints, informal resolutions, appeals, and hearings under these policies.
Newell: We recognized that our community was confused by different offices handling concerns that touched on issues of discrimination. Now, the NDAB and Title IX compliance team within CSNDR can support individuals in response to issues of discrimination, bullying, sexual harassment and other sexual misconduct, which is more convenient and efficient, and responsive to what we have heard from community members. Also, many of the School-based staff who serve as local designated resources for non-discrimination and anti-bullying also serve as local Title IX resource coordinators. With all of those considerations in mind, combining OCC and the Title IX team within OGE into one compliance team under CSNDR is a better way to serve our community.
What can community members expect from this change?
Merhill: All previous resources, including the good work of the prevention team and our confidential SHARE team, will continue.
The prevention team’s mission will expand to look at how we strengthen capacity across our community to combat forms of harm broadly, whether it’s in the realm of discrimination based on a protected class, sexual harassment, or anti-bullying. It’s a nice alignment, because often our prevention team would lead bystander training and be asked to incorporate race-based or other protected class discrimination.
The invaluable SHARE team remains dedicated to providing individual and community-level support to those who may have experienced sexual harassment, sexual assault, stalking, abusive relationships, or discrimination on the basis of gender or sexual orientation. Additionally, the SHARE team will continue to offer confidential accountability support for individuals and communities who may have caused harm. These critically supportive resources have not changed.
And the new NDAB and Title IX compliance team allows us to be more efficient by being able to address policy-related issues in one space, under the University’s policies addressing non-discrimination, anti-bullying, sexual harassment, and other sexual misconduct, without a potential hurdle of separate or duplicate outreach and engagement that could emerge in the previous structure.
In addition to bringing together existing staff from OCC and the Title IX team within OGE, over the summer we hired a new staff member who serves as the University’s Title VI coordinator and deputy for compliance. We are also in the process of hiring two additional staff members — a deputy for Title VI and Title IX compliance, who will support our network of local Title IX resource coordinators and local designated resources and serve as a facilitator of informal resolutions, and a deputy Title VI coordinator and case manager, who will consult on complaints of discrimination, including all complaints of antisemitism. Each of these roles will bring additional support and expertise to the NDAB and Title IX compliance team.
You mentioned that you expanded the resources in the compliance team. Can you tell us more about those changes?
Merhill: On the compliance side, as I mentioned earlier, over the summer we filled a new position, the Title VI coordinator and secretary for compliance, who oversees the formal complaint side of the work and who has already been working with our Schools and community members on these issues. Our newest staff member has extensive experience addressing concerns of Title VI and Title IX discrimination at the federal level, including investigating and resolving concerns of sexual harassment, racial harassment, and discrimination on the basis of shared ancestry, including antisemitism, Islamophobia, and other forms of harm.
When we were assessing the new NDAB and Title IX compliance team’s needs, we also heard from the community a desire for the processes related to reviewing and responding to complaints to proceed more quickly. Based on that feedback, we created and are actively recruiting two new positions to provide additional support for the community and make those processes more efficient: a deputy for Title VI and Title IX compliance and a deputy Title VI coordinator and case manager.
CSNDR is responsible for providing essential trainings on non-discrimination, sexual harassment, and other misconduct. Do you anticipate any changes to training that the University offers?
Merhill: Today we rolled out an eLearning module to all incoming and returning students across the University. All students are required to complete the course in order to be enrolled in courses at Harvard.
On Sept. 8, the module will be assigned to all staff, faculty, and postdoctoral fellows. This module will be substantially similar to the module provided to students, but it also includes information for faculty, staff, and postdoctoral fellows on their role as “responsible employees” for matters under the University’s Title IX and other sexual misconduct policies.
The course takes about an hour to complete.
Newell: We really appreciate our community members taking the time, and we welcome everyone’s feedback as this is our first iteration where we combine non-discrimination and harassment, including antisemitism and Islamophobia, sexual harassment, and other sexual misconduct into one module.
Merhill: It’s an hour of time spent on issues that are exceedingly important for all of us to recognize, understand, and actively address. The module includes information on our community expectations as reflected in our policies, our own individual responsibilities in our work to meet those expectations, what the University’s responsibilities are, and what resources are available if someone encounters one of these concerns.
Where can Harvard community members learn more about the resources CSNDR provides?
Merhill: In addition to the information in the eLearning initiative, we rolled out a new website at csndr.harvard.edu today. We encourage everyone to visit the website and also provide feedback. The website is organized according to each team’s services and resources, and you are invited to visit each of the teams to learn more about their work in each of those spaces. We look forward to continuing and deepening our work in this important space.
Harvard appoints Rabbi Getzel Davis as inaugural director of interfaith engagementPresidential initiative will promote religious literacy and dialogue across faith and non-faith traditions
Harvard appoints Rabbi Getzel Davis as inaugural director of interfaith engagement
Rabbi Getzel Davis.
Niles Singer/Harvard Staff Photographer
Jacob Sweet
Harvard Staff Writer
7 min read
Presidential initiative will promote religious literacy and dialogue across faith and non-faith traditions
Among Harvard’s chaplaincy, Rabbi Getzel Davis has long been known as a bridge builder. From his internship at Harvard Hillel in 2012 to his service as a member of the executive committee of Harvard chaplains, Davis has created lasting relationships across religious, spiritual, and ethical organizations on campus.
Davis will now join the University staff as inaugural director of interfaith engagement, where he will lead programs to foster respect for diverse identities, build relationships among communities, and encourage cooperation for the common good. He sees the post as a natural continuation of his tenure at Harvard.
“I spent 12 years as a Harvard chaplain, and I learned a lot about all these other communities,” Davis said. “Not only did I build deeper relationships with them and run programming together, but I learned a lot about what they were struggling with and was often surprised that, in fact, we had a lot in common.”
In the new role, part of a presidential initiative on interfaith engagement, Davis will oversee projects that promote religious literacy and meaningful dialogue across diverse faith and non-faith traditions, and collaborate with University offices to advocate for the needs of religious and spiritual communities.
“Creating a community in which every person at Harvard can thrive means expanding opportunities for individuals to know, understand, and appreciate one another,” said President Alan M. Garber. “Rabbi Davis is a good listener and a great collaborator. His capacities for curiosity and compassion will shape our efforts to ensure that Harvard is a place where people can be themselves, express their views, and pursue their dreams both individually and collectively.”
Imam Khalil Abdur-Rashid (left), Harvard’s Muslim chaplain, called the appointment of Davis to his new role “a win for Harvard, a win for the chaplains, and a win for our students.”
File photo by Veasey Conway/Harvard Staff Photographer
Davis brings with him deep relationships with many of Harvard’s chaplains, including Imam Khalil Abdur-Rashid, Harvard’s Muslim chaplain, who expressed excitement about Davis’ appointment and the new role. “To have someone in the Office of the President that is devoted to fostering interfaith programming is innovative, strategic, and forward-looking,” he said. “I think his presence as director of interfaith engagement is a win for Harvard, a win for the chaplains, and a win for our students.”
The work has already begun. In the coming semester, Davis will launch the First-Year Religious Ethical and Spiritual Life Fellowship, a paid 10-session program that helps students develop the skills to navigate complex differences and combat religious prejudice, antisemitism, and Islamophobia. At the end of the program, students will have the opportunity to apply for grants to foster their own interfaith initiatives on campus.
Davis is also collaborating with the office of the College dean of students to provide programming for pre-orientation and orientation to help promote pluralism and mutual understanding.
These new projects will run alongside existing programming, including Interfaith PhotoVoice — an exhibit of photos and stories that reflect student perspectives on religion, ethics, and spirituality — and Pluralism Passports, a series of interfaith events and programs that help Harvard community members learn about religious, ethical, and spiritual communities outside their own. Additional programs, administered by Davis and multifaith engagement fellow Abby McElroy, will begin throughout the academic year.
Other chaplains joined Abdur-Rashid in praising Davis as the right leader for the role.
“Getzel is a leader of deep humanity who has already spent years working hard to build closer, more mutually respectful relationships at Harvard, between religious groups that would undoubtedly have been more at odds with one another if not for his presence,” said Harvard Humanist Chaplain Greg Epstein. “In my particular case, I can say he has also been a wonderful champion of friendship and understanding between religious and nonreligious communities.”
Tammy McLeod, president of the Harvard Chaplains and a staff member of the interdenominational Christian organization Cru, also spoke to Davis’s ability to lead across difference. “Within the Harvard Chaplains, he has been a dedicated advocate for cultivating genuine relationships across diverse belief systems,” said McLeod. “Warm, personable, and deeply committed to life’s enduring questions, Getzel brings a unique presence to Harvard’s spiritual and ethical landscape. Students will find great value in engaging with him. His new position is not only timely — it is vital.”
Rabbi Jason Rubenstein, executive director of Harvard Hillel, echoed that sentiment.
“Of the many people I have worked with and observed in higher education, none is a better exemplar of assiduously cultivating relationships with colleagues across difference. … I cannot imagine a better fit, or more urgent work, than his new role of stitching together the different strands of Harvard’s communal tapestry into a more unified, humane, and interconnected whole.”
Davis lives with his wife, Leah Rosenberg, a physician at Massachusetts General Hospital and an assistant professor of medicine at Harvard Medical School, and three children in Cambridge. At Brandeis University, he majored in Near Eastern and Judaic studies, with a minor in comparative religion, before attending Hebrew College, a pluralistic rabbinical school in Newton. He first joined Harvard Hillel as an intern, advising the reform and conservative minyans on campus. In 2015, he became Harvard Hillel’s director of graduate programming and chair of University Programs for Harvard Chaplains.
In the latter role, he aimed to strengthen relationships among more than 40 chaplains from more than 30 religious and ethical traditions. Davis recalls meeting in a different chaplaincy every month, giving different groups opportunities to share their triumphs and struggles.
Aside from formal programming, Davis and other chaplains hosted meals open to students to discuss essential questions of faith, meaning, and collaboration on campus. He also changed the format of chaplain meetings to build time for one-on-one conversations and in-person gatherings.
“I find a lot of the way I encounter the sacred is to be in relationship with other people,” said Davis, who became campus rabbi in 2023. “And some of that has been by developing deep and trusting relationships with the other chaplains.”
The deep bonds with other religious leaders, including Abdur-Rashid, led to joint events between Harvard Hillel and other groups like the Harvard Islamic Society. Davis cited the “Sukkat Salaam” dinner as one of many successful collaborations — an event that celebrated the start of the Jewish holiday Sukkot and the close of Ramadan, the Islamic month of fasting.
The relationship between Davis and Abdur-Rashid proved valuable following the events of Oct. 7, 2023, as Jewish and Muslim students navigated complex emotional and community responses to the attack on Israel and the Gaza war.
In December 2023, the two held their first of three vigils, praying together for peace for all those affected by the conflict. “They felt very important, symbolically, to be done on campus,” Davis said. “It felt like a very big deal.”
This experience of bringing communities together during a particularly challenging time reinforced Davis’ belief in a more structured approach to interfaith work on campus. After leaving Hillel in March 2025 to regroup and spend time with his family, Davis continued thinking about the connections he had formed with other chaplains, imagining a new role that would allow him to establish programming for an even wider and more diverse community.
“That time of reflection gave me the clarity to see that the bridge-building work we did at Hillel was precisely what the entire campus needed,” Davis said. “I used that period to meet with chaplains, administrators, and students to develop a concrete vision for how Harvard could foster true pluralism. This collective vision is what the University has now entrusted me to advance.”
After more than a decade at the University, Davis is thrilled to be stepping into the inaugural role and an initiative that he expects to grow in years to come. “This new role feels like the culmination of my entire career here,” he said. “I am honored and energized to answer this call to serve the whole Harvard community.”
From tragedy to ‘Ecstasy’Ivy Pochoda’s feminist retelling of ‘The Bacchae’ examines freedom from inhibition with Electronic Dance Music beat
Ivy Pochoda’s feminist retelling of ‘The Bacchae’ examines freedom from inhibition with Electronic Dance Music beat
Anna Lamb
Harvard Staff Writer
5 min read
King Pentheus of Thebes and his mother, Agave, become the target of the god Dionysus’ wrath for rejecting his sybaritic cult in the ancient Greek tragedy “The Bacchae.”
In “Ecstasy,” Ivy Pochoda’s new feminist retelling, Dionysus is an international DJ with a cult following in the Electronic Dance Music, or EDM, and rave scene. Pentheus and Agave become Drew and his mother, Lena — heir and widow to a deceased hotel magnate opening a new luxury resort on a Greek island.
It’s a bloody story in the old and new, rife with decadence and depravity — one with timeless appeal judging from the multitude of stagings and adaptations over the centuries.
For Pochoda, the new project additionally marks a return to an early love — and an earlier self.
“I did Latin and Greek in middle and high school, and I was really good at it,” said Pochoda, a 1998 graduate in classics and literature. “And one of the reasons I wanted to go to Harvard was because of their classics department.”
Raised in Brooklyn, Pochoda attended high school at St. Ann’s — a private school with no grades, no set curriculum, and a philosophy of being “systematically asystematic.” One year, her teachers led the class through a translation of Ovid’s “Metamorphosis.” During another they spent the entire year translating Euripides.
“I spent my senior year in high school translating ‘The Bacchae,’” Pochoda said. “We did it start to finish, and it was really a cool experience for a 17-year-old to get that immersed in a text. And it was never really far from my brain.”
But in College, Pochoda said, it was hard to immerse herself in ancient stories in the same way.
“I found out in College that being interested in classics and being interested in mythology are not the same thing,” she said. “When I was in high school, it sort of was — we were able to overlap.”
Pochoda said it seemed to her that having a concentration in classics meant translating — like, all the time.
She wanted to spend more time discussing meaning and themes, the part of ancient storytelling that brought her joy. That’s why, halfway through her undergraduate study, Pochoda decided she would switch concentrations.
“And there was this concentration called classics and secondary fields, which was not meant to be combined with English. But I did it, and I combined it through the study of dramatic literature, which brought me back to ‘The Bacchae’ and plays that I love.”
“It took me back to where I started from, which is being academic, but also creative, and applying that academia to performance and to things that are just a little off the beaten path.”
To fulfill the novel requirements set forth by combining literature and classics, Pochoda began taking classes at American Repertory Theater, alongside creative writing courses. She reminisces fondly about her classes with Professor Emeritus Robert Brustein and associate Robert Scanlan.
“It took me back to where I started from, which is being academic, but also creative, and applying that academia to performance and to things that are just a little off the beaten path,” she said. “Being an undergraduate and taking classes with students in the arts and working with the art professors and actually thinking about why I was studying Greek and why I was studying English literature through a dramatic focus, was a really interesting tunnel.”
The setting of “Ecstasy” is far from the ivy-covered buildings of Cambridge, or even the metropolis of Los Angeles, where she lives now with her 10-year-old-daughter, but Pochoda said there is real life inspiration at play.
“Ecstasy” is set largely on the island of Naxos — a destination to which she took a trip in 2018 when working on the “Epoca” series with Kobe Bryant.
In addition to her real-life island retreat, Pochoda has also dabbled in the world of EDM. In her previous life as captain of the women’s squash team at Harvard, followed by nine years playing professionally in Europe, Pochoda got out her fair share.
“I’m not some super hardcore EDM person, but I do know about it. I mean, I’ve been to some raves and parties, which was a problem for me academically,” she said, laughing. “I will talk about it openly,” she added.
As for the decision to transpose this culture onto that of the ancient Greek god known for his love of wine and sex and revelry, Pochoda said that was easy.
“When I was thinking about what’s going on in that play, those women are raving for all intents and purposes.”
“When I was thinking about what’s going on in that play, those women are raving for all intents and purposes,” she said. “In the early EDM, early trance parties, early underground music, there was a lot of suspicion of what was going on and a lot of worry that the music was making you crazy and the drugs were making you crazy. So in the book, I try to use the idea of a beat, or beats, and the build-ups of EDM.”
But to be clear, Pochoda said, this is not quite a cautionary tale.
“The main characters, they want to go to the beach and party their faces off and reconnect with their youthful exuberance and the permissiveness of youth — the permissiveness of women being allowed to do what they want to do without men telling them what they want to do, what they can’t do,” she said. “But there is a dark side to that.”
Getting to the root of teen distracted driving7 in 10 young people use cellphones while behind the wheel, finds a new study that also takes a look at why
7 in 10 young people use cellphones while behind the wheel, finds a new study that also takes a look at why
Every year, hundreds of people die in automobile accidents involving distracted teen drivers. A new study zeroes in on one of the most common forms of distraction, cellphone use, exploring how often young people engage in the risky behavior and why.
A team of public health researchers led by Rebecca Robbins, Assistant Professor at Harvard Medical School and a scientist at Brigham and Women’s Hospital, surveyed teens across the country to find out the ways in which they use their phones while driving and how that behavior might be curbed.
They found that seven in 10 high school students reported using or making long glances toward their phones while driving — many lasting two seconds or longer — for about 20 percent of each trip.
“That’s a huge proportion — putting themselves and the traveling public around them at risk,” said Robbins.
The time that it would take to read or send a text message, activate maps, or check social media, she added, is associated with a 5.5 times greater likelihood of a crash.
Most teens in the study said they believed their peers engaged in distracted driving. Robbins said teens have a strong association between their beliefs about what their peers are doing and their own actual behavior. So many think it’s normal to check their phones while driving, despite the risks.
“Young people harbor beliefs that looking at their phone offers benefits.”
Rebecca Robbins
“Young people harbor beliefs that looking at their phone offers benefits,” she said. “It allows them to be entertained. It allows them to get where they’re going. That is what we call a maladaptive belief that would need to be corrected with behavioral intervention.”
Among participants who reported using their phones while driving, the most common reasons were entertainment (65 percent), followed by texting (40 percent) and navigation (30 percent).
Among participants who reported using their phones while driving, the most common reasons were…
entertainment
65%
texting
40%
navigation
30%
Yet Robbins emphasized three in 10 respondents reported practicing focused driving.
“Young people had bright spots around them, of role models that were practicing safe driving practices such as avoiding phone use while driving, that was inversely associated with reports of young people distracted-driving themselves,” she said.
Additionally, Robbins said, teens’ attitudes toward their own ability to make educated choices played a role.
“We also found a significant association between self-efficacy and distracted driving, such that stronger self-efficacy beliefs or beliefs that they could avoid distracted driving, avoid the temptation, put their phone in the backseat, turn on ‘Do Not Disturb’ mode, any number of those in the constellation of safe driving practices, was inversely associated with distracted driving,” she said.
Robbins said information gleaned through the study could be used to craft public health messaging campaigns and behavioral interventions like those that have promoted seat belt use. “This research suggested a number of promising avenues for future research, such as a campaign that would emphasize the benefits of using ‘Do Not Disturb’ mode and empowering young people to turn that mode on, or have it automatically turn on, while they’re driving.”
A popular TV show, cathartic commute, and dance that requires teamworkEducation lecturer finds leadership lessons in unlikely places
Uche Amaechiis the chair of the Leading Change Foundations and a lecturer on leadership at the Graduate School of Education.
TV show
“Severance” on Apple TV+
“Severance” is a great story and it’s great storytelling, and I highly recommend it. So many rich conversations about organizations and leadership can come out of it: It gets into the idea of multiple versions of yourself, and which versions may come to the fore in different contexts. It asks the question: Why would a company want its employees to be severed? Is it about risk management? Is it about control? Is it about blind allegiance to the mission?
Escape
Cycling
I’ve always been a cycling commuter, but I didn’t start cycling for fun until COVID. The reason I recommend cycling is that you get emotional and mental benefits as well as physical benefits. Mentally, cycling gives me time to really think things through, to work through what’s in my head. Emotionally, it’s a great way to release stress I didn’t even know I had, it’s very cathartic. And it’s a great way to get to know what’s in your neighborhood: You become more aware of your surroundings.
In leadership, we always talk about the importance of work-life balance. You can’t be a good leader if you’re not taking care of yourself. Cycling is a great way to get time on your own to focus, clear your mind, and find your center so you can be a better leader and a better team member.
Dance
Argentine tango
In my line of work, we often encourage leaders to be on the balcony as opposed to on the dance floor: When you’re on the dance floor, you’re part of the system, but when you’re on the balcony you’ve removed yourself from the system a little bit so you can make decisions for the benefit of the system. That’s important for strong, empathetic leadership. But there’s a lot of value to being on the dance floor.
I dance — and teach — Argentine tango. It’s you, your partner, and the music. The leader has to pay attention to how their partner is interpreting their lead, and they have to adjust in real time. The follower has to interpret what the leader is asking them to do. In a way, the leader has to know how to follow, and the follower has to know how to lead. Both people have to really pay attention to each other, and to the rest of the dance floor: What are the other couples doing? It’s personal, it’s interpersonal, and it’s systems-thinking.
— As told to Sy Boles/Harvard Staff Writer
Will your job survive AI?Expert on future of work says it’s a little early for dire predictions, but there are signs significant change may be coming
Expert on future of work says it’s a little early for dire predictions, but there are signs significant change may be coming
In recent weeks, several prominent executives at big employers such as Ford and J.P. Morgan Chase have been offering predictions that AI will result in large white-collar job losses.
Some tech leaders, including those at Amazon, OpenAI, and Meta have acknowledged that the latest wave of AI, called agentic AI, is much closer to radically transforming the workplace than even they had previously anticipated.
Dario Amodei, chief executive of AI firm Anthropic, said nearly half of all entry-level white-collar jobs in tech, finance, law, and consulting could be replaced or eliminated by AI.
Christopher Stanton, Marvin Bower Associate Professor of Business Administration at Harvard Business School, studies AI in the workplace and teaches an MBA course, “Managing the Future of Work.” In this edited conversation, Stanton explains why the latest generation of AI is evolving so rapidly and how it may shake up white-collar work.
Several top executives are now predicting AI will eliminate large numbers of white-collar jobs far sooner than previously expected. Does that sound accurate?
I think it’s too early to tell. If you were pessimistic in the sense that you’re worried about labor market disruption and skill and human capital depreciation, if you look at the tasks that workers in white-collar work can do and what we think AI is capable of, that overlap impacts about 35 percent of the tasks that we see in labor market data.
“My personal inclination — this is not necessarily based on a deep analytical model — is that policymakers will have a very limited ability to do anything here unless it’s through subsidies or tax policy.”
The optimistic case is that if you think a machine can do some tasks but not all, the tasks the machine can automate or do will free up people to concentrate on different aspects of a job. It might be that you would see 20 percent or 30 percent of the tasks that a professor could do being done by AI, but the other 80 percent or 70 percent are things that might be complementary to what an AI might produce. Those are the two extremes.
In practice, it’s probably still too early to tell how this is going to shake out, but we’ve seen at least three or four things that might lead you to suspect that the view that AI is going to have a more disruptive effect on the labor market might be reasonable.
One of those is that computer-science graduates and STEM graduates in general are having more trouble finding jobs today than in the past, which might be consistent with the view that AI is doing a lot of work that, say, software engineers used to do.
If you look at reports out of, say, Y Combinator or if you look at reports out of other tech sector-focused places, it looks like a lot of the code for early-stage startups is now being written by AI. Four or five years ago, that wouldn’t have been true at all. So, we are starting to see the uptake of these tools consistent with the narrative from these CEOs. So that’s one piece of it.
The second piece is that even if you don’t necessarily think of displacement, you can potentially think that AI is going to have an impact on wages.
There are two competing ways of thinking about where this is going to go. Some of the early evidence that looks at AI rollouts and contact centers and frontline work and the like suggests that AI reduces inequality between people by lifting the lower tail of performers.
Some of the best papers on this look at the randomized rollout of conversational AI tools or chatbots and frontline call-center work and show that lower-performing workers or workers who are at the bottom of the productivity distribution disproportionately benefit from that AI rollout tool. If these workers have knowledge gaps, the AIs fill in for the knowledge gaps.
What’s driving the accelerated speed at which this generation of AI is evolving and being used by businesses?
There are a couple of things. I have a paper with some researchers at Microsoft that looks at AI adoption in the workplace and the effects of AI rollout. Our tentative conclusion was that it took a lot of coordination to really see some of the productivity effects of AI, but it had an immediate impact on individual tasks like email.
“Our tentative conclusion was that it took a lot of coordination to really see some of the productivity effects of AI, but it had an immediate impact on individual tasks like email.”
One of the messages in that paper that has not necessarily been widely diffused is that this is probably some of the fastest-diffusing technology around.
In our sample, half of the participants who got access to this tool from Microsoft were using it. And so, the take-up has been tremendous.
My guess is that one of the reasons why the executives … didn’t forecast this is that this is an extraordinarily fast-diffusing technology. You’re seeing different people in different teams running their own experiments to figure out how to use it, and some of those experiments are going to generate insights that weren’t anticipated.
The second thing that has accelerated the usefulness of these models is a type of model called a chain-of-thought model. The earliest versions of generative AI tools were prone to hallucinate and to provide answers that were inaccurate. The chain-of-thought type of reasoning is meant to do error correction on the fly.
And so, rather than provide an answer that could be subject to error or hallucinations, the model itself will provide a prompt to say, “Are you sure about that? Double check.” Models with chain-of-thought reasoning are much, much more accurate and less subject to hallucinations, especially for quantitative tasks or tasks that involve programming.
As a result, you are seeing quite a lot of penetration with early stage startups who are doing coding using natural-language queries or what they call “vibe coding” today. These vibe-coding tools have some built-in error correction where you can actually write usable code as a result of these feedback mechanisms that model designers have built in.
The third thing driving major adoption, especially in the tech world, is that model providers have built tools to deploy code. Anthropic has a tool that will allow you to write code just based on queries or natural language, and then you can deploy that with Anthropic tools.
There are other tools like Cursor or Replit where you will ultimately be able to instruct a machine to write pieces of technical software with limited technical background. You don’t necessarily need specific technical tools, and it’s made deployment much, much easier.
This feeds back into the thing that I was telling you earlier, which is that you’ve seen lots of experiments and you’ve seen enormous diffusion. And one of the reasons that you’ve seen enormous diffusion is that you now have these tools and these models that allow people without domain expertise to build things and figure out what they can build and how they can do it.
Which types of work are most likely to see change first, and in what way? You mentioned writing code, but are there others?
I have not seen any of the immediate data that suggests employment losses, but you could easily imagine that in any knowledge work you might see some employment effects, at least in theory.
In practice, if you look back at the history of predictions about AI and job loss, making those predictions is extraordinarily hard.
We had lots of discussion in 2017, 2018, 2019, around whether we should stop training radiologists. But radiologists are as busy as ever and we didn’t stop training them. They’re doing more and one of the reasons is that the cost of imaging has fallen. And at least some of them have some AI tools at their fingertips.
And so, in some sense, these tools are going to potentially take some tasks that humans were doing but also lower the cost of doing new things. And so, the net-net of that is very hard to predict, because if you do something that augments something that is complementary to what humans in those occupations are doing, you may need more humans doing slightly different tasks.
And so, I think it’s too early to say that we’re going to necessarily see a net displacement in any one industry or overall.
If AI suddenly puts a large portion of middle-class Americans out of work or makes their education and skills far less valuable, that could have catastrophic effects on the U.S. economy, on politics, and on quality of life generally. Are there any policy solutions lawmakers should be thinking about today to get ahead of this sea change?
My personal inclination — this is not necessarily based on a deep analytical model — is that policymakers will have a very limited ability to do anything here unless it’s through subsidies or tax policy. Anything that you would do to prop up employment, you’ll see a competitor who is more nimble and with a lower cost who doesn’t have that same legacy labor stack probably out-compete people dynamically.
It’s not so clear that there should be any policy intervention when we don’t necessarily understand the technology at this point. My guess is that the policymakers’ remedy is going to be an ex-post one rather than an ex-ante one. My suspicion is better safety-net policies and better retraining policies will be the tools at play rather than trying to prevent the adoption of the technology.
2 new initiatives strengthen Harvard’s academic engagement with IsraelOpportunities for undergraduate study abroad and research exchange in biomedicine
2 new initiatives strengthen Harvard’s academic engagement with Israel
Stephanie Mitchell/Harvard Staff Photographer
7 min read
Opportunities for undergraduate study abroad and research exchange in biomedicine
Harvard has launched two new initiatives that promise to bolster the University’s academic engagement with Israeli institutions and create greater opportunities for students and researchers. A collaboration announced this week with Ben-Gurion University of the Negev (BGU) will offer study abroad opportunities for undergraduates during the academic year and the summer. Additionally, earlier this month, Harvard Medical School opened applications for the Kalaniyot Postdoctoral Fellowships for Israeli researchers.
Undergraduate study abroad: Ben-Gurion University of the Negev
Harvard’s agreement with BGU will offer Harvard College students year-round opportunities to study and earn credit toward their degree in Israel beginning in spring 2026.
BGU, whose main campus is in Be’er-Sheva, was founded in 1969 as the first campus in southern Israel’s Negev desert. Today, it has expanded to three campuses, which are home to 20,000 students and 4,000 faculty members. Its community is engaged in cutting-edge research and academics in the sciences, history and the humanities, and business and management, and the school is a regional leader in research on climate change and desert studies. In addition to its three campuses, BGU is home to several multi-disciplinary research institutes specializing in biotechnology, solar energy, desert research, and Jewish and Israeli culture, among other areas. Its more than 100,000 alumni hold leading roles in research and development, healthcare, industry, and culture across Israel and the world. One noteworthy aspect of BGU’s mission is its commitment to social and environmental responsibility. The university is actively engaged in developing the Negev, Israel, and the region.
Current and past opportunities at BGU include an archaeology course that sends students to help on active excavations; marine science courses on the Mediterranean; and a sustainable agricultural practices course focused on preventing desertification and conserving resources. All courses are taught in English, with visiting students sharing classrooms with Israeli students to facilitate conversation and cross-cultural exchange.
Professor Michal Bar-Asher Siegal, BGU’s vice president for global engagement, highlighted the university’s commitment to impact and exchange: “Ben-Gurion University strives for excellence in research and teaching, as well as innovation and applied research that impact people’s lives wherever they are. We look forward to embarking on this collaboration with Harvard University and fostering the collaborative relationships so necessary to training our next generation of leaders.”
“We are thrilled to work with BGU to provide this new opportunity for undergraduate study abroad,” said Mark Elliott, Harvard’s vice provost for international affairs. “The collaboration with BGU is the latest in Harvard’s long and rich history of engagement with institutions of higher education across Israel, and I have no doubt that it will contribute both to transformative experiences for students and to increased academic collaboration across the region in the coming years.”
Amanda Claybaugh, dean of undergraduate education at Harvard College, said: “I’m delighted that we’re adding BGU to the list of Israeli universities where our students can study abroad, because BGU offers opportunities that aren’t available here at Harvard: learning about archaeology at a dig site, about marine biology in the Mediterranean, about climate and sustainability from the world’s leaders in desert agriculture.”
College students can study abroad with BGU beginning in spring 2026, with opportunities available for spring, full-year, or summer study. For information on term-time studies, visit the Office for International Education’s list of approved programs. The deadline for spring 2026 semester study will be Oct. 1.
In addition to BGU, OIE also offers undergraduate study abroad opportunities with Tel Aviv University; the Hebrew University of Jerusalem; Technion – Israel Institute of Technology; and the University of Haifa. Harvard’s Center for Jewish Studies and Harvard Divinity School also offer opportunities for graduate student exchange.
For information on summer study, see OIE’s list of summer programs. The deadline for summer 2026 study abroad applications will be Jan. 29 (for funding and credit, programs 6+ weeks) or April 1.
Students interested in term-time or summer options can email oie@fas.harvard.edu to schedule a meeting, or attend drop-in sessions beginning in September, from 2 to 4 p.m. Monday through Thursday.
Postgraduate research exchange: Kalaniyot Fellowships at HMS
The University is also working to strengthen its academic ties to Israel via scholarly exchange. Harvard Medical School recently announced the opening of the Kalaniyot Postdoctoral Fellowships at Harvard Medical School, which will welcome scientists from Israel to conduct postdoctoral training in basic biomedical research at HMS.
The Kalaniyot Postdoctoral Fellowships are open to residents of Israel who have completed a Ph.D. and wish to perform biomedical research in a laboratory on the HMS campus or at an affiliated hospital (Beth Israel Deaconess Medical Center, Boston Children’s Hospital, Brigham and Women’s Hospital, Dana-Farber Cancer Institute, Joslin Diabetes Center, Massachusetts Eye and Ear, or Massachusetts General Hospital). Successful applicants will be awarded a fellowship of two to three years, beginning in January, with the possibility of extension. The HMS branch of the Kalaniyot chapter at Harvard administers the Blavatnik Fellowship in Life Sciences established by the Blavatnik Family Foundation, and the Dorot Fellowship established by the Dorot Foundation.
This opportunity is coordinated by the Harvard Medical School branch of the Kalaniyot chapter at Harvard. HMS led the way in establishing Kalaniyot at Harvard, where the local chapter is supported by the Kalaniyot Foundation, a national organization that seeks to deepen ties between American and Israeli researchers and to contribute to academic exchange and excellence in both Israel and the U.S. The University is currently exploring expanding the initiative to include other Schools at Harvard.
“The aim of the Kalaniyot Postdoctoral Fellowships is to enhance scientific excellence and expertise by bringing the most promising research talent from Israel to Harvard Medical School and our affiliated hospitals,” said Naama Kanarek, HMS assistant professor of pathology at Boston Children’s Hospital, who serves as a faculty leader of the HMS branch of the Kalaniyot chapter at Harvard alongside Matthew Meyerson, HMS professor of genetics and medicine at Dana-Farber Cancer Institute, and Mark Poznansky, director of the Vaccine and Immunotherapy Center at Mass General Hospital and HMS professor of medicine. “We look forward to the benefits of academic exchange with these researchers, as well as the strengthened ties between HMS and researchers across Israel that will result.”
The new initiative builds on other successful collaborations between HMS and Israeli institutions, such as the Ivan and Francesca Berkowitz Family Living Laboratory Collaboration, established in 2021 to bring together researchers from HMS and Clalit Research Institute to investigate critical questions in precision medicine and predictive health. Since its inception, the Berkowitz Clinic for Undiagnosed Cases has successfully resolved dozens of complex genetic mysteries, enabling prenatal diagnosis and disease prevention while identifying novel disease-causing genes and risk factors that have been published for global use. In parallel, Clalit researchers developed innovative analytic models to enhance genetic interpretation, benefiting both Israeli and global patient populations.
This type of collaboration is not unique to HMS; faculty across Harvard are widely engaged in scholarly work in and about Israel. Several centers and programs across the University — including the Center for Jewish Studies, Harvard Law School’s Julis-Rabinowitz Program on Israeli and Jewish Law, HKS Belfer Center’s Middle East Initiative — host Israeli fellows, visiting scholars, and speakers each year.
‘Learning without a net’Here are 5 students doing summer research with faculty in topics from heat mortality to epigenetics, Legionnaires’ disease to anorexia
Here are 5 students doing summer research with faculty in topics from heat mortality to epigenetics, Legionnaires’ disease to anorexia
Summer break offers a time for a different kind of learning in labs and research centers across campus. Hundreds of Harvard College students are conducting hands-on research with faculty and making discoveries — about the material and themselves.
There are 350 undergraduates participating in the Harvard Summer Undergraduate Research Village, and another 150 are enrolled in the Undergraduate Research and Fellowships Summer Scholars program. These programs house students on campus all summer while they work alongside faculty mentors on cutting-edge research across a range of disciplines.
“We are so excited to see our students ‘learning without a net’ and looking to answer questions with no known answers,” said Jonna Iacono, director of the Office of Undergraduate Research and Fellowships.
Sam Capehart ’28
Niles Singer/Harvard Staff Photographer
A native of Virginia, Capehart is assisting Sophia Wiesenfeld, a Ph.D. student at the Kenneth C. Griffin Graduate School of Arts and Sciences, in Michael Baym’s lab at Harvard Medical School. They are working on a project exploring the role plasmids play in the spread of antibiotic resistance.
Plasmids are mobile genetic elements that can transfer between bacterial cells. The DNA molecules can carry genes that make bacteria resistant to antibiotics and can pass those genes between different species, which is a major concern among public health experts.
“Thinking toward the future, especially for our generation, antibiotic resistance is something we have to contend with,” Capehart said. “The bacteria will always be one step ahead of us. So I think any research that we can be doing now that’s even tangentially related to antibiotic resistance could potentially save millions of lives in the coming years.”
The Baym lab, which is part of the Departments of Biomedical Informatics and Microbiology, investigates whether it might be possible to combat antibiotic resistance by outcompeting it.
Since the most “fit” plasmids replicate the most and dominate within bacterial cells, the researchers are working to design a highly fit plasmid that can dominate and displace plasmids carrying antibiotic resistance genes.
In the lab, Capehart has been doing “competition experiments” to identify which plasmids come out on top when placed in the same bacterial environment.
Using samples from the Deer Island Wastewater Treatment Plant, she isolates bacterial plasmids and introduces two different ones into cells to observe how they behave over a nine-day period and see which takes over. She compared it to making a March Madness bracket.
“We’re hoping to determine whether plasmid hierarchy exists,” Capehart said. “Does plasmid A always win over the other plasmids, or is it more of a rock-paper-scissors system? Developing some sort of probiotic plasmid would depend pretty heavily on its ability to defeat all other plasmids. So if we can find a ‘king plasmid,’ that could point us toward mutations that we could then use in the future.”
Capehart said her experience in a wet lab environment is the perfect complement to her coursework. While she hasn’t declared a concentration yet, she’s leaning toward chemical and physical biology.
“Having the hands-on skills that I’ve been learning these past couple of weeks is invaluable,” Capehart said. “I’m a nerd. I love reading books as much as the next person, but there’s nothing quite like actually getting your hands dirty with wastewater to understand the subject.”
Nouraldeen Ibrahim ’26
Veasey Conway/Harvard Staff Photographer
Ibrahim is a chemical and physical biology concentrator who has worked in Philip Cole’s lab at the Medical School since he was a first-year. He is studying the function of an enzyme released by Legionella pneumophila, the bacteria responsible for Legionnaires’ disease, a severe form of pneumonia.
Specifically, he is looking at how the enzyme functions at the molecular level. The enzyme has a role in modifying DNA, leading to a reduced response in immune response genes, which allows the pneumonia to develop.
“I have access to this enzyme, which is fairly new and not much work has been done on it,” Ibrahim said. “I’m using some tools in our lab to better understand the function of this protein. What is its shape? What does it like to interact with? Which metal ions does it contain? The goal is finding ways to curtail this enzyme and making sure that maybe in the future we could have a way to prevent the spread of this disease.”
Ibrahim first became interested in epigenetics in high school after visiting his grandmother in Egypt while she was undergoing chemotherapy for lymphoma.
“I began thinking about how we can look at a more targeted way at how these diseases or bacteria are able to modify your DNA before looking at the outcomes,” said Ibrahim. “Chemotherapy looks at the outcome and then tries to kill those cells. But maybe if we could look at the start and what happens in the first place to lead to these downstream effects, it could be useful.”
Ibrahim, who hopes to attend medical school in the future, said being able to work in the Cole lab as an undergraduate has been transformational.
“Getting to go hands-on in the lab, having one-on-one conversations with one of the top professors at Harvard and Brigham Women’s Hospital, and being able to gain from his expertise has been crucial for me,” Ibrahim said. “Just pipetting things, working through my own experiments, designing my experiment from scratch, having my idea that I conceptualize, and seeing outcome and data is super powerful.”
Eunice Kim ’26
Niles Singer/Harvard Staff Photographer
Kim’s research focuses on the history of heat mortality in Los Angeles County, particularly in the mid-19th to mid-20th centuries before air conditioning became common.
Kim has been assisting David S. Jones, A. Bernard Ackerman Professor of the Culture of Medicine, with research for his forthcoming book on how heat waves came to be seen as public health threats. The work has required her to play detective, scouring online newspaper archives — including the 1870s Los Angeles Daily Star newspaper — to find records of major heat waves that impacted the region, and how residents responded to them.
“We’ve known about heat waves for a long time,” said Kim, who is earning a double concentration in the history of science and human developmental and regenerative biology. “It was mentioned in the Bible — people have been writing about it since basically the beginning of time. But it wasn’t until the 1980s, interestingly enough, that people realized this was going to be a re-occurring issue of experiencing heat waves, and that something had to be done in order to create better structures so people can live through heat waves and actually survive.”
For Kim, who was born and raised in the Koreatown neighborhood of Los Angeles, it’s a research topic close to home, literally. She grew up experiencing heat waves in the county — sustained high temperatures lasting for two days or more — but began thinking about them more critically as a public health issue in the classroom.
“As global warming is continuing to affect the world and temperatures are continuing to rise, this will continue to become a persistent issue,” said Kim, adding that the research skills she is acquiring will help prepare her to write a senior thesis.
“I’ve really enjoyed getting to know Harvard’s archives, resources, and librarians better,” Kim said. “Hands-on research has taught me a lot about curiosity and patience. I’ve gone into this research with a spirit of inquiry and a hope to uncover unique LA heat wave narratives.”
Charlotte Paley ’26
Veasey Conway/Harvard Staff Photographer
Paley is spending the summer researching eating disorders in the lab of Kristin Javaras, assistant professor of psychology at the Medical School. Based at McLean Hospital, Paley’s position is part of McLean’s Student Visitor Program and is funded through Harvard’s BLISS Program.
Paley, a Florida native who is concentrating in psychology with a secondary in global health and health policy, is assisting with a project aimed at investigating the accuracy of eating disorder diagnoses.
She is working under the supervision of Javaras; Jennifer Sneider, assistant director of the Javaras laboratory and assistant professor at the Medical School; and research assistant Lily Suh.
Paley’s role on the project involves reviewing descriptions of patient symptoms (with personal details carefully edited to ensure anonymity) that Javaras’ team have collected to see how well they match up against the formal criteria used to diagnose anorexia nervosa.
The goal is to evaluate the accuracy of the diagnoses in practice, to inform future assessment in both research and treatment.
“Something I’m particularly interested in is looking at exercise behavior in the data,” Paley said. “Hopefully by the end of the summer I’ll have some good qualitative findings regarding exercise behavior and the ways these are manifesting across patients of different ages and genders.”
Paley has also been helping with a neuroimaging study on binge eating that explores how social stress affects food-related decision-making in women. That work has included some data entry and putting up fliers to recruit study participants.
“Eating disorders are pretty misunderstood and very stigmatized,” Paley said. “There’s so much shame surrounding them. Research is so important to improving outcomes and potential treatments for eating disorders, so I’m really excited about this research and hope that it makes a meaningful impact.”
This summer offers Paley real-world experience that may contribute to her senior thesis on how weight discrimination contributes to various forms of psychopathology, including anxiety, depression, and disordered eating. She hopes to pursue medicine or public health after graduation.
“Getting to do research this summer is an amazing opportunity,” she said. “In this current climate where research funding is being cut, it’s very meaningful that I’m getting to do this now. I’m very grateful for this opportunity.”
Jeffrey Shi ’26
Photo by Grace DuVal
Shi is researching acoustic metamaterials in the lab of Jenny Hoffman, Clowes Professor of Science, a topic that has fascinated him since he first joined the lab as a Massachusetts high schooler.
Shi, a double concentrator in physics and English, has used acoustic metamaterials to help design and simulate a broadband high-Q resonator. The devices trap energy, like sound vibrations, and traditionally are either broadband or high-Q (quality factor), meaning they can resonate either for a long time or at multiple frequencies.
But this new design breaks that barrier, maintaining energy efficiently across a wide range of frequencies. Some potential real-world applications include energy harvesting: capturing energy from the environment and converting it to electricity.
“If you place our metamaterials under train tracks, say, and the train barrels across and the tracks shake, there’s actually a very straightforward way of harvesting the energy using our acoustic materials,” explained Shi, who was first author on a paper on the topic and presented his work at several conferences. “Because our material is both broadband and high-Q, the tracks can vibrate at different frequencies, and we can harvest that energy with high efficiency.”
Acoustic metamaterials are engineered structures designed to manipulate sound waves. Since their properties come from their geometry rather than the materials they’re composed of (they can be made of steel, plastic, or even a trash bag), they are highly tunable and scalable, according to Shi.
They can also be 3D printed quickly, which makes them ideal stand-ins for studying quantum materials, which are notoriously expensive and time-consuming to create.
“Part of the beauty is that there is easy tunability for whatever purpose that you need these materials for,” Shi said. “Is my layer going to be steel here, or is it going to be a sort of polymer? You can scale it so that you can hold it in your hand or so that it stretches across your entire wingspan. Either way, it’s macroscopic, and it’s easy to build.”
When he isn’t in the Hoffman lab, Shi is an undergraduate researcher in the lab of Kang-Kuen Ni, Theodore William Richards Professor of Chemistry and Professor of Physics.
There he assists with improving a component in a complex laser system used for experiments. His group uses these highly focused lasers to trap and manipulate individual atoms, bringing isolated atoms of different species together to study their interactions at the single-particle level.
“I feel very fortunate and grateful to have had a research experience so early, and so many resources and support and guidance from the people around me,” said Shi, who plans to pursue physics at the graduate level. “Physics research has been helpful in terms of knowing what kinds of physics I care about and what kinds of academic work I want to do in the future. I think I’ve learned a lot about myself through my research.”
How do math, reading skills overlap? Researchers were closing in on answers.Grant terminated at critical point of ambitious study following students for five years
How do math, reading skills overlap? Researchers were closing in on answers.
Grant terminated at critical point of ambitious study following students for five years
Liz Mineo
Harvard Staff Writer
6 min read
For cognitive neuroscientist Nadine Gaab, the termination of a five-year grant one year before it was scheduled to end couldn’t have come at a worse moment. As part of a study aimed at understanding the co-development of math and reading skills over time from preschool through elementary school, Gaab and her team of researchers had followed 163 students for up to four years. In May, before the study’s final year, they were preparing to test the participants to see which children were on the trajectory to develop math and reading problems.
“This was the most important year because we were going to see who of these kids developed typical reading and math skills versus atypical reading and math skills.”
Nadine Gaab
But since Gaab’s research was terminated as part of the federal funding cuts the Trump administration announced in May, which froze more than $2.2 billion in federal grant money in its ongoing clash with Harvard, the research couldn’t be completed. Recently, it received bridge funding from Harvard Graduate School of Education, which will cover minimal research. But Gaab’s team will not to be able to reassess participants’ brain development, a crucial part of the study.
An associate professor of education at the Ed School, Gaab said she cannot overstate the impact of the grant termination.
“This was the most important year because we were going to see who of these kids developed typical reading and math skills versus atypical reading and math skills,” said Gaab, principal investigator of the Gaab Lab. “It’s like if you’re trying to prevent heart disease, and you’re examining a number of protective and risk factors for four years, and at the end, you want to see who developed heart disease and who didn’t. Now, with the grant being terminated, we can’t determine who, of all the kids, has math or reading problems. It is just devastating.”
Although researchers primarily recruited preschoolers from the New England area, the significance of the termination extended beyond the region. Each year, several families — some from as far as California and Alaska — traveled across the country to Cambridge to participate in the groundbreaking study that also tracks the children’s brain development. For many families, the chance to receive annual reports on their child’s math and reading development was reason enough to engage in the journey.
Called the Children’s Arithmetic, Language, and Cognition (CALC) study, it intended to explore how math and reading skills develop and interact over time, using a comprehensive testing battery of language and cognitive abilities, measures of brain structure and function, as well as reports of the home learning environment.
“We wanted to see the role of the environment or having an older sibling or a parent with a reading disability in shaping these trajectories.”
Nadine Gaab
Through community engagement efforts, the researchers at the Gaab Lab managed to recruit a unique sample for this study, including kids with family histories of reading difficulties, math difficulties, or both. It is known that children coming from these families have a higher risk of developing a learning difficulty themselves. The goal was to examine the trajectories of math and reading skills development in these groups to identify when and how they diverge from typically developing children. “We wanted to see the role of the environment or having an older sibling or a parent with a reading disability in shaping these trajectories,” said Gaab.
Researchers were hoping that the $4.1 million grant would also shed light on a phenomenon education experts have noticed: the high co-occurrence of math and reading disabilities in some students. Experts hypothesize that if students struggle with language or reading, those difficulties could potentially disrupt the understanding of mathematical concepts.
“There is a lot of language involved when we teach math,” said Gaab. “But there are other aspects that can play a role, such as working memory or executive functioning that are needed for both reading and math skills, and we are interested in overlapping brain regions that could explain this high co-occurrence.”
Beyond understanding the interaction between language and math development, the study’s findings could also have had serious repercussions for how math is taught during the first few years of formal education and further influence the development of early screening instruments, said Gaab.
“An implication of this work was not only to develop early screening instruments to find kids at risk, but also to see whether we should change the way we teach math,” said Gaab. “And that involves maybe teaching math a little bit differently or paying attention to kids who struggle with language when you teach math.”
“An implication of this work was not only to develop early screening instruments to find kids at risk, but also to see whether we should change the way we teach math.”
Nadine Gaab
Due to the grant’s interruption, Gaab had to let go of several team members and terminate a subcontract to a university in Canada that included a postdoctoral fellow. The necessary training for research staff can be long and intensive, with abrupt funding cuts potentially disrupting Gaab’s research well beyond the immediate future, even if the grant were to be reinstated.
Gaab is grateful that her research was selected for bridge funding from the University that at least allows the researchers to test some of the students’ reading and math skills. Conducting neuroimaging research via MRI on the participants will be too expensive, said Gaab, but examining their math and reading outcomes after four years of formal instruction will bring valuable lessons.
“Knowing how math and reading skills develop over time in typical and atypical populations could help us develop early screening tools,” said Gaab. “We could see early on who may struggle or be more likely to struggle. It can also help developing intervention tools to know how we can best help those struggling students and can lead to a better curricula design to teach reading and math. This is a study that can help any child and educators in the long run.”
AI leaps from math dunce to whizExperts describe how rapid advances are transforming field and classroom and expanding idea of what’s possible — ‘sky’s the limit’
Experts describe how rapid advances are transforming field and classroom and expanding idea of what’s possible — ‘sky’s the limit’
When Michael Brenner taught the graduate-level class “Applied Mathematics 201” in fall 2023, the course’s nonlinear partial differential equations were too tough for artificial intelligence. AI managed to solve just 30 to 50 percent of the problems in the first three weeks of the class.
“It was fine, but it wasn’t that great,” said Brenner, the Catalyst Professor of Applied Mathematics and Applied Physics and of Physics at the John A. Paulson School of Engineering and Applied Sciences.
But when he taught the same course this past spring, everything had changed. The same AI models that had stumbled on the easiest problems now aced the hardest ones. Brenner was shocked. “This actually calls into question the entire way the class is taught,” he said.
The course taken by many graduate students has a reputation for being tough. Brenner has taught it for more than two decades, and he’d always given students take-home exams; he wanted them to have time to wrestle with difficult questions without the stress of a ticking clock. But if ChatGPT could take the exam for his students, could he trust what they turned in?
He had two choices: Ban AI completely, or embrace it and redesign the class that has been taught at Harvard since before he was born.
He redesigned the class.
Forms of artificial intelligence have been used in mathematics for decades. But in the last few years, advances in machine learning and the exponential improvements of publicly available large language models have begun to reshape the discipline. While some mathematicians expect modest tools that can automate unglamorous parts of the job, others see a wholesale reimagining of the discipline and a rapid acceleration of what’s possible.
Michael Brenner.
Stephanie Mitchell/Harvard Staff Photographer
And the notion that AI is uniquely bad at math? Brenner says: Simply not true.
“If you take your favorite large language model from two years ago and you ask it to please add 372 and 476, and it gives you the wrong number, then you say it’s bad at math,” Brenner said. “But what I would say is that’s the wrong use of a large language model. Obviously what you should do if you’re given those two numbers is you should open up a calculator.”
Evidence of AI’s mathematical capabilities is mounting. In 2024, two AI models from Google DeepMind earned a silver medal in the International Mathematical Olympiad, the largest and most prestigious competition for young mathematicians. Also in 2024, Demis Hassabis and John Jumper of DeepMind won the Nobel Prize in Chemistry for their AI model AlphaFold2, which predicted the structure of almost all 200 million known proteins.
“It is now possible to make a computational model that leads scientifically to the extent that, within years of publication, it wins a Nobel Prize,” Brenner said. “That’s unprecedented.”
Knots and murmurations
While some mathematicians remain wary of AI’s tendency to hallucinate, specialized machine learning systems are accelerating mathematical discoveries in multiple fields.
“Computers can deal with data sets that are too large for people to search through, and they can find patterns that people find interesting and significant,” said Michael Douglas, senior research scientist at the Center for Mathematical Sciences and Applications at Harvard. “That’s been very helpful for mathematicians whose objects of study can be assembled into data sets.”
“Computers can deal with data sets that are too large for people to search through, and they can find patterns that people find interesting and significant.”
Michael Douglas
One example: knot theory, a discipline that has applications in physics, biology, and chemistry.
There are an infinite number of possible knots; researchers comb through databases of hundreds of millions of mathematical knots looking for relationships between them. In 2021, researchers at DeepMind, some of whom had ties to Harvard, used AI to discover new relationships between knot invariants, the numerical characteristics that define each knot’s properties. The discovery could have taken human mathematicians years to uncover through traditional methods.
Another breakthrough came in research into elliptic curves, deceptively simple structures with big implications in both pure mathematics and cryptography. When Harvard researchers fed curve data into machine learning systems, they found that the curves’ behavior resembled murmurations — those swirling, coordinated movements of flocks of birds.
“No mathematician ever thought to look for that before, and they were quite surprised to see it,” Douglas said. “They are now busy trying to prove it.”
But perhaps the form of AI that’s generating the most buzz in the math world is in automated theorem proving. For decades, mathematicians have used computer systems to check that theorems are logically sound. It’s an effective process, but translating human-written proofs into computer-readable formats is time-consuming, and the proofs generated by the automated provers are often very large.
Enter generative AI.
“The hope, and what people are starting to do now, is that we’ll use the large language models to write the proof in this way that the computer can check,” Douglas said.
It’s solving one problem with another, Douglas said. Generative AI can almost instantaneously translate proofs into formats that automated systems can verify, while the verification process catches any AI-generated errors or hallucinations.
Melanie Weber.
Niles Singer/Harvard Staff Photographer
Of course, just as AI is influencing mathematics, math is also influencing AI. Melanie Weber, assistant professor of applied mathematics and of computer science at SEAS, uses classical tools from geometry to build “geometric” AI models that are more efficient and transparent.
“Artificial intelligence is already revolutionizing science. However, the current models require vast data and computing resources, which can be limited in the sciences and raise sustainability concerns. Encoding geometric structures, such as symmetries arising from laws of physics, can increase the models’ efficiency by narrowing their focus to physically plausible conditions,” she said. “What that means is that the possible instances that we have to consider during training can be dramatically reduced by incorporating such structure. And that essentially means that we need less data and less computing resources to train a good model if we hard-code that structure into the model.”
“Artificial intelligence is already revolutionizing science. However, the current models require vast data and computing resources, which can be limited in the sciences and raise sustainability concerns.”
Melanie Weber
The future of math
Although the speed at which AI will change mathematics is still unclear, there’s no doubt the transformation is happening. Weber sees the technology serving both as a kind of research assistant, handling literature reviews and proof verification, and as a sounding board for ideas, helping mathematicians solve problems faster. Brenner said that kind of acceleration could be transformative.
“My hope is that we can solve problems faster and we can get more work done,” he said. “Science is infinite. There’s no limit.”
“My hope is that we can solve problems faster and we can get more work done. Science is infinite. There’s no limit.”
Michael Brenner
Teaching, Brenner added, is also infinite. In the 2025 version of “Applied Mathematics 201,” he did away with traditional homework problems. Instead, students had to create their own problems, have a classmate verify them, and see if they could outsmart an AI. (“One of the good ones,” Brenner specified. “If the not-very-good models can solve your problem, it doesn’t count.”)
By the end of the semester, the students had created nearly 700 math problems of increasing difficulty. The data on whether AI could solve them could prove useful for researchers.
“It’s just amazing, because I’m teaching math, right? And we’ve got a situation where the students are inventing math problems that are harder and harder and harder, and trying to solve them. That is the dream.”
Brenner added, “There are problems everywhere. How to better predict the climate. How to understand what manipulations one could do to help the Earth be a more habitable place. Models that would help discover drugs. … The sky’s the limit on this, and we don’t know what is possible. But it is not a boring time to be doing this.”
Taking a second look at executive functionNew study suggests what has long been considered innate aspect of human cognition may be more a matter of schooling
New study suggests what has long been considered innate aspect of human cognition may be more a matter of schooling
Clea Simon
Harvard Correspondent
4 min read
Executive function — top-down processes by which the human mind controls behavior, regulating thoughts and actions — have long been studied using a standard set of tools, with these assessments being included in national and international child development norms.
A new study of children in schooled and unschooled environments, published in Proceedings of the National Academy of Sciences, raises questions about some of the assumptions underlying the way psychologists and scholars of cognitive science think about these processes.
Instead of defining an innate, basic feature of human cognition, the executive functions supposedly captured in the assessments are likelier to depend on the influence of formal schooling.
The study, “The cultural construction of ‘executive function,’” tested children in the Kunene region of Africa, which spans the countries of Namibia and Angola, as well as children in the U.K. and Bolivia. Children in rural areas of Kunene who received limited or no formal schooling differed profoundly in so-called executive function testing from their schooled peers, or a “typical” Western schooled sample.
“Almost all developmental research is done on children who live in a schooled world,” explained Joseph Henrich, Ruth Moore Professor of Evolutionary Biology, whose Culture, Cognition, and Culture lab in the Department of Human Evolutionary Biology oversaw the study. Referring to Kunene, he said, “We went to a place where we have a kind of natural experiment, where we have some communities with no schools and some with schools. That allows us to compare the cognitive development of the kids. And what we see is we only get the usual executive function development in the places with schools rather than in the places without. That suggests that it’s really about schooling.
“What has been taken as a very generalized thing called ‘executive function’ is actually really specific to a set of skills you need to navigate school and schooled worlds.”
Testing executive function, continued Henrich, often involves such exercises as memorizing lists of unconnected words. But children with little or no formal schooling might not recognize these words because such lists do not occur in their environment.
Joseph Henrich, Ruth Moore Professor of Evolutionary Biology.
Harvard file photo
However, the researchers argued, the innate cognitive functions of children who were not formally schooled were not impaired — they were simply applied differently.
“In the populations we work in, people are super good at remembering cows,” he said. “They can look at the herd, they can tell you how many cows are there, they can name the cows. If you showed them faces of cows, they can tell you who the owner is. And I bet if I did this with kids around here in Boston, they would be terrible at differentiating cows.”
It’s not that executive function doesn’t exist, explained the researchers. Instead, we need to recognize that what we have been measuring is not that overall control.
“We need to rethink how we approach human psychology,” said Henrich, and a lot of what is regarded as regular cognitive development is actually a product of a formal education.
Ivan Kroupin, the paper’s lead author and a former postdoc in Henrich’s lab, elaborated: “The term ‘executive function’ refers to a set of capacities and dispositions that are, in large part, culture-specific.” Kroupin, who is currently at the London School of Economics and co-directed the field studies with Helen Elizabeth Davis of Arizona State University, said, “ Our study suggests that the capacities these tasks require are in part universal, but also in part culture-specific, potentially tied to formal schooling or other institutions and experiences in urbanized societies.”
The findings suggest a re-examination of terms such as “executive functions” and a more accurate understanding of what these are.
“We can use the term ‘executive functions’ to refer to underlying universal capacities,” said Kroupin. However, “If that is the case then we need a different term for the suite of universal and culture-specific capacities which typical EF tasks are measuring.”
You’re a deer mouse, and bird is diving at you. What to do? Depends.Neural study shows how evolution prepared two species to adopt different survival strategies to take advantage of native habitats
The Peromyscus maniculatus lives in densely vegetated prairies.
You’re a deer mouse, and bird is diving at you. What to do? Depends.
Neural study shows how evolution prepared two species to adopt different survival strategies to take advantage of native habitats
Kermit Pattison
Harvard Staff Writer
6 min read
For a mouse, survival in the wild often boils down to one urgent question: flee or freeze?
The best strategy depends on which mouse you are asking. A new study by Harvard biologists has found that two closely related species of deer mice have evolved very different reactions to aerial predators thanks to tweaks in brain circuitry. One species that dwells in densely vegetated areas instinctively darts for cover while a cousin living in open areas goes still to avoid being spotted.
“In this case, we were able to pinpoint where evolution acted to make species from different environments have different responses to the same stimulus,” said Felix Baier, who conducted the study in Hopi Hoekstra’s lab in the Department of Organismic and Evolutionary Biology while he was a Ph.D. student in the Kenneth C. Griffin Graduate School of Arts and Sciences.
The deer mouse Peromyscus maniculatus frantically runs for cover when shown a simulation of a fast-approaching predatory bird.
Credit: Felix Baier
“The paper shows that evolution can act anywhere, including in more central brain regions,” added Baier, now a postdoctoral fellow at the Max Planck Institute for Brain Research.
The findings, published in the journal Nature, provide new insights into a group of animals that have become iconic examples of evolutionary adaptation.
Deer mice of the genus Peromyscus include more than 50 species occupying virtually every habitat from desert to mountains and are the most abundant mammals in North America. They are prime examples of an adaptive radiation — the process by which an evolutionary lineage rapidly diversifies into multiple species, each occupying specialized ecological niches.
Because they have been intensely studied in the wild and in the lab, deer mice are sometimes called the fruit flies of mammal biology.
In the rodent family tree, deer mice separated from the ancestors of house mice and rats about 25 million years ago. By some accounts, Mickey Mouse was inspired by the Peromyscus field mice that scurried through the animation studio of Walt Disney.
“In this case, we were able to pinpoint where evolution acted to make species from different environments have different responses to the same stimulus.”
Felix Baier
The lab of evolutionary biologist Hoekstra, the Edgerley Family Dean of the Faculty of Arts and Sciences and the C.Y. Chan Professor of Arts and Sciences, has spent decades studying how different species of deer mice have adapted their biology and behavior. The lab has shown in previous studies how species evolved specializations such as fur colors, mating habits, and burrowing behaviors.
In this new study, the team sought to understand why two sister species respond very differently to predators. Because deer mice are frequently hunted by hawks and owls, their escape behaviors are shaped by intense natural selection. “It’s life or death!” said Hoekstra.
The species Peromyscus maniculatus — which lives in densely vegetated prairies and is the most widespread of all deer mice — is quick to dash for cover after sensing the approach of a bird of prey. In contrast, the Peromyscus polionotus — which lives in open areas such as sand dunes or bare farm fields — tends to freeze.
To better understand these differences, the investigators placed the deer mice in an enclosure furnished with a small shelter. They mounted a computer screen atop the cage and showed images of small dark dots floating on a light screen (which simulated birds soaring high overhead) and dots that suddenly loomed larger (which mimicked predatory birds diving in for the kill).
When they sensed the looming threat of an approaching bird, the prairie deer mice scrambled for shelter, but the open field mice froze in place.
The investigators sought to uncover the neural basis for these differences. They played a frightening sound and triggered similar reactions, revealing that the difference was not just vision or other peripheral senses, but some kind of central processing in the brain.
Next they conducted immunohistochemical and electrophysiological studies of the mice brains and located the key junction — a portion of the brain called the dorsal periaqueductal gray (dPAG). Activation of this region was about 1.5 times higher in the species that sought cover.
With a technique known as optogenetics, the scientists introduced proteins that act as light-sensitive ion channels into the dPAG of both species and then stimulated the neurons with lasers. This stimulation triggered the same responses they had witnessed in the earlier experiments — even when no images were shown.
In another experiment, they suppressed activity in the same region and induced one species to behave just like the other.
The study was conducted in collaboration with colleagues at KU Leuven, a research university in Belgium.
Previous studies by the Hoekstra lab have documented other differences between the same two species, such as mating (P. polionotus is monogamous while P. maniculatus is promiscuous) and burrowing (P. polionotus makes long complex tunnels, P. maniculatus makes short, simple ones).
The new study adds yet another example of how evolution has tailored each species to its unique environment since the two lineages separated between 1 million and 2 million years ago.
The authors theorize that the different escape responses evolved to maximize chances of survival in their respective habitats. Deer mice that live in vegetated areas usually can find cover nearby so they flee, but those that live on open ground have fewer places to hide and only attract attention by running.
But no species would survive if it never took flight. The scientists found that the open field mice could be induced to flee, but they required twice the amount of threat.
Both species share the same basic neural machinery, but evolution apparently has adjusted the knobs to fine tune each species for its ecology.
Hoekstra, also the Xiaomeng Tong and Yu Chen Professor of Life Sciences who also has an appointment in molecular and cellular biology, said those findings echoed a common theme in evolutionary biology: “Natural selection often tweaks existing neural circuits rather than constructing entirely new pathways,” she said.
A step toward solving central mystery of life on EarthExperiment with synthetic self-assembling materials suggests how it all might have begun
Juan Pérez-Mercader, a senior research fellow in the Department of Earth and Planetary Sciences.
A step toward solving central mystery of life on Earth
Experiment with synthetic self-assembling materials suggests how it all might have begun
Kermit Pattison
Harvard Staff Writer
6 min read
It is the ultimate mystery of biology: How did life begin?
A team of Harvard scientists has brought us closer to an answer by creating artificial cell-like chemical systems that simulate metabolism, reproduction, and evolution — the essential features of life. The results were published recently in the Proceedings of the National Academy of Sciences.
“This is the first time, as far as I know, that anybody has done anything like this — generate a structure that has the properties of life from something, which is completely homogeneous at the chemical level and devoid of any similarity to natural life,” said Juan Pérez-Mercader, a senior research fellow in the Department of Earth and Planetary Sciences and the Origins of Life Initiative, the senior author of the study. “I am super, super excited about this.”
According to Dimitar Sasselov, director of the Origins of Life Initiative and Phillips Professor of Astronomy, the paper marks an important advance by demonstrating how a simple, self-creating system can be constructed from non-biochemical molecules.
“As it mimics key aspects of life, it allows us insight into the origins and early evolution of living cells,” said Sasselov, who was not involved in the new study.
The team sought to demonstrate how life might “boot up” from materials similar to those available in the interstellar medium.
The earliest known evidence of life are tiny fossils of ancient microbes about 3.8 billion years old. But their discovery hardly solved the mystery of just how or when life began. What simple biological molecules gave rise to complex cells? Was there a single origin or multiple events? Did life begin on Earth or on another planet?
These questions have puzzled biologists for centuries. Charles Darwin speculated that life began in a “warm little pond” and then diversified into varied forms.
In the 1950s, Stanley Miller and Nobel laureate Harold Urey conducted experiments at the University of Chicago in which they simulated the conditions of primordial Earth — an atmosphere of methane, ammonia, hydrogen, and water with electric arcs of lightning — and produced amino acids, the organic molecules that form the building blocks of proteins.
Into this debate stepped Pérez-Mercader, an energetic scientist who describes himself as a “77-year-old kid.” Trained as a theoretical physicist, he spent his earlier career investigating grand unified theories, super symmetry, super gravity, and super strings.
In the 1990s, he shifted into astrobiology and founded the Centro de Astrobiología in Madrid in collaboration with NASA, and oversaw Spain’s participation in NASA’s Mars Science Laboratory.
In 2010, he came to Harvard with another grand undertaking. “I’m trying to understand why life exists here,” he said.
Chenyu Lin, who works on Pérez-Mercader’s research team, adjusts settings on an experiment.
Pérez-Mercader works with Lin in the lab.
Pérez-Mercader’s office whiteboard.
All forms of life share a few basic attributes: They handle chemical information, metabolize some form of energy (such as consuming food or performing photosynthesis) to sustain themselves and build body parts, reproduce, and evolve in response to the environment.
Pérez-Mercader worked out mathematical equations for the basic physics and chemistry of biology and used their solutions as guidance to synthesize artificial life in a test tube.
For years, these efforts remained theoretical explorations without an experimental demonstration. Then came a laboratory breakthrough with the advent of polymerization-induced self-assembly, a process in which disordered nanoparticles are engineered to spontaneously emerge, self-organize, and assemble themselves into structured objects at scales of millionths or billionths of a meter.
At last, these tools enabled Pérez-Mercader and his colleagues to bring their theories to life — literally.
“The paper demonstrates that lifelike behavior can be observed from simple chemicals that aren’t relevant to biology more or less spontaneously when light energy is provided.”
Stephen P. Fletcher, University of Oxford
In the new study, the team sought to demonstrate how life might “boot up” from materials similar to those available in the interstellar medium — the clouds of gasses and solid particles left over from the evolution of stars in a galaxy — plus light energy from stars. A test tube served as the lab version of Darwin’s “warm little pond.”
The team mixed four non-biochemical (but carbon-based) molecules with water inside glass vials surrounded by green LED bulbs, similar to holiday lights. When the lights flashed on, the mixture reacted and formed amphiphiles, or molecules with hydrophobic (water-adverse) and hydrophilic (water-loving) parts.
The molecules self-assembled into ball-like structures called micelles. These structures trapped fluid inside, where it developed a different chemical composition and turned into cell-like “vesicles,” or fluid-filled sacs.
Eventually, the vesicles ejected more amphiphiles like spores, or they just burst open — and the loose components formed new generations of more cell-like structures. But the increasing numbers of expelled spores slightly differed from each other, with some proving more likely to survive and reproduce — thus modeling what the researchers called “a mechanism of loose heritable variation,” the basis of Darwinian evolution.
Stephen P. Fletcher, a professor of chemistry at the University of Oxford who was not involved in the new study but pursues similar research, said the PNAS study opens a new pathway for engineering synthetic, self-reproducing systems — an achievement that past experiments attained only with more complex methods.
“The paper demonstrates that lifelike behavior can be observed from simple chemicals that aren’t relevant to biology more or less spontaneously when light energy is provided,” he said.
Pérez-Mercader characterizes the experiment in more animated terms. He thinks it provides a model for how life might have begun around 4 billion years ago. By his reckoning, such a system could have evolved chemically and given rise to the last universal common ancestor — the primordial form that begat all subsequent life.
“What we’re seeing in this scenario is that you can easily start with molecules which are nothing special — not like the complex biochemical molecules associated today with living natural systems,” he said. “That simple system is the best to start this business of life.”
Going to bed earlier may help you hit fitness goalsNew study finds link between sleep curfew, higher levels of moderate-to-vigorous physical activity
Going to bed earlier may help you hit fitness goals
New study finds link between sleep curfew, higher levels of moderate-to-vigorous physical activity
Alvin Powell
Harvard Staff Writer
6 min read
The proverb says, early to bed and early to rise makes a person healthy, wealthy, and wise.
Wealth and wisdom may still be a question mark, but sleep experts at Harvard, Brigham and Women’s Hospital, and Monash University say there’s strong evidence that hitting the sack earlier than usual and getting a good night’s rest can help with a key factor in good health: getting enough heart-pumping physical activity.
The study, conducted from 2021 to 2022, showed that people who got the most moderate-to-vigorous physical activity the next day went to sleep earlier than usual, but slept about as much as they usually did. The biggest difference in next-day activity was between people who typically slept 5 hours a night, on average, and those who averaged 9 hours. In that case, the short sleepers got 41.5 more minutes of moderate-to-vigorous physical activity the following day.
“In general, individuals who went to bed earlier engaged in more frequent and longer physical activity per day than those who habitually went to bed later.”
Mark Czeisler
“In general, individuals who went to bed earlier engaged in more frequent and longer physical activity per day than those who habitually went to bed later,” said Mark Czeisler, a clinical fellow in medicine at Harvard Medical School, resident physician at Brigham and Women’s Hospital, and an author of the paper.
Czeisler, who graduated from Harvard College in 2019 and HMS in May, said it may be that those people were better rested and more inclined to exercise, but it could also be that going to bed earlier meant waking earlier than usual, simply giving them more time in their day. Untangling specific causes and effects, he said, would be a goal of future work.
U.S. health guidelines suggest that adults get 150 minutes to 300 minutes of moderate-to-vigorous physical activity weekly. Moderate activities are those that cause you to break a sweat and increase breathing and heart rate, such as walking quickly, riding a bike, or doing yard work. Vigorous activities make it hard to talk during them and include things like running, swimming laps, and playing basketball.
Mark Czeisler.
Veasey Conway/Harvard Staff Photographer
“The biggest takeaway is that sleep and physical activity may be more closely related than we previously thought,” said Josh Leota, adjunct researcher with the Brigham’s Division of Sleep and Circadian Disorders, research fellow at Monash University in Australia, and the paper’s first author. “Even small changes in when you go to bed may be linked to how active you are the next day. So, rather than viewing sleep and exercise as competing for time, we should think about how they can support each other.”
“Even small changes in when you go to bed may be linked to how active you are the next day. So, rather than viewing sleep and exercise as competing for time, we should think about how they can support each other.”
Mark Czeisler
The research, published in June in the Proceedings of the National Academy of Sciences, takes advantage of the evolution of wearable fitness trackers, which provided daily sleep and activity data for nearly 20,000 Americans who logged about 6 million person-nights over the course of a year.
Researchers used anonymous data provided by WHOOP Inc., a Boston health tracker technology company with roots at Harvard. The WHOOP results were verified by a second study, All of Us, run by the National Institutes of Health, in which a cohort designed to be demographically representative was given a free Fitbit device to participate.
Czeisler said the All of Us study showed similar patterns between sleep and physical activity, but the effect’s magnitude was smaller. That is likely due to differences between the study populations, he said, with the WHOOP population more likely to be self-selected for interest in fitness and athletic performance.
The work, which did not receive outside funding, helps bring clarity to an area where previous studies were mixed.
Some failed to show any connection between sleep patterns and levels of moderate-to-vigorous physical activity. Others, meanwhile, do show a connection, but point in different directions. Experimental studies that control how much sleep the subjects get showed lower next-day physical activity for those who slept less than usual, while epidemiological studies — often conducted via questionnaire to those living freely under normal circumstances — indicated the opposite.
One advantage of the current study is that collecting longitudinal data from two large-scale samples across several months and up to a year allows for both between-participant comparisons and “within participant” analysis of an individual’s tracker data under different circumstances.
It also helps, Czeisler said, that the tracker data is objective, reducing problems of bias or difficulty with recall that may be present in questionnaire-based studies.
Though both sleep and physical activity have been studied previously, Czeisler said the current work is among the largest in sample size and longest in duration to examine the relationship between the two in the setting of everyday life. That’s important because physicians and public health officials often make separate recommendations about how much sleep is optimum and how much physical activity is ideal for good health, but there’s little public health messaging about how one might influence the other.
Busy adults might choose, for example, to get less sleep, rising early to work out. Or they may choose to stay up late with friends on Friday and Saturday nights, which may impact weekend workouts.
“If one of the takeaways is that people are sacrificing sleep for exercise or exercise for sleep, the question becomes what amount of each behavior maximizes health span and lifespan?” Czeisler said. “There are only 24 hours in a day; what is the optimal balance?”
Leota said an important next step is to use the findings to design experiments to determine cause and effect, with the aim of providing a solid foundation for future public health recommendations.
“We would like to test whether encouraging earlier bedtimes directly leads to more physical activity the next day, within an experimental paradigm,” Leota said. “This would provide strong evidence for updating public health messaging to improve population physical activity levels.”
Harvard seeks restoration of research fundsUniversity argues Trump administration violated free speech rights, ignored procedural provisions in federal court hearing
Protesters gathered outside the Moakley Federal Courthouse in Boston on Monday, where Harvard challenged the Trump administration’s termination of billions in research funding.
Charles Krupa/AP Photo
Alvin Powell
Harvard Staff Writer
3 min read
University argues Trump administration violated free speech rights, ignored procedural provisions in federal court hearing
Harvard argued in federal court in Boston on Monday that the Trump administration’s move to terminate billions of dollars in research funding to the University was unconstitutional and violated procedural provisions in civil rights and administrative laws.
The 2½-hour hearing, before U.S. District Judge Allison Burroughs, brought together attorneys from Harvard, the Justice Department, and the American Association of University Professors, which has also sued the government for its abrupt cancellation of research funding, on behalf of Harvard faculty members.
During the proceeding, Burroughs spent significant time questioning Justice Department senior attorney Michael Velchik about several topics, including the administration’s contention that it has the right to halt funding at any time due to contractual terms and probing the link between that ability and free speech issues.
“If you can make decisions for reasons oriented around free speech, the consequences of that are staggering to me,” Burroughs said.
The Trump administration has cited campus antisemitism in its actions against Harvard. The University’s attorney, Steven Lehotsky, argued that the government has sought to coerce Harvard to give up its autonomy through a series of demands that extend beyond fighting antisemitism.
He noted that the demands include audits of viewpoint diversity among students and faculty and changes to admissions and hiring practices. Those demands amount to a violation of academic freedom and the University’s First Amendment guarantees of free speech, he argued.
“This is a blatant, unrepentant violation of the First Amendment,” Lehotsky said.
In addition, Lehotsky argued that the government’s actions violate procedural provisions in Title VI of the Civil Rights Act of 1964, which require that an investigation be conducted, a hearing held, and findings released before funding is withdrawn.
The government did not engage in a reasoned decision-making process that took into account the interests of those who stand to benefit from academic and medical research, the general public, and others affected by research funding cutoffs, he said.
Making the case for the Trump administration, Velchik shifted the focus of the government’s argument, which previously centered claims of an inadequate response to incidents of antisemitism. Rather, Velchik said, the dispute is about money: Harvard wants billions in research dollars restored, and the government is within its rights to decide where it wants those funds to go.
The government’s disagreement with Harvard is at its core a contract dispute, he argued, and government contracts contain language that says funding can be withdrawn at any time.
All sides requested summary judgment in the case, which avoids a lengthy trial. Burroughs said she would work as quickly as possible but did not set a deadline for a ruling.