Why Expanding Access to Algebra is a Matter of Civil Rights

Bob Moses, who helped register Black residents to vote in Mississippi during the Civil Rights Movement, believed civil rights went beyond the ballot box. To Moses, who was a teacher as well as an activist, math literacy is a civil right: a requirement to earning a living wage in modern society. In 1982, he founded the Algebra Project to ensure that “students at the bottom get the math literacy they need.”

As a researcher who studies ways to improve the math experiences of students, we believe a new approach that expands access to algebra may help more students get the math literacy Moses, who died in 2021, viewed as so important. It’s a goal districts have long been struggling to meet.

Efforts to improve student achievement in algebra have been taking place for decades. Unfortunately, the math pipeline in the United States is fraught with persistent opportunity gaps. According to the Nation’s Report Card—a congressionally mandated project administered by the Department of Education—in 2022 only 29% of U.S. fourth graders and 20% of U.S. eighth graders were proficient in math. Low-income students, students of colour and multilingual learners, who tend to have lower scores on math assessments, often do not have the same access as others to qualified teachers, high-quality curriculum and well-resourced classrooms.

A new approach

The Dallas Independent School District—or Dallas ISD—is gaining national attention for increasing opportunities to learn by raising expectations for all students. Following in the footsteps of more than 60 districts in the state of Washington, in 2019 the Dallas ISD implemented an innovative approach of having students be automatically enrolled rather than opt in to honours math in middle school.

Under an opt-in policy, students need a parent or teacher recommendation to take honours math in middle school and Algebra 1 in eighth grade. That policy led both to low enrolment and very little diversity in honours math. Some parents, especially those who are Black or Latino, were not aware how to enroll their students in advanced classes due to a lack of communication in many districts.

In addition, implicit bias, which exists in all demographic groups, may influence teachers’ perceptions of the behaviour and academic potential of students, and therefore their subsequent recommendations. Public school teachers in the U.S. are far less racially and ethnically diverse than the students they serve.

Dallas ISD’s policy overhaul aimed to foster inclusivity and bridge educational gaps among students. Through this initiative, every middle school student, regardless of background, was enrolled in honours math, the pathway that leads to taking Algebra 1 in eighth grade, unless they opted out.

Flipping the switch from opt-in to opt-out led to a dramatic increase in the number of Black and Latino learners, who constitute the majority of Dallas students. And the district’s overall math scores remained steady. About 60% of Dallas ISD eighth graders are now taking Algebra 1, triple the prior level. Moreover, more than 90% are passing the state exam.

Efforts spread

Other cities are taking notice of the effects of Dallas ISD’s shifting policy. The San Francisco Unified School District, for example, announced plans in February 2024 to implement Algebra 1 in eighth grade in all schools by the 2026-27 school year.

In fall 2024, the district will pilot three programs to offer Algebra 1 in eighth grade. The pilots range from an opt-out program for all eighth graders—with extra support for students who are not proficient—to a program that automatically enrolls proficient students in Algebra 1, offered as an extra math class during the school day. Students who are not proficient can choose to opt in. Nationwide, however, districts that enroll all students in Algebra 1 and allow them to opt out are still in the minority. And some stopped offering eighth grade Algebra 1 entirely, leaving students with only pre-algebra classes. Cambridge, Massachusetts—the city in which Bob Moses founded the Algebra Project—is among them.

Equity concerns linger

Between 2017 and 2019, district leaders in the Cambridge Public Schools phased out the practice of placing middle school students into “accelerated” or “grade-level” math classes. Few middle schools in the district now offer Algebra 1 in eighth grade.

The policy shift, designed to improve overall educational outcomes, was driven by concerns over significant racial disparities in advanced math enrollment in high school. Completion of Algebra 1 in eighth grade allows students to climb the math ladder to more difficult classes, like calculus, in high school. In Cambridge, the students who took eighth grade Algebra 1 were primarily white and Asian; Black and Latino students enrolled, for the most part, in grade-level math.

Some families and educators contend that the district’s decision made access to advanced math classes even more inequitable. Now, advanced math in high school is more likely to be restricted to students whose parents can afford to help them prepare with private lessons, after-school programs or private schooling, they said.

While the district has tried to improve access to advanced math in high school by offering a free online summer program for incoming ninth graders, achievement gaps have remained persistently wide.

Perhaps striking a balance between top-down policy and bottom-up support will help schools across the U.S. realize the vision Moses dreamed of in 1982 when he founded the Algebra Project: “That in the 21st century every child has a civil right to secure math literacy—the ability to read, write and reason with the symbol systems of mathematics.”

For more insights like this, visit our website at www.international-maths-challenge.com.

Credit of the article given to Liza Bondurant, The Conversation

 

 


Mathematical model reveals commonality within the diversity of leaf decay

The colorful leaves piling up in your backyard this fall can be thought of as natural stores of carbon. In the springtime, leaves soak up carbon dioxide from the atmosphere, converting the gas into organic carbon compounds. Come autumn, trees shed their leaves, leaving them to decompose in the soil as they are eaten by microbes. Over time, decaying leaves release carbon back into the atmosphere as carbon dioxide.

In fact, the natural decay of organic carbon contributes more than 90 percent of the yearly carbon dioxide released into Earth’s atmosphere and oceans. Understanding the rate at which leaves decay can help scientists predict this global flux of carbon dioxide, and develop better models for climate change. But this is a thorny problem: A single leaf may undergo different rates of decay depending on a number of variables: local climate, soil, microbes and a leaf’s composition. Differentiating the decay rates among various species, let alone forests, is a monumental task.

Instead, MIT researchers have analysed data from a variety of forests and ecosystems across North America, and discovered general trends in decay rates among all leaves. The scientists devised a mathematical procedure to transform observations of decay into distributions of rates. They found that the shape of the resulting curve is independent of climate, location and leaf composition. However, the details of that shape—the range of rates that it spans, and the mean rate—vary with climatic conditions and plant composition. In general, the scientists found that plant composition determines the range of rates, and that as temperatures increase, all plant matter decays faster.

“There is a debate in the literature: If the climate warms, do all rates become faster by the same factor, or will some become much faster while some are not affected?” says Daniel Rothman, a co-founder of MIT’s Lorenz Center, and professor of geophysics in the Department of Earth, Atmospheric and Planetary Sciences. “The conclusion is that all rates scale uniformly as the temperature increases.”

Rothman and co-author David Forney, a PhD graduate in the Department of Mechanical Engineering, have published the results of their study, based largely on Forney’s PhD thesis, in the Journal of the Royal Society Interface.

Litter delivery

The team obtained data from an independent 10-year analysis of North American forests called the Long-term Intersite Decomposition Experiment Team (LIDET) study. For this study, researchers collected leaf litter—including grass, roots, leaves and needles—from 27 locations throughout North and Central America, ranging from Alaskan tundra to Panamanian rainforests.

The LIDET researchers separated and weighed each litter type, and identified litter composition and nutrient content. They then stored the samples in porous bags and buried the bags, each filled with a different litter type, in each of the 27 geographic locations; the samples were then dug up annually and reweighed. The data collected represented the mass of litter, of different composition, remaining over time in different environments.

Forney and Rothman accessed the LIDET study’s publicly available data online, and analysed each dataset: the litter originating at one location, subsequently divided and distributed at 27 different locations, and weighed over 10 years.

The team developed a mathematical model to convert each dataset’s hundreds of mass measurements into rates of decay—a “numerically delicate” task, Rothman says. They then plotted the converted data points on a graph, yielding a surprising result: The distribution of decay rates for each dataset looked roughly the same, forming a bell curve when plotted as a function of the order of magnitude of the rates—a surprisingly tidy pattern, given the complexity of parameters affecting decay rates.

“Not only are there different environments like grasslands and tundra and rainforest, there are different environments at the microscale too,” Forney says. “Each plant is made up of different tissues … and these all have different degradation pathways. So there’s heterogeneity at many different scales … and we’re trying to figure out if there’s some sort of commonality.”

Common curves

Going a step further, Forney and Rothman looked for parameters that affect leaf decay rates. While each dataset resembled a bell curve, there were slight variations among them. For example, some curves had higher peaks, while others were flatter; some curves shifted to the left of a graph, while others lay more to the right. The team looked for explanations for these slight variations and discovered the two parameters that most affected the details of a dataset’s curve: climate and leaf composition.

In general, the researchers observed, warmer climates tended to speed the decay of all plants, whereas colder climates slowed plant decay uniformly. The implication is that as temperatures increase, all plant matter, regardless of composition, will decay more quickly, with the same relative speedup in rate.

The team also found that plant matter such as needles that contain more lignin—a sturdy building block—have a smaller range or decay rates than leafier plants that contain less lignin and more nutrients that attract microbes. “This is an interesting ecological finding,” Forney says. “Lignin tends to shield organic compounds, which may otherwise degrade at a faster rate.”

Mark Harmon, principal investigator for the LIDET study and a professor of forest science at Oregon State University, says the team’s results add evidence to a long-held debate over rising temperature’s effect on organic decay: As temperatures rise, decomposition will likely speed up, releasing more carbon dioxide into the atmosphere, which in turn creates warmer temperatures, further speeding decay in a positive feedback loop.

“There is a wide range of results on temperature response,” says Harmon, who was not involved in the study. “Some have proposed that materials that are hard to decompose will respond more to temperature increases, and others have proposed the opposite. The current study indicates they may be the same,” meaning the positive feedback from rising temperatures may not be as strong as others have predicted.

Rothman adds that in the future, the team may use the model to predict the turnover times of various ecosystems — a finding that may improve climate change models, and help scientists understand the flux of carbon dioxide around the globe.

Rothman adds that in the future, the team may use the model to predict the turnover times of various ecosystems—a finding that may improve climate change models, and help scientists understand the flux of carbon dioxide around the globe.

“It’s a really messy problem,” Rothman says. “It’s as messy as the pile of leaves in your backyard. You would think that each pile of leaves is different, depending on which tree it’s from, where the pile is in your backyard and what the climate is like. What we’re showing is that there’s a mathematical sense in which all of these piles of leaves behave in the same way.”

For more such insights, log into our website https://international-maths-challenge.com

Credit of the article given to Jennifer Chu, Massachusetts Institute of Technology

 


‘Models of everything’ created to improve accuracy of long-term weather forecasting

People love to complain about the weather – and especially about weather forecasters. But real, accurate forecasting beyond five to seven days is immensely complicated, due to the sheer volume of atmospheric processes and factors. Fortunately for us, advances in computing are making it possible for mathematicians, atmospheric scientists and statisticians to create “models of everything,” which may lead to accurate long-range weather forecasts.

NC State mathematician John Harlim is working on one such “model of everything,” specifically for longer-range weather and climate prediction. He’s part of a five-year project led by NYU’s Andrew Majda that is creating simpler, less expensive stochastic models (a model that includes random variables) for extended range weather and climate prediction.

One major stumbling block to extending and improving weather predictions beyond seven-day forecasts is a lack of understanding of the tropical weather dynamics that drive global weather patterns. The mix of factors in these patterns is amazingly complex. According to Harlim, “The dynamics in the tropics involve hierarchies of processes on both huge scales – like, 10,000 km – and much smaller scales over many months.  Physical processes in individual clouds can affect these larger processes in the long run.

“In terms of a model, then, you would have to resolve the entire globe in one-kilometer chunks, look at every possible weather pattern that could possibly occur over every moment given all sorts of variables, and then scale it up,” Harlim adds. Since this approach is very expensive, computationally speaking, Harlim and his colleagues hope to develop simpler, cheaper models that can capture tropical dynamics and understand their interactions with extratropical weather patterns.

Says Harlim, “Understanding tropical dynamics is the Holy Grail of atmospheric modeling, and if we’re successful, you’ll be able to get accurate weather forecasting for months, not just days, in advance.”

Atmospheric scientist Sukanta Basu is part of a team working on a “model of everything” for atmospheric turbulence by studying airflow over complex terrain, including islands. The team wants to understand how atmospheric turbulence affects laser propagations, but their work could have other applications as well – such as predicting microbursts for aircraft safety or estimating evaporation rates for water management in agriculture. And just like Harlim’s, Basu’s model will have to take a huge number of factors into account.

“We’ll be looking at 10-meter terrain maps, finding out every spatial location and time and what the atmospheric field may look like,” Basu says. “The amount of computational power needed is huge – one simulation can fill up a terabyte disk – so we’re looking at petascale computing, which can do a quadrillion operations per second. We didn’t have computing on this scale ten years ago, so projects like this were impossible.”

For more such insights, log into our website https://international-maths-challenge.com

Credit of the article given to Tracey Peake, North Carolina State University


Math algorithm tracks crime, rumours, epidemics to source

A team of EPFL scientists has developed an algorithm that can identify the source of an epidemic or information circulating within a network, a method that could also be used to help with criminal investigations.

Investigators are well aware of how difficult it is to trace an unlawful act to its source. The job was arguably easier with old, Mafia-style criminal organizations, as their hierarchical structures more or less resembled predictable family trees.

In the Internet age, however, the networks used by organized criminals have changed. Innumerable nodes and connections escalate the complexity of these networks, making it ever more difficult to root out the guilty party. EPFL researcher Pedro Pinto of the Audiovisual Communications Laboratory and his colleagues have developed an algorithm that could become a valuable ally for investigators, criminal or otherwise, as long as a network is involved.

“Using our method, we can find the source of all kinds of things circulating in a network just by ‘listening’ to a limited number of members of that network,” explains Pinto. Suppose you come across a rumor about yourself that has spread on Facebook and been sent to 500 people; your friends, or even friends of your friends. How do you find the person who started the rumor? “By looking at the messages received by just 15󈞀 of your friends, and taking into account the time factor, our algorithm can trace the path of that information back and find the source,” Pinto adds. This method can also be used to identify the origin of a spam message or a computer virus using only a limited number of sensors within the network.

Out in the real world, the algorithm can be employed to find the primary source of an infectious disease, such as cholera. “We tested our method with data on an epidemic in South Africa provided by EPFL professor Andrea Rinaldo’s Ecohydrology Laboratory,” says Pinto. “By modeling water networks, river networks, and human transport networks, we were able to find the spot where the first cases of infection appeared by monitoring only a small fraction of the villages.”

The method would also be useful in responding to terrorist attacks, such as the 1995 sarin gas attack in the Tokyo subway, in which poisonous gas released in the city’s subterranean tunnels killed 13 people and injured nearly 1,000 more. “Using this algorithm, it wouldn’t be necessary to equip every station with detectors. A sample would be sufficient to rapidly identify the origin of the attack, and action could be taken before it spreads too far,” says Pinto.

Computer simulations of the telephone conversations that could have occurred during the terrorist attacks on September 11, 2001, were used to test Pinto’s system. “By reconstructing the message exchange inside the 9/11 terrorist network extracted from publicly released news, our system spit out the names of three potential suspects; one of whom was found to be the mastermind of the attacks, according to the official enquiry.”

The validity of this method thus has been proven a posteriori. But according to Pinto, it could also be used preventatively; for example, to understand an outbreak before it gets out of control. “By carefully selecting points in the network to test, we could more rapidly detect the spread of an epidemic,” he points out. It could also be a valuable tool for advertisers who use viral marketing strategies by leveraging the Internet and social networks to reach customers. For example, this algorithm would allow them to identify the specific Internet that are the most influential for their target audience and to understand how in these articles spread throughout the online community.

For more such insights, log into our website https://international-maths-challenge.com

Credit of the article given to Ecole Polytechnique Federale de Lausanne