
This week we learned about freshwater snails that could help AI engineers design brains for robots, scientists who are seeking to sequence synthetic human DNA and software that turns computer cameras into eye-tracking devices that can gather information about what content grabs our attention online. Take a look.
Will Snails Get Us Faster To AI?

Freshwater snail (Bithynia) on ponds bottom. Research on the snail’s neurons “will eventually help us design the ‘brains’ of robots based on the principle of using the fewest possible components necessary to perform complex tasks.” Image credit: Getty Images
Scientists studying snails at the University of Sussex in England described how just a pair of neurons can give rise to “complex behavioral decisions” in the brain. The team studied the brain activity of freshwater snails and monitored their behavior while they were searching for food. They observed that when one neuron registered the potential for food, a second neuron told the brain whether the snail was hungry or not. The team wrote that the system “enabled the snails to save energy by reducing brain activity when food is not found.” George Kemenes, the Sussex professor who led the study, said it “reveal[ed] for the first time how just two neurons can create a mechanism in an animal’s brain which drives and optimizes complex decision-making tasks. It also shows how this system helps to manage how much energy they use once they have made a decision.”
“Our findings can help scientists to identify other core neuronal systems which underlie similar decision-making processes,” Kemenes added. “This will eventually help us design the ‘brains’ of robots based on the principle of using the fewest possible components necessary to perform complex tasks.” The findings were reported in the journal Nature Communications.
Leading Geneticists Plan To Make Human DNA In The Lab

One of the goals of the project will be “to synthetically create an entire human genome,” according to the New York Times.
Geneticists working at America’s leading universities, nonprofits and the company Autodesk announced a $100 million project to create an artificial human genome. The effort, called The Human Genome Project-write (HGP-write), “will use the cellular machinery provided by nature to ‘write’ code, constructing vast DNA chains,” the team said in a news release. “This grand challenge is more ambitious and more focused on understanding the practical applications than the original Human Genome Project, which aimed to ‘read’ a human genome,” George Church, genetics professor at Harvard Medical School, said in a statement. “Exponential improvements in genome engineering technologies and functional testing provide an opportunity to deepen the understanding of our genetic blueprint and use this knowledge to address many of the global problems facing humanity.” The team’s goal is to launch the project this year.
The Universe May Be Expanding Faster Than We Thought
Two decades ago, a group of astronomers made the startling discovery that the universe keeps accelerating and blowing itself apart. Now their colleagues, using the Hubble Space Telescope, followed up with a new set of measurements that led them to believe the speed of that expansion is as much as 9 percent faster than previously thought. The team, led by Nobel Prize winner Adam Riess, used the telescope to measure the distance to about 300 special stars in a number of distant galaxies. They then used the data to recalculate the rate of expansion of the universe, known as the Hubble constant. “If we know the initial amounts of stuff in the universe, such as dark energy and dark matter, and we have the physics correct, then you can go from a measurement at the time shortly after the Big Bang and use that understanding to predict how fast the universe should be expanding today,” Riess said in a statement. “However, if this discrepancy holds up, it appears we may not have the right understanding, and it changes how big the Hubble constant should be today.” The study was published in the Astrophysical Journal.
Supercomputers Simulate Supernova Explosions To Help Us Understand How We Are Made

Top image and above: Researchers from Michigan State University are using Mira to perform 3-D simulations of the final moments of a core-collapse supernova’s life cycle. This visualization is a volume rendering of a massive star’s radial velocity. In comparison to previous 1-D simulations, none of the structure seen here would be present. Image credit: Sean Couch, Michigan State University
When Dmitri Mendeleev classified nature’s elements in his periodic table in 1869, he knew their properties but didn’t know how the building blocks of planets and life came to be. Since then, we’ve figured out that chemical elements are cooked up by stars; with the exception of hydrogen and helium, which were created during the big bang. But we still are groping to find out exactly how. That’s why researchers at Michigan State University are now using America’s most powerful supercomputers to simulate supernova explosions. When stars “go supernova,” they essentially blow themselves to pieces and create and release the chemical elements that make the visible stuff in the universe. However, the mechanisms that drive supernova explosions are still not well-understood. “If we want to understand the chemical evolution of the entire universe and how the stuff that we’re made of was processed and distributed throughout the universe, we have to understand the supernova mechanism,” Sean Couch, assistant professor of physics and astronomy at Michigan State University, said in a statement. Couch and his colleagues are using Mira, the 10-petaflops supercomputer located at the U.S. Department of Energy’s Argonne Leadership Computing Facility, “to carry out some of the largest and most detailed three-dimensional … simulations ever performed of core-collapse supernovas.”
New Software Turns Computer Cameras Into Eye Trackers
Scientists at Brown University have developed software that turns computer cameras into eye trackers “that can infer where on a webpage a user is looking. The software can be added to any website with just a few lines of code and runs on the user’s browser,” the team said in a statement. “We see this as a democratization of eye-tracking,” said Alexandra Papoutsaki, a Brown University graduate student who led the development of the software. “Anyone can add WebGazer to their site and get a much richer set of analytics compared to just tracking clicks or cursor movements.” The team will present a paper describing the software in July at the International Joint Conference on Artificial Intelligence. The software code is freely available to anyone here.