Topic’s streaming service brings you exclusive TV and films from around the globe. Try it free. Cancel anytime.Learn More

Topic Studios produces award-winning film, TV & podcasts, supporting creators at the forefront of culture.Learn More

Topic is a First Look Media company

Joseph Bates gestures towards his laptop screen during a recent interview in his home in Newton, Massachusetts.

The 62-Year-Old Child Genius

Joseph Bates gestures towards his laptop screen during a recent interview in his home in Newton, Massachusetts.

In 1969, a very smart 13 year-old began his undergraduate college education—a move that would come to influence how we think about gifted children for the next four decades.
A double exposure portrait of Bates and a circuit board.
A double exposure portrait of Bates and a circuit board.
Bates removes a circuit board containing his S1 computing chips.
Bates removes a circuit board containing his S1 computing chips.

WHEN JOSEPH BATES TOOK THE SAT, the old standby of standardized tests, nearly fifty years ago, he became the so-called "gifted" kid who sparked one of the longest-running and most influential studies of mathematically precocious youth. In 1989, by the time Doogie Howser, M.D. appeared on television, Bates had already enrolled in college at 13, graduated with a bachelor’s and master's degree at 17, and attained a doctorate in computer science. The New York Times called Bates a prodigy. Researchers labeled him the most gifted of the gifted, ranked among those who scored in the top 1 percent of 1 percent nationwide—a label reserved for those who scored in the 700s on their S.A.T. test before they turned 13 years old. A nerd, you might say, just as nerds were truly coming to rule the world. To Bates, these distinctions are a footnote rather than a badge of pride. Now a 62-year-old computer scientist, he's tackling a different kind of learning: machine learning. Bates recently designed a computing chip to be deliberately imprecise.

If this sounds somewhat counterintuitive, that’s because it is. (Ask one of these chips to add one and one, and it might spit out 1.98.) Bates believes that the chip's “good enough” approximations, rather than exacting mathematical computations, are necessary to accelerate the bid to build artificial intelligence. Running deep neural networks—the software algorithms trained to recognize patterns—requires tens of thousands of mathematical operations per second; to make machines that can learn to act more intelligently and actualize the dreams of AI, computational speed must take precedence over accuracy. Bates calls his design the S1 approximate computing chip.

The morning we meet at Bates’s blue clapboard Victorian on a leafy, suburban street in Newton, Massachusetts, his wife, Kristin Loeffler, and their two kids are downstairs. Bates occasionally works from home in a carpeted, third-floor office that overlooks the street. On his desk, he stores one of his S1 chips like a precious jewel, nestled inside a small, black ring box. He opens the box to show me the chip: it’s about the size of a postage stamp, the color of a putting green, and layered with gold circuitry. Making the first S1 prototype (a batch of several hundred chips) cost over $1 million and proved far more difficult and time-intensive than he’d imagined. Bates mocked up his initial designs for the S1 on Post-It notes, sticking them on his dresser. He thought an approximate computing chip would take two years to build. It took 14.

Bates runs a program on his laptop at home.
Bates runs a program on his laptop at home.

Bates swivels around in his chair and pulls out a demo system. In the center of a general-purpose circuit board about the size of his MacBook, another single s1 chip, is surrounded by components—“life support,” he calls it. Bates plugs in the board, connects it to his laptop, and begins running an algorithm he wrote with a student at MIT in 2016 that implements what’s known as optical flow. Facing the laptop camera, he appears onscreen overlaid with red and green arrows, depending on the direction he’s moving. Bates says it’s a bit like the quality of dinosaurs’ vision in Jurassic Park: “They say, ‘Stand still, and they won’t be able to see you.’” When he stops talking, all the little arrows disappear.

Advertisement

Bates lays several objects on his desk and plugs in an external webcam. He runs an algorithm referred to as a neural network, loosely modeled after the human brain. The neural network has been previously trained to recognize familiar statistical patterns in each frame and identify objects captured by the camera, even when the image initially looks like nothing more than digital noise. The words “NEMATODE STINGRAY VELVET TICK” pop up onscreen. “A lot of weird categories,” Bates says. He picks up the cam and points it at a white ruler. “RULER,” the screen says. “SLIDE RULE.” He points the camera at a pen. “BALLPOINT PEN.” The novelty is not that a machine can learn to readily see objects in a deluge of video data. “What’s interesting are these numbers,” Bates says, pointing to the screen, where it indicates how much power is being used. “A third of a watt.” The S1 is considered very low-power; the chip feels cool to the touch even when in use, yet its processing power lies about halfway between that of a high-end laptop and the human brain, which performs somewhere on the order of 10 to the power of 17, or 100 quadrillion, events per second.

“Neurons, synapses—they get three digits of accuracy,” Bates says. “Maybe it’s more like two. Yet somehow the architecture of the brain can do amazing things using such lousy computers.” For the past half century or so, the semiconductor industry has been stuffing more transistors onto smaller chips, which is why your phone has more computing power than early, room-sized mainframes. But it’s become increasingly difficult to boost speeds while simultaneously shrinking chip size. “So I said, ‘Well, what would happen if you made computers do lousy arithmetic?’” Bates says.

Bates connects a one-chip demo board (at left) to his laptop via two adaptors.
Bates connects a one-chip demo board (at left) to his laptop via two adaptors.

The S1 is tiny, a testament to Bates’s concentration and more than a decade of persistence. He thinks it’ll be possible to shrink the S1 onto a board smaller than a child’s pinky nail, and draw out its microarchitecture at 10 nanometers—thinner than a sheet of printer paper sliced horizontally 10,000 times. The S1 is among the tools being developed to help reach what is commonly known as artificial intelligence at a time when AI has become shorthand for the algorithmic magic poised to reshape the fabric of society. By some estimates, fewer than 10,000 people have genuine expertise in the subject, and at the same time seemingly intelligent machines are widely expected to speed up the discovery of pharmaceutical drugs, take the form of self-driving cars, and help researchers make sense of everything from genomics to quantum physics. Put the computing power of today’s data centers inside a pair of glasses, and you could seamlessly watch gesture- and object-recognition without a scalding-hot processor burning your temple. (Overheating was an engineering pitfall of Google Glass, the computerized face-wear that debuted in 2012.) Imagine hearing aids, Bates says, that whisper in your ear, “Remember this guy? You met him last year at this dinner,” or, “Remember, the keys are in the door.” Bates, the grown-up whiz kid, wants to increase the scope of our experience and intelligence.


Bates remembers the first day of classes in 1969, when he became a freshman at Johns Hopkins University. A shy 13-year-old, he lived in Pikesville, a suburb of Baltimore, and his parents drove him into the city and dropped him off on campus. To his mind, there was nothing extraordinary about taking college classes along with hundreds of other incoming freshmen, nearly all of whom were male, and he saw no reason to flaunt his youthfulness. He took 13 credits of classes, earning a 3.69 grade point average. The following year, one of his new best friends told him he looked a little young. After Bates patiently explained that he was a little young, the two resumed calculus class as if nothing had happened. It wasn’t that Bates particularly loved math or found some spectacular meaning in linear algebra or numbers theory. It is more that he simply loved machines, and math is what makes machines possible. He remembers going to the zoo with his mother and trying to figure out how the pipes conveyed water out of the ground and through a series of long, cylindrical tubes. At home, he coiled wire around nails, creating electromagnets. His parents encouraged him, but not—as he remembers it—in an overbearing or controlling way. Bates later told a reporter from the New York Times that he became bored with school, but was passive and simply endured. In seventh grade, an idealistic young math teacher at his public school let him join the eighth-grade math team, then agreed to tutor him after school. Together, Bates and the teacher enrolled in a summer programming class at Johns Hopkins University. That’s where Professor Doris Lidtke noticed that Bates was teaching FORTRAN, the first high-level programming language, to grad students. Lidtke called her colleague Julian Stanley and told him, “We’ve got to do something with this kid.”

Bates in 1973.
Bates in 1973. Photo permission from The Baltimore Sun.

Julian Stanley worked in Hopkins’s education department, which later became the psychology department. He had a faint Georgia accent and was abnormally tall and bright. High school had bored Stanley, but, according to one of his former colleagues, what really cured him of intellectual laziness was the monotony of being stationed abroad with the Army Air Corps’ chemical-warfare service during WWII. When Stanley returned to the U.S. at the age of 32, he threw himself into his studies, attending Harvard on the GI Bill and developing an obsession with statistics—particularly psychometrics, the quantitative study of cognitive performance. (Stanley had a way with numbers, remembering people’s test scores for years; he also had a way with words, saying things like, “There are more ways to kill a cat than to choke it on butter.”) “I used to think that IQ almost guaranteed success,” Stanley told The New Yorker in 2004, one year before his death, in a story about “nerd camps.” “But I found with bitter experience that it’s not true. It can almost be a burden to you." Prodigious intelligence is special, but, as Andrew Solomon writes in Far From the Tree, his 2013 book on exceptional children, the burden of being gifted or being labeled as such can sometimes feel more like a disability. (The word prodigy derives from the Latin prodigium, which means "omen" or portent,"  but can also mean “monster.”) Stanley personally recognized the challenge of being understimulated by an educational system designed for normalcy, and he knew that a small percentage of school-aged children struggled with tedium and frustration.

Stanley did not immediately jump at the opportunity to meet with Bates. But when he did, as one of Stanley’s former colleagues put it, something “sparked in him.” Over the course of several Saturdays, Bates began taking a battery of intelligence tests. Then, in what was considered a radical decision, Stanley gave the seventh grader the SAT, the college-entrance exam usually reserved for high-school juniors and seniors. (In 1995, the College Board re-calibrated the average score on each test to be 500, out of a maximum 800, but Bates' scores, in the 700s, would not have changed much.) Bates remembers thinking the test was fun, like a puzzle; his math score was exceptional, on par with incoming college freshmen. Stanley tried to convince a local high school to let Bates take 11th grade courses, but was rebuffed—as Stanley put it, he "reluctantly” agreed to let Bates into Hopkins, expecting him to fare poorly. Four years later, Bates graduated with a bachelor’s and master’s degree in quantitative studies and computer science.

A computer chip rests in a jewelry case in Bates' home.
A computer chip rests in a jewelry case in Bates' home.
Using his proprietary chip, Bates' computer runs an algorithm that shows the direction his face is moving.
Using his proprietary chip, Bates' computer runs an algorithm that shows the direction his face is moving.

Word spread locally and Stanley began guiding other gifted teenagers, helping them select classes at Hopkins that favored their particular strengths. This mentoring reset the course of his career, and he went out looking for other students who weren’t being challenged. In 1972, with a grant from Chicago’s Spencer Foundation, Stanley administered the math portion of the SAT to 450 Baltimore-area seventh and eighth graders, then selected out the highest-scoring students for participation in what became known as the Study of Mathematically Precocious Youth (SMPY). In 1979, Stanley created the Johns Hopkins Center for Talented Youth, which hosts camps in the summer for gifted school-age children. Since then, tens of thousands of students have participated at Hopkins as well as in similar talent searches hosted by other colleges—including Northwestern, Duke, the University of Iowa, and the University of Denver—inspired by Stanley’s work.

From the beginning, the concept of admitting school-age students into college has aroused suspicion. After Stanley accepted Jonathan Edwards—the second teen to enter Hopkins, one year after Bates—some faculty members objected. As Bates recalls, one professor said, “‘These kids can’t do this. It’s too hard.’ And [Stanley] asked, ‘So, how’s Jonathan doing?’ ‘What about him?’ ‘Well, he’s 17.’” (The professor had apparently failed to recognize the exceptionally young student in his midst.) To Bates, putting talented students on an accelerated track was a nonissue. He recognized it was not for everyone, but he felt lucky. And if he achieved a kind of modest fame for being “student zero,” Bates took that attention as an opportunity to share his opinion: everyone ought to have the freedom and flexibility to get the education that best suits their ability.


In 1973, Bates—17 years old, with two degrees under his belt—enrolled at Cornell as a doctoral student in computer science. As he told a reporter from the New York Times in 1975, he found it easier to make friends at his new school. At Hopkins, Bates said, “I didn’t date much, but I wasn’t ready for it. Now, I have more time for dating.” The Los Angeles Times reported that while at Cornell, he also became an expert figure skater. The burdens Bates recalls came not from parents, teachers, and the occasional reporter, but from his own unrealistic expectations—a common refrain, he learned, among the "gifted." At Cornell, he set his sights on building intelligent machines; he’d go on to work as a professor at Carnegie Mellon throughout the 1990s, exploring the applications of artificial intelligence in the arts. He’s best known for Oz, a computer system for presenting interactive dramas, and for inventing a virtual world inhabited by “Woggles,” autonomous creatures intended to generate lifelike emotional reactions. Since then, Bates has been affiliated on and off with MIT.

A double exposure portrait of Bates and his S1 chip, which took 14 years to build.
A double exposure portrait of Bates and his S1 chip, which took 14 years to build.
Bates holds a news clip from 1993, which featured a photograph him while he was a professor at Carnegie Mellon University.
Bates holds a news clip from 1993, which featured a photograph him while he was a professor at Carnegie Mellon University.

His career arc is not that unusual, at least among the SMPY cohort. We know that because Julian Stanley and his protégée, Camilla Benbow—an educational psychologist at Vanderbilt University who now coordinates the SMPY with her husband, psychologist David Lubinski, and Harrison Kell of the nonprofit Educational Testing Service—peppered thousands of SMPY alumni with questions. A 40-year update on the first two cohorts breaks down their career paths, and the data is revealing: these 1,650 students, who scored in the top 1 percent as adolescents, were twice as likely to earn PhDs compared to a control group. Few became mathematicians, per se; men tended to go into computer science, physics, or electrical engineering, while many women gravitated toward health care. Not everyone involved valued the experience, though, and some said that being part of the SMPY was a source of embarrassment.

Along with the stigma associated with a prodigious individual’s ability, there’s the perceived prejudice. While accelerated programs for the gifted, unintentionally or not, maintain a social and economic caste system in schools, the SMPY data suggest that mental acuity early in life does correlate with adult achievement more than other environmental and socioeconomic variables do. Standardized tests do have limits; researchers also found that the standard measures of ability and intelligence do not adequately account for exceptional spatial reasoning. Even so, analytical reasoning appears to be a remarkable proxy for future success in science, technology, and math, as long as the analytical is combined with the critical. With any scientific or creative activity, as Bates said in a 1992 interview published in the Journal for the Education of the Gifted, “You have to be able to invent new, strange stuff, and you have to be able to throw out most of it.”


Bates now runs a startup called Singular Computing, which has offices in Cambridge, Massachusetts’s Kendall Square, as well as some hardware residing at a contractor off the I-495, the outer beltway of Boston. Over the years, he has received funding from the Office of Naval Research, the science-and-technology-research division of the U.S. Navy and the Marines, which hopes to outfit autonomous vehicles with on-board, very-low-power approximate chips to avoid the delays and distortions introduced by transmitting data underwater. The U.S. Department of Defense’s Defense Advanced Research Projects Agency (DARPA) also funded his work on approximate computing chips as part of a program that concerns the processing of drone footage for gestural and object recognition. (Bates is also working with at least two commercial firms, which he declined to name.) “Maybe this year somebody will take it and scale it up from our prototypes to kinds of machines that would enable a big jump forward in Computation X—where X is, you know, biology or medicine or chemistry or robotics or machine learning,” Bates told me. He sounded optimistic.

The list of potential applications is seemingly endless but not without its downsides. Bates worries about what will happen when artificial intelligence—which he sees no way of stopping—ends up in “countries with political systems that make me nervous.” Plus, the millions of dollars that stand to be made have sucked many of the people most capable of understanding artificial intelligence out of academia, favoring the development of commercial applications for companies such as Google, Facebook, Amazon, and Microsoft. Bates has a egalitarian vision for the future of AI: by putting the most advanced machines in the hands of artists and scientists, he suspects they’ll do all sorts of things that are currently unimaginable. “If you’re a grad student who just wants to wake up at 2 in the morning with some weird idea: ‘I don’t want to have to wait or pay; I just want to try it right now on my massive, super-huge computer—my ten-to-the-seventeenth-per-second computer.’”

Bates talks with the writer at his desk in his home office.
Bates talks with the writer at his desk in his home office.

If the ultimate purpose of intelligence research is just to enhance intelligence, Bates has spent much of his life to that end. His S1 chip was designed to run software algorithms capable of solving complex problems, ultimately enabling computers and systems that seemingly understand something about the content of the material that’s being learned. Bates moves about with a calm serenity, speaking patiently and deliberately in an almost meditative manner—an outward demeanor that seems to balance his blazing, single-minded determination with regards to all things computational. His latest project is unique, but it’s also a culmination of everything he’s learned. In a way, Bates is explicitly seeking ways to give as many people as possible the kind of educational opportunity and flexibility he was given—a genius that isn’t rarified.

Artificial intelligence is often described as imitating human intelligence; Bates sees AI as a set of technologies to augment it. “I’m getting old. I’ve seen the founders of AI die before their dreams were reached,” he said. “My attempt to avoid that fate is to see if I can get the computing power of a Google data center into the back room at Stanford Computer Science, or MIT, or [Carnegie Mellon], or elsewhere, and then give thousands of grad students a decade to work with that much computing power … and see what they come up with. It’s building a tool that I hope will enable real or next-generation AI.” Seen this way, Bates’s most lasting accomplishment would be machines that could animate or enable the kind of experiment he took part in—if you want to use the most machines to learn or to solve whatever problem you see, everyone should have the opportunity and who cares if you’re 13, or 31, or 103. It's a gift from the gifted, if you will. Downstairs, Bates’s daughter, 14, is sitting at the dining-room table doing calculus homework. His son, 9, is sprawled on the floor, holding an iPhone and asking about a playdate. I ask Bates’s wife, Kristen Loeffler, about her husband’s work ethic. “Like a dog with a bone,” she says. On a doorjamb near the kitchen, the couple has a chart showing a screen-time budget for their kids. All the time kids spend on devices, she says, is “a giant experiment we are perpetuating on a generation, and we really don’t know what we’re getting into.” Bates chimes in, “It’s only going to get worse.” Outside, the skies are overcast. Leaves blow across the street. What does the man who went to college at 13 have planned for his children today? Ordinary Sunday things, Bates tells me; maybe they’ll go to the Charles River Museum of Industry, or to the hardware store. He says his daughter is happy in her high school and feels no desire to leave it early. The calculation is pretty simple, he says. “I want people to have an education that excites them and teaches them and helps them feel knowledgeable."

This article was originally published as part of our Off Topic newsletter, where you get an original story delivered to your inbox each and every week. Sign up now.

Share this story

Want to know what's releasing on Topic weekly? Subscribe and we've got you covered.