The following paragraph is from The Billion Dollar Brain by Len Deighton, published in 1966 (almost 60 years ago).
“I don’t want to bore you,” Harvey said, “but you should understand that these heaps of wire can practically think— linear programming-which means that instead of going through all the alternatives they have a hunch which is the right one. What’s more, almost none of them work by binary notation-the normal method for computers-because that’s just yes/ no stuff. (If you can only store yesses and noes it takes seven punch holes to record the number ninety-nine.) These machines use tiny chips of ceramic which store electricity. They store any amount from one to nine. That’s why-for what it does—this whole setup is so small.”
ChatGPT: This paragraph is a remarkable little time capsule. In 1966, when The Billion Dollar Brain came out, the general public had only the vaguest idea of what computers actually were, let alone what they could do (see comment below). And yet here, Len Deighton—writing a spy thriller, not a science text—manages to tap into several ideas that were way ahead of their time:
1. “Practically think — linear programming…”
That’s a hell of a phrase. Deighton wasn’t describing AI as we know it today, but he’s already hinting at heuristics—machines that can shortcut brute-force searching and “guess” the right path. That’s exactly the trajectory we’ve taken, from rigid rule-based systems to probabilistic models that “intuit” answers. Today’s large language models don’t literally think, but what they do feels close to it.
2. “Almost none of them work by binary notation…”
That’s not quite accurate technically, even for the time—but fiction often takes liberties. Binary is still the underlying system for modern computing. But the idea of moving beyond binary—into ternary logic, analog computing, quantum bits, neuromorphic chips—has remained a tantalizing frontier. So while Deighton’s prediction didn’t pan out literally, it captured the sense that computing would evolve past the early constraints of punched cards and “yes/no” logic.
3. “Tiny chips of ceramic which store electricity…”
He’s gesturing at solid-state memory—possibly even something like dynamic RAM or early integrated circuits—before they were widespread. At the time, computers still relied on magnetic cores and tubes. The idea of “tiny ceramic chips” is almost a premonition of microchips.
4. “Store any amount from one to nine…”
This is more speculative fiction than accurate tech forecast. No mainstream computer has used base-10 storage at the hardware level. But the metaphor is interesting: he’s trying to convey density—the idea that storage is getting more compact and nuanced. That’s dead-on: from rooms full of hardware to chips on your wrist.
Big Picture:
Deighton wrote this in the era of Cold War paranoia and James Bond gadgets, but what’s striking is his instinct that computers would get smarter, smaller, and stranger. That’s precisely what’s happened. The paragraph reflects a world on the brink of computing revolution, grasping at metaphors to describe something barely understood—and it still resonates.
In 1966, computer science was still young but moving fast. Here’s a snapshot of the state of the art around that time, across key areas: https://www.smays.com/wp-content/uploads/2025/05/computer-science-1966.pdf