If the incredible rise of computing is one of the biggest stories of the twentieth century, then the failure of the nation that invented the electronic computer to capitalize on it is undoubtedly one of history's most important cautionary tales.
In 1944, Britain led the world in electronic computing. The top-secret codebreaking computers that the British deployed at Bletchley Park worked round the clock to ensure the success of D-Day, and the Allies' win in Europe. At a time when the best electronic computing technology in the United States was still only in the testing phase, British computers literally changed the world.
After the war, British computing breakthroughs continued, and British computers seemed poised to succeed across the board, competing with US technology on a global scale. But by the 1970s, a mere thirty years later, the country's computing industry was all but dead.
What happened? The traditional history of computing would have you understand this change through the biographies of great men, and the machines they designed. It would gesture towards corporations' grand global strategies, and the marketing that those companies pushed to try to define what computers were for an entire generation of workers. It would not, however, focus on the workers themselves. And by ignoring them, it would miss the reasons for this catastrophic failure—a failure that remains a cautionary tale for many other countries today, particularly the United States.