10 Years after “Academic Revolution and Regional Innovation: The Case of Computer Science at Stanford 1957-1970”

10 Years after “Academic Revolution and Regional Innovation: The Case of Computer Science at Stanford 1957-1970”
Photo by Yu Wang on Unsplash

The sleepy suburban streets of Silicon Valley, with Stanford at its epicenter, were once well-tended farmlands, a lush region of physical toil and abundant agricultural exports that fed the frontiers and the fronts of world wars. Over the course of the last century, that rustic life was transformed first into the postwar hub of the West Coast defense industry during the 1950s, then into the fabrication center of the nascent and soon ubiquitous semiconductor industry, before the region all but died out with a surprise whimper in the 1980s, decimated by the offshoring of hardware to the Asian Tigers of Japan, South Korea, Singapore and Taiwan.

From that economic nadir came the longest bull run Silicon Valley had ever witnessed: the sequential invention and rise of the consumer internet, mobile, cloud, artificial intelligence and more, not to mention the coursing rivers of biotech burbling between the corporate complexes of software giants.

Silicon Valley has engraved itself into the economic chronicles of the globe, becoming an elemental metonym for innovation the world over. Despite its riches and influence though, the region’s origins — and Stanford’s unique place in that story — remains a strangely under-studied phenomenon. A passel of works, particularly over the past decade, have attempted to narrate that story, some better and some worse, but none have definitively concluded: why there and why then?

It was a research problem that I was intensely curious about as an undergraduate at Stanford, and one that I wanted to solve through the assiduous work of the historical scholar. So for two years — and thanks to an Undergraduate Research Fellowship — I perused the Stanford historical archives for all the evidence that I could find that connected the university into the wider region that it inhabited.

What I discovered surprised me. Stanford, something of a runner-up university in the 1950s and 1960s, was desperate for grants and cash that could accelerate its ambitions to move to the top of the academic league tables dominated by East Coast Ivies. The nascent field of “computer science” offered the university’s administration a unique opportunity to bring in money simultaneously from government grants via ARPA (now DARPA) as well as industry sources from the new computer companies springing up in the region to solve problems from accounting to energy modeling.

There was just one problem: influential members of the Stanford faculty despised the emerging field. Computing at the time wasn’t an “intellectual” field, but rather a mechanical one: less theory and more punch cards, grease, gears and the the giant, floor-spanning whirring machines that calculated numbers and processed handwritten code. The results of science and humanities fields were inscribed in journal articles, but computer science — which was evolving rapidly then just as it is today — placed the highest weight on academic conference papers, in what seemed like superfluous work to researchers in traditional disciplines.

Stanford’s Computer Science division (later upgraded to department) would have to navigate a dual identity: offering enough intellectual cover to its own faculty to avoid internecine warfare with other academic departments, while being pragmatic enough to be useful to industry and bring in the lucre that would subsidize its ambitious growth. George Forsythe, a numerical analyst who played the leading role in developing Stanford Computer Science and ultimately its connections to Silicon Valley writ large, decided alongside John Herriot and later John McCarthy to place algorithms at the center of the enterprise, in contrast to universities like MIT, Carnegie Institute of Technology (which transformed into Carnegie Mellon in 1967) and Berkeley, which focused on systems engineering.

There would be casualties. One of computer science’s most famed researchers, Marvin Minsky, barely received an offer from Stanford and ultimately headed to MIT in a ferocious fight to maintain computer science as a discipline at Stanford. For more than two decades until 1986, the department wouldn’t let undergrads major in the field, diverting them into instead into a major known as Mathematical and Computer Sciences (my personal major which was rebranded in 2022 as Data Science). All the while, leaders of computer science felt an impending sense of financial doom, given hefty student enrollments and low relative university support.

As is so often the case in history though, fortuitous decisions, administrative favor and sheer luck allowed the department and its researchers to persevere, and the white-hot forge of the academic fights at the time forever forced the department to find a diverse range of revenue sources — a model that continues to the present day at Stanford and is now considered a standard for operating a modern engineering department at a university.

As I look back on my thesis a decade on, what I am most surprised by is how little meticulous historical research has really been conducted on Silicon Valley and its institutions. Everyone has a theory, but few writers have done the painstaking work of tracking down thousands and thousands of letters, budgets, memos and miscellanea that represent how economic history comes to be embodied in the larger world.

The downside to rigor is the demise of a grand narrative — as there really isn’t one when it comes to the origins of Silicon Valley. Stanford’s Computer Science department — much like the startups that now inhabit its environs — was not well funded historically, but it did always have just enough to keep its researchers going but hungry for more. It forced a level of entrepreneurialism and drive that could have easily been squelched by more sumptuous budgets.

Giving the right people just enough to watch their ambitions form but never enough to see them satiated can produce magic at just the right time. Stanford’s central role in the history of the internet and computing wasn’t serendipitous nor was it foreordained. Rather, it represented a culmination of thoughtful leadership, flexible thinking, and a drive to bring a new field into existence and establish its legitimacy even among the most supercilious of traditional academics. 60 years on, the department’s culture continues to reflect those early origins, as does Silicon Valley itself.

I’ve just published the full text of this thesis to the website. You can also read it as a full PDF, and there is also a shorter conference paper available.