This is one of several parts of my undergraduate thesis at Stanford entitled “Academic Revolution and Regional Innovation: The Case of Computer Science at Stanford 1957-1970”. It was submitted on May 17, 2011, and the text here remains unchanged and unedited since then.
On April 16th, 2010, Stanford University hosted Angela Merkel, Chancellor of Germany and leader of the fourth largest economy in the world. While her only public speech focused on Afghanistan and the global financial crisis, the primary goal of her visit was to observe the newly built Volkswagen Automotive Innovation Lab.1 The lab is designed for interdisciplinary teams of Stanford faculty members and their global industrial partners to conduct joint research projects. The center is the embodiment of the kind of university-industry partnerships desired by Germany and countries across the world.
Germany’s leader was not the only head of state to visit the region that spring. Just a few weeks later, Stanford would welcome Russian President Dmitri Medvedev, who was touring Silicon Valley — perhaps the best example of the possibilities of academic-government-industrial networks. The president’s goal is to duplicate the success of the region in the city of Skolkovo outside Moscow.2 During his visit, “Medvedev said he wanted to create an atmosphere that mirrors the relationship between Stanford and Silicon Valley, and acknowledged a brain drain that’s costing his country bright young scientists and business leaders.”3
Medvedev’s mission is hardly unique. Countries across the world are developing plans and investing heavy resources in the pursuit of creating the next Silicon Valley — a regional innovation hub with a strong network of research universities, entrepreneurial companies and professional service firms. South Korea has embarked on a plan to enhance its human capital, forming the Ministry of Knowledge Economy in 2008 and developing plans for a massive new international campus outside Seoul.4 The well-endowed establishment in 2009 of King Abdullah University of Science and Technology in Saudi Arabia is spearheading the creation of a new center of innovative science and technology in the Middle East.5
Understanding the Role of Universities in Regional Innovation
This incredible worldwide interest in regional innovation hubs has led to significant interest in their historical development. As will be seen later in this chapter, scholars have developed several methodologies to analyze the origins of them, and this study connects with three of these approaches. Historical institutional approaches take as their subject an organization within a region’s research system and analyze the economic, social, political and cultural factors that shape it and how it shapes other entities in the system. Two other approaches look at regions as a whole from the bottom-up and top-down. Historical cultural approaches develop theories of innovation from the bottom-up by focusing on groups of people with sociologically similar characteristics — for instance, members of the countercultural movement in San Francisco. Finally, network analysis approaches use “relationships” such as patents or publications to investigate the development of patterns of innovation from the top-down.
All of these methodologies place universities as a core element in the rise of regional innovation hubs, and particularly in the development of computing and Silicon Valley. Despite the substantial research conducted on the latter two areas however, there has been comparatively little work on the institutional factors that assisted and hindered the development of academic computer science programs. Such an historical question may not seem pertinent at a time when universities place innovation equal to teaching and research as institutional priorities, and governments are increasingly demanding that universities assist with economic growth.6
As this study will show, universities are divergent in their abilities to engage with regional innovation, and analyzing these factors can better explain the rise of Silicon Valley as well as the wide variance of success of different regional innovation hubs and their constituent universities. Furthermore, such research provides a new perspective on models of science and technology development, which form some of the core theories in research policymaking.
This study analyzes the rise of the Computer Science department at Stanford University, starting from 1957 with the hiring of mathematician George Forsythe to around 1970. It takes as its primary lens an historical institutional approach, focusing on the development of an academic department and its related discipline within the milieu of a research university. However, this study also uses historical cultural and network analysis lenses to analyze specific institutional factors that favored Stanford’s engagement with the development of computing in Silicon Valley. This research is based on in-depth archival work with eight different collections, and it provides the first extensive history of the department.
This study has several potential audiences. It is most immediately directed toward scholars of regional innovation hubs, a diverse group that includes economists, political scientists, historians and anthropologists. In addition, this study provides the first archival-based analysis of the rise of computer science at Stanford, a top department that has shaped the field since its inception. This perspective will be of interest to the growing community of scholars investigating the history of computation and information technology. This study also develops new perspectives on how university administrators manage the development of new disciplines, which will will be of interest to scholars of higher education. Finally, this study provides an empirical application of some of the core theories of science, technology and society, particularly those theories related to the social construction of science and technology.
This chapter begins with a brief history of theoretical models of science and technology. Next, it will develop a fuller understanding of the three approaches to understanding regional innovation outlined above, with particular attention on the historical institutional approach. An important component to this study is the rise of computer science as a discipline within the academy, and a discussion of recent work analyzing this history will follow. Finally, this chapter will end with a brief history of Stanford and the growing literature of research analyzing its research model in the Cold War context.
Theoretical Models of Science and Technology
Theoretical models of science and technology related to computation and regional innovation hubs can be placed into three overlapping but intellectually coherent groups. Each group emphasizes different directions of influence between and among the three major components — science, technology, and society. The first group tends to treat technology and resulting social change as a product of science. The second group of models emphasizes the opposite kind of influence, the ways that social factors influence both scientific knowledge and technical innovation. The third and final group are contextual, in that they emphasize all three bidirections of influence most broadly — between science, technology, and society.
Technology as “Applied Science”
Vannevar Bush’s linear model of science and technology research remains the most influential theory for understanding the creation of new knowledge. Bush had a long and storied career, with important connections to computing. During World War II, he led the Office of Scientific Research and Development, where he pioneered the basic research funding system largely intact today.7
In mid-1945, Bush wrote a policy paper on the state of scientific research and development.8 The essay makes a vigorous defense for the funding of scientific research by the U.S. federal government, both to fight disease and to protect the nation’s security. However, it is the development of Bush’s conceptual understanding of science and technology that is perhaps the most important contribution of his essay.
Bush divides research into two categories. Basic research is “performed without thought of practical ends.” The other category is applied research, and it encompasses all other research that provides “complete answers” to important practical problems. Bush believed that basic research was crucial for a nation, since “it creates the fund from which the practical applications of knowledge must be drawn.” Thus, he argues that the possible range of applied science is dependent upon the fundamental knowledge available. In other words, technology emerges as a product of “applied science,” and then contributes to social change in a linear succession.9
The linear model provides a clean heuristic for understanding the development of science and technology, and held particular sway in the 1950s and 1960s. However, scholars have complicated the picture of the development of science and technology since Bush’s original publication. Analysis of the linear model led to the development of theories of hard technological determinism. In this theory, technology is an autonomous agent and develops independently of social and political forces. However, technology itself creates the social structure and patterns of organizations for human behavior exclusively – humans adapt to the changing technology around them without influencing it.
This exclusive role of technology was not entirely accepted by scholars. A more general investigation of the forces that directed the development of basic research and innovation was taken up by economists, led by Jacob Schmookler, starting in the mid-1960s.10 He developed two possible notions of the directionality within the linear model, which are today referred to as “technology push” and “market pull.” In the former, developments in technology create a “supply” of possible solutions that are developed before determining needs within the marketplace, a concept essentially similar to the theories of technological determinism. The latter idea takes a demand-side view, arguing that market needs provide signals to researchers and inventors, who develop their basic research programs accordingly. Schmookler takes a decidedly market-pull approach in his work, arguing that essentially all basic science is merely a response to market forces.11
Other economic historians criticized such a strong demand-side view, most notably Nathan Rosenberg. He critiques the market-pull theory by exploring the differential development of inventions in industry, arguing that it is not just demand forces but also the stock of available knowledge that affects the rate of invention. Rosenberg briefly writes about Charles Babbage and the development of the first computer, which was not a commercial success. Rather than explaining the failure as a consequence of low demand, Rosenberg argues that the “failure to complete this ingenious scheme was due to the inability of the technology of his day to deliver the components which were essential to the machine’s success,” and thus, “society’s technical competence at any point in time constitutes a basic determinant of the kinds of inventions which can be successfully undertaken.” Therefore, a social stock of technical knowledge gives rise to new technological innovations, in accordance with the basic linear paradigm.12
Social Shaping of Science and Technology
As scholars further probed the interactions between science, technology, and society, there was a growing accumulation of examples that did not fit the linear models. Scholars began to argue that science and technology were shaped and even constructed by social forces, reversing the directionality proposed by Bush and the adherents of the linear paradigm. These models remain hotly debated today, particularly over the issue of scientific relativism.
Thomas Kuhn developed the first major study of society’s interaction with science. He demythologizes the notion of the objective search for truth in science, arguing that scientists are engaged in “normal science” for the majority of their time. Occasionally, there is an accumulation of examples that do not fit the reigning paradigm in a field, and there is consequently a battle between the keepers of the old model and the vanguard of the new one. The politics over these paradigm shifts thus form society’s influence over the evolving database of scientific facts.13
Society’s interaction with technology has created a rich body of research, of which the theory of the social construction of technology remains very influential. The theory deconstructs the notion that technologies are designed exclusively by technical decisions, but rather that societal factors often play a paramount role. This methodology was first developed by Wiebe E. Bijker and Trevor J. Pinch, who looked at the societal factors that shaped the development of the bicycle.14 Another prominent example is the development of missile targeting systems analyzed by Donald MacKenzie. He shows how different groups within the defense community had varying levels of influence over the construction of the ballistic missile targeting program. Thus, the final product was less about the fitness of different technical solutions and more about the changing web of politics surrounding the project.15
The increased understanding of different factors affecting innovation has led to the development of more sophisticated models of research that take account of all the directions between the three major components. These “contextual” models place science, technology, and society in a network of mutual influence, thus establishing the importance of all six possible directions. As opposed to hard technological determinism, the theory of soft technological determinism argues that technology is a primary but not exclusive agent of social change, and it generally fits within this contextual framework.
More usefully for this study, Etzkovitz and Leydesdorff have developed the theory of the “triple helix” to describe the relations between universities, industry and the government. While the three types of institutions are generally described as being part of a triangle, the triple helix model takes as a basis the differential approaches of the three groups and adds elements of co-evolution (generating the ever-evolving helix). Thus, developments in one of the three institutions changes the trajectory of all three, and it is the constant adaptation of the system to these new developments that explains regional and national innovation systems.16
Etzkovitz has further developed these notions in analyzing the development of “entrepreneurial science” at MIT and Stanford. He argues that universities are increasingly adding the capitalization of knowledge to their missions, complementing research and teaching and representing the next stage in the development of these institutions. He traces the development of this model to MIT, which pioneered the industry-facing university and the concept of venture capital that is an important component for innovative regions. MIT’s model was later transferred to Stanford in the form of Frederick Terman, who became Stanford’s provost and had received his PhD from MIT.17
Approaches to Regional Innovation and Computing
Before exploring the major research approaches to computing and regional innovation, some historical context is necessary. Computing has fundamental connections with mathematics, but the notion of a computer as a calculating machine is generally attributed to Charles Babbage. He was a mathematician at Cambridge who developed the idea of a “difference engine” in 1821 that could build mathematical tables with less errors than humans. In the same era, the mathematician George Boole developed a logical calculus for binary values that today forms the basis of nearly all mathematics on computers.18
Mathematicians continued to hold a crucial role in the development of computing in the years before World War II. Among the most important figures in computing is Alan Turing, a mathematician at the University of Cambridge who developed the notion of a Turing machine, a hypothetical computational device. Turing proved that these machines could represent all calculations possible on a computer, and thus, they provided a theoretical limit on the power of computation. Independently of Turing, Alonzo Church, a mathematician at Princeton University, developed a similar limit through the development of lambda calculus. The combined Church-Turing Thesis provides the means of converting between these different notions of computing, and continue to represent the core of computability theory.19
These theoretical developments took place just as the growth of the region today known as Silicon Valley was beginning. During the early years of the twentieth century, the peninsula south of San Francisco was perhaps more notable for its lack of industry than for scientific innovation. The development of university-industry relations at Stanford in the 1930s, however, began a process of industrialization, particularly in radio. The computing industry was generally concentrated in New England, and its effects on the area known as Silicon Valley would not become significant until the 1950s and 1960s. Since then, the region has been one of the preeminent innovation hubs in the world.
Historical institutional approaches take as their subject an organization within a region’s research system and analyze its web of influences. Such approaches provide an important perspective by allowing a high degree of synthesis and integration. However, the method can provide a fragmented picture of the relations between science, technology, and society since organizations are often products of local forces, and translating findings to other institutions can be difficult. This section looks at several major studies in this area, saving those analyzing Stanford as an institution for a later section.
Developing a history of Silicon Valley has proven difficult due to its diversity, but Christophe Lécuyer has written a technically-sophisticated and nuanced account of the changing composition of companies and industries that underpinned the economy of Silicon Valley. He follows the development of each new industry by analyzing prominent companies, including Eitel-McCullough, Varian Associates and Shockley Semiconductor. He finds that the firms benefited from a close collaboration with Stanford, a culture well-adapted to the needs of innovative enterprises, strong connections between manufacturing and research programs, and defense procurement policies that benefitted the Valley’s firms over their competitors in the East.20
Margaret O’Mara analyzes federal policies to compare the different trajectories of Silicon Valley, Philadelphia and Atlanta as regional innovation hubs. One of her major areas of focus is the use of dispersal policies by the Pentagon in response to the atomic threat from the Soviet Union. O’Hara argues that the military, through its funding policies, encouraged the deconcentration of urban centers by supporting the development of more diffuse industrial regions that could withstand nuclear attack. The theory is novel, and its emphasis on the importance of geography in political economic studies of regional innovation hubs is worthy of further study. However, the theory suffers from a level of reductionism that was handled far more deftly in Lécuyer’s analysis.21
The effects of U.S. government policies on the development of computing and Silicon Valley are of obvious interest, and several scholars have analyzed government agencies and the politics surrounding their policies. William Aspray and Bernard O. Williams studied the National Science Foundation and its programs to support the development of scientific computing. In the three decades following the war, the foundation sponsored grants for universities to buy computers, totaling millions of dollars. By the end of the 1960s though, the foundation increasingly desired to focus on the development of a theoretical discipline of computer science, and eventually ended its computational facilities program in 1970.22
The foundation’s desire to support pure science is heavily analyzed by Daniel Lee Kleinman, who explores the politics surrounding the agency’s establishment. One model was developed by Vannevar Bush, who believed that the agency should focus exclusively on basic science and create an elite, meritocratic system of funded research. The other approach was most vigorously argued by Harvey Kilgore, a Democratic senator from West Virginia. He desired a system with more applied science and a greater geographical distribution of research funds. Kleinman demonstrates convincingly that Bush and top industrialists at the time worked together to secure their vision for the organization, and thus social and political factors held a tremendous role in the development of the new agency, and by extension, the nature of science in the postwar period.23
Despite the impact of the National Science Foundation, it was the Department of Defense that likely had the largest impact on the growing use of the computer. Arthur L. Norberg and Judy E. O’Neill have shown that the Defense Department’s Advanced Research Projects Agency and its Information Processing Techniques Office played crucial roles in the transformation of computing. Led by a leader with strong vision of the potential of computing, IPTO transformed the development of time-sharing and graphics, which led to a fundamental change in industry’s approach to the development of computing systems. They explore the relations between the Pentagon’s needs and those of academia, and how specific funding and research policies shaped the course of computing in academia.24
One final strand of historical institutional research that is relevant to this study is the theme of big science. The history of science has classically been one of the independent scientist developing and testing theories individually, with perhaps a few assistants. Starting in the years prior to World War II though, there was a growing trend toward large research labs with dozens if not hundreds of personnel. Peter Galison and Bruce Hevly have edited a volume that explores the implications of these changes as well as the policies that led to this concentration. The diverse essays provide multiple perspectives on the rise of big science as an institutional characteristic.25
Historical cultural analysis takes as its subject a group of people with sociologically similar characteristics and explores how a particular cultural background affects the direction of a region or institution. Within the literature on Silicon Valley, AnnaLee Saxenian conducted one of the first and most celebrated comprehensive ethnographic studies, comparing the region and its dynamics to those of Route 128, the high-tech corridor near MIT. She argues that differences in firm formation and structure are instrumental in the varying levels of success of the two regions. Firms in Silicon Valley are smaller and less vertically-integrated compared to their Eastern competition, and there is more camaraderie between engineers that facilitates greater competition and velocity of information.26
However, Saxenian’s work suffers from several methodological problems that limits its utility in understanding the region. The emphasis on interviews with business executives and engineers provides an interesting perspective on corporate culture, and few will contest that the culture in the Bay Area is different from that in Massachusetts. However, Saxenian’s evidence is insufficient to place firm culture and structure as the major basis for regional development. The primary problem is one of causality: did the culture change the industries or did the industries create the culture? Saxenian reduces the relationship to an almost linear level, but further evidence suggests the two co-evolved, complicating the history far more than she addresses.
Lécuyer takes a more nuanced cultural approach, integrating biographical details of the business executives and engineers of the region’s most notable companies with the corporate structure that forms within them. He finds that the engineers of the first major companies shared similar stories: a middle class upbringing, a desire to participate in the rise of radio, and a social ethic that emphasized community and libertarianism. He argues that this culture is an important factor of the development of the culture’s of these firms, but not a sufficient one.27
Outside of regional development, significant work has been done on analyzing the connection between the counterculture movement and the rise of computation in Silicon Valley. Fred Turner has written a definitive account of this group, focusing on the story on Stewart Brand and a group of people he calls the “new communalists.” He argues persuasively that much of the culture of computation, such as decentralization, libertarianism, and optimism toward technology, are merely manifestations of the culture of people like Brand. Furthermore, this culture helped to facilitate the creation of the networks that today underpin the organizational structures found in so many computer firms.28
Network analysis is a newer approach to studying the rise of regional innovation hubs and focuses on a defined unit of relationship between entities in the system, which are then systematically tracked over a period of time. Given the right dataset, it can offer persuasive evidence of how and when networks develop within regional innovation systems. However, the approach often suffers from its equivalence of relationships (for instance, considering each patents as one unit regardless of actual quality or economic worth).
This approach developed from the work of Walter W. Powell, who found that the classic market-hierarchy spectrum of economic organization does not fully fit firms where tacit knowledge and experience form important sources of capital. Powell argues that network forms of organization can transfer knowledge into action more quickly and allow for sustained cooperation between firms. He applies this approach to a host of different industries, finding that high tech start-ups in areas like Silicon Valley are very similar in their networked organization as craft firms in Italy.29
Jeanette A. Colyvas and Powell applied network analysis to the case of academic entrepreneurship in the life sciences at Stanford. They find that entrepreneurship at the university grew incrementally over the course of three decades. In the early years, only senior faculty with tenure were willing to engage in industrial activities, their reputations having already been secured. As others in the life sciences witnessed their success, they too began to engage in academic entrepreneurship. The two scholars find that network effects were particularly important factors in an individual’s likelihood of engaging with industry. Those with more publications were significantly more likely to secure patents, and students and younger faculty were more likely to be entrepreneurial if they were working with senior faculty.30
Saxenian used network analysis as part of her comparative ethnographic study, but scholars have also taken a more quantitative approach to network analysis. Lee Fleming has applied a network approach in studying the development of regional economies by looking at data of patent authorship and citations. Olav Sorenson and Fleming analyzed the value of academic scientific research in regional networks by looking at different groups of patents. Analyzing the data, they find that patents which cite any publication — whether a journal article or a press release — increase their future citation counts. Thus, the increased value of a patent citing the academic literature can be mostly attributed to increased communication rather than increased quality.31
Historical Development of Computer Science
A core part of this study analyzes the academic politics of the Stanford faculty and their approaches to the developing discipline of computer science. Throughout its early years, computer science was a hybrid construction. One side provided computing resources for other disciplines in the university, while another side directed the theoretical developments of computer science as a field. Scholars in recent years have increasingly focused on the development of the discipline of computer science, although coverage of it remains limited.
Atsushi Akera has developed a theoretically rigorous synthesis of the rise of academic computer science as part of his study on the pluralism of computation in the Cold War era. He develops the notion of an ecology of knowledge pioneered by Charles Rosenberg to show how the tension between military applications, commercial goals and academic desires shaped the direction of computing. On university campuses, this tension was manifest between the academic staff of the discipline and the service staff of university computational facilities. Akera shows the struggle between these two at MIT and the University of Michigan over the development and deployment of time-sharing computers, a debate that eventually led to their “disintegration.”32
The importance of the military is not absolute, and Paul Ceruzzi has explored the connections of computing to science and engineering businesses. He looks at the evolution of different components of a computer system, including at the hardware level with core memory and at the software level with operating systems. He investigates the role that computers have taken in information processing, attempting to define what a computer is and how it has changed over the post-war period, providing a wide-ranging perspective on the rise of computer science.33
Ultimately though, the development of academic computer science was led and constructed by faculty at major research universities. Much of the development of computing in the 1960s can be traced to research at MIT centered around Project Whirlwind, which developed one of the first computers with real-time displays. Kent C. Redmond and Thomas M. Smith studied the pressures between the MIT administration of the project, notably Jay Forrester, and the Office of Naval Research. Forrester pushed the project hard, ignoring budget projections and focusing exclusively on expanding scientists’ knowledge of computers. While the work relies perhaps too heavily on oral histories with the project leadership, it provides an insightful account of the different goals of universities and the military.34
Recently, Ensmenger has looked into the development of the discipline of computer science, analyzing the qualities of people who entered the field. He writes that the need for academic legitimacy was a crucial element in the direction of computer science departments, and this concern caused departments to focus on theoretical concepts (especially the algorithm) as a means of building a defined field of inquiry with open problems and clear research directions. He argues that this increasing theoretical basis assisted academic departments, but led to a widening gap between the science and the applications of computer science.35
Ensmenger’s study provides interesting evidence on the people who worked on computers during the rise of computer science, but it lacks a more encompassing geographical approach. The focus is primarily on the East Coast schools — most heavily MIT — and this limited range of the work hinders its wide utility. In fact, the story at Stanford was quite different, as this study will show — the department simultaneously increased its vigor in the theoretical fields while building important and quickly-growing connections to industry.
History of Stanford
Stanford University was founded in 1891 by Jane and Leland Stanford as a memorial for their son, who died from typhoid. Leland was a railroad magnate and a former governor of California, and the two donated 10,000 acres of land on the peninsula south of San Francisco to be used as a permanent home for the university. Stanford’s first president, David Starr Jordan, served more than twenty years, and oversaw the rebound of the university from the devastation of the 1906 San Francisco earthquake. The event ushered in an austere period, and an on-going struggle for financial security at the university would continue for several decades.
After the stock market crash of 1929 and the economic depression that followed, income for Stanford fell precipitously, putting the school on even more precarious financial ground than before. Stanford’s on-going financial difficulties had harmed its ability to recruit faculty, and by the 1930s, college rankings did not place Stanford among the top ten schools nationwide. There was an acceptance by the president, Ray Lyman Wilbur, that the university was facing deep problems in its basic operations. To counter the decreases in income facing universities, the federal government created programs to increase funding for research. However, these programs were widely rejected by universities, including Stanford, for fear of government intrusion in private universities.36
This environment created a proving ground for Frederick E. Terman, a professor of electrical engineering who by the 1930s was chair of the department. Terman was the son of Lewis Terman, a child psychologist who invented an IQ test, and Fred Terman had studied under Vannevar Bush at MIT in the 1920s. Observing the financial situation of the school, Terman was deeply concerned at the direction of the university. As chair of Electrical Engineering, Terman spearheaded the creation of industrial partnerships, realizing an opportunity to secure additional funding. Thus, the department began a long and vital relationship with local industry. It was also around this time that Terman helped to train William Hewlett and David Packard, perhaps the most notable example of the kind of university-industry partnership desired today.37
Bush’s impact on science policy is certainly important, but it is his interests in computing that make him particularly relevant to the story of computer science. By 1931 he had developed a “differential analyzer,” a mechanical computer that could give numerical solutions to differential equations. The device captured the imagination of the public, and leading scientists were very optimistic about the future of mechanical calculators. Bush’s interests in computing were vast, and his predictions about a machine that could retrieve information created metaphors that apply to the internet today.38
Terman remained close to Bush throughout their relationship, and he used the expanded funding from the federal government in the postwar years as a means to subsidize the growth of new and powerful departments at Stanford. The funding was largely derived from increased defense spending following the Korean War, and much of this funding was of an applied nature. This approach, which Terman referred to as “steeples of excellence,” had several different components. First, the university should emphasize areas of research that had strong federal funding and wide contacts in industries. Second, he often encouraged the hiring of several faculty members in the same academic area, with the goal of building a national center in a sub-field critical to the future development of the discipline.39
It is in this period of growth that scholars often analyze the historical institutional development of Stanford and Silicon Valley. In his work on the subject, Stuart W. Leslie explores the changing nature of research at Stanford and the relationship between the university and military research grants. He is relatively pessimistic about these developments, lamenting the transformation of universities from being independent basic science organizations to directed applied agencies of the government. Due to their large grant programs, the military was able to direct research activities, and largely “defined what scientists and engineers studied, what they designed and built, where they went to work, and what they did when they got there.”40 His criticism is valid, although perhaps overstated, particularly in the context of the development of computer science where defense funding led to significant civilian applications.
Lowen’s work is the most intellectually similar to this study’s approach and sensibility. As she emphasizes, the military held a crucial role in the development of engineering departments at Stanford including Electrical Engineering. While her work does not extend into the growth of computer science, many of the same issues arose, including the balance between basic and applied research. Terman’s approach to computer science was keeping with these developments, and much of the core debate over the identity of computer science as either a service to the university or an academic discipline derives from these problems.
Stanford Computer Science: Study Outline and Source Notes
The first notion of computer science at Stanford developed in the Mathematics department under the direction of George E. Forsythe and John McCarthy, both numerical analysts. Forsythe was a mathematician, with great interests in solving problems numerically. This interest began during the war when he served as a meteorologist, but it was truly developed at UCLA, where he served at the Institute for Numerical Analysis of the National Bureau of Standards. He had access to a computer in this position, and began a life-long mission to use the power of computing to solve important mathematical problems.
After joining the department in 1957, Forsythe worked with Herriot to quickly develop computer science into its own discipline. Computer science was soon provided its own division within the department, allowing Forsythe a level of independence for the burgeoning area of study. These early years were tough for the division. Faculty billets were shared with the Mathematics department, ensuring a constant friction over staff. Furthermore, the development of computer science as a discipline brought its research program away from the work conducted by other mathematicians, generating important discussions on the utility and legitimacy of this new discipline. Eventually, these disagreements would cause Forsythe and the Computer Science division to leave the Mathematics department and create their own independent division at the end of 1963.
Despite these issues of academic legitimacy, the new division grew rapidly, increasing in staff as Forsythe built alliances with other departments on campus through joint appointments. The expansion of the division’s graduate program ensured a strong incoming class of doctoral students, who were top in the field. The division’s growth led to the university administration granting full department status on January 1, 1965. However, the financial pressures on the department continued to grow as high inflation and university budget cuts constrained its expansion. Through a range of programs with industry, Forsythe developed new sources of income which allowed for new faculty growth and greater prominence. Forsythe would lead the department until his early death in 1972 from cancer at age 55.
This study is divided into three chapters grouped thematically that together complicates and enhances our understanding of the pathways universities take in engaging with regional innovation hubs. Chapter two considers the politics of computer science and particularly artificial intelligence within the context of the academy. It focuses on the debates surrounding the tenure cases of artificial intelligence researchers John McCarthy and Marvin Minsky as well as William F. Miller, a physicist with interests in computing. This chapter complicates our understanding of the history of Stanford’s links to innovation, showing that different constituencies within the university were widely varying in their desire to engage with the development of a new academic discipline and industry. The archival material in this area is particularly rich, and allows for a close dialogue between the actors.
Despite these protests against the development of the Computer Science division, it was granted department status and became one of the top programs in the nation. Chapter three analyzes the university environment to determine what institutional factors allowed the department to overcome this resistance. This chapter argues that cultural factors of the faculty and administration played an instrumental role in the university’s continued support for growth within the field. The financial insecurity experienced by the department throughout the decade forced a bureaucratic creativity and efficiency very much in the style of a twenty-first century start-up company. The administration itself did little to constrain Forsythe, even when he spent more money than budgeted. Its active and passive facilitation ensured that the department had few bureaucratic hurdles to retard its development.
These cultural factors led to further growth in the department, but they do not completely explain the desire to engage with industry. Chapter four explores the mutual relationship of the Computer Science department and industry. Major corporations in computing, particularly IBM, assisted in subsidizing the costs of developing the Computer Science department, in exchange for early access to research and the publication of new tools for their company’s computer systems. The need for further funds encouraged the development of new venues to engage industry, and the chapter concludes with an exploration of the Honors Co-Op program and the Computer Forum that led to the first formation of networks between the department and industry.
Archival materials for this study come from the Stanford University Archives, and consist of documents primarily from the George Forsythe as well as William F. Miller, Frederick E. Terman, Richard Lyman, J.E. Wallace Sterling, Edward A. Feigenbaum, Joshua Lederberg and the School of Humanities and Sciences collections. There has not been a study published on the history of computer science using these archival materials. Footnotes include the indexing of these documents starting with collection number and ending at folder number.
- Tyler Brown, “Merkel addresses Afghanistan, climate change,” The Stanford Daily, 16 Apr. 2010.↩︎
- Andrew Clark, “Dmitry Medvedev picks Silicon Valley’s brains,” The Guardian, 23 Jun. 2010.↩︎
- Adam Gorlick, “ ‘I wanted to see with my own eyes the origin of success,’ Russian president tells Stanford audience,” Stanford News Report, 23 Jun. 2010.↩︎
- According to the promotional pamphlet for the complex, the goal is to become an Asian education and research hub through the creation of a new campus for the prestigious Yonsei University and a nearby R&D park that will connect the university to industry.↩︎
- Its initial endowment of $10 billion is larger than that of MIT. Charles Q. Choi, “Arabian Brainpower,” Scientific American, 17 Jan. 2008↩︎
- The change in attitude, particularly since the passage of the Bayh-Dole Act in 1980, has led to significant dissent within the academic community. For one critical albeit uneven account, see Daniel S. Greenberg, Science for Sale: The Perils, Rewards and Delusions of Campus Capitalism, University of Chicago Press, 2007.↩︎
- Zachary, G. Pascal, Endless Frontier: Vannevar Bush, engineer of the American Century, The Free Press, 1997.↩︎
- Vannevar Bush, “Science: The Endless Frontier,” United States Government Printing Office, 1945, http://www.nsf.gov/od/lpa/nsf50/vbush1945.htm#summary↩︎
- Schumpeter developed the first studies of technology and economic growth, positing innovation as the crucial element in the capitalism system. While his work mostly predates the post-war developments outlined here, he provides an important intellectual basis for studies in innovation and economic growth.↩︎
- Jacob Schmookler, Invention and economic growth, Harvard University Press, 1966.↩︎
- Nathan Rosenberg, “Science, Invention and Economic Growth,” The Economic Journal, Vol. 84, No. 333 (Mar. 1974), pg. 105.↩︎
- Thomas Kuhn, The Structure of Scientific Revolutions, University of Chicago Press, 1962.↩︎
- Bijker and Pinch, “The Social Construction of Facts and Artefacts: or How the Sociology of Science and the Sociology of Technology might Benefit Each Other,” Social Studies of Science, Vol. 14, No. 3 (Aug. 1984), pg. 399-441.↩︎
- Donald MacKenzie, “Inventing Accuracy,” MIT Press, 1990.↩︎
- Universities and the Global Knowledge Economy: A Triple Helix of University-Industry-Government Relations, Ed. Henry Etzkovitz and Loet Leydesdorff, Pinter, 1997↩︎
- Henry Etzkovitz, MIT and the Rise of Entrepreneurial Science, Routledge, 2002.↩︎
- There are numerous references available on the early era of computing. The Charles Babbage Institute of the University of Minnesota provides immense bibliographic resources. This section is from the review by Gerard O’Regan, A Brief History of Computing, Springer, 2008.↩︎
- Christophe Lécuyer, Making Silicon Valley, MIT Press, 2005.↩︎
- Margaret P. O’Mara, Cities of Knowledge: Cold War Science and the Search for the next Silicon Valley, Princeton University Press, 2005.↩︎
- William Aspray and Bernard O. Williams. “Arming American scientists: NSF and the provision of scientific computing facilities for universities, 1950-1973.” Annals of the History of Computing, Vol. 16, No. 4 (Winter 1994), pg. 60-74.↩︎
- Kleinman, “Layers of Interests, Layers of Influence: Business and the Genesis of the National Science Foundation.” Science, Technology, and Human Values, Vol. 19, No. 3 (Summer 1994), pg. 259-282 and Kleinman, Politics on the Endless Frontier, Duke University Press, 1995.↩︎
- Norberg and O’Neill, Transforming computer technology: information processing for the Pentagon, 1962-1986, Johns Hopkins University Press, 1996.↩︎
- Big Science: The Growth of Large-Scale Research, ed. Galison and Hevly, Stanford University Press, 1992.↩︎
- AnnaLee Saxenian, Regional Advantage, Harvard University Press, 1994.↩︎
- Christophe Lécuyer, Making Silicon Valley, MIT Press, 2005.↩︎
- Fred Turner. From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism, University of Chicago Press, 2006.↩︎
- Walter W. Powell, “Neither Market Nor Hierarchy: Network Forms of Organization.” Research in Organizational Sociology, Vol. 12 (1990), pg. 295-336.↩︎
- Colyvas and Powell, “From Vulnerable to Venerated: The Institutionalization of Academic Entrepreneurship in the Life Sciences.” Research in the Sociology of Organizations, Vol. 25 (2007), pg. 219-259.↩︎
- Sorenson and Fleming, “Science and the Diffusion of Knowledge.” Research Policy, Vol. 33, No. 10 (Dec. 2004), pg. 1615-1634.↩︎
- Atsushi Akera, Calculating a Natural World, MIT Press, 2008.↩︎
- Paul E. Ceruzzi, A History of Modern Computing, 2nd ed., MIT Press, 2003.↩︎
- Redmond and Smith, Project Whirlwind : the history of a pioneer computer, Digital Press, 1980.↩︎
- Nathan Ensmenger, The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise, MIT Press, 2010.↩︎
- One of Stanford’s trustees at the time was Herbert Hoover, who opposed much of the expansion of the federal government during this period. Rebecca S. Lowen, Creating the Cold War University, University of California Press, 1997.↩︎
- C. Stewart Gillmor, Fred Terman at Stanford, Stanford University Press, 2004.↩︎
- G. Pascal Zachary, Endless Frontier: Vannevar Bush, engineer of the American Century, The Free Press, 1997.↩︎
- Ibid. and Rebecca S. Lowen, Creating the Cold War University, University of California Press, 1997.↩︎
- Stuart W. Leslie, The Cold War and American Science, Columbia University Press, 1993, pg. 9.↩︎