Emerging Technology

How to Make the NSTC a Moonshot Success

The new National Semiconductor Technology Center can reshape America's semiconductor industry
April 22nd 2024

Executive summary

As the semiconductor industry has specialized in recent years, its ability to organize and cooperate has atrophied. But it faces an increasing number of challenges that will require deep, long-term cooperation. The United States has a chance to ensure an increasingly siloed industry can unite in pursuit of such moonshots on American soil. That chance lies in the National Semiconductor Technology Center (NSTC), funded up to $11 billion by the CHIPS and Science Act of 2022. 

The NSTC should not be relegated to the role of a low-risk R&D subsidy, aligned with existing industry agendas. Rather, the consortium should independently tackle hard-to-solve, long-term challenges that no existing entities have the incentive to pursue. In this paper, we highlight market failures that challenge research and investment in the semiconductor industry, and propose an organizational infrastructure that would allow the NSTC to plug these holes as a unique market actor.

The chip industry was born in America, and a sufficiently ambitious NSTC can ensure the next generation of computing technology is continued here. If the NSTC’s ambitious research and investment agendas complement those of private industry, the once-in-a-generation consortium can make a strong argument to Congress for continued funding, becoming an institution with staying power. 

Don’t sell the CHIPS moonshot short

A year and a half after passage of the CHIPS and Science Act of 2022, a nonprofit, Natcast, has finally been established to govern the CHIPS Act’s underappreciated gem: the National Semiconductor Technology Center (NSTC). The CHIPS Act appropriated over $52 billion to revitalize the American semiconductor industry, and its $39 billion reserved for chip manufacturing subsidies received the most attention. More important to America’s long-term technological leadership, however, is another $11 billion set aside to fund a series of research initiatives.

These research initiatives, especially the NSTC, have the potential to turbocharge American innovation. The NSTC consortium is the first American attempt since 1987 to create a public-private partnership for semiconductor technology. Done right, it could lead to paradigm shifts in areas such as computing for machine learning, advanced computing security, and in the environmental sustainability of chipmaking itself. 

However, the world is full of research consortia that only incrementally advance technology frontiers. If the NSTC is insufficiently ambitious, it will likely not improve on existing efforts in industry, academia, and foreign consortia. Moreover, it may not produce sufficient value to survive beyond its five years of CHIPS Act funding.

The CHIPS manufacturing grants serve short-term goals: restoring America’s share of global semiconductor manufacturing and securing critical supply chains. The NSTC ought to target a long-term goal: ensure America remains the birthplace of paradigm-shifting technologies. Commerce Department officials should not relegate the NSTC to a low-risk R&D subsidy aligned with existing industry agendas. Rather, the NSTC’s research and investment agendas should pursue paradigm-shifting moonshots.

While Congress laid out clear goals for the manufacturing grants in the CHIPS and Science Act, it left its instructions for the NSTC vague. This ambiguity gives the Commerce Department a unique opportunity to dream ambitiously. Based on dozens of interviews with chip startup founders and venture capitalists,1 this paper articulates a vision for a moonshot-focused NSTC, and proposes the organizational steps for getting there:

Correcting market failures in research and manufacturing

Major players across the chip industry face incentives that create particular shortfalls in their research agendas. Startups struggle to plan for real-world manufacturing environments when developing their early products. Incumbents and startups both insufficiently explore new materials and chemicals, because established supply chains are difficult to disrupt. Siloed firms lack forums for cooperation on research missions that target the computation needs spurred by downstream technological trends, and which no individual semiconductor company is incentivized to solve. In each of these cases, existing market forces lead to a less dynamic chip industry. In some of these cases, the problem isn’t market failures per se; but the market is externalizing risk to things we care about, like supply chain durability and national security. This externalized risk leads actors to invest less than they would in a more robust ecosystem. We propose an NSTC research agenda that addresses these market failures and externalities.

The chip industry’s investment agenda leaves similar gaps that further hamstring these players. Startups benefit from discounted electronic design automation (EDA) tools in their early stages, but often cannot survive the leap to commercialization after these tools expire. Startups operating outside the latest investment trends struggle to attract financing. Lastly, large industry funds, generalist venture capitalists, and institutional investors lack the combination of technical knowledge and financial muscle to organize investing syndicates for promising upstarts.

A research and investment agenda laser-focused on addressing market failures could allow this new American research consortium to orchestrate generational steps forward in computing.

Independent, centralized infrastructure producing shared IP

After explaining our vision for the NSTC’s leadership role in research and investment, we proceed to the how. Our recommendations for the consortium’s IP sharing model, geographic organization, and financial plan will allow the NSTC to solve key market failures.

Natcast leaders should design an NSTC that shares its IP widely in the first decade. As many members as possible should have nonexclusive, royalty-free licenses to IP as the consortium develops. Only after the NSTC’s cross-sectoral research agenda solidifies should the consortium narrow the number of companies with which it shares IP.

The NSTC’s geography should be as centralized as possible. Building facilities around a flagship campus would give Natcast leadership more influence over the consortium’s agenda than it would have with many annexes to existing facilities. This strong leadership will allow the NSTC’s research and investment agenda to avoid industry capture.

Lastly, the NSTC’s financial plan should mimic that of its closest historical peer, Belgium’s Imec. The CHIPS Act gives the consortium an initial five years of relative financial independence, which it should use to build centralized infrastructure. In the second five years, the NSTC should rely more heavily on participation fees for joint efforts, but still use public dollars for infrastructure expansions. After a decade, the NSTC can confidently engage in more direct fee-for-service projects without fear that it will become over-reliant on a small number of private-industry partners.

A unique orientation for the NSTC 

The chip industry is rife with grand visions for new paradigms in materials, architectures, and other foundational technologies, but realizing these goals requires tighter cooperation among industry players. In 1987, TSMC’s decision to only fabricate chips that other firms design kicked off the fabless-foundry bifurcation. Today, heterogenous integration further divides the semiconductor world into engineers thinking about fragmentary chiplets and those thinking about the entire chip. Just as the industry has become more siloed, however, downstream trends like autonomous driving demand increasingly seamless communication between sensing and computing systems, technologies developed by different companies in distinct segments of the industry. Each time chipmaking reaches a new level of complexity, it needs new institutional arrangements to orchestrate cooperation among increasingly siloed companies, universities, and financiers.

Imec played such a gathering role in the late-nineties and naughts, when it brought together upstream optics makers, toolmakers, university labs, and downstream chipmakers, and it continues to facilitate cross-industry collaboration toward a limited set of moonshots today. The value-add of gathering places like Imec lies in their ability to incentivize industry players, who would otherwise be unwilling to or incapable of sharing knowledge, to share intellectual property (IP) and engineering talent in pursuit of joint-research projects. For example, the consortium has linked toolmakers, IDMs, and foundries over fifteen years in pursuit of a gallium nitride on silicon (GaN-on-Si) effort that no industry player alone has the incentive structure to coordinate. Imec’s cooperative efforts bore fruit in December 2023, when its GaN-on-Si transistors outperformed industry alternatives. 

This alternate development route shows the value of consortia with different incentive structures than private industry. Imec’s diverse research participants and longer time horizons allowed the chipmaking world to bet on two GaN transistor technologies at once, and the best ultimately won out.

Aside from Imec, chipmaking countries across the world boast public-private partnerships seeking to connect subsets of players in pursuit of narrow goals. However, reaching semiconductor technologists’ loftiest moonshots requires gathering competitors, siloed industry segments, and investors in ways that no existing consortium addresses.

This section analyzes how the NSTC could set its research agenda and investment agenda to help correct those market failures.

Correcting market failures in research

The consortium should set its research agenda in line with three high-level goals: proactive design planning for manufacturing, de-risking underinvested areas, and organizing cross-sectoral missions. Such ambitious planning might not immediately yield overwhelming support from all industry players, but the fear of disagreement should not hinder farsighted investment. If the consortium can demonstrate early success, initially skeptical actors may come on board later.

Design for manufacturing

Chip startups often prefer to use laboratory equipment and controlled settings to develop prototypes, as opposed to the less precise tools used in commercial environments. This preference is understandable, as most are pursuing early-stage investment and have limited bandwidth to consider how their prototypes will ultimately reach production scale. However, a venture fund co-founder we interviewed lamented that relying on nearly perfect laboratory conditions delays important design and testing questions. Startups meet messy commercial fabrication environments in the Series B stage, at which point their early funding runs dry. The Series B stage, when startups are too mature for further venture funding but have not yet proven their products with commercial equipment, is referred to as the valley of death.

Adapted from the CHIPS Research and Development Office.

Insisting that startups participating in the NSTC ecosystem use production-style tools (and providing access to such facilities) will provide early-stage innovators with both the means and the incentive to consider crucial questions about the transition to commercial-scale manufacturing at an earlier stage in their development. The same venture fund co-founder highlighted American SME champion Applied Materials’ long-standing investment in Adesto Technologies as an example of such an approach. Starting from Adesto’s Series A, through the startup’s IPO, Applied Materials (AMAT) provided its own lab space and engineering support to help Adesto prove its designs could scale with commercial equipment. AMAT’s interest in doing so was to improve its own tools and process flows, suggesting that the NSTC could similarly expose startups to commercial environments by incorporating them into its workflow.

An NSTC commitment to “design for manufacturing” in its research agenda could look like Taiwan Semiconductor Manufacturing Company’s (TSMC) enormously successful Design-Technology Co-Optimization (DTCO) methodology. DTCO, and its broader counterpart, System-Technology Co-Optimization (STCO), are methods under which traditionally sequential steps of the chip value chain are undertaken in parallel. In doing so, engineers from different steps of the value chain can more easily learn about and implement changes that lead to efficiency gains for their upstream and downstream colleagues. DTCO notably led to the transition from planar to finFET transistors, and to the development of more accurate chip layout design using lithography and etching tools. 

TSMC succinctly described this approach in a 2022 media post:

“TSMC process R&D needs to work with TSMC design R&D on DTCO from day one of next-generation technology definition. Both teams must maintain an open mind as they explore what’s possible in terms of design innovation and process capability. A lot of innovative ideas are proposed at this stage. Some of them could be too aggressive to be realized by current technology. Some of them may look promising initially but turn out to not be so useful.”

Similarly to how TSMC coordinates its various research teams, the NSTC should require all startups it supports for early-stage research to plan for limits of production-stage capabilities. In the worst case, such planning would lead to the early scrapping of unrealistic proposals; in the best case, it would push startups to find creative solutions to circumvent the constraints of commercial environments.

Testing grounds for new materials and new chemicals 

The chip industry invests far too little in certain high-potential technology segments. New substrates and new chemicals are examples of how capex barriers and process inertia impede progress.

Semiconductor substrate startups working with wide bandgap (WBG) materials such as silicon carbide (SiC) and gallium nitride (GaN) face a unique set of capex barriers tied to their infrastructure needs. WBG materials are important for computing in contexts like transportation, energy grids, and outer space because they can operate at higher voltages, temperatures, and frequencies than conventional semiconductor materials. Increasing the commercial-level production of WBGs would increase the supply of these materials, as well as the number of facilities that can process them. Such an increase in materials supply and infrastructure would in turn lower the cost of testing for novel WBG chip technology.

Unfortunately, commercial fabs often hesitate to adopt new materials like SiC due to the risk of contaminating their existing production lines, and substrate startups need early adopters to demonstrate the viability of their products. As a result, these startups have limited avenues to test novel substrate technologies. Texas Instrument’s retired X-FAB facility offers SiC and GaN testing lines, and ARPA-E regularly invests in SiC testing; however, these facilities and funds are not nearly enough to support a fertile substrate ecosystem in the U.S. Publicly financed programs in the U.S. devoted to new materials research have in fact declined over the past decade: Founded in 2005 with $20 million in annual public-private funding, the Nanoelectronics Research Initiative (NRI) focused on non-conventional materials and architectures that sought to outperform CMOS over the course of a decade. It was later merged with the STARnet program, which received $50 million in annual funding for six facilities, three of which engaged in beyond-CMOS research. Neither NRI nor STARnet are still active, leaving the U.S. with ARPA-E to fund, but not provide additional facilities, for the demonstration of new materials.

In contrast, Chinese companies investing in SiC and GaN technologies benefit from government support for the infrastructure required to bring such products to market, helping the PRC catch up to foreign leaders in substrate technology. By incorporating novel substrate technologies into its full-stack research projects, the NSTC could help provide testing grounds for startups otherwise unable to get such opportunities from industry.

Regarding new chemicals, the World Semiconductor Council (a global gathering of regional semiconductor industry associations) annually reminds regulators that certain toxic substances are currently irreplaceable for the chip making process. Per-and polyfluoroalkyl substances (PFAS) compounds are ubiquitous in fabs, but the industry has only succeeded in fully eliminating one particular PFAS compound (perfluorooctane sulfonic acid) since 2000 and has only committed to eliminating another (perfluorooctanoic acid) by 2025. Though the technical challenges of finding alternative chemicals and transitioning process steps are formidable, one industry observer laments that chip makers are dragging their feet due to their belief that regulators will continue allowing them to use PFAS. Making PFAS alternatives a research priority for the NSTC could catalyze such chemicals’ discovery and implementation while regulators still allow their limited use in chip production.

As with new materials, the semiconductor researchers lack testing grounds for potential PFAS alternative chemicals. The University of Arizona Engineering Research Center (ERC) for Environmentally Benign Semiconductor Manufacturing, an NSF-SRC program receiving approximately $1 million in initial annual public funding since 1997, has not been active since its last NSF grant ended in 2017. Additionally, the University of California Los Angeles (UCLA) Center for the Environmental Implications of Nanotechnology (CEIN) received approximately $6 million in initial public funding from the NSF and EPA from 2008 to 2013, and NSF extended this funding through 2017. However, CEIN primarily focuses on measuring the chip industry’s environmental impact, not on developing alternatives to hazardous chemicals.

New materials and new chemicals would confer benefits across the semiconductor industry and the larger U.S. technology ecosystem. This broad benefit, however, has to-date limited industry investment. The NSTC should not only invest in these new paradigms, but it should also fashion itself into a testing ground for the materials and chemicals that industry facilities will not risk trying. Doing so will define the NSTC’s physical infrastructure as a public good that helps solve some of chipmakers’ most consequential free-rider dilemmas.

Cross-sectoral missions

Cross-sectoral missions are research initiatives requiring participation from companies that usually pursue their own siloed research agendas. Rather than following corporate research agendas driven by firms’ quarterly financial cycles, these missions should be oriented toward downstream use cases in the public interest like machine learning, security, and sustainability. A financially viable NSTC could help coordinate chip technology developments needed for these downstream trends.

A research agenda targeting cross-sectoral missions would be different in kind from the Decadal Plan for Semiconductors, a research agenda laid out by industry players through the Semiconductor Industry Association (SIA) and the Semiconductor Research Corporation (SRC). The Decadal Plan delineates five categories of breakthroughs in memory, analog, communications, security, and sustainability. The first three categories of this plan are important and largely align with the research agendas of well-established chipmakers and designers, but the latter two categories convey the priorities of actors outside the core industry. 

The NSTC, however, should draw inspiration from the Decadal Plan’s latter two categories, security and sustainability, to deliver the chip technology needed for world-changing downstream uses of advanced computation. The following are three examples of use-case driven research agendas:

Risky AI hardware bets

The Hardware Lottery” describes how a research direction can win not because it’s necessarily the most promising, but because it’s the most well-suited to the capabilities of the underlying computer hardware. Nowhere is this more evident than when looking back from the current AI boom. The fundamental software ingredients for AI computing, namely neural networks and backpropagation, were developed many decades ago. But their promise wasn’t realized until GPUs were applied to the problem – a fortunate spillover from hardware developed for the gaming industry.

Unfortunately, no single hardware paradigm works ideally for every software paradigm, and the current approach to AI hardware also has its limitations. For example, Bayesian deep learning is regarded by many researchers as a promising approach, but has high computational costs and is difficult to parallelize over multiple GPUs. 

Today’s AI hardware paradigm suffers from performance bottlenecks due to poor data movement: advances in how quickly chips can crunch numbers are far outpacing improvements in bandwidth and latency.

As companies race to specialize within the current AI hardware/software paradigm, the NSTC could support higher-risk work to explore the next paradigm. Beyond potentially leading to breakthroughs in areas such as optical computing and in-memory computing, risky hardware bets could allow software startups whose ideas do not fit into the current paradigm to achieve proof-of-concept.

System-level security

“Heterogeneous integration” (HI) has become chipmakers’ greatest hope to advance computing beyond the limits of Moore’s Law. At the same time, however, HI also makes chips more complicated, introducing more weak points that bad actors can exploit to hack devices. Importantly, no single developer has insight into all the potential security risks on a single heterogeneously integrated chip because siloed actors individually develop many of the final product’s “chiplet” subparts.

One key area of concern is advanced AI development, which often relies on highly complex GPU-based systems, and where IP presents an especially juicy target for well-resourced hackers. Independent researchers recently discovered a vulnerability in chips from four major GPU vendors that allowed adversaries to spy on outputs from other users’ LLM conversation sessions. The White House highlighted similar concerns in a report calling for chips of sensitive systems to deeply embed memory protections.

As chip development becomes more siloed due to HI, the NSTC could coordinate its members to identify and standardize solutions to system-level vulnerabilities. Specifically, it could help industry and academia jointly unravel structural problems in software and hardware stacks to ensure chip security is not solely dependent on individual companies’ foresight and due diligence. As we will elaborate in later sections, the NSTC could then use its role as a publicly subsidized consortium to make IP that is relevant to chip security as widely and cheaply available as possible.

PFAS in chip fabrication

Even before chips begin sucking power in massive data centers or on laptops, semiconductor fabs rely on electricity, water, and chemicals to run machines etching the designs on paper-thin sheets of silicon. The Decadal Plan envisions computing paradigms that could enable a more than million-fold improvement in operational efficiency. However, it omits any mention of sustainability in fabrication. 

Creating and implementing safer process chemicals could be a prime area for the NSTC to gather chipmakers, toolmakers, chip designers, and upstream chemicals providers in pursuit of a stubborn challenge faced across chip segments and manufacturing steps. While the theorized causality between PFAS chemicals and harm to the human endocrine system has not been definitively proven, chipmakers have nonetheless faced increasingly tighter restrictions on PFAS use, as well as heavy pressure for a blanket ban on PFAS use by environmental advocates. The semiconductor industry has to-date been successful in delaying such bans by arguing that, without PFAS, chipmaking would grind to a halt in America. However, the U.S. Environmental Protection Agency’s 2023 proposition of complete ban suggests there may soon be a day when U.S. chipmakers have to entirely remove PFAS from their manufacturing processes, regardless of definitive scientific evidence.

As mentioned previously, alternatives to PFAS compounds have evaded the industry for over two decades. For individual companies to make breakthroughs in this field is challenging, because a) developing PFAS alternatives is undeniably hard and so requires expensive, risky upfront research, b) implementing PFAS alternatives would require significant manufacturing and design process changes, and c) the benefits would accrue to all industry actors and society more broadly (in the form of more sustainable fabrication). The NSTC should therefore organize cross-industry contributions to developing PFAS alternatives as a public good.

The NSTC’s leadership should identify overlap points between traditionally siloed technology verticals as early as possible. Once found, these overlaps should form the nexuses of the consortium’s research agenda, as opposed to planning it around analog, memory, logic, and other verticals that function as the industry existing foci. Such cross-sectoral orientation will make the NSTC a unique gathering spot for players across the industry to plan breakthrough goals.

Correcting market failures in investment

Congress provided limited instruction regarding how the NSTC’s investment fund should function, and the Commerce Department’s 2023 “A Vision and Strategy for the NSTC” document does not specify what particular financing mechanisms the consortium plans to leverage. U.S. Code simply states that a function of the NSTC shall be:

“to establish and capitalize an investment fund, in partnership with the private sector, to support startups and collaborations between startups, academia, established companies, and new ventures, with the goal of commercializing innovations that contribute to the domestic semiconductor ecosystem, including

(i) advanced metrology and characterization for manufacturing of microchips using 3 nanometer transistor processes or more advanced processes; and

(ii) metrology for security and supply chain verification.”

Although the NSTC’s empowering legislation enumerates the two responsibilities above, the law does not limit the consortium to only metrology-related tasks. NSTC leadership can act across industry segments, and it should use this flexibility to align its investment agenda with its research agenda. Such alignment would form a powerful signal to capital providers that semiconductor startups are ripe for funding.

Investment signaling from a well-funded and technically competent institution like the NSTC would greatly benefit the chip industry. Venture capital firms like a16z, as well as larger institutional investors, generally do not have staff with deep experience in the chip industry, making them unlikely to lead funding rounds for startups whose markets they do not understand. In contrast, NIST regularly publishes roadmaps for even niches of the chip industry like die-level failure analysis, but investors lament that NIST’s signals are weak without the government also putting financial skin in the game.

The NSTC’s investment arm will not have perfect insight into the industry, but a clear research agenda can help it evangelize a clear investment agenda. In-Q-Tel stands as an extreme case study. It is the U.S. intelligence community’s sole investment arm, so third-party investors analyze its funding decisions as the key signal to what technologies a major government customer cares about. The U.S. government has not been a large enough customer to set the semiconductor industry’s agenda since the mid-20th century, but an investment agenda well aligned with the NSTC’s research agenda can give the government a meaningful role in assembling industry stakeholders and setting long-term moonshots.

Bridging the “valley of death”

The NSTC should focus investment on Series A and Series B startups to help these firms think about commercialization, not just survival. As part of their focus on surviving through seed and Series A, chip startups use equipment not used under commercial conditions, and often accept short-term deals from EDA providers like Cadence and Synopsys. These deals generously subsidize tool use until Series A, but once a startup raises a threshold amount of funds, EDA license payments suddenly trigger. Some startups simply cannot afford to raise their Series A as a result of these license triggers, leaving their products far from ever being commercialized.

Though the chip industry has grown from receiving only the 25th most venture funding in 2018 to the 11th most in 2023, chip startups still lack strategic investors with both large funds and deep technical insight. The top semiconductor seed investors come from either generalist funds which lack expertise, or chip firms’ in-house funds, which adhere to existing industry research agendas. Silicon Catalyst is the world’s only incubator focused solely on developing ambitious semiconductor solutions. Though it does great work, our interviewees repeatedly bemoaned that its limited resources and administrative structure force Silicon Catalyst to be selective to the point of excluding many promising upstarts. With an annual revenue of roughly two million dollars and a dozen team members, it has admitted under 10% of the over 1,000 startups it has evaluated. 

Though such selectivity from strategic investors is commonplace across industries, it risks allowing otherwise worthy chip startups to look abroad for less conservative sources of funding. In a representative 2022 month, 58 Chinese startups seeking venture funding were successful, raising $1.27 billion, 56.27% of that month’s global chip startup funding. In contrast, only 17 U.S. startups successfully raised money that month, raising $710 million or 31.26% of the global total. Though these statistics cannot predict the relative success rate of either country’s startups, a well-funded, technically informed institution like the NSTC could help the U.S. catch up to China in the number of entrepreneurs it could help bridge the valley of death.

By leveraging the U.S. government’s and industry participants’ technical expertise alongside its significant funding, the NSTC can offer startups the advisory support and subsidized EDA licenses that only a small share of worthy entrepreneurs currently enjoy. As the chip world’s newest investor and incubator, the NSTC could help semiconductor startups survive the valley of death dividing prototyping and commercialization.

Building on- and off-ramps for underdogs

To avoid duplicating efforts by foreign consortia and to fill gaps left by the private sector, the NSTC should avoid chip industry segments that are overly saturated. Due to their lower capital cost, chip design startups receive the lion’s share of venture capital (VC) funding for the semiconductor industry. Of the 1343 global (including China) chip startups receiving VC funding tracked by one study, over half – 750 – were design startups. Although that study could not accurately capture the breakdown in funding each sector received, 18 of the 25 startups receiving the most funds engage in design activity. This relative abundance of design funding is partially due to the explosive growth in AI-related chip design. Design startups targeting AI use-cases received $6.5 billion of the $14.4 billion in raised capital held by private chip companies outside China as of July 2023 according to Pitchbook.

Instead of focusing on areas that are already receiving major private sector investment, the NSTC should support technology breakthroughs in fields that operate more like public goods, often where high capital and infrastructure barriers discourage individual companies from taking a lead. For instance, it could consider alternatives to PFAS chemicals, or solutions to deeply engrained vulnerabilities in heterogeneously integrated chips. Some research into on-chip governance mechanisms could also be appropriate, especially when these tools need to cohere with software and IP designed by different firms for the same chip. The pay-offs of such research strains may come in the form of preventing chemical run-off, enhanced national security, and assured Large Language Model (LLM) end-user privacy. In other words, the NSTC should leverage its ability to advance high-leverage technologies whose costs and pay-off may not be neatly internalized by the firms which are typically making investment decisions.

Leading rounds as a strategic investor

As the traditional free-market paradigm goes, the private sector can allocate capital more efficiently than the government. However, this aphorism misses how semiconductor companies with venture capital arms actually organize investment rounds. Different firms in different parts of the industry will have unique insight into limited sets of technology verticals, and the NSTC’s investment fund could serve to organize funding syndicates as an external moderator.

For example, Intel may know precisely what future equipment could serve its processes well, but it may not know what software design tools Qualcomm would find valuable. Furthermore, neither company’s venture arm will necessarily be focused on risky, potentially paradigm-shifting technologies, for the simple reason that such efforts could either fail or successfully upend their existing business models. At $2.2 billion in funding, SRC comes close to the sort of technically informed investment organizer the NSTC ought to become. However, SRC’s research and investment agenda, the previously mentioned Decadal Plan, largely focuses on existing industry niches. To fill a gap not yet filled by in-house venture funds and SRC, the NSTC’s venture arm should think about semiconductor breakthroughs needed by downstream technologies, like machine learning, heterogeneous integration, and sustainability.

Successfully orchestrating funding syndicates will often require the NSTC to take a leading position. The amount it would need to invest to lead on a given round depends on the stage of the startup, but private entities leading chip startup rounds usually put up between $10 million to $20 million. In contrast, non-leading participants usually only contribute $5 million. The large amounts are themselves testament to the capital barriers semiconductor startups face. For example, a startup based in the U.S. with 50 employees could face an initial annual burn rate of $10 million, rapidly growing to $30 million and $40 million as the startup’s IP license dues grow. Due to these rising operational costs, in each successive round, the size of the anchor investment would need to keep pace with the total round size.

Different niches for different consortia

Though the NSTC should be the broadest chip industry gathering place in a generation, existing semiconductor public-private partnerships around the world offer valuable case studies on how research consortia gather different players in pursuit of overlapping research goals. The legal and financial building blocks we describe in the next section for the NSTC derive from these consortia’s operational models.

Foreign consortia generally fall under two categories: those focusing on “More Moore,” continuing node size shrinkage in line with Moore’s Law, and those pursuing “More than Moore.” Most institutions do not fit neatly in one category or another, and the NSTC’s research agenda will feature characteristics of both. 

As examples of “More Moore,” Belgium’s Imec and Taiwan’s ITRI have long pushed the boundaries of semiconductor manufacturing equipment and process technology. Because generational advancements in node size are such major efforts, the role of startups in these consortia’s flagship research projects is limited. The NSTC will have to consider whether its “More Moore” goals have too fixed a trajectory for smaller players to play meaningful roles.


The Interuniversity Microelectronics Center (Imec) secured its spot in industry history books for its role in developing extreme ultraviolet (EUV) lithography tools, and it continues to work closely with industry players to push the boundaries of semiconductor lithography, deposition, and etching equipment. The center offers researchers across the industry access to facilities for developing new, full-stack, CMOS (the primary fabrication process for memory, logic, and analog chips) paradigms through its system-technology co-optimization (STCO) sandbox. In addition to pushing the boundaries on CMOS paradigms, Imec’s facilities also offer pilot wafer runs for narrower innovations by startups in sensors and telecommunications for specific industry applications.


On the other side of the planet, Taiwan’s Industrial Technology Research Institute (ITRI) similarly supports companies trying to push the boundaries of Moore’s Law, as well as those trying to get around it. Programs to develop advanced metrology tools and techniques support firms like TSMC seeking to manufacture chips with ever-smaller node sizes. Like Imec, however, ITRI also supports logic and memory chipmakers’ “More than Moore” efforts to increase the efficiency of existing node-generations through heterogeneous integration.

By contrast, the “More than Moore” consortia are often located in regions without advanced-node semiconductor manufacturing or design capabilities. Seeking to support the strengths of local chipmakers and downstream clients, they instead focus on increasing the efficiency and applicability of legacy-node chips. Because these consortia seek new applications within existing technology generations, they tend to aggressively search out and support local startups.


France’s Leti (Laboratoire d’électronique des technologies de l’information) serves to reduce the risk of developing new chip technologies for industrial applications by outsourcing applied research for industrial partners. It works with legacy chipmakers like STMicroelectronics and GlobalFoundries to develop more energy efficient materials and architectures for sensors, telecommunications, power electronics, and optics. Its related investment arm, CEA Investissement, also searches out startups in these fields to develop the consortium’s patent portfolio for industrial applications. These semiconductor innovations then feed into the mostly European original equipment manufacturer (OEM) customers that buy from Leti’s upstream chipmaking partners.


Singapore’s Agency for Science Technology and Research (A*Star) operates the country’s Institute of Microelectronics (IME), which focuses its research on advanced packaging, MEMS, SiC, mmWave GaN, and photonics and sensors. On the advanced packaging side, IME offers a heterogeneous integration pilot line for companies to integrate different chiplets, boasting approaches for both dissimilar technologies and dissimilar wafer materials. The institute conducts both self-initiated research and research partnerships focused on applying silicon carbide to power electronics (wide-bandgap semiconductors). On the optics and sensors side, it conducts R&D prototyping and small scale manufacturing for micro electrical multi-physical systems (MEMS) that serve medical device applications. A*Star IME’s technical niches overlap with those of Leti, leading some of Leti’s key French partner firms like Soitec to also collaborate with the Singaporean institute.

Bureaucratic building blocks 

Companies and universities participate in research consortia to access intellectual property that they are not capable of developing alone. They may be willing to contribute technology they have already developed in the form of “background IP,” to help consortium partners conduct joint research into breakthroughs shared with all participants in the form of “foreground IP.” This legal instrument is the lynchpin of any research consortium’s operational model and long-term financial sustainability.

Semiconductor consortia historically take a range of approaches to sharing IP with their partners, from making all findings public to exclusively licensing findings to commercial customers. These IP sharing models should pair with the structural features of the consortium. For example, a consortium receiving a large share of its funding from member companies cannot share its IP as widely as a primarily publicly funded consortium could. America’s last major semiconductor research consortium, SEMATECH, infamously failed when member companies stopped paying contributions due to an IP policy change. The consortium decided to drop its two-year moratorium during which only member companies could license consortium IP, and this caused fee-paying members to feel shortchanged. In other words, the heavily private-sector funded consortium adopted a policy befitting a primarily publicly funded consortium.

This section identifies approaches leaders in Natcast could take to ensure the NSTC’s IP sharing model coheres with its level of subsidization, membership, and research agenda. If these factors are not internally consistent, the consortium may become financially unsustainable after only five years. 

Picking the right IP model

Policymakers must pick IP models that facilitate commercialization of the NSTC’s research, but some IP policies will simply not be compatible with the NSTC’s membership and research output. Critically, if the center does not receive enough funding from the government, it may need to leverage IP tools to siphon revenue to itself.

A 2021 paper by the Institute for Defense Analyses lays out the four most prominent IP models for PPPs. The NSTC’s IP sharing model should progress down this spectrum from a combination of models #1 and #2 in its initial five years, toward models #3 and #4 after a decade.

  1. Open Access: At the least restrictive end of the spectrum, consortia could make all IP open to the public by publishing results and not requiring licensing fees to implement findings. These are most common for PPPs focused on basic research.
  2. Shared, Limited to Partners Only: Here, members have free access to any IP developed by the PPP through nonexclusive royalty-free licenses. Use may be limited, though, to “evaluative work” that does not solely benefit an individual firm’s priorities.
  3. Shared, Limited to R&D Collaborators: At the tightest end of the sharing spectrum, only members who participated in a specific technology’s development own the resulting IP. Access for other consortium members varies.
  4. Exclusive Model: Some consortia license technology on an exclusive basis to third parties. These tend to occur in PPPs focusing on late-stage research, and such licenses are a significant source of income for those consortia.

In addition to differing between each other in their IP sharing models, many consortia change their models over time. NSTC leaders will have to dynamically adjust their IP model as the answers to the following questions about the consortium change: 

Does the consortium receive a large or small share of its funding from the government? Does it have a homogenous or diverse membership? Does it focus on one stage of technological development or research the full stack? These factors all require different allocations of IP rights.

Levels of federal subsidization

Consortia receiving a relatively large share of their annual budget from governments have much more flexibility to share their IP among members and the public, pushing them toward the public/open access camp. In contrast, consortia receiving a relatively small share of their funding from governments either have to own their IP for the associated royalty revenue or alternatively rely on membership fees.

If a consortium decides to own its IP, then it is likely to rely on the exclusive model of licensing IP to third parties. Meanwhile, relying heavily on membership fees obliges a consortium to share limited IP rights to its members only, as these companies would be loath to see outsiders free ride on their contributions.

As we elaborate in the next section, Natcast should follow Imec’s financial model, allowing government funding from the CHIPS Act to form the majority of the NSTC’s funding in its first five years. Such a financial plan will allow the NSTC to share its IP as a public good in its earlier years, allowing small startups, universities, and large firms to equally benefit from the consortium’s foundational research.

Homogenous vs. diverse membership

Four types of stakeholders often join research consortia: start-ups, big industry, academia, and government. Each has different contributions to make to the consortium, as well as different needs regarding IP. Critically, the more diverse a consortium’s membership is, the more flexible it will need to be in its IP policy.

Start-ups often have the least to contribute to consortia, but the greatest ability to leverage a PPP to benefit society at large. Since most of a start-up’s value is based on its IP, founders are naturally hesitant to share their background IP with other coalition members. Any leakage of core IP to well-established competitors could jeopardize their chances of receiving the next round of funding. Nonetheless, if consortia accommodate start-ups by requiring less background IP sharing from them, these small players can leverage the consortium’s subsidization of EDA tools to commercialize their ideas more rapidly.

Big industry stakeholders bring significant IP to the table, but want to ensure competitors do not free ride off their contributions. To that end, they often call for multiple tiers of membership that match IP rights to contribution levels. Though big companies are more willing than start-ups to share background IP, they demand robust licensing frameworks to ensure their technology is not misused by competitors.

Academia differs from private sector members in that university IP often comes with limitations. Their background inventions are often funded by government grants that regulate licensing terms, end-uses, and ownership transfers. A consortium must prevent members from inadvertently violating the rules around any background IP, but caution is especially warranted for universities’ contributions.

At the time of this report’s writing, the White House has announced an interagency agreement that “outlines the goals and processes for determining the strategy and membership structure of the NSTC Consortium,” but the contents of the agreement are not yet public. As they move forward in this membership selection process, Natcast officials will have to carefully consider the IP interests of the members it admits. The consortium may need to subsidize startup and academia members who cannot contribute background IP as readily as large firms. 

Setting an agenda

Research consortia can focus on basic research, applied research, prototyping, or scaling. More often, though, they adopt an agenda spanning the research stack.

Basic research pursues fundamental advancements in core scientific disciplines, like physics and chemistry. This usually calls for a public/open access model, where people in and out of the consortium can apply research results to starkly different fields. Often, only consortia with large government funding or broad memberships can afford to make their findings free.

Applied research leverages existing science to address immediate commercial needs. As a result, members worry about non-contributors free riding off the consortium’s work, and call for a shared-limited model. These models often involve nonexclusive royalty-free licenses (NERFs) granting consortium members free but limited use of jointly developed IP.

Prototyping and scaling take focus on how to manufacture and commercialize an innovation. Here, consortia help companies prepare their already well-developed ideas for production and commercialization, so there is no question of sharing this IP with competitors. Consortia engaging in this end of the stack must use exclusive models that compensate the consortium via revenue shares and IP royalties.

In consortia that research more than one piece of the technology stack, it becomes important to categorize technology properly. Whether a specific research project qualifies as basic research, applied research, prototyping, or scaling determines who has access to the resulting IP. The Department of Energy has faced challenges in the past where proprietary user-facility agreements prevented its laboratories from accessing information about industry partners’ research results. Though facility users had to report what patentable technologies resulted from their activities, they did not have to report these technologies’ commercial applications, which in turn limited the DOE’s ability to determine the technology readiness level (TRL) and ultimate value of its research products. 

Having incomplete information about its facilities’ research products is not fatal for the DOE labs. Because they do not function as a research consortium, DOE labs are not at risk of improperly allocating IP rights if they misunderstand the use cases of clients’ IP. The NSTC, however, must ensure it retains insight into the commercial applications of its research results. If it fails to, it will not be able to properly categorize the TRL of the research it produces and so may improperly allocate IP rights to its members.


As responsible stewards of taxpayer dollars, Commerce officials are right to pursue efficiency while implementing the CHIPS Act. The Department noted in its September 2022 strategy paper that achieving its resilient chip supply chain goals “requires co-location of resources in particular geographic regions to achieve economies of scale and spillover benefits.” Officials need to be careful, however, that co-location for financial reasons does not relinquish control of the NSTC to the next door companies with whom it cooperates.

In theory, a program like the NSTC could be fully centralized, fully distributed, or follow a hybrid model. Most stakeholders suggest the latter because existing academic and corporate facilities already have parts of the infrastructure needed for the NSTC. In these models, there is a core NSTC facility, as well as several satellite “Centers of Excellence” (CoEs) serving specific functions. 

Industry stakeholders, namely large semiconductor manufacturers, advocate a decentralized approach. One prominent response to a Commerce Department request for information on the NSTC said that a single NSTC location with full-flow capabilities is not necessary. Rather, the NSTC could function as a “distributed full flow” connecting nationwide hubs. Chipmakers clearly prefer decentralized hubs pursuing specific technology verticals because these hubs could advance their preexisting research agendas. Intel has proposed setting up an Advanced Packaging Lithography Center and assisting with an Advanced Packaging Manufacturing Center, both of which would operate as standalone physical facilities tied to the broader NSTC ecosystem. Micron has urged the NSTC to create a Memory Center of Excellence built adjacent to existing leading-edge facilities, namely Micron’s own in Boise, Idaho. The manufacturers’ philosophy is captured by TSMC’s position, which calls for the NSTC’s Centers of Excellence to feature both specific focus areas and economically self-sustainable business models. In other words, such a vision would empower CoEs to operate more or less independently of each other, the overall NSTC, and Washington.

On the other hand, academics, federal labs, and the most R&D-intensive semiconductor companies (namely, fabless design firms) advocate a more centralized version of the “hybrid” NSTC model. Evolving semiconductor technology and supply chains at a fundamental level requires early orchestration at the R&D stage, and these groups are concerned that siloed R&D at one stage of the technology “stack” could cause incompatibility with advancements further up or down. To that end, centralization advocates call for the NSTC to be led by a strong core with Centers of Excellence (CoEs) that play only a secondary role. Composed of early-stage research, equipment suppliers, chip customers (OEMs), or chip manufacturers, the CoEs would not pursue independent technology verticals, but rather research different stages of a centrally determined technology goal.

We support a hybrid model, leaning toward greater centralization of facilities in pursuit of a unified research agenda. Whether Congress ultimately decides to renew the NSTC’s funding after the five years of CHIPS Act support dry out may hinge on the consortium’s track record at moonshots. If it just becomes a subsidy for existing industry R&D plans, members of Congress may not view the NSTC as an improvement on dedicated CHIPS manufacturing subsidies. When debates about a “CHIPS Act Part II” commence around 2027, Congress may even choose to follow long-standing industry calls to subsidize R&D directly through tax credits, in lieu of extending an administratively complex consortium. However, if the NSTC leads to paradigm shifts in computing technology that benefit more than just the largest industry incumbents, it may become an American institution with staying power.

Funding trajectory

To understand how infrastructure and funding sources are interrelated, it helps to recall that Imec had a long path to its low level of government funding (17%). The consortium’s first year in 1984 posted 90% public funding, falling to 50% in 1992, and close to its present-day levels at 20% by 2004. Because a wide range of companies pay membership fees, no one firm commands its long-term research agenda.

To establish a centralized model and control over its long-term agenda, the NSTC should follow a path similar to Imec, as outlined in a MITRE Engenuity February 2023 paper:

  1. In the first five years, the NSTC should build out its infrastructure primarily with public CHIPS Act funds, setting the consortium’s foundation without controlling pressure by any single company. The independence brought by funding infrastructure investments with public dollars would reduce the NSTC’s need to make large concessions to companies next to whose facilities it may need to build annexes. Such concessions could include requiring that the annex pursue existing research priorities of the host company or even committing fab time to some of the host company’s individual competitive research.
  2. In the second five years, the NSTC should rely more heavily on participation fees for joint projects, but public dollars should continue funding significant investments in infrastructure to ensure new facility plans do not only serve narrow private interests.
  3. Finally, with most infrastructure having been built and its research agenda largely set, the NSTC can confidently engage in more direct fee-for-service projects with companies without fear that it will become over-reliant on a small number of players.

What the semiconductor industry needs

As the semiconductor industry gets ever more complex, it needs greater cross-industry collaboration. Throughout chipmaking’s history, public-private partnerships have facilitated cooperation between siloed competitors, but the industry’s current research and investment agendas feature market failures that existing consortia cannot solve.

The chip industry was born in America, and a sufficiently ambitious NSTC can ensure the next generation of computing technology is continued here. The consortium must help players of different sizes across the chip industry address important shortfalls in their research agendas: startups need commercial-grade facilities to test their designs’ production viability, incumbents and newcomers both need a partner committed to developing new materials and process chemicals, and large siloed firms need forums for cooperation on cross-sectoral missions. 

Similarly, the NSTC must fill gaps in the industry’s investment agenda. Mid-stage startups need targeted support from technically informed investors to survive until commercialization. Startups with offerings outside the latest trends need a sharp-eyed investor thinking about the industry’s long-term needs. And other investors need a player with both technical knowledge and financial muscle to organize funding syndicates for promising upstarts.

If the NSTC’s ambitious research and investment agendas complement those of private industry, the once-in-a-generation consortium can make a strong argument to Congress for continued funding in its later years. Legislators must remain convinced that the consortium thinks broadly about American innovation and can produce moonshots that direct industry subsidies could not. With Natcast leadership appointed and ready to build the NSTC from the ground up, we hope these recommendations help the U.S. “drive the pace of innovation, set standards, and re-establish global leadership in semiconductor design and manufacturing.”

Appendix: IP-sharing models

Public-private partnerships around the world feature different funding sources, memberships, and research agendas, offering fascinating illustrations of IP sharing in different settings. As Natcast leadership becomes clearer on the proportion of the center’s annual budget that comes from the government, its membership, and its research agenda, these precedents offer guidance on which IP sharing model would be most sustainable for the NSTC.


Although Belgium’s Imec also only receives a small share – 17% – of its funding from the Flemish government, other structural characteristics allow it to operate under a more open shared limited model.

On the funding side, readers will recall that consortia receiving limited funding from their governments have two options: seek IP revenue or lean on membership fees. Imec does the latter, relying on fees to fund its facilities. As a result, Imec neither requires members to share their background IP nor takes sole ownership of the center’s IP.

A diverse membership further encourages Imec’s shared limited model. Called the Imec Industrial Affiliation Program (IIAP), research partners in specific technical areas establish shared limited ecosystems where members receive rights to foreground IP proportionate to the background IP they contribute. This allows start-ups to participate despite having less IP to offer than established firms, whereas they would be repelled by the rigidity of Germany’s Fraunhofer or China’s NHSTTC.

Beyond its diverse membership, Imec’s wide research agenda requires different IP approaches depending on the stage of research. Early-stage research is less sensitive, so Imec shares these results with all fee-paying members. However, as technology climbs up the stack toward applied research, access is tightened to only members directly collaborating on the project. Imec rarely offers exclusive licenses as those would shortchange fee-paying members.


Receiving 65% of its nearly $700 million annual budget from the Taiwanese government, ITRI has a public orientation similar to Japan’s AIST, but it also has exclusive arrangements for later-stage R&D.

As most of its R&D projects use funds from the Ministry of Economic Affairs, ITRI’s IP model is obliged by Taiwan’s Basic Law of Science and Technology to be fair, open-access, and non-exclusive. Like Germany’s Fraunhofer and China’s NHSTTC, ITRI itself owns the IP it develops. In contrast to those exclusive models, however, partners collaborating with ITRI on basic research projects are not expected to commit significant background IP or fees, and in return for participation they receive shared limited use of the R&D results.

In addition to its government-sponsored basic research work, ITRI also pursues later-stage projects and commissioned work requiring exclusive IP models. When companies commission ITRI to solve specific problems, the consortium must offer exclusive licenses giving clients full competitive advantage over the technology they purchased.


Public announcements by Leti often mention that subsidies from the French government account for only 20% of its annual budget of 330 million euros. It is true that, over the past decade, direct public funding has only made up about a third of Leti’s budget —with most of these public funds allocated to academic research and education. However, another third of its budget comes from public funding through research contracts, and only the final third is purely funded by industry.

This segmentation of public funding allows the French state to support its industrial champions without violating EU regulations. CEA, Leti’s governing body, remains the owner of the results obtained by Leti researchers; it also retains the “industrial building blocks” (basic patents) of bilateral projects. These “industrial building blocks” not only help disseminate know-how to startups but, as a former director of CEA’s investment fund and head of Leti’s microelectronics program noted, “Leti’s objective is not to do research for research’s sake but to help our industrial champions grow.” To that end, Leti not only keeps its patents in France, but also grants preferential licensing terms to French and European companies over their non-European counterparts. As a result, Leti’s IP policy bolsters France’s technological sovereignty by preventing innovations – and innovators – from leaking out of the country.

Leti’s private sector partners certainly also influence its program since they contribute a third of its budget. However, the levers of the French state –Leti’s public contracts, as well as other administrative checks – maintain public influence capable of setting the laboratory’s research agenda.


Receiving nearly its entire budget from the government and working with a broad range of research partners, Japan’s AIST functions most closely to a public research organization.

About 75% of AIST’s over $700 million annual budget comes from government funding. Specifically, a large share of these funds are specifically commissioned to explore basic science. The results of these projects are often public, not needing a license, and Japan’s Science and Technology Agency has underscored the importance of open access to research in the information and communication technology (ICT) sector.
Beyond its public research, AIST does conduct joint research work with universities and companies partners, calling for some shared limited IP arrangements. In these partnerships, AIST follows a tack similar to that of Imec, in welcoming partners to leverage background IP in return for joint ownership of foreground IP that can be non-exclusively licensed for limited purposes.

  1. Thank you to Joe Pasetti, Travis Mosier, Daniel Armbrust, Todd Younkin, Dan Patt, John Cole, Yan Zheng, Eileen Tanghal, David Henshall, Sam Folk, Gary Ignatin, and others for their contributions to this paper.