Cloud native EDA tools & pre-optimized hardware platforms
It’s hard to imagine a time when AI wasn’t a part of the silicon chip design flow. Since intelligence has now been integrated into design, verification, test, and other key phases, engineers are experiencing productivity advantages along with outcomes that humans alone wouldn’t be able to accomplish under typical project timelines.
How did we get here? And where do we go from here?
These were just a couple of the questions pondered by a panel of 草榴社区 AI architects at this year’s SNUG Silicon Valley 2023 conference in Santa Clara. The panel, “Rise of AI for Design—Journey Thus Far and the Road Ahead,” brought together experts from different areas of the business to share overviews of AI enhancements in their areas so far and thoughts on what might be coming up next. Geetha Rangarajan, senior manager from the 草榴社区 AI Strategy and System team and AI track lead at SNUG Silicon Valley, shared that the main objective for the panel was to discuss how AI can help us rethink ‘hard’ problems in multiple areas of system design and inspire attendees to think creatively about possibilities for leveraging AI-driven solutions. Read on for highlights of the discussion.
Over time, AI has simplified complex chip design workflows, optimizing increasingly large and complex search spaces. Indeed, solutions such as 草榴社区 DSO.ai? use reinforcement learning to massively scale the exploration of design workflow options, reducing design time while enhancing power, performance, and area (PPA). Unveiled in 2020, DSO.ai was the market’s first AI application for chip design, and recently notched its first .
Joe Walston, a distinguished architect from the 草榴社区 AI Strategy and Systems team, raised the question: could AI help engineers solve complex problems across the system stack? From computing devices to appliances to planes, modern systems cover a gamut of applications. The system stack includes software and hardware (physical) components that can potentially benefit from AI. There’s the workload-driven software, which is designed to handle the communication, data processing, and interface with the user, external environment, and other components within the system. Then there’s the software-driven architecture: hardware subsystems that are architected to deliver what the software needs (whether it’s mechanical, optical, semiconductor, power, or sensors).
Each of the layers of the system stack comes with many consequential questions to answer. For example, in the semiconductor subsystem, state-of-the-art SoCs include several processors, complex interface IP, digital/analog logic, memories, etc. Designers are faced with multiple workflow optimization challenges, from microarchitecture selection to floorplan optimization and choices around physical design, testing, and manufacturing. Similarly, for an optical subsystem, the designer must make decisions around the embedded circuit, lens, detector, and light source. AI, said Walston, could help with the repetitive exploration work, enabling engineers to meet their targets faster.
Badri Gopalan, a 草榴社区 scientist in the company’s EDA Group, provided perspectives from the functional verification side. For background, he noted that verification complexity is growing faster than Moore’s law. In today’s SoCs, there’s a lot to verify: all of the logic, all of the combinations across multiple dimensions such as performance and power, etc. How can verification engineers find more bugs, which also happen to be more complex? And do so faster than ever before? While maintaining the desired quality-of-results and cost-of results?
Applying traditional verification solutions at RTL will typically get engineers close to 100% coverage, as Gopalan explained. Static verification might find about 10% of the bugs, though the results may be extremely noisy and the process, quite manual and laborious. Formal verification could detect another 20% of the bugs, and simulation might find another 65%. Simulation also involves a lot of manual effort, with a substantial amount of time spent tweaking constraints of the testbench and writing manual tests.
The new 草榴社区 VSO.ai AI-driven verification solution, which was announced at SNUG Silicon Valley 2023, helps verification teams achieve closure faster and with higher quality by identifying and eliminating redundancies in regressions, automating coverage root-cause analysis, and inferring coverage from RTL and stimulus to identify coverage gaps and provide coverage guidance.
Semiconductor testing is another area that is benefiting from AI. During the silicon manufacturing process, test engineers must verify that the design is defect-free and will work as intended. Memory built-in self-test (BIST), compression IP, and logic test fabric are among the solutions available to test the logic. Sensors also play an important role. Typically, the data collected is analyzed and looped back through the design cycle for improvements.
Each step of the way, decisions need to be made on parameters to optimize. The goal is to test as much as possible with as few test patterns as possible to manage costs, explained Fadi Maamari, vice president of engineering for hardware analytics and test at 草榴社区. At SNUG Silicon Valley 2023, 草榴社区 announced a new product that optimizes test pattern generation using AI: 草榴社区 TSO.ai. Designed to reduce the number of test patterns needed while increasing coverage and shortening automatic test pattern generation (ATPG) turnaround time, TSO.ai intelligently automates ATPG parameter tuning, drives consistent, design-specific quality-of-results optimization, and reduces test costs dramatically.
AI is currently just scratching the surface of the impact that’s possible in the electronic design space. The panelists at SNUG Silicon Valley 2023 agreed that with rise of natural language models such as those seen in AI chatbots like ChatGPT, as well as other opportunities presented by AI, now is an exciting time to be in the industry. There’s more to be done to advance autonomous design, verification, and test, as well as more areas to enhance. Strong electronic design automation (EDA) technologies with a tightly integrated, machine learning-driven loop can be a powerful force enabling engineers to accomplish more than ever possible.
“With the move to FinFET nodes, new problems are emerging,” said Vuk Borich, distinguished architect, 草榴社区 Circuit Design & TCAD 草榴社区. “While chips are denser and smaller and there’s more of them, there’s some regularity and some patterns and some things that are amenable to artificial intelligence. So we foresee a great deal of innovation.”
Just looking at analog design, one can pinpoint areas that can benefit from an infusion of intelligence. As Borich highlighted:
Beyond electronic systems, another area where AI can accelerate convergence with less designer intervention is in optical design. Optical design is a key enabling technology for applications such as imaging, automotive illumination, and photonic ICs. These applications are highly complex, with a large number of variables and tolerances to consider which were historically handled with special tools. AI has the potential to unlock new opportunities to co-optimize specialized algorithms, explained William Cassarly, a 草榴社区 Scientist on the optical solutions team. AI allows exploration of a large portion of the design space, provides new starting points for existing algorithms, and reduces the effort involved in handling discrete cases. In addition, AI offers the potential to enable knowledge transfer between completely different use cases, allowing less experienced designers to produce results that might have only been considered by a designer with significant experience.
As we approach the system level, siloed knowledge across hardware and software teams makes bring-up a complex and costly effort. System-level visibility and automated root-cause analysis are key to faster time to market. Rachana Srivastava, a senior staff R&D engineer in the 草榴社区 Systems Design Group, noted that AI can enable automated system-level root-cause analysis. Mapping data in an event-based knowledge graph can provide visibility across the system. Applying machine-learning models on this data can generate predictions and a feedback loop for information mining to generate better silicon results.
Exciting times are indeed to come as engineers devise new ways to apply AI and machine learning to workflows across the system stack. Designs that meet PPA and time-to-market targets for next-generation applications will only grow more complex. AI can provide the productivity boost that engineering teams need, while helping them achieve outcomes that were previously unimaginable.