Cloud native EDA tools & pre-optimized hardware platforms
The human condition of innate curiosity—that insatiable thirst for knowledge and answers—lies behind so many of the advances that surround us in our daily lives. Answering big questions and bringing technology solutions to our evolving wants and needs is the fuel that powered the Moore’s Law scaling era. It is also the reason we now find ourselves thrusting headlong into a new era, one exemplified by even greater scale and systemic complexity.
Designing today’s system-on-chip (SoC) or today’s and tomorrow’s systems-of-chips is becoming ever more complex. Methodologies to mitigate increasingly challenging laws of physics; create new domain-specific architectures to deliver the needed exponential growth in processing power; cost reductions to enable “smarter everything;” the evolution from 2D to multi-die design—all of these areas are demanding greater levels of innovation that can only be delivered through new ways of thinking and new ways of designing.
It turns out that our curiosity and thirst for knowledge and answers is, indeed, the answer itself. A greater understanding of the design flow and our ability to improve it stems directly from our curiosity, tapping into and leveraging the very data that underpins the design process to change it for the better. Using recent advances in big data, analytics, and machine learning to enable more effective, more efficient—indeed smarter—design.
It’s helpful to recognize the different scenarios that call for this comprehensive approach to enhance productivity and optimize the end-to-end design workflow—an area that Mark Richards, a senior staff product marketing manager in the 草榴社区 Silicon Realization Group, is fortunately very familiar with.
He recently sat down with some of the top journalists at Forbes, SemiWiki, DesignNews, eeNews Europe, and SemiEngineering, among others, to share his thoughts on the underleveraged wealth of information that can benefit designers today and how teams must leverage data richness in their complete digital design flow.
Ever-shrinking time-to-market windows, geographically diverse teams, and the wide range of design variables have made it difficult to track and optimize the design workflow. This gives teams limited visibility of the design process, adding inefficiencies to the team’s productivity and systemic risk throughout.
As Karl Freund at noted, “Every chip design team has probably watched this movie before: after months of hard work designing a new product, the team has suddenly lost the recipe, and the latest simulation run veers from the steady progress the team has been making towards tape-out. What went wrong? And how can the team quickly recover to stay on schedule?“
Another issue that organizations face is the loss of knowledge when people and teams inevitably change. In a conversation with Kalar Rajendiran at , Richards alluded that keeping note of various observational learnings from one project is not only helpful at the start of projects, but is valuable when facing problems that need to be debugged. This is where and why institutional knowledge developed over time plays an important role.
“Everything is fine until one or more team members leave and/or when many fresh engineers start working on a project. One of the biggest gripes at many companies is the loss of talent along with institutional learnings,” writes Rajendiran.
Speaking to Spencer Chin at , Richards examined how design engineers have tried to get around the problem of methodologically understanding complex data by resorting to time-consuming solutions to track data. With limited insights on the depth and breadth of the data at hand, this approach only hurts productivity in the long run.
“We want to accelerate the design debug and optimization process. The design process has long been opaque, with designers and managers typically having a limited view of the entire design process, making it difficult to track and improve. There are a number of gauges inside a design, including power, clock rate, etc. All of these variables are interconnected and affect one another.”
To keep up with the pace of changing customer requirements amid the growing necessity for “actionable” data-driven insights, the semiconductor industry needs a new flow of tools to design more efficiently and aid decision-making as we move into the era of SysMoore.
Richards believes that offering a 360-degree view of all design activities will help teams make faster decisions by getting a deeper understanding of run-to-run, design-to-design, and project-to-project trends. In a recent blog post, he called out the old, manual ways of piecemeal data:
“A live, 360-degree view of all design activities allows standardization to drive project-wide efficiencies — based on the fact that everything is now being measured. Engineers no longer need to extract data, enter it into spreadsheets, format it (with maybe somewhat arbitrary colors), and paste it into slides to report their daily or weekly progress. How many hours are your team currently spending on this type of task? Too many!”
Pre-empting this potential loss of information and leveraging data-led insights to enable more efficient workflows and bridge productivity gaps is what led us to develop the 草榴社区 DesignDash data-visibility and machine-intelligence-guided design optimization and signoff-closure solution.
As a complementary product to the 草榴社区 Digital Design Family and the 草榴社区 DSO.ai? AI application for chip design, the cloud-optimized DesignDash solution combines engineering R&D with the data that underpins design to serve as an always-on, always-ready design-data companion. Designed with a comprehensive, 360-degree view, it captures every part of the design flow so that a designer can understand the “what,” “why,” and “how” when they need to solve a design problem.
Making the “why” readily identifiable clearly carries particular value for a designer who needs to fix an issue. During Richards’ conversation with Nick Flaherty at , he described how a comprehensive view of the design flow allows teams to understand why things happened as they did and identify causes systematically.
“Our R&D team has developed a number of models on their own data, including connectivity, and use machine learning (ML) to predict what might happen in the tool flow. We can then create apps that, for example, apply suggested constraints on timing and the models continue to train over time. You can also add in your own ML models with a Python interface for custom workflows and there are rule-based workflows that work out of the box and are shareable. This means I can start to find trends, do data mining, and look at how to improve the design flow.”
Keeping the industry’s widening talent gap in mind, Richards continued the conversation with Marie C. Baca at , elaborating on how the labor crunch has finally convinced chip designers to bet serious money on big data and realize productivity gains through intelligent use of data:
“It’s like compound interest in your bank account. Even small efficiencies can compound over time, especially if you can make every engineer at the company more efficient. Then you can start to address those holes around not having enough engineers or not having enough experienced engineers. A lot of the macroeconomic changes are going to take 20 years to come out of the pipeline. This kind of technology can make everyone more effective today and help the shortfall that’s happening now.”
Thanks to knowledge-led design, actionable insights, and effective debugging and optimization, designers can now unlock the potential of design-flow data to enable better decision-making and accelerate the number of “implementable” designs across a broader architectural solution space than previously seen.