Cloud native EDA tools & pre-optimized hardware platforms
The term “common data model” is used in many aspects of software engineering. At a high level, software systems can be thought of as having two parts:
A common data model aims to standardize that logical infrastructure so that many related applications can operate on and share the same data.
With regard to applying a common data model to electronic design automation (EDA) software, the goal is to facilitate sharing of information between related parts of a design flow that typically operate in a sequential manner. One example is a logical/physical implementation flow that can include operations such as:
Another example is a simulation and verification flow that can include operations such as:
A common data model contains a uniform set of metadata, allowing data and its meaning to be shared across applications. In addition to the uniform metadata, a common data model includes a set of standardized, extensible data schemas that include items such as entities, attributes, semantic metadata, and relationships.
Once all the elements of the common data model are defined, methods to access and operate on the data are developed so that all applications can use these same, standardized procedures.
A common data model opens many opportunities to increase the efficiency of the software development process. New algorithms can be developed without the need to define the data structures to store and operate on the required information. The common data model provides the capability to store the information using standard structures and operate on the data using standard access methods.
There is another very important aspect of design flow integration which is difficult to implement without a common data model. It is often useful for early-stage tools in an EDA flow to access estimates from late-stage tools to improve the accuracy of the models being used in the early stage. Accessing more accurate floorplans and routing delays from back-end placement and routing tools while performing logic synthesis is one example of this.
A common data model allows this data sharing to occur. The process of using late-stage information to inform early-stage decisions is called a shift-left approach.
A common data model also substantially improves the efficiency of the design flow. Without a common data model, data sharing becomes very difficult. Design databases must be exported from internal models to standard readable formats to pass on to the next tool that compiles it into its own data model. This process increases runtime and can result in loss of information. It also uses much more disk space due to data redundancy, which can create challenges for NFS file systems.
A common data model allows early- and late-stage tools in an EDA flow to share information. This facilitates a shift-left approach to design, which results in a more convergent, predictable design process since early-stage results can now take late-stage effects into account. The concept of physical synthesis, where placement information and routing delays are more accurately modeled during logic synthesis, is an example of this approach. Propagating RTL design intent throughout the flow has significant benefits as well. For example, RTL design intent can make late place-and-route stage, multi-bit re-banking much easier. Without this intent, the tool is looking at a random sea of logic to bank. A common data model also implements a “design memory” effect, where approaches that didn’t work well can be propagated so they are avoided for similar future cases.
Convergent design flows result in a more efficient design process with less re-work and a shorter time-to-market. This has a significant positive impact on the lifetime profitability of the system using the chip being designed. Pervasive use of a well-developed common data model opens another significant benefit. Every chip design project is different, and every design group has a specific way of addressing these unique challenges. If the team uses tools that are based on a well-developed common data model, it now becomes possible for the design group to develop its own flow and its own strategy for sharing early- and late-stage data.
The result is a fully targeted and customized design flow. We call this a customer-defined, hyper-convergent design flow. It represents one of the most potent tools available to target the precise needs of a complex chip design project.
草榴社区 Fusion Compiler? delivers a singular RTL-to-GDSII digital implementation solution with a common data model.
Fusion Compiler provides an innovative RTL-to-GDSII flow that enables a new era in digital design implementation. The solution offers new levels of predictable quality-of-results to address the challenges presented by the industry’s most advanced designs. Its unified architecture shares technologies across the RTL-to-GDSII flow to enable a hyper-convergent design flow that delivers 20% better quality-of-results and 2x faster time-to-results.
Benefits of this technology include: