Building a Performance Validation Continuum

Building a Performance Validation Continuum

We had another great Synopsys Virtual Prototyping Day 2025, with many insightful presentations from our virtual prototyping and systems modeling user community. One of the highlights in the System Architecture Design & Exploration track was the presentation by Vikrant Kapila , Chief Architect for the Agilex product line from Altera .

Vikrant presented a flow for seamless tracking of SoC KPIs from early architecture design, via simulation and emulation, to silicon validation. His Takeaways are:

  1. The performance validation continuum is a multi-stage, collaborative process that benefits from early, model-based exploration and tight feedback loops.
  2. Investing in test portability and harmonized KPI tracking can yield significant efficiency gains.
  3. Collaboration with tool vendors and internal cross-functional teams is essential for managing increasing SoC complexity.

Vikrant covered a lot of ground, starting from the challenges of building a scalable SoC portfolio for a broad range of application domains. He identified the gaps in the current performance validation strategy, where today different steps are not well connected. Vikrant is building a seamless validation methodology, where results from early analysis are validated in later verification and validation steps. In more detail, his presentation covers the following topics:

1. The Importance & Breadth of FPGAs:

  • FPGAs are used across a wide range of industries, from industrial and broadcast to medical, military, wireless, and data centers.
  • This diversity necessitates a broad portfolio of solutions (e.g., Altera Agilex3,5,7, and9) to meet varying requirements for performance, power, logic capacity, IO density, and cost.

2. The Challenge:

  • Building a unified, scalable, and verifiable architecture is crucial to efficiently serving these diverse markets without incurring prohibitive costs.
  • Modular, plug-and-play architectures help optimize development and support market needs.

3. Performance Validation Continuum:

  • IP Level: Early validation of complex IPs, like network-on-chip, before full SoC integration.
  • SoC Level: Benchmarking performance when multiple IPs are integrated, including running bare metal and OS-level benchmarks.
  • Solution Stack Level: Validating performance for specific end-use cases (e.g., robotics, automotive, AI).

4. Methodology:

  • Use of task graphs to capture diverse use cases and make them executable in Synopsys Platform Architect.
  • Early performance analysis is done using system performance models, where many are available out-of-the-box, and some are custom-built.
  • Correlation between performance model results and RTL simulations to ensure model fidelity.
  • Emulation is used for full stack validation, correlating findings, and feeding back insights to improve models.
  • Post-silicon validation leverages performance models for root cause analysis and optimization.

5. Evolution of Validation Approaches:

  • Initial use of Platform Architect focused on interconnect and memory subsystem exploration.
  • As CPU subsystems were integrated (e.g., ARM cores in Agilex5), complexity increased, requiring co-simulation of RTL and SystemC models.
  • Synopsys’ co-simulation framework enabled integrating RTL of CPU subsystems with SystemC models for other IPs, improving early detection of system-level issues.

6. Practical Considerations:

  • Importing RTL into Platform Architect is straightforward, but co-simulation (RTL + SystemC) often reveals mismatches or abstraction gaps that require effort to resolve.
  • Co-simulation is valuable for left-shifting validation but comes with trade-offs in simulation speed and complexity.

7. Test Portability & Correlation:

  • Emphasis on ensuring test patterns and configurations are portable across validation stages (performance model, RTL, emulation, post-silicon) to avoid redundant work and improve debug efficiency.
  • Example: Using traces from RTL simulations to create equivalent task graphs for performance models to ensure apples-to-apples correlation.

8. Next Steps and Ongoing Work:

  • We are striving for equivalence of test patterns and KPI definitions across platforms.
  • We are building dashboards for tracking and harmonizing data across teams and validation stages.
  • We expand our focus from just performance to include power and performance co-optimization.
  • We enable our customer and partner with access to performance models for early design exploration.

In case you missed it, Vikrant's presentation is available on the Synopsys Virtual Prototyping Day 2025 event website.

Altera Synopsys Inc

To view or add a comment, sign in

More articles by Tim Kogel

Others also viewed

Explore content categories