SV out, Python in, do You dare?
With the advent of Verisity Specman (and later SystemVerilog), verification came out of middle ages. Verification took place as discipline by itself.
Taken seriously now, the verification quality improved. My guess, productivity increased too. By the way, it would be a mistake to assume that even before Specman, all we used were hammers and chisels - isles of high tech verification existed always.
But now we face another crisis - shortage of verificators. As stated in a previous post, it is partly or mostly due to bad tools we use today. Most notable SystemVerilog and UVM. These bad tools hamper productivity and innovation.
It is pretty clear that the verification language should be real, powerful, highlevel, general purpose language. And it doesn't have to look like verilog. C and C++ are too low level and building environments with them is too time consuming and complex endeavour. They tend to "Bus Error" a lot.
Specman has many fans, but it is closed language and lack the features and breadth of a general purpose. Made-Up languages (like Specman and SV) have tendency to "help" us with tailor-made structures. However high-level language can do these hands down.
So what I want to bring to the table? Let's start by agreeing that the language in which the design itself is expressed, doesn't have to be the one that verification is built (see E). Moreover, if the language is different, recompilation time of the DUT is spared. Verilog code compilation time is much shorter, because RTL verilog compiles very fast, and no overhead of test-bench compilation occurs.
My language of choice for building verification test bench is Python. Combining Python with Verilog opens up an opportunity to use free tools throughout the whole process. It also allows easy switch between commercial simulators, because the simulator sees just the DUT, which by most part is plain RTL. All of it may save money, some grey hair and allow to see your kids growing.
Today we have a connection between all commercial and free simulators and Python and it is implemented using VPI. Experience of over 10 years also shows that Python can do the job (and do it well). You can easily (and in the same language) combine on-the-fly generation and checkers or off-line generation, coverage and checkers.
Python is rich, powerful, succint and clear language. It is very stable and have practically no bugs. When connected to verilog, powerful math libraries are at your disposal, which are as good or better (and free) as Matlab (for all your OFDM algorithms) . You can use graphical libraries to plot the results on the fly. You can even to ask it to send SMS when a bug is found. Lastly debugging in a high level language is so much more productive.
As example, If Your project is about encryption compression - there are "golden models" of all popular protocols. Database access, parsers of formats, access to the web are among features of full fledged language. All of these and more can help efficient test-bench development. All of these run without compilation and "right out of the box".
Another feature of a good language, is that same piece of code can either work in conjunction with running verilog or post-processing or pre-porcessing - thus saving licenses and splitting the work among servers. For example golden model of CPU, can be used to verify the RTL of that processor, but also serve as software development tool.
Python code has distinct advantage of being readable by other people , not only by the developer. (Which is why i like it better than Perl). Of course, any language can be written poorly, but in Python it is so much easier to do the right thing.
Lately i did few experiments with writing VIP for complex protocols (interlaken, proprietary serdes and one of Ethernet varieties). For me it answered one objection over using open source language for verification - and that "What about VIPs?". It is easy, because You focus on understanding the protocol, not the ways to implement it in some grotesque language.
There are several cheap or free or open source simulators (Icarus, CVC64,..). Thus bulk of verification can be done without any expensive licenses at all.
Adding python to verilog simulator doesnot harm the performance significantly. Of course it also depends on the amount of processing python code performs. But if You take into account saved time on recompilations, the total is run time reduction.
To make Python serve as verification test-bench, we need to know and define how to do:
1. coverage
2. constraint solver, randomizer
3. UVM like conventions.
Coverage.
For starters coverage features can be used as is from SV. However due to clumsiness of the coverage definitions in SV, we can easily replace them by Python. The key is to know what You are doing, but this again is true in every technical situation. Databases, like light SQL, can be used to manage coverage data among runs.
Constraint solver.
My assertion is that constraint solver is not needed. Simple random package is enough. At least in 99% of cases. It is not that constraint solver / randomizer is not being used. It is used just because it is there. However in majority of cases there is no real need for a full scale constraint solver. I challenged many verificators to present me with real case. None appeared. The best i got, was 8 queens on chessboard solver. Didnt see many tapeouts of that one.
Sure there is a case where real constraint solver is the best option. Just never saw one. Lastly there are solver packages in Python. For everything else "import random" - it has enough algorithms and power to last many chips.
Conventions.
There are common structures in verification. Like scoreboards, randomisers, queues, generators and what not. All of these are easily implemented in Python (because object oriented design and basic data structures available). There is no need for deep inheritance chains, because the language doesn't call for it.
What is needed is set of guidelines on how to implement each one of the structures, so they will be easily recognizable by anybody. My guess a good cookbook will do the trick.
So if You are fed up with SV bullshit, ping me to get a complementary copy of the connection document and library.
It may save Your startup.
Nobody was fired because He chose System Verilog and UVM verification.
Ever since this post was published, I am on this mini-quest to find out why people don’t ditch SV and switch to something more humane. I gently probe managers and verificators and patterns start to emerge.
Here few most prominent claims:
1. SV+UVM is the industry standard, we cannot fight that.
For managers it is an easy choice, they follow the herd. Take the easy route. Why get blamed later for being inventive? SV+UVM is the industry standard - true if You want to stay at or below industry average. And it is as always, being a manager: “Anything looks easy, if You don’t have to do it yourself”
2. UVM+SV was invented by smart people. How can we top that?
It is true, but the ones who invented it, do not use it themselves much. And there is market they profit from, fixing the problems and adding complexity. It is like big pharmas prefer to develop keep-in-check-barely-survive drugs and not cures.
3. Available verificators are UVM. That is what available.
For one, judging by feedbacks i get, it is pretty hard to recruit verification crews in any language. For two, some of them are E-specman, which is also hard to come by.
My call is that good worker can learn anything, and given better tools, can increase his output. Mediocre one will be mediocre in any setting.
4. It is complicated therefore it must be sophisticated and be the future.
Claim by especially the career aware younger guys and girls. Bad for them. You always should peek outside the box. The alternative is to remain part of the faceless.
5. verification manager: It makes me look "professional".
enough said.
6. In Bulgaria/Romania/India they all know only UVM+SV
My impression is that given a chance they (the ones i talked to) would prefer a better way. But being mostly behind service providers, they don’t get to choose.
7. We cannot advertise- “we do it differently” .
Local service groups cannot of course advertise anything but perceived industry standard.
So we are doomed to be stuck with SV+UVM. It is just like GB will forever be part of EU.
one more python advantage: very long simulations produce huge log files. Using simple logging package, it was easy to reset these log files, every now and then, if they didn't contain any useful info, like errors, warnings and irregular behaviour. Of course this advantage is shared by all general purpose languages, in case they are used as verification test bench.
Love to try this out.
could you please provide a pointer to the veri python package and command line applied to the simulator
We have built our flow purley based on Python! We even do the design in Python! Sent me a note and let's talk!
A fresh concept for digital verification, I will like to learn more about it. However , your weak spot that immediately pops is the VPI. There aren't a lot people in the industry with enough knowledge in VPI/API .