Verification 101: What is Verification all about?
We often think that everyone knows the basics, but let’s face it, we have all, at some point in our careers, been given a task that we have never performed before and for a while we stand, or sit, looking stupid because all of the other people in the meeting know what the acronyms stand for, know the basics, and sound really intelligent. After a short while you begin to pick up the basics and once you are up to speed you realize that many of those people didn’t really understand it either – they were just muddling through, hoping that nobody would notice.
This series of short articles will present the basics for someone just coming into the functional verification field. It will attempt to give you a quick boost that will send you in the right direction.
So let’s get down to it. First of all – what is verification?
The cost to create a new design is large – very large. While it was possible in the very early days of chip design, to get it right the first time, design size and complexity soon grew to the point that it was very likely that an engineer would make some mistakes. In order to locate those mistakes we have to exercise the design and see if it responds in the correct way. This is an act of verification. When a problem is found, we move into a debug phase where we get to understand what caused the problem and to decide upon how to fix the design. Then we go back to verification until we gain enough confidence that there are no more bugs left in the design. Today, system complexity is such that it is not possible to exercise all aspects of a design and thus we have to be very selective in the verification that is performed. It is not untypical today for more time, people and expense to be applied to verification than the actual design itself, and even with this large expenditure, most designs are first fabricated with several bugs still in them. With any luck these bugs do not completely stop the design from working and workarounds can be found such that a first release of the product is possible.
In the early days of verification things were very informal. It normally involved exercising a model and looking at the waveforms produced. The designer would decide if they were right or wrong and if right move onto the next experiment. This is no longer possible because of the number of tests and the fact that it is too difficult to rerun the experiments to see if anything changed. Today, verification has become a lot more structured.
Fundamentally, verification is the comparison of two models. The argument goes that if those two models were derived independently from each other, and they functionally match, there is a good chance that they are both right. So what do we mean by independently? We assume two different people that each read a specification and write their model without directly discussing it with each other. Those two people are generally from the design team and the verification team. So why do I say “not directly discussing it”? If there is a problem with the specification – an ambiguity in the way that it is defined, you want that problem to become apparent during the verification process. You do not want the way each of them writes the model to be influenced by that discussion, unless it directly results in a clarification of the specification. If they find a problem in the specification, discuss it and correct the specification such that the ambiguity is removed. That is OK. In fact that was in itself an act of verification. This is an informal review of the specification.
So, now we have two models. How do we compare them?
There are two primary ways, generally called static and dynamic verification methods. Let’s start with dynamic, since this is the most commonly used. Dynamic verification is based on a simulator, emulator, or prototype (we will look at these in detail in another posting). These methods exercise the model by sending sample data into the model and checking the outputs to see what the model did. If we send in enough input data, then our confidence grows that the model always does the right thing. The input data stream – usually called stimulus – is constructed in a way that it will take the model into various different parts of the code and we can compare the output data – usually called response – between the two models. When a difference is found, it either means the design model is incorrect, the verification model is incorrect, or as we have already implied – there is a problem with the specification. Each set of data that is sent into the models is called a test. The comparison between the two models is done by a checker. The stimulus can be constructed to take the design into a specific area of functionality, normally called a directed test, or can be based on randomized data streams. Now we don’t just send in any random data – that is unlikely to fully exercise the model, instead it is controlled randomization and usually referred to as pseudo random, constrained random or directed random testing.
Static verification, or often called formal verification, is a mathematical proof that the two models are identical under all conditions. No stimulus is necessary as all possible stimuli are considered. The second model (the verification model) is usually constructed in a different way using what are called properties. A property defines a behavior that must be exhibited by the design. Alternatively it may define something that must never happen.
Verification is an attempt to maximize the quality of the design in the most effective manner, and the ways in which this is done will be different for each company and sometimes for each individual product. For example, we do not need the same level of quality for a cheap child’s toy as compared to an implantable medical device. This means there is no one right way to do verification and many people have called it an art rather than a science. In the past verification engineers were seen as the poor cousin of the design team whereas today a good verification engineer is worth his weight in gold – even with the price of gold skyrocketing to levels we have never seen in the past!
Brought to you by Brian Bailey
Verification 101-2 Directed and Random testing
[amazon_enhanced asin=”144194561X” /] [amazon_enhanced asin=”0750676175″ /] [amazon_enhanced asin=”B00403N212″ /] [amazon_enhanced asin=”1441909648″ /]
Pingback: Verification 101 - Directed and Random testing