title: Verification and Test Talk date: November 8th, 2023 author:
In 1993 Intel released the Pentium lineup and Professor Thomas Nicely found a bug in them in the fdiv module. This bug cost Intel about 475 million dollars. This showed how important verification can be and therefore verification is used to find bugs like this and to ensure that the hardware works as intended and the number of possible bugs is reduced.
In hardware developement in general we want to verify if components adhere to their specification, how the components react to various inputs and if they are correct in reference to specific standards. The difference between verification and validation is that verification ensures that the design is right and that the design meets the specification. Validation checks if it is the right design. And in general the specification should be interpreted correctly and that it is implemented correctly.
A fault is the cause of an error, an error is an incorrect state which may lead to failure, and a failure is a deviation from specified/desired behaviour.
In general there are a few methods to check if the system works. This can range from a simple run of the software, checking whether it works, up to a functional and a formal verification which works with a model using mathematical abstraction.
In software it is only possible to show the presence of failures but not the complete absence of those. An important thing to keep in mind in software verification is the coverage of all aspects such as Input Space Partitioning, Graph coverage, Logic coverage and Syntax-based coverage. A big help for the verification is the V model commonly used in larger projects (slide 11).
Current trends in the software verification are Test Driven Developement where the predefined test is created at first and then the implementation, automated testing where tools and scripts are used to create tests for the system and a machine learning approach which is a hot research topic right now.
In hardware verification we want that the design adheres to the specifications and that the safety required my the specification is met by the design. The goal is to show that design behaves like specified. The behavior is the wanted output to given inputs. A problem we run into here is the space explosion problem, because in larger system not every input and output possibility can be checked as they are way too big.
Another approach is the formal verification where a mathematical model is used to guarantee the system is working for every possible input/output without explicit testing. This can be used for smaller curcuits but it still does not solve the state explosion problem.
Model checking is an effective technique for hardware with a finite state space. It involves translating system specifications into temporal logic (e.g., CTL, LTL) to systematically explore and verify properties in all states. While state space explosion remains a challenge, smart exploration techniques help mitigate it.
Bounded Model Checking (BMC) efficiently examines a limited number of clock cycles (k) using Boolean logic and SAT transformation. It's excellent for quickly detecting bugs in smaller hardware blocks over a limited timeframe, but it doesn't address state space explosion.
Simulation offers a practical alternative to exhaustive testing, relying on random testing with tools, testbenches, and high-level HDL concepts. However, manually verifying waveforms for correctness can be labor-intensive for complex systems.
Reference models automatically check hardware results for correctness by providing a higher-level implementation of system specifications. These models are guaranteed to be correct and can simplify the verification process, potentially generated from specifications.
Comparing waveform output to a reference model is simplified by transforming the waveform to the abstraction level of the reference model (e.g., RTL to Transaction Level Modeling).
\pagebreak