MSc Dissertation: Grant Carter

Citation:

Adobe-PDF-downloadCarter, Grant. System Level Simulation of Digital Designs: A Case Study. MSc Dissertation. Department of Electrical Engineering, University of Cape Town, 1998.

 

Abstract:

Very High Speed Integrated Circuit Hardware Description Language (VHDL) is a hardware description language that is gaining increasing popularity among digital designers in South Africa, as it is both a synthesis and simulation language. Many designers make use of the language’s synthesis ability but hardly tap into the power of its simulation abilities. This dissertation primarily investigated the feasibility of VHDL simulation during the design process. Secondary goals were to document the design methodology as well as state-of-the-art of the tools required for FPGA design and simulation. As a case study, a digital preprocessor for a synthetic aperture radar (SAR) was designed and simulated. The design was targeted for an FPGA in an attempt to determine the level of complexity of algorithm that can be obtained in an FPGA. This was a hardware solution to the design requirement; a completely software solution implemented in a DSP was attempted by Yann Tréméac [19].

In July 1993, the US Department of Defence instigated a program known as Rapid Application Specific Signal-processor Prototyping (RASSP). The purpose of this program was to review the process used in creating embedded digital signal processors in an attempt to decrease the time taken to produce a prototype by a factor of four. The methods proposed by RASSP for achieving this goal included the reuse of existing modules, concurrent design and virtual prototyping.

The virtual prototyping that the RASSP initiative refers to includes a process of writing VHDL models to represent the system being designed. These models are first written at an abstract level where the mathematical equations which describe the processing are tested. Test data can be input to the model which will perform the required processing. The output can then be verified to ensure that the equations are correct. At this stage, the model contains no structural information as to how the processing is achieved, nor even the numerical method used to implement the equations.

The level of abstraction of these models decreases with every model that is written. Obviously the number and type of models that are written depends upon the design. An example of the models which could be written are a mathematical model and an algorithm model which models the numerical methods used in implementing the mathematical equations. A behavioural or functional model can then be written to break the system into a number of sub-components. The sub-components are modelled so that their interfaces are correct but the internals contain no information on the structure used to implement the algorithms. These models can then be further refined to include implementation details until a final design is produced. At each stage, the test data that is used in the more abstract model can still be used for verification. This system of testing requires that testbenches be written. These are simply pieces of VHDL code that can read and write data files as well as provide known stimuli to the unit under test.

To investigate the feasibility of VHDL modelling, a preprocessor for the South African Synthetic Aperture Radar (SAR) was designed and modelled. This preprocessor was required to low pass filter the data received by the radar and then sub-sample it safely to reduce the data rate of the data to be stored. Three methods were considered for implementing this data reduction: Using a presummer, using a FIR filter or a combination of the two. The last option was chosen since it produced the highest azimuth resolution after SAR processing and it required the least number of filter taps to produce. The method required a presummer which summed three PRIs. The FIR filter was a 32 tap filter and incorporated a “skip” factor of 4. This method did not violate any constraints set by the SAR processing regarding the sampling rate of the data, and it was feasible to implement.

Since the processing was divided into the presummer and prefilter, it was logical that the hardware be similarly divided. One of the first design issues to be overcome was how these two entities should interact. Both required the use of external RAM to facilitate temporary data storage. The first method was to have separate memories for each entity. The presummer would then output a presummed range line to the prefilter for processing. The greatest disadvantage of this method was that the prefilter would then have to store this data in its memory before processing could take place. This was inefficient as the prefilter would have to store the data again in its memory and this would prevent it from processing during that time. The second method was the one implemented. The implementation made use of dual ported RAM. The presummer was connected to one port and the prefilter to the other. The advantage of this method was that the prefilter did not have to perform any data storage which increased the amount of time it could spend processing data.

An algorithm model was written for the presummer and prefilter operations to verify the effects of the precision of the stored data, the filter tap weights and the mathematical validity of the process. Test data was produced and read into the model. The processed data was output and the results analysed. This data set was then used to verify the operations of the other more detailed models.

The second model that was written was an abstract functional model. This modelled the interfaces of the presummer and prefilter but contained no details of the internal implementation or timing. The abstract functional model was however able to process data and the test data which was used in the algorithm simulation was used to verify the operation of the model. A model of the RAM had to be written to allow the presummer and prefilter to store data. A functional model was written which contained no timing information but contained the full functionality of the device being modelled.

Finally the presummer and prefilter descriptions were written to allow synthesis. A VHDL synthesiser was used to specify the logic required to implement the devices. FPGA design software was then used to place-and-route the logic and finally a FPGA configuration file was produced. Back-annotated VHDL source code was also produced by the FPGA design software. This was a gate level VHDL model of the device and included timing information which reflected the internal delays of the FPGA. This model was used in the test bench for the functional model since it contained the same I/O ports. The same test data was again used and the results compared to the functional simulation for verification.

In conclusion, the modelling provided a method of verification that would normally only be achievable with a physical prototype. The largest problem encountered with the virtual prototyping was the simulation time of the gate level models. These would have taken up to 60 days on an Intel PII-300MHz processor with 196MB RAM to perform – longer than the time required to build and debug a physical prototype. The second problem was the availability of VHDL models. Without simulation models of all the components used, system level simulation was a pointless exercise. There are some web sites which contain a number of free models but the majority of available models are commercial and are therefore expensive. For companies starting out in the field of VHDL modelling, the cost of a VHDL simulator package can also be prohibitive. If the required models are available and software to simulate and synthesise them, the goals of RASSP can be achieved.

Leave a Reply