LDRAunit® - A Standalone Unit Testing Tool




The most common approach to unit testing requires drivers and stubs to be written. The driver simulates a calling unit and the stub simulates a called unit. The investment of developer time in this activity sometimes results in demoting unit testing to a lower level of priority and that is almost always a mistake. Even though the drivers and stubs cost time and money, unit testing provides some undeniable advantages. It allows for automation of the testing process, reduces difficulties of discovering errors contained in more complex pieces of the application, and test coverage is often enhanced because attention is given to each unit.

The primary goal of unit testing is to take the smallest piece of testable software in the application, isolate it from the remainder of the code, and determine whether it behaves exactly as you expect. Each unit is tested separately before integrating them into modules to test the interfaces between modules. Unit testing has proven its value in that a large percentage of defects are identified during its use.



LDRAunit, LDRA’s class leading unit test tool, provides a complete integrated framework for the automated generation and management of unit tests. This solution maximises test throughput and repeatability to significantly increase overall test benefit. In turn this frees up developers to focus on ensuring that the functionality of software under development has been implemented correctly, including the appropriate error handling. Software development managers seeking to develop the highest quality code are turning to unit testing to avoid the potential delays caused by postponing the discovery and correction of defects until the system test cycle.


Unit Testing with LDRA


Making use of the comprehensive control/data flow analysis, LDRAunit extracts details concerning the unit interface, parameters, globals (input and output), return values, variable types and usage and procedure calls. Traditionally this level of information could only have been specified by a developer with an expert knowledge of the unit under test. By automating this process LDRAunit frees up highly qualified staff who may then be re-assigned to other modeling, design and development tasks.

LDRAunit Test Manager Report
LDRAunit Test Manager Report

LDRAunit facilitates several test scenarios:


- Single procedures, functions, methods (Unit test)

- Files containing many functions, classes (Module test)

- Complete programs (Sub system & system test)


LDRA has revolutionised the traditional “unit testing” activity, which is typically performed on the host and/or target systems with its new automatic testing capability, eXtreme Testing. This high degree of test automation saves both time and resources, thereby enabling a quicker time to market. The LDRA tool suite’s ability to work in a highly distributed environment provides complete visibility into the overall development processes which can be accomplished even if development teams are distributed globally.


Key Features


The key benefit of LDRAunit supported unit testing process is the high degree of automation that saves both time and resources, thereby enabling a quicker time to market.


LDRAunit Callgraph
LDRAunit Callgraph

LDRA's Unit Testing Features:


- Automated test driver / harness generation with no manual scripting requirement

- High levels of test throughput via the intuitive graphical and command line interface options

- Sophisticated automated analysis facilities which reduce test effort, freeing up developers and empowering testers

- Storage and maintenance of test data and results for fully automated regression analysis

- Automated detection and documentation of source code changes

- Tool driven test vector generation

- Facilitates execution of tests in host, target and simulator environments

- Automated generation of test case documentation including pass/fail and regression analysis reports


LDRAunit Testcase Pass/Fail Report
LDRAunit Testcase Pass/Fail Report

Additional Automatically Handled Language Features:


- Abstract Class testing

- Automatic Creation of Compound

- Objects in test

- Access to Private and Protected Data

- Re-use of tests through Class Hierarchy

- Polymorphism

- Inheritance

- Templates

- Structure/Arrays/Unions

- Automated Resolution of Templated Types

- Classes

- Automatic Creation & Object "Re-Use" (Through Attachment)

- Access Methods & Attributes through the entire hierarchy

- Exceptions

- Pointers

- Generics (Ada)

- In/Out Parameters (Ada)

- Records (Ada)


Features & Benefits


Automatically Generated Driver Program/Test Harness


LDRAunit Host Target Options
LDRAunit Host Target Options

LDRAunit utilises sophisticated control flow and data flow analysis techniques to document the interface to the unit under test in full. This level of information then enables LDRAunit to automatically generate test drivers removing the need for manual scripting. There are no limitations to the automatically generated driver. It is pure C/C++, Ada 83/95 or Java depending on the application code and can be executed in the host or target environment.

Exception Handling

Exceptions can be automatically caught and test cases can be passed or failed dependent on whether such an exception has been raised. The exception handling method is configurable. The exception handlers themselves can also be subject to unit tests. Such tests can be applied irrespective of whether the exceptions are raised, allowing coverage to be achieved even when the raising of an exception would be impractical.

Stub Creation


LDRAunit Regression Report
LDRAunit Regression Report

Stubs can be written by hand or generated automatically for functions, methods, constructors, system calls, packages, generics, etc. The automatically generated "managed stubs" are sufficiently complete to allow the test harness to build and execute.

The default behaviour of managed stubs can be modified via an intuitive graphical user interface to tune such items as return and global parameter values. For instance, it is possible to vary return values depending on the number of occasions on which the stubbed function has been called, whilst passed parameter values can become pass/fail criteria for the unit tests themselves.

Test Case Files / Test Case Management / Storage

LDRAunit stores groups of test cases as sequences. Users can then export a sequence to a Test Case File (TCF) which contains all of the information required to re-run the test cases. TCF’s can be grouped with regression reports and can be stored for regression verification and either saved with the source file, via a software configuration management (SCM) system, or used as an annotation. Requirements based testing documentation, including why particular values were chosen and tags to map to a requirement management system, can be added for storage. When used as a SCM annotation these files allow managers to determine directly from the SCM system that developers are testing their code on check in. TCF’s can also be re-run from the command line and in batch mode so that as the source code changes module interfaces and output can be verified.

LDRAunit has access to the full range of coverage metrics available in the LDRA tool suite. These include Procedure Call, Statement, Branch/Decision, MC/DC and LCSAJ (Test Path). Users can choose an appropriate metric or set of metrics based on their safety and program constraints. For example, MC/DC coverage is essential to verify results are not masked by condition input conditions and LCSAJ coverage provides a comprehensive metric to evaluate loops. All of these metrics are available graphically, via flow graph displays, call graph displays and the file/view of the LDRAunit GUI. Users can directly access compliance reports to give overall pass / fail metrics for standards such as DO-178B. Line by line views as to which statements, branches and conditions have been executed are also shown in these reports.

eXtreme Testing

eXtreme Testing builds on the LDRAunit ability to automatically populate unit test cases, extending this to the generation of the test cases themselves. It automates the unit/module/integration testing processes and, by encompassing test harness and test vector production, it eliminates almost all of the overheads associated with bottom-up testing.

Features include the ability to automatically fine tune the processes used to create the test vectors to optimise the level of coverage achieved. Vectors generated by means of eXtreme Test can then be complemented by means of manually generated test cases.