Vector Software W H I T E P A P E R. Using VectorCAST for Software Verification and Validation of Railway Applications

Similar documents
Vector Software. Understanding Verification and Validation of software under IEC :2010 W H I T E P A P E R

Testing. CxOne Standard

9. Verification, Validation, Testing

Test Workflow. Michael Fourman Cs2 Software Engineering

Led by the Author Attended by a peer group Varying level of formality Knowledge gathering Defect finding

AIRBORNE SOFTWARE VERIFICATION FRAMEWORK AIMED AT AIRWORTHINESS

Development of AUTOSAR Software Components with Model-Based Design

Deploying Code Coverage Analysis to Improve Software Quality

Deterministic Modeling and Qualifiable Ada Code Generation for Safety-Critical Projects

Implementing IEC 61508:2010 with the LDRA tool suite

ISO Software Compliance with Parasoft: Achieving Functional Safety in the Automotive Industry

This document describes the overall software development process of microcontroller software during all phases of the Company Name product life cycle.

Lectures 2 & 3. Software Processes. Software Engineering, COMP201 Slide 1

Assessment of String Tests Strategy for an En Route Air Traffic Control System

ON THE USE OF BASE CHOICE STRATEGY FOR TESTING INDUSTRIAL CONTROL SOFTWARE

Interlocking Design Automation. The Process

Introduction to Software Engineering

Testing 2. Testing: Agenda. for Systems Validation. Testing for Systems Validation CONCEPT HEIDELBERG

Software Quality Engineering Courses Offered by The Westfall Team

Brochure Services. About. Tools. » Where can we help? » Unit/system testing. » Software verification services» Our approach

Software Quality Engineering Courses Offered by The Westfall Team

Single Euro Payments Area

Achieving ISO Compliance in Silicon (And Beyond?)

User guide. SAP umantis EM Interface. Campus Solution 728. SAP umantis EM Interface (V2.7.0)

MSc Software Testing and Maintenance MSc Prófun og viðhald hugbúnaðar

Using codebeamer to Achieve

Safety in the Matrix. Siemens AG All rights reserved.

version NDIA CMMI Conf 3.5 SE Tutorial RE - 1

VectorCAST Presentation AdaEurope 2017 Advanced safety strategies for DO178C certification Massimo Bombino, MSCE

Brochure Services. About. Tools. »» Where can we help? »» Unit/system testing. »» Software verification services»» Our approach

QUALITY ASSURANCE PLAN OKLAHOMA DEPARTMENT OF HUMAN SERVICES ENTERPRISE SYSTEM (MOSAIC PROJECT)

Brochure. About. Tools. Services. Where can we help? Our approach Why choose Rapita?

Brochure Services. About. Tools. »» Where can we help? »» Unit/system testing. »» Software verification services»» Our approach

2010 Green Hills Software, Inc Slide 1

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT SOFTWARE ENGINEERING 2

Using Auto-Generated Diagnostic Trees for Verification of Operational Procedures in Software-Hardware Systems

Software verification services for aerospace. »» Unit and integration testing. »» Timing analysis and optimization»» System and acceptance testing

Brochure Services. About. Tools. »» Where can we help? »» Unit/system testing. »» Multicore timing services»» Our approach

MSc Software Testing MSc Prófun hugbúnaðar

Introduction to Software Engineering

BluePlant SCADA/HMI Software

Introduction to Verification and Test of Embedded Systems SE767: Vérification & Test

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK

Software Processes. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 4 Slide 1

BASICS OF SOFTWARE TESTING AND QUALITY ASSURANCE. Yvonne Enselman, CTAL

Automated Black Box Testing Using High Level Abstraction SUMMARY 1 INTRODUCTION. 1.1 Background

A Cost-effective Methodology for Achieving ISO26262 Software Compliance. Mark Pitchford

The Leading Solution for Method Validation VALIDAT

Developing Medical Device Software to be compliant with IEC Amendment 1:2015

Technical Integration Testing Requirements. Trusted Digital Identity Framework August 2018, version 1.0

ISTQB CTFL BH0-010 Exam Practice Question Paper

New Solution Deployment: Best Practices White Paper

Summary. Days til Cutover Considerations Detail Sheet

Software Processes. Objectives. Topics covered. The software process. Waterfall model. Generic software process models

Objectives. The software process. Topics covered. Waterfall model. Generic software process models. Software Processes

Objectives. Rapid software development. Topics covered. Rapid software development. Requirements. Characteristics of RAD processes

Frontload the design, V&V and certification of software-intensive mechatronic systems by adopting the Digital Twin approach

Topics covered. Software process models Process iteration Process activities The Rational Unified Process Computer-aided software engineering

Contractual Aspects of Testing Some Basic Guidelines CONTENTS

Space Product Assurance

Bugs are costly... Kinds of Quality Assurance

Benefits. + + Consistent quality to recipe specification. + + Increase asset utilization and operational efficiency

Instrumentation & Controls. Siemens Power Plant Automation -- SPPA-T3000. Technical Highlights. The New Benchmark in Control.

SPARK. Workflow for Office 365 Workflow for every business. SharePoint Advanced Redesign Kit. ITLAQ Technologies

The Verification Company. Software Development and Verification compliance to DO-178C/ED-12C

LIMS interfaces. R.D. McDowall. The role of a LIMS MA R C H/ A P RI L

Testing and Inspections (3C05/D22) Unit 11: Testing and Inspection. What is Testing?

Product Documentation SAP Business ByDesign August Product Development

ISTQB Advanced Technical Test Analyst Certificate in Software Testing

Accelerating Xilinx All Programmable FPGA and SoC Design Verification with Blue Pearl Software

HP Cloud Maps for rapid provisioning of infrastructure and applications

TLM-Driven Design and Verification Time For a Methodology Shift

Title of Slide. Virtual Simulation using QTronic Silver and TestWeaver. Presented by: Robert Ter waarbeek

Dependable Technologies For Critical Systems. Software Verification. 22 nd May Technologies Ltd 2011 Critical Software

Industrial IT System 800xA Engineering

R214 SPECIFIC REQUIREMENTS: INFORMATION TECHNOLOGY TESTING LABORATORY ACCREDITATION PROGRAM

Next Generation Design and Verification Today Requirements-driven Verification Methodology (for Standards Compliance)

THE INSTITUTE OF CHARTERED ACCOUNTANTS (GHANA) SOLUTION: BUSINESS INFORMATION SYSTEMS, NOVEMBER, 2014

Mathcad : Optimize your design and engineering process.

Results of the IEC Functional Safety Assessment

Discussion Paper on the Validation of Pharmacovigilance Software provided via SaaS

Requirements elicitation: Finding the Voice of the Customer

DEPARTMENT OF DEFENSE STANDARD PRACTICE

Towards Systematic Software Reuse in Certifiable Safety-Critical Systems

A Cost-Effective Model-Based Approach for Developing ISO Compliant Automotive Safety Related Applications

Research on software systems dependability at the OECD Halden Reactor Project

SOFTWARE DEVELOPMENT STANDARD

Can I reduce manual data entry by using an automated information capture system?

Characteristics of a Robust Process

Applicability of Model-Based Design Quality Metrics to Medical Device Software

GENERAL PRINCIPLES OF SOFTWARE VALIDATION

ISTQB-Level1 ASTQB. American Software Testing Qualifications Board Level 1

ISTQB CTFL BH0-010 Exam Practice Question Paper

SPARK. Workflow for SharePoint. Workflow for every business. SharePoint Advanced Redesign Kit. ITLAQ Technologies

Validation of Calibration Software by the Calibration Facility

A Model-Based Reference Workflow for the Development of Safety-Critical Software

Testing Masters Technologies

WIND RIVER SIMICS WHEN IT MATTERS, IT RUNS ON WIND RIVER DEVELOP SOFTWARE IN A VIRTUAL ENVIRONMENT

Test Management: Part I. Software Testing: INF3121 / INF4121

Saber Automotive Overview

Transcription:

Vector Software W H I T E P A P E R Using VectorCAST for Software Verification and Validation of Railway Applications Introduction This document is intended to serve as a reference for the usage of VectorCAST products with the European Standard EN 50128 (EN 50128: Software for railway control and protection systems", March 2001) itself derived from the IEC 61508 standard. It is not intended to be an exhaustive review of the standard, but merely to provide a high-level view of the different testing requirements that can be satisfied using the VectorCAST testing products. For this document, the following legend is used: R Recommended activity HR Highly recommended activity M Mandatory activity VectorCAST during Software Design and Implementation In Clause 10, which is summarized in Table A.4 (page 49), the following activities can be automated by VectorCAST. 13. Library of Trusted/Verified Modules and Components R R R R 14. Functional/Black Box Testing (Design) HR HR M M 15. Performance Testing HR HR HR HR 16. Interface Testing R R R R Table A.4 13. Library of Trusted/Verified Modules and Components According to B.40 (page 84-85), a library of trusted or verified modules and components avoids the need for software modules and hardware components to be extensively revalidated or redesigned for each new application.

Previously tested software modules can certainly be reused but what happens if you want to use a library from a third party vendor that (a) you did not use before (thus have no considerable operational history) and (b) for which you do not have source code available? VectorCAST/C++ can be used to test such libraries (library testing mode, starting with VectorCAST version 5.1) to ensure the data coming out of these libraries and into your software is predictable and correct. Access to the source code is not necessary. 14. Functional/Black Box Testing (Design) Testing during software design and implementation is highly recommended for SIL1 and SIL2, and mandatory at SIL3 and SIL4. Different types of suitable testing strategies are described in Table A.14. The challenge here is that during implementation, engineers are not likely to have access to the entire source code by definition, it is still under development. This supposes a unit or perhaps module-based testing approach. In order to be able to test the software as it gets implemented, software stubs will need to be provided. These are pieces of code that take the place of regular code (for instance, one that has yet to be implemented) so compilation closure can be obtained on a specific file, class or even module. Without compilation closure, the code cannot be successfully tested. VectorCAST/C++ and VectorCAST/Ada are well-suited to this task. Engineers can use these tools to quickly generate a complete test environment stubs included without providing any test code or scripts. All that is needed is the code to be tested and the path to the includes. With this minimal amount of information, both tools can automatically generate all of the test harness code (test driver, stubs ) required for unit or module testing. What would normally take hours of test code development is done in a matter of minutes. The VectorCAST tools for unit and integration testing mentioned above, when used in conjunction with the VectorCAST Runtime Support Package (RSP) for a specific target environment allows test execution on host, simulator, or directly on the target board. All of the facilities to execute tests and capture all test artifacts are controlled by the RSP. 15. Performance Testing According to Table A.17, Avalanche/Stress Testing, Response Timing and Memory Constraints, and Performance Requirements are recommended and highly recommended by EN 50128. These can be partially automated using VectorCAST/C++ and VectorCAST/Ada. For instance, the time a function takes to execute can be calculated at the unit test level. Another example of such tests may involve running the same test case several times, perhaps alongside systemlevel threads, which is also possible with VectorCAST. Page2

Particular activities that might be undertaken depend on the specifics of your environment. Please contact a Vector Software representative to discuss specific activities you wish to perform. 16. Interface Testing During software design and implementation, it is recommended to test the interface of the code (at the function call level). As explained at B.27 (page 83), several levels of detail or completeness of testing are feasible, but the most important levels would require testing all function parameters using extreme values at once or one at a time (while using normal values for other parameters); test all values for all parameters, including in combination; and through specific test conditions 1. All of these activities can easily be done in VectorCAST/C++ and VectorCAST/Ada. Extreme values can be specified according to functional range or type, and illegal values can also be tested. Combinational testing is also integrated into the product. 1 Testing all parameter values, especially in combination mode, may not be doable for large interfaces, as specified by B.37. VectorCAST during Verification and Testing In Clause 11, which is summarized in Table A.5 (page 50), the following activities can be automated by VectorCAST. 4. Dynamic Analysis and Testing HR HR HR HR 5. Metrics 6. Traceability Matrix R R HR HR 7. Software Error Effect Analysis R R HR HR Table A.5 Even if all of the methods are Recommended or Highly Recommended, a minimum combination for achieving different SIL levels must be done. At SIL 3 or 4, Dynamic Analysis and Testing must be performed, in addition to either: Formal Proof Static Analysis Traceable Matrix and Software Error Effects Analysis Page3

It should be noted that VectorCAST provides facilities to perform a Traceability Matrix easily, so the Dynamic Analysis and Testing + Traceability Matrix + Software Error Effect Analysis is doable without the presence of another tool. A combination between Dynamic Analysis and Testing + Static Analysis is also doable if one static tool (for instance, QAC, to which VectorCAST provides a complete integration) is used. 4. Dynamic Analysis and Testing This section covers a variety of tests, many of which can be achieved through a mix of testing on the entire software build (hereafter referred to as system testing ) and unit/module testing. Most of these activities are described in Table A.13. 1. Test Case Execution from Boundary Value Analysis HR HR HR HR 2. Test Case Execution from Error Guessing R R R R 3. Test Case Execution from Error Seeding R R R R 4. Performance Modeling R R HR HR Table A.13 Although these activities can technically be done at the system level, the EN 50128 specifies that during the Dynamic Analysis and Testing phase they must be performed at the sub-system level (meaning unit or module level) and must be based on the specifications of the software and/or the specification and the code. VectorCAST can automate all of these techniques, as explained below: 1. Test Case Execution from Boundary Value Analysis These types of tests aim at removing potential software errors at parameter limits or boundaries based on the parameter s type. The best way to achieve this goal is to unit test the software so as to input these limit values directly to functions. Section B.4 mentions special cases that should be duly tested. These include: Setting parameters to zero if they happen to divide another variable within the function Testing blank ASCII characters Testing an empty stack or list element Testing a null matrix Testing a zero table entry Testing the maximum and the minimum values of the type, and potentially the functional limits Testing values outside the boundaries Page4

All of these can be done within VectorCAST/C++ and VectorCAST/Ada. In fact, these two tools push the automation even further by providing a way to automatically generate test cases that test all input values to their minimum, maximum and median values. We refer to these tests as MIN-MID-MAX test cases. The minimum and maximum values are determined by testing the range of every type present in the program on the target board or simulator. Thus, using the tool on either the board or on a simulator will guarantee that the range of boundary values tested through automatically generated MIN-MID-MAX tests is valid in your system. These two tools can further test special values as specified above. They can even test other values not directly mentioned, such as Not-A-Number (NAN), positive and negative infinity on floating-point variables, etc. 2. Test Case Execution from Error Guessing According to section B.21, these test cases are based on testing experience and intuition combined with knowledge and curiosity about the system under test. This may lead to the creation of additional test cases to try to cause errors within the software. This activity can be performed at system level, but there is also value performing this at unit level (as required by Table A.13), thus ensuring that the components of the software are as error-proof as possible. In fact, the experience gained testing at implementation time (as introduced earlier in this document) may lead to the creation of additional test cases that can be recycled during testing phases. Even if the environment changes (for example, a new target processor, or with a different code version), test cases can be easily exported from and imported into test environments using VectorCAST unit test tools. The test case data is kept separate from the test harness in a text file, and can be re-imported in test harnesses focusing on specific units all at a click of a button. This not only saves time but also provides assistance during Error Guessing activities. 3. Test Case Execution from Error Seeding The goal of this test strategy is to intentionally provoke errors to see if test cases will identify their presence. If some errors are not detected, additional test cases must be provided. Error Seeding can easily be done at unit level. Although test cases are provided in an intuitive table, a test case in VectorCAST can also be set to (1) input error conditions and, in many cases, (2) confirm that these error conditions were detected. For instance, pointers may be left intentionally null, or exceptions may be intentionally raised to see how the system would cope with such situations. Page5

4. Performance Modeling These activities aim at calculating performance, such as processor time, communications bandwidth, storage devices utilization, and so on. VectorCAST tools can be used to inject test vectors for these types of activities, however the calculation of processor time, bandwidth, throughput, etc., must be done with the help of other tools, such as the debugger. An interesting feature here is the capability for VectorCAST to be executed under the control of a debugger, which is available for all supported compiler configurations. 5. Equivalence Classes and Input Partition Testing This strategy aims at testing the software adequately by determining partitions on the input domain necessary to exercise the software. These test cases aim at testing the program sufficiently. They can be based on either software specifications or the internal structure of the program (or both). These tests can obviously be conducted at system testing level, especially if testing high-level requirements. However, engineers may also want to test low-level requirements and/or base some of their partition test cases on the internal structure of the program (as required by table A.13). These last activities become practical when using automated tools like VectorCAST. VectorCAST unit test tools can test ranges of values and lists of values, either in a combination or non-combination mode. Execution of these complex test cases is done with the click of a button. VectorCAST also features a partition test case generator to automatically create additional test cases on a provided domain. 6. Structure-Based Testing This activity aims at compiling a metric called code coverage. Code coverage indicates how much of the application source code was exercised during different testing activities. The EN 50128 standard lists a number of code coverage criteria (Section B.58), including Statement, Branch, Compound Conditions, LCSAJ and Entire Path, without specifying which criteria should be used. The standard document also mentions that accomplishing LCSAJ and Entire Path coverage is often infeasible. Our experience working with clients in the industry shows that for that reason, LCSAJ and Entire Path criteria are seldom used Statement, Branch, and Compound Conditions are frequently used, and the combination of these criteria is chosen as a function of the SIL level that is chosen to be satisfied. Page6

Statement The goal here is to execute each line of code. This is usually done for all SIL levels. Branches In the case of a line with a branch (for instance, an IF statement), both the FALSE and TRUE value of the branch are tested. This supposes the presence of two test cases per branch. This can often be found in SIL2, SIL3, and SIL4 projects. Compound This refers to MC/DC, or Modified Condition/Decision Coverage, where every sub-condition in a compound branch (for instance; a branch that has many parameters, such as if(a && b c) ) is demonstrated to be able to independently affect the result of the entire branch. This criteria of code coverage requires a minimum of n + 1 test cases, where n = number of sub-conditions in the branch. Thus, it is usually reserved for SIL3 and SIL4. Within VectorCAST, test cases developed to perform other activities (such as demonstrating requirements, testing software boundaries, etc.) can automatically generate code coverage information. Page7

Also, when using VectorCAST/Cover while performing system testing (for instance, Probabilistic Testing, an activity described in Table A.5), code coverage can also be generated. The use of VectorCAST/Cover can therefore bring the following benefits: 1. Ensure that the system test undertaken for a particular activity tests a significant part of the code which usually, will not reach 100%. 2. Generate code coverage that can later be combined with code coverage achieved at unit level to achieve 100% code coverage on the software. Although achieving 100% code coverage is doable at unit level only, many clients find it beneficial to do it both at unit and system level, as it usually results in a shorter time to get complete coverage. VectorCAST offers an easy pooling of code coverage data between system, unit, and integration test levels. 3. If all Statement/Branch or MC/DC coverage is done at unit level, VectorCAST/Cover can ensure that during system test all function calls are used at least once, which is considered to be enough to guarantee complete code coverage at system level by standards such as ISO 26262. Call coverage is also mentioned by EN 50128 in section B.58. VectorCAST offers a uniquely automated code coverage browser which indicates through different colors, which lines of code were covered and which ones were not, which test cases or coverage results covered specific lines of code, and even automatically generate truth tables required for MC/DC coverage. Page8

Metrics The goal of these metrics, according to section B.42, is to predict the attributes of programs from properties of the software itself rather than from its development or test history. Both VectorCAST/C++ and VectorCAST/Cover include cyclomatic complexity as part of their standard analysis, which is mentioned by EN 50128. For additional metrics from static analysis, VectorCAST is used in conjunction with several different static analysis tools such as QAC (with which VectorCAST has an advanced integration). Traceability Matrix According to section B.69, the objective of Traceability is to ensure that all requirements can be shown to have been properly met and that no untraceable material has been introduced. This includes but is not limited to traceability between software requirements and test cases. VectorCAST/C++ integrates requirements through the VectorCAST/Requirements Gateway module, to third-party tools which function is to maintain software requirements, such as DOORS. Using this integration, users can download and then link specific software requirements to test cases, and then upload that information back to the requirements management system with the status of the test case (PASS/FAIL). This makes the drafting of the traceability matrix much easier. Software Error Effect Analysis (SEEA) Software Error Effect Analysis proposes to detect software errors and enhance software robustness as well as to evaluate the amount of validation needed by different software components. The VectorCAST unit test tools play a role here as test vector generator, which makes it easy to draw up and test what-if scenarios quickly during SEEA. Additional Considerations All of the testing described can be done on host, simulator, or directly on a target board. For more accurate results, Vector Software recommends these tests be performed at least on an appropriate simulator. They should be performed on the board whenever possible so as to (1) guarantee that the results correspond to the specificities of the environment, and (2) accelerate the testing linked to software/hardware integration testing activities detailed in Table A.6 and explained in the next section of this document. Page9

Test activities during software/hardware integration Clause 12 specifies a software/hardware integration phase meant to demonstrate that the software and the hardware interact correctly to perform their intended functions. Table A.6 provides a list of testing activities that should be undertaken during this phase: 1. Functional and Black Box Testing (Design) HR HR HR HR 2. Performance testing R R HR HR Table A.6 For SIL1, SIL2, SIL3, or SIL4, both techniques must be used. 1. Functional and Black Box Testing Functional and Black Box Testing during software/hardware integration may take the following forms (Table A.14): 1. Test Case Execution from Cause Consequence Diagrams HR HR HR HR 2. Prototyping/Animation R R HR HR 3. Boundary Value Analysis HR HR HR HR 4. Equivalence Classes and Input Partition Testing HR HR HR HR 5. Process Simulator R R R R Table A.14 Boundary Value Analysis and Equivalence Classes/Input Partition Testing have been described earlier in this document (during Verification and Testing). If these tests were directly performed on the board, these should not have to be modified at this point in time. If they were done on a host or a simulator, the test cases associated with these activities can be easily reused within VectorCAST unit test tools. At this point in time, additional test cases performed at system testing level may also be advisable. Cause Consequence Diagrams is to model, in a diagrammatic form, the sequence of events that can develop in a system as a consequence of combinations of basic events. Prototyping and Animation check the feasibility of implementing the system against the given constraints. Process Simulation tests the function of a software system without actually putting the complete system in action. Page10

All of these additional activities should be performed at system testing level. Of course, for critical system it is advisable to measure how complete these tests are. One way to do this is to measure code coverage during the performance of these system level tests. VectorCAST/Cover can integrate directly to the target board or even complete system and provide that code coverage information. Note that only a set of these activities must be used in order to comply with EN 50128. 2. Performance Testing As previously mentioned, the VectorCAST unit testing tools can be used to generate test vectors to partially automate that form of testing, but other tools must be used in conjunction to calculate performance, memory constraints, etc. As with VectorCAST/Cover, both VectorCAST/C++ and VectorCAST/Ada can be used directly on the target board, making both tools suitable for use during software/hardware integration. Software Validation Software validation s objective is to analyze and test the integrated system to ensure compliance with the Software Requirements Specification with particular emphasis on the functional and safety aspects according to the Software Safety integrity level (Clause 13). Activities to be performed during software validation are as follows (Table A.7): 1. Probabilistic Testing R R HR HR 2. Performance Testing HR HR M M 3. Functional and Black-Box Testing HR HR M M 4. Modeling R R R R Table A.7 As it can be seen from the table, Performance Testing and Functional/Black-Box Testing are either highly recommended or mandatory. Both can be automated and/or facilitated by the use of VectorCAST tools, as described in earlier sections of this document. If VectorCAST tools were systematically used in earlier phases, their test cases and output can be easily reused during software validation to run these as regression tests, saving time and resources. Page11

Software Maintenance Clause 16 discusses software maintenance and is considered very important by EN 50128. The techniques to be used to ensure software maintenance are as follows: 1. Impact Analysis HR HR M M 2. Data Recording and Analysis HR HR M M Impact analysis aims at identifying the effect that a change or an enhancement to a software system will have to other modules in that software system as well as to other systems. This entails re-verifying the changed module, all the affected modules or the complete system, which itself depends on impact analysis. VectorCAST unit testing tools have complete regression testing capabilities. This means that the test cases that were designed for earlier versions of the code can be re-run seamlessly on the new code. If a test case become antiquated (because of a change in the number of parameters or their type for example), then that test case is simply ignored or flagged for review. Thanks to this characteristic of VectorCAST tools, it becomes possible to re-run the whole set of test cases on a given model against updated code in a matter of minutes. This streamlines the amount of maintenance that needs to be done. These tests can also be re-executed overnight, as the regression testing process can be fully automated. Data Recording and Analysis is answered by VectorCAST s ability to append notes to specific test cases. Reporting EN 50128 highly recommends the production of a number of documents, such as Software Test Reports, the Software and Hardware Integration Test Report, Software Validation Report and Software Maintenance Records (Table A.1). VectorCAST/C++, VectorCAST/Ada and VectorCAST/Cover can produce a variety of reports based on the whole environment/software or any part of it. These reports can be generated in either Text format or HTML. They can be saved outside VectorCAST, and have been used for years as artifacts to comply with a variety of standards such as EN 50128, DO-178B, IEC 61508, ISO 26262, IEC 62304, etc. Page12

Conclusion In conclusion, the VectorCAST/C++, VectorCAST/Ada, VectorCAST/Cover, and VectorCAST/Requirements Gateway products can automate testing and code coverage activities in a way that makes EN 50128 much easier to attain. All these tools can export their individual reports in HTML or Text, which have been used successfully in the past to comply with EN 50128 and other demanding industry. VectorCAST/RSP is the module that enables the execution of a test harness on simulator or a board. The process is entirely automated, so test cases can be executed individually or as a group by a simple click of a mouse or the command line. The execution itself requires no user input. In addition to these basic tools, Vector Software also provides VectorCAST/Manage to its clients. This tool can federate the results from the other tools so as to provide a bird eye s view to the status of the entire project and generate metrics on the whole project standards. In addition, VectorCAST/Manage expends the reporting capability of the other VectorCAST tools by providing a way to customize reports and export them in a variety of formats, including XML. Successes in Railway Projects VectorCAST tools have been used successfully by a growing list of premier clients, such as ELIN EBG (a division of Siemens), Bombardier Transportation, GE Transportation, The London Tube, Union Switch and Signal (a division of Ansaldo Systems). Our clients appreciate the VectorCAST products automation and flexibility which radically decrease the time spent on complying with the software verification and validation requirements specified in standards such as EN 50128 and EN 50129. About Vector Software Vector Software, Inc., is the leading independent provider of automated software testing tools for developers of safety critical embedded applications. Vector Software s VectorCAST line of products, automate and manage the complex tasks associated with unit, integration, and system level testing. VectorCAST products support the C, C++, and Ada programming languages. Vector Software, Inc. 1351 South County Trail, Suite 310 East Greenwich, RI 02818 USA T: 401 398 7185 F: 401 398 7186 E: info@vectorcast.com Vector Software Golden Cross House 8 Duncannon Street London WC2 N4JF, UK T: +44 203 178 6149 F: +44 20 7022 1651 E: info@vectorcast.com Vector Software St Tӧniser Str. 2A 47906 Kempen Germany T: +49 2152 8088808 F: +49 2152 8088888 E: info@vectorcast.com Page13