The ECSS (European Cooperation for Space Standardization) set of standards are for use in all European space activities and are designed to be coherent and user friendly. Many of the standards have an impact on the software development process, including two that are software specific:
These two standards are complementary and essentially view the same recommended best practices from different perspectives. That can lead to some confusing overlap between the two for the unwary.
Many of the ECSS standards have a bearing on software development, including the project management series for example. The article “Introducing ECSS software engineering standards within ESA” highlights two that deal specifically with software development and software quality. They are shown below in context. The diagram is an adaptation of an illustration from the article.

ECSS-E-ST-40C “Space engineering – software standard” details topics related to space software engineering, including the software development lifecycle, review milestones, requirements elicitation, software design, coding, verification and validation. It concerns product software that is part of a space system product tree and developed as part of a space project.
First published in 1999 and based on ISO/IEC 12207, the standard defines principles and requirements applicable to space software engineering processes. It also established interfaces with the management and product assurance streams. It is applicable to any software product developed as a part of any space project, including those developed for the:
ECSS-E-ST-40C covers all phases of space software engineering (requirements definition, design, production, verification and validation, transfer, operations, and maintenance), including the definition of reviews and appropriate documentation for each phase. The standard supports tailoring based on software criticality.
The ECSS-Q-ST-80C Rev1 “Space product assurance – Software product assurance” defines a set of software product assurance requirements pertinent to the development and maintenance of software for space systems. It covers software product assurance topics, including safety and dependability, process assessment and improvement, configuration management, software reuse, software problem resolution management, and software metrication.
Published in 2017 and based on ISO/IEC 12207, revision 1 consists of three parts:
The standard defines software product assurance requirements designed to establish confidence that software will achieve its mission objectives. It is equally applicable to development and to repurposed or procured software, and supports tailoring based on software criticality.
The ECSS online glossary defines “criticality” as the “classification of a function or of a software, hardware or operation according to the severity of the consequences of its potential failures”. This abbreviated version of table D1 from ECSS-Q-ST-80C Rev1 shows that there are four such software classification categories.

These categories are allocated according to a process detailed in ECSS-Q-ST-40C. Categories have a significant impact on the level of rigor that is required throughout the software development lifecycle.
There are many ECSS standards, and all of them are arguably associated with ECSS-E-ST-40C and ECSS-Q-ST-80C Rev 1 to at least some degree. The ECSS standards following selection do not represent a comprehensive list, and more information about other ECSS standards can be found on the ECSS website.
ECSS-S-ST-00C Rev.1 “Description, implementation and general requirement” is the ECSS top–level document for ECSS users. It gives a general introduction into ECSS and the use of ECSS Documents in space programmes and projects.
Its purpose is to provide users with an overview of the ECSS System, including an introduction to its four branches – Management, Product Assurance, Engineering and Space Sustainability. The diagram above illustrates the branches most relevant to software development.
ECSS-D-00B defines the processes that apply to the lifecycle of ECSS documents, including the activities performed by the various actors.
These processes and activities were applied to the development of both ECSS-E-ST-40C and ECSS-QST-80C Rev 1.
The ECSS-Q-HB-80-04A handbook describes software metrication in the context of a space project. It provides recommendations, methods and procedures that can be used for the selection and application of appropriate metrics for the development of projects in accordance with ECSS-E-ST-40C and ECSS-Q-ST-80C. However, it does not include new requirements with respect to those provided by Revision 1 of the ECSS‐ST‐Q‐80C standard.
The ECSS-Q-HB-80-02 handbook defines methods for process assessment and improvement. These are suggestions and recommendations on the achievement of the requirements defined in ECSS-Q-ST-80C §5.7.
The ECSS-E-HB-40A handbook provides advice on software engineering best practices for the implementation of the requirements specified in ECSS-E-ST-40C.
ESA’s Board for Software Standardisation and Control (BSSC) was established in 1977 in recognition of the importance of software standards in complex and/or critical space software projects.
BSSC ESA P-SS-05, was a highly successful software engineering standard first published in 1984. Ten years later, the ESA Council adopted a resolution that confirmed the Agency’s commitment to transferring the existing system of ESA space standards to a new set of standards that were to be prepared by the European Cooperation for Space Standardization (ECSS). The ECSS standards therefore superseded ESA P-SS-05.
In 1995, ISO published a new international software-engineering standard, ISO/IEC 12207 (Information Technology, Software Lifecycle Processes, 1995). The timing of that release was ideal for ECSS who used it as the basis for their software engineering standard ECSS-E-40.
ECSS-Q-ST-80C Rev 1 §6.2.3.2 a requires that “The supplier shall define, justify, and apply measures to assure the dependability and safety of critical software.”
ECSS-E-ST-40C illustrates the software lifecycle process to put those aspirations into practice as shown below:

Most of the software practices pertinent to this process are referenced in both standards. However, the most detailed explanation of what is required can come from either, depending on the topic and the extent to which it is generic. The following breakdown therefore identifies relevant sections from each.
There are five “clauses” (sections) in ECSS-E-ST-40C. These are:
The first four clauses provide the “building blocks” for the software lifecycle process described in clause 5. The LDRA tool suite can contribute by automating validation and verification activities throughout this process.
Requirements traceability is key to compliance with ECSS-E-ST-40C. Establishing requirements at the outset and confirming that those requirements are completely fulfilled (forward traceability) and uniquely fulfilled (backward traceability) is highly important.
Requirements traceability can seem easy when things are going well. But changed requirements or failed tests can throw things quickly into disarray and cause a project management nightmare.
The ideal tools for requirements management depends largely on the scale of the development. If there are few developers in a local office, a simple spreadsheet or Microsoft Word document may suffice. Bigger projects, perhaps with contributors in geographically diverse locations, are likely to benefit from an Application Lifecycle Management (ALM) tool.
The software requirements phase details the process of evolution of lower-level requirements as they relate specifically to the software system. That demands traceability between the system and software-specific requirements artefacts to demonstrate that all system requirements have been allowed for.
This notion of traceability between phases and back to requirements is referenced throughout the standards. It would be easy to assume that if the requirements are all shown to be catered for in design documentation and implemented and tested in the code, then the objective of the standard is satisfied. However, it is just as important to show that there is no code present that is surplus to the requirements. To achieve that requires “bidirectional traceability”.

Challenges arising during a project mean that traceability between project phases can become a major project management headache. The TBmanager component of the LDRA tool suite automates traceability to project requirements, and to the objectives of ECSS‑Q‑ST‑80C Rev 1.

The detailed design stage requires the developer to ensure that appropriate design criteria are in place to limit problems later. It is important to not only design out such issues, but also to ensure that the good intentions are met.
The TBmanager component of the LDRA tool suite automatically tracks the resulting product requirements to demonstrate that they are being fulfilled. The TBvision component of the LDRA tool suite includes control and data flow analysis facilities, which aid verification of the correct interpretation of that design.
§6.3.4.1a: requires that “Coding standards (including consistent naming conventions and adequate commentary rules) shall be specified and observed.”
The LDRA tool suite provides both static and dynamic analysis capabilities. Static analysis involves and automated “examination” of the source code. Code review ensures the correct application of coding rules which are designed ensure the safety and security of the source code.
The TBvision component of the LDRA tool suite calculates quality metrics to measure the complexity, loop nesting depth, procedure count, and other parameters, and compares them to configurable threshold values. It is also used to verify adherence to the coding rules specified by the nominated coding standard, style guide, and/or language subset.

§5.3.3.2.a requires that “the supplier shall develop and document the test procedures and data for testing each software unit.”
The LDRA tool suite provides both static and dynamic analysis capabilities. Static analysis involves “examining” source code. Dynamic analysis involves executing programs, whether in part or as complete applications, on a real or virtual processor.
The TBrun unit test component of the LDRA tool suite provides boundary value analysis to detect software errors occurring around parameter limits, and structure-based testing and test coverage expose the completeness of testing. The level of coverage required is dependent on the applicable SIL – statement, branch/decision, MC/ DC (Modified Coverage/Decision Coverage), etc.
The optional TBjustify module can be specified to manage the documentation of justifications concerning coverage for certification and compliance purposes.

TBrun automatically generates test drivers and harnesses (wrapper code), executes tests easily and efficiently, and stores both test data and results. These tests can be automatically regressed. When coupled with the TBmanager component of the LDRA tool suite, the test data maintenance process is streamlined through the automatic detection of changes in source code, prompting repeats of tests as necessary. The TBextreme option supplements TBrun, offering the option to automatically generate test cases the nature of which is user configurable.
§5.5.3.2 b requires that “the supplier shall test each software unit ensuring that it satisfies its requirements and document the test results”
The TBmanager component of the LDRA tool suite provides for requirement-based testing and traceability as discussed above. Applying unit test techniques in conjunction with TBmanager provides a mechanism to automate the fulfilment of this requirement.
§5.5.4.2 a requires that “The supplier shall integrate the software units and software components, and test them, as the aggregates are developed”
Integration testing with the TBrun component of the LDRA tool suite uses the same mechanisms and approach as unit testing. The difference is that one than one unit (function, procedure, or whatever) is under test as part of a compiled executable. The same mechanisms are available for creating stubs, handling global variables, and the rest.
The software validation process demands the construction of a set of appropriate test and validation activities…
… and compliance requires the provision of evidence that those activities have been completed successfully.
The TBmanager component of the LDRA tool suite helps to achieve the requirement traceability between different levels of requirements, source code, test cases and test design. It helps to identify the software requirements which are not traced to system requirements or vice versa.

5.8.3.5 a requires that “The supplier shall verify the technical specification … ensuring that … the code implements … correct data and control flow”. The LDRA tool suite automates data and control flow analysis, making it easy identify any anomalies.
5.8.3.5 a requires that “The supplier shall verify the technical specification … ensuring that … the effects of run-time errors are controlled”. § 5.8.3.5 f also requires that “The supplier shall verify source code robustness….”
Coding standards including MISRA, JPL, and CERT are supported by the LDRA tool suite, all of which include rules to prevent run time errors. Automated code review checks for compliance with these rules ensuring the safety and security of the source code.
Robustness tests can also be performed dynamically using the TBrun component of the LDRA tool suite.
These sections detail a requirement for code coverage analysis, with ECSS‑E‑ST‑40C § 5.8.3.5 b stating that “The supplier shall verify that the following code coverage is achieved”. The standards include a table similar to that shown, which specifies which code coverage metric is to be applied.
There is a degree of discretion allowed for agreement between supplier and customer for the less critical categories, shown as “AM” in the table.
| Code coverage versus criticality category | A | B | C | D |
| Statement coverage | 100% | 100% | AM | AM |
| Decision coverage | 100% | 100% | AM | AM |
| MC/DC coverage | 100% | AM | AM | AM |
The LDRA tool suite supports all coverage metrics specified by the standard and offers flexibility in the presentation of coverage analysis results, and the optional TBjustify module can be specified to manage the documentation of justifications concerning coverage for certification and compliance purposes.


These sections detail a requirement for code coverage analysis. For example, ECSS‑E‑ST‑40C § 5.8.3.5 e states that for software of criticality category A, “In case the traceability between source code and object code cannot be verified… the supplier shall perform additional code coverage analysis on object code level”
The Object Code Verification (OCV) capabilities offered by LDRA tools automate this process. By leveraging the one-to-one relationship between object code and assembler code, it exposes which parts of the object code are unexercised, prompting the tester to devise additional tests and achieve complete assembler code coverage – and hence achieve object code verification.
The LDRA tool suite provides the unit test facilities demanded by the standard, as discussed elsewhere.
LDRA offers training in a variety of formats, and free resources are available online. Training course subject matter includes the use of LDRA tools suite, and on many of the standards it supports. More information is available at https://ldra.com/ldra-tools-methodology-training/
5.6.1.2 requires that “The choice of development methods and tools shall be justified by demonstrating through testing or documented assessment”. The suitability of LDRA tools can be verified by reference to TÜV certification, or by means of tool verification in conjunction with a tool qualification support pack.
6.3.5.11 a requires that “The supplier shall ensure that… the tests are properly documented, [and] the test reports are up to date and valid”. The LDRA tool suite provides a range of appropriate reporting mechanisms and formats.

7.1.5 suggests several basic metrics to be used (code size, complexity, fault and failure intensity, test coverage, number of failures) all of which are supported by the LDRA tool suite.
7.3.6 requires that “where the components developed for reuse are developed to be reusable on different platforms, the testing of the software shall be performed on all those platforms.”
The target testing capabilities of the LDRA tool suite facilitate testing embedded software on hardware similar to that specified for the completed system. This allows for an automated approach to testing the same software on different target hardware.
The ECSS standards look to apply best practice development practices with a functional safety focus. ECSS-E-ST-40C and ECSS-Q-ST-80C are complementary, viewing these best practices from different perspectives, which can lead to some confusing overlap between the two.
The LDRA tool suite is proven to ease compliance with such standards, both in space and elsewhere. In particular, the TBmanager component of the tool suite helps by ensuring that requirements traceability information is never out of date.
LDRA’s Object Code Verification has been shown to provide an efficient mechanism to demonstrate compliance for category A applications, especially with regards to similarly demanding DO‑178 DAL A systems.
Email: info@ldra.com
EMEA: +44 (0)151 649 9300
USA: +1 (855) 855 5372
INDIA: +91 80 4080 8707