[Figure 1 | Digital I/O channels presenting potential attack surfaces on a modern car.]
From the perspective of the customer—the driver—the onus for assuring security and safety falls on the auto manufacturer. In turn, these OEMs depend on Tier-1 and Tier-2 suppliers for a range of electronic control units (ECUs), vehicle-to-everything (V2X) communications, advanced driver assistance systems (ADAS), and infotainment systems. With each of these systems now connected to a common vehicle network (Figure 1), each contributes to an ever-expanding attack surface. Security approaches such as separation kernels and hypervisors can mitigate the problem to some degree by offering run-time separation and isolation, but they can’t offer a guarantee of security—merely a line of defense.
Best practices dictate that security, like functional safety, can’t be an afterthought. It must be part of the software development life cycle as a whole.
Requirements are part of a secure software development life cycle
A secure development life cycle starts with specific security requirements presented from the OEM to Tier-1 and Tier-2 suppliers. Those requirements must then be applied to the development process and the software produced. The results, both in terms of process and the resulting software, must be verified and proven to the OEM so that the OEM can verify that the entire vehicle and its systems are secure.
Requirements are only meaningful if they can be traced and verified, and implementing those requirements relies on adherence to practices and standards that must be constantly checked and verified as well. Secure coding practices and standards might be compared to strategy (the overall approach) and tactics (the detailed execution). Failure to adhere to either can result in errors that compromise security. Fortunately, these errors can be detected using the proper automated testing tools and methodologies.
The Computer Emergency Readiness Team (CERT) website lists 12 secure coding practices that can be thought of as strategic coding methods or security requirements. They include such advice as “validate inputs,” and “heed compiler warnings” while using the compiler’s highest warning level. Another piece of advice is to “keep it simple,” which can be tested by means of metrics such as cyclomatic complexity. This highlights functions that are more complex than expected so that any complexity value out of specified bounds can be justified, with the goal of creating code that is clearer, more maintainable, and more testable. Another key piece of advice in the 12 practices is to adopt a secure coding standard.
Adopt secure coding standards
A coding standard sets out specific rules and guidelines—the tactics—for writing secure code using a specific language, such as C or C++. One of these coding standards is CERT C, which has 98 rules for developing safe, reliable, and secure systems. The rules are grouped in sections covering such things as strings, memory, and expressions. MISRA C and MISRA C++ are another two popular coding standards widely used in the automotive industry. These MISRA standards were always directed at “critical” code, implying that they were always appropriate for safety- and security-critical systems. MISRA C:2012 Amendment 1, has emphasized this further with the introduction of 14 new guidelines aimed specifically at security.
Coding standards deal with the details of correctly using high-level languages for secure systems, restricting coding constructs to minimize potential vulnerabilities. A benefit of using standards is that the source code can be checked for compliance with the selected standard using static analysis tools, preferably throughout the development process.
Structural coverage offers confidence
Securing the connected automobile and its wide and varied attack surface is a daunting undertaking. Access is possible for example through the infotainment system, GPS, the ODB2 diagnostic port, or the software/firmware update process. Once the in-car network has been breached, vulnerabilities could be exposed and exploited in safety-critical systems such as air bags, braking, steering, transmission, and collision avoidance. Coding errors in any of these systems can lead to catastrophe, so developers must test security and then measure the effectiveness of that testing.
When it comes to executing and testing the compiled code, structural coverage analysis helps to measure the effectiveness of the testing process by identifying and highlighting what code has and has not been tested. Color-coded formatted source code and flow graphs make it easy to identify what more needs to be done (Figure 2). The more coverage achieved offers confidence that there isn’t code that contains vulnerabilities. The more safety- or security-critical the software components, the more demanding the levels of coverage analysis should be applied – ranging from simple statement coverage to modified condition/decision coverage (MC/DC).
[Figure 2 | Structural coverage using dynamic analysis reveals functions and paths not yet taken.]
Structural coverage may be derived from the execution of the whole running system, or during unit testing. Unit testing utilizes a test harness that provides an executable mechanism to call functions or subsystems with specified inputs, so that outputs can be verified and traced to requirements. This combination results in confidence in the functionality of both the individual software components and the application as a whole.
A suite of integrated, coordinated, and configurable tools is indispensable for dealing with the size and complexity of these projects. These tools automate tracing, analysis, and testing, and maintain the data needed for proof and qualification or certification of vital, complex systems. They provide the foundation for a safe, secure software development life cycle, and a mechanism through which all team members can communicate and coordinate their efforts.
Furthermore, once cars are on the road, the tool suite enables an effective and efficient response whenever new vulnerabilities are exposed or additional security requirements are introduced.
Mark Pitchford is a Technical Specialist with LDRA Ltd.