SA-11 - Developer Testing and Evaluation
Control requirements
SA-11 (a)
Require the developer of the system, system component, or system service, at all post-design stages of the system development life cycle, to: Develop and implement a plan for ongoing security and privacy assessments.
SA-11 (b)
Require the developer of the system, system component, or system service, at all post-design stages of the system development life cycle, to: Perform [IBM Assignment: unit; integration; system; regression] testing/evaluation [Assignment: organization-defined frequency] at [Assignment: organization-defined depth and coverage].
SA-11 (c)
Require the developer of the system, system component, or system service, at all post-design stages of the system development life cycle, to: Produce evidence of the execution of the assessment plan and the results of the testing and evaluation.
SA-11 (d)
Require the developer of the system, system component, or system service, at all post-design stages of the system development life cycle, to: Implement a verifiable flaw remediation process.
SA-11 (e)
Require the developer of the system, system component, or system service, at all post-design stages of the system development life cycle, to: Correct flaws identified during testing and evaluation.
Additional IBM Cloud for Financial Services specifications
Responsible personnel or roles must be designated and identified as approvers for each stage of testing.
Implementation guidance
See the resources that follow to learn more about how to implement this control.
NIST supplemental guidance
Developmental testing and evaluation confirms that the required controls are implemented correctly, operating as intended, enforcing the desired security and privacy policies, and meeting established security and privacy requirements. Security properties of systems and the privacy of individuals may be affected by the interconnection of system components or changes to those components. The interconnections or changes—including upgrading or replacing applications, operating systems, and firmware—may adversely affect previously implemented controls. Ongoing assessment during development allows for additional types of testing and evaluation that developers can conduct to reduce or eliminate potential flaws. Testing custom software applications may require approaches such as manual code review, security architecture review, and penetration testing, as well as and static analysis, dynamic analysis, binary analysis, or a hybrid of the three analysis approaches. Developers can use the analysis approaches, along with security instrumentation and fuzzing, in a variety of tools and in source code reviews. The security and privacy assessment plans include the specific activities that developers plan to carry out, including the types of analyses, testing, evaluation, and reviews of software and firmware components; the degree of rigor to be applied; the frequency of the ongoing testing and evaluation; and the types of artifacts produced during those processes. The depth of testing and evaluation refers to the rigor and level of detail associated with the assessment process. The coverage of testing and evaluation refers to the scope (i.e., number and type) of the artifacts included in the assessment process. Contracts specify the acceptance criteria for security and privacy assessment plans, flaw remediation processes, and the evidence that the plans and processes have been diligently applied. Methods for reviewing and protecting assessment plans, evidence, and documentation are commensurate with the security category or classification level of the system. Contracts may specify protection requirements for documentation.