A four-phase approach to independent verification and validation
Connecting state and local government leaders
Through IV&V, developers and owners can mitigate the risk of implementing systems that don’t serve the purpose for which they were built or don't deliver as expected.
Government agencies face numerous challenges in managing their myriad information systems and projects. Beyond their complexity, IT systems often feature a mix of custom and commercial components that must interoperate with systems implemented by other organizations. The fast pace of technology, coupled with a rigid set of compliance criteria, adds even more complexity to the mix.
Moreover, major IT projects often take a long time to complete, and initial assumptions about the operating environment may evolve or stakeholder expectations may change. As a result, products, services or systems as initially conceived may no longer meet the stated requirements or deliver on their intended purpose. Here is where independent verification and validation can be invaluable.
Simply stated, IV&V is an evaluation conducted by an unbiased third-party. Its purpose is twofold:
- Verification – Check whether the product meets the stated requirements, specifications or constraints.
- Validation – Determine whether the product achieves its intended purpose and meets the needs of the stakeholder community.
Through IV&V, developers and owners can mitigate the risk of implementing systems that don’t serve the purpose for which they were built or don't deliver as expected. Further, IV&V testing results support program management decisions.
The optimal IV&V evaluation comprises four phases: plan, review, assess and report. By design, it fully addresses all aspects of an information system or project including cost, schedule, technical, management, program, process and quality.
A team approach to an IV&V engagement is optimal. Team members should possess not just the specific skills and experience relevant to the business and the technology under review, but also domain expertise in high-risk project areas.
Phase 1: Plan
As shown in Figure 1, the planning phase focuses on identifying the objectives of the IV&V effort and gathering requirements for the target system. Here, stakeholders help in developing and validating a requirements matrix that identifies each IV&V requirement, verification method (test, demonstration, simulation, analysis, etc.), verification and validation (V&V) event, and V&V condition. This matrix facilitates IV&V report generation by providing a framework for correlating requirements and supporting data.
Figure 1 – A four-phase IV&V approach helps produce consistent and objective results.
The requirements matrix drives the development of the test analysis and evaluation approach, which, in turn, drives the preparation of the IV&V activity plan. This plan focuses on high-risk areas based on test results (when possible) and augments them with additional testing methods as needed. In developing test plans and cases to provide adequate test coverage for projects, it is critical to ask the following questions:
- Is every requirement addressed by at least one test?
- Have test suites been selected for an "average" situation as well as for "boundary" situations such as minimum and maximum values?
- Have "stress" cases been selected, to account for out-of-bounds values?
- Have meaningful combinations of inputs been selected?
- Are the functions and objectives accurately reflected in the test cases?
- Have all the system interfaces been exercised in all conditions?
- Are there sufficient end-to-end tests?
- Are tests conducted with sufficient and representative data?
Phase 2: Review
During the review phase, the IV&V team evaluates the current documentation and assesses the perspectives of stakeholders across the organization, allowing the team to monitor and trace the impact of changes and dependencies throughout the development effort. The IV&V team also assesses the impact of these changes and provides an impact assessment from both an operational and maintenance perspective.
Verification procedures should involve test performance, as shown in Figure 1, to exercise a portion, or the entirety, of the system. Review and analysis of the test results follow. If the system is modified, then verification procedures should regularly repeat tests (regression tests) devised specifically to ensure it continues to meet the initial design requirements, specifications and regulations.
In addition to the test techniques, verification processes should include demonstration, inspection and analysis to evaluate and confirm whether the delivered or proposed capability meets the intent of the stated requirement. For some projects, virtual or operational environments might be required for testing and evaluation. For acquired services, the evaluation focuses on how well the product vendor conducted testing through a review/assessment of vendor test processes and results.
Validation procedures can include modeling and simulations to predict faults or gaps that might lead to invalid or incomplete verification or development of a product. Also, user-defined validation requirements, specifications and regulations can serve as a basis for qualifying system development. Additional resources can include available operational requirements, concepts of operations, use cases, user stories, operations and maintenance processes and documentation, among others. Due to its subjective nature, validation may require test repetition, multiple users or both to obtain sufficient samples for the desired level of confidence in the results.
Phase 3: Assess
After completing the IV&V tests, simulations and reviews, the data must be synthesized to show as coherent a result as possible. Any disparate conclusions (i.e., “The system delivers the required function” versus “The required function is not possible with the system”) must be resolved to determine the correctness and accuracy of results. If necessary, additional tests must be performed to resolve these disparities.
Actual results also must be compared with predicted results and expected outcomes. Anomalies must be resolved or explained. Any negative results must be fully characterized with details and context so that the team can determine the overall risk to performance.
Phase 4: Report
Output from IV&V testing activities, at each stage of project acquisition and development, consists of reports documenting compliance or noncompliance with requirements, evidence supporting the conclusions and recommendations on deployment. Reports should also include recommendations for rectifying issues or instances of noncompliance.
Strengths of a four-phase approach
The recommended approach offers several benefits:
- Produces consistent, cost-effective results that consider security at every phase.
- Proactively identifies issues and facilitates speedy implementation of measures to avoid or mitigate these risks.
- Provides a perspective on tests and results and allows assessment of the technical and operational impact of any identified noncompliance.