Quality Assurance of Evolving Systems
As evolving systems are deployed in various industries such as transportation, financial markets, medicine, and energy, the development of new rigorous, comprehensive, and trustworthy quality assurance is critical both before and after deployment. Due to the size and complexity of these systems, the high pace of innovation, and the ability to learn and adapt during runtime, the approach to testing and quality assurance must be equally advanced.
The methods and procedures previously used for testing software are no longer applicable to evolving systems. Before the advent of Artificial Intelligence, software testing consisted of creating a test plan according to application requirements and then developing manual test scenarios from the end user’s perspective. Then the test scenarios were automated using scripts, and finally, functional tests were performed to verify that everything worked as intended. However, as the system evolves (through ML, AI, or very rapid development), the requirements for system behavior are constantly changing, making it (almost) impossible to test against the requirements. If there are no extended requirements, testing against the requirements is impractical.
Due to its complexity, the need for these new methods for testing AI and ML solutions in Evolving Systems requires a collaborative effort that led to the launch of an international project called IVVES.
Testing Evolving Systems: IVVES Project
IVVES (Industrial-Grade Verification and Validation of Evolving Systems) is an ITEA project involving 26 partners from 5 countries and running for 3 years (2019-2022). The technical work of the project focuses on the following three topics:
Validation techniques for ML, including model quality, training data quality, and testing techniques for ML.
Validation techniques for complex evolving systems, including ML-driven testing, testing with uncertainties, and online testing and monitoring.
Data-driven engineering, including data collection techniques, instrumentation, and smart probes, pattern recognition for predictive maintenance and fault analysis, and data analytics in engineering and operation.
Over the three years that participants have worked on IVVES, methods and tools have been developed for quality assurance of incoming data. In addition, researchers have developed validation methods and techniques for Evolving Systems, such as test generation and test prioritization for fault detection. In the area of data-driven engineering, they have developed methods and tools for data collection, e.g., for using customer program resources in production, simulating operational technology networks, and collecting data during automated exploration of graphical user interfaces.
Through collaboration, researchers have been able to develop methods for testing ML and AI solutions and evolving systems, and use AI and ML to improve and automate development and testing.
Most of the research and tools developed in the IVVES project are made available to the public on the IVVES learning platform.