Machine learning-assisted performance testing
2019 (English) In: ESEC/FSE 2019 - Proceedings of the 2019 27th ACM Joint Meeting European Software Engineering Conference and Symposium on the Foundations of Software Engineering, Association for Computing Machinery, Inc , 2019, p. 1187-1189Conference paper, Published paper (Refereed)
Abstract [en]
Automated testing activities like automated test case generation imply a reduction in human effort and cost, with the potential to impact the test coverage positively. If the optimal policy, i.e., the course of actions adopted, for performing the intended test activity could be learnt by the testing system, i.e., a smart tester agent, then the learnt policy could be reused in analogous situations which leads to even more efficiency in terms of required efforts. Performance testing under stress execution conditions, i.e., stress testing, which involves providing extreme test conditions to find the performance breaking points, remains a challenge, particularly for complex software systems. Some common approaches for generating stress test conditions are based on source code or system model analysis, or use-case based design approaches. However, source code or precise system models might not be easily available for testing. Moreover, drawing a precise performance model is often difficult, particularly for complex systems. In this research, I have used model-free reinforcement learning to build a self-adaptive autonomous stress testing framework which is able to learn the optimal policy for stress test case generation without having a model of the system under test. The conducted experimental analysis shows that the proposed smart framework is able to generate the stress test conditions for different software systems efficiently and adaptively without access to performance models.
Place, publisher, year, edition, pages Association for Computing Machinery, Inc , 2019. p. 1187-1189
Keywords [en]
Autonomous testing, Performance testing, Reinforcement learning, Stress testing, Test case generation, Machine learning, Automated test case generation, Automated testing, Case based design, Complex software systems, Experimental analysis, Software testing
National Category
Natural Sciences
Identifiers URN: urn:nbn:se:ri:diva-39924 DOI: 10.1145/3338906.3342484 Scopus ID: 2-s2.0-85071935558 ISBN: 9781450355728 (print) OAI: oai:DiVA.org:ri-39924 DiVA, id: diva2:1361872
Conference 27th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering, ESEC/FSE 2019, 26 August 2019 through 30 August 2019
2019-10-172019-10-172021-01-21 Bibliographically approved