skip to main content
10.1145/3350768.3351798acmotherconferencesArticle/Chapter ViewAbstractPublication PagessbesConference Proceedingsconference-collections
research-article

Runtime Monitoring of Behavioral Properties in Dynamically Adaptive Systems

Published:23 September 2019Publication History

ABSTRACT

A Dynamically adaptive System (DAS) enable adaptations at runtime based on context information. DAS can be developed following the same approach used in Dynamic Software Product Lines (DSPL). Then, software engineers design the behavioral adaptations of DAS modeling context-aware features, which can be activated/deactivated at runtime. In our previous work, we proposed a model checking technique to verify behavioral properties in the specification of a DAS adaptation at design time. However, once this kind of system deals with reconfiguration at runtime, the inherent dynamism of context information and defects in the adaptation mechanism may cause unexpected behaviors, such as incorrect activation of adaptation rules. So, runtime monitoring activities are necessary to ensure the satisfaction of properties during the system execution. In this paper, we address this issue by proposing an approach to verify behavioral properties with a monitor framework during the system operation. To evaluate the approach, we perform a proof of concept with two mobile DAS by previously injecting faults in the source code. The injected faults were successfully detected when they turn into failures at system runtime.

References

  1. Vander Alves, Daniel Schneider, Martin Becker, Nelly Bencomo, and Paul Grace. 2009. Comparitive study of variability management in software product lines and runtime adaptable systems. (2009).Google ScholarGoogle Scholar
  2. Davi Monteiro Barbosa, Rômulo Gadelha De Moura Lima, Paulo Henrique Mendes Maia, and Evilásio Costa. 2017. Lotus@ runtime: a tool for runtime monitoring and verification of self-adaptive systems. In 2017 IEEE/ACM 12th International Symposium on Software Engineering for Adaptive and Self-Managing Systems (SEAMS). IEEE, 24--30.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Johannes Bürdek, Sascha Lity, Malte Lochau, Markus Berens, Ursula Goltz, and Andy Schürr. 2014. Staged configuration of dynamic software product lines with complex binding time constraints. In Proceedings of the Eighth International Workshop on Variability Modelling of Software-Intensive Systems. ACM, 16.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bruno B.P. Cafeo, Joost Noppen, Fabiano C. Ferrari, Ruzanna Chitchyan, and Awais Rashid. 2011. Inferring Test Results for Dynamic Software Product Lines. In Proceedings of the 19th ACM SIGSOFT Symposium and the 13th European Conference on Foundations of Software Engineering (ESEC/FSE '11). ACM, New York, NY, USA, 500--503.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Christian Colombo, Gordon J Pace, and Gerardo Schneider. 2009. LARVA--Safer Monitoring of Real-Time Java Programs (Tool Paper). In 2009 Seventh IEEE International Conference on Software Engineering and Formal Methods. IEEE, 33--37.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Krzysztof Czarnecki and Andrzej Wasowski. 2007. Feature diagrams and logics: There and back again. In 11th International Software Product Line Conference (SPLC 2007). IEEE, 23--34.Google ScholarGoogle ScholarCross RefCross Ref
  7. Ismayle de Sousa Santos. 2017. TestDAS - Testing Method for Dynamically Adaptive Systems,. Thesis. Federal University of Ceará.Google ScholarGoogle Scholar
  8. Ismayle de Sousa Santos, Rossana Maria de Castro Andrade, Lincoln Souza Rocha, Santiago Matalonga, Kathia Marcal de Oliveira, and Guilherme Horta Travassos. 2017. Test case design for context-aware applications: Are we there yet? Information and Software Technology 88 (2017), 1--16.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Erik M Fredericks, Byron DeVries, and Betty HC Cheng. 2014. Towards runtime adaptation of test cases for self-adaptive systems in the face of uncertainty. In Proceedings of the 9th International Symposium on Software Engineering for Adaptive and Self-Managing Systems. ACM, 17--26.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. M Usman Iftikhar and Danny Weyns. 2017. ActivFORMS: A Runtime Environment for Architecture-Based Adaptation with Guarantees. In 2017 IEEE International Conference on Software Architecture Workshops (ICSAW). IEEE, 278--281.Google ScholarGoogle Scholar
  11. Kyo C Kang, Sholom G Cohen, James A Hess, William E Novak, and A Spencer Peterson. 1990. Feature-oriented domain analysis (FODA) feasibility study. Technical Report. Carnegie-Mellon Univ Pittsburgh Pa Software Engineering Inst.Google ScholarGoogle Scholar
  12. Jeffrey O Kephart and David M Chess. 2003. The vision of autonomic computing. Computer 1 (2003), 41--50.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Ivan Kiselev. 2002. Aspect-oriented programming with Aspect J. Sams, Indianapolis, IN, USA.Google ScholarGoogle Scholar
  14. Mariam Lahami, Moez Krichen, and Mohamed Jmaiel. 2016. Safe and efficient runtime testing framework applied in dynamic and distributed systems. Science of Computer Programming 122 (2016), 1--28.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Martin Leucker and Christian Schallhart. 2009. A brief account of runtime verification. The Journal of Logic and Algebraic Programming 78, 5 (2009), 293--303.Google ScholarGoogle ScholarCross RefCross Ref
  16. Yoo Jin Lim, Eunkyoung Jee, Donghwan Shin, and Doo-Hwan Bae. 2015. Efficient Testing of Self-Adaptive Behaviors in Collective Adaptive Systems. In 2015 IEEE 39th Annual Computer Software and Applications Conference, Vol. 2. IEEE, 216--221.Google ScholarGoogle Scholar
  17. Fabiana G. Marinho, Rossana M. C. Andrade, Cláudia Werner, Windson Viana, Marcio E. F. Maia, Lincoln S. Rocha, Eldínae Teixeira, J. B. Ferreira Filho, Valéria L. L. Dantas, Fabrício Lima, and Saulo Aguiar. 2013. MobiLine: A Nested Software Product Line for the Domain of Mobile and Context-aware Applications. Sci. Comput. Program. 78, 12 (Dec. 2013), 2381--2398.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Santiago Matalonga, Felyppe Rodrigues, and Guilherme Horta Travassos. 2017. Characterizing testing methods for context-aware software systems: Results from a quasi-systematic literature review. Journal of Systems and Software 131 (2017), 1--21.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Rabeb Mizouni, Mohammad Abu Matar, Zaid Al Mahmoud, Salwa Alzahmi, and Aziz Salah. 2014. A framework for context-aware self-adaptive mobile applications SPL. Expert Systems with applications 41, 16 (2014), 7549--7564.Google ScholarGoogle Scholar
  20. Afaf Mousa, Jamal Bentahar, and Omar Alam. 2019. Context-aware composite SaaS using feature model. Future Generation Computer Systems 99 (2019), 376--390.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Freddy Munoz and Benoit Baudry. 2009. Artificial table testing dynamically adaptive systems. arXiv preprint arXiv:0903.0914 (2009).Google ScholarGoogle Scholar
  22. Georg Püschel, Sebastian Götz, Claas Wilke, Christian Piechnick, and Uwe Aßmann. 2014. Testing self-adaptive software: requirement analysis and solution scheme. International Journal on Advances in Software, ISSN 2628 (2014), 88--100.Google ScholarGoogle Scholar
  23. Georg Püschel, Ronny Seiger, and Thomas Schlegel. 2012. Test Modeling for Context-aware Ubiquitous Applications with Feature Petri Nets. In Modiquitous Workshop.Google ScholarGoogle Scholar
  24. Lincoln S. Rocha and Rossana M. C. Andrade. 2012. Towards a Formal Model to Reason About Context-aware Exception Handling. In Proceedings of the 5th International Workshop on Exception Handling (WEH '12). IEEE Press, Piscataway, NJ, USA, 27--33.Google ScholarGoogle Scholar
  25. Karsten Saller, Malte Lochau, and Ingo Reimund. 2013. Context-aware DSPLs: model-based runtime adaptation for resource-constrained systems. In Proceedings of the 17th International Software Product Line Conference co-located workshops. ACM, 106--113.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Michele Sama, Sebastian Elbaum, Franco Raimondi, David S Rosenblum, and Zhimin Wang. 2010. Context-aware adaptive applications: Fault patterns and their automated identification. IEEE Transactions on Software Engineering 36, 5 (2010), 644--661.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Erick Barros dos Santos, Rossana Maria de Castro Andrade, and Ismayle de Sousa Santos. 2019. Towards Runtime Testing of Dynamically Adaptive Systems based on Behavioral Properties. In XV Brazilian Symposium on Information Systems: Workshop of Thesis and Dissertations on Information Systems. ACM, 4.Google ScholarGoogle ScholarCross RefCross Ref
  28. Ismayle de Sousa Santos, Erick Barros dos Santos, Rossana Maria de Castro Andrade, and Pedro de Alcântara dos Santos Neto. 2018. CONTroL: Context-Based Reconfiguration Testing Tool. In Brazilian Conference on Software: Tools Session. ACM, 6.Google ScholarGoogle Scholar
  29. Ismayle de Sousa Santos, Magno Lua de Jesus Souza, Michelle Larissa Luciano Carvalho, Thalisson Alves Oliveira, Eduardo Santana de Almeida, and Rossana Maria de Castro Andrade. 2017. Dynamically adaptable software is all about modeling contextual variability and avoiding failures. IEEE Software 34, 6 (2017), 72--77.Google ScholarGoogle ScholarCross RefCross Ref
  30. Ismayle S. Santos, Lincoln S. Rocha, Pedro A. Santos Neto, and Rossana M. C. Andrade. 2016. Model Verification of Dynamic Software Product Lines. In Proceedings of the 30th Brazilian Symposium on Software Engineering (SBES '16). ACM, New York, NY, USA, 113--122.Google ScholarGoogle Scholar
  31. Bento Rafael Siqueira, Fabiano Cutigi Ferrari, Marcel Akira Serikawa, Ricardo Menotti, and Valter Vieira de Camargo. 2016. Characterisation of Challenges for Testing of Adaptive Systems. In Proceedings of the 1st Brazilian Symposium on Systematic and Automated Software Testing (SAST). ACM, New York, NY, USA, Article 11, 10 pages.Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Bento R Siqueira, Misael Costa Júnior, Fabiano C Ferrari, Daniel SM Santibáñez, Ricardo Menotti, and Valter V Camargo. 2018. Experimenting with a Multi-Approach Testing Strategy for Adaptive Systems. In Proceedings of the 17th Brazilian Symposium on Software Quality. ACM, 111--120.Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Gabriel Tamura, Norha M Villegas, Hausi A Müller, João Pedro Sousa, Basil Becker, Gabor Karsai, Serge Mankovskii, Mauro Pezzè, Wilhelm Schäfer, Ladan Tahvildari, et al. 2013. Towards practical runtime verification and validation of self-adaptive software systems. In Software Engineering for Self-Adaptive Systems II. Springer, 108--132.Google ScholarGoogle Scholar
  34. Huai Wang and Wing Kwong Chan. 2009. Weaving context sensitivity into test suite construction. In Proceedings of the 2009 IEEE/ACM International Conference on Automated Software Engineering. IEEE Computer Society, 610--614.Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Chang Xu, Shing Chi Cheung, Xiaoxing Ma, Chun Cao, and Jian Lu. 2012. Adam: Identifying defects in context-aware adaptation. Journal of Systems and Software 85, 12 (2012), 2812--2828.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Runtime Monitoring of Behavioral Properties in Dynamically Adaptive Systems

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      SBES '19: Proceedings of the XXXIII Brazilian Symposium on Software Engineering
      September 2019
      583 pages
      ISBN:9781450376518
      DOI:10.1145/3350768

      Copyright © 2019 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 23 September 2019

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      SBES '19 Paper Acceptance Rate67of153submissions,44%Overall Acceptance Rate147of427submissions,34%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader