Canopus: A Domain-Specific Modeling Language for Performance Testing

  • Maicon Bernardino PUCRS
  • Avelino Francisco Zorzo PUCRS
##plugins.pubIds.doi.readerDisplayName## https://doi.org/10.5753/sbqs.2017.15120

Resumen


Despite all the efforts to reduce the cost of the testing phase in software development, this is still one of the most expensive phases. In order to continue to minimize those costs, in this paper, we propose a Domain-Specific Language (DSL), built on top of MetaEdit+ language workbench, to model performance testing for Web applications. Our DSL, called Canopus, was developed in the context of a collaboration between our university and a Technology Development Laboratory from an Information Technology (IT) company. It is presented, in this paper, the overview of Canopus, including: metamodels, its domain analysis, a process that integrates Canopus to Model-Based Testing, and applied it to an industrial case study. Furthermore, we also carried out a controlled empirical experiment to evaluate the effort (time spent), when comparing Canopus with another approach widely used by industry UML.

Citas

Bernardino, M., Rodrigues, E., and Zorzo, A. (2016a). Performance Testing Modeling: an empirical evaluation of DSL and UML-based approaches. In 31st ACM Symposium on Applied Computing, pages 1660–1665, New York, NY, USA. ACM.

Bernardino, M., Rodrigues, E., Zorzo, A., and Marchezan, L. (2017). A Systematic Mapping Study on Model-Based Testing: Tools and Models. IET Software.

Bernardino, M., Zorzo, A., and Rodrigues, E. (2016b). Canopus: A Domain-Specific Language for Modeling Performance Testing. In 9th International Conference on Software Testing, Verification and Validation, pages 157–167, Washington, DC, USA. IEEE.

Bernardino, M., Zorzo, A. F., Rodrigues, E., de Oliveira, F. M., and Saad, R. (2014). A Domain-Specific Language for Modeling Performance Testing: Requirements Analysis and Design Decisions. In 9th International Conference on Software Engineering Advances, pages 609–614, Wilmington, DE, USA. IARIA.

Bertolino, A. (2007). Software Testing Research: Achievements, Challenges, Dreams. In Future of Software Engineering, pages 85–103, Washington, DC, USA. IEEE.

El Ariss, O., Xu, D., Dandey, S., Vender, B., McClean, P., and Slator, B. (2010). A Systematic Capture and Replay Strategy for Testing Complex GUI Based Java Applications. In 7th International Conference on Information Technology: New Generations, pages 1038–1043, Washington, DC, USA. IEEE.

Erdweg, S. [et al.] (2013). The State of the Art in Language Workbenches. In Erwig, M., Paige, R., and Wyk, E., editors, Software Language Engineering, volume 8225, pages 197–217. Springer International Publishing.

Kelly, S. and Tolvanen, J.-P. (2007). Domain-Specific Modeling: Enabling Full Code Generation. John Wiley & Sons, New York, NY, USA.

Oates, B. J. (2006). Researching Information Systems and Computing. SAGE Publications, London, UK.

Utting, M. and Legeard, B. (2006). Practical Model-Based Testing: A Tools Approach. Morgan Kaufmann, San Francisco, CA, USA.

Wohlin, C., Runeson, P., Ho¨st, M., Ohlsson, M. C., and Regnell, B. (2012). Experimentation in Software Engineering. Springer–Verlag, Berlin, Germany, 1st edition.

Woodside, M., Franks, G., and Petriu, D. C. (2007). The Future of Software Performance Engineering. In Future of Software Engineering, pages 171–187, Washington, DC, USA. IEEE.

Wynne, M. and Hellesøy, A. (2012). The Cucumber Book: Behaviour-Driven Development for Testers and Developers. The Pragmatic Bookshelf.

Yang, Y., He, M., Li, M., Wang, Q., and Boehm, B. (2008). Phase Distribution of Software Development Effort. In 2nd ACM-IEEE International Symposium on Empirical Software Engineering and Measurement, pages 61–69, New York, NY, USA. ACM.

Yin, R. (2013). Case Study Research: Design and Methods. SAGE Publications, London, UK, 5th edition.
Publicado
28/08/2017
BERNARDINO, Maicon; ZORZO, Avelino Francisco. Canopus: A Domain-Specific Modeling Language for Performance Testing. In: ACTAS DEL SIMPOSIO BRASILEÑO DE CALIDAD DE SOFTWARE, 16. , 2017, Rio de Janeiro. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2017 . p. 400-414. DOI: https://doi.org/10.5753/sbqs.2017.15120.