Covering User-Defined Data-flow Test Requirements Using Symbolic Execution

  • Marcelo Medeiros Eler USP
  • André Takeshi Endo UTFPR
  • Vinícius Durelli USP

Resumo


A execução simbólica tem sido utilizada no teste de software como uma técnica efetiva para gerar dados de teste automaticamente. A maioria das abordagens considera apenas critérios de fluxo de controle e o teste completo da função ou do programa. Entretanto, testadores podem querer usar critérios de fluxo de dados e cobrir apenas requisitos específicos. Este artigo apresenta uma abordagem para gerar dados de teste para cobrir requisitos de teste definidos pelo usuario considerando critérios de fluxo de dados e de controle. Um prototipo foi implementado para gerar dados de teste para programas Java e realizar uma avaliação preliminar da abordagem. Os resultados, embora em um contexto limitado, são encorajadores e motivam mais experimentos.
Palavras-chave: Data-flow, Symbolic, Requirements

Referências

Beizer, B. (1990). Software Testing Techniques. Van Nostrand Reinhold Co., New York, NY, USA, 2nd edition.

Cadar, C., Dunbar, D., and Engler, D. (2008). Klee: unassisted and automatic generation of high-coverage tests for complex systems programs. In Proceedings of the 8th USENIX Conference on Operating Systems Design and Implementation (OSDI), pages 209–224. USENIX Association.

Cadar, C. and Sen, K. (2013). Symbolic execution for software testing: three decades later. Communications of the ACM, 56(2):82–90.

Cuoq, P., Kirchner, F., Kosmatov, N., Prevosto, V., Signoles, J., and Yakobowski, B. (2012). Frama-C: A Software Analysis Perspective. In Proceedings of the 10th International Conference on Software Engineering and Formal Methods (SEFM), pages 233–247. Springer-Verlag.

Eler, M. M., Endo, A. T., and Durelli, V. H. S. (2014). Quantifying the Characteristics of Java Programs that May Influence Symbolic Execution from a Test Data Generation Perspective (to appear). In COMPSAC 2014, pages 1–10. Vasteras, Sweeden.

Galler, S. and Aichernig, B. (2013). Survey on Test Data Generation Tools. International Journal on Software Tools for Technology Transfer, pages 1–25.

Godefroid, P. (2012). Test Generation Using Symbolic Execution. In IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science, volume 18, pages 24–33. Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik.

Godefroid, P., Klarlund, N., and Sen, K. (2005). DART: Directed Automated Random Testing. ACM SIGPLAN Notices, 40(6):213–223.

Gupta, R., Mathur, A., and Soffa, M. (2000). Generating Test Data for Branch Coverage. In Proceedings of the Fifteenth IEEE International Conference on Automated Software Engineering, pages 219–227.

Harrold, M. J. and Rothermel, G. (1994). Performing Data Flow Testing on Classes. ACM SIGSOFT Software Engineering Notes, 19(5):154–163.

Horgan, J. R. and London, S. (1991). Data Flow Coverage and the C Language. In Proceedings of the Symposium on Testing, Analysis, and Verification, pages 87–97.ACM.

JPF. Java Path Finder: the Swiss Army Knife of Java Verification. [January 2014].

Kahkonen, K., Launiainen, T., Saarikivi, O., Kauttio, J., Heljanko, K., and Niemel, I.(2011). LCT: An Open Source Concolic Testing Tool for Java Programs. In 6th Workshop on Bytecode Semantics, Verication, Analysis and Transformation, pages 1–6. ETAPS.

King, J. C. (1976). Symbolic Execution and Program Testing. Communications of the ACM, 19(7):385–394.

Li, G., Ghosh, I., and Rajan, S. P. (2011). KLOVER: A Symbolic Execution and Automatic Test Generation Tool for C++ Programs. In Proc. of the 23rd int. Conference on Computer aided verification, pages 609–615. Springer-Verlag.

Myers, G. J., Sandler, C., Badgett, T., and Thomas, T. M. (2004). The Art of Software Testing. John Wiley & Sons, Inc., Hoboken, New Jersey.

Pasareanu, C. S. and Visser, W. (2004). Verification of java programs using symbolic execution and invariant generation. In SPIN, volume 2989 of Lecture Notes in Computer Science, pages 164–181. Springer.

Pasareanu, C. S. and Visser, W. (2009). A Survey of New Trends in Symbolic Execution for Software Testing and Analysis. International Journal on Software Tools for Technology Transfer (STTT, 11(4):339–353.

Ramamoorthy, C., Ho, S.-B. F., and Chen, W. (1976). On the Automated Generation of Program Test Data. IEEE Transactions on Software Engineering,, SE-2(4):293–300.

Rapps, S. and Weyuker, E. J. (1982). Data flow analysis techniques for test data selection. In Proceedings of the 6th International Conference on Software Engineering, pages 272–278. IEEE.

Rapps, S. and Weyuker, E. J. (1985). Selecting Software Test Data Using Data Flow Information. IEEE Transaction on Software Engineering, 11(4):367–375.

Sedgewick, R. and Wayne, K. (2008). Introduction to programming in Java - an interdisciplinary approach. Pearson/Addison Wesley.

Sen, K. and Agha, G. (2006). CUTE and jCUTE: Concolic Unit Testing and Explicit Path Model-Checking Tools. In In CAV, pages 419–423. Springer.

Team, T. C. (2008). Choco: An Open Source Java Constraint Programming Library. In Proceedings of the Workshop on Open-Source Software for Integer and Contraint Programming, pages 1–7. ACM.

Tillmann, N. and De Halleux, J. (2008). Pex: White Box Test Generation for .Net. In Proceedings of the 2nd International Conference on Tests and Proofs, pages 134–153. Springer-Verlag.

Vergilio, S. R., Maldonado, J. A. C., and Jino, M. (2006). Infeasible paths in the context of data flow based testing criteria: identification, classification and prediction. Journal of the Brazilian Computer Society, 12:71–86.

Williams, N. (2010). Abstract Path Testing with PathCrawler. In Proceedings of the 5th Workshop on Automation of Software Test (AST), pages 35–42. ACM.

Zhu, H., Hall, P. A. V., and May, J. H. R. (1997). Software Unit Test Coverage and Adequacy. ACM Computing Surveys (CSUR), 29(4):366–427.
Publicado
04/08/2014
Como Citar

Selecione um Formato
ELER, Marcelo Medeiros; ENDO, André Takeshi; DURELLI, Vinícius. Covering User-Defined Data-flow Test Requirements Using Symbolic Execution. In: SIMPÓSIO BRASILEIRO DE QUALIDADE DE SOFTWARE (SBQS), 13. , 2014, Blumenau. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2014 . p. 16-30. DOI: https://doi.org/10.5753/sbqs.2014.15241.