A Method to Determine the Complexity of a Microtask in Crowdsourcing Environments

Authors

  • William Simão de Deus
  • José Augusto Fabri
  • Alexandre L'Erario

DOI:

https://doi.org/10.5753/isys.2018.375

Keywords:

Microtasks, complexity, ity; CrowdsourcingSoftware Development, Software complexity

Abstract

Crowdsourcing (CS) software development employs the outsourcing of design, development and testing tasks, known as microtasks, to an anonymous group of participants. The use of microtaks in CS environments has been promoting the reduction of costs and diminution of the time of a project, which remits its importance. Despite this, the literature still does not concentrate an effective way on how to determine its difficulty of execution. Thus, the objective of this work is to present a method capable of evaluating the complexity that a microtask possesses. For this, a method of determining complexity based on the characteristics of a microtask, was generated. Such a method was evaluated via a case study on a CS software development platform. The results showed that the method was efficient achieving an index of close to 90% of assertiveness in classifying microtasks as simple or complex in the analyzed sample. In addition, the method also showed that the simpler microtasks achieved a higher index of registration and submission than the complex microtasks.

Downloads

Download data is not yet available.

References

Abhinav, K., Dubey, A., Jain, S., Bhatia, G. K., McCartin, B., and Bhardwaj, N. (2018). Crowdassistant: A virtual buddy for crowd worker. In 2018 IEEE/ACM 5th International Workshop on Crowd Sourcing in Software Engineering (CSI-SE), pages 17–20. doi: https://doi.org/10.1145/3195863.3195865

Aipe, A. and Gadiraju, U. (2018). Similarhits: Revealing the role of task similarity in microtask crowdsourcing. HT.

Chandra, G., Gupta, D. L., and Malviya, K. (2012). Some observations based on comparison of mood and ck software metrics suites for object oriented system. International Journal of Computer Science and Technology, 3(3).

Deus, W. S., Barros, R. M., and L’Erario, A. (2016). Um modelo para o gerenciamento do crowdsourcing em projetos de software. In I Workshop sobre Aspectos Sociais, Humanos e Economicos de Software (WASHES’2016).

Deus, W. S., Fabri, J. A., and L’Erario, A. (2017). The management of crowdsourcing software projects: A systematic mapping. In 2017 12th Iberian Conference on Information Systems and Technologies (CISTI), pages 1–7. doi: https://doi.org/10.23919/CISTI.2017.7975711

de Freitas Junior, M., Fantinato, M., and Sun, V. (2015). Improvements to the function point analysis method: A systematic literature review. IEEE Transactions on Engineering Management, 62(4):495–506. doi: https://doi.org/10.1109/TEM.2015.2453354

Dubey, A., Abhinav, K., Taneja, S., Virdi, G., Dwarakanath, A., Kass, A., and Kuriakose, M. S. (2016). Dynamics of software development crowdsourcing. In 2016 IEEE 11th International Conference on Global Software Engineering (ICGSE), pages 49–58. doi: https://doi.org/10.1109/ICGSE.2016.13

Dwarakanath, A., Chintala, U., Shrikanth, N. C., Virdi, G., Kass, A., Chandran, A., Sengupta, S., and Paul, S. (2015). Crowd build: A methodology for enterprise software development using crowdsourcing. In 2015 IEEE/ACM 2nd International Workshop on CrowdSourcing in Software Engineering, pages 8–14. doi: https://doi.org/10.1109/CSI-SE.2015.9

Goke, N. and Freitag, E. (2014). Microtask platforms a win/win/win situation. In ¨ Colletive Inteligence. Greicius, T. (2018). Multi-planet system found through crowdsourcing. disponivel em: https://www.nasa.gov/feature/jpl/multi-planet-system-found-through-crowdsourcing. NASA.

Hosseini, M., Phalp, K., Taylor, J., and Ali, R. (2014). The four pillars of crowdsourcing: A reference model. In 2014 IEEE Eighth International Conference on Research Challenges in Information Science (RCIS), pages 1–12. doi: https://doi.org/10.1109/RCIS.2014.6861072

Howe, J. (2006a). Crowdsourcing: A definition. https://crowdsourcing.typepad.com/cs/2006/06/crowdsourcing_a.html, Janeiro, 2019.

Howe, J. (2006b). The rise of crowdsourcing. Wired magazine, 14(6):1–4. https://www.wired.com/2006/06/crowds/, Janeiro, 2019.

Jacques, J. T. (2018). Microtask Design: Value, Engagement, Context, and Complexity. PhD thesis, University of Cambridge. doi: http://dx.doi.org/10.17863/CAM.18777

Karim, M. R., Yang, Y., Messinger, D., and Ruhe, G. (2018). Learn or earn? - intelligent task recommendation for competitive crowdsourced software development. doi: http://dx.doi.org/10.24251/HICSS.2018.700

Kittur, A., Smus, B., Khamkar, S., and Kraut, R. E. (2011). Crowdforge: Crowdsourcing complex work. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, UIST ’11, pages 43–52, New York, NY, USA. ACM. doi: https://doi.org/10.1145/2047196.2047202

Kurve, A., Miller, D. J., and Kesidis, G. (2015). Multicategory crowdsourcing accounting for variable task difficulty, worker skill, and worker intention. IEEE Transactions on Knowledge and Data Engineering, 27(3):794–809. doi: https://doi.org/10.1109/TKDE.2014.2327026

LaToza, T. D., Towne, W. B., Adriano, C. M., and van der Hoek, A. (2014). Microtask programming: Building software with a crowd. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, UIST ’14, pages 43–54, New York, NY, USA. ACM. doi: https://doi.org/10.1145/2642918.2647349

LaToza, T. D. and v. d. Hoek, A. (2015). A vision of crowd development. In 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering, volume 2, pages 563–566. doi: https://doi.org/10.1109/ICSE.2015.194

Mao, K., Capra, L., Harman, M., and Jia, Y. (2017). A survey of the use of crowdsourcing in software engineering. Journal of Systems and Software, 126:57 – 84. doi: https://doi.org/10.1016/j.jss.2016.09.015

Mao, K., Yang, Y., Li, M., and Harman, M. (2013). Pricing crowdsourcing-based software development tasks. In 2013 35th International Conference on Software Engineering (ICSE), pages 1205–1208. doi: https://doi.org/10.1109/ICSE.2013.6606679

Naik, N. (2016). Crowdsourcing, open-sourcing, outsourcing and insourcing software development: A comparative analysis. In 2016 IEEE Symposium on Service-Oriented System Engineering (SOSE), pages 380–385. doi: https://doi.org/10.1109/SOSE.2016.68

Nakatsu, R. T., Grossman, E. B., and Iacovou, C. L. (2014). A taxonomy of crowdsourcing based on task complexity. Journal of Information Science, 40(6):823–834. doi: https://doi.org/10.1177%2F0165551514550140

Sui, D., Elwood, S., and Goodchild, M. (2012). Crowdsourcing Geographic Knowledge: Volunteered Geographic Information (VGI) in Theory and Practice. Springer Publishing Company, Incorporated. doi: https://www.doi.org/10.1007/978-94-007-4587-2

TopCoder (2017). Projects - topcoder crowdsourcing.

Tranquillini, S., Daniel, F., Kucherbaev, P., and Casati, F. (2015). Modeling, enacting, and integrating custom crowdsourcing processes. ACM Trans. Web, 9(2). Doi: https://doi.org/10.1145/2746353

Wei Li, Michael N. Huhns, W.-T. T. W. W. (2015). Crowdsourcing Cloud-Based Software Development. Springer, first edition. doi: https://www.doi.org/10.1007/978-3-662-47011-4

Winkler, D., Sabou, M., Petrovic, S., Carneiro, G., Kalinowski, M., and Biffl, S. (2017). Improving Model Inspection with Crowdsourcing. IEEE/ACM 4th International Workshop on Crowdsourcing in Software Engineering (CSI-SE), pages 30-40. doi: http://doi.ieeecomputersociety.org/10.1109/CSI-SE.2017.2

Wohlin, C., Runeson, P., Host, M., Ohlsson, M. C., Regnell, B., and Wesslen, A. (2012). Experimentation in software engineering. Springer Science & Business Media. doi: https://www.doi.org/10.1007/978-3-642-29044-2

Xiao, L. and Paik, H. Y. (2014). Supporting complex work in crowdsourcing platforms: A view from service-oriented computing. In 2014 23rd Australian Software Engineering Conference, pages 11–14. doi: https://doi.org/10.1109/ASWEC.2014.11

Yang J. Y. Redi, J. Demartini, G. B. A. (2016). Modeling task complexity in crowdsourcing. Fourth AAAI Conference on Human Computation and Crowdsourcing.

Yin, R. K. (2015). Estudo de Caso: Planejamento e Métodos. Bookman editora.

Published

2018-12-21

How to Cite

Deus, W. S. de, Fabri, J. A., & L’Erario, A. (2018). A Method to Determine the Complexity of a Microtask in Crowdsourcing Environments. ISys - Brazilian Journal of Information Systems, 11(4), 05–30. https://doi.org/10.5753/isys.2018.375

Issue

Section

Special issues articles