The Design of a Proposed Strategy for Online Crowdsourcing in Tests and its Effect on the Final Achievement, Quality of Crowd Tests Item and the Female Teachers and Students` Perceptions towards Proposed Strategy

Document Type : Original Article

Author

Department of Instructional Technology, Faculty of Women for Arts, Science & Education, Ain Shams University, Egypt

Abstract

The aim of the present research is to design a proposed strategy for Online- crowdsourcing in tests and to reveal its effect on: the final achievement, the quality of crowd test items and explore The female students` perceptions towards such proposed strategy. The study`s sample consisted of the total number of (118) 3rd grade female students, at the College of Women Ain Shams University in the academic year 2021- 2022. Those female students have been divided into two groups. Experimental group is with Online- Crowdsourcing in tests strategy, while Control group is without crowdsourcing. The study`s results revealed that in the Experimental group, there is an increase in the post- test final achievement and in gain in academic achievement. The study`s results also showed that as compared with the control group, the experimental group has achieved (90/80) on the mastery level and that there is a significant effect size of the proposed strategy in increasing the post achievement of the experimental group, Moreover, in relation to the quality of crowd tests items, it has been found that (74%) and (97%) of such items have already met the accepted statistical standards set for the difficulty and discrimination levels respectively. The study`s results also showed that (56%) of the female students in the experimental group, have positive perceptions towards crowdsourcing in testes and that (39%) of those female students have positive perceptions towards  teacher - created tests, (100%) of them have positive perceptions towards the individual activity of writing questions and toward conducting discussion of questions within each group  and (81%) of them have positive perceptions towards  reviewing crowd questions among groups. In addition, while (19%) of the female students include of the experimental group, have negative perceptions. (71%) of them have positive perceptions towards group review conducted by all crowd individuals and while (29%) of those female students included in the experimental group, have negative perceptions, (100%) of them have positive perceptions towards the existence of questions in the crowed Test.

Keywords

Main Subjects


English References:
Agarwal, V., Panicker, A., Sharma, A., Rammurthy, R., Ganesh, L., & Chaudhary, S. (2021). Crowdsourcing in higher education: Theory and best practices. In Crowdfunding in the Public Sector: Theory and Best Practices (pp. 127-135). Cham: Springer International Publishing.
Alghamdi, E. A., Aljohani, N. R., Alsaleh, A. N., Bedewi, W., & Basheri, M. (2015, December). CrowdyQ: a virtual crowdsourcing platform for question items development in higher education. In Proceedings of the 17th International Conference on Information Integration and Web-based Applications & Services (pp. 1-4).‏
‏Amrollahi, A. (2016). A process model for crowdsourcing: insights from the literature on implementation. arXiv preprint arXiv:1605.04695.‏
Anderson, M. (2011). Crowdsourcing higher education: A design proposal for distributed learning. MERLOT Journal of Online Learning and Teaching7(4), 576-590.‏
Baranowski, R. A. (2011). Item editing and editorial review. In Handbook of test development (pp. 363-372). Routledge.
Bates, S. P., Galloway, R. K., & McBride, K. L. (2012). Student-generated content: Using PeerWise to enhance engagement and outcomes in introductory physics courses. In AIP Conference Proceedings (Vol. 1413, No. 1, pp. 123-126). American Institute of Physics.‏
Boopathiraj, C., & Chellamani, K. (2013). Analysis of test items on difficulty level and discrimination index in the test for research in education. International journal of social science & interdisciplinary research, 2(2), 189-193.‏
Breakall, J., Randles, C., & Tasker, R. (2019). Development and use of a multiple-choice item writing flaws evaluation instrument in the context of general chemistry. Chemistry Education Research and Practice, 20(2), 369-382.‏
Danh, T., Desiderio, T., Herrmann, V., Lyons, H. M., Patrick, F., Wantuch, G. A., & Dell, K. A. (2020). Evaluating the quality of multiple-choice questions in a NAPLEX preparation book. Currents in Pharmacy Teaching and Learning, 12(10), 1188-1193.‏
De Alfaro, L., & Shavlovsky, M. (2014, March). CrowdGrader: A tool for crowdsourcing the evaluation of homework assignments. In Proceedings of the 45th ACM technical symposium on Computer science education (pp. 415-420).
De Leeuw, A., Valois, P., Ajzen, I., & Schmidt, P. (2015). Using the theory of planned behavior to identify key beliefs underlying pro-environmental behavior in high-school students: Implications for educational interventions. Journal of environmental psychology, 42, 128-138.
Denny, P., Luxton-Reilly, A., & Hamer, J. (2008, January). The PeerWise system of student contributed assessment questions. In Proceedings of the tenth conference on Australasian computing education-Volume 78 (pp. 69-74).
Donlon, E., Costello, E., & Brown, M. (2020). Collaboration, collation, and competition: Crowdsourcing a directory of educational technology tools for teaching and learning. Australasian Journal of Educational Technology, 36(3), 41-55.
Duret, D., Christley, R., Denny, P., & Senior, A. (2018). Collaborative learning with PeerWise. Research in Learning Technology, 26, 1-13.‏
Ebersbach, M., Feierabend, M., & Nazari, K. B. B. (2020). Comparing the effects of generating questions, testing, and restudying on students' long‐term recall in university learning. Applied Cognitive Psychology, 34(3), 724-736.‏
Elgazzar, A. (2014). Developing E-Learning Environments for Field Practitioners and Developmental Researchers: A Third Revision of an ISD Model to Meet E-Learning and Distance Learning Innovations. Open Journal of Social Sciences. 02. 29-37.
Estellés-Arolas, E., & González-Ladrón-de-Guevara, F. (2012). Towards an integrated crowdsourcing definition. Journal of Information Science, 38(2), 189–200.
Farasat, A., Nikolaev, A., Miller, S., & Gopalsamy, R. (2017). Crowd learning: Towards collaborative problem-posing at scale. Proceedings of the Fourth (2017) ACM Conference on Learning @ Scale - L@S '17, 221-224.
Gavin, H. (2008). Thematic analysis. In Understanding research methods and statistics in psychology, 273-281.
Gehringer, E. F. (2011). "crowdsourcing" a textbook: 120 student authors writing on a wiki (research-based).
Hajian, A., Baloian, N., Inoue, T., & Luther, W. (Eds.). (2022). Data Science, Human-Centered Computing, and Intelligent Technologies. Logos Verlag Berlin.‏
Haladyna, T. M., Downing, S. M., & Rodriguez, M. C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied measurement in education, 15(3), 309-333.‏
Harris, B. H., Walsh, J. L., Tayyaba, S., Harris, D. A., Wilson, D. J., & Smith, P. E. (2015). A novel student-led approach to multiple-choice question generation and online database creation, with targeted clinician input. Teaching and learning in medicine, 27(2), 348-352.‏
Hartati, N., & Yogi, H. P. S. (2019). Item analysis for a better-quality test. English Language in Focus (ELIF), 2(1), 59-70.‏ ‏
Hartoyo. (2011). Language assessment. Pelita Insani
Haryaka, U., Agus, F., & Kridalaksana, A. H. (2017). User satisfaction model for e-learning using smartphone. Procedia computer science, 116, 373-380.‏
Howe, J. (2006). The rise of crowdsourcing. Wired magazine, 14(6), 1-4.‏
Howe, J. 2008. Crowdsourcing: Why the Power of the Crowd Is Driving the Future of Business: Crown Publishing Group.
Jiang, Y., Schlagwein, D., & Benatallah, B. (2018). A Review on Crowdsourcing for Education: State of the Art of Literature and Practice. PACIS, 180.‏
Karim, S. A., Sudiro, S., & Syarifah, S. (2021). Utilizing test items analysis to examine the level of difficulty and discriminating power in a teacher-made test. Utilizing test items analysis to examine the level of difficulty and discriminating power in a teacher-made test, 6(2), 256-269.
Kaufmann, N., Schulze, T., & Veit, D. (2011). More than fun and money. worker motivation in crowdsourcing–a study on mechanical Turk.‏
Kay, A. E., Hardy, J., & Galloway, R. K. (2020). Student use of PeerWise: A multi‐institutional, multidisciplinary evaluation. British Journal of Educational Technology, 51(1), 23-35.
Kim, D. (2018). A framework for implementing OER-based lesson design activities for pre-service teachers. The International Review of Research in Open and Distributed Learning, 19(4).‏
‏King, A. (1992). Comparison of self-questioning, summarizing, and notetaking-review as strategies for learning from lectures. American Educational Research Journal, 29(2), 303-323.
Knop, N., Durward, D., & Blohm, I. (2017). How to Design an Internal Crowdsourcing System?. International Conference on Information Systems (ICIS).‏
‏Koschmider, A., & Buschfeld, D. (2016). Shifting the process of exam preparation towards active learning: A crowdsourcing based approach. Informatik 2016.‏
Koschmider, A., & Schaarschmidt, M. (2017). A crowdsourcing-based learning approach to activate active learning. Bildungsräume 2017.
Kulkarni, C., Cambre, J., Kotturi, Y., Bernstein, M. S., & Klemmer, S. (2015). Talkabout: Making Distance Matter with Small Groups in Massive Classes. Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, 1116-1128.
Lacher, L. L., & Gibson, C. M. (2020). Crowdsourcing Exams to Increase Student Engagement in an Online Information Technology Class: An Experience Report. In 2020 International Conference on Computational Science and Computational Intelligence (CSCI) (pp. 941-947). IEEE.
Lehmann, K., & Leimeister, J. (2015). Theory-Driven Design of an IT-Based Peer Assessment to Assess High Cognitive Levels of Educational Objectives in Large-Scale Learning Services. ECIS 2015, Münster.
Llorente, R., & Morant, M. (2015). Crowdsourcing in higher education. Advances in crowdsourcing, 87-95.‏
Moore, S., Nguyen, H. A., & Stamper, J. (2021). Examining the effects of student participation and performance on the quality of learner sourcing multiple-choice questions. In Proceedings of the Eighth ACM Conference on Learning@ Scale (pp. 209-220).
Nugraha, A., & Inoue, T. (2022). An Experiment of Crowdsourced Online Collaborative Question Generation and Improvement for Video Learning materials in Higher Education. Data Science, Human-Centered Computing, and Intelligent Technologies, 56.‏
Olson, T. (2014). Crowdsourcing college examinations using technology to improve assessment and higher learning by students. In International Conference on e-Learning (p. 123). Academic Conferences International Limited.
Paniagua M, Swygert K. (2016). Constructing written test questions for the basic and clinical sciences. In: Paniagua M, Swygert K, editors. Director. 4th ed. Philadelphia (PA): NBME
Papinczak, T., Peterson, R., Babri, A. S., Ward, K., Kippers, V., & Wilkinson, D. (2012). Using student-generated questions for student-centered assessment. Assessment & Evaluation in Higher Education, 37(4), 439-452.‏
Pate, A., & Caldwell, D. J. (2014). Effects of multiple-choice item-writing guideline utilization on item and student performance. Currents in Pharmacy Teaching and Learning, 6(1), 130-134.‏
Pirttinen, N., Kangas, V., Nikkarinen, I., Nygren, H., Leinonen, J., & Hellas, A. (2018). Crowdsourcing programming assignments with Crowd Sorcerer. Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education - ITiCSE 2018, 326-331.
Postareff, L., Mattsson, M., & Parpala, A. (2018). The effect of perceptions of the teaching-learning environment on the variation in approaches to learning–Between-student differences and within-student variation. Learning and Individual Differences, 68, 96-107.‏
Prester, J., & Schlagwein, D. (2019). Crowdsourcing For Education: Literature Review, Conceptual Framework. In Proceedings of the 27th European Conference on Information Systems (ECIS), Stockholm & Uppsala, Sweden, June 8-14. ISBN 978-1-7336325-0-8 Research Papers.
Rick Stone, M., Kinney, M., Chatterton, C., & Pettit, R. K. (2017). A crowdsourced system for creating practice questions in a clinical presentation medical curriculum. Medical Science Educator, 27, 685-692.‏‏
‏Roy, S., Biswas, S., & Chaudhuri, S. S. (2014). Nature-inspired swarm intelligence and its applications. International Journal of Modern Education and Computer Science, 6(12), 55.
Sadler, P. M., Sonnert, G., Coyle, H. P., & Miller, K. A. (2016). Identifying promising items: The use of crowdsourcing in the development of assessment instruments. Educational Assessment, 21(3), 196-214.
Saxton, G. D., Oh, O., & Kishore, R. (2013). Rules of crowdsourcing: Models, issues, and systems of control. Information Systems Management, 30(1), 2-20.‏
Sharma, L. R. (2021). Analysis of difficulty index, discrimination index and distractor efficiency of multiple-choice questions of speech sounds of English. International Research Journal of MMC, 2(1), 15-28.‏
Solemon, B., Ariffin, I., Din, M. M., & Anwar, R. M. (2013). A review of the uses of crowdsourcing in higher education. International Journal of Asian Social Science, 3(9), 2066-2073.
Surowiecki, J. (2005). The wisdom of crowds. Anchor Books.
Tackett, S., Raymond, M., Desai, R., Haist, S. A., Morales, A., Gaglani, S., & Clyman, S. G. (2018). Crowdsourcing for assessment items to support adaptive learning. Medical Teacher, 40(8), 838-841.‏
Taniguchi, A., & Inoue, S. (2015). A method for automatic assessment of user-generated tests and its evaluation. Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers - UbiComp '15.
Tarrant, M., Knierim, A., Hayes, S. K., & Ware, J. (2006). The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Education Today, 26(8), 662-671.‏
Troll, J., Blohm, I., & Leimeister, J. M. (2016). Revealing the impact of the crowdsourcing experience on the engagement process.‏
Vaish, R., Gaikwad, S. S., Kovacs, G., Veit, A., Krishna, R., Ibarra, I. A., . . . Bernstein, M. S. (2017). Crowd Research: Open and Scalable University Laboratories. Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology - UIST '17, 829-843.
Walsh, J., Harris, B., Tayyaba, S., Harris, D., & Smith, P. (2016). Student‐written single‐best answer questions predict performance in finals. The Clinical Teacher, 13(5), 352-356.‏
Weld, D. S., Adar, E., Chilton, L. B., Hoffmann, R., Horvitz, E., Koch, M., ... & Mausam, M. (2012, July). Personalized Online Education-A Crowdsourcing Challenge. In HCOMP@ AAAI.‏
Whitehill, J., & Seltzer, M. (2017, April 22). A Crowdsourcing Approach To Collecting Tutorial Videos - - Toward Personalized Learning-at-Scale. Retrieved September 16, 2020, from http://arxiv.org/abs/1606.09610
Wimbauer, L. K. (2020). Innovate with Crowds. Co-Creation and Idea Evaluation in Internal and External Crowdsourcing (Doctoral dissertation, Universität Passau).
Zuchowski, O., Posegga, O., Schlagwein, D., & Fischbach, K. (2016). Internal crowdsourcing: conceptual framework, structured review, and research agenda. Journal of Information Technology, 31(2), 166-184.‏
Zuchowski, O. (2022). Understanding internal crowdsourcing (Doctoral dissertation, Otto-Friedrich-Universität Bamberg, Fakultät Wirtschaftsinformatik und Angewandte Informatik).‏
Translation of Arabic References:
Abu Allam, R. M. (2009). Statistical analysis of data using SPSS. Cairo: Universities Publishing House.
Ibrahim, S. & Rajab, W. (2022). The two types of crowdsourcing (internal/external) in electronic training environments and their impact on developing digital teacher skills and collective intelligence among science teachers. Journal of Educational Technology: Studies and Research Series, 32(1) 179-288.
Hassan, N. (2021). The style of crowdsourcing electronic resources (competitive/participatory/hybrid) using social media platforms and its impact on developing scientific research skills among graduate students at the College of Education, Umm Al-Qura University. Peer-reviewed scientific journal of the Egyptian Educational Computer Society, 9(2), 243-370.
Khamis, M. (2020). Recent trends in educational technology and its research areas. C1. Cairo: Arab Academic Center for Publishing and Distribution.
Khamis, R., Al-Jazzar, A. and Al-Salami, Z. (2022). Two designs for crowdsourcing (guided and free) in a social learning environment via the web and their effectiveness in acquiring competencies for designing educational situations among female student teachers. Research, 2(8), 83-127.