Title: Learning Analytics and High School Adoption
Author Name: Vicki Griffin
1. Introduction – The concept of learning analytics has been in use awhile. Many industries such as banking, health, insurance, aviation, telecommunication, and entertainment have enjoyed the advantages of leveraging and gaining insights from the analysis of large-scale data (Kiron, Shockley, Kruschwitz, Finch & Haydock, 2012; Manyika et al., 2011; Siemens, 2013). The use of big data has changed customer behavior from optimizing flight plans to creating predictive health insurance models. (Joksimović, Kovanović, & Dawson, 2019); however, the education sector has been slow to catch up, in terms of utilizing the vast array of data available during the period that grades K-12 are in session. The lack of adopting data collection and analysis particularly holds true for the high school grades ((Joksimović, Kovanović, & Dawson, 2019; Cechinel et al., 2020; Ifenthaler, 2021).
In 2011, a group of educational researchers hosted The First International Learning Analytics (LAK’11) Conference in Banff, Canada. A goal of this group was to define and scope the emergent research that focused on understanding student learning using machine learning, data mining, and data visualization methods. The Society for Learning Analytics Research (SoLAR) was formed and the defining of learning analytics as the “measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” was created (Siemens, Long, Gašević, & Console, 2011, para. 4).
In an educational setting, learning analytics use student learning data that is already available to establish early indicators of academic performance and student attrition. This, in turn, gives instructors an opportunity to provide early interventions to retain students and improve the quality of their work (Joksimović, Kovanović, & Dawson, 2019).
2. Overview of the Case – Over the last few years, not only has technology become an essential tool in helping to create a more effective and supportive educational experience for students, the online learning environments (e.g., learning management systems, student diaries, library systems, digital repositories, and academic systems) has created a vast repository of data for instructors and researchers to access to learn more about the educational experience (Gaftandzhieva et al., 2020). Digital footprints from these sources can provide invaluable informational solutions to teaching and learning practices in order to achieve better student success practices (Varanasi et al., 2018) and lend support to teachers’ practices (Jivet et al., 2018).
The use of learning analytics can provide information that will benefit students, teachers, administrators and institutions. The amount of student data that is stored such as demographics information, grades, and student behavior, increases the possibilities of strategies framed around student retention and increased student success, moving away from the status quo and leveling up to meet each student in an individualized and data-driven way (Tan et al., 2016; Aguerrebere et al., 2017).
Though learning analytics have been widely researched and adopted in early childhood classrooms (Agus and Samuri, 2018) and higher education (Leitner et al., 2017; Waheed et al., 2018; Charitopoulos et al., 2020). Learning analytics has promising results for all educational levels; however, high schools have not adopted learning analytics in the same way as other levels of K-12 (Cechinel et al., 2020; Ifenthaler, 2021). There are many challenges that face stakeholders and are particular to the high school level in teaching and learning (Gaftandzhieva et al., 2020). A few of these issues that learning analytics can address are high school drop-out (Khalil and Ebner, 2015), difficulty with peer collaboration (Berland et al., 2015), development of scientific writing and argumentation skills (Lee et al., 2019; Palermo and Wilson, 2020), and the development of the emerging ability of computational thinking in this age group (Grover et al., 2017). Learning analytic can also provide teachers with support in understanding student practices and student motivational levels (Quigley et al., 2017; Aluja-Banet et al., 2019). Furthermore, administrators can use learning analytics to identify students who are at-risk for not graduating on time (Aguiar et al., 2015; Jimenez-Gomez et al., 2015) and in the development of curricula to meet the needs and expectations of students (Monroy et al., 2013).
Bruno et al. (2021) conducted a systematic literature review focusing on the application of learning analytics in high schools with the specific aim of providing a broad description of main approaches, educational goals, techniques, and challenges related to learning analytics and high schools. For this e-book chapter, one research question will be analyzed, “What evidence, if any, shows that Learning Analytics improves the performance of students in high school?”
Table 1 indicates the number of articles per country which was derived by the first author of the article (Bruno). Table 2 indicates the evidence from the articles that depicts improvement from leaning analytics. Table 3 indicates the analyses of students’ learning outcomes and students’ learning processes that were the most important goals in the context.
Table 1. Number of Articles per Country | |
Country | Articles |
United States | 18 |
Netherlands, Brazil | 2 |
Austria, Bulgaria, China, Estonia, Germany, India, Indonesia, Italy, Jordan, Pakistan, Romania, Singapore, South Africa, Spain, Taiwan, Turkey, United Kingdom, Uruguay, and Spain | 1 |
Total | 42 |
(Bruno et al., 2021)
Table 2. Evidences that Learning Analytics improves high school student performance | |
Evidence | Number of articles (%) |
No evidence | 25 (59.52%) |
Positive with empirical evidence | 8 (19.05%) |
Positive without empirical evidence | 9 (21.43%) |
Negative with empirical evidence | 0 (0%) |
Negative without empirical evidence | 0 (0%) |
Total | 42(100%) |
(Bruno et al., 2021)
Table 3. Main Educational goals in using Learning Analytics in high schools | |
Goal | Number of articles (%) |
Predict and enhance students learning outcomes | 18 (42.85%) |
Analyze students’ learning processes | 11 (26.19%) |
Support teachers’ decisions and reflection | 5 (11.91%) |
Support writing activities | 3 (7.14%) |
Other | 5 (11.91%) |
Total | 42 (100%) |
(Bruno et al., 2021)
3. Solutions Implemented – The main goal of applying learning analytics in high school was to predict student dropout or to predict and enhance student learning outcomes or student success outcomes. The goal was to predict students’ grades in an effort to provide support and individualized support (Blasi, 2017).
Another commonly cited educational goal was to analyze the learning processes of students. Most works aligned to this goal investigated students’ participation in assessments and various educational games using log data.
A third common goal was the support of writing activities (Palermo and Wilson, 2020). It was discovered the iStart tool provides formative feedback in written assessments (Allen et al, 2017). Allen suggested that dynamic visualization and analyses can be used as a tool as a step toward more adaptive educational technologies for literacy. This method provides a firm initial foundation because it demonstrates a feasibility of measures for modeling student performance (Allen et al., 2017).
4. Outcomes – For example, Wandera, et al., (2019) used several models to predict school pass rates to support administration decision-making. Similarly, performance prediction models developed by Aguiar et al (2015) helped schools to allocate limited resources more efficiently to those students who were most in need of help and target intervention programs to match those students’ particular needs.
Additionally, learning analytics have been used to predict high school dropout (Lakkaruju et al., 2015; Filho and Adeodato, 2019; Baker et al., 2020), early interventions with potential failing middle school students (Jimenez-Gomez et al., 2015), and assisted the education department administrators and policymakers in predicting the number of graduating students and students who were dropping out of high school (Yousafzai et al., 2020).
Student self-reports and activity logs were collected from 25 classes at a high school in northern Taiwan. The aim of this exercise was at the application of supervised and unsupervised lag sequential analysis (LSA) for examining students’ learning processes (Wen et al., 2018). Log data was also used by Grover et al. (2017) and Manske and Hoppe (2016) to evaluate the computational thinking skills of students and the Go-Lab portal, respectively. The primary goal in both studies was to assist students in reflecting on personal knowledge building by visualizing log data. A similar analysis of the behavior of solo and collaborative student groups engaged in educational games to evaluate the differences between students’ interactions in these two profiles based on log data contained inside the game, which is a novel approach (Ruiperez-Valiente and Kim, 2020).
Bruno et al., (2021) discovered articles related to real-time adjustable feedback (Lee et al., 2019), analyses and student classifications sentiments towards educational processes (Marcu and Danubianu, 2020), and direct mapping between learning traces that are usually gathered for learning analytics and a theoretical grounded model of cognition (Seitlinger et al., 2020).
Table 4 shows the approaches high schools used in the adoption of learning analytics. Most of the applications are related to visualizations (in the distillation of data for human judgment), prediction and mining.
Table 4. Main data analysis approaches used in studies on learning analytics in high schools | |
Approach | Number of articles (%) |
Distillation of data for human judgment | 15 (35.71%) |
Prediction | 11 (26.19%) |
Relationship to Mining | 9 (21.43%) |
Discovery with models | 4 (9.53%) |
Clustering | 3 (7.14%) |
Total | 42 (100%) |
(Bruno, 2021)
Visualization is the one of the main topics of learning analytics and research, in general. In high school, there are several categories of visualization to support different stakeholders. Chen (2020) suggested the use of the Visual Learning Analytics (VLA) approach that combines the perspectives of learning analytics and visual analytics to understand education. This approach was applied to video-based professional development program for teachers, which compared how conventional knowledge-based workshops and the hands-on VLA-based workshops influenced teacher beliefs about the usefulness of classroom talk (based on the Academically Productive Talk approach), self-efficacy in the guidance of classroom talk, and actual enactment of dialog in the classroom. Results of this study showed the hands-on VLA-based approach was more effective and improved teacher methodology in developing dialog teaching (Chen, 2020).
Teachers found support in a proposed dashboard visualization in adapting their lesson plans and instructions to improve student performance (Admiraal et al., 2020). Similar dashboard visualizations were proposed by Papamitsiou and Economides (2015) for increasing student awareness during assessment with the use of temporal Learning Analytics. Visualizations generated by student log data during learning activities support students and instructors in interpreting them intuitively and perceiving hidden aspects of these data. Learning analytic dashboards also provide a means of supporting the feedback process (Lee et al., 2019)
Predictions from learning analytics primarily focus on two areas: predicting at-risk students and student learning outcomes. For example, Baker et al., (2020) analyzed data that included a student attendance, grades (and their changes), courses taken, and disciplinary records. Using a regression algorithm, they were able to predict dropout of high school students. The model predicting dropout achieved an area under the ROC curve (AUC) of 0.76. The authors identified the most predictive features included the number of dress code violations, the number of in-school suspensions and the standard deviation of grades in the current semester. Similarly, Aguiar et al., (2015) used random forest and logistic regression models for early prediction of high school dropout. The authors stated the same models can be used to intervene for these students for a more successful outcome (Aguiar et al., 2015).
5. Implications – It is important to note that there is a significant difference in contextual content between high school, K-12 schools, higher education and MOOCs. Another consideration is that students in high school are usually below the age of 18 years old (different from higher education and many times MOOCs, and professional learning environments). This could raise ethical questions, concerns and needs for data associated with learning analytics. Generally, a high school instructors’ technology backgrounds are not usually the same as those of a professor at a university. Finally, data collected from high schools do not involve many interactions with learning management systems or MOOC platforms, both of which are primary sources for data collection for learning analytics (Bruno et al., 2021).
Other implications to be considered in learning analytics are Internet connectivity (Berland et al., 2015) and issues related to the number of devices that are connected in a classroom (Monroy et al., 2013).
In general, learning analytic applications focus on the learning process and not only on the learning outcome (Joksimovic et al., 2019). For example, the use of learning analytic to promote feedback on process level (Hattie and Timperley, 2007), it is necessary to identify learning processes from the data that is available in schools, and not just the outcomes. Therefore, studies in high schools should adopt techniques used in social network analysis (Knoke and Yang, 2019), epistemic network analysis (Shaffer et al., 2009), and process mining (Van Der Aalst, 2012) rather than that just of machine learning algorithms.
References
Admiraal, W., Vermeulen, J., & Bulterman-Bos, J. (2020). Teaching with learning analytics: how to connect computer-based assessment data with classroom instruction? 29(5). Technology, Pedagogy and Education. Routledge, 577-591.
Aguerrebere, C., Cobo, C., Gomez, M., & Mateu., M. (2017). Strategies for data and learning analytics informed national education policies: The case for Uraguay. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 449-453).
Aguiar, E., Lakkaraju, H., Bhanpuri, N., Miller, D., Yuhas, B., & Addison K. (2015). Who, when, and why. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge Conference (pp.93-102).
Agus, R., & Samuri, S. M. (2018). Learning analytics contribution in education and child development: A review on learning analytics. Asian journal of assessment in teaching and learning, 8, 36-47.
Allen, L, Perret, Likens, A., & McNamara, D. (2017). What’d you say again? In Proceedings of the Seventh International Learning Analytics Conference (pp. 373-382).
Aluja-Banet, T., Sancho, M., & Vukic, I. (2019). Measuring motivation from the virtual learning environment in secondary education. Journal of Computational Science, 36, 100629.
Baker, R. S., Berning, A. W., Gowda, S. M., Zhang, S., & Hawn, A. (2020). Predicting K-12 dropout. Journal of Education for Students Placed at Risk (JESPAR), 25(1), 28-54.
Berland, M., Davis, D., & Smith, C. P. (2015). AMOEBA: Designing for collaboration in computer science classrooms through live learning analytics. International Journal of Computer-Supported Collaborative Learning, 10(4), 425-447.
Blasi, A. (2017). Performance increment of high school students using ANN model and SA algorithm. Journal of Theoretical & Applied Information Technology, 95(11).
Bruno, E., Alexandre, B., Ferreira Mello, R., Falcão, T. P., Vesin, B., & Gašević, D. (2021). Applications of learning analytics in high schools: a Systematic Literature review. Frontiers in Artificial Intelligence, 132.
Cechinel, C., Ochoa, X., Lemos dos Santos, H., Carvalho Nunes, J. B., Rodés, V., & Marques Queiroga, E. (2020). Mapping learning analytics initiatives in Latin America. British Journal of Educational Technology, 51(4), 892-914.
Charitopoulos, A., Rangoussi, M., & Koulouriotis, D. (2020). On the use of soft computing methods in educational data mining and learning analytics research: A review of years 2010–2018. International Journal of Artificial Intelligence in Education, 30(3), 371-430.
Chen, G. (2020). A visual learning analytics (VLA) approach to video-based teacher professional development: Impact on teachers’ beliefs, self-efficacy, and classroom talk practice. Computers & Education, 144, 103670.
Silva Filho, R. L., & Adeodato, P. J. (2019). Data mining solution for assessing the secondary school students of Brazilian federal institutes. In 2019 8th Brazilian Conference on Intelligent Systems (BRACIS) (pp. 574-579). IEEE.
Gaftandzhieva, S., Docheva, M., & Doneva, R. (2021). A comprehensive approach to learning analytics in Bulgarian school education. Education and Information Technologies, 26(1), 145-163.
Grover, S., Basu, S., Bienkowski, M., Eagle, M., Diana, N., & Stamper, J. (2017). A framework for using hypothesis-driven approaches to support data-driven learning analytics in measuring computational thinking in block-based programming environments. ACM Transactions on Computing Education (TOCE), 17(3), 1-25.
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research, 77(1), 81-112.
Ifenthaler, D. (2021). Learning analytics for school and system management. OECD Digital Education Outlook 2021 Pushing the Frontiers with Artificial Intelligence, Blockchain and Robots: Pushing the Frontiers with Artificial Intelligence, Blockchain and Robots, 161.
Jiménez-Gómez, M. Á., Luna, J. M., Romero, C., & Ventura, S. (2015, March). Discovering clues to avoid middle school failure at early stages. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge (pp. 300-304).
Jivet, I., Scheffel, M., Specht, M., & Drachsler, H. (2018, March). License to evaluate: Preparing learning analytics dashboards for educational practice. In Proceedings of the 8th international conference on learning analytics and knowledge (pp. 31-40).
Joksimović, S., Kovanović, V., & Dawson, S. (2019). The journey of learning analytics. HERDSA Review of Higher Education, 6, 27-63.
Khalil, M., & Ebner, M. (2015, September). A STEM MOOC for school children—What does learning analytics tell us?. In 2015 International Conference on Interactive Collaborative Learning (ICL) (pp. 1217-1221). IEEE.
Kiron, D., Shockley, R., Kruschwitz, N., Finch, G., & Haydock, M. (2012). Analytics: The widening divide. MIT Sloan Management Review, 53(2), 1.
Knoke, D. & Yang, S. (2019). Social Network Analysis. New York, NY: Sage Publications.
Lakkaraju, H., Aguiar, E., Shan, C., Miller, D., Bhanpuri, N., Ghani, R., & Addison, K. L. (2015, August). A machine learning framework to identify students at risk of adverse academic outcomes. In Proceedings of the 21th ACM SIGKDD international conference on knowledge discovery and data mining (pp. 1909-1918).
Lee, H. S., Pallant, A., Pryputniewicz, S., Lord, T., Mulholland, M., & Liu, O. L. (2019). Automated text scoring and real‐time adjustable feedback: Supporting revision of scientific arguments involving uncertainty. Science Education, 103(3), 590-622.
Leitner, P., Khalil, M., & Ebner, M. (2017). Learning analytics in higher education—a literature review. Learning analytics: Fundaments, applications, and trends, 1-23.
Manske, S., & Hoppe, H. U. (2016, July). The” concept cloud”: Supporting collaborative knowledge construction based on semantic extraction from learner-generated artefacts. In 2016 IEEE 16th International Conference on Advanced Learning Technologies (ICALT) (pp. 302-306). IEEE.
Manyika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R., Roxburgh, C., & Hung Byers, A. (2011). Big data: The next frontier for innovation, competition, and productivity. McKinsey Global Institute.
Marcu, D., & Danubianu, M. (2020). Sentiment analysis from students’ feedback: a Romanian high school case study. In 2020 International Conference on Development and Application Systems (DAS) (pp. 204-209). IEEE.
Monroy, C., Rangel, V. S., & Whitaker, R. (2013). STEMscopes: contextualizing learning analytics in a K-12 science curriculum. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (pp. 210-219).
Palermo, C., & Wilson, J. (2020). Implementing Automated Writing Evaluation in Different Instructional Contexts: A Mixed-Methods Study. Journal of Writing Research, 12(1).
Papamitsiou, Z., & Economides, A. A. (2015). Temporal learning analytics visualizations for increasing awareness during assessment. International Journal of Educational Technology in Higher Education, 12(3), 129-147.
Quigley, D., Ostwald, J., & Sumner, T. (2017, March). Scientific modeling: using learning analytics to examine student practices and classroom variation. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 329-338).
Ruipérez-Valiente, J. A., & Kim, Y. J. (2020). Effects of solo vs. collaborative play in a digital learning game on geometry: Results from a K12 experiment. Computers & Education, 159, 104008.
Seitlinger, P., Bibi, A., Uus, Õ., & Ley, T. (2020, March). How working memory capacity limits success in self-directed learning: a cognitive model of search and concept formation. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge (pp. 53-62).
Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380-1400.
Siemens, G., Gasevic, D., & Console, G. (2011). Call for papers, 1st international conference learning analytics & knowledge (LAK 2011).
Shaffer, D. W., Hatfield, D., Svarovsky, G. N., Nash, P., Nulty, A., Bagley, E., … & Mislevy, R. (2009). Epistemic network analysis: A prototype for 21st-century assessment of learning. International Journal of Learning and Media, 1(2).
Tan, J. P. L., Yang, S., Koh, E., & Jonathan, C. (2016, April). Fostering 21st century literacies through a collaborative critical reading and learning analytics environment: user-perceived benefits and problematics. In Proceedings of the sixth international conference on learning analytics & knowledge (pp. 430-434).
Van Der Aalst, W. (2012). Process mining: Overview and opportunities. ACM Transactions on Management Information Systems (TMIS), 3(2), 1-17.
Varanasi, M. R., Fischetti, J. C., & Smith, M. W. (2018). Analytics framework for K-12 school systems. In Data leadership for K-12 schools in a time of accountability (pp. 206-233). IGI Global.
Waheed, H., Hassan, S. U., Aljohani, N. R., & Wasif, M. (2018). A bibliometric perspective of learning analytics research landscape. Behavior & Information Technology, 37(10-11), 941-957.
Wandera, H., Marivate, V., & Sengeh, M. D. (2019, November). Predicting National School Performance for Policy Making in South Africa. In 2019 6th International Conference on Soft Computing & Machine Intelligence (ISCMI) (pp. 23-28). IEEE.
Wen, C. T., Chang, C. J., Chang, M. H., Fan Chiang, S. H., Liu, C. C., Hwang, F. K., & Tsai, C. C. (2018). The learning analytics of model-based learning facilitated by a problem-solving simulation game. Instructional Science, 46(6), 847-867.
Yousafzai, B. K., Hayat, M., & Afzal, S. (2020). Application of machine learning and data mining in predicting the performance of intermediate and secondary education level student. Education and Information Technologies, 25(6), 4677-4697.