Main Article Content
The process of using analytic data to inform instructional decision-making is acknowledged to be complex; however, details of how it occurs in authentic teaching contexts have not been fully unpacked. This study investigated five university instructors’ use of a learning analytics dashboard to inform their teaching. The existing literature was synthesized to create a template for inquiry that guided interviews, and inductive qualitative analysis was used to identify salient emergent themes in how instructors 1) asked questions, 2) interpreted data, 3) took action, and 4) checked impact. Findings showed that instructors did not always come to analytics use with specific questions, but rather with general areas of curiosity. Questions additionally emerged and were refined through interaction with the analytics. Data interpretation involved two distinct activities, often along with affective reactions to data: reading data toidentify noteworthy patterns and explaining their importance in the course using contextual knowledge. Pedagogical responses to the analytics included whole-class scaffolding, targeted scaffolding, and revising course design, as well two new non-action responses: adopting a wait-and-see posture and engaging in deep reflection on pedagogy. Findings were synthesized into a model of instructor analytics use that offers useful categories of activities for future study and support
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0) license that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
Avramides, K., Hunter, J., Oliver, M., & Luckin, R. (2015). A method for teacher inquiry in cross-curricular projects: Lessons from a case study. British Journal of Educational Technology, 46(2), 249–264. https://dx.doi.org/10.1111/bjet.12233
Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gašević, D., Mulder, R., Williams, D., Dawson, S., & Lockyer, L. (2016). A conceptual framework linking learning design with learning analytics. In Proceedings of the 6th International Conference on Learning Analytics and Knowledge (LAK ʼ16), 25–29 April 2016, Edinburgh, UK (pp. 329–338). New York: ACM. http//:dx.doi.org/10.1145/2883851.2883944
Borko, H. (2004). Professional development and teacher learning: Mapping the terrain. Educational Researcher, 33(8), 3–15. https://dx.doi.org/10.3102/0013189X033008003
Borko, H., & Shavelson, R. J. (1990). Teacher decision making. In B. F. Jones & L. Idol (Eds.), Dimensions of thinking and cognitive instruction (pp. 311–340). Mahwah, NJ: Lawrence Erlbaum Associates.
Cochran-Smith, M., & Lytle, S. L. (1999). The teacher research movement: A decade later. Educational Researcher, 28(7), 15–25. https://dx.doi.org/10.3102/0013189X028007015
Cuban, L. (2001). Oversold and underused: Computers in the classroom. Cambridge, MA: Harvard University Press. Darling-Hammond, L., & Richardson, N. (2009). Research review/teacher learning: What matters. Educational
Leadership, 66(5), 46–53.
Dazo, S. L., Stepanek, N. R., Chauhan, A., & Dorn, B. (2017). Examining instructor use of learning analytics. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA ’17), 6–11 May 2017, Denver, Colorado, USA (pp. 2504–2510). New York: ACM. http://dx.doi.org/10.1145/3027063.3053256
Dillenbourg, P. (2013). Design for classroom orchestration. Computers & Education, 69, 485–492. https://dx.doi.org/10.1016/j.compedu.2013.04.013
Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and implementation of a learning analytics toolkit for teachers. Journal of Educational Technology & Society, 15(3), 58–76.
Ferguson, R., Brasher, A., Clow, D., Cooper, A., Hillaire, G., Mittlemeier, J., Rienties, B., Ullman, T., & Vuorikari,
R. (2016). Research evidence on the use of learning analytics: Implications for education policy. In R. Vuorikari & J. Castano-Munoz (Eds.), A European framework for action on learning analytics. Luxembourg: Joint Research Centre Science for Policy Report. http://dx.doi.org/10.2791/955210
Gibson, W., & Brown, A. (2009). Working with qualitative data. Los Angeles, CA: Sage.
Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for learning analytics. Journal of Educational Technology & Society, 15(3), 42–57.
Guba, E. G. (1981). Criteria for assessing the trustworthiness of naturalistic inquiries. Educational Technology Research & Development, 29(2), 75–91. https://dx.doi.org/10.1007/BF02766777
Hansen, C. J., & Wasson, B. (2016). Teacher inquiry into student learning: The TISL heart model and method for use in teachers’ professional development. Nordic Journal of Digital Literacy, 11(1), 24–49. http://dx.doi/10.18261/issn.1891-943x-2016-01-02
Herodotou, C., Rienties, B., Boroowa, A., Zdrahal, Z., Hlosta, M., & Naydenova, G. (2017). Implementing predictive learning analytics on a large scale: The teacher’s perspective. In Proceedings of the 7th International Learning Analytics and Knowledge Conference (LAK ’17), 13–17 March 2017, Vancouver, BC, Canada (pp. 267–271). New York: ACM. http://dx.doi.org/10.1145/3027385.3027397
Herodotou, C., Rienties, B., Verdin, B., & Boroowa, A. (2019). Predictive learning analytics “at scale”: Towards guidelines to successful implementation in higher education based on the case of the Open University UK. Journal of Learning Analytics, 6(1), 85–95. https://doi.org/10.18608/jla.2019.61.5
Holstein, K., Hong, G., Tegene, M., McLaren, B. M., & Aleven, V. (2018). The classroom as a dashboard: Co-designing wearable cognitive augmentation for K–12 teachers. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, NSW, Australia (pp. 79–88). New York: ACM. https://dx.doi.org/10.1145/3170358.3170377
Kitto, K., Buckingham Shum, S., & Gibson, A. (2018). Embracing imperfection in learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, NSW, Australia (pp. 451–460). New York: ACM. http://dx.doi.org/10.1145/3170358.3170413
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. https://dx.doi.org/10.1177/0002764213479367
Maguire, M. (2001). Methods to support human-centred design. International Journal of Human–Computer Studies, 55(4), 587–634. https://dx.doi.org/10.1006/ijhc.2001.0503
McKenney, S., & Mor, Y. (2015). Supporting teachers in data-informed educational design. British Journal of Educational Technology, 46(2), 265–279. https://dx.doi.org/10.1111/bjet.12262
McLaughlin, M. W., & Talbert, J. E. (2006). Building school-based teacher learning communities: Professional strategies to improve student achievement (Vol. 45). New York: Teachers College Press.
Molenaar, I., & Knoop-van Campen, C. (2018). How teachers make dashboard information actionable. IEEE Transactions on Learning Technologies. (Early Access). http://dx.doi.org/10.1109/TLT.2018.2851585
Mor, Y., Ferguson, R., & Wasson, B. (2015). Learning design, teacher inquiry into student learning and learning analytics: A call for action. British Journal of Educational Technology, 46(2), 221–229. https://dx.doi.org/10.1111/bjet.12273
Park, Y., & Jo, I. H. (2015). Development of the learning analytics dashboard to support students’ learning performance. Journal of Universal Computer Science, 21(1), 110–133.
Rienties, B., Boroowa, A., Cross, S., Kubiak, C., Mayles, K., & Murphy, S. (2016). Analytics4Action evaluation framework: A review of evidence-based learning analytics interventions at Open University UK. Journal of Interactive Media in Education, 2016(1), 2. http://dx.doi.org/10.5334/jime.394
Rust, F. (2009). Teacher research and the problem of practice. Teachers College Record, 111(8), 1882–1893.
Sergis, S., & Sampson, D. G. (2017). Teaching and learning analytics to support teacher inquiry: A systematic literature review. In A. Peña-Ayala (Ed.), Learning analytics: Fundaments, applications, and trends (pp. 25–63). Cham, Switzerland: Springer. https://dx.doi.org/10.1007/978-3-319-52977-6_2
Siemens, G., Gašević, D., Haythornthwaite, C., Dawson, S., Shum, S. S., Ferguson, R., Duval, E., Verbert, K., & Baker, R. S. (2011). Open learning analytics: An integrated & modularized platform. [Concept paper]. Society for Learning
Spillane, J. P. (2012). Data in practice: Conceptualizing the data-based decision-making phenomena. American Journal of Education, 118(2), 113–141. https://dx.doi.org/10.1086/663283
Steen, M. (2011). Tensions in human-centred design. CoDesign, 7(1), 45–60. https://dx.doi.org/10.1080/15710882.2011.563314
Tan, J. P-L., Koh, E., & Jonathan, C. R. (2018). Visible teaching in action: Using the WiREAD learning analytics dashboard for pedagogical adaptivity. Paper presented at the American Educational Research Association Annual Conference (AERA 2018), 13–17 April 2018, New York, NY, USA. Retrieved from the AERA Online Paper Repository. http://dx.doi.org/10.302/1321813
Tarmazdi, H., Vivian, R., Szabo, C., Falkner, K., & Falkner, N. (2015). Using learning analytics to visualise computer science teamwork. In Proceedings of the 2015 ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE ’15) 4–8 July 2015, Vilnius, Lithuania (pp. 165–170). New York: ACM. http://dx.doi.org/10.1145/2729094.2742613
Tempelaar, D. T., Rienties, B., & Nguyen, Q. (2017). Towards actionable learning analytics using dispositions. IEEE Transactions on Learning Technologies, 10(1), 6–16. http://dx.doi.org10.1109/TLT.2017.2662679
van Harmelen, M., & Workman, D. (2012). Analytics for learning and teaching. CETIS Analytics Series, 1(3), 1–40.
van Leeuwen, A. (2015). Learning analytics to support teachers during synchronous CSCL: Balancing between overview and overload. Journal of Learning Analytics, 2(2), 138–162. https://dx.doi.org/10.18608/jla.2015.22.11
van Leeuwen, A., van Wermeskerken, M., Erkens, G., & Rummel, N. (2017). Measuring teacher sense making strategies of learning analytics: A case study. Learning: Research and Practice, 3(1), 42–58. https://dx.doi.org/10.1080/23735082.2017.1284252
Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning analytics dashboard applications. American Behavioral Scientist, 57(10), 1500–1509. https://dx.doi.org/10.1177/0002764213479363
Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. In Proceedings of the 4th International Conference on Learning Analytics and Knowledge (LAK ʼ14), 24–28 March 2014, Indianapolis, IN, USA (pp. 203–211). New York: ACM. http://dx.doi.org/10.1145/2567574.2567588
Wise, A. F., Knight S., & Ochoa, X. (2018). When are learning analytics ready and what are they ready for? Journal of Learning Analytics, 5(3), 1–4. http://dx.doi.org/10.18608/jla.2018.53.1e
Wise, A. F., & Vytasek, J. (2017). Learning analytics implementation design. In C. Lang, G. Siemens, A. F. Wise, & D. Gašević (Eds.), Handbook of learning analytics (pp. 151–160). Beaumont, AB: Society for Learning Analytics Research. http://dx.doi.org/10.18608/hla17.013
Wise, A. F., Vytasek, J. M., Hausknecht, S., & Zhao, Y. (2016). Developing learning analytics design knowledge in the “middle space”: The student tuning model and align design framework for learning analytics use. Online Learning, 20(2), 155–182.
Wise, A. F., Zhao, Y., & Hausknecht, S. (2014). Learning analytics for online discussions: Embedded and extracted approaches. Journal of Learning Analytics, 1(2), 48–71. https://doi.org/10.18608/jla.2014.12.4
Xhakaj, F., Aleven, V., & McLaren, B. M. (2017). Effects of a teacher dashboard for an intelligent tutoring system on teacher knowledge, lesson planning, lessons and student learning. In E. André, R. S. Baker, X. Hu, M. M. T. Rodrigo, & B. du Boulay (Eds.), Proceedings of the 18th International Conference on Artificial Intelligence in Education (AIED 2017), 28 June–1 July 2017, Wuhan, China (pp. 315–329). Cham, Switzerland: Springer. http://dx.doi.org/10.1007/978-3-319- 61425-0_69
Zirkel, S., Garcia, J. A., & Murphy, M. C. (2015). Experience-sampling research methods and their potential for education research. Educational Researcher, 44(1), 7–16 https://dx.doi.org/10.3102/0013189X14566879