Main Article Content
This article summarizes some emerging concerns as learning analytics become implemented throughout education. The article takes a sociotechnical perspective — positioning learning analytics as shaped by a range of social, cultural, political, and economic factors. In this manner, various concerns are outlined regarding the propensity of learning analytics to entrench and deepen the status quo, disempower and disenfranchise vulnerable groups, and further subjugate public education to the profit-led machinations of the burgeoning “data economy.” In light of these charges, the article briefly considers some possible areas of change. These include the design of analytics applications that are more open and accessible, that offer genuine control and oversight to users, and that better reflect students’ lived reality. The article also considers ways of rethinking the political economy of the learning analytics industry. Above all, learning analytics researchers need to begin talking more openly about the values and politics of data-driven analytics technologies as they are implemented along mass lines throughout school and university contexts.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0) license that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
Couldry, N., & Mejias, U. (2019). The costs of connection. Palo Alto CA: Stanford University Press.
Crawford, K. (2017). Why AI is still waiting for its ethics transplant. Wired, 1 Nov. 2017. www.wired.com/story/why-ai-is-still-waiting-for-its-ethics-transplant/
Eubanks, V. (2018). Automating inequality. New York: St. Martin’s Press.
Fourcade, M., & Healy, K. (2017). Classification situations: Life-chances in the neoliberal era. Historical Social Research/Historische Sozialforschung, Vol.42, No. 1 (159), Markets and Classifications. Categorizations and Valuations as Social Processes Structuring Markets (2017), pp. 23-51
Howard, P. (2017). Is social media killing democracy? Computational propaganda, algorithms, automation and public life. Inaugural lecture to the Oxford Internet Institute, 15 June 2017. www.youtube.com/watch?v=J1kXdA61AQY
Ivarsson, J. (2017). Algorithmic accountability. Lärande, 2 May 2017. http://lit.blogg.gu.se/2017/05/02/algorithmic-accountability/
Iveson, K. (2017). Digital labourers of the city, unite! In J. Shaw & M. Graham (Eds.), Our digital rights to the city (pp. 20–22). Oxford, UK: Meatspace Press.
Jarchow, T., & Estermann, B. (2015). Big data: Opportunities, risks and the need for action. Berner Fachhochschule, E-Government-Institut.
Li, F. (2017). Put humans at the centre of AI. MIT Technology Review, 9 Oct. 2017. https://www.technologyreview.com/s/609060/put-humans-at-the-center-of-ai
Nemorin, S. (2016). Neuromarketing and the “poor in world” consumer. Consumption Markets & Culture, 20(1), 59–80. https:/dx.doi.org/10.1080/10253866.2016.1160897
Noble, S. (2018). Algorithms of oppression. New York: New York University Press.
Obar, J. (2015). Big data and the phantom public: Walter Lippmann and the fallacy of data privacy self-management. Big Data & Society, 2(2). http://dx.doi.org/10.1177/2053951715608876
O’Neil, C. (2016). Weapons of math destruction. New York: Broadway Books.
Pangrazio, L., & Selwyn, N. (2018). “Personal data literacies”: A critical literacies approach to enhancing understandings of personal digital data. New Media & Society, 21(2). http://dx.doi.org/10.1177/1461444818799523
Reidenberg, J., & Schaub, F. (2018). Achieving big data privacy in education. Theory and Research in Education, 16(3). http://dx.doi.org/10.1177/1477878518805308
Robinson, S. (2017). What’s your anonymity worth? Digital Policy, Regulation and Governance, 19(5), 353–366. https://dx.doi.org/10.1108/DPRG-05-2017-0018
Selwyn, N. (2015). Data entry: Towards the critical study of digital data and education. Learning, Media and Technology, 40(1), 64–82. https://dx.doi.org/10.1080/17439884.2014.921628
Selwyn, N. (2016). Is technology good for education? Cambridge, UK: Polity Press.
Selwyn, N. (2019). What is digital sociology? Cambridge, UK: Polity Press.
Shelton, T. (2017). Re-politicizing data. In J. Shaw & M. Graham (Eds.), Our digital rights to the city (pp. 24–27). Oxford, UK: Meatspace Press.
Sims, C. (2017). Disruptive fixation: School reform and the pitfalls of techno-idealism. Princeton, NJ: Princeton University Press.
Singer, N. (2018). Just don’t call it privacy. New York Times, 22 Sept. 2018. www.nytimes.com/2018/09/22/sunday-review/privacy-hearing-amazon-google.html?smid=tw-nytopinion&smtyp=cur
Smith, A. (2018). Franken-algorithms: The deadly consequences of unpredictable code. The Guardian, 30 Aug. 2018. www.theguardian.com/technology/2018/aug/29/coding-algorithms-frankenalgos-program-danger
Tene, O., & Polonetsky, J. (2014). A theory of creepy: Technology, privacy, and shifting social norms. Yale Journal of Law and Technology, 16(1), article 2.
Tucker, I. (2013). Evgeny Morozov: We are abandoning all the checks and balances. The Guardian, 9 Mar. 2013. www.theguardian.com/technology/2013/mar/09/evgeny-morozov-technology-solutionism-interview
Wajcman, J. (2019). The digital architecture of time management. Science, Technology, & Human Values, 44(2). http://dx.doi/org/10.1177/0162243918795041