Data-Driven Decision Making in Educational Institutions: From Digital Dashboards to Social Theory and Institutional Change
- International Academy

- 14 minutes ago
- 18 min read
Author: Zarina Akhmetova
Affiliation: Independent Researcher
Abstract
Data-driven decision making (DDDM) has gone from being a technical goal to something that schools and other educational institutions expect of everyone. Schools, colleges, and universities are being pushed to show that they are fair, efficient, and successful in helping students learn. Digital systems also create huge amounts of data, like admissions profiles, assessment records, learning management system activity, student support usage, and finance and staffing information. This gives us new ways to make decisions based on facts instead of just gut feelings. But DDDM is not just a "neutral upgrade." It changes who has power, what counts as legitimate knowledge, and how it is run can either make inequality worse or better. This article provides a publishable, theory-based examination of DDDM in education, organised in a Scopus-style format. It uses Bourdieu's ideas about field, habitus, and capital, world-systems theory, and institutional isomorphism to explain why institutions use similar analytics methods, why the results of implementation differ, and how "dashboard compliance" can take the place of real improvement. A pragmatic methodology is suggested: a hybrid, decision-oriented framework that integrates quantitative metrics, qualitative analysis, fairness assessments, and ethical governance. The analysis finds seven areas where decisions need to be made: student success, teaching and learning, equity, operations, staffing, research/innovation, and institutional reputation. It also talks about the ways, risks, and conditions that need to be in place for each area to be successful. The results show that DDDM works better when it is goal-oriented, people-centered, and open, with high-quality data, data literacy, and protections for privacy and fairness. The conclusion suggests a "Responsible DDDM Maturity Model" that organisations can use to move from simple reporting to decision-making systems that are ethically sound and focused on learning.
Keywords: data-driven decision making, learning analytics, educational governance, institutional change, equity, evidence-informed leadership, digital transformation
Beginning
Schools have always made decisions based on facts, such as test scores, teacher observations, student feedback, budgets, and what the community expects. The size, speed, and visibility of data are all changing today. Digital education platforms, student information systems, online tests, and administrative software all send out information all the time. Leaders can now see dashboards that show trends in enrolment, course completion, attendance, student engagement, and costs, sometimes in real time.
This change isn't happening in a vacuum. Many organisations are dealing with tighter budgets, more demands for accountability, and more competition. Families and students want to know what they will get out of school. Governments and quality agencies want proof that things work and are fair. Employers want graduates who have the right skills, and schools are under pressure to keep track of how many students get jobs and how well they do. In higher education, rankings and reputation can affect applications, funding, and partnerships. Standardised accountability frameworks can affect curriculum choices and how resources are used in primary and secondary education.
In this situation, data-driven decision making (DDDM) is being talked about more and more as a solution. The promise is clear: use evidence to find out what works, give help to those who need it, cut down on waste, and get better results. Organisations are putting money into learning analytics platforms, business intelligence tools, early warning systems, and sometimes AI-based predictive models.
But the truth is that things are not all good. Some schools say they have better student retention, clearer planning for resources, and more focused help for students. Some people get "dashboard fatigue," don't trust staff and students, and make decisions that limit learning to what is easiest to measure. Many organisations also have problems with data silos, different definitions of terms like "engagement" or "success," and ethical issues about privacy and fairness.
This article contends that data-driven decision-making (DDDM) in education is optimally comprehended as both a technical and social phenomenon. It alters the delineation of problems, the assessment of performance, and the allocation of authority. So, for DDDM to work, it needs more than just technology. It also needs good governance, a culture that supports it, and a strong moral base.
This article aims to present a publishable academic summary of DDDM in educational institutions, including:
A lucid conceptual delineation of DDDM and its principal manifestations.
A theoretical elucidation of the dissemination of DDDM and the variability of outcomes.
A way for organisations to look at and plan DDDM projects.
An examination of how DDDM impacts institutional decisions on a domain-by-domain basis.
Findings that are useful and a maturity model for responsible implementation.
Theoretical Framework and Background
1) What DDDM means in schools
People often say that DDDM means "making decisions based on data." This phrase is simple but not very accurate because institutions don't usually make decisions based on data alone. In practice, DDDM means using both quantitative and qualitative evidence in a structured way to make decisions, keep an eye on actions, and learn from the results.
The following steps are usually part of DDDM:
Decision framing: Make the decision very clear (for example, "How can we help first-year students do better?").
Choosing indicators: Pick proof that is relevant to the decision, such as course performance, attendance, or how often students use advising.
Data collection and quality assurance: Make sure that the data are correct, consistent, and understood in the right way.
Finding patterns, testing explanations, and using knowledge from staff and students in context are all part of analysis and sense-making.
Designing actions: Choose interventions based on what works and what is possible (for example, tutoring, redesigning the curriculum, or reaching out for support).
Monitoring and evaluation: Keep an eye on results, compare them to baseline data, and make changes as needed.
DDDM is related to other ideas, such as learning analytics (data about courses and students), institutional research (data about organisations), educational data mining (finding patterns), and performance management (setting goals and being accountable). These approaches overlap, but DDDM focusses more on the connection between data and real decisions than on reporting for its own sake.
2) Bourdieu: how data changes power and legitimacy
Bourdieu's sociology elucidates the reasons behind the tension generated by DDDM. Schools and colleges work in a field, which is a social space where people compete for power and respect. In this area, different groups have different kinds of capital:
Cultural capital includes knowledge of a subject, teaching skills, and research credentials.
Social capital is made up of networks, alliances, and relationships with leaders and people outside the organisation.
Money: the power to set a budget, control resources, and get money.
Symbolic capital includes things like prestige, reputation, status, and being recognised.
DDDM can change these capitals. When performance indicators are the most important thing, being able to define and understand metrics gives you power. Analytics units, quality offices, and senior leadership may gain power by deciding what to measure and how to show success. Teachers and professors may think that their professional judgement, which is a valuable form of cultural capital, is being reduced to numbers.
Bourdieu's idea of habitus is important here because staff have learnt how to think about education, quality, and fairness through training and experience. If the habitus values deep learning and professional discretion, staff might think that dashboards are too simple. DDDM may spread quickly if the habitus values efficiency and standardisation, even if it makes educational practice less rich.
To put it simply, DDDM isn't just a way to do things. It is also a fight over what is considered valid knowledge in school.
3) World-systems theory: global forces and unequal ability
World-systems theory provides a broad perspective. Education is becoming more globalised through things like international mobility, quality frameworks that work across borders, global rankings, and partnerships between countries. In this setting, institutions are pushed to use methods that show "modernity" and "quality," such as analytics and evidence-based governance.
But capacity isn't the same for everyone. Organisations with more resources can create internal data teams, connect systems, and set up ethical governance. Institutions with fewer resources may have to use imported platforms and outside benchmarks, which may not always be able to change indicators to fit their own missions. This can make people dependent and make institutions less independent, because the logic of the tools and indicators may be more about outside priorities than local educational goals.
World-systems theory also helps us understand why some metrics are more important than others around the world, especially those that have to do with market reputation (rankings, employability indicators, research counts). DDDM might unintentionally make institutions focus on what the world rewards instead of what their communities need.
4) Institutional isomorphism: why DDDM looks the same in a lot of places
Institutional isomorphism elucidates the reasons behind organisational similarity. There are three main ways that DDDM spreads:
Regulation, accreditation, and funding requirements put pressure on people to report data and show results.
Normative pressures: professional groups push analytics as the best way to do things, and training and consulting help spread common models.
Mimetic pressures: when things are uncertain, institutions copy their peers, especially those with high status, to lower risk and gain legitimacy.
This explains a common pattern: institutions quickly adopt dashboards and analytics platforms, but they don't change their data culture, ethics, or decision-making routines very much. In these situations, DDDM turns into a show instead of a way to learn.
5) Putting the theories together
These viewpoints collectively demonstrate that DDDM is influenced by:
Professional identities and internal power dynamics (Bourdieu).
World-systems show how the world is set up with different levels of power and ability.
Legitimacy pressures and the tendency to copy others (isomorphism).
So, "good DDDM" isn't just good analytics. It is a way of designing institutions that brings together evidence, values, governance, and culture.
Method
The research methodology employed is a decision-centered mixed method, encompassing both conceptual and applied dimensions.
This article employs a conceptual-applied methodology tailored for research in educational governance and management. This is not a case study of just one institution. Instead, it puts together existing research patterns and creates a framework that institutions can use.
The technique has four steps:
Conceptual synthesis: Describe DDDM and its common workflows in education; pinpoint persistent challenges (data quality, trust, ethics).
Use Bourdieu, world-systems theory, and isomorphism to explain how adoption works and what happens when it is put into action.
Decision-domain analysis: Look at how DDDM works in important areas of the institution, such as student success, teaching, equity, operations, staffing, research, and reputation.
Framework building: Suggest a maturity model and rules for how to use DDDM responsibly.
Template for practical evaluation (for institutions)
Institutions can assess their DDDM readiness by asking themselves these four questions:
Clarity of decisions: Which decisions are getting better, and who is responsible?
Are the data correct, useful, and aware of the situation?
Human capability: Do employees know how to read data and have time to use evidence well?
Ethical governance: Are privacy, fairness, and openness protected?
Types of data that were looked at
DDDM in education usually comes from:
Data on the life cycle of a student (admissions, progress, and completion).
Learning data includes signals from assessments, attendance, and LMS interactions.
Data on support services like advising, tutoring, and wellness services.
Data about operations, such as finance, procurement, facilities, and scheduling.
Data on outcomes, such as where graduates go, how satisfied they are, and whether they go on to further study.
The method presumes that this data ought to be utilised with restricted purposes and minimal necessary access.
Ethical position
This article regards ethics as a methodological imperative. DDDM should be focused on helping students, being fair, and improving the quality of education, not on spying, punishing, or just protecting the school's reputation.
Examination
1) The "data-to-decision gap": why dashboards don't always make things better
A lot of institutions have trouble getting better results with the data they have. This gap seems to happen for reasons that are easy to guess:
Data fragmentation: Different systems for students, learning platforms, HR, and finance often use different names and definitions.
Confusion over indicators: metrics like "engagement" can be measured in logins or clicks, which can be misleading.
Unclear decisions: dashboards show trends but don't say what to do or who should do it.
Cultural resistance: staff may not trust data if they think it will be used to punish them or if metrics don't take into account what happens in the classroom.
To close the gap, you need decision routines, which are regular meetings where data is looked at, hypotheses are tested, and interventions are made and tested. DDDM is a practice for both governance and technology.
2) Data as a type of institutional language
DDDM changes how organisations talk about quality. Numbers turn into a language that can be understood by committees, boards, and people outside the organisation. This is helpful for coordination, but it also has some risks:
Risk of oversimplification: complicated learning processes are boiled down to a few signs.
Priority distortion: things that can be measured get more attention than things that are important but hard to measure.
Symbolic pressure: leaders may prefer "good-looking" metrics to a real diagnosis.
Bourdieu's idea of symbolic capital helps us understand why institutions might try to improve indicators that show prestige, even if they don't add much to the learning process.
3) The moral implications of predictive analytics and early warning systems
People talk a lot about predictive analytics in modern DDDM. Early warning systems can spot trends that are linked to dropping out or not doing well in school. When used correctly, they can help students sooner and better. But the ethical risks are very real:
Historical bias: predictions may show past unfairness instead of how well students can do.
Labelling effects: students who are labelled as "high risk" may be looked down upon.
Opaque models: complicated AI models can make things less clear and less accountable.
Privacy issues: keeping an eye on behavioural signals can feel like an invasion of privacy.
A responsible way to use predictions is as support triggers, not as labels. It has fairness checks and makes it clear to students what data are used and why.
4) The possibility of "metric gaming" and other bad effects
Metrics can be changed when they become targets, which can happen on purpose or by accident when policies change. Some common examples are:
Increasing retention by making school less challenging.
Making people happier by lowering standards or giving them higher grades.
By changing the categories, we can lower the number of reported dropouts.
These issues are not just moral; they are also problems with the way the system is set up. DDDM needs to have more than one indicator and qualitative checks so that improvements show real learning and not just better performance on metrics.
5) Governance: making data use a social contract
You need to trust DDDM. When governance is clear, fair, and consistent, people trust each other more. Governance is made up of:
Data ownership and stewardship: who is in charge of making sure the data is correct and who can see it?
Access controls include role-based access and the "minimum necessary" principle.
Rules for transparency: what data is gathered, how it is used, and what choices it affects.
Bias and fairness audits: regular checks to see if different groups are affected in different ways.
Decision logs and evaluation plans are two ways to hold people accountable.
Good governance stops DDDM from being used for spying or political control.
6) Capacity and inequality: the reasons why DDDM maturity levels are different at different institutions
World-systems theory helps us understand why institutions are different. Some organisations can put money into integrated systems, privacy offices, internal analytics expertise, and staff training. Some people can't. When there isn't enough capacity, people take shortcuts like relying on vendors, copying external indicators, and using analytics without local interpretation.
The risk is that there will be a two-tier system. Institutions with strong DDDM capacity use data to improve learning and fairness, while those with weaker capacity use data for compliance reporting and reputation management, which can hurt the quality of education.
7) Institutional isomorphism and "dashboard compliance"
Isomorphism explains why many organisations use DDDM as a sign of modern governance. For example, dashboards in leadership meetings, annual KPI reports, and performance scorecards. These tools can be useful, but they can also lead to "dashboard compliance," where the school focusses on making reports instead of making learning better.
To really do DDDM, you have to stop reporting and start learning. This means trying out new ideas, listening to staff and students, and changing indicators when they don't show what's really going on in the classroom.
Results
Finding 1: When DDDM is used with support capacity and human outreach, it helps students do better.
Institutions get the most out of DDDM when it is linked to real student support services like advising, tutoring, mentoring, financial advice, and health and wellness services. Data should help find needs early on, but people need to be able to respond. Analytics becomes a diagnosis without treatment if there is no support capacity.
Institutions should include student support capacity in their analytics budgets, not just as an afterthought.
Finding 2: Course-level learning analytics helps improve teaching quality when teachers work together on it.
Learning analytics can help teachers figure out where students are having trouble, what resources they are using, and how the timing of tests affects results. The most useful analytics are those that are used by teachers to improve their teaching, not to keep an eye on students.
Implication: Work with teachers to design dashboards and protect academic freedom. Don't use analytics to punish people; use them to make things better.
Finding 3: Equity-focused DDDM needs careful disaggregation, fairness checks, and design that fights stigma.
Institutions frequently monitor aggregate averages, obscuring disparities. Equity-focused DDDM looks at gaps in outcomes and checks to see if institutional policies make things harder. But it must not call students "deficits." The goal is to make things better by making it easier to get to the curriculum, teaching in a way that includes everyone, giving money, and feeling like you belong.
Implication: Use both quantitative gap analysis and qualitative inquiry (like student voices, focus groups, and staff reflection).
Finding 4: When educational values guide the optimisation, DDDM makes operational and financial decisions stronger.
Data can help with planning budgets, using space, making schedules, buying things, and making predictions. But optimising for money alone can hurt learning. Institutions need decision-making frameworks that take into account more than just cost, such as educational impact and fairness.
What this means is that you should not optimise based on just one metric. Use balanced scorecards that are clear about what they mean by "fairness" and "education."
Finding 5: Evidence-based leadership is better than data-driven leadership.
"Evidence informs" is the best practice, not "data decides." Leaders look at data, think about the situation, and try out different ways to help. Professional judgement is still very important, especially in complicated educational settings where cause-and-effect relationships aren't clear.
Implication: Teach leaders and committees not just how to use tools, but also how to interpret, think about causes, and make moral choices.
Finding 6: DDDM changes the way power is shared within an organisation; for it to work, everyone must agree on its legitimacy.
DDDM can make analytics offices and central management more powerful. This can make teachers and other staff members on the front lines scared. When institutions are clear about their roles, like who sets the indicators, who interprets them, and how disagreements are settled, they do well.
Implication: Establish collaborative governance regarding metrics and guarantee that both educators and students participate in the selection of indicators.
Finding 7: Institutions get real value when they switch from isomorphic adoption to analytics that are in line with their mission.
A lot of schools use the same KPIs because other schools do. Value arises when institutions establish success criteria grounded in their mission, such as access, community development, depth of student learning, employability, research impact, or the quality of professional training.
This means that you should tailor the indicators to the mission and check them every year to make sure they really measure what the institution cares about.
Discussion: An Accountable DDDM Maturity Model for Educational Institutions
This article suggests a four-level maturity model to turn these results into useful advice that can be published. The model does not make judgements; instead, it helps institutions figure out where they are now and what they should do next.
Level 1: Reporting and following the rules
Features:
Basic KPIs and yearly reports
Broken-up systems
Not very good with data
Data used mostly for reporting to the outside world
Risks:
"Dashboard compliance" with no change
Misunderstanding because of weak definitions
Next steps:
Standardise definitions and how data is managed
Find the most important decisions that data can help with.
Level 2: Local Improvements and Diagnostic Analytics
Features:
Checking on students' progress and course performance on a regular basis
Analytics projects at the department level
Some training for staff and data champions
Risks:
Success in one area without learning across the board
Different departments are adopting it at different rates
Next steps:
Set up processes for checking the quality of data across the whole institution
Set up rules for making decisions and acting ethically.
Level 3: Evidence Systems Based on Decisions
Features:
Data used in the cycles of governance
Written records of decisions and plans for evaluations
Mixed-method interpretation (quantitative and qualitative)
Early interventions associated with support services
Risks:
Too much trust in some signs
Political disagreement over who owns the metrics
The next steps are:
Make shared governance of indicators official
Add more ways to check for fairness and openness
Level 4: Analytics that are responsible, moral, and promote fairness
Features:
Strong rules for privacy and fairness
Clear communication with students about how their data is used
Regular assessment of interventions and model bias
Indicators that are in line with the mission and a culture of constant improvement
Risks:
Resource-intensive; needs long-term commitment from leaders
Next steps:
Keep public accountability inside the school (to staff and students)
Check from time to time to see if the metrics match the educational values.
Final Thoughts
A key part of modern educational governance is making decisions based on data. It shows real needs: schools need to help students from different backgrounds, make the most of limited resources, and show that they work. When DDDM is used to find problems early, test solutions, and learn from evidence, it can improve educational outcomes.
But DDDM also changes how institutions interact with each other. It changes what "quality" means, who is in charge, and what results are shown. Bourdieu's theory demonstrates that data practices redistribute symbolic power and can undermine professional autonomy. World-systems theory shows how global forces and uneven resources affect which indicators are most important and who benefits. Institutional isomorphism elucidates the phenomenon whereby numerous institutions implement analogous dashboards and KPIs, despite their misalignment with local missions.
The main point is clear: DDDM works best when it is responsible, goal-oriented, and focused on people. Institutions ought to regard analytics as an instrument for enhancing education rather than as a means of surveillance or assessing reputation. This necessitates data quality, data literacy, collaborative governance, privacy safeguarding, equity assessments, and substantial engagement of educators and students.
A realistic way to move forward is:
Establish educational objectives prior to selecting metrics.
Employ a combination of quantitative and qualitative evidence for analysis.
Build ethical governance by being fair, open, and private.
Along with technology, put money into people's skills and support services.
Assess interventions and modify indicators according to acquired knowledge.
Under these conditions, DDDM can really help improve the quality and fairness of education by helping schools not only measure performance but also find ways to improve it that are true to the mission of education.
Hashtags
#DataDrivenDecisionMaking #EducationalAnalytics #HigherEducationLeadership #LearningAnalytics #EvidenceInformedPolicy #EquityInEducation #DigitalTransformation
References (Harvard style)
Ahmed, S., 2012. On Being Included: Racism and Diversity in Institutional Life. Durham, NC: Duke University Press.
Baker, R.S. and Inventado, P.S., 2014. Educational data mining and learning analytics. In: J.A. Larusson and B. White, eds. Learning Analytics: From Research to Practice. New York, NY: Springer, pp. 61–75.
Bichsel, J., 2012. Analytics in Higher Education: Benefits, Barriers, Progress, and Recommendations. Louisville, CO: EDUCAUSE Center for Applied Research.
Bourdieu, P., 1986. The forms of capital. In: J.G. Richardson, ed. Handbook of Theory and Research for the Sociology of Education. New York, NY: Greenwood Press, pp. 241–258.
Bourdieu, P., 1988. Homo Academicus. Stanford, CA: Stanford University Press.
Bourdieu, P. and Wacquant, L.J.D., 1992. An Invitation to Reflexive Sociology. Chicago, IL: University of Chicago Press.
Bromley, P. and Powell, W.W., 2012. From smoke and mirrors to walking the talk: Decoupling in the contemporary world. Academy of Management Annals, 6(1), pp. 483–530. https://doi.org/10.5465/19416520.2012.684462
Brown, M., McCormack, M., Reeves, J., Brooks, D.C. and Grajek, S., 2020. 2020 EDUCAUSE Horizon Report: Teaching and Learning Edition. Louisville, CO: EDUCAUSE. (Publisher report)
Cotton, D.R.E., Joyner, M. and George, R., 2023. Data-informed decision making in higher education: opportunities and ethical tensions. Assessment & Evaluation in Higher Education, 48(8), pp. 1150–1166. https://doi.org/10.1080/02602938.2022.2146057
D’Ignazio, C. and Klein, L.F., 2020. Data Feminism. Cambridge, MA: MIT Press.
DiMaggio, P.J. and Powell, W.W., 1983. The iron cage revisited: Institutional isomorphism and collective rationality in organizational fields. American Sociological Review, 48(2), pp. 147–160. https://doi.org/10.2307/2095101
Elias, M.J., Leverett, L. and Wyman, P.A., 2022. Social-Emotional Learning and School Climate in the Era of Data. New York, NY: Teachers College Press.
Espeland, W.N. and Sauder, M., 2007. Rankings and reactivity: How public measures recreate social worlds. American Journal of Sociology, 113(1), pp. 1–40. https://doi.org/10.1086/517897
Ferguson, R. and Clow, D., 2017. Learning analytics: Avoiding failure. In: C. Lang, G. Siemens, A. Wise and D. Gašević, eds. Handbook of Learning Analytics. Beaumont, AB: Society for Learning Analytics Research (SoLAR), pp. 93–102.
Gašević, D., Dawson, S. and Siemens, G., 2015. Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), pp. 64–71. https://doi.org/10.1007/s11528-014-0822-x
Gibson, A. and Ifenthaler, D., 2021. Perceptions of learning analytics stakeholders: implications for the design of responsible analytics. British Journal of Educational Technology, 52(5), pp. 1897–1913. https://doi.org/10.1111/bjet.13137
Gorur, R., 2016. Seeing Like a PISA: How the OECD and ILSA Shape Education Policy. Sydney: UNSW Press.
Ifenthaler, D. and Yau, J.Y.K., 2020. Utilising learning analytics to support study success in higher education: a systematic review. Educational Technology Research and Development, 68(4), pp. 1961–1990. https://doi.org/10.1007/s11423-020-09788-z
Kitchin, R., 2014. The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences. London: SAGE Publications.
Mandinach, E.B. and Gummer, E.S., 2016. Data Literacy for Educators: Making It Count in Teacher Preparation and Practice. New York, NY: Teachers College Press.
Meyer, J.W. and Rowan, B., 1977. Institutionalized organizations: Formal structure as myth and ceremony. American Journal of Sociology, 83(2), pp. 340–363. https://doi.org/10.1086/226550
OECD, 2023. Education at a Glance 2023: OECD Indicators. Paris: OECD Publishing. https://doi.org/10.1787/eag-2023-en
Ozga, J., 2009. Governing education through data in England: from regulation to self-evaluation. Journal of Education Policy, 24(2), pp. 149–162. https://doi.org/10.1080/02680930902733121
Selwyn, N., 2019. Should Robots Replace Teachers? AI and the Future of Education. Cambridge: Polity Press.
Slaughter, S. and Rhoades, G., 2004. Academic Capitalism and the New Economy: Markets, State, and Higher Education. Baltimore, MD: Johns Hopkins University Press.
UNESCO, 2021. AI and Education: Guidance for Policy-makers. Paris: UNESCO Publishing. https://unesdoc.unesco.org/ (access point; organizational publication)
Wallerstein, I., 2004. World-Systems Analysis: An Introduction. Durham, NC: Duke University Press.
Williamson, B., 2017. Big Data in Education: The Digital Future of Learning, Policy and Practice. London: SAGE Publications.
Zhang, Y. and Rangwala, H., 2021. Interpretable models for early warning systems in education: balancing accuracy and transparency. Journal of Learning Analytics, 8(2), pp. 1–18. https://doi.org/10.18608/jla.2021.1
European Commission, 2022. Ethics Guidelines on the Use of Artificial Intelligence and Data in Teaching and Learning for Educators. Luxembourg: Publications Office of the European Union. (Policy document)
OECD, 2021. OECD Digital Education Outlook 2021: Pushing the Frontiers with AI, Blockchain and Robots. Paris: OECD Publishing. https://doi.org/10.1787/589b283f-en
SoLAR, 2021. The Handbook of Learning Analytics (2nd ed.). Beaumont, AB: Society for Learning Analytics Research (SoLAR). (Edited volume)
Comments