I’ve been using the learning cycle as a framework for a strategic approach to technology in schools. This is the third post of the series, the previous two having focused on access (mobile) and action (cloud). The next stage is that of reflection. The manifestation of this aspect in my proposed strategy is analytics.
In the basic learning cycle, reflection is the all-important point in the process when we widen our awareness, take a breath and open our senses to some objective evidence of the efficacy of our efforts. Reflections may be fluid and continuous (usually resulting in micro adjustments) or periodic (usually resulting in more macro or strategic reflections). We may self-reflect (internal validation) or we may seek out reflection in the observations of others or in data (external validation). In our journey to becoming more effective learners, an important part of the process is calibrating our self-reflections to more closely match external validation. This is a lifelong process in which external validation continues to be important but we learn to learn more effectively because our internal validations are proved to be getting more accurate.
The calibration of internal and external validation is essential to the teaching and learning process. Without it, it’s quite possible for individuals to entirely miscalculate their progress and consequently focus on the wrong things to generate improvement. I’m reminded of the contestants in singing contests on TV who are convinced they are superstars in the making but who can barely sing. This is an extreme example on the spectrum (perhaps delusional) however the underlying issue is a lack of calibration between internal and external validation of effective learning.
Of course, this is (in part) precisely the purpose of the teacher. The challenge is that, being human, we’re not only capable of a little self-delusion at times but we can also project our delusions. In other words, the teacher as an instrument of reflection for learners also needs to be calibrated. Teacher calibration might come through the formative assessment process, summative assessment, experience and professional development. The challenge is to effectively and objectively benchmark our internal assessments.
This is the point at which I introduce the concept of data, analytics and learning intelligence (equate with business intelligence). Before you start telling me about the shortcomings of data in the learning and teaching process, hear me out. I know that human relationships underpin learning. What I also know is that human nature is such that we are simply not objective in our evaluations nor are we calculating machines. It is possible for us to miss patterns, to be ‘mis-calibrated’ or simply to be overwhelmed by too much data. We’re fallible.
‘Big Data’ and analytics are 21st Century phenomena emerging from the already enormous, and still rapidly increasing, speed and scale that technology affords us in capturing, aggregating, storing and analysing data. There is more data available about human behaviour than ever before and a great deal of value is locked up in that data. The promise of analytics is that new insights can be gained from analysis of the data trails left by individuals in their interactions with each other and the world, most particularly when they’re using technology.
The rapid evolution of big data methodologies and tools has, to date, been driven by the business world which recognises in them the potential for unlocking value for their customers and shareholders. In this context the term ‘business intelligence’ is often used to describe the intersection of data and insight. When applied to education, analytics may be sub-divided into two categories: learning and academic. The following table describes that categorisation:
Academic analytics are the improvement of organisational processes, workflows, resource allocation and measurement through the use of learner, academic, and organisational data. Academic analytics, akin to business analytics, are concerned with improving organisational effectiveness.
We can define learning analytics as the measurement, collection, analysis and reporting of data about learners and their contexts for the purposes of understanding and optimising learning and the environments in which it occurs. In the same way that ‘business intelligence’ informs business decisions in order to drive success, so learning analytics is the basis of ‘learning intelligence’ that is focused on improving learner success.
Learning analytics are not the goal in themselves. Learning intelligence is the goal. Learning intelligence is the actionable information arising from learning analytics that has the potential to deliver improved learner success. The evidence from analytics in business is that there is deep value to be mined in the data. The objectivity and rigour that is represented by learning analytics provides an empirical basis for everything from learner-level interventions to national policy making.The Society for Learning Analytics Research (SoLAR) is an inter-disciplinary network of leading international researchers who are exploring the role and impact of analytics on teaching, learning, training and development. Their mission as an organisation is to:
- Pursue research opportunities in learning analytics and educational data mining,
- Increase the profile of learning analytics in educational contexts, and
- Serve as an advocate for learning analytics to policy makers
Significant potential exists for analytics to guide learners, educators, administrators, and funders in making learning-related decisions. Learning analytics represents the application of “big data” and analytics in education. SoLAR is an organisation that is focused on a building a planned and integrated approach to developing insightful and easy-to-use learning analytics tools. Three key beliefs underpin their proposal:
- Openness of process, algorithms, and technologies is important for innovation and meeting the varying contexts of implementation.
- Modularised integration: core analytic tools (or engines) include adaptation, learning, interventions and dashboards. The learning analytics platform is an open architecture, enabling researchers to develop their own tools and methods to be integrated with the platform.
- Reduction of inevitable fragmentation by providing an integrated, expandable, open technology that researchers and content producers can use in data mining, analytics, and adaptive content development.
From my experience talking to educators, it’s clear they usually know that there is data available and they know how to act on learning intelligence when they have it, but they’re much less sure about the analytics phase. Whilst working on a national procurement for a learning management system last year I realised we really knew very little about the utilisation of key technology assets in the schools we were trying to build systems for. As it turned out this data was sitting, untouched, in log files in servers within these schools. I approached three of the schools and asked their permission to copy this data for the purposes of analysis. They knew it existed and were happy for me to analyse the anonymised data.
I was able to analyse the utilisation of technology assets (software and hardware) across these schools over a period of months in order to understand exactly how technology was used. This enabled me to show where the investment in technology was being dramatically underused and how it could be re-shaped to maximise utilisation of the investment in order to improve the chances of learning gains. I didn’t have time to, but could have mapped this data against the timetable and assessment data to explore how technology mapped against attainment. This would have allowed me to correlate technology utilisation by different teachers, departments and schools against the performance of their pupils.
This example is the tip of the iceberg in terms of analytics and big data in education. In terms of my technology strategy, identifying and analysing key data in your school to produce learning intelligence will maximise the learning bang for your technology buck in an objective manner. It is a critical part of your strategy because without the analysis, you may well be making unnecessary or ineffective investments in technology. Don’t be driven by technology; be driven by learning outcomes.