Category Archives: technology

How To Get Your #EdTech Business Off The Ground And Keep It There

rocket

I’ve worked in education technology (#EdTech) for many years now and over the last 7 years I’ve co-founded and run a successful company (Airhead Education) delivering a web desktop to schools (Airhead). We won a Bett award for ‘Innovation in ICT’ in 2015 and achieved it without investment from any external source. We’re debt-free and we’ve made a profit in every year of operation. No, we’re not Google yet but we’re making our mark in the education sector and we continue to listen and grow. It’s certainly not been easy, but it has been a lot of fun and as we start 2017, I’ve been reflecting on a few of my lessons learned.

I’ve seen technology products and services designed for education come and go (and usually turn up again, reinvented). Often I’ve seen the merit in the idea but the execution has been poor. Occasionally, both the idea and the execution have appeared to be flawed. The thing is, I don’t have an issue with either scenario. Ideas don’t just spring into life fully formed; they need to be shaped in the fire of trial, error and reflection. And of course the same applies to the execution of ideas in the form of products and services. The process of releasing, reviewing and revising is a basic principle underpinning continuous improvement. And I’m not even perturbed if individuals without education experience try their hand at EdTech. Sometimes the education crowd can’t see the wood for the trees. But there’s one thing you must do: survive long enough to learn the lessons you need to learn in order to build a successful business.

So if you’re going to invest your time, energy and creativity in developing technology for the education sector, you should be sensitive to the characteristics of the technology and education markets and what they mean for your business. For me, there are three particular EdTech business challenges:

    1. Rapid lifecycles – The pace of change in technology is rapid and and the lifecycle of most technologies is therefore short. Whether it’s software, hardware or the services that support them, rapid evolution of technology means a requirement to make changes just to stay functional and relevant, let alone to evolve with your customers’ needs. Developing, delivering, maintaining and scaling products and services is a costly endeavour which requires unerring financial and technological vigilance.
    2. Tight budgets – The majority of educational establishments are under constant budgetary pressure and, rightly, there is a tension between competing requirements for investment. Educational establishments should not be taking excessive risks with the deployment of technology because they simply cannot afford to squander their budget. The consequence of this is an ever higher bar for the effectiveness of education technology vershe price paid.
    3. Risk Aversion – Education establishments are intrinsically risk averse for a variety of reasons. Limited budget is one of those reasons but so too is the price of failure beyond just money. Organisational failure in an educational establishment ultimately hits the learner and so an ‘if it ain’t broke don’t fix it’ culture often emerges to protect the learner from excessive educational experimentation, including with technology.

So let’s be clear about what this means for the prospective EdTech entrepreneur:

  1. Deep pockets – You’re going to need deep pockets in order to get your business off the ground and keep it running when technology is changing apace. You will need to factor in the cost of ongoing technological development because without it, you run the risk of becoming irrelevant just as you’re gaining traction in your market.
  2. Realistic assumptions – Take a long hard look at your business plan in terms of market size, adoption rate and price point. Make sure that enough customers will actually pay the price you need them to pay in order to survive. Be pessimistic and enjoy a nice surprise. There’s no point in creating something that users love but which they don’t value enough to buy.
  3. Solid evidence – Test your product or service in MVP form (Minimum Viable Product) from the beginning and never stop soliciting opinions and analytics about its performance in order to create an evidence base for the efficacy of your creation. Whilst you may think you know how to solve a relevant problem, ultimately your customers need to agree with you and be prepared to recommend you.

Yes, I’ve learned a lot of lessons in the course of co-founding Airhead Education and no doubt there are many more to come. The truth is that I love what I’m doing and so it’s easy to get up in the morning and consistently spend time working out how to make Airhead better. As long as I can say that, I have the most important ingredient for success. Good luck in 2017!

You and You and You are the Weakest Links (in the information security chain)

Over the last twenty years or so as an Educational Technologist, I’ve visited literally thousands of schools. When I first started, my point of contact was the ICT (Information and Communication Technology) Network Manager. Nowadays, it’s almost always a member of senior leadership. I don’t flatter myself that I’m more important than I used to be. It’s simply that technology in most schools is now integrated in teaching, learning and operations from top to bottom. It’s strategically important.

Of course, with strategic importance comes a sharpened focus, not only on the benefits of technology, but on the issues and threats it introduces. Barely a week goes by without a story about the effects of screen time on children or the destruction wreaked by the latest malware. Where once upon a time, I could guarantee I’d find an administrator password on a sticky note in the office, initiatives such as Safeguarding and Prevent have ramped up the focus on safety and security in schools.

And yes, senior leaders are nervous. Apart from an unwelcome appearance in the media, if a school’s Safeguarding or Prevent arrangements do not meet requirements, then Ofsted is likely to place them in special measures.

As if that wasn’t enough, against a background of growing threat, hardening sanctions and shrinking budgets, the replacement of the Data Protection Act (DPA) with the EU’s General Data Protection Regulation (GDPR) is going to hit (mostly unwary) schools hard on the 25th May 2018. As of April 2017, only 43% of organisations were actively preparing for GDPR.

Whilst it’s true that the GDPR will bring more clarity and rigour to the discipline of information security, schools may well have more of a mountain to climb than most because they are Data Controllers with sensitive personal data on minors. It’s not clear from the legislation whether the appointment of a Data Protection Officer (DPO) will be mandatory for schools, but it would certainly seem to be sensible advice.

However, the main purpose of this post is not to bemoan the plight of schools but rather to point out an emergent weakness in this layered process of security hardening. It’s mandatory for schools to designate a member of senior management as a Safeguarding Lead. It’s also mandatory to appoint a Prevent Lead. With the advent of the GDPR, it seems there will be a DPO as well. To perform these roles effectively will require:

  • An understanding of the relevant regulatory environment
  • Experience of practical application in a school
  • A grasp of the technology landscape across the school and its supply chain

In the good old days (ahem), when I used to roll up to meet the Network Manager, usually I wouldn’t need to speak to anyone else. They were the Kings and Queens of their IT domains. Perhaps they lacked a strategic perspective on occasion, but at least there was one person who understood every piece of technology in the organisation and the implications of every change that was made.

I’m certainly not advocating a return to the past, but, going forwards, I think the increasing regulatory load is already leading to fragmentation in the security chain. In a world where one IoT device can become a gateway for a serious network incursion, it’s easy for knowledge to exist in silos which lead to Donald Rumsfeld’s infamous unknown unknowns.

My conclusion is that people are usually the weakest link in the security chain and, in this case, the weakness is exacerbated by an approach to safety and security in schools that is evolving in silos. I would simply advocate that domain experts with overlapping interests come together on a regular basis to educate each other and review their mutual challenges. Every school – every organisation – should have a Safety & Security Working Group that aligns and coordinates the work of all stakeholders.

Part 3: Data, analytics and learning intelligence

I’ve been using the learning cycle as a framework for a strategic approach to technology in schools. This is the third post of the series, the previous two having focused on access (mobile) and action (cloud). The next stage is that of reflection. The manifestation of this aspect in my proposed strategy is analytics.

In the basic learning cycle, reflection is the all-important point in the process when we widen our awareness, take a breath and open our senses to some objective evidence of the efficacy of our efforts. Reflections may be fluid and continuous (usually resulting in micro adjustments) or periodic (usually resulting in more macro or strategic reflections). We may self-reflect (internal validation) or we may seek out reflection in the observations of others or in data (external validation). In our journey to becoming more effective learners, an important part of the process is calibrating our self-reflections to more closely match external validation. This is a lifelong process in which external validation continues to be important but we learn to learn more effectively because our internal validations are proved to be getting more accurate.

The calibration of internal and external validation is essential to the teaching and learning process. Without it, it’s quite possible for individuals to entirely miscalculate their progress and consequently focus on the wrong things to generate improvement. I’m reminded of the contestants in singing contests on TV who are convinced they are superstars in the making but who can barely sing. This is an extreme example on the spectrum (perhaps delusional) however the underlying issue is a lack of calibration between internal and external validation of effective learning.

Of course, this is (in part) precisely the purpose of the teacher. The challenge is that, being human, we’re not only capable of a little self-delusion at times but we can also project our delusions. In other words, the teacher as an instrument of reflection for learners also needs to be calibrated. Teacher calibration might come through the formative assessment process, summative assessment, experience and professional development. The challenge is to effectively and objectively benchmark our internal assessments.

This is the point at which I introduce the concept of data, analytics and learning intelligence (equate with business intelligence). Before you start telling me about the shortcomings of data in the learning and teaching process, hear me out. I know that human relationships underpin learning. What I also know is that human nature is such that we are simply not objective in our evaluations nor are we calculating machines. It is possible for us to miss patterns, to be ‘mis-calibrated’ or simply to be overwhelmed by too much data. We’re fallible.

‘Big Data’ and analytics are 21st Century phenomena emerging from the already enormous, and still rapidly increasing, speed and scale that technology affords us in capturing, aggregating, storing and analysing data. There is more data available about human behaviour than ever before and a great deal of value is locked up in that data. The promise of analytics is that new insights can be gained from analysis of the data trails left by individuals in their interactions with each other and the world, most particularly when they’re using technology.

The rapid evolution of big data methodologies and tools has, to date, been driven by the business world which recognises in them the potential for unlocking value for their customers and shareholders. In this context the term ‘business intelligence’ is often used to describe the intersection of data and insight. When applied to education, analytics may be sub-divided into two categories: learning and academic. The following table describes that categorisation:

Academic analytics are the improvement of organisational processes, workflows, resource allocation and measurement through the use of learner, academic, and organisational data. Academic analytics, akin to business analytics, are concerned with improving organisational effectiveness.

We can define learning analytics as the measurement, collection, analysis and reporting of data about learners and their contexts for the purposes of understanding and optimising learning and the environments in which it occurs. In the same way that ‘business intelligence’ informs business decisions in order to drive success, so learning analytics is the basis of ‘learning intelligence’ that is focused on improving learner success.

Learning analytics are not the goal in themselves. Learning intelligence is the goal. Learning intelligence is the actionable information arising from learning analytics that has the potential to deliver improved learner success. The evidence from analytics in business is that there is deep value to be mined in the data. The objectivity and rigour that is represented by learning analytics provides an empirical basis for everything from learner-level interventions to national policy making.The Society for Learning Analytics Research (SoLAR) is an inter-disciplinary network of leading international researchers who are exploring the role and impact of analytics on teaching, learning, training and development. Their mission as an organisation is to:

  1. Pursue research opportunities in learning analytics and educational data mining,
  2. Increase the profile of learning analytics in educational contexts, and
  3. Serve as an advocate for learning analytics to policy makers

Significant potential exists for analytics to guide learners, educators, administrators, and funders in making learning-related decisions. Learning analytics represents the application of “big data” and analytics in education. SoLAR is an organisation that is focused on a building a planned and integrated approach to developing insightful and easy-to-use learning analytics tools. Three key beliefs underpin their proposal:

  1. Openness of process, algorithms, and technologies is important for innovation and meeting the varying contexts of implementation.
  2. Modularised integration: core analytic tools (or engines) include adaptation, learning, interventions and dashboards. The learning analytics platform is an open architecture, enabling researchers to develop their own tools and methods to be integrated with the platform.
  3. Reduction of inevitable fragmentation by providing an integrated, expandable, open technology that researchers and content producers can use in data mining, analytics, and adaptive content development.

From my experience talking to educators, it’s clear they usually know that there is data available and they know how to act on learning intelligence when they have it, but they’re much less sure about the analytics phase. Whilst working on a national procurement for a learning management system last year I realised we really knew very little about the utilisation of key technology assets in the schools we were trying to build systems for. As it turned out this data was sitting, untouched, in log files in servers within these schools. I approached three of the schools and asked their permission to copy this data for the purposes of analysis. They knew it existed and were happy for me to analyse the anonymised data.

I was able to analyse the utilisation of technology assets (software and hardware) across these schools over a period of months in order to understand exactly how technology was used. This enabled me to show where the investment in technology was being dramatically underused and how it could be re-shaped to maximise utilisation of the investment in order to improve the chances of learning gains. I didn’t have time to, but could have mapped this data against the timetable and assessment data to explore how technology mapped against attainment. This would have allowed me to correlate technology utilisation by different teachers, departments and schools against the performance of their pupils.

This example is the tip of the iceberg in terms of analytics and big data in education. In terms of my technology strategy, identifying and analysing key data in your school to produce learning intelligence will maximise the learning bang for your technology buck in an objective manner. It is a critical part of your strategy because without the analysis, you may well be making unnecessary or ineffective investments in technology. Don’t be driven by technology; be driven by learning outcomes.

Personal Data Protection in the Cloud

A few weeks ago I was contacted by a student asking me to complete a questionnaire on cloud security issues as part of a dissertation for her degree. At the time I thought I should probably post my answers here but I was overtaken by events (or in plain speak, I plain forgot).

However, I was reminded this morning by an article published yesterday on the very same topic. The article is built around a joint statement issued by European Commission Vice-President Viviane Reding and US Secretary of Commerce John Bryson on the 19th March. The statement frames a high level conference on Privacy and Protection of Personal Data, held simultaneously in Washington and Brussels and, in their words, “represents an important opportunity to deepen our transatlantic dialogue on commercial data privacy issues.” This is an excerpt from the statement:

“The European Union is following new privacy developments in the United States closely. Both parties are committed to working together and with other international partners to create mutual recognition frameworks that protect privacy. Both parties consider that standards in the area of personal data protection should facilitate the free flow of information, goods and services across borders. Both parties recognize that while regulatory regimes may differ between the U.S. and Europe, the common principles at the heart of both systems, now re-affirmed by the developments in the U.S., provide a basis for advancing their dialog to resolve shared privacy challenges. This mutual interest shows there is added value for the enhanced E.U.-U.S. dialogue launched with today’s data protection conference.”

The thrust of the student’s questioning was that the uptake of cloud technology was being slowed by businesses’ concerns about data security and privacy. I’m not so sure that’s at the heart of the issue as you can probably tell from my answers:

Question: Despite its promises very few businesses have actually moved their operations to the Cloud. Why has the real application of Cloud computing not yet reached momentum among businesses?

Answer: I think the premise of the question is wrong, i.e. that very few businesses have moved operations to the cloud. To explain what I mean, we need to agree terms first. Cloud just means stuff hosted off premises. Web is cloud. Virtualisation is cloud. Streaming is cloud. If cloud means stuff hosted off premises, then a critical limiting factor is the pipe between the client and the host. Even with diversely routed connectivity, this is a business risk in terms of resilience and performance. Business risks need to be balanced against costs and benefits. The second issue for cloud services is that it is more difficult to integrate disparate systems – potentially from different vendors – to meet business specific requirements. There are not yet standards that facilitate this type of integration between cloud vendors (although discussions are in progress). The combination of issues I describe means that cloud services are not suitable for all business functions, business types and business sizes. For example, some businesses may be willing to sacrifice performance and resilience to achieve lower price or greater agility. A business whose main channel is the Web may already have the internal processes and culture to embrace more cloud services. When I said the premise of the question was wrong, I meant that I think most companies do take cloud services, albeit in a limited way. It’s true that most businesses haven’t embraced cloud for the full scope of their technology requirement but I’m not sure this is possible for most businesses given the present limitations of the technology. So really what we’re talking about is a hybrid scenario with a progressive shift to cloud services as bandwidth costs reduce, standards for integration emerge and the business case, taking account of the risks, gradually shifts in favour of cloud. This is part of the picture. There are also cultural and practical issues in terms of change management. On premises IT departments have traditionally kept a tight control over their networks and data. Releasing control is difficult for them. It’s only when competition becomes extreme that the old paradigms become unsettled and eventually unseated. I’ve deliberately left the wider data security issue out of this response because there are lots more questions about it later!

Question: A study by LSE has revealed that the top two issues on the way to adopting the Cloud are fears of data security and privacy and -data being offshored. In your opinion have these two issues been the main concern for your users/clients?

Answer: I have some sympathy with this view although when issues are complex, respondents often migrate to shrink-wrapped answers. My view is that the issues of data security and privacy are the go-to issues for cloud ditherers. They’re a form of displacement behaviour. In my experience, it’s rare that data security and privacy are truly critical factors in the decision to use (or not) a cloud service. They are of course critically important issues, but as a technology, ‘cloud’ usually has reasonable answers, at least relative to the security and privacy challenges that already exist due to human and system frailty. My experience is that the objection regarding data security and privacy is often the first provided objection but that a little digging usually reveals a more complex set of concerns, some technical, some practical and some cultural.

Question: Steve Ballmer, CEO of Microsoft believes that security is a personal responsibility of everyone in the chain (– employees, managers, end users). How important is human factor in ensuring security on all levels? 

Answer: Steve Ballmer’s comment highlights the absurdity of the data protection and privacy issue in the context of most businesses. That is to say, people are most commonly the weakest link in the security chain, closely followed by the systems and processes they devise. For example, in schools across the land you’ll still find passwords and user names written on post-it notes attached to the monitors of administrators with access to sensitive data about pupils. In the next breath, they will resist a cloud technology solution because they’re not sure where the data is located. There’s a significant lack of perspective about the relative significance of the human factor in most security breaches.

Question: Do you believe security is a two way responsibility for both users and providers?

Answer: In order to create a secure technology chain, people, processes and technology need to work together in a seamless way. This means reciprocal responsibilities between users and providers.

Question: Cloud providers are increasingly trying to convince users that because of their heavy investments in hardware, software and staff, security in the Cloud may be better? Would you say that security on average is better in the Cloud comparing to the in-house security?

Answer: For small and medium sized businesses in particular I’d say that this is true as long as you believe the cloud provider have robust and resilient systems themselves. The reality of most SMEs is that pressure to compete and grow creates budgetary pressure and that privacy and security are easy victims of this pressure. We still see many businesses which do not store and control data effectively and where staff are inadequately trained in the security systems. Aggregating demand through cloud removes part of this problem from the premises and frees up resources to focus on the ‘edge’ issues, i.e. people (and their systems).

Question: What legislation are you currently guided by in the Cloud industry? Do you believe it is sufficient enough for users’ security?

Answer: The UK’s Data Protection Act 1998, the US Patriot Act and the European Union’s Data Privacy Directive all have something to say on this issue. In truth they’re all out of date in the context of cloud and there are various reviews of the legislation happening at present in order to stimulate the cloud industry. One of the issues is at what point permission is required from the data subject. At the moment, the legal view is that the data subject may need to provide permission even if a non-EU company stores data temporarily on an EU device, e.g. through a cookie as part of a social networking service. Moving personal data outside the EU therefore presents potential issues. Currently some cloud companies have circumvented this problem by basing data centres in the EU, e.g. Microsoft. Others have resisted making absolute statements about data location (such as Google) because their data is so widely replicated (data sharding) around their system for the very purposes of resilience, redundancy and security. So the legal landscape is somewhat at odds with the technical landscape.

Question: Some scholars have suggested we create an auditing board/authority to monitor activities of the providers. Do you think it is a good idea?

Answer: Issues of data security and privacy are very important issues. It may not seem so until something goes wrong and you are directly affected. Luckily most of us never experience the effects of a meaningful breach of our personal data. We may be irritated by it, for example if our credit card information is hijacked. However, there is a system of restitution in place and so it’s usually an irritation rather than a catastrophe. However identity theft (as another example) is potentially a very significant issue and one that is growing. So, in order to build confidence in the cloud, there inevitably needs to be some regulation and control. In the same way as integration standards between cloud providers will enhance take-up of cloud technologies, so regulation and legal harmonisation will enhance confidence and take-up.

Question: What are your predictions for Cloud computing security in the future?

Answer: As I said earlier, I think the shift to cloud is underway for most businesses. Whether it is as simple as web-based email or a web store front, or as complex as an entire company built on cloud computing, businesses are on the journey. To paraphrase Anais Nin, cloud adoption progresses when the risk it takes to remain tight in the bud is more painful than the risk it takes to blossom. Cloud leverages scale to deliver more for less. If it really does this well, then the business ecosystem will naturally select it. In my view, security and privacy are real issues that need to be tackled. The cloud providers are the guardians of valuable personal assets: our personal data. They are the data ‘banks’. Data is a valuable asset and therefore as vulnerable to abuse as the banking and financial systems. I would argue therefore that we need consistent and robust regulation and legislation in order to protect our interests. It is clear from the banking crisis that the trust and best intentions rarely work out well for the individual. My prediction would be that ‘big data’ and the ‘cloud’ will be a very important trend over the coming decades and that a robust legal and regulatory framework will emerge, along with standards for multi-vendor cloud integration.

So that’s my take. What would your answers have been?

It’s all about the (validated) learning

This week I’m listening to The Lean Startup by Eric Ries. If you prefer visual consumption, check out Eric’s  presentation to Google. If you like listening to books (like me), check out the Audible version. Whatever your preferred medium for consumption, and whether or not you’re interested in starting a business, I recommend that you engage with his thinking a little if you’re interested in learning in the context of change.

I should also state my personal interest. I’m currently a one third partner in a startup that’s sailing into uncharted waters (Airhead Education). We have a great idea (built around the concept of a cloud desktop for schools), great people (check out our technical guru, Jason Dixon’s blog) and, I think, the zeitgeist is in our favour. But we’re trying to bring a new technology paradigm to schools and that means change. We know about 10% of what we need to know to even build a meaningful business plan! There’s 90% (probably a lot more) to learn.

Eric begins his book by defining a startup as “a human institution designed to create a new product or service under conditions of extreme uncertainty.” This emphasis on ‘uncertainty’ is important. If I was to start a traditional grocery shop, I’d be walking a well-trodden path. There’s loads of explicit learning which I can access in order to understand how to make it successful. Most management theory is focused on this type of business where the idea and the customers are well understood. The keys to success are effective planning and efficient execution. But what if your idea is disruptive and visionary and you have no idea how your ideas and products will be received by customers, or even who your customers are? Traditional management theory falls down.

This is where Eric steps in. His thrust is that a new type of management theory is required under conditions of extreme uncertainty (sounds like Quantum Management Theory to me). He’s also keen to point out that, although this is the sea where entrepreneurs swim, it’s also vital for there to be a similar management theory for intrapreneurs (those who behave like entrepreneurs but in the context of a mature businesses). In fact, mature businesses are often very poor at innovating because the negative impact of failure is magnified. Even minor failures can reflect badly on the brand. Mature businesses are usually conservative for this reason.

I’m not going to provide a complete synopsis of Eric’s book but I wanted to pull out a couple of key points. In conditions of extreme uncertainty, one thing is for sure: you need to learn and fast. But is all learning equal? Most entrepreneurs fail a lot before they find success. You’ll hear them say things like, “Well, it was tough but I learned a lot.” What was it that they learned and was it worth learning? Perhaps at a personal level it was, but at the level of business, Eric argues that what would’ve been much more useful and timely to their venture was validated learning.

build-measure-learn-loop__largeValidated learning is achieved using the scientific method. That is to say, you build a minimum viable product (MVP) or even just a mock up, set yourself a hypothesis to test, and then get out there and start testing it with customers immediately. Don’t wait until you have a great product. Don’t guess who your customers are and what they need. Build it. Measure it. Learn from it. Refine it. Go back around the loop. But faster this time (time and money are running out, remember?).

The problem with most entrepreneurs is that they’re passionately attached to their vision and find it hard to pivot (pivot = a strategic change of direction with one foot firmly planted in validated learning). The problem with engineers and designers is that they’re perfectionists and feel that they will be judged by the quality of their output. A minimum viable product is a scary idea for them. The problem with investors is that while you have zero revenue and a great idea, you’re exciting. As soon as you make a penny in revenue, the questions start coming: why so little? The clock is ticking.

So as sensible as validated learning is, it’s quite a tough management philosophy for participants in the startup to embrace. There’s actually quite a lot of momentum in a startup. The potential for agility, yes, but the appetite for it? Not so much. You may have to accept a potential pivot (major strategic change) for each cycle of the process. You will certainly be constantly tuning (tactical change) based on new data. You will also be asking your customers to accept (and pay for) something less than perfect. But how else can you systematically and meaningfully evolve your product unless it’s by validated learning? OK, you may be lucky and come up with the perfect formula first time. Unlikely. More than likely the market is changing almost as fast as your product. It’s a race!

This management theory is particularly challenging for mature companies who are good at planning and execution but for whom innovation has become an aspiration rather than a reality. The idea of putting a MVP in the hands of their valued customers is very scary. The idea of pivoting every other day (read ‘acknowledge failure and learn from it’) is even more scary. But this is what learning looks like. Hard graft and lots of mistakes. Why would it be different for a big company than a startup?

Personally I think there are important lessons in here for organisational change as well as entrepreneurs and intrapreneurs. As was pointed out to me the other day, I talk a good game in terms of advocacy for educational change, but what about the ‘how’? The problem is, I think, that many education leaders (like entrepreneurs) become victims of a grand plan when in fact what they need is constant evolutionary change based on validated learning. I call this the paradox of incremental transformation. What is called for in schools is not one grand plan. In fact the grand plan creates an unhelpful momentum of its own. It is not about unleashing massive transformation but rather a constant series of micro-experiments to test hypotheses that form the granularity of the plan and could change the plan. The key is to become agile at validated learning. Perhaps it’s important to point out at this point that learners need to be participants in their learning. This is the reason why change imposed from above (or externally) is often met with resistance.

To achieve evolutionary growth as an organisation, leaders need to build a culture of support for  experimentation, failure, and in particular, advocacy for measurement and reliance on data to validate results. They need to be willing to react to validated learning quickly and implement change when it is proven to make a difference, even if the results are contrary to their expectations or wishes. The cycle of build, measure and learn is every bit as important to a school as to a startup.

Education fails technology?

As I’ve been blogging about the development of a School Technology Strategy, I’ve also been reading a recently published book called The Learning Edge by Bain and Weston. It’s a stimulating read in this context because it positions education as failing technology rather than the traditional reverse. That might not immediately chime with readers but bear with me. A few days ago I also read an interesting blog post by Wes Miller in which he explored the concept of ‘Premature Innovation’ in the context of Microsoft. The combination of these two sources has got me thinking…

Bain & Weston take the reader back to the work of Benjamin Bloom, the famous Educational Psychologist who in 1984 published ‘The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring’. In short, Bloom argued that one-to-one tutoring was the most efficient paradigm for learning but that, at scale, it is not practical or economical. He went on to say that optimising a relatively small number of significant variables may in fact allow group instruction to approach the efficiency of one-to-one tutoring. In this context, of particular interest is whether technology might simulate one-to-one tutoring effects such as reinforcement, the feedback-corrective loop and collaborative learning.

The promise of technology in education to date has almost always exceeded delivery and the blame has usually been attributed to technology. But is it really all the fault of technology? Well, Bain & Weston make a very interesting point in the context of Bloom’s research: although Bloom gave us a very useful framework for educational reform, there has been little systematic change in classroom practice for decades. The didactic model is still the beating heart of most schools. The practical implementation of research-based enhancements to pedagogy and curricula in schools has been painfully slow. In a very real sense, technology is the gifted student, sitting at the front with a straight back and bright eyes, full of enthusiasm, and being studiously ignored by educators. Education is failing technology.

Is this the whole story? Well, I certainly think it’s impossible to divorce a school technology strategy from an educational strategy with associated pedagogical and curricular implications. They go hand in hand. For example, a 1:1 ratio of devices to students is not going to make much of dent in learning in a school if the underlying pedagogy is predominantly teacher-led (for example). Technology will only ever leverage the benefits of a sound educational strategy and its practical manifestation. The biggest challenge for school leaders is therefore to construct a rigorous educational strategy and drive the change required to manifest it using research and data to drive continuous improvement. I see limited evidence of this in most schools.

If I’ve convincingly shifted the blame away from technology, perhaps it’s time to balance the scales a little. When reading Bain & Weston’s book, I was struck by the fact that a lot of the research focused on technology that I think fundamentally fails education, regardless of the education strategy. I think bright eyed, bushy tailed technologists sometimes suffer from premature innovation. This is where a seemingly great idea isn’t adopted or fails to fulfil its promise. A startling example from Wes Miller’s blog is the tablet. Tablets have been around for quite a while with very limited adoption before Apple stepped into the market. They launched the iPad and now tablet numbers are burgeoning and 1:1 iPad models for schools seem to fill every other blog post I read. Why?

before_and_afterAs Steve Jobs was well aware, technology does not get used unless it does what it is designed to do really well and certainly better than a manual option. In a classroom, technology needs to work at the pace of the learner and/or the teacher. Even a 5 second delay can interrupt the pace and rhythm of a lesson. It also needs to be intuitive. It is just not fair to expect every teacher to be a technology expert and there isn’t time for endless training. Taking the iPad as an example, it’s hugely popular because a two year old can use it, it’s personal and mobile, wireless technology and the Internet are have matured sufficiently to fill it up with engaging content, and it is reliable. It’s turbo-charged book. The time is right.

Another example of a significant product failure in education due to premature innovation is the Virtual Learning Environment (or Managed Learning Environment or Learning Platform or Learning Management System). In the UK a Government agency called Bectawas responsible for creating a functional specification for this product category. They then used this specification to put in place a framework off which schools might procure. The problem was that Becta tried to create an all singing, all dancing specification and it was just far too detailed. The resulting software created by the market to meet the requirement was therefore horribly over-engineered. The outcome? A very significant number of VLE products languishing in schools, not being used because they’re too difficult. A very big waste of money.

Again, in the VLE space we’re beginning to see disaggregation of the functional components into bite-size and usable chunks rather than a monolith with all the agility of a supertanker. Platforms are beginning to emerge which re-aggregate these simple elements into a manageable whole, retaining and enhancing usability in the process. The result? I’m beginning to see some interesting products in the VLE space.

Let’s not ever lose sight of the fact that technology is a tool and that my School Technology Strategy blog posts are implicitly (and now hopefully explicitly) intended to sit within the context of an educational strategy that attacks the 2 Sigma challenge with energy and evidence. Without educational change, the impact of technology on learning will be a placebo effect [placebo in the sense that there’s nothing fundamentally changing but leaders feel better for ticking the technology box]. It is also the case that, even with a sound educational strategy, technology will only make a difference if it adheres to some very basic principles of usability and usefulness, a test that most technology in schools still fails.

Leading technology

We have a lodger staying at the moment – a primary school teacher. While chatting I discovered that the laptop she was using was a school-supplied unit from the Laptops for Teachers (LfT) initiative, a programme kicked off by the DfES and Becta in 2002. “Of course I can’t do anything useful with it,” she said. “Huh?” I replied (in my usual articulate fashion). “They don’t like me to put any of my own stuff on it.” I’ll admit this floored me. One of two things was possible:

a)  Working on national projects with aspirations at the cutting interface of education and technology has unhitched me from the reality of technology in schools at the coalface, or
b)  My lodger’s school is at the end of a, no doubt, long, trailing technology tail.

I think it’s probably a bit of both. I won’t go into the conversation that ensued, but it became clear to me that the technology in her school was being managed, not to enhance learning and teaching, but to minimise technical issues. Even now, it seems this is far too common.

I’ve been very lucky in my career so far to have visited many hundreds of education organisations. I’ve engaged with all manner of staff from leaders to technicians. What’s become clear to me over time – and please accept that this is a generalisation to which there are notable exceptions – is that the majority of education leaders built their education experience in a pre-digital age. They are not digital natives and regard technology as something between an expensive distraction and an interesting diversion. They don’t intuitively ‘get’ technology and they certainly don’t trust it to make a significant difference to learning outcomes or life chances. Their perception is that budget allocated to ICT is displacing spend on things they do understand, like teachers, and this is uncomfortable and so unwelcome. Furthermore, technology is evolving rapidly and so the knowledge they do have is constantly challenged and there’s relentless pressure on them to refresh their investment in terms of stuff and skills.

As a general rule, leaders are not very good at being out of control and I think technology is one of those areas where many leaders feel exactly that. I’ve met many heads who’ve been proud to tell me they don’t even own a computer, yet their organisation’s raison d’être is to prepare young people for a digital age. It’s also not uncommon to see a head wielding his or her iPad as evidence of a progressive attitude to ICT while their school languishes in the middle ground of technology adoption. It is one thing to be a user of technology and appreciate its merits, but quite another to develop and drive an ICT strategy for an organisation.

So technology is often perceived by leaders as a threat rather than a valuable ally in achieving successful outcomes. The usual responses to a threat are either to marginalise it or dominate it. Given that the former is becoming more and more difficult in a digital age, the latter is the usual course of action. The most common way of dominating technology is to regulate it into submission by creating ring-fenced, in-house control structures, both curricular and technical.

An internal structure is far less likely to expose or challenge than an external one. Better the devil you know. The technology manager in a secondary school usually becomes the trusted source of technical advice, despite the fact that he/she is probably under-qualified to be making learning-focused, strategic decisions about technology adoption. Yes, I know there may be another member of the SMT with the  portfolio for technology, but I’m as wary of technology enthusiasts as I am of Luddites. I can count with the fingers of one hand the number of technology leaders I’ve met in schools who have any significant professional technology experience outside of their school. They usually mean well but lack perspective.

My contention is that in-house technology management is almost always inefficient and a distraction from the core organisational mission. In my opinion, the necessity for an ICT department has become a self-perpetuating myth in most schools and colleges. To change would involve asking the turkeys to vote for Christmas. This is of course why leaders need to get to grips with technology and lead their organisations from the front, not by becoming experts, but by taking expert advice.

To be clear, this is not a gratuitous critique of education leaders. The reason for making these observations is to shed light on the current state of technology in education organisations. In general, we see a very conservative landscape, with significant tracts of technology experience out of bounds for learners, let alone staff. We see tragic waste through under-utilisation of technology assets. We see technology managed to reduce support rather than to enhance learning and teaching. We see inefficient procurement. Mobile phones are a threat. Social networking is a threat. Parental access to school data is a threat. Data is a threat!

I see the proliferation of Interactive Whiteboards as a symptom of this malaise. It is a comfortable choice of technology because they simply perpetuate the same didactic techniques as before but delivered with elevated anxiety. Do they improve learning outcomes? Where is the evidence? Yet the idea of engaging young people through their mobile phones in social learning is almost non-existent in schools. Did you know that 1 in every 5 minutes of Internet time was spent using Facebook in 2011? Where does the opportunity really lie?

My intention over the coming few weeks is to challenge the status quo and blog about how technology in schools can be different and better while costing less. I want to engage education leaders in a dialogue that’s about relinquishing technology control and focusing all their effort on their organisations’ core mission. The trend is already well underway in business, with many SMEs letting their CIOs go and outsourcing their ICT. They see they get better advice, better value, a more agile organisation and better outcomes. I think the education sector is ripe for a revolution and I’m delighted to be one of those waving a red flag.

2012 and beyond (part 1)

Around this time of year I’m always interested to see what the technology pundits predict for the following year. The advantage of being in a sector for a while is that you work out who’s worth listening to. Richard Holway from Techmarketview is a seasoned oracle with a great track record. Education is not his thing but it’s worth reading what he predicts for technology in 2012. Just replace “business” with “school” and “2012” with “2015+”! I jest, but it is true that trends in business often feed into education in time. I’ve added a few lines (in blue) after each item offering my education-speak version of his prediction.

1. “It’s the economy, stupid” – Although what happens in the general economy – UK, Europe, US and globally – has always had some impact on UK SITS [software and information technology scene], it has often been minimal. Indeed, UK SITS has often thrived in downturns – indeed, growth has been spurred by the need to cut costs/change business models etc. But the UK faces the possibility of an unprecedented downturn which is just bound to affect the UK SITS sector and, indeed, consumer tech too. The Governor of the Bank of England was recently asked “What will happen to the economy in 2012?” and replied “I don’t know what will happen tomorrow, let alone next year”. So, the greatest driver for our markets in 2012 will be the economy. The greatest problem facing the executives in the UK SITS sector will be uncertainty. Nobody – not the Governor of the BoE or any TMV [True Market Value] analyst – can accurately predict what will happen.

The impact on the education sector of the economic downturn in the UK is already being felt and the termination of the Building Schools for the Future was an early indication of this. Traditionally education is fairly resistant to economic cycles but I think the situation is sufficiently dire to send waves out across the entire public sector, and indeed the country, with increasing pressure to do more for less.

2. Consumerisation of Enterprise IT – Consumerisation of Enterprise IT is already an established trend but will become mainstream from 2012 providing huge threats and similarly huge opportunities. This will particularly apply to mobile, social and tablets.

Consumerisation of Enterprise IT means users finding and using their own technology tools to meet their day to day requirements in work rather than being enterprise driven. Thus the power-base of enterprise IT companies is being diluted as users create their own ecosystem of technology to meet their needs, e.g. iPhone, FaceBook, LinkedIn, Twitter etc. This trend is inevitable and desirable in the education sector too. It promotes a self-personalised experience as well as rapid innovation and diversity while reducing the management overhead for organisations.

3. Bring Your Own Tech – Similarly, BYOT will also go mainstream. Enterprises supplying tech items such as mobiles, laptops etc to employees will become as uncommon as the supply of company cars. BYOT will spur major growth in security systems and in desktop virtualisation. However, supply and support channels will be adversely affected by the BYOT trend in much the same way as manufacturers and suppliers of company cars were affected in the last decade or so.

BYOT is an extension of the consumerisation of enterprise IT. Again it’s about self-service, self-personalised technology solutions that meet the specific needs of the individual and remove the management and control of edge devices away from the organisation. It not only removes the management overhead from organisations but it also increases personal responsibility and utilisation. This is an important step to take in education in order to improve utilisation of, and therefore access to, technology.

4. Social media bubble bursts – Consumer social networks have already peaked. The winners are in place. Valuations were always in bubble territory and that bubble has also burst. However, just like the Internet bubble of 1999/2000, the world has changed. Social networks will have a huge effect on the next 10 years just as the Internet has had on the last decade. The real opportunities are now the adoption of social networks in the Enterprise.

The key point here for education is in the last sentence. There is a huge amount of potential tied up in social learning, both formal and informal, and most education organisations have resisted the integration of these platforms into their cultures. Adoption of social networks in and across education organisations will revolutionise when, where and how learning happens. It may not be 2012 for education organisations but young people are there already – in their tens of millions. We should take notice of them.

5. IT as a utility. “It’s business not IT, stupid” – For as long as I can remember, pundits have suggested that IT will become a utility – like the supply of electricity. They have made the point that nowadays nobody has an “Electricity Supply Director”. So, why do we have IT directors? Or even why do we have CIOs [Chief Information Officers]? The acceptance of BPS means that in many companies that day has already arrived. Much of the previous IT budget is now controlled by user departments. Decisions are taken for business not IT reasons. CIOs are probably a dying race. The same might well apply to some SITS companies. The need to supply a business solution already supersedes the need to supply an IT solution. I remember Paul Pindar has long objected to me ever referring to Capita as an IT company. –I suggest most other companies will object in similar fashion in the future.

This is a key one for schools in particular. Even now, all round the world (and especially in secondary education and above) network managers and technicians have created stand-alone IT empires that are mostly about the technology. They are usually characterised by being non-standard, difficult to support, lacking in scalability and  dreadfully inefficient. Technology paradigms such as the ‘cloud’ are offering great opportunities to deliver IT services from an off-premises location, increasing standardisation, availability, consistency and driving down cost. The consequence? More focus on learning in learning organisations. “It’s education not IT, stupid.”