I have had the opportunity to work with some great people in Futures and from the sector. I did start to list them and realised that there had been so many I was bound to miss someone out. Thanks to everyone.
As Jisc’s Head of higher education and student experience I coordinate Jisc’s overall strategy for HE learning, teaching and student experience and have lead responsbility for promoting the total programme and value and impact of all HE learning, teaching and student experience products and services delivered by Jisc.
I lead the ongoing review of Jisc’s HE learning and teaching strategy, positioning this work within the organisation’s overall strategy I ensure that Jisc’s portfolio of activity in this area remains in line with Jisc’s HE learning and teaching priorities and work closely with colleagues to develop Jisc’s understanding of the value and impact of all of our HE learning, teaching and student experience activities.
As Head of higher education and student experience I am also responsible for framing how current and future challenges in this area can be resolved by technological innovation and translating the key insights into actionable innovation pipelines that deliver real impact.
I manage the monitoring of national and regional HE learning, teaching and student experience customer and funder priorities, and work with Jisc account managers to examine the value ascribed by customers to Jisc products and services in this area, the join up of intelligence from funders and customers and the internal sharing of this, as appropriate.
I also manage the process of directorates identifying and mapping operational activities to our HE learning, teaching and student experience priorities, and the tracking and measuring of impact, highlighting gaps, challenging work if it is not aligned to priorities and identify emerging opportunities as these materialise.
If you are going to Jisc’s Digifest next week, come and say hello.
It was with a little trepidation that I stood on the stage at the 48em ADBU Congrès to deliver a keynote on the intelligent campus and the student experience. The audience were all French library professionals attending the Congress.
I delivered my presentation in English, and for those who needed it a translation service was available. The presentation covered the background to the Intelligent Campus project and it builds on the existing Jisc analytics service. I briefly covered the service and what it enabled for universities and colleges using the service. I also spoke about how the service can provide data and visualisations to students to improve their own performance.
I described the plan for the technical infrastructure behind the intelligent campus and how the data hub can be used to deliver data to different presentation layers. These presentation layers covered a range of possibilities.
Talking about tracking students and gathering other data about student brings the legal and ethical issues to the fore. It is important to think about these issues before moving ahead with analytics. We also considered the technical challenges, can we actually measure some of the things that would provide an useful insight. Are these insights even valid? It was this last point that was picked up in following discussions and presentations at the Congress. Do certain kinds of activities actually help students to achieve and succeed? More research in this space is needed.
Many of the questions at the end of the presentation were similar to questions we’ve had at events in the UK.
Overall my keynote provided an insight into the work Jisc is undertaking in the Intelligent Campus space and how far we have come in the realm of learning analytics.
This is an updated version of this blog post from 2016. It now includes details of the 2016 and 2017 conferences.
Reading Maren Deepwell’s recent post about her #altc journey, it reminded me of the many conferences I have attended and like her the impact that they had on my life and professional practice. Going back to my experiences of my first ALT-C I was surprised I even went again!
So have you ever been tasked with writing a digital strategy? Do you know where to start? Do you know what is going to ensure it will work and be successful.
So if you are tasked with writing a digital strategy, you could write it in isolation, but prepare for it to be a low priority for people higher up. Also expect people in other directorates or departments to ignore it as they focus on their own strategies.
Jisc have recently published a leadership briefing written by myself and Lawrie Phipps. A key aspect is aimed at those tasked with writing strategies, where we argue that in order to get stronger “buy-in” there is a need to apply digital lens to all strategies.
The paper proposes the concept of using a digital lens when approaching strategy, practice and process. The lens is made up of different aspects that need to be considered when applying digital to existing and intended structures.
It is necessary to identify which element will be looked at in digital contexts – for example, a particular teaching practice. Different digital options should then be explored to gain a thorough understanding of the range of possibilities. The benefits and risks of each possibility should be carefully weighed before deciding to deploy. As with all change, it is important to reflect and evaluate the nature and impact of the changes caused by the incorporation of digital.
There is a history of people talking about applying a lens to stuff, to look at things differently. To give a different perspective on what has been written or talked about. These are sometimes called strategic lenses and can cover different area such as design, customer focus, resources, cultural amongst others.
In this blog post I want to reflect on my own experiences in designing, developing and writing my own digital strategies. My initial frustrating experiences with a strategy that took a lot of my time which was then ignored, well certainly felt like it was ignored. It was almost a tick box exercise and the end result was the strategy was put into a lever arch file, put on a shelf until the following year when it would be reviewed, revised and published again.
As a TEL Manager in a college I was asked and I delivered a digital learning strategy, well back then it was called the Information and Learning Technology or ILT strategy. Historically it had come about because of funding from Becta to colleges was given on the basis of colleges writing an ILT strategy. This was often distinct from the IT strategy. The IT strategy was usually focused on the technical infrastructure to support the college business, whereas the ILT strategy was focused on the embedding of technology into teaching and learning. What often happened though was that both strategies weren’t linked together and weren’t always linked to the corporate strategy, of if they were those linkages weren’t always clear.
The end result was that sometimes these strategies were at odds with each other.You had an ILT strategy was advocating a student BYOD policy and the IT strategy was clear that non-organisation devices could not be connected to the wireless network.
It wasn’t just the IT strategy, I am aware of heated discussions between managers, where the ILT strategy was advocating a student BYOD policy and the Estates strategy was clear that non-organisation devices could not be plugged into the power sockets.
On top of all this was the core corporate strategy that was focused on something completely different.
I remember my ILT strategy talking about the use of the VLE by students and that all courses would have a presence on the VLE. Sounds fine, but academic staff didn’t see that as a priority, because the corporate strategy was talking about, widening participation, improving teaching and learning, and better student outcomes. Staff saw the improvement of teaching and learning as a priority, they saw using the VLE as something extra, more work so a) didn’t use it b) would often say they didn’t have the time (which we know now means they didn’t consider it a priority). So a lot of my time was taken up “selling” the use of edtech. What I didn’t realise at the time was that what I often was doing was applying a digital lens to the existing strategy in order to “sell” the VLE or other edtech to academics. I would talk about how the VLE would enable them to “improve” teaching and learning, could be used to “widen participation”. I started to realise that having a strategy focused on tools was never going to be successful, one which focused on outcomes would be easily understood by managers and staff, and more easily achieved.
In a later role I had to write a combined IT, Libraries and Learning Technology strategy. We were being supported by an external consultant and I do remember one of the key things she said was that anything in our departmental strategies had to stem from the core corporate strategy.
A typical IT strategy will often say something like this:
Enable a secure, robust and stable network.
What this does is focus the minds of the IT and network teams to ensure that the network has high resilience, low downtime and is secure. As a result when academics and learning technologists want to try something new, they are “refused” because it could impact on the security and reliability of the network. Over the years I remember many times being told we couldn’t use this tool, access this service, because of the “importance” of enabling a secure, robust and stable network.
The problem was that the corporate strategy said
We will develop and deliver high quality teaching and learning, across a wide range of subjects and qualifications.
This meant developing new ways of teaching and improving learning. Academics wanted to try new and innovative practices, but the IT strategy was acting as a barrier.
If the IT strategy was linked to the corporate strategy and said:
Enable a secure, robust and stable network to allow high quality teaching and learning through the use of technology.
What this does is focus the minds of the IT and network teams to ensure that the primary focus and use of the network is on allowing innovative use of technology for teaching and learning. Yes they still need to ensure that the network is secure, resilient and stable, but their primary focus will be on ensuring that teaching and learning can make effective use of technology.
Any departmental or methodology strategy should always link back to the organisational strategy and how the objectives and actions will support the organisational strategic aims.
So how do you do that then?
Well that’s where the lens comes in.
So if you are tasked with writing a digital strategy, you could write it in isolation, but prepare for it to be a low priority for people.
If you apply a digital lens to the corporate strategy, you can demonstrate how digital technologies can enable that strategy. So rather than talk about how you are going to increase the use of digital technologies, the strategy talks about how the use of digital technologies will enable the strategic aims.
The leadership briefing we published provides a mechanism on how to do just that. The next stage will be to distill the strategy into an operational plan, again applying a digital lens will demonstrate and show how digital technologies can be an enabler and not a barrier.
Over the last three years I have been developing and delivering the Jisc Digital Leaders Programme, part of a wider team including Lawrie Phipps and Donna Lanclos.
Those two were recently interviewed by Chris Rowell, though the focus of the podcast was supposed to be about a chapter of a book that Lawrie and Donna had written, in the end the podcast was mainly about the leaders programme.
DELcast #4 Interview with Lawrie Phipps & Donna Lanclos about Digital Leadership and Social Media
I really enjoyed listening to these two talk about the Jisc Digital Leaders Programme. The conversation reminded me how much the programme has changed since the initial pilots back in 2015, and what improvements and changes we have made to the programme.
People are not fearful of algorithms, they’re fearful of agendas that the algorithms represent.
Over the last few weeks I have been discussing and listening to people’s views on the intelligent campus.
One topic which has resulted in a fair bit of controversy is the concept of using artificial intelligence to support teaching and learning. This isn’t some kind of HAL 9000 computer running the campus and refusing to open the library doors until Dave the learner has finished their essay. This is more about a campus system being able to learn from the users, take that data, do some analysis and make suggestions to the user on potential ideas for improvement and useful interventions.
Imagine a learner arriving at campus with the intention of writing an essay, needing a quiet place in which to do this. They check their Campus App on their smartphone and it recommends a location based on the ambient noise levels and the type of environment the learner has used before. It could take into account the distance from the coffee shop, depending on if coffee is used as a distraction or supports the learner in writing their essay. The learner can of course ignore all this and just go to where they want to, the app provides informed guidance and learns as the learner does more learning activities and which spaces they use.
Another scenario, is a teacher planning a session, with some relatively interactive and engaging learning activities. They ask the intelligent campus where is the best place for this to happen! The system takes on board the preferences of the teacher, the availability of rooms, information from previously successful similar sessions and any feedback from learners. The teacher can then make an informed choice about the best space for this session. After the learning, the system asks for feedback so that it can learn from and improve the decisions it makes.
I think some of the issues (or should we call them problems and fears) that people have with a concept such as this is they feel any such algorithm is secret and hidden and will have a built in bias.
So if we are to use algorithms to manage the next generation of learning environments, the intelligent campus, support for future apprenticeships and data driven learning gains, how can we ensure that we recognise that there is bias? If we do recognise the bias, how do we mitigate it? Using biased algorithms, can we make people aware that any results have a bias, and what it might mean?
People are not fearful of algorithms, they’re fearful of agendas that the algorithms represent. But if we make these algorithms open and accessible, we could mitigate some of those concerns.
So if we are to use algorithms to support teaching and learning, could we, by making the algorithms open, ensure that, we remove some of those fears and frustrations people have with a data approach? By making the algorithms open could we ensure that staff and learners could see how they work, how they learn and why they produce the results that do?
This does bring up another issue that people have mentioned which is the bias any algorithm has, bias which comes from the people who write it, sometimes consciously, sometimes unconsciously. There is an assumption that these algorithms are static and unchanging and written by people who have an agenda. As we know from using Google and other algorithms, these are constantly changing and being tweaked.
Could we go one step further and allow people to edit or even create their own algorithms? Allowing them to make suggestions on how they could be improved, creating new analytics that could benefit the wider community.
We need to embrace the benefits of a smart campus, because the technology is already here, but we need one which learns from the people who use it; we need to ensure that those people are the ones who inform and guide the development of that learning. They are able to and can decide which intelligent campus decisions to benefit from and which they can ignore. By making the whole process open and accessible, we can provide confidence in the decision making, we can feel we can trust those decisions. We mustn’t forget that giving them that literacy in this area, is perhaps the most important thing of all.
Remember that in both scenarios above the learner and teacher both, ultimately, have the decision to ignore the intelligent campus decisions, they can decide themselves to close the pod bay doors.
Do you remember when the Google algorithm wasn’t that good, well it was good, but today it’s better!
Many years ago if you searched for a hotel on Google, so you could find out if there was car parking, or to find the website for the restaurant menu, the search results most of the time were not the hotel website, but hotel booking sites offering cheap hotel rooms. Pointless if you already had a room, and all you wanted to know if you had to pay for car parking, or what time you could check out. The problem was that the hotel booking sites worked out how the Google search algorithm ranked sites and “gamed” Google search.
Today, the experience is very different, the algorithm usually results in the actual hotel website being the top hit on any search for a specific hotel.
Google had worked on the algorithm and ensured what they saw as the correct search result was the one that was at the top.
One thing that many people don’t realise was that Google not only worked on the software behind the algorithm, but that they also use human intervention to check that the algorithm was providing the search results they thought it should be. If you wonder why Google search is better than search functions on your intranet and the VLE this is probably why, Google use people to improve search results. Google uses people to both write the algorithms and to tweak the search results. Using people can result in bias.
So if we are to use algorithms to manage the next generation of learning environments, the intelligent campus, support for future apprenticeships and data driven learning gains, how can we ensure that we recognise that there is bias? If we do recognise the bias, how do we mitigate it? Using biased algorithms, can we make people aware that any results have a bias, and what it might mean? If we are to, like Google, use human intervention, how is that managed?
The one aspect of Google’s search algorithm that some people find frustrating is that the whole process is secret and closed. No one, apart from the engineers at Google really knows how the algorithms were written and how they work, and what level of human intervention there is.
So if we are to use algorithms to support teaching and learning, could we, by making the algorithms open, ensure that, we remove some of those fears and frustrations people have with a data approach? By making the algorithims open could we ensure that staff and learners could see how they work, how they learn and why they produce the results that do?
Could we go one step further and allow people to edit or even create their own algorithms? Allowing them to make suggestions on how they could be improved, creating new analytics that could benefit the wider community.
Is it time for open analytics?
Thank you to Lawrie Phipps for the conversations we had after the Digital Pedagogy Lab: Prince Edward Island conference and this blog post.
At the recent AELP Autumn Conference I was having a discussion about the challenges that online delivery can be for training providers and employers. Often they have excellent instructors and trainers who work well in face to face classroom situations.
There is often an assumption that is made that because someone is excellent in face to face learning scenarios, they will be able to easily transfer these skills into an online environment, as the scenarios are very similar.
This is quite a risky assumption to make, as though there are similarities in delivering learning in classrooms and online, they are not the same.
It was and can be challenging for radio personalities to move into television, even though both broadcast mediums, and there are similar programmes on both (think News Quiz and Have I Got News For You) the skills for the different media are quite different.
In a similar vein, many stars of the silent cinema were unable to make the move to the talkies. Those that did, certainly thrived, those that couldn’t, didn’t!
If we are to make the move to a truly digital apprenticeship experience for all apprentices, then we need to ensure that the instructors and trainers involved in the delivery of learning have the right capabilities and skills to deliver effectively online.
Having the digital confidence, capacity and capability is something that often needs to be built in those staff who may already have excellent skills in delivering learning in face to face scenarios. Certainly there are many things which are transferable, but the skills in facilitating a classroom discussion are different to those in running a debate in an online forum.
So the question is, how do we build that digital capability? How are you building digital and online skills? What are you doing to ensure the successful transition to online delivery?
My phone knows where I’ve parked, why couldn’t my phone know where I am on my course?
I recently upgraded my phone to the most recent iOS and have been interested to note that my phone, which can connect to my car over Bluetooth has now been giving me updates on where I parked. I usually know where I have parked and am able to find my car quite easily.
There are times where I think this could be useful, such as parking at the airport (but then I usually use my cameraphone to record the bay details).
Or when I am attending a meeting, an event or conference in a city I don’t know and then afterwards don’t recall the way back to the car park. I can recall at least twice in the last twelve months when that may have been useful, once was in Wolverhampton. Though most of the time I really need to know which level I am on in the car park.
So the notification of where I have parked my car isn’t as useful as my phone thinks it is. Maybe the day when I really need it I will think more highly of it.
The technology behind this though is somewhat clever. My phone has GPS so knows where it is and where it has been. It has Bluetooth which connects to my car (mainly for audio streaming, but also occasional hands free phone calls). There is importantly a software layer that enables the recording of the information and the notification.
This is only a simple aspect of what is a quite complex software layer. The software often tells me how long it will take to get home, and what roads to take (hasn’t quite worked out that I usually use the train). My phone knows where I am and will suggest apps on my location.
So from an educational perspective, if my phone knows where I’ve parked, why couldn’t my phone know where I am on my course and provide contextual information about where I need to go and what I could be doing.
It would need a software layer that uses the same processes as it does or parking and travelling. The software layer would need to know as a learner who I was, where I was studying, what subject I was doing and where I was on my course. It would need access to a detailed learning plan (scheme of work) and would also need to have algorithms and access to data, so that it can direct advice and content appropriately. It would also need to be able to overcome that annoyance factor that we get with the “I know where you parked your car!”
Jisc are currently running their Co-Design 2016 challenge and this concept fits into the Intelligent Campus topic. You can find out more about that on this page on the intelligent campus blog.
So do you think this is something that would be useful, or would it be too complex and expensive to build?
news and views on e-learning, TEL and learning stuff in general…