Monday was a lovely day, as it was the first day that the new Jisc offices on Portwall Lane in Bristol were open. The new office was lovely, not quite finished, but the main working areas were open for business and it was a fantastic working environment. Lots of choice of different places to work, open plan, quite spaces, not a library quiet working space, small meeting rooms (ideal for silent working), social working spaces, light, bright and airy. Loved it.
I spent time putting together a presentation I was delivering later in the week. Though it was mainly images I did put together some explanatory graphics.
The talk was only ten minutes, so I hoped the graphics would explain the concepts more easily and faster. I also wrote a blog post about the presentation, which helped me formulate what I wanted to say in the talk. I wrote this in advance and scheduled it for publication later on the same day as the presentation.
As I had an early start in London on Wednesday I travelled up on the Tuesday so I wouldn’t be late for the event. In the end due to some travel issues I missed the start of the event, but was on time to present my piece and listen to some others from the first half.
I enjoyed delivering the presentation and a due to a last minute illness I was asked to chair the second half of the event.
My overall takeaway was that developing an intelligent campus (even a smart campus) involves a range of stakeholders and users, and that all departments across an university need to be involved.
Thursday I was on leave, so Friday was more about catching up on missed e-mail from the week and doing some preparation for the Data Matters 2020 conference, ready for meetings next week.
Over the last few years I have been working on a project at Jisc, called The Intelligent Campus. Though I left the project in March, in my new role and as part of Jisc’s work on Education 4.0, I still have an interest in that space and what is possible and what benefits it could bring universities and colleges.
People are not fearful of algorithms, they’re fearful of agendas that the algorithms represent.
Over the last few weeks I have been discussing and listening to people’s views on the intelligent campus.
One topic which has resulted in a fair bit of controversy is the concept of using artificial intelligence to support teaching and learning. This isn’t some kind of HAL 9000 computer running the campus and refusing to open the library doors until Dave the learner has finished their essay. This is more about a campus system being able to learn from the users, take that data, do some analysis and make suggestions to the user on potential ideas for improvement and useful interventions.
Imagine a learner arriving at campus with the intention of writing an essay, needing a quiet place in which to do this. They check their Campus App on their smartphone and it recommends a location based on the ambient noise levels and the type of environment the learner has used before. It could take into account the distance from the coffee shop, depending on if coffee is used as a distraction or supports the learner in writing their essay. The learner can of course ignore all this and just go to where they want to, the app provides informed guidance and learns as the learner does more learning activities and which spaces they use.
Another scenario, is a teacher planning a session, with some relatively interactive and engaging learning activities. They ask the intelligent campus where is the best place for this to happen! The system takes on board the preferences of the teacher, the availability of rooms, information from previously successful similar sessions and any feedback from learners. The teacher can then make an informed choice about the best space for this session. After the learning, the system asks for feedback so that it can learn from and improve the decisions it makes.
I think some of the issues (or should we call them problems and fears) that people have with a concept such as this is they feel any such algorithm is secret and hidden and will have a built in bias.
So if we are to use algorithms to manage the next generation of learning environments, the intelligent campus, support for future apprenticeships and data driven learning gains, how can we ensure that we recognise that there is bias? If we do recognise the bias, how do we mitigate it? Using biased algorithms, can we make people aware that any results have a bias, and what it might mean?
People are not fearful of algorithms, they’re fearful of agendas that the algorithms represent. But if we make these algorithms open and accessible, we could mitigate some of those concerns.
So if we are to use algorithms to support teaching and learning, could we, by making the algorithms open, ensure that, we remove some of those fears and frustrations people have with a data approach? By making the algorithms open could we ensure that staff and learners could see how they work, how they learn and why they produce the results that do?
This does bring up another issue that people have mentioned which is the bias any algorithm has, bias which comes from the people who write it, sometimes consciously, sometimes unconsciously. There is an assumption that these algorithms are static and unchanging and written by people who have an agenda. As we know from using Google and other algorithms, these are constantly changing and being tweaked.
Could we go one step further and allow people to edit or even create their own algorithms? Allowing them to make suggestions on how they could be improved, creating new analytics that could benefit the wider community.
We need to embrace the benefits of a smart campus, because the technology is already here, but we need one which learns from the people who use it; we need to ensure that those people are the ones who inform and guide the development of that learning. They are able to and can decide which intelligent campus decisions to benefit from and which they can ignore. By making the whole process open and accessible, we can provide confidence in the decision making, we can feel we can trust those decisions. We mustn’t forget that giving them that literacy in this area, is perhaps the most important thing of all.
Remember that in both scenarios above the learner and teacher both, ultimately, have the decision to ignore the intelligent campus decisions, they can decide themselves to close the pod bay doors.
news and views on e-learning, TEL and learning stuff in general…