Tag Archives: artificial intelligence

I am working, I may be at home, but I really am working – Weeknote #79 – 4th September 2020

Shorter week due to a Bank Holiday in England, the weather wasn’t up to much.

I wrote a piece about the reality of robots. The premise of the article was that:

When we mention robots we often think of the rabbit robots and Peppa robot that we have seen at events. As a result when we talk about robots and education, we think of robots standing at the front of a class teaching. However the impact that robotics will have on learning and teaching will come from the work being undertaken with the robots being used in manufacturing and logistics.

The draft of the article was based on conversations and some research I had done over the last few years. This was an attempt to draw those things together, as well as move the discussion about robots in education away from toy robots which are great for teaching robotics, but how robots could and may impact the future of learning and teaching.

I remember in one job when we bought a Peppa robot, in the support of teaching robotics. One of my learning technologists asked if the team could get one. We then had a (too) long discussion on why would be need a robot and how it would enhance learning and teaching in subjects other than robotics? The end consensus was more that it was cool. This was a real example of the tech getting in the way of the pedagogy.

Peppa

It’s September, so schools and colleges are back this week, operating in a totally different way to what they were doing just six months ago.

At my children’s secondary school, the students will now remain in the same room throughout the day and it will be the teachers who move from room to room. Each child will have a designated desk which they will sit in each day for at least the first term, if not the rest of the academic year. It won’t be like this at colleges and universities, but restrictions will still need to be in place to mitigate the risk of infection.

There has been quite a bit of discussion online and in the press about people returning to the workplace. Sometimes the talk is of returning to work. Hello? Hello? Some of have never stopping working, we have been working from home! The main crunch of the issue appears to be the impact of people not commuting to the workplace and the impact this is having on the economy of the city centre and the businesses that are there.

Personally I think that if we can use this opportunity to move the work landscape from one where large portions of the population scramble to get to a single location via train or driving to one where people work locally (not necessarily from home) then this could have a really positive impact on local economies, as well as flattening the skewed markets that the commute to the office working culture can have on house prices, transport, pollution and so on.

I wrote more thoughts on this on my tech and productivity blog.

video chat
Photo by Dylan Ferreira on Unsplash

I read an article on The Verge this week which sparked my interest.

These students figured out their tests were graded by AI — and the easy way to cheat

I posted the link to the article to the Twitter (as I often do with links) and it generated quite a response.

Didn’t go viral or cause a Twitterstorm, but the article got people thinking about the nature of assessment and marking, with the involvement of AI. I wrote a blog post about this article, my tweet and the responses to it.

There was a new publication from Jisc that may be of interest to those looking at digital learning, Digital learning rebooted.

This report highlights a range of responses from UK universities, ranging from trailblazing efforts at University of Northampton with its embedded ‘active blended learning’ approach, to innovation at Coventry University which is transforming each module in partnership with learning experience platform Aula. The University of Leeds, with its use of student buddies, and University of Lincoln’s long-standing co-creation work are notable for their supportive student-staff approaches. University of York, however, focused on simplicity in the short term and redesign longer-term. The University of the West of Scotland is also focusing on developing a community-based hybrid learning approach for the new year.

I am going teach, was a blog post I wrote about the nature of teaching in this new landscape.

The Office for Students are reviewing the challenges the sector faced during the Covid-19 pandemic and are calling for evidence.

This call for evidence is seeking a wide breadth of sector input and experience to understand the challenges faced, and lessons learned from remote teaching and learning delivery since the start of the coronavirus (COVID-19) pandemic in March 2020.

The OfS are looking to see what worked and what has not worked. What will work in the future and what about the student experience in all of this.

I was quoted a few times in this article, How digital transformation in education will help all children.

As many teachers and learners have discovered recently, Zoom fatigue, that that needs to be accounted for when designing curriculums. “You need to design an effective online curriculum or blended curriculum that takes advantage of the technology and opportunities it offers, but likewise doesn’t just bombard people with screentime that actually results in a negative impact on their wellbeing,” says Clay.

I also mentioned connectivity.

“As soon as you took away the kind of connectivity and resources you find on campus, it became a real challenge to be able to connect and stay connected,” says James Clay, head of higher education and student experience at Jisc.

This was something that was echoed in a recent survey on digital poverty from the OfS.

During the coronavirus (COVID-19) lockdown, 52 per cent of students said their learning was impacted by slow or unreliable internet connection, with 8 per cent ‘severely’ affected.

The survey also found the lack of a quiet study space was also impacting on the student experience.

71 per cent reported lack of access to a quiet study space, with 22 per cent ‘severely’ impacted

Friday was full of meetings, which made for a busy day.

My top tweet this week was this one.

It wasn’t cheating!

using a laptop
Image by Free-Photos from Pixabay

I read an article on The Verge this morning which sparked my interest.

These students figured out their tests were graded by AI — and the easy way to cheat

The student in the article was undertaking an online test, and it wasn’t multiple choice, but short form answers to questions.

…he’d received his grade less than a second after submitting his answers. A teacher couldn’t have read his response in that time, Simmons knew — her son was being graded by an algorithm.

What the parent found was that by using a mix of keywords, or “word salad”, the system would mark the answer as correct. So the student could “cheat” the system!

The article itself was stemmed from a Twitter thread.

I posted the link to the article to the Twitter (as I often do with links) and it generated quite a response. Didn’t go viral or cause a Twitterstorm, but the article got people thinking about the nature of assessment and marking, with the involvement of AI.

There was quite a bit of feedback that this wasn’t cheating, but actually providing an answer to the question that the AI would mark as correct.

Others felt that this wasn’t AI.

I would agree, this was being called AI, but it was more of a system which matched keywords from answers given by students to a list provided by a teacher. The system wasn’t analysing what and how an answer was written, it was a text matching process.

A flawed approach to testing, which resulted in students been able to “game” the system to get 100%.

The lesson here is, for anyone looking at automated online assessment, is if there is a way in which the system can be manipulated, then it probably will be.

All together now – Weeknote #32 – 11th October 2019

Birmingham
Birmingham

A busy week with travels to Bristol, Reading and Birmingham this week.

Monday I was in Bristol for a meeting with the Office for Students, one of the funders of Jisc. Following that I was back in the main office for further meetings.

There was an interesting long read on the Guardian website.

‘The way universities are run is making us ill’: inside the student mental health crisis

A surge in anxiety and stress is sweeping UK campuses. What is troubling students, and is it the universities’ job to fix it?

We know that there is a student mental health crisis and the reasons for this aren’t necessarily clear. We know there has been increase in the demand for mental health services at universities. The article notes that there has been research into the causes of this, but lays the blame for the crisis on the way in which universities are managed and run, leading to students not being in control of what they do and saddled with debt.

Image by Karolina Grabowska from Pixabay
Image by Karolina Grabowska from Pixabay

Continue reading All together now – Weeknote #32 – 11th October 2019

Future of Teaching – Weeknote #23 – 9th August 2019

Doctor Johnson's House
Doctor Johnson’s House in Gough Square

Monday was another trip to London, I had been expecting to participate in a workshop, but this was cancelled late last week, and I already had train tickets and another meeting in the diary so decided to head up anyhow. The weather was changeable, raining whilst on the train, but this cleared up by the time I arrived in London.

I saw this link in my news feed and it did make me think more about how we could use AI to support learning, but also reflect on some of the real challenges in making this happen. Also do we want this to happen!

China has started a grand experiment in AI education. It could reshape how the world learns. – MIT Technology Review

I wrote a blog post about some thoughts I had on this.

Is this the future of “teaching”?

In the afternoon in the office we were discussing Education 4.0 and how we are going to move this forward in terms of expert thinking and messages.

Tuesday was a busy day, first a meeting in the Bristol office, before heading up to Cheltenham for a meeting the HESA office.

CrossCountry train at Cheltenham Spa Railway Station
CrossCountry train at Cheltenham Spa Railway Station

I haven’t been on a CrossCountry train for a while now, so travelling to Cheltenham Spa from Bristol Temple Meads I was interested to see how the 3G connectivity issues I’ve always had on that route would be like, especially as I now have 4G with Three. Well same old problems, dipping in and out from 4G to 3G as well as periods of No Service. I would like to blame the train, but the reality is that there is poor phone signal connectivity on that route. As there is no incentive for mobile network providers to improve connectivity.

If I do go to Cheltenham again, I think I will take a book!

We were discussed the Data Matters 2020 Conference, which is now in my portfolio. Still a work in progress and the proposal needs to be signed off by key stakeholders.

Pub in Cheltenham
The Vine Pub in Cheltenham

Whilst I was in Cheltenham I bumped into my old colleague Deborah from Gloucestershire College and we had a chat about stuff. What was nice to hear was the number of my team and colleagues in that team that had started there in learning technology and were now doing new and more exciting jobs at universities across the UK.

Wednesday there was rain. I spent today preparing for a meeting in the afternoon and tidying up my inbox. Though I did find time for a coffee.

Flat White
Flat White from Hart’s Bakery

Thanks to Lawrie for the link, I read this report on the iPASS system, which uses data and analytics to identify students at risk.

The three institutions increased the emphasis on providing timely support, boosted their use of advising technologies, and used administrative and communication strategies to increase student contact with advisers.

This report shows that the enhancements generally produced only a modestly different experience for students in the program group compared with students in the control group, although at one college, the enhancements did substantially increase the number of students who had contact with an adviser. Consequently, it is not surprising that the enhancements have so far had no discernible positive effects on students’ academic performance.

Looks like that it didn’t have the impact that they thought it might.

In a couple of weeks I am recording a podcast and met with the organiser today to discuss content and format. Without giving too much away, we will be covering the importance of people in any digital transformation programme and ensuring that they are part of the process, consultation and are given appropriate training in the wider context of their overall skills and capabilities. You can’t just give people new digital systems and expect them to be able to use them from day one or with specific training. Familiarity with digital in its wider context is often critical, but is equally often forgotten.

Whilst writing a blog post about online learning I wrote the following

Conversations are really hard to follow in e-mail, mainly as people don’t respond in a linear manner, they add their comment to the top of their reply.

When I first started using e-mail in 1997, well actually I first started using e-mail in 1987, but then got flamed by the e-mail administrator at Brunel University, so stopped using it for ten years….

When I re-started using e-mail in 1997, there was an expectation when replying to e-mail that you would respond by writing your reply underneath the original e-mail, bottom posting, which really was something that I got from using usenet newsgroups. This from RFC 1855.

If you are sending a reply to a message or a posting be sure you summarize the original at the top of the message, or include just enough text of the original to give a context. This will make sure readers understand when they start to read your response. Since NetNews, especially, is proliferated by distributing the postings from one host to another, it is possible to see a response to a message before seeing the original. Giving context helps everyone. But do not include the entire original!

By the early 2000s lots more people were using e-mail and most of the time they were replying at the start of the e-mail, top-posting. There were quite a few people in my circles who continued to bottom post their replies, which made sense when reading a threaded conversation, but confused the hell out of people who didn’t understand why someone replied to a conversation, and from what they could see, hadn’t written anything!

Today top-posting appears to be the norm and I can’t recall when I last saw someone responding to an e-mail by replying at the end of the quoted reply.

Here is the blog post I wrote, about how online learning doesn’t just happen.

Online learning doesn’t just happen

Friday was about planning, planning and even some forward planning. One thing that has puzzled me for a long time was the difference between forward planning and planning. Thanks to Google I have a better idea now.

Forward planning is being pro-active, predicting the future and then planning to achieve that prediction.

The opposite is backward planning, which is more reactive, you wait until you get a request or management decision then create a plan to achieve it.

So what is plain and simple planning then?

Wikipedia says that planning is the process of thinking about the activities required to achieve a desired goal.

So some of what I am doing in my planning is responding to both requested goals and planning for some predicted goals.

We had our weekly meeting about the Technical Career Pathways we are developing at Jisc. I am responsible for the Learning and Research Technical Career Pathway.

My top tweet this week was this one.

Is this the future of “teaching”?

learning
Image by Pexels from Pixabay

Today I saw this link in my news feed and it did make me think about how we could use AI to support learning, but also some of the real challenges in making this happen.

China has started a grand experiment in AI education. It could reshape how the world learns. – MIT Technology Review

An interesting read on how China is using it’s technological might (and economies of scale) to utilise AI to respond to demand for tutoring to catch up and for the college entrance exam, the gaokao.

The article says that three factors have driven AI education, firstly supported by tax breaks, a $1bn investment. Second intense demand for tutoring and people (read parents) willing to pay for it. Thirdly masses of data to refine algorithms and a population not as concerned about data privacy as we are in the West.

This AI teaching model is gaining traction in China, could the UK do something similar? Well we certainly couldn’t match the investment, people don’t like paying for stuff and we have big ethical concerns about data and privacy.

Could we learn from China?

Well one aspect of the tutoring system is how it adapts to the needs of the learner and as a result personalises the learning that is made available to the learner. Could we use a similar system to support learners in their learning journey? Not necessarily fulfilling the entire learning journey, but aspects and parts of the journey.

Well the system doesn’t necessarily remove the human function within learning and teaching, in the same way that books didn’t and in reality neither did the VLE. What it does do is free up time for teachers to focus on those aspects of learning which technology and AI can’t do well or not at all.

Group working
Image by StockSnap from Pixabay

I don’t see AI systems replacing teachers, in the same way that text books, workbooks, don’t replace teachers. What I can see is how such systems could enhance the learning journey. Rather than a blanket list of resources and links, more focused and personalised approach to meeting the learner needs.

I can see a system also working intelligently to understand the context the learner is learning in. Are they travelling or on campus, or at home? What is their connectivity like? Are they on their own, with peers from their cohort? What assessments do they have coming up? Have they been working for a while, do they need a break?

Before we get to that, there is a lot of work to be done on how we measure these aspects of learner context. Importantly we also need to consider the ethical aspects of any such system.

Open the pod bay doors…

People are not fearful of algorithms, they’re fearful of agendas that the algorithms represent.

2001-a-space-odyssey

Over the last few weeks I have been discussing and listening to people’s views on the intelligent campus.

One topic which has resulted in a fair bit of controversy is the concept of using artificial intelligence to support teaching and learning. This isn’t some kind of HAL 9000 computer running the campus and refusing to open the library doors until Dave the learner has finished their essay. This is more about a campus system being able to learn from the users, take that data, do some analysis and make suggestions to the user on potential ideas for improvement and useful interventions.

Imagine a learner arriving at campus with the intention of writing an essay, needing a quiet place in which to do this. They check their Campus App on their smartphone and it recommends a location based on the ambient noise levels and the type of environment the learner has used before. It could take into account the distance from the coffee shop, depending on if coffee is used as a distraction or supports the learner in writing their essay. The learner can of course ignore all this and just go to where they want to, the app provides informed guidance and learns as the learner does more learning activities and which spaces they use.

Another scenario, is a teacher planning a session, with some relatively interactive and engaging learning activities. They ask the intelligent campus where is the best place for this to happen! The system takes on board the preferences of the teacher, the availability of rooms, information from previously successful similar sessions and any feedback from learners. The teacher can then make an informed choice about the best space for this session. After the learning, the system asks for feedback so that it can learn from and improve the decisions it makes.

I think some of the issues (or should we call them problems and fears) that people have with a concept such as this is they feel any such algorithm is secret and hidden and will have a built in bias.

hal-9000-reflecting-daves-entry-in-stanley-kubricks-2001-a-space-odyssey

As I wrote in my previous blog post on open analytics I said

So if we are to use algorithms to manage the next generation of learning environments, the intelligent campus, support for future apprenticeships and data driven learning gains, how can we ensure that we recognise that there is bias? If we do recognise the bias, how do we mitigate it? Using biased algorithms, can we make people aware that any results have a bias, and what it might mean?

People are not fearful of algorithms, they’re fearful of agendas that the algorithms represent. But if we make these algorithms open and accessible, we could mitigate some of those concerns.

So if we are to use algorithms to support teaching and learning, could we, by making the algorithms open, ensure that, we remove some of those fears and frustrations people have with a data approach? By making the algorithms open could we ensure that staff and learners could see how they work, how they learn and why they produce the results that do?

This does bring up another issue that people have mentioned which is the bias any algorithm has, bias which comes from the people who write it, sometimes consciously, sometimes unconsciously. There is an assumption that these algorithms are static and unchanging and written by people who have an agenda. As we know from using Google and other algorithms, these are constantly changing and being tweaked.

Could we go one step further and allow people to edit or even create their own algorithms? Allowing them to make suggestions on how they could be improved, creating new analytics that could benefit the wider community.

We need to embrace the benefits of a smart campus, because the technology is already here, but we need one which learns from the people who use it; we need to ensure that those people are the ones who inform and guide the development of that learning. They are able to and can decide which intelligent campus decisions to benefit from and which they can ignore. By making the whole process open and accessible, we can provide confidence in the decision making, we can feel we can trust those decisions. We mustn’t forget that giving them that literacy in this area, is perhaps the most important thing of all.

Remember that in both scenarios above the learner and teacher both, ultimately, have the decision to ignore the intelligent campus decisions, they can decide themselves to close the pod bay doors.