Category Archives: video

Transforming the television watching experience

old television
Image by Free-Photos from Pixabay

When we start talking about digital transformation, I often see people focusing on the digital aspect and expecting the transformation to follow on. Where we see true transformation, the focus is on the change and digital is enabling that change.

In a previous blog post I wrote about the changes digital had on the music industry, well a specific focus on the retail aspect. In this similar post I want to think about the impact digital has had on television and probably more importantly the ways in which we watch television.

Television has been around for a while now. The early 1950s saw an explosion of television ownership in the UK. These were analogue devices that enabled broadcast television in the home. Originally all television was live, it was the development of video tape that allowed television to be recorded in advance and broadcast later.

1998 saw the launch of digital terrestrial television in the UK with Ondigital. However digital terrestrial television really took off in the UK when Freeview was launched in October 2002. In order to view the digital signal you either needed a set top box or a television with an integrated digital tuner. This was very much a digitisation of the television.

Over the next few years we saw televisions become smarter (and flatter and larger).

Well what really transformed television wasn’t the digitisation or digitalisation of the hardware.

Remote control
Image by tookapic from Pixabay

If we separate the physical television hardware from the experience of watching television then we are now seeing the digital transformation of television watching experience.

When we look at digital transformation, it becomes obvious that focusing on the hardware or technology is actually quite limiting.

If we go through the story of television again, but rather than look at the hardware to watch television, we focus on the experience of watching television, we can start to see how digital enhanced, enabled and transformed the experience of watching television.

When television was broadcast, you had no choice but to watch what was on when it was on. Yes you had a choice of channels, but not a huge choice.

The VCR (video cassette recorder) did transform the way in which we could watch television, we could now time shift when we watched stuff, and video rental shops allowed us to watch things which weren’t on television. I do remember travelling by train on the 4th July 1990 and there was someone in our carriage watching the World Cup semi-final between Germany and England on a small portable television. It was a tiny screen, it was back and white, and every time we went through a tunnel they lost the signal. The poor bloke was also surrounded by people who also wanted to watch the game. I remember the running commentary when the match ended in penalties. The portable telebision and the VCR were both technologies that led to the transformation of the television watching experience, but this was not digital transformation of that watching experience. However this change would influence how we wanted to watch television and as there were technological changes enabled by digital, this would result eventually in a digital transformation of that television watching experience.

The launch of digital terrestrial television of course changed the watching experience now we had access to lots of channels and the EPG (electronic programme guide) would enhance that experience. Though for some it meant more time scrolling through those channels.

What really transformed the watching experience was when digital technologies detached that experience from the physical television.

Back in the early 2000s I had a Compaq iPAQ handheld PDA with a jacket that could be used with a CompactFlash memory card. I do remember, as an experiment, ripping a DVD, compressing the resulting video file, copying it over to a 1GB IBM MicroDrive CF memory card and watching video on the move. It was challenging to do and not something the average consumer would do. However the concept was there of watching television on a small screen, regardless of my location.

There followed the development of small handheld devices, be they phones or tablets, which now have sufficient processing power to deliver high quality video. Also video content is now much more easily available. Connectivity has changed as well, with 4G (and now 5G) allowed high quality video to be streamed over the internet regardless of location.

Mobile video
Image by Claudia Dewald from Pixabay

Suddenly we could watch television when we wanted, where we wanted and how we wanted. Services exploited this transformation of the television watching experience, we saw subscription services such as Netflix, on-demand services such as BBC iPlayer, downloaded content from services such as Google Play or iTunes really enabled and allowed people to have a very different television watching experience. In many ways the digital transformation of watching television has resulted in box sets being available (as opposed to releasing an episode weekly, though that still happens) . We’ve also seen a huge explosion in short videos, through services such as YouTube and TikTok.

One example of the impact of this transformation of the watching experience is how television episodes are no longer constrained by the artificial construct of broadcast television. Most US series, until recently, episodes were 45 minutes long, so with adverts they would fit into the one hour slot allocated to them. With services such as Netflix and on demand services, the removal of the constraint has enabled television production to produce episodes of different lengths to suit the story for that episode, some will be longer and some will be shorter. The fourth season of Stranger Things is a case in point, only one episode is an hour, five are between 74 and 78 minutes and the finale is one hour 38 minutes. Okay these are all longer…

So to remind us, when we look at digital transformation, it becomes obvious that focusing on the hardware or technology is actually quite limiting. So when looking at the digital transformation of education, we really want to focus on the transformation of education and how digital can enable and enhance that transformation.

The VLE is not dead – Weeknote #167 – 13th May 2022

Image by drippycat from Pixabay

Monday morning, I was off to Queen Mary University of London for their VLE Expo. This was very much a QMUL focussed event, though they had invited a range of VLE vendors. I liked how the focus of the event was about, what do we want to do to achieve our strategic aspirations, how will the VLE help us to do that, and which platform (or platforms) will enable us to do that.

There were some excellent presentations from the academic staff on the different ways in which they were using technology including virtual reality, mixed reality and H5P. I sat on the final panel session answering questions from the floor on a range of issues. A lot of the questions were more about the use of technology for learning and teaching, than VLE specific topics. However, I did get into a few discussions about the VLE on the Twitter as a result of attending the event.

I posted another blog post in my Lost in Translation series this time with a focus on the technical aspects of recording videos or audio files.

Most institutions will (probably) have equipment which staff can use, but if there is a strategic approach to building a sustainable approach to the use of video and audio, then universities will need to reflect if they have sufficient resources to support the increased demand for cameras and microphones.

video recording
Image by StockSnap from Pixabay

Tuesday I was still in London for a briefing session, well as it happened it got cancelled, so I worked in the office.

Apple have announced that they are going to stop selling the iPod once the current stocks of iPod touch run out. So did you have an iPod and if so which one?

iPod
Photo by Cartoons Plural on Unsplash

Wednesday, I did two all-staff briefings for two directorates on the Jisc HE sector strategy. From the feedback I got they seemed to be well received.

I was reminded on the Twitter about when I took my bike to work. I made a video back then.

Mike Sharples posted an excellent Twitter thread on how AI can be used to write essays. I agree with Mike, if we are setting students assignments that can be answered by AI, are we really helping students learn?

I enjoyed the #LTHEchat on images in presentations in the evening.

These two blog posts from 2005 (and 2007) were very influential on my presentation style: Gates, Jobs, & the Zen aesthetic and Learning from Bill Gates & Steve Jobs. I also posted  a link to a presentation from an internal TEDx event about delivering presentations – A duck goes quack.

Thursday, I made my way to Harwell for a drop in session I was running at the Jisc offices there, alas an accident the closure of the M4 meant I spent nearly four hours sitting the car rather than sitting in a room talking to Jisc staff. In the end I had to abandon my visit to the office.

Friday, I had a scoping call about learning spaces in higher education. Interested in the kinds of learning spaces higher education is using, flexibility, technology and the kinds of activities spaces are being used for.

I found this WonkHE article interesting – Learning design is the key to assuring the quality of modular provision in which Nick Mount talks about building quality assurance into the design of modular programmes and micro-credentials.

Traditional providers can expect to find themselves facing the difficult job of rethinking existing assurance processes that are designed for coherent, longitudinal programmes of study, so that they can accommodate a new pick-and-mix landscape of highly portable and stackable micro-credential learning.

My top tweet this week was this one.

Lost in translation: cameras and microphones

video recording
Image by StockSnap from Pixabay

As part of my work in looking at the challenges in delivering teaching remotely during this crisis period I have been reflecting on how teaching staff can translate their existing practice into new models of delivery that could result in better learning, but also have less of detrimental impact on staff an students.

One of the things we noticed when the pandemic struck and lockdown happened, was as the education sector moved rapidly to remote delivery was the different models that people used. However what we did see was many people were translating their usual practice to an online version.

In my post on translating the lecture I discussed the challenges of translating your 60minute lecture into a 60 minute online video presentation. 

There are some problems with this as you are not providing an online video version of the lecture. You are using a platform like Teams or Zoom to deliver the lecture via a webcam. You will not be able to “read” the room as you can in a face to face environment. Video presentations also lose much of the energy that a physical presentation has. It can flatten the experience and people will disengage quite rapidly.

In a couple of posts in this series I discussed how you could reflect on the format of the lecture by looking at how content is produced and delivered for television and radio. 

One aspect I didn’t discuss in too much detail was the technical aspects of recording videos or audio files.

webcam
Photo by Waldemar Brandt on Unsplash

Back in the day, most laptops didn’t have webcams, and I remember buying external iSight cameras to use with my G5 Power Mac. Today you would be hard pressed to buy a laptop without a built-in webcam, the iPad comes with two cameras (front and back). It’s the same with microphones, the G5 Power Mac had an audio-in mini-jack for an external microphone, though I went out and got a USB Blue Snowball.

So today most people using a computer will have the technical capability to record video and audio easily. However there is more to creating high quality content than the ability to turn on a webcam or speak into the laptop microphone. These tools are fine for video conferencing, but aren’t necessity ideal for creating videos or audio recordings.

microphone
Image by StockSnap from Pixabay

Using external cameras and microphones is one way in which to enable better quality recordings than using the built in hardware on your laptop.

During the pandemic lockdowns, using your laptop was acceptable. Moving forward and creating new recordings, it makes sense to have better equipment. It’s not just about cameras, but also decent microphones for those cameras.

Most institutions will (probably) have equipment which staff can use, but if there is a strategic approach to building a sustainable approach to the use of video and audio, then universities will need to reflect if they have sufficient resources to support the increased demand for cameras and microphones.

video recording
Image by Pexels from Pixabay

Going forward maybe having decent cameras and microphones will be the staple of academic kit, in the same way that laptops are now provided.

In a future post I will talk about creating an ideal environment for recording television style and radio content.

Pausing for thought

Just a thought, a one hour lecture paused six times, isn’t this the same as six ten minute lectures?

Gravity AssistYesterday saw the publication of the Office for Students’ report Gravity assist: propelling higher education towards a brighter future. It is their review of the shift toward digital teaching and learning in English higher education since the start of the coronavirus (COVID-19) pandemic.

It is a 159 page report that attempts to capture the lessons from an extraordinary phase of change.

I have been reading the document and overall yes I do welcome the report, I think it has covered the background and situation on the response to the pandemic well.

However moving forward, it misses many opportunities to fully exploit the affordances of digital.

An example on page 42

We heard that to create high-quality digital teaching and learning, it is important not simply to replicate what happens in an in-person setting or transpose materials designed for in-person delivery to a digital environment. For example, an hour-long in-person lecture should not simply be recorded; rather, it needs to be broken down into more manageable chunks. In other words, teachers need to reconsider how they approach teaching in a digital environment.

Why does it need to be broken down? Has no one heard of the pause button?

A one hour lecture paused six times, is the same as six ten minute lectures.

How is six ten minute lectures better quality than a one hour lecture paused six times?

video camera
Image by Robert Lischka from Pixabay

I wrote this back in September following a conversation with a lecturer about this very issue, who like me couldn’t work out why they needed to chunk their lectures.

Doing a 60 minute (physical) face to face lecture is one thing, generally most people will pause through their lecture to ask questions, respond to questions, show a video clip, take a poll, etc…

Doing a 60 minute live online lecture, using a tool like Zoom or Teams, is another thing. Along with pauses and polls as with a physical face to face lecture you can also have live chat alongside the lecture, though it can help to have someone else to review the chat and help with responses.

Doing a 60 minute recording of a lecture is not quite the same thing as a physical face to face lecture, nor is it a live online lecture.

There is a school of thought which says that listening to a live lecture is not the same thing as watching a recording of a lecture and as a result that rather than record a 60 minute lecture, you should break it down into three 20 minute or four 15 minute recordings. This will make it better for the students.

For me this assumes that students are all similar in their attention span and motivation, and as a result would not sit through a 60 minute lecture recording. Some will relish sitting down for an hour and watching the lecture, making notes, etc… For those that don’t, well there is something called the pause button.

With a recording you don’t need to break it down into shorter recordings, as students can press the pause button.

remote control
Image by tookapic from Pixabay

Though I think a 60 minute monologue is actually something you can do, why do it all the time? You could, for other sessions do different things, such as a record a lecture in the style of a television broadcast or a radio programme.

If you were starting afresh, there is something about breaking an online lecture down into more sections and intersperse them with questions, chat and polls, just as you would with a 60 minute physical face to face lecture. If you have the lecture recordings already, or have the lecture materials prepared, then I would record the 60 minutes and let the students choose when to use the pause button.

The reality is that if you want to create high quality digital teaching and learning, you need to start from the beginning with what the learning outcomes will. Chunking your existing processes doesn’t result in a higher quality experience, it merely translates what you did before, but in a way which is not as good but in a digital format. It loses all the nuances of what made that physical in-person session so good and has none of the affordances that digital could bring to the student experience.

Probably the best way of thinking about this, is this is a first stage, the next stage is making it happen.

DVD pedagogy in a time of digital poverty

DVD
Photo by Phil Hearing on Unsplash

The challenges of digital poverty are making the news, with demands to ensure students have access to devices and connections. What isn’t making the news so much is demands to rethink the curriculum design and delivery so that it is less reliant on high end devices and good broadband!

Could we deliver content and learning via an USB stick or even on DVD?

This tweet by Donald Clark of a suggestion by Leon Cych to use USB flashdrives, reminded me of a presentation I delivered fifteen years ago.

Back in 2006 I was looking at how learners could access learning content despite not having a fancy laptop (or desktop) or even internet connectivity.

I was intrigued about how consumer devices used for entertainment, information and gaming could be used to access learning.

I also did a fair amount of work reflecting on how to convert learning content (from the VLE) to work on a range of devices from the PlayStation Portable (PSP), iPods, mp3 players, as well as devices that usually sat under the television, such as DVD players and media streaming devices.

So for an online conference I prepared a presentation on this subject.

Continue reading DVD pedagogy in a time of digital poverty

Lost in translation: the television programme

old television
Image by Free-Photos from Pixabay

I have been working on a series of blog posts about translating existing teaching practices into online models of delivery. In previous posts I looked at the lecture and the seminar, in this one I want to focus on the conversation.

One of the things I have noticed as the education sector moved rapidly to remote delivery was the different models that people used. However what we did see was many people were translating their usual practice to an online version, some have called this practice mirroring.

As part of my work in looking at the challenges in delivering teaching remotely during this crisis period I have been reflecting on how teaching staff can translate their existing practice into new models of delivery that could result in better learning, but also have less of detrimental impact on staff an students.

In my post on translating the lecture I discussed the challenges of translating your 60minute lecture into a 60 minute online video presentation. There are some problems with this as you are not providing an online video version of the lecture. You are using a platform like Teams or Zoom to deliver the lecture via a webcam. You will not be able to “read” the room as you can in a face to face environment. Video presentations also lose much of the energy that a physical presentation has. It can flatten the experience and people will disengage quite rapidly.

So at a simple level, you could create a 60 minute video or audio recording to replace the physical lecture or live zoom session. However simply recording yourself misses a real opportunity to create an effective learning experience for your students. If you have watched a 60 minute TV programme, you will realise few if any have a talking head for 60 minutes. Few of us have the time or the skills to create a 60 minute documentary style programme to replace the lecture, and where would you go to film it?

So if you change the monologue to a conversation then you can create something which is more engaging for the viewer (the student) and hopefully a better learning experience.

It could be an interview between two people, but why not make it a real conversation and have three or four people involved. When involving people do think about diversity, are all the people involved old white men? If they are, time to think differently about who is involved.  It could be a debate, a heated discussion on a topic between two opposing views. 

When thinking of what to is going to happen, it makes sense to plan the conversation, this isn’t about scripting, but about decided what topics you are going to cover and how long you will spend on each of them. What are the objectives of the video (or session) what are the learning outcomes the students should achieve by watching the video? You may want to consider writing some ideas or prompts for the students to think about and make notes as they watch the video.

Image by InspiredImages from Pixabay
Image by InspiredImages from Pixabay

You will probably need to use some kind of video tool such as Zoom or Teams to do this.

From a recording perspective, you want to try and keep all the people on screen at the same time (gallery mode), or if not, at least use selective muting to avoid the focus jumping from person to person. Also try and use decent microphones to get decent audio. People are much more forgiving of poor quality pictures than they are audio when it comes to internet based video. 

Image by Free-Photos from Pixabay
Image by Free-Photos from Pixabay

Another method is to use Zoom or Teams to have the discussion, but use an actual camera and decent microphone to record the video and then combine them in post (as in post production) using a tool like Final Cut or even iMovie, to create a better quality video. 

Reflecting from an audience perspective, it might be better to create two or three shorter video recordings rather than one big one. It might result in a fresher better recording than one which tires itself out. Also think about the student, will they want to watch a 60 minute video? You may think you are an amazing and engaging presenter and raconteur, the reality is maybe do something shorter and to the point. It may also be more accessible as well for those who have other pressures on their time, or unable to find a space to watch a video for a whole hour.

Though you could present all three recorded conversations in one hour, another option would be to spread them out over the week and support with with an asynchronous online discussion chat.

Simply translating what we do in our physical buildings into a online remote version, is relatively simple, however it may not be effective. Thinking about what you want that learning experience to achieve and what you want the students to learn, means you can do different things.

The future of learning… ten years later!

FOTE09

On the 2nd October 2009 I was at the ULCC Event, The Future of Technology in Education.

Little did I know the impact that this presentation would have on me, my future career and education in general.

I felt a little intimidated to be invited to talk at the event, we wouldn’t have called it imposter syndrome back then, but I did wonder if I was the right person to talk at such an interesting conference. It certainly had a TED talk feel to it. I must thank Frank Steiner and Tim Bush from ULCC for their support and help and inviting me to talk at this FOTE and future FOTE events.

2009 was quite a year for me, I had won the ALT Learning Technologist of the Year award that year. It was also the year of “The VLE is Dead” debate at the ALT Conference.

The event took place at the Royal Geographical Society in Kensington, which I remember wasn’t the easiest place to get to via the underground. Knowing London better now I think I would probably have just walked across Hyde Park from Paddington to get there. From about 2001 I started going to London a lot for work, well a few times a year, which was considerably more than when I was a lecturer in Bristol. I use to go to London, arrive at Paddington, take the underground, pop up somewhere, go to a meeting or an event, before popping back down into the underground on my way home. These days I visit London a lot more and have spent a lot more time walking around London, so have a much better grasp of the geography of the place. I remember being quite impressed with the place, and that you could see the nearby Albert Hall.

Albert Hall

I spent a fair bit of time putting my presentation together, in the end it comprised 82 slides… and I only had twenty minutes to deliver my talk. A challenge that took some doing.

My presentation was entitled The future of learning… The aim of my presentation was to discuss how learning would and could change with the affordances of technological change.

So what of my predictions?

Well we know predicting the future is hard and generally most people get it wrong.

You will no doubt not be surprised that I got a lot of things wrong…

One thing I feel I did get right was that mobile was going to be big and important. I said how I felt mobile was the future. The audience did have a range of mobile devices themselves, but most phones were nothing more than phones that could do SMS and the Snake game. There were a few smartphones out there, but if my experience was to go by, they were clunky and difficult to use. We had the iPhone, but it hadn’t quite had the impact that it has had by today.

We didn’t have the iPad, that would arrive the following year. So no surprise that in my talk at FOTE I didn’t mention tablets

My talk actually started off talking about the past, how we are still impacted and embedded by the past, which makes change challenging and difficult.

I then talked about the present and some of the issues and problems that technology was causing in classrooms and lecture theatres. PAT testing was a real concern for many back then, don’t hear much about it these days in relation to BYOD or learner devices.

One of the challenges I saw back then was how academics and educationalists wanted to categorise learning, so we had e-learning, m-learning, mobile learning, online learning, digital learning, etc….

I said that I thought categorising learning and putting it into different boxes was restricting and that really we should focus on learning and blur the boxes, blur the boundaries.

Boxes

It was fine to talk about the “boxes” at conferences and in papers, but experience has shown that categorising learning into boxes caused confusion for teachers and academics, who rightly focussed on the word before the learning as a problem to be solved and then found it challenging.

However back then I said, and I still stand by this today, is that learners and academics need to understand the potential of technology and digital to better understand the affordances and opportunities that it can provide for learning. You don’t need to be ab le to do the technology, but you do need to know what it can do.

I also brought in scepticism about technological advances, something I would draw upon in future talks and presentations.

Nokia N95

Video (and film) had been used for learning for years, but people were sceptical and convinced that video (ie lecture capture) would stop traditional learning activities. However we know that television didn’t destroy radio, we know that radio didn’t kill newspaper, books didn’t replace folk stories. When we have a new technological development, often the result is a negative impact on existing technologies, but often the result is affordances about the potential of the new technology, enabling access that otherwise wouldn’t be possible.

I also talked about the potential of video on mobile devices. Video cameras were getting smaller and cheaper, the quality was getting better as well. You could buy video cameras which could record HD video, even if it was a challenge to capture and edit it on standard computers of the time. This was before the concept of streaming became mainstream. I showed a Sanyo Xacti camera which was waterproof and dropped it in a jug of water. These cameras could be used in dirty and dusty environments and the washed under the tap!

James Clay presenting at FOTE09

Mobile phone video has become so much better now. I am still impressed that my iPhone can record 4K video… If only we could get people to record video in landscape!

GPS was usually an option on devices back then, today it is more prevalent in the devices we buy. I saw this as an opportunity, the concept of geo-location based learning was something that felt quite magical at the time. Your device knows where you are, so personalises the learning based on your location. What I missed was how location tracking and would become a very big issue for people.

There was a bit of a backlash against e-Books back in 2009, as people felt that they weren’t as good as “real” books. For me they weren’t a replacement for books, they enabled different ways of reading. For many e-Books and e-book readers enabled a new way to access books and content, that otherwise would mean they wouldn’t have access. I presented on the future of reading at #FOTE10 the following year. I became a bit of an expert on e-books as as result. I presented on e-books at many different events and conferences, as well as writing a chapter in a book, and finally a book on Preparing for Effective Adoption and Use of Ebooks in Education in 2012.

Today e-books are part and parcel off education with easier access to books by students from academic libraries. As I did predict, we didn’t see the end of physical books, we still have bookstores and people still buy physical books.

reading a Kindle
Image by Pexels from Pixabay

Back then in 2009 connectivity was either slightly haphazard, or expensive, or both. We had 3G, but it wasn’t widespread, it would be another three years before we saw 4G.

WiFi was there, but it didn’t always work and network congestion would often cause the WiFi to fail. This happened with frequent regularity at events and conferences I attended back then, as delegates killed the WiFi with too many connections.

In the future I felt connectivity wouldn’t just be important, it would be critical for the future of learning.

Today we have really good (and cheap) mobile data, 4G is more available and 5G is starting to appear. Ubiquitous WiFi is certainly there compared to ten years ago, Eduroam has made it easier for people in education to connect when travelling, but WiFi is easily found in most places. This has allowed users to do so much more when travelling and moving about, or just when drinking coffee. I certainly notice how many people are streaming video, having video chat, doing so much more, because they had the connection and the bandwidth to do so.

Mobile often means battery power, and access to charging. Everyone remembers how their Nokia phone would last days on a single charge, today, most people seem to complain how their smartphone battery doesn’t last the day. Batteries may not seem to have got better, they have, just that we demand more power for our complex devices. We have seen significant improvements in battery technology, but we have seen a huge increase in our demand for power on our devices. Streaming video requires more power than reading an e-mail. One thing that has taken time to filter through was the importance of the ability to charge devices. Since 2009 we have seen trains and buses adding power sockets, and USB ports for charging as well. Hotels have added similar sockets. Some lecture theatres now have plug sockets as well.

In my 2009 presentation I talked about the technological penknife.

Image by Karolina Grabowska from Pixabay
Image by Karolina Grabowska from Pixabay

This is one thing I got very wrong, I thought that the idea that a device that did everything meant it did everything badly. A penknife has multiple tools, but most of them aren’t very good doing the stuff they are designed to do. People would prefer to have specialist devices for specific activities. Why would you have rubbish video from a phone, when you could have a decent HD video camera? Why would you use the rubbish microphone on a device, when a specialist recording device would do it so much better? Well that didn’t happen, in reality we have seen devices become so much better that we don’t need to have multiple devices. We have the penknife, but it’s a really good penknife, really good at everything.

I then went on to talk about change and the importance of managing change. I talked about how change can be a series of small steps, but noted the importance of missing steps, endless steps and steps that trip you up.

These slides were really where I started to understand strategy and writing strategies much more. This certainly helped me in future roles and influenced heavily the design of certain aspects of the Jisc Digital Leaders Programme in which I was part of the research and development team led by Lawrie Phipps.

I talked about activity, technology should never be about the technology, it needed to be about how it could enhance or improve activities. Or where the affordances created new opportunities for different activities. We still have a perception that we shouldn’t talk about technology first, though sometimes I think we should.

Technology allow for flexibility, flexible curriculum, flexible approaches to delivery, flexible learning. I think we have made a little progress here, but so much more is possible these days. The technology enables flexibility, but that doesn’t mean it will just happen, there is so much more that needs to happen to enable flexibility.

Back then I felt sharing was important, not just sharing content (as in open) but also sharing ideas, concepts and approaches. Not that this didn’t happen, but it was difficult to do so. Today it is much easier to share than it was back then, so much so, I think we have forgotten about the time when this didn’t happen.

I talked about the importance of working collaboratively. Since the talk online tools have made it so much easier to collaborate. Collaboration across institutions (and countries) is so much easier these days. Tools such as Slack enable groups to talk and work together.

I talked about innovation, celebrating ideas. Innovation doesn’t always mean better, it means different or new. Following on from that I talked about experimentation and encouraging it within our institutions.

If you want innovation, then it needs to be embedded into the strategy, rewarded and not penalised when things go wrong. It needs to be done in collaboration with learners not done to them. I think we are seeing much more innovation and collaboration these days, and the student voice is helping to inform developments and ideas.

I said we need to re-think assessment, technology was going to have an impact. I think it has, but not in the way we thought it would. We try and use technology to “fix’ assessment today, rather than re-imagine how we assess.

I talked about culture and how culture can enable change, but also frustrate it. Culture is about what and who we are, it’s the sum of the people within an organisation. This was something we covered years later in the Jisc Digital Leaders Programme.

I have written about the importance of culture and strategy in this blog post on writing strategies.

I have always seen technology as a solution to a problem. Technology in itself is not the problem needing to be solved. This was something that I wrote about in 2018.

I finished the presentation about talking about the future and how the future was about the learner, the student. It was about how they wanted to learn, where they wanted to learn, what they wanted to learn and with whom they wanted to learn. Why did we need to think about the future, it was because we needed to think about the learners, then, now and in the future.

So did I predict the future?

No.

It certainly though had a huge impact on my future, some of which I have outlined above. As a result of this talk I was invited to speak at a range of events and conferences on the future of learning and a range of mobile learning events. I spoke the following year at FOTE 10 about the future of reading, which resulted in me doing much more in the e-book space.

So there is also a video of me (looking much younger) presenting, if you want to watch what happened…

Which video and audio standard should I use?

People like standardisation it makes life easier, it is easier to provide advice and guidance and removes barriers. The implication is that if everyone used the same device, the same formats and the same delivery mechanisms then it would be easier to deliver video and audio to learners and for those learners to create audio and video themselves.

Audio is a little easier to standardise, most people have heard of mp3 and most devices can play mp3 files. That should make it much easier to roll out, however due to the fact there is a patent behind mp3 and licensing fees that need to be paid, means that some devices though can play mp3s can not record direct to mp3. These devices will use other audio codecs that play fine on the device in question, but not necessarily on other devices or through a browser. The solution in Gloucestershire College was to standardise on the Edirol R09H which records natively to mp3 onto an SD memory card. Yes it is expensive, but it does record to mp3 and the quality is excellent. Use of SD cards meant that it was very easy to transfer recordings to a laptop or computer and then share them on the VLE. The quality of the audio recordings were excellent.

Of course though the mp3 standard is ubiquitous, recording to mp3 is not necessarily the best format to use, especially if you are going to do any kind of editing on the recording. If for example you are going to be using Garageband to edit the recording and create a podcast, then you are not going to want to use the mp3 format. The reason is that an mp3 file is compressed and uses what is called a lossy compression, in other words information is lost when the file is compressed. If you edit and then compress again, more information is lost. As a result you are compressing a compressed format and you will lose even more quality. This means you can get a less than satisfactory recording. For those who prefer a higher quality the Edirol R09H can also record direct to uncompressed WAV. This is CD quality and doesn’t use any lossy compression, so no loss of quality when editing the file. Of course the problem with WAV is that the file sizes are large so distribution is a problem, so the final edited audio file can then be compressed to mp3.

It’s one thing to record audio, delivery is something else. Placing the audio file on the VLE makes it very easy for learners to access and download the recording in their browser. However for regular recordings it makes more sense to use RSS or podcast the recordings. This will allow learners to subscribe to the series of recordings through software such as iTunes and then transfer the recordings to a portable device such as an iPhone or iPod. The challenge here is not just technical, but also the recognition from practitioners of the importance of a regular series of recordings or podcasts.

If audio is difficult video is much more challenging. Different cameras and devices record video using different formats and even when they use the same format, they may use a different codec. Modern operating systems generally have few problems using MP4 video files, both Windows 7 Media Player and OS X QuickTime X can play modern MP4 video files. Many mobile devices and smartphones can play MP4 files too. However one of the challenges facing FE Colleges is that many are still using Windows XP which doesn’t natively support MP4 and isn’t from a networking perspective the easiest to add that functionality.

One solution is to upload the video to sites such as Vimeo or YouTube and use Flash. Flash is often the solution to the delivery of video on networks where native MP4 playback isn’t possible. The issue with Flash video is how do you deliver such video on mobile devices, many of which don’t have support for Flash? Also how do you deliver the video to mobile devices that are not network connected? Another issue is privacy, the problem with using sites such as YouTube is that they are public and it may not be possible or wanted that the videos are public.

A key technological challenge for institutions is to answer all those problems without it becoming a barrier to learners and practitioners.

4oD CatchUp – iPad App of the Week

4oD CatchUp – iPad App of the Week

This is a regular feature of the blog looking at various Apps available. Some of the apps will be useful for those involved in learning technologies, others will be useful in improving the way in which you work, whilst a few will be just plain fun! Some will be free, others will cost a little and one or two will be what some will think is quite expensive.

This week’s App is 4oD CatchUp.

Watch Channel 4 on your iPad for free with our new 4oD iPad app

– free to download for a limited time only
– the only way to watch 4oD on your iPad
– wide selection of shows from Channel 4, E4 and More4
– huge 30-day catch up window
– free and unlimited access to content

The 4oD iPad app is free for a limited time and is the only way you can watch 4oD on your iPad. It enables you to catch up on a wide selection of programmes recently broadcast on Channel 4, E4 and More4, for up to 30 days after transmission. Once you have downloaded the app, the content is available to watch free of charge with no limits to the amount of content you can play.

Our archive 4oD content (which includes both classic shows and more recent programmes that have dropped out of the 30-day catch up ‘window’) is not currently available on the iPad, but we are looking to bring these programmes to you soon.

This application does not work outside the UK. Please note that US shows are not currently available on this app. The 4oD application is only available using wi-fi, and only supported on iOS 4.

Free (for a limited time)

One of the features I liked about the iPad was the ability to play BBC iPlayer content, however without Flash support services such as Channel 4’s 4oD weren’t available to iPad users. The 4oD CatchUp App now allows some 4oD programmes to be watched on the iPad.

Alas this is not the full 4oD service, but then neither does the BBC iPlayer iPad App offer the full iPlayer experience. What the 4oD CatchUp App does allow you to do is to see some of the Channel 4 programmes from the last 30 days from your iPad.

The App only works over wifi and this does restrict you using it on the move, but this is the same as the BBC iPlayer App.

For the me the importance of this App is the move by content providers who previously only relied on Flash for delivering video to now providing that same content via the iPad. I do expect to see more content providers move in this direction. What this means is the lack of Flash on the iPad becomes less and less an issue for consumers.

4oD CatchUp is a simple App and it works reasonably well, it did crash on me a few times. This should be fixed in a future version I suspect. Hopefully in the future we will also see more 4oD content.

Update: This app is now called All 4.