Tag Archives: voice assistant

Record temperatures – Weeknote #177 – 22nd July 2022

This week saw record temperatures as a red warning heatwave hit the UK. I spent the week working from home, as trains were cancelled or delayed and there were problems on the roads.

I wrote a blog post on how I can teach anywhere

I use to say things like “I can teach anywhere”. What I meant by this, wasn’t that the environment or space I was using wasn’t important, but I could overcome the disadvantages of the different spaces I had to play with, and still deliver an effective session.

So though I might be able to teach anywhere the reality is that all those challenges and issues I face in an inappropriate space, may well result in poor quality learning, despite the quality of my teaching.

Big news this week was that the QAA was to step away from designated role in England. Over on Wonkhe, David Kernohan  tries to make sense of it all.

The Quality Assurance Agency (QAA) will no longer consent to be the Designated Quality Body (DQB) in England, as of the end of the current year in office (March 2023). The reasoning is straightforward – the work that QAA does in England, on behalf of the OfS, is no longer compliant with recognised quality standards – namely the European Standards and Guidelines (ESG) as monitored by the European Quality Assurance Register for Higher Education (EQAR). For this reason, the QAA registration with EQAR was recently suspended – a decision that highlights international concerns about procedures in England but has an impact in the many other nations (including Scotland and Wales) where QAA needs that EQAR registration in order to fulfil a statutory quality assurance role.

Once more we are seeing more divergence across the UK for higher education.

Alexa
Image by finnhart from Pixabay

I revisited and revised a blog post on voice assistants I had written back in 2018.

Hey Siri, what’s my day like today? Alexa when’s my next lesson? Okay Google, where are my library books?

Voice assistants have become widespread and are proving useful for a range of uses. The cost has fallen over the years and the services have expanded.

The use of voice assistants and smart hubs has certainly continued, and they have become embedded into many digital ecosystems. Their use in education though is still limited and I will be looking at that in a later blog post.

Attended a session on impact this week, which was interesting, but not necessarily that useful. How do you evidence impact of what you do? I wonder for example of the 1,828 blog posts published on this blog have had any impact on the way in which people work, support others or plan their work. For example one of the most popular blog posts on the blog, which though written in 2011, is still regularly viewed, is this one 100 ways to use a VLE – #89 Embedding a Comic Strip, which was one of a series of blog posts on improving or enhancing the use of the VLE.

One use of graphic that can enhance the look of a VLE course or as a mechanism to engage learners is to embed a comic strip into the VLE course.

What has been the impact of this? Has is changed practice? Has it improved the student experience? Has it improved student outcomes? How would I know?

I don’t think I can evidence the impact of this, but other work I have done I can sometimes see the evidence, however I don’t know if their has been actual impact.

I quite liked these tweets from August 2021 from people who had attended the digital leadership consultancy I had delivered for Leeds.

I had as part of the programme delivered a session on e-mail. It incorporates much of what is in this blog post on Inbox Zero and this follow up post. Always nice to see the impact that your training has had on the way that people work, they didn’t just attend the training, engage with the training, but are now acting on what they saw and learnt.

However what I don’t know is, has the change had a positive impact? And what was that impact?

I spent some of the week reviewing our new guide to the Intelligent Campus, and the revamped guide to the Intelligent Library. The library guide was never published but has been updated for 2022. I also reviewed our updated use cases, as well as drafting plans for some additional use cases. I am aiming for publication of these in the autumn.

letters
Image by Gerhard G. from Pixabay

If you are going on leave over the summer, you may want to look at this blog post on managing your summer e-mail.

My top tweet this week was this one.

Hey Siri, what’s my day like today? Alexa when’s my next lesson? Okay Google, where are my library books?

Microphone
Image by rafabendo from Pixabay

Voice assistants have become widespread and are proving useful for a range of uses. The cost has fallen over the years and the services have expanded.

Google report that 27% of the global online population is using voice search on mobile.

Alexa was announced by Amazon in November 2014 alongside the Echo devices, which act as connected speakers and hubs for voice controlled devices. The Echo devices act as connected hubs complete with speakers and in some cases small screens. Amazon continues to innovate and develop their Alexa devices including car adapters and headphones.

Alexa
Image by finnhart from Pixabay

Cortana from Microsoft was demonstrated in April 2013, and was released as part of Windows 10 in 2015. In March 2021, Microsoft shut down the Cortana apps entirely for iOS and Android and removed them from corresponding app stores. Microsoft has also reduced emphasis on Cortana in Windows 11. Cortana is not used during the new device setup process and isn’t pinned to the taskbar by default.

Bixby from Samsung was announced in March 2017. Unlike other voice assistants Samsung are going to build Bixby into a range of consumers goods such as refrigerators and TVs which they manufacture.

Google have Google Nest, which was originally released as Google Home announced in May 2016 and released in the UK the following year. In May 2019, Google rebranded Google Home devices to the Google Nest banner, and it unveiled the Nest Hub Max, a larger smart display.

Google Home
Image by antonbe from Pixabay

Google Nest speakers enable users to speak voice commands to interact with services through Google’s intelligent personal assistant called Google Assistant.

And of course Siri from Apple. Siri was originally released as a stand-alone application for the iOS operating system in February 2010, but after a buy out from Apple was released as part of the operating system in October 2011. It wasn’t until 2018 that Apple released their own connected speaker hub with the HomePod in February of that year, which was replaced with the HomePod Mini in November 2020.

Many of these voice assistants started their journey on mobile devices, but over the last few years we have seen connected voice controlled hubs appearing on the market.

An online poll in May 2017 found the most widely used in the US were Apple’s Siri (34%), Google Assistant (19%), Amazon Alexa (6%), and Microsoft Cortana (4%).

Though we might think we want to see how we can embed these into the classroom or education, they are not aimed at this market, they are consumer devices aimed at individuals. Our students are certainly the type of consumers who may purchases these devices and they will want to be able to connect them to the university or college services they use.

group
Photo by Annie Spratt on Unsplash

All the voice assistants require some kind of link to information and in some cases data.

If I ask Alexa to play a particular song, she delves not just into my personal music collection on the Amazon Music app but also what is available through my Prime subscription. If the song isn’t available I could either subscribe to Amazon Music streaming service, or purchase the song.The Alexa ecosystem is built around my Amazon account and the services available to me as a Prime subscriber.

With Google Nest I have connected my free Spotify account to it. This is one of the key features of these devices that you can connect services you already subscribe to, so you can control them via voice. Of course the reason I have a free Spotify account is that Google Nest would much prefer I was connected to Google Music, and it certainly won’t let me connect to either my home iTunes library (where virtually all my music is) nor to Amazon Music. So when I ask Google Nest to play a particular music track, she gets annoyed and says that she can’t as that is only available on Spotify Premium.

This is one of the challenges of these devices that they are quite reliant on subscriptions to other services. Apple’s HomePod only really works if you have an Apple Music subscription.

When it comes to connecting services to voice assistants then are two key challenges, can you get the right data out to the right people, and similarly can you do this for the range of voice assistants available especially when you remember that there is no de facto standard for voice assistants.

It would be useful to know and understand what sorts of questions would be asked of these assistants. There are the known problems, such as where is my next lesson? What books would be useful for this topic? When is my tutor free for a quick chat on assignment? Do I need to come into college today? Even simple questions could result in a complicated route to multiple online systems. Imagine asking the question, where and when is my next lecture, what resources are available and are there any relevant books in the library on this subject? The module design or course information system (or more likely this is a dumb document) would have the information on what would be next. Timetabling systems would be able to inform the learner which space and when the lesson was. Imagine the extra layer of last minute changes to the information because of staff sickness, or building work resulting in a room change. As for what resources are available, this may be on the VLE or another platform. As for additional resources then this could be on the library systems. How would the voice assistant know what to do with this information, could it push the links to a mobile device? Add in a social platform, say a closed Facebook group, or a collaborative tool such as Slack, then you start to see how a simple question about what am I doing next and where is it, becomes rather complicated.

student
Image by Karolina Grabowska from Pixabay

There is though something to be said to ensuring services work with voice assistants, as the same data and information could also be used with chatbot interfaces (ie textual assistants) and with campus bound services such as kiosks or web portals. Get the data right then it’s simple a matter of ensuring the interface to either voice, text or screen is working. Learning analytics services rely on a hub where academic and engagement data is collected, stored and processed. Could we use a similar data structure to build the back end system for chatbots, kiosks and voice assistants?

Could we Siri? Could we?

This is an updated version of a blog post I wrote in August 2018 on the Jisc Intelligent Campus project blog.

All together now – Weeknote #32 – 11th October 2019

Birmingham
Birmingham

A busy week with travels to Bristol, Reading and Birmingham this week.

Monday I was in Bristol for a meeting with the Office for Students, one of the funders of Jisc. Following that I was back in the main office for further meetings.

There was an interesting long read on the Guardian website.

‘The way universities are run is making us ill’: inside the student mental health crisis

A surge in anxiety and stress is sweeping UK campuses. What is troubling students, and is it the universities’ job to fix it?

We know that there is a student mental health crisis and the reasons for this aren’t necessarily clear. We know there has been increase in the demand for mental health services at universities. The article notes that there has been research into the causes of this, but lays the blame for the crisis on the way in which universities are managed and run, leading to students not being in control of what they do and saddled with debt.

Image by Karolina Grabowska from Pixabay
Image by Karolina Grabowska from Pixabay

Continue reading All together now – Weeknote #32 – 11th October 2019

Hey Siri, are you real?

Hey Siri, are you real?

Following on from my recent blog post about installing voice assistants on campus, I recently read an article, Giving human touch to Alexa or Siri can backfire on how trying to make voice assistants appear to be human or have human touches may not give you the results you were looking for.

A team has found that giving a human touch to chatbots like Apple Siri or Amazon Alexa may actually disappoint users.

Just giving a chatbot human name or adding human-like features to its avatar might not be enough to win over a user if the device fails to maintain a conversational back-and-forth with that person…

This reminded me of a conversation I had at an Intelligent Campus workshop where the idea of trying to make chatbots appear human was probably not a good idea, and maybe they should intentionally make their chatbot non-human.

There are potential challenges as Microsoft found out with their paper clip assistant, but was that because it was a paper clip or because it was annoying?

Clippy

In many ways Clippy was the ancestor of Siri, Cortana and other modern day assistants.

A non-human chatbot could also avoid some of the gender issues that occur when deciding if your chatbot is female or male.

This Guardian article from March discusses this contentious issue, of gender for voice assistants.

Providing assistance has long been considered a woman’s role, whether virtual or physical, fictional or real. The robots that men voice, meanwhile, tend to be in positions of power – often dangerously so. Think Hal 9000, or the Terminator: when a robot needs to be scary, it sounds like a man.

Patriarchy tells us that women serve, while men order, and technology firms seem content to play into stereotypes, rather than risk the potentially jarring results of challenging them.

The article talks about EqualAI

EqualAI, an initiative dedicated to correcting gender bias in AI, has backed the creation of Q, what it says is the first genderless voice.

So if you do have a non-human chatbot, if you want to extend it to be a voice assistant, at least soon you will be able to have a genderless voice behind it.

So what (rather than who) should be your chatbot? Well it could be an anthropomorphic animal or maybe something else that is special to your university or college.

So what would your chatbot be?

Student Experience 2030 – Weeknote #09 – 3rd May 2019

Telephone boxes in London

The week started off in London with a day looking at and thinking about next generation learning environments. Before I got there, as I sat on the train I thought and reflected about what we even mean when we say next generation learning environment. Are their generational changes in learning environments, as in big changes from one generation to the next? Or do they merely evolve gradually over time? Could we enable these big shifts? Do we even want big shifts?

In London we discussed a range of challenges and issues in relation to next generation learning environments. What is the future of education? Will teaching be transformed? How can create personalised adaptive learning? How do we re-imagine assessment? How do you enable a merged digital and physical learning environment? What are the foundations that need to be put into place before you can start building the infrastructure, the design, the staff development required to enable those future challenges?

In order to understand what needs to happen, we framed some questions independently, so see what commonality there was and what differences there were.  This is an useful exercise when deciding that question needs to be answered.

Some of my questions included:

  • What does adaptive learning look like from the view of students and staff?
  • Is personalised learning possible? Is it desirable?
  • How do you enable a merged digital and physical learning environment?
  • What are the differences between the student experience of 2020 and that of 2030?

coffee

Wednesday  morning, after some coffee,  I was in the office and had an initial discussion was had about possible themes for Digifest 2020, though this event won’t be happening until March 2020, like most big events the planning started almost after the last one finished (if not just a bit before).

The afternoon was off to the University of Bristol for a meeting about the One City Bristol project, elements of which are very much in the realm of the smart city. In my previous role I did do some initial research into the various smart city initiatives across the UK and we published a smart city use case on the Intelligent Campus blog.

In January 2019 Bristol published its first ever One City Plan.

The interdependent challenges of growing an inclusive, sustainable city that both breaks down our social fractures and inequalities and reaches carbon neutrality sit at the heart of the future we must deliver. They are stitched throughout the plan.

In the plan there are six themes, one of which is connectivity.

The lifeblood of Bristol is connectivity. Our connectivity is considered the template for contemporary city living. Whether our people connect in person or in virtual spaces, whether they connect in their physical communities or their global communities, our city infrastructure helps bring them together. Bristol connectivity means multimodal connectivity – we designed our infrastructure around the human condition. Anchored yet free, our people are able to draw on the experience of others in their communities and peer groups, and live independently and spontaneously.

Connectivity is synonymous with productivity and Bristol is the regional epicentre of productivity. The South West Economic Region grew on the back of investment in transport and digital connectivity.

The Bristol-Cardiff high speed, high frequency rail link benefits both cities equally – time and travel no longer impinge productivity as they once did. Talent, ideas, energy and enthusiasm flow between the cities and across the region. High-speed rail links connect Bristol with other cities and when the mass transit system was completed in the 2030s, connections between Bristol, Bath, Bristol airport and North Fringe and East Fringe were complete. Our traffic management has cut congestion times and many of our deliveries are made by driverless freight vehicles.

Throughout the 2020s ultrafast broadband was rolled out without exception to social housing, businesses, in public spaces and through city Wi-Fi services. Tactile and immersive virtual and augmented realities reduce the need to travel and are commonplace at work and at home. They also bring together like-minded communities for shared social activities and entertainment.

Our city has managed bus lanes, cycle lanes, congestion controls and programmes to educate school children about safe travel. More than half the city cycles and active travel is the preferred mode of transport for many commuters. Domestic deliveries often arrive by drone. Nobody has been killed or seriously injured as a result of an avoidable road traffic accident in Bristol for years.

We strategically removed the obstacles and barriers to people connecting. The city moves on renewable energy, our people are free to create their own pathways, connected in person or virtually. Our lifeblood flows locally, regionally and globally.

This certainly is an aspiration that hopefully will come to fruition.

Following a request, based on my experience of working on the Jisc Digital Apprenticeships project, Thursday saw me working on some desk research on the current provision of Digital & Technology Solutions Degree Apprenticeships across the UK.

One thing that was apparent was how “popular” this degree apprenticeship is.

Chartered manager and digital and technology solutions are the two most implemented standards across each English region, with at least 43 and 33 institutions, respectively, providing them.

 Source

You can find more information about this specific degree apprenticeship on the government’s apprenticeships website.

One of the key requirements of my role is engaging with the Office for Students and the funding they provide Jisc to support higher education. As a result I attend and participate in various meetings that enables us to demonstrate value for money for the OfS, as well as how Jisc is supporting their strategic aims.

I spent some time this week reviewing the Office for Students Strategy 2018 to 2021 and their business plan for 2019-20 in preparation for a meeting on Friday morning.

One thing that I noticed was the target to Launch and oversee a ‘what works’ centre, Transforming Access and Student Outcomes in Higher Education.

The Centre for Transforming Access and Student Outcomes in Higher Education (TASO) will use evidence and evaluation to understand and show how higher education contributes to social justice and mobility. TASO will exist as an independent hub for higher education professionals to access leading research, toolkits, evaluation techniques and more, to help widen participation and improve equality across the student lifecycle.

It made me think about how this could be done, how it will probably be done and what the actual impact will be.

I tweeted out about the Jisc Futures R&D quarterly learnings webinar for summer 2019

This is the third in a series of update webinars for Jisc members to discuss the progress of our R&D work and share what we are learning during our projects.

 Each webinar will consist of two presentations on recent lessons learned, followed by an open Q&A session which will offer an opportunity to question or discuss any of our R&D work.

    • Launch of the step up programme – making it easier and less risky to work with edtech start-ups
    • Building digital capability service demo including new features for library and information professionals
    • Q&A on all of our R&D projects

Details of the event and registration are here.

We had a debrief about the Agile Implementation Workshop I helped run last week. One outcome from this workshop was to run a knowledge call or workshop on using Jira for projects and business processes.

I spent part of Friday, clearing the inbox, reviewing my scrum boards and planning work for next week.

Following my post about Alexa last week, I read this blog post Maybe Universities Shouldn’t Be Putting Amazon Echos in Student Dorms from Inside Higer Ed which explores the problems and concerns some have about “listening” devices in students’ rooms.

Amazon Echo
Photo by Jan Antonin Kolar on Unsplash

In a tragic twist to the end of the week, someone at Castlepark office broke my Cambridge University Press Academic mug that I got at the UKSG 2011 conference in Harrogate ten years ago.

broken mug

I’ve had that mug ten years, through four jobs, three employers, three cities and numerous mugs of coffee. I liked that mug, but it holds no sentimental value.

So who broke it and why?

My top tweet this week was this one.

Student Journey – Weeknote #08 – 26th April 2019

Mews in London

A shorter week this week, due to the Easter holiday weekend, so the week started on a Tuesday.

I spent some time reflecting and reviewing some initial discovery work that Lawrie was doing on the emergence of new communication tools and platforms. He is looking at the impact on teaching practices and the student experience, as well as what Jisc could do in this space.

I attended an update on how the project was going that is looking at how Jisc can influence the influencers.

Though I have left the Intelligent Campus project I still have some interest in that space. One aspect is voice assistants, tools such as Siri, Google Now, Cortana and Alexa. I noticed some recent news articles in this space and wrote a blog post about one of the articles.

Back in January I presented at the Data Matters conference about the Intelligent Campus, I found out on Tuesday that I had been allocated an action: James Clay (Head of higher education and student experience) should work with the relevant members of the M5 group to prepare a proposal for the next Data Matters Conference. So I spent some time reviewing what this entails. The next Data Matters will take place in January 2020.

Wednesday saw me attending a debrief on and reviewing the workshop I lead last week looking at Jisc’s work in the Education 4.0 space and what others are doing in this space. We reviewed what worked well and what we would improve. We aim to run further workshops in a similar vein.

On Thursday I was in London for an Agile Implementation Workshop I am helping run.

On the train to London I skimmed Educauses’ Horizon Report, the end result was I realised how much I needed to read it in more detail. So when I got to London I printed it out , so I could annotate it.

At the workshop, I talked about reporting and also did an introductory demo of JIRA. Sometimes the value of a tool such as JIRA is not the value it adds to the individual using the tool, but the combined and added value you get when everyone in a team uses that tool. Reporting is something else that often is seen as a process between two people, but aggregated reports are valuable to a range of stakeholders in an organisation. It was a great workshop and it was nice to work with a wide range of people from across Jisc.

Agile Workshop

Friday I spent time discussing and reading about the “student journey”, the “student experience” and the “student lifecycle”. These are all terms used by different people and organisations and mean different things to different groups. I do wonder if they are similar or different things.

Lancaster use the term student journey and have mapped it out in a diagram.

Lancaster Student Journey

This map outlines the student journey from deciding to attend Lancaster University right through to graduation.

Manchester in their Student Lifecycle Project have used a map to describe what they also call the student journey.

UoM Student Journey

Westminster have described in text their vision of the student journey.

So is the journey where the student is going to go, and the student experience is what happens when they get there?

So what of the student digital experience?

At Jisc we are developing a world-leading and holistic understanding of the student digital experience. What is the role of digital in students’ journeys into, through and out of study and into employment, as well as their interaction with a range of systems through the day. This understanding should include student wellbeing issues and the experiences of learners across different: backgrounds, modes and levels of study, subjects, types of learning provider, locations, family and work commitments, and disabilities.

What is clear is having a shared understanding across the organisation of the student digital experience.

Finally the EdTech Podcast episode #146 was released and I am on that podcast.

Not listened to it yet, mainly as I don’t really like listening to myself, so I’ll leave that choice up to you. However it has reminded me that I may want to record some new episodes of my podcast.

My top tweet this week was this one.

Alexa, what’s on my timetable today?

Amazon Echo
Photo by Jan Antonin Kolar on Unsplash

Though I have left the Intelligent Campus project I still have some interest in that space. One aspect is voice assistants, tools such as Siri, Google Now, Cortana and Alexa. I noticed some recent news articles in this space, such as this one: Amazon Echos to be installed in dorms.

The program will oversee the installation of third-generation Echo Dot units in all of the suites, classrooms and study rooms.

There have been many exploratory programs in this space, and it reminds me of the early days of the iPad.

There are challenges connecting devices to university networks, mainly as the consumer devices aren’t always able to be connected to a WPA2 Enterprise wireless network (such as Eduroam) but as with early days with any device, this functionality is something that is on the roadmap and can often be found as beta software.

Alexa for Business now allows organizations to connect select Echo devices managed by Alexa for Business to their corporate WPA2 Enterprise Wi-Fi network.

What we don’t know is, are these devices a fad, or are voice assistants here to stay?