A student voice in the DEO: What have the Student Digital Champions found so far?

Since the 2019 Digital Experience Insight Survey, which revealed so much about students’ experiences of the digital learning environment at Bristol even before the pandemic, the DEO have been keen to channel student voices straight into our work. With 2020 turning out the way that it did, it was even more crucial to make that a reality, and so we worked with Bristol SU to recruit 12 Student Digital Champions (SDCs) from across all faculties in the University. They don’t have a lot of time each week with us, but they’ve definitely been making the most of that time so far!  

You can get to know them a little better by viewing this introduction video, also found on the DEO Student Digital Champion project page. 

What have they been up to? 

Since joining the DEO team, the SDCs have been actively getting out into their faculties, going to course rep meetings and faculty meetings, and talking to staff Digital Champions and other key staff. They’re reporting that even just being in meetings with their ‘digital champion’ hat on has been sparking interesting conversations with course reps and students about the student experience of digital learning in 2020 so far. 

They’ve already actively worked with us on two new DEO guides, which have been instigated from student feedback in the Pulse surveys. These are the guides on Interactivity in large sessions, and Breakout roomsThey’ve also worked to cocreate and give feedback on the Assessment Checklist and Troubleshooting guide, and other areas of the new Digitally Ready online space on assessment, which launched on 5th January to support students during this assessment period.  

What have they found?

The remit of the SDCs is to look for patterns emerging in the student experience across faculties and schools, and work together on the key themes of student engagement in learning and community building. They’ve been tasked with getting students to talk about solutions to their problems too: we want to hear ideas for what could be done differently, or what is working really well and how that could be expanded.

So far they’ve noticed…

Some of the common themes which seem to be merging across the student experience include:

The cohort conundrum  

Students are feeling disconnected, are lacking a sense of belonging and a sense of a shared experience. Many are reporting that this is partly due to other students not being active and engaged in online sessions, particularly in not turning on their videos. On the other hand, students also said they feel anxious themselves about being in online sessions, particularly breakout sessions, and in turning on their own mics and video. In the Engineering faculty, students actually felt there was an increase in engagement betweestudents when using the general discussion forum to ask questions. Students seem to be asking more questions and sharing information with each other.  

‘I don’t wanna be just a guy on the screen. I want us to be more like a cohort.’  [Year 1 Student, Centre for Innovation] 

Clarity and simplicity make good online course spaces 

Echoing student feedback in previous years, students are now more than ever keen on things being concise, clear, and easy to navigate. Videos around 20-30 minutes seem to be the maximum that students feel they can engage with, with most preferring 10-15 minutes. Our SDCs are also reporting that a messy Blackboard course space can be pretty discouraging, especially to first year students! 

Group work online is brilliant/impossible (delete as appropriate)  

We’re hearing loud and clear that the tools of online learning – shared documents, MS Teams, BB Collaborate and BB journals – are potentially great in making group work easier to manage, and coordinate. Students are getting to grips with what these systems can offer, and love the flexibility of it (when the technology allows – internet connection problems are frequently mentioned too!)But they would like more guidance on these tools, and how to use them effectively. At the same time, the lack of group identity, and the fact that they may not have actually met their peers in person, is making things difficult.  

Only few people are tuning up. How can I trust someone to do their work when we’ve never met?’ [Year 1 Student, Arts] 

And they’ve suggested…

There are already several projects in the pipeline, ideas for what might be possible, and pilots in progress. A snapshot of these include: 

A Breakout Room toolkit – A toolkit for staff, made by students, on how to plan and delivery the best breakout room experience. This is broken down by year, recognising that first years have different needs and situations than returning studentsIt includes ideas for group sizes and permanence (3-5 week rotations for groups seems popular), and establishing group identity, as well as how to encourage students to actively participate. More on this soon… 

‘Online mingle’ pilot – In partnership with the Centre for Innovation, creating a template for how to run ‘speed dating’ type welcome sessions for students, where they can get to know each other and practice speaking online in a safe and fun environment.  

Motivation Panels – Here, more experienced students are there to support first years involved in team/group work, and spark a sense of what their degree is about, and feel motivated by the subject. Led by course reps and students, this is a way to feel part of something bigger than your own unit or programme.  

Shared spaces – using tools like MS Teams to explore ways for students to meet regularly and informally. This could include news and inspiration, notices of events, a ‘Help me out’ forum, and introductions to different people within their programme or school. 

Groupwork toolkits – Deliverables to help students choose the best tools to use, and how to use them, for group work, as well as how to maximise group work as a way to meet people, and gain the sense of social interaction often missing online.  

School assembles – Regular school-wide live sessions, to give a sense of belonging and motivation across a school, rather than just within a unit or programme. These are already been run in the School of Psychological Science, and the SDCs are working to find out what it is about them which are so engaging, and how that might be replicated across the university.  


 

Exploring Curriculum Design Approaches

Recently at the University of Bristol, we’ve all been thinking a lot about learning design, developing curriculum and ways of assessment. BILT’s focus on TESTA for transforming assessment is one way you can see this in action. In higher education, learning design can quickly get complicated – for example redesigning a whole programme – and is increasingly new and exciting – with online or blended aspects, new assessment methods or innovative pedagogies. A method of working when approaching curriculum, programme or learning design can speed up the process and make it much more enjoyable for everyone involved. Helpfully, there are several working methods based on story-boarding which provide a way to navigate this process, and which focus on a team approach to designing learning.

The Digital Education Office have mainly used an approach based on UCL’s ABC: you can read more about our use of this method from a series of blog posts by Suzi Wells and I on a previous ABC conference held at UCL. 

Such curriculum design approaches all facilitate discussion and evaluation of current and future learning designs by bringing together relevant stakeholders, learning design specialists and support staff. In the Sway presentation embedded here, we’ll have a quick look at a few, in order to get a taste of what these approaches involve, and how they’ve been used by others. Follow this link to open the Sway in a new tab or window.

Thoughts from a recent GW4 meeting at University of Bath

 

On Friday 23rd March, Mike, Naomi, Robyn, Han and I headed over to Bath for the latest GW4 meeting of minds. As decided in the previous meeting, the main topics for discussion were e-assessment and portfolios, but we also discussed MOOC development and learning analytics. Unfortunately, no one from Exeter could make it up this time, so it was us from Bristol, along with colleagues from Bath and Cardiff. As before, we used Padlets to pool ideas and discussion points as we discussed in smaller groups.

Portfolios 

Portfolios seem to be a common focus (dare I even say, headache). Bath and Cardiff have been using Mahara, and have been trying to overcome some of its limitations in-house. There was a strong feeling that none of us have found a portfolio which delivers what we need, and that if we ganged up on the providers they might be able to find a solution. The next step is to try to define what it is we do need from a portfolio, which tools we use (or have already investigated), and what we can do to find a common solution. Some immediate themes were e-portfolios as assessment tools (and how they integrate with current systems), GDPR implications, students being able to share parts of portfolios externally and internally, and how long students can have access to their portfolio.

MOOCS

As something we all have experience of, to a greater or lesser degree, there was inevitably quite a bit of discussion around MOOCs. We talked about the processes we follow to develop MOOCs, and the different support we provide to academics. For example, Gavin from Bath showed us how he uses Camtasia to produce videos in house; in fact, he was able to knock up an example of such a video in 20 minutes during the session, with mini interviews and shots from the day. We also discussed the data we get from FutureLearn, and how we all find it difficult to do anything with that data. With so much information, and not much time, it tends to become something we’d all like to do more with but never quite find the time for.

The discussion also retuned to an idea we’ve been kicking around GQ4 for a while, that of a collaborative MOOC. We discussed the idea of perhaps making courses for pre-entry undergrads, or students embarking on PhDs, or perhaps staff development and CPD courses for new academics (which Cardiff are already building a bank of in FutureLearn). The idea of creating small modular courses or MOOCS, where each of us could provide a section which is based on our own expertise and interests, was also popular…let’s see how this develops!

E-assessment

Tools and systems around e-assessment was also a common theme. As well as thinking about Blackboard assignments, use of Turnitin and QMP, there was also talk about peer assessment tools and practice and adopting a BYOD approach. It seemed that we all had experiences of e-assessment being very mixed, with huge disparity in adoption and approach within our institutions. We’re all working on e-assessment, it seems, for example our EMA project, which is quite similar to that of Bath. However, other trials are also going ahead, such as Cardiff’s trial of ‘Inspera‘. I think we’re all keen to see what their experiences of that project are, as the Scandinavian approach to e-exams has often been heralded as the future!

What next?

For the future, we discussed more of a ‘show and tell’ approach, where we could get a closer look at some of the things we’re up to. There was also talk of upping our use of communication channels in between meeting in person, particularly using the Yammer group more frequently, and perhaps having smaller virtual meetings for specific topics.

It wasn’t decided who would host the next session, particularly as Exeter weren’t represented, although we did tentatively offer to host here at Bristol. But, seeing as Bath really did set the bar high for lunch expectations – with special mention to the excellent pies and homemade cake – if we do host I think we’d better start planning the food already…!

 

 

Reflections on the ABC mini-conference from Suzanne

Heading to London for the ABC mini event on Friday 9 February at UCL, I was a tiny bit apprehensive. This curriculum development tool was something I have used, in various forms, but without ever actually seeing how it should be ‘properly’ done, or ever receiving any training from Clive and Natasha, who came up with it. What I soon found was that our renegade use of the tool wasn’t in fact that renegade.

The morning session, where I got to actually try to develop a course using the tool, was pretty reassuring. It turns out I had actually been running the sessions ‘properly’ after all, which I would say is testament to how straightforward and logical the tools are to use.

After being on the other side of the table during a session, I learned how enjoyable it is to make such visible progress in such a short time. I also realised how much you have to remember if you end up talking through a whole sequence of learning without noting down the detail (ie, before you ‘flip the cards’). By the time we came to adding detail, we all had to try and remember what we’d had in mind. This is definitely something I’ll bear in mind the next time I run a session.

 

 

As well as the hands on session, hearing about what others have been using the method for, and what they had learned from it, was inspiring. The main things that stuck in my mind were:

  • How useful the method is as a review tool (as I had previously used it to design new courses). It helps people visualise and recognise all the great things they already do, before thinking about how they might want to develop their course for the future. The act of discussing it with others surfaces long held beliefs and assumptions which might no longer apply. When redesigning a course, unit or programme, I can see how helpful this might be.
  • Secondly, this tool is really effective at a programme level. The evaluation of individual courses or units seems to take on a new dimension when done in a room with all the units and courses in the programme being evaluated at the same time. Without asking people to do this explicitly, connections between units can be spotted and developed, duplication can be discussed, and people involved across the whole programme can start to get a real sense of what the students’ experience of the whole programme actually is. A ‘ground-up’ programme development seems to happen, which is more holistic and sustainable than a ‘top-down’ directive.

For our purposes, this certainly seems like a useful tool for two big projects that the University of Bristol is tackling: programme level assessment, and embedding the Bristol Futures themes into the core curriculum. Being able to quickly map where things already happen, and then talk about it in an open and positive environment, could be a really engaging way to get these conversations started. Let’s see where learning our ABCs can get us…