Thoughts from a recent GW4 meeting at University of Bath

 

On Friday 23rd March, Mike, Naomi, Robyn, Han and I headed over to Bath for the latest GW4 meeting of minds. As decided in the previous meeting, the main topics for discussion were e-assessment and portfolios, but we also discussed MOOC development and learning analytics. Unfortunately, no one from Exeter could make it up this time, so it was us from Bristol, along with colleagues from Bath and Cardiff. As before, we used Padlets to pool ideas and discussion points as we discussed in smaller groups.  

Portfolios 

Portfolios seem to be a common focus (dare I even say, headache). Bath and Cardiff have been using Mahara, and have been trying to overcome some of its limitations in-house. There was a strong feeling that none of us have found a portfolio which delivers what we need, and that if we ganged up on the providers they might be able to find a solution. The next step is to try to define what it is we do need from a portfolio, which tools we use (or have already investigated), and what we can do to find a common solution. Some immediate themes were e-portfolios as assessment tools (and how they integrate with current systems), GDPR implications, students being able to share parts of portfolios externally and internally, and how long students can have access to their portfolio.

MOOCS   

As something we all have experience of, to a greater or lesser degree, there was inevitably quite a bit of discussion around MOOCs. We talked about the processes we follow to develop MOOCs, and the different support we provide to academics. For example, Gavin from Bath showed us how he uses Camtasia to produce videos in house; in fact, he was able to knock up an example of such a video in 20 minutes during the session, with mini interviews and shots from the day. We also discussed the data we get from FutureLearn, and how we all find it difficult to do anything with that data. With so much information, and not much time, it tends to become something we’d all like to do more with but never quite find the time for. 

The discussion also retuned to an idea we’ve been kicking around GQ4 for a while, that of a collaborative MOOC. We discussed the idea of perhaps making courses for pre-entry undergrads, or students embarking on PhDs, or perhaps staff development and CPD courses for new academics (which Cardiff are already building a bank of in FutureLearn). The idea of creating small modular courses or MOOCS, where each of us could provide a section which is based on our own expertise and interests, was also popular…let’s see how this develops!

E-assessment 

Tools and systems around e-assessment was also a common theme. As well as thinking about Blackboard assignments, use of Turnitin and QMP, there was also talk about peer assessment tools and practice and adopting a BYOD approach. It seemed that we all had experiences of e-assessment being very mixed, with huge disparity in adoption and approach within our institutions. We’re all working on e-assessment, it seems, for example our EMA project, which is quite similar to that of Bath. However, other trials are also going ahead, such as Cardiff’s trial of ‘Inspera‘. I think we’re all keen to see what their experiences of that project are, as the Scandinavian approach to e-exams has often been heralded as the future! 

What next? 

For the future, we discussed more of a ‘show and tell’ approach, where we could get a closer look at some of the things we’re up to. There was also talk of upping our use of communication channels in between meeting in person, particularly using the Yammer group more frequently, and perhaps having smaller virtual meetings for specific topics.   

It wasn’t decided who would host the next session, particularly as Exeter weren’t represented, although we did tentatively offer to host here at Bristol. But, seeing as Bath really did set the bar high for lunch expectations – with special mention to the excellent pies and homemade cake – if we do host I think we’d better start planning the food already…!  

 

 

Reflections on the ABC mini-conference from Suzanne

Heading to London for the ABC mini event on Friday 9 February at UCL, I was a tiny bit apprehensive. This curriculum development tool was something I have used, in various forms, but without ever actually seeing how it should be ‘properly’ done, or ever receiving any training from Clive and Natasha, who came up with it. What I soon found was that our renegade use of the tool wasn’t in fact that renegade.

The morning session, where I got to actually try to develop a course using the tool, was pretty reassuring. It turns out I had actually been running the sessions ‘properly’ after all, which I would say is testament to how straightforward and logical the tools are to use.

After being on the other side of the table during a session, I learned how enjoyable it is to make such visible progress in such a short time. I also realised how much you have to remember if you end up talking through a whole sequence of learning without noting down the detail (ie, before you ‘flip the cards’). By the time we came to adding detail, we all had to try and remember what we’d had in mind. This is definitely something I’ll bear in mind the next time I run a session.

 

 

As well as the hands on session, hearing about what others have been using the method for, and what they had learned from it, was inspiring. The main things that stuck in my mind were:

  • How useful the method is as a review tool (as I had previously used it to design new courses). It helps people visualise and recognise all the great things they already do, before thinking about how they might want to develop their course for the future. The act of discussing it with others surfaces long held beliefs and assumptions which might no longer apply. When redesigning a course, unit or programme, I can see how helpful this might be.
  • Secondly, this tool is really effective at a programme level. The evaluation of individual courses or units seems to take on a new dimension when done in a room with all the units and courses in the programme being evaluated at the same time. Without asking people to do this explicitly, connections between units can be spotted and developed, duplication can be discussed, and people involved across the whole programme can start to get a real sense of what the students’ experience of the whole programme actually is. A ‘ground-up’ programme development seems to happen, which is more holistic and sustainable than a ‘top-down’ directive.

For our purposes, this certainly seems like a useful tool for two big projects that the University of Bristol is tackling: programme level assessment, and embedding the Bristol Futures themes into the core curriculum. Being able to quickly map where things already happen, and then talk about it in an open and positive environment, could be a really engaging way to get these conversations started. Let’s see where learning our ABCs can get us…