Accessibility, inclusivity, universal design – notes from the reading group

Naomi looked at the Accessibility of e-learning OU Course, and read the 10 key points from the UCL Blog

The summary comments written by Jessica Gramp summed up the OU course and gave a good overview of what the course covered, as well as an idea of how wide the disability scope is. It was an interesting read for someone who’s knowledge of accessibility in e-learning is quite limited.

The post gave information on how there are two views of disability. The Medical Model, describes ‘the problem of disability as stemming from the person’s physical or mental limitation.’ And the Social Model, ‘sees disability as society restricting those with impairments in the form of prejudice, inaccessible design, or policies of exclusion.’

The idea of society restricting those with impairments through inaccessible design was interesting, as it is something most people have done, but often give little thought to.  We often like to design things to look ‘pretty’ but give little thought to those using screen readers or think about how we would describe an image for example. What is also mentioned in the post is how accessibility is about both technical and usable access for people with disabilities. Jessica gives the example of a table of data. Although it may be technically accessible for someone who is blind, the meaning of the data would be lost on a screen reader and would no sense and be unusable to the user. The post and course both talk about evaluation accessibility, but for me it’s something that needs to come right at the beginning of the design. There is no point designing something that uses spreadsheets for example if screen readers won’t produce the correct data and meanings to the users.

The last point Jessica makes, which I really liked, was that accessible learning environments help everyone, not just those with disabilities.

“This last point reflects my own preference for listening to academic papers while running or walking to work, when I would be otherwise unable to “read” the paper. As a student and full-time employee, being able to use this time to study enables me to manage my time effectively and merge my fitness routine, with study time. This is only possible because my lecturers, and many journals these days too, provide accessible documents that can be read out loud using my mobile smartphone.” – Jessica Gramp

A thought-provoking blog post that gave me a lot to think about and made me put more thought into the work I create online.

Whilst reading this I also came across on article on Twitter from Durham’s student paper The Palatinate. This talks about how Durham University have introduced lecture capture to their lectures. However, the English department have opted out, citing changes to the teaching relationships, and a ‘lack of credible evidence that lecture capture improves academics attainment.’ In the departments’ email, they talk about the ‘danger of falling attendance, and the potential compromise of the classroom as a safe place, where controversial material can be discussed.’

These are all good points, but the writer of the article points out that accessibility needs may be more important than these factors. With such a wide range of disabilities, lecture capture could provide help in lectures to those that need it. The question also needs to be answered that if they aren’t going to use lecture capture, what are they doing to help their students with disabilities?

It was an interesting article that makes us think about how much accessibility weighs in within teaching and learning. It should be at front of what we are thinking when we first start designing how we are going to teach, or present data. But there is often a stigma and it can also cause tensions and challenges. Going forward, these need to be addressed, rather than be ignored.

Suzi read Applying Universal Design for Learning (UDL) Principles to VLE design from the UCL blog. A short, but very thorough and clear post, written as part of UCL’s Accessible Moodle project. For the main part this is, reassuringly enough, a re-framing of things we know make for good accessible web design (resizing text, designing for screen readers, etc). However, it did include the following:

“The VLE should also offer the ability to customise the interface, in terms of re-ordering frequently accessed items, placement of menus and temporarily hiding extraneous information that may distract from the task at hand.”

Not suggestions I have seen before in an accessibility context, possibly because they are more difficult to implement. In particular, the idea of limiting distracting information – that being an accessibility issue – seems obvious once it’s been said. It’s something that would be welcome for a wide range of our students and staff.

Suzi also read Advice for making events and presentations accessible from GOV.UK. Again this is very clear, straightforward advice, well worth being aware. The advice is for face-to-face events but covers points on supporting a partially remote audience. Some of the points that I had not thought of included:

  • Ask your participants an open question about their requirements well before the event. Their wording is “Is there anything we can do to enable you to be able to fully participate in this event?”
  • Don’t use white slide backgrounds because of the glare. For example, GOV.UK slide decks use black text on grey or white text on dark blue.
  • Give audio or text descriptions of any video in your presentation.

There are also some interesting suggestions in the comments. I found the comments particularly interesting as they seem to be individuals speaking directly about their own needs (or possibly those of people they work with) and what they would find most useful. Suggestions include ensuring there is good 3G or 4G coverage, as wifi might not be enough to support assistive technologies, and opening with a roll call (because as a blind person you can’t just glance around the room to see who is there). One commenter suggests you should always sing the key points from your presentation (to an existing tune, no need to compose especially) – an idea I love but not one I’m up to implementing.

Chrysanthi watched 2 videos from the list 15 inspiring inclusive design talks:

When we design for disability, we all benefit | Elise Roy

In this talk, Elise Roy gives examples of inventions that were initially inspired by/ for people with disabilities, but turned out to be useful for people without as well. These include:

  1. Safety glasses that visually alert the user about changes in pitch coming from a tool (which can mean the tool will kick back) before the human ear can pick it up (theirs).
  2. A potato peeler that was designed for people with arthritis but was so comfortable that others used it.
  3. Text messaging, which was originally conceived for deaf people.

Her suggestion is to design for people with disabilities first, rather than the norm. This could mean that the solution is not only inclusive, but potentially better, than if it was designed for the norm. So rather than “accommodate” people with disabilities, use that energy to come up with innovative solutions that are beneficial to all.

Derek Featherstone: Accessibility is a Design Tool

Derek Featherstone makes a similar point to Elise Roy, that designing for accessibility can help everyone. Looking at how outliers/ people at the ends of a spectrum will be influenced by a design decision can also help understand how the average person will be affected. “If we look at the extremes, everybody else is going to be somewhere in the middle”. Between no vision and perfect vision, between no hearing and perfect hearing etc.

The main points to consider for accessibility as a design tool:

  1. People with disabilities may have needs for specific types of content, on top of the content everyone else gets, in order to make decisions: e.g. to choose a health provider, they don’t just need to know how far away the provider is, but perhaps where the wheelchair ramp is at the practice, as that might affect whether they choose to go to this one or choose a different one. Designers should find out what kind of extra content they need. Other examples: Are there captions for this film I am considering watching?
  2. When trying to make something accessible, it is important to consider why it is included in the first place, rather than just what it is. That could be the difference between providing a confusing textual description of an element, and a clear one of how the information the element portrays affects the people accessing it. E.g. instead of trying to textually describe a change of boundaries on a map, give someone the ability to look up their post code and see if they are affected by that change.
  3. Proximity; this known design principle of grouping related items together (e.g. images to their textual explanations, instructions to the parts they refer to etc) is even more important for people with certain types of disability, like low vision. This is because it is much easier for them to lose the context, as they see much less of the interface at a time. Derek suggests getting an understanding of this by examining an interface looking at it through your fist, like holding a straw. Actions, buttons etc should be placed in a way that the desired action is located where the person would expect according to the patterns of use that have been established already. If so far, the action is on a specific part of the screen, changing that will be confusing. Buttons should be distinguishable from each other even without reading, so e.g. for buttons previous & next, using the exact same colours, font, sizes, etc means the user needs to read to distinguish.

Finally, it is important to not get so caught up in the technical requirements of making something accessible on paper, that we forget what it is we are trying to achieve.

Suzanne read New regulations for online learning accessibility (WonkHE, 23 Sept 2018)

Published in WonkHe in September 2018, this article by Robert McLaren outlines the new regulations for online learning accessibility. McLaren works for the think-tank Policy Connect, which published a report in collaboration with Blackboard Ally after the government ratified the EU Web Accessibility Directive on the 23rd of September. This directive clarifies the position of HE institutions as public sector bodies and thus includes them in the requirements for web accessibility. This means that VLEs, online documents, video recordings etc are all counted as web content, and need to meet the four principles of accessible web design: that it is perceivable, operable, understandable, and robust. Additionally, VLEs will also have to include an accessibility statement outlining the accessibility compliance of the content, directing students to tools to help them get the most from the content (such as browser plugins), and explaining how students can flag any inaccessible content. As McLaren notes, this has long been considered good practice, and isn’t really anything new, but is now a legal duty.

The article then outlines several areas which may still need addressing in VLE content. The first is ensuring content is usable. The example he uses is the prevalence of scanned pdfs which are hard or impossible to work with (as they appear as image, rather than text) for disabled students, but also for non-disabled students and those working with mobile devices. From this point, McLaren moves to discuss briefly the idea of universal design, which he defines as “educational practice that removes the barriers faced by disabled students and thereby benefits all students.” In the article, he claims that the rise in universal design has in part been fuelled by cuts to Disabled Students Allowances and the increasing shift in focus to universities to remove barriers for disabled students rather than DSA and other measures which work to mitigate these barriers once they are in place.

The article then suggests a model for ensuring the change required to meet these needs: “We recommended a cascading approach. Government should work with sector organisations to provide training for key staff such as learning technologists, who can in turn train and produce guidance for teaching staff.” As the report was sponsored by Blackboard Ally, it is perhaps not surprising that another side of their solution is to provide a range of usable and flexible resources, which Ally helps users ensure they are providing. The final remarks, however, surely stand true no matter how achieved (through Ally or other means): “An inclusive approach allows all students to learn in the ways that suit them best. If the sector can respond effectively to these regulations, all students, disabled and non-disabled, will benefit from a better learning experience.”

Suggested reading

Online communities – notes from the reading group

Amy read Professors share ideas for building community in online courses. The over-arching narrative of this piece was that ‘humanizing learning’ was the most effective was to build online learning communities, which occurs when students connect on a emotional and social level when engaging with the community. The author, Sharon O’Malley, suggest six methods for achieving this:

  1. Let students get to know you – instructors need to present themselves as ‘real people’ – this can be done by appearing goofy or telling genuine anecdotes in videos, for example. Students should also be encouraged to reveal their non-academic lives, in order for others to feel more like they know them personally, rather than just in the learning context
  2. Incorporating video and audio resources and feedback
  3. Meet in real time – students can talk to each other in real time and make instant connections
  4. Work in small groups – students get connected with others in their group – instead of feeling like they’re in a class of fifty, they feel they are in a class of 5, 10 etc.
  5. Require constant interaction – group projects and collaborative writing assignments force students to engage with each other out of the session
  6. Rise to the challenge – building community takes time – it takes planning and experimentation. Stick with it if it doesn’t immediately work!

Roger introduced a Building learning communities card activity. This is an activity from QAA Scotland, designed to stimulate discussion about what helps an effective learning community. The activity cards suggest the following factors:

  • Clearly defined and inclusive values
  • A clearly articulated and shared purpose
  • Clearly articulated and shared purpose goals
  • Active and vibrant interaction
  • Owned and managed by its people
  • Dedicated structure
  • Collaboration
  • Adequate and appropriate support
  • Understood and respected expectations
  • Adequate and appropriate resources
  • Built in evaluation

The instructions ask the group to consider which of these are essential and which are “nice to haves”.   The activity was certainly effective in stimulating discussion in reading group.]

Suzi watched Building Community: A Conversation with Dave Cormier – a recording of an edX webinar from 2014 – video. Here Cormier, who coined the term MOOC, talks to edX about how they could and should use online learning communities.

Cormier talks about four models of learning that you could scale up online:

  • One-to-one (adaptive learning, tutoring on skype?)
  • One-to-many (video lectures on MOOCs)
  • Cooperative learning: many-to-many, all working on the same thing
  • Collaborative learning: many-to-many, shared interest but each with own project

Collaborative learning is the one which he thinks is particularly – perhaps only – served by online communities. The real life equivalent being chaos, or maybe conferences (which, arguably, don’t work well for learning).

He draws the distinction between mastery learning (where skills can be ticked off a list as you progress) and complexity. Communities are not a particularly useful tool for mastery, or for checking who has learnt what. They are much better suited for complexity. This seemed to echo discussions we’d had about the difference between using gamification and using playfulness in learning – gamification being more for mastery, playfulness for complexity.

Cormier offers some tips on building a successful community.

  • A community should have, should move people towards building, shared values and a shared language.
  • Drive participation by having a famous person (but this can become one-to-many) or by asking annoying questions that people can’t resist engaging with (eg “how do we recognise cheating as a valuable part of education?”).
  • Shape participation by assigning roles to people and having course leader presence to set the tone.
  • Give people ways to get to know each other and make connections: recognising who people are and recognising aspects of yourself in them.

His view on evaluation and measuring success might be more specific to the MOOC context. He suggests borrowing techniques from advertising to demonstrate their value (but he doesn’t give details). The outcomes he suggests you might hope for are things like building more interest in your research area, or building the brand of an academic / department / institution.

He also asks some interesting questions. About the authenticity of work we give to students – how will their work persist? Can it be right that so much of students work is destined to be thrown away? About life beyond the community – how will the community persist? Communities are emotional – you shouldn’t just pull the plug at the end.

Lots of this is challenging in an educational context. For instance, communities take time to build but we generally work with units that last for a teaching block at most. Our online Bristol Futures courses only last four weeks. I wonder if this is to do with setting expectations. Perhaps we need thin and thick communities: the thin communities being time-bound but with much more scaffolding and a narrower purpose, the thick communities being more what Cormier is talking about here.

I also read The year we wanted the internet to be smaller (on the growth of niche communities in 2017) and 11 tips for building an engaged online community (practical advice aimed at NGOs). Both are interesting in their own right and worth a read. In both the idea of shared values, shared language and a sense of purpose came up. They also talk also recognition: communities as a place where you find “your people”. This resonates with my positive experiences of online communities but is, again, challenging in an education context. As Suzanne pointed out I think – if the tone and being among “your people” is important you must be able to walk out and find something different if you don’t feel comfortable. And it may be far better that you work with people who aren’t just  “your people”, or at least who don’t start that way.

Suggested reading

Online communities in education

From other sectors

Education communities – articles that are 10+ years old

Suggested listening

Miscellany – notes from the reading group

No theme this month – just free choice. Here’s what we read (full notes below):

Naomi read Stakeholders perspectives on graphical tools for visualising student assessment and feedback data.

This paper from the University of Plymouth looks at the development and progression of learning analytics within Higher Education. Luciana Dalla Valle, Julian Stander, Karen Gretsey, John Eales, and Yinghui Wei all contributed.  It covers how four graphical visualisation methods can be used by different stakeholders to interpret assessment and feedback data. The different stakeholders being made up of external examiners, learning developers, industrialists (employers), academics and students.

The paper discusses how there is often difficulty pulling information from assessments and feedback as there can be a lot of data to cover. Having graphic visualisations means information can be shared and disseminated quickly, as there is one focal point to concentrate on. Its mentioned that some can include ‘too much information that can be difficult for teachers to analyse when limited time is available.’ But it is also discussed how it is important then to evaluate the visualisations from the point of view of the different stakeholder who may be using them.

The paper looks at how learning analytics can be seen as a way to optimise learning and allow stakeholders to fully understand and take on board the information that they are provided with. For students it was seen as a way to get the most out of their learning whilst also flagging student’s facing difficulties. The paper also talks about how it brings many benefits to students who are described as the ‘overlooked middle’. Students are able to easily compare their assessments, attainment, and feedback to see their progression. Student’s agreed that the visualisations could assist with study organisation and module choice, and it’s also suggested taking these analytics into account can improve social and cultural skills. For external examiners, analytics was seen as a real step forward in their learning and development. For them it was a quick way to assimilate information and improve their ‘knowledge, skills and judgement in Higher Education Assessment. Having to judge and compare academics standards over a diverse range of assessment types is difficult and visual graphics bring some certain simplicity to this. For learning developers too, using these images and graphics are suggested to help in ‘disseminating good practice.

The paper goes on to explain how it does improve each of the stakeholder’s evaluation of assessment. It goes into a lot of detail of the different visualisations suggested, commenting on their benefits and drawbacks of each set of data, which is worth taking a more detailed look at. It should also be noted that the paper suggested there could be confidential or data protection issues involved with sharing or releasing data such as this as in most cases this certain data is only seen at faculty or school level. Student demoralisation is also mentioned near the end of the paper as being a contributing factor to why these graphics may not always work in the best ways. It finishes by suggesting how it would be interesting to study student’s confidence and self-esteem changes due to assessment data sharing. It an interesting idea that needs to be carefully thought out and analysed to ensure it produces a positive and constructive result for all involved.

Suzanne read: Social media as a student response system: new evidence on learning impact

This paper begins with the assertion that social media is potentially a “powerful tool in higher education” due to its ubiquitous nature in today’s society. However, also recognising that to date the role of social media in education has been a difficult one to pin down. There have been studies showing that it can both enhance learning and teaching and be a severe distraction for students in the classroom.

The study sets out to answer these two questions:

  • What encourages students to actively utilise social media in their learning process?
  • What pedagogical advantages are offered by social media in enhancing students’ learning experiences?

To look at these questions, the researchers used Twitter in a lecture-based setting with 150 accounting undergraduates at an Australian university. In the lectures, Twitter could be used in two ways: as a ‘backchannel’ during the lecture, and as a quiz tool. As a quiz tool, the students used a specific hashtag to Tweet their answers to questions posed by the lecturer in regular intervals during the session, related to the content that had just been covered. These lectures were also recorded, and a proportion of the students only watched the recorded lecture as they were unable to attend in person. Twitter was used for two main reasons. First, the researchers assumed that many students would already be familiar and comfortable with Twitter. Secondly, using Twitter wouldn’t need any additional tools, such as clickers, or software (assuming that students already had it on their devices).

Relatively early on, several drawbacks to using Twitter were noted. There was an immediate (and perhaps not surprising?) tension between the students and lecturers public and private personas on Twitter. Some students weren’t comfortable Tweeting from their own personal accounts, and the researchers actually recommended that lecturers made new accounts to keep their ‘teaching’ life separate from their private lives. There was also a concern about the unpredictability of tapping into students social media, in that the lecturer had no control over what the students wrote, in such a public setting. It also turned out (again, perhaps not unsurprisingly?) that not all students liked or used Twitter, and some were quite against it. Finally, it was noted that once students were on Twitter, it was extremely easy for them to get distracted.

In short, the main findings were that the students on the whole liked and used Twitter for the quiz breaks during the lecture. Students self-reported being more focused, and that the quiz breaks made the lecture more active and helped with their learning as they could check their understanding as they went. This was true for students who actively used Twitter in the lecture, those who didn’t use Twitter but were still in the lecture in person, and those who watched the online recording only. During the study, very few students used Twitter as a backchannel tool, instead preferring to ask questions by raising a hand, or in breaks or after the lecture.

Overall, I feel that this supports the idea that active learning in lectures is enhanced when students are able to interact with the material presented and the lecturer. Breaking up content and allowing students to check their understanding is a well-known and pedagogically sound approach. However, this study doesn’t really provide any benefit in using Twitter, or social media, specifically. The fact that students saw the same benefit regardless of whether they used Twitter to participate, or were just watching the recording (where they paused the recording to answer the questions themselves before continuing to the answers), seems to back this up. In fact, in not using Twitter in any kind of ‘social’ way, and trying to hive off a private space for lecturers and students to interact in such a public environment seems to be missing the point of social media altogether. For me, the initial research questions therefore remain unanswered!

Suzi read Getting things done in large organisations

I ended up with a lot to say about this so I’ve put it in a separate blog post: What can an ed techie learn from the US civil service?. Key points for me were:

  • “Influence without authority as a job description”
  • Having more of a personal agenda, and preparing for what I would say if I got 15 minutes with the VC.
  • Various pieces of good advice for working effectively with other people.

Chrysanthi read Gamification in e-mental health: Development of a digital intervention addressing severe mental illness and metabolic syndrome (2017). This paper talks about the design of a gamified mobile app that aims to help people with severe chronic mental illness in combination with metabolic syndrome. While the target group is quite niche, I love the fact that gamification is used in a context that considers the complexity of the wellbeing domain and the interaction between mental and physical wellbeing. The resulting application, MetaMood, is essentially the digital version of an existing 8-week long paper-based program with the addition of game elements. The gamification aims to increase participation, motivation and engagement with the intervention. It is designed to be used as part of a blended care approach, combined with face to face consultations. The game elements include a storyline, a helpful character, achievements, coins and a chat room, for the social element. Gamification techniques (tutorial, quest, action) were mapped to traditional techniques (lesson, task, question) to create the app.

The specific needs of the target group needed the contributions of an interdisciplinary team, as well as relevant game features; eg the chat room includes not only profanity filter, but also automatic intervention when keywords like suicide are used (informing the player of various resources available to help in these cases). Scenarios, situations and names were evaluated for their potential to trigger patients, and changes were made accordingly; eg the religious sounding name of a village was changed, as it could have triggered delusions.

The 4 clinicians that reviewed the app said it can proceed to clinical trial with no requirement for further revision. Most would recommend it to at least some of their clients. The content was viewed as acceptable and targeted by most, the app interesting, fun & easy to use. I wish there had been results of the clinical trial, but it looks like this is the next step.

Roger read “Analytics for learning design: A layered framework and tools”, an article from the British Journal of Educational Technology.

This paper explores the role analytics can play in supporting learning design. The authors propose a framework called the “Analytics layers for learning design (AL4LD)”, which has three layers: learner, design and community analytics.

Types of learner metrics include engagement, progression and student satisfaction while experiencing a learning design. Examples of data sources are VLEs or other digital learning environments, student information systems, sensor based information collected from physical spaces, and “Institutional student information and evaluation (assessment and satisfaction) systems”. The article doesn’t go into detail about the latter, for example to explore and address the generic nature of many evaluations eg NSS, which is unlikely to provide meaningful data about impact of specific learning designs.

Design metrics capture design decisions prior to the implementation of the design. Examples of data include learning outcomes, activities and tools used to support these. The article emphasises that “Data collection in this layer is greatly simplified when the design tools are software systems”. I would go further and suggest that it is pretty much impossible to collect this data without such a system, not least as it requires practitioners to be explicit about these decisions, which otherwise often remain hidden.

Community metrics are around “patterns of design activity within a community of teachers and related stakeholders”, which could be within or across institutions. Examples of data include types of learning design tools used and popular designs in certain contexts. These may be shared in virtual or physical spaces to raise awareness and encourage reflection.

The layers inter-connect eg learning analytics could contribute to community analytics by providing evidence for the effectiveness of a design. The article goes on to describe four examples. I was particularly interested in the third third one which describes the “experimental Educational Design Studio” from the University of Technology Sydney. It is a physical space where teachers can go to explore and make designs, ie it also addresses the community analytics layer in a shared physical space.

This was an interesting read, but in general I think the main challenge is collection of data in the design and community aspects. For example Diana Laurillard has been working on systems to do this for many years, but there seems to have been little traction. eg The learning design support environment  and the Pedagogical Patterns Collector.

Amy read: Addressing cheating in e-assessment using student authentication and authorship checking systems: teachers’ perspectives. Student authentication and authorship systems are becoming increasingly well-used in universities across the world, with many believing that cheating is on the rise across a range of assessments. This paper looks at two universities (University A in Turkey and University B in Bulgaria) who have implemented the TeSLA system (an Adaptive Trust-based eAssessment System for Learning). The paper doesn’t review the effectiveness of the TeSLA system, but rather the views of the teachers on whether the system will affect the amount of cheating taking place.

The research’s main aim is to explore the basic rationale for the use of student authentication and authorship systems, and within that, to look at four specific issues:

  1. How concerned are teaching about the issue of cheating and plagiarism in their courses?
  2. What cheating and plagiarism have teachers observed?
  3. If eAssessment were introduced in their courses, what impact do the teaching think it might have on cheating and plagiarism?
  4. How do teachers view the possible use of student authentication and authorship checking systems, and how well would such systems fit with their present and potential future assessment practises?

Data was collected across three different teaching environments: face-to-face teaching, distance learning and blended learning. Data was collected via questionnaires and interviews with staff and students.

The findings, for the most part, were not hugely surprising: the main type of cheating that took place at both universities was plagiarism, followed by ghost-writing (or the use of ‘essay mills’). These were the most common methods of cheating in both exam rooms and online. The difference between the reasons staff believed students cheated and why students cheated varied widely too. Both teachers and students believed that:

  • Students wanted to get higher grades
  • The internet encourages cheating and plagiarism, and makes it easy to do so
  • There would not be any serious consequences if cheating and plagiarism was discovered

However, teachers also believed that students were lazy and wanted to take the easy way out, whereas students blamed pressure from their parents and the fact they had jobs as well as studying for reasons.

Overall, staff were concerned with cheating, and believed it was a widespread and serious problem. The most common and widespread problem was plagiarism and ghost writing, followed by copying and communicating with others during assessments. When asked about ways of preventing cheating and plagiarism, teaching were most likely to recommend changes to educational approaches, followed by assessment design, technology and sanctions. Teachers across the different teaching environments (face-to-face, blended and distance learning) were all concerned with the increase in cheating that may take place with online/ eAssessments. This was especially the case for staff who taught on distance learning courses, where students currently take an exam under strict conditions. Finally, all staff believed that the use of student authentication and authorship tools enabled greater flexibility in access for those who found it difficult to travel, as well as in forms of assessment. However, staff believed that cheating could still take place regardless of these systems, but that technology could be used in conjunction with other tools and methods to reduce cheating in online assessments.

Wellbeing – notes from the reading group

Roger read: Curriculum design for wellbeing.

This is part of an online professional development course for academics, produced by a project run by a number of Australian Universities co-ordinated by the University of Melbourne. It aims  to build the capacity of University educators to design curriculum and create teaching and learning environments that enhance student mental wellbeing. There are 5 modules: on student wellbeing, curriculum design, teaching strategies, difficult conversations and your wellbeing.

I focussed on module 2 which is on curriculum design. It starts by stressing the importance of students, through the curriculum, experiencing autonomous motivation, a sense of belonging, positive relationships,  feelings of autonomy and competence (M-BRAC). All of these are aspects of good practice in curriculum design.

It goes on to consider how elements of curriculum design support student mental wellbeing, covering alignment, organisation and sequencing of content, engaging learning activities and a focus on assessment for learning.

For example, aligning ILOS with assessment and learning activities helps student autonomy as they understand how what they are doing contributes to their goals and they develop their knowledge and skills, including self-regulation, to achieve the ILOS. Assessment for learning plays a key role here. Clear organisation and sequencing of content contribute to effective learning. Both alignment and structure help to build students’ sense of competence.  Engaging and meaningful learning activities can increase student motivation and encourage peer interaction, which can contribute to building relationships and a sense of belonging.

It suggests that when reviewing the curriculum one should ask:

  • How will the curriculum be experienced by my (diverse) students eg international students, mature students, “first in family” students?
  • Will the curriculum foster or thwart experiences of M-BRAC

The module then has a number of FAQs you are asked to consider, with suggested answers. These were really useful as they tease out some of the complexities, for example “Is setting group assignments in the first year a good way of helping students develop positive peer relationships, and feel a sense of connectedness or belonging?”  Here they recognise that if not well-designed or if students are not supported to develop group work skills it can have a negative impact.

The module ends with a set of case studies illustrating how curricula have been re-designed to better support M-BRAC.

Amy read: Approach your laptop mindfully to avoid digital overload.

This was a short article that recognised the ever-increasing belief that we are being constantly bombarded with masses of new information which, in turn, means that many are suffering with stress-related diseases, anxiety and depression. The reliance on digital devices to provide constant streams of information in the form of news articles, social media feeds and messages mean that without these devices we feel lost without them. A full digital detox is suggested at the beginning of this article, though this may be a short-term solution and often an impractical one.

The authors of this article suggest introducing the practice of mindfulness into our lives to combat this. They describe mindfulness as ‘a moment-to-moment attention to present experience with a stance of open curiosity’. Mindfulness has been studied extensively by the medical community and has shown to help with stress, anxiety and depression in individuals. One can ‘reprogramme’ their mind to deal with stresses more easily by training it to be more present. The authors suggest two key ways they suggest to introduce mindfulness into our use of digital devices to reduce the pressures they can put on us.

One of the methods they suggest is ‘mindful emailing’, which includes practices such as taking three breaths before responding to a stressful email and also considering the psychological effect that the email will have to the recipients.

The second method they suggest is the mindful use of social media, citing ‘checking our intentions before uploading a feed (post?), being authentic in our communications and  choosing the time we spend on social media, rather than falling into it.’

If you haven’t tried mindfulness before take a look at these tips and short audio meditations.

Chrysanthi read: Designing a product with mental health issues in mind,

This article – true to its title – talks about including technological features that aim to help the vulnerable users. While the examples given are taken from a banking application context, the suggestions can be applied to other contexts. More specifically, the article mentions positive friction and pre-emptive action. Positive friction goes against developers’ usual desire to make everything easier and faster and aims to put some necessary obstacles in the way instead, for users that need it. The example used is allowing certain users with somewhat “impulsive” behaviour to check their recent purchases and confirm that they indeed want them. This would help eg bipolar disorder sufferers, who overspend in their manic phase, often at night, and slip into depression in the morning because of their irreversible mistake. In the specific app, this is still a speculative feature.  Pre-emptive action aims to prevent trouble by anticipating certain events and acting on them, eg perceiving a halt in income and sending a well timed notification to start a conversation so the person doesn’t end in debt (and therefore create more stress for themselves). Also, allowing vulnerable customers to choose their preferred time and form of communication (eg phone might be anxiety inducing or email might seem complex).

In an education context, positive friction could be relevant in cases where students are repeatedly doing things they no longer need to do. This would help when – under the illusion they are still learning – students are focusing on redoing exercises they already know how to do – which might help them feel accomplished but doesn’t add value from some point on – or on consuming more and more content, even when they haven’t actually digested what they have learned so far. It isn’t very clear how this could be applied during exam period, though. Pre-emptive action is perhaps easier to translate in an educational context. Any action (or inaction) that is either outside the student’s usual pattern or outside a successful pattern, might be a conversation starting point, or a trigger for suggestions for alternative ways to handle their studies. Also, allowing them different options to learn and communicate with their professors and peers.

Chrysanthi also read: E-mail at work: A cause for concern? The implications of the new communication technologies for health, wellbeing and productivity at work.

This paper explores the potential negative implications of using email at work. The email features they consider as potentially problematic are: speed, multiple addressability, recordability, processing & routing. Essentially, email as a message that can be instantly transmitted to several people at once, automatically stored, easily altered and different versions of it sent to various people, not all of which necessarily visible on the recipient list. According to the authors, emails may increase stress by increasing workload and interruptions, adding difficulty to interpretation of the message and tone, increasing misunderstandings and groupthink, reducing helpful argumentation and social support, increasing isolation and surveillance – which increases discontent, offering a new ground for bullying and harassment, or hindering the processing of negative feelings. Having established these potential negative implications, the authors point out that more research is needed to understand the optimal ways to use email at work, for effective communication and humane workplaces.

Naomi read: Did a five-day camp without digital devices really boost children’s interpersonal skills?

This article was about a brief study led by Yalda Uhls in Southern California. It studied two groups of pupils ‘who on average spent 4.5 hours a day ‘texting watching TV, and video gaming’. Half of the children were sent on a five-day educational camp in the country side where all technical devices were banned. The other half stayed at school as usual. Emotional and psychological tests were carried out on the students before and after the 5 days were completed.

There was a small amount of evidence to suggest the children who had spent time away from devices improved psychologically over the five days. However, because there were several small problems with the study no firm answers can be taken from it. Its suggested the children who went away for the five days only looked like they improved on the tests because they started at a lower level then the children who stayed at school. It was also suggested that the children who stayed at school tests deteriorated because they were tired from doing a week’s work.

As it suggests in the article, the results of this study weren’t particularly hard hitting but it does raise the question of how much the younger generations are using their devices throughout the day. Uhls does admit in the article that there were shortcomings to the study, but they suggest that these findings relate to the ‘wider context of technology fears’ and hope their paper will be ‘a call to action for research that thoroughly and systematically examines the effects of digital media on children’s social development.’ Although the study needed a more comprehensive approach the ideas behind it are interesting and relate to several issues that we see in everyday life – is it good for us to spend so much time on our devices, or is it integral to how we live now?

Suzi read: Learning in the age of digital distraction

This interview with Adam Gazzaley, a neurologist, is a plug for a book called The Distracted Mind in which he and and research psychologist Larry D Rosen talk about the way our brains are wired influences how we use technology.

They suggest that information-foraging is a development of our evolved food-foraging mechanisms, and so is to some extent driven by our very basic drive to survive. Because of this it is hard to prevent it from distracting us from our ability to set and pursue higher-level goals.

Information-foraging doesn’t just impact on people’s ability to focus, it can cause anxiety and stress, and affect mood.

Suggestions for possible ways to combat this include:

  • accepting that we need to (re-)learn to focus (they are also developing brain-training video games)
  • using play in education (but this was only very briefly mentioned I wasn’t clear if this was playfulness or gamification)
  • physical exercise
  • mindfulness
  • sitting down to dinner as a family, or otherwise building in device-free interaction time

Suggested reading

Playful learning – notes from the reading group

Suzi read Playful learning: tools, techniques, and tactics by Nicola Whitton

This is a useful scene-setting article, suggesting ways of framing discussions on playful learning and pointing the way to unexplored territory suitable for future research.

There are three ideas about how to talk about playful learning:

  • The magic circle – a socially constructed space in which play can happen
  • A mapping of aspects of games onto playful learning: surface structures of playful learning <> mechanics of games, deep structures <> activities of play, implicit structure <> philosophy of playfulness
  • Tools / techniques / tactics of playful learning: objects artefacts & technologies / approaches / mechanics and attributes – these could serve as prompts for getting playfulness into teaching

Whitton suggests three characteristics of the magic circle which make it pedagogically useful: “the positive construction of failure; support for learners to immerse themselves in the spirit of play; and the development of intrinsic motivation to engage with learning activities.” In playful activities, failure will be framed positively, participants suspend disbelief which can encourage creativity, and participation is voluntary so there is intrinsic motivation (the difficulty of this last in particular in a formal education setting is acknowledged).

There’s a lot of acknowledgement that playfulness may not be an easy fit in higher education. Obstacles include: the inescapably of real world power relationships, confusing gamification with true playfulness, the need for things to be mandatory and assessed, existing attitudes to failure, prejudice about play being for children, lack of time, confidence and social capital.

I wasn’t certain about the point about play being a privilege. While certain types of play might attract a relatively slender demographic (escape rooms, real world games) and it’s important not to assume that everyone would want to engage in these, adults seem to play to learn in a range of contexts. I thought about the kinds of spaces where you would see playful learning: cooking, Karaoke, parkour, getting dressed up, new social media platforms (when FB started and everyone was poking each other and and biting each other and throwing bananas, hashtags came from playing with the way Twitter worked), and of other adult pursuits. There is playfulness in higher education too, although it’s often not explicitly described as such. Maybe there is a danger of rarefying play and almost making it by definition a domain for geeks alone, not recognising play that has not been made explicit.

This got me thinking about why we play, and why we might want to play in HE, and about one of my favourite quotes:

“The things we want are transformative, and we don’t know or only think we know what is on the other side of that transformation. Love, wisdom, grace, inspiration – how do you go about finding these things that are in some ways about extending the boundaries of the self into unknown territory, about becoming someone else?”

— Rebecca Solnit, A field guide to getting lost

This sums up a lot of what university could and should be. Playfulness has to have a very key role in that: place to play with possible selves, both academic and social.

Chrysanthi read Gamifying education: what is known, what is believed and what remains uncertain: a critical review by Christo Dichev and Darina Dicheva.

This is a review aiming to find what is known about gamification in educational contexts based on empirical evidence, rather than beliefs. The authors seem to find that much more is believed or uncertain, rather than known. For example, their main findings are that a. gamification has started being used at a pace much faster than researchers pace at understanding how it works, b. there is very little knowledge about how to effectively apply it in specific contexts and c. there is not enough evidence about its benefit long term.

While the understanding of how to engage, motivate and aid learning through gamification is inadequate, researchers are still praising the practice, thus inflating expectations about its effectiveness. The frequent use of performance-centric game elements like points, levels, badges and leaderboards is noteworthy; in absence of justification from the researchers implementing them, the authors hypothesize that this happens because they are similar to traditional classroom practices and easy to implement. But this leaves other major game elements out; authors note – among others – role play, narrative, choice, low risk failure. These tiny elements are then expected to affect broad concepts like motivation, with researchers often concluding that that is the case, without enough evidence to claim it is so.

This implies a somewhat blind application of the easiest-to-implement elements of gamification, with the belief that it will be enough to motivate students to perform better. But how are points different to marks and levels different to grades and chapters?

Perhaps gamification can’t be a canned, one-size-fits-all-learning-contexts solutions. Perhaps researchers and practitioners need to put in time and at least a bare minimum of imagination to create something that will be engaging enough for students, for the evidence supporting it to not be stamped “inconclusive” when under scrutiny.

David read Playful learning in Higher Education: developing a signature pedagogy by Rikke Toft Nørgård, Claus Toft-Nielsen & Nicola Whitton (2017)

This paper starts off with a bit of a rant about the commercialisation of higher education and the focus on metrics to measure performance and how this creates an assessment-driven environment focused on goal-oriented behaviours characterised by avoidance of risk and fear of failure. The authors see recent gameful approaches in higher education as a response to this but warn that while gamification may increase motivation, games often focus on extrinsic motivational drivers and the results may be short-lived. They also cite research which points to issues around perceived appropriateness and students manipulating points-based incentive systems (and my colleagues and I have encountered examples of this in out teaching).

In contrast to gamification, they regard playful learning as something which encourages intrinsic and longer-term motivation by offering the chance to explore and experiment without fear of being judged for failure and therefore being able to learn from it. They use the ideas of the ‘magic circle’ and ‘lusory attitude’ to describe the environment in which this can occur. The concept of the magic circle is used a lot in gaming and is a metaphor for the ‘space’ we enter into when we fully engage with a game, accepting the different norms and codes of practice (or actively constructing them with other ‘players’). This can refer to physical (e.g. sports) or virtual/imaginary (e.g. computer games) or a combination of both (e.g. a child’s tea party). For this to work, we need to assume an ‘lusory attitude’. This gives participants a shared mindset in which they are free to play, experiment and fail in a safe place.

The Magic Circle – How Games Transport Us to New Worlds

The authors then turn to the question of how to implement such an approach. Using the results of two studies about what students report (a) makes their learning enjoyable and (b) disengages them, they develop a ‘signature pedagogy’ for playful learning in higher education. The notion of ‘signature pedagogy’ they assume is split into three levels:

  • The foundation is formed by Implicit (playful) structures, which are the necessary assumptions and attitudes (values, habits, ethics)
    • Lusory attitude
    • Democratic values and openness
    • Acceptance of risk-taking and failure
    • Intrinsic motivation
  • Deep (play) structures represent the nature of the activities which the implicit structures facilitate
    • Active and physical engagement
    • Collaboration with diverse others
    • Imagining possibilities
    • Novelty and surprise
  • Surface (game) structures are the ‘mechanics’ of an activity, including the materials, tools and actions involved
    • Ease of entry and explicit progression
    • Appropriate and flexible levels of challenge
    • Engaging game mechanics
    • Physical or digital artefacts

The authors see the implicit (playful) structures as the necessary starting point for their ‘signature pedagogy’ but do not say how students get to this point. Indeed they acknowledge the inherent paradox in their model:

“…for many students to view learning as valuable then it must be valued by the system (assessed), yet it is simultaneously this assessment that makes learning stressful and undermines the creation of a safe and comfortable environment.”

For me, then, this article leaves three interrelated questions to be discussed:

  1. For playful learning to be successful, do students need to have the implicit structures already in place or can students acquire these through the playful activity itself?
  2. If these implicit structures are prerequisite, how do we get students to acquire them?
  3. As this involves a change in students’ attitudes which the authors argue are reinforced by the current assessment-driven environment, does this pedagogical approach have any chance of success without change at the programme or institutional level?

Suzanne read Unhappy families: using tabletop games as a technology to understand play in education by John Lean, Sam Illingworth, Paul Wake, published in the ALT Journal special issue

In this article, the authors decide to take a step back when considering the ‘future’ of digital technologies in relation to playful learning by considering traditional table top games as a form of technology. They aimed to better understand the affordances of digital game tools by looking at table top games as an analogue, in order to reflect critically on the pedagogical uses of games and playful learning. Their hypothesis was that table top games (see the article for a full definition of how they classify a game as ‘table top’), are successful because: 1) they provide an immediate and accessible shared space, which is also social; 2) this space and the game are both easily modified by players and educators; and 3) they provide a tactile, sensory experience. So, in essence, that they are social, modifiable and tactile, which are all things that could be transferred into digital games in education, but which are often overlooked.

To explore this hypothesis, they used a specific game, that of  ‘Gloom‘, which was played by participants at the 2017 ALT Playful Learning Conference. In relation to the first hypothesis, they found that the game encouraged a lot of social interaction. Firstly, the game encourages players to talk about their recent lived experiences as a means of deciding who gets to go first (ie, who has had the worst day thus far). Additionally, there is a storytelling element of the game, which also encourages an empathetic interaction between the game and the players, as well as between players.

Regarding how modifiable the game is, the players found it was easy to change and adapt the game, even during play. They also explored the ways of playing around with and stretching the rules, to create different rules or games within the game play. The authors note that this is often not as easily achieved in digital games, where rules can be more fixed and more difficult to circumvent. Thirdly, the players did undoubtedly find the game tactile, as the cards provide a physical element, further enhanced by the way the cards are played. The cards themselves have transparent elements, so as you stack cards you create different versions of them, allowing for the storytelling element.

In conclusion, the authors used this game play experience to revisit some preconceptions about what ‘play’ or ‘playfulness’ is in a game context. They felt that the ‘true’ play seemed to happen when the players had modified the rules to the point where the game itself was almost no longer required. The players were exploring and testing their new game playfully, in the way that they were interacting with each other and the environment. In terms of education, they felt that this playfulness had great potential for learning. The process of negotiating the play, and working out how to play with others who might have different ideas to you (for example, either wanting to stick to the rules or wanting to break them) is potentially a powerful social learning opportunity.

However, they also noted that this very character of playful learning – that it is negotiated and created by the context and participants – makes it extremely difficult to categorise or understand pedagogically. If we need to allow for such variety of outcomes in playful learning, it can be difficult to work out how we can situate it within other educational structures, like lesson plans or learning objectives.

Suggested reading

Digital and physical spaces – notes from the reading group

Amy read – Institutional to Individual: realising the postdigital VLE? by Lawrie Phipps.

Lawrie starts this article by quoting himself – ‘Digital is about people’. He believes that learning is effective when we are connection in conversations and in groups – this is been proven many times over – but that these conversations should not be confined. The ‘confinement’ he talks of is the attempt by unnamed institutions to restrict their teaching staff by controlling the access and provision of alternative tools, which, Lawrie argues, don’t often align with their everyday activities. He mentions two projects that are taking places at universities – the Personalised User Learning & Social Environments (PULSE) project at Leeds Beckett (difficult to find anything about this online) and the Aula team, who have created a ‘conversational layer’ to run alongside a VLE and provide an ‘ecosystem for a range of other tools’.

The article moves on to discuss the emerging trend of disaggregation as being an indicator of ‘post digital academic practice’… I’d be interest to know what he means by this but the article does not shed any light on this. If I were to guess at its meaning I would think that the digital is becoming so integrated into our lives that it can no longer be considered a practice – it is seamless, and therefore doesn’t need to be recognised. He reminds us to be mindful of the other emerging themes of digital spaces; control, surveillance and ‘weaponised’ metrics used by corporate bodies and the use of algorithms to control our feeds.

Lawrie finishes by letting us know that ‘the report’ (I presume the ‘Next Generation Digital Learning Environments’, mentioned earlier in the article) is coming together nicely, and urges the reader get in touch if they have any relevant cases of disaggregation for practical purposes.

 

Chrysanthi read Digital sanctuary and anonymity on campus by Sian Bayne.

The article is trying to make a case for anonymity in online social exchange in the context of higher education.

The author points out that as part of the point of higher education is to help students own and defend their knowledge, anonymity is not usual, barring exceptions like peer review. But this works better for those with privileges than those without and it doesn’t work for every topic that a student would be interested in. In their view, anonymity offers 1. social value and 2. a way to resist digital surveillance. By looking at the use of an anonymous social media app called Yik Yak – which was popular for 2-3 years, but then removed the anonymity and then closed – they realised that it was a tool often used to facilitate anonymous peer support, which was very helpful to students concerning topics like social difficulties or isolation, relationships, health (sexual and emotional) or teaching-related issues.

Anonymity also serves to resist the ubiquitous surveillance that occurs in large part through social media, that record everything individuals do and like for their financial benefit. But there can be online social networks where students don’t need to hand over their data to be able to use them.

They argue that the absence of an app like this reduces students’ opportunities to a support group and that the counter argument usually put forward – that anonymous spaces facilitate abuse – is weak, considering abuse can and does happen everywhere, including non anonymous social media like Facebook. They are concerned about where the supportive conversations that people would previously have anonymously are happening now, for topics like mental health or relationships.  Overall, they believe universities need such anonymous spaces and should figure out how to implement them balancing data, trust & safety.

I think the author makes good points. Regarding where the conversations are happening now, I am assuming

1. other anonymous but not higher education specific spaces, such as reddit, which means people will get support, although from a broader population that is not coming from the same context, with all the challenges this implies.

2. non anonymous spaces, like Facebook, which means people are essentially broadcasting their issues on platforms that a. may use this information for their benefit and the student’s detriment, b. store and display the data with the user’s name for a long time, with no guarantees for who can/ cannot see it. This makes abuse easier, as well as enabling people looking up the individual (e.g. future employers) to see information they should otherwise not have access to.

3. they are not getting support, which could lead to isolation.

Overall, I do see the point of universities implementing anonymous digital spaces for their students.

 

Naomi read: The SCALE-UP Project: A Student-Centred Active Learning Environment for Undergraduate Programmes by Robert J. Beichner.

The author starts by describing these scale-up areas as places where ‘student teams are given interesting things to investigate, while their instructor roams.’ Although this is one of the short areas we hear about how the actual space is designed to improve learning and collaboration.The purpose of these teaching spaces is to encourage discussion between student’s and their peers. By working in small groups on separate tables within the classroom student’s can work on separate activities and use a shared laptop or whiteboard to research or make note of their findings. Thay can then discuss with other groups.

The main point of the paper revolves around the idea of social interaction between students and their teachers being the ‘active ingredient’ in making this approach to teaching work. Beichner talks about how student’s in these classes gain a better conceptual understanding then the student’s taking traditional lecture-based classes. Studies saw a high rise in student’s confidence, their problem-solving skills, as well as teamanship and communication. There is some concern about whether this approach is meaning less content is being delivered to the students, but Beichner argues the content is being developed and created by the student’s themselves.

Discussion led learning is always going to be popular, but we need to think about the physical space too and whether it is needed or not. The size of these classes needs to be considered too – what can be classed as too big? Beichner’s study was interesting, but not surprising, and it would have been good to know how the design or the space and tables aiding the learning too.

 

Suzi read The Educause NDingle and an API of one’s own by Michale Feldstein (which is a rebuttal to a rebuttal of the Educause Next Generation Digital Learning Environment report)

This is a clear and interesting artical discussing where learning management systems (LMSs) could/should go – as digital spaces for learning. The perspective on this is relatively technical, discussing the underlying architecture of the system but the key ideas are very approachable:

LMSs could move from being one application that tries to do everything, to being more like an oporating system on a mobile phone – hosting apps and managing the ways they can communicate with each other

Lego is also used as a metaphor for this more adaptable LMS, but Feldstein discusses the tension between having fairly generic blocks that don’t build anything in particular but allow you to be very creative (Lego from my childhood), and having sets which are intended to build a particular thing but which are then less adaptable (more typical of modern Lego). I found this a harder idea to apply, though I can appreciate that just because something comes in blocks and can be taken apart, doesn’t mean it is genuinely flexible and adaptable.

Personal ownership of data is discussed – the idea of students even hosting their own work and having a personal API via which they grant the institution’s LMS (and hence teachers) access to read and provide feedback on their work (“an API of one’s own”). This seems to me an attractive idea, in a purist origins-of-the-web way. People have suggested similar approaches in various domains, social media in particular, and I don’t know of any that have worked.

 

Suzanne read Semiotic Social Spaces and Affinity Spaces From The Age of Mythology to Today’s Schools by James Paul Gee. The premise of this text is to reconsider the idea of a community of practice, to think about it as related to the space in which people interact (and in what way), rather than membership of the community (particularly membership given to people by others, or through arbitrary groupings). Gee argues that thinking about community in this way is more useful, as membership means so many different things to different people, so trying to decide who is ‘in’ or ‘out’ of a group is problematic. He explains his ‘alternative’ to thinking about a ‘community of practice’ as an ‘affinity space’ in quite a lot of detail, using the analogy of a real-time computer game as an example, which here I won’t try to explain fully. However, some key ideas around what makes an ‘affinity space’ are that there needs to be some kind of content, generated by the community around a common endeavour. The people who interact with this content do so with an agreed set of ‘signs‘ with their own particular ‘grammar‘ or rules. This grammar can be internal (signs decided on within the group), or external (eg the way that people’s beliefs and identities are formed around these signs, and their relationship with them), and the external grammar can influence the internal grammar. Another interesting aspect is the idea of portals. An affinity space will have a number of ways that people can interact with it. To take the game example, the game itself could be a portal, but so could a website about game strategy, or a forum discussing the game. Importantly, the content, signs and grammar of the space can be changed by those interacting through those portals, so the content is not fixed. The final points are that people interacting in the space are both ‘expert’ and ‘novice’, and both intensive and extensive knowledge is valued. Individuals with specific skills or who a great amount of knowledge about a specific thing are as valued by the space as those who work to build a more distributed community of knowledge, and there are many different ways people can participate. Gee’s text presents quite an in depth concept, which seems quite theoretical. However, thinking about something like the Bristol Futures themes (Global Citizenship, Innovation and Enterprise or Sustainable Futures), we discussed how it might be applied, and how it might help us to think about things like reward and recognition, or success measures, in a very different way.

Suggested reading

Programme level assessment – notes from the reading group

Suzi read Handbook from UMass – PROGRAM-Based Review and Assessment: Tools and Techniques for Program Improvement

A really clear and useful guide to the process of setting up programme level assessment. The guide contains well-pitched explanations, along with activities, worksheets, and concrete examples for each stage of the process: understanding assessment, defining programme goals and objectives, designing the assessment, selecting assessment methods, analysing and reporting. Even the “how to use this guide” section struck me as helpful, which is unheard of.

The proviso is that your understanding of what assessment is for would need to align with theirs, or you would need to be mindful of where it doesn’t. As others do, they talk about assessment to improve, to inform, and to prove and they do also nod to external requirements (QAA, TEF, etc in our context). However, their focus is on assessment as part of the project of continual (action) research into, and improvement of, education in the context of the department’s broader mission. This is a more holistic approach that might bring in a wide range of measures including student evaluations of the units, data about attendance, and input from employers. I like this focus but it might not be what people are expecting.

During the group we discussed the idea of combining some of the ideas from this, and the approach Suzanne read about (see below). A central team would collaborate with academic staff within the department in what is essentially a research project, supporting conversations between staff on a project, bringing in the student voice and leaving them with the evidence-base and tools to drive conversations about education in their context – empowering staff.

(Side note – on reflection I’m pretty sure this is the reason this particular reading appealed to me.)

Chrysanthi read Characterising programme‐level assessment environments that support learning by Graham Gibbs & Harriet Dunbar‐Goddet.

The authors propose a methodology for characterising programme-level assessment environments, so that they can later be studied along with the students’ learning.

In a nutshell, they selected 9 characteristics that are considered important either in quality assessment or for learning (e.g. variety and volume of assessment). Some of these were similar to the TESTA methodology Suzanne described. They selected 3 institutions that were different in terms of structure (e.g. more or less fixed, with less or more choice of modules, traditional or variety in assessment methods etc. They selected 3 subject areas, the same in all institutions. They then collected data about the assessment in these and coded each characteristic so there would be 3 categories: low, medium, high. Finally, they classified each characteristic for each subject in each institution according to this coding. They found that the characteristics were generally consistent within institution, showing a cultural approach to assessment, rather than a subject- related one. They also identified patterns, e.g. that assessment aligned well with goals correlates with variety in methods. While the methodology is useful, their coding of characteristics as low-medium-high is arbitrary and their sample small, so the stated quantities in the 3 categories are not necessarily good guidelines.

Chrysanthi also watched a video from the same author Suzanne read about: Tansy Jessop: Improving student learning from assessment and feedback – a programme-level view (video, 30 mins).

There was a comparison of 2 contradictory case studies, 1 that seemed like a “model” assessment environment, but where the students did not put in much effort and were unclear about the goals and unhappy, and 1 that seemed problematic in terms of assessment but where students knew the goals and were satisfied. The conclusion was that rather than having a teacher plan a course perfectly and transmit large amounts of feedback to each student, it might be worth encouraging students to construct it themselves in a “messy” context, expanding constructivism to assessment as well.

Additionally, as students are more motivated by summative assessment, have a staged assessment where students are required to complete some formative assessment that feeds into their summative assessment. Amy & Chris suggested that this has already started happening in some courses.

Finally, the speaker noted that making the formative assessment publicly available, such as in blog posts, motivates the students, that it would be better if assessment encouraged working steadily throughout the term, rather than mainly at peak times around examinations and that feedback is important for goal clarity and overall satisfaction.

Both paper and video emphasised the wide variety in assessment characteristics between different programs. In the paper’s authors’ words, “one wonders what the variation might have been in the absence of a quality assurance system”.

The discussion went into the marking system and the importance students give to the numbers, even when they are often irrelevant to the big picture and their future job.

Amy summarised a summary she had created after attending a Chris Rust Assessment Workshop at the University. The workshop focussed on the benefits of programme-level assessment, looking at the current problems with assessment in universities and offering practical solutions and advice on creating programme-level assessments. The workshop started by looking at curriculum sequencing – it’s benefits and drawbacks, and illustrated this with examples where it had been successful.

Chris then discussed ‘capstone and cornerstone’ modules as a model for programme-level assessment, and explain where it had been a success in other universities. He discussed the pseudo-currency of marks and looked at ways we can alter our marking systems to improve student’s attitude to assessments and feedback. He ended the session by looking at the ways you can engage students with feedback effectively, and workshop attendees shared their advice with colleagues on how they engage their students with feedback. You can find the summary here.

Suzanne read Transforming assessment through the TESTA project by Tansy Jessop (who will be the next Education Excellence speaker) and Yaz El Hakim, which briefly describes the TESTA project, the methods they use and the outcomes they have noted so far. There are also references within the text to more detailed publications on specific areas of the methods, or on specific outcomes, if you want to find out more detail.

In brief, the TESTA project started in 2009, and has now expanded to 20 universities in the UK, Australia and the Netherlands, with 70 programmes having used TESTA to develop their assessment. The article begins by giving a pretty comprehensive overview of the reasons why programme assessment is so high on the agenda, including the recognition that assessment affects student study behaviours, and that assessment demonstrates what we value in learning, so we should make sure it really is focused on the right things. There was also a discussion about how the ‘modularisation’ of university study has left us with a situation of very separated assessments, which make it difficult to really see the impact of assessment practices across a programme, particularly for students who take a slower approach to learning. Ultimately the TESTA project is about getting people to talk about their practices on a ‘big picture’ level, identity areas which could be improved, and then work from a base of evidence to make those improvements. There is a detailed system of auditing current courses, including sampling, interviews with teaching s and programme directors, student questionnaires, and focus groups. the information from this is then used as a catalyst for discussion and change, which will manifest differently in each different programme and context.

The final paragraph of the report sums it up quite well: “The value of TESTA seems to lie in getting whole programmes to discuss evidence and work together at addressing assessment and feedback issues as a team, with their disciplinary knowledge, experience of students, and understanding of resource implications. The voice of students, corroborated by statistics and programme evidence has a powerful and particular effect on programme teams, especially as discussion usually raises awareness of how students learn best.”

Suggested reading

Role and future of universities – notes from the reading group

Maggie read Artificial intelligence will transform universities. Here’s how – World Economic Forum
The article presents the idea that Universities have created a need to innovate and evolve to meet the changing needs caused by the upsurge in AI. Already, the marking of student papers is becoming a thing of the past as AI is able to assess and even ” flag up” issues with ethics.Students are less able to distinguish between teacher marking and that of a “bot”. Teaching is additionally being impacted as students are able to undertake statistics courses using AI, massively reducing learning (and human teacher) time, with apparently equal learning and application outcomes. The author argues that Universities will need to up their game regarding employability and indeed attractive employment (remuneration). The paper is an easy-to-read item and clearly outlines the range of benefits and subsequent issues in relation to AI. All pertinent.

Suzi read three short opinion pieces: What are universities for and how do they work? by Keith Devlin, Everything must be measured: how mimicking business taints universities by Jonathan Wolff, and Universities are broke. So let’s cut the pointless admin and get back to teaching by André Spicer.

Devlin focused largely on the role of research within maths departments. The most interesting part, for me, came at the end when he talked about universities as communities and learning occurring “primarily by way of interpersonal interaction in a community”. Even without thinking about research outputs, there is value then in having a rich and varied community with faculty who have deep love and enthusiasm for their subject.

Wolf provided a clear and compelling dissection of how current educational policy is creating adverse incentives to community-mindedness (both within and between universities). Something detrimental to the education sector, which is such a significant part of the UK economy.

Spicer provides an insight into how this feels as an academic. He talks about how “In the UK, two thirds of universities now have more administrators than they do faculty staff.” and describes academics are “drowning in shit” (pointless admin).

For me, Spicer’s solutions for what universities could do to change this weren’t so compelling. If I could change one thing I would look at how we cost (or fail to cost) academic staff time. Academics can feel that they are expected to just do any amount of work they are given, or at least they often have no clear divide between work and not-work and have to constantly negotiate their time.

Amy didn’t read, but watched Why mayors should rule the world – Benjamin Barber – and would highly recommend it. Our modern democracy revolves around ancient institutions – we elect leaders most of us never meet and feel like we have very little input into the democratic process. This isn’t the case in cities – the leaders of cities, mayors, are seldom from anywhere other than the city they look after. They went to the local schools, they use the public transport and hospitals – they’ve watched their city grow. They have a vested interest in improving it. Positive changes towards existential issues such as climate change and terrorism are happening in cities (he gives an example of the LA port, which after an initiative to clear up, reduced the city’s overall emissions by 20%), and something can be learnt from the way that they operate. There are networks of mayors across the world, with a sense of competitiveness between them as to who can be the best city. Mayors from different cities meet up and share their practices, helping other cities implement changes using best practice, without the bureaucracy of central government slowing change down.

Suzanne watched  What are universities for? the RSA talk by Professor Stefan Collini and  Professor Paul O’Prey. The second half of the video was more focused on the way that higher tuition fees have changed the nature of the relationship between universities and students, but the introduction to the talk was much more on the topic we were discussing today. Stefan Collini began by saying that he believes universities are partially protected spaces which prioritise ‘deepening human understanding’, and that there are few if any other places where this happens. He compared them to other organisations which do research, such as R&D departments in industry, or teams working in politics, but said the difference was that universities were able to follow second order enquiries, and look at the boundaries of topics and knowledge, as they didn’t have a primary purpose of furthering one particular thing or ideal. So, although there are many benefits, such as increased GDP, from the kind of enquiry universities do (the ‘deepening of human understanding’ he started out with), that isn’t their aim or goal. He also went on to say that although we tend to see universities as primarily for the benefit of the individual students (furthering their careers, developing their own skills and knowledge) they should be seen as providing public good as well, for the reasons outlined above. In the group we discussed his basic premise, that universities are ‘protected spaces’, and decided that we aren’t sure that is really the case (especially with so much research being funded by grants from industry). However, it did lead to an interesting discussion about what we feel universities are actually for, if they aren’t what Collini outlined.

Suggested reading

Video – notes from the reading group

Hannah read ‘Motivation and Cognitive Strategies in the Choice to Attend Lectures or Watch them Online‘ by John N Bassilli. It was quite a in depth study but the main points were:

  • The notion of watching lectures online has a positive reaction from those who enjoy the course and find it important, but also from those who don’t want to learn in interaction with peers and aren’t inclined to monitor their learning.
  • From the above groups, the first group is likely to watch lectures online in addition to attending them face-to-face, whereas the second group are likely to replace face-to-face interaction with online study.
  • The attitude towards watching lectures online is related to motivation (ie. those who are motivated to do the course anyway are enthusiastic about extra learning opportunities), whereas the actual choice to watch them is related to cognitive strategies.
  • There is no demonstrable relation between online lecture capture and exam performance, but often the locus of control felt by students is marginally higher if they have the option to access lectures online.

Amy recommended Lifesaver (Flash required) as an amazing example of how interactive video can be used to teach.

Suzi read three short items which lead me to think about what video is good for. Themes that came up repeatedly were:

  • People look to video to provide something more like personal interaction and (maybe for that reason) to motivate students.
  • Videos cannot be skimmed – an important (and overlooked) difference compared to text.

The first two items were case studies in the use of video to boost learning, both in the proceedings of ASCILITE 2016.

Learning through video production – an instructional strategy for promoting active learning in a biology course, Jinlu Wu, National University of Singapore. Aim: enhance intrinsic motivation by ensuring autonomy, competence, and relatedness. Student video project in (theoretically) cross-disciplinary teams. Select a cell / aspect of a cell, build a 3D model, make a short video using the model and other materials, write a report on the scientific content and rationale for the video production. Students did well, enjoyed it, felt they were learning and seem to have learn more. Interesting points:

  • Students spent much longer on it than they were required to
  • Nearly 400 students on the module (I would like to have heard more about how they handled the marking)

Video-based feedback: path toward student-centred learning, Cedomir Gladovic, Holmesglen Institute. Aim: increase the student’s own motivation and enhance the possibility for self-assessment and reflection. They want to promote the idea of feedback as a conversation. Tutor talking over students online submission (main image) with webcam (corner image). Students like it but a drawback is that they can’t skim feedback. Interesting points:

  • How would tutors feel about this?
  • Has anyone compared webcam / no webcam?
  • Suggested video length <5 mins if viewed on smartphone, <10 mins if viewed on monitor

Here’s a simple way to boost your learning from videos: the “prequestion” looks at the effect of testing whether students remember more about specific questions and more generally when they are given prequestions on a short video. Answer seems to be yes on both counts. They thought that prequestions were particularly useful for short videos because students can’t easily skim through to just those topics.

Roger read “Using video in pedagogy”, an article from the Columbia University Center for Teaching and learning.

The article primarily focuses on the use of video as a tool for teacher reflection. The lecturer in question teaches Russian and was being observed. As she teaches in the target language which her observer didn’t speak her original motivation was to make the recording then talk the observer through what was happening. In actual fact she discovered additional benefits she had not envisaged. For example she was able to quantify how much time she was speaking compared to the students (as an important objective is to get students speaking as much as possible in the target language, and the teacher less). Secondly she could analyse and reflect on student use of the vocabulary and structures they had been taught. Thirdly it helped her to reflect on her own “quirks and mannerisms” and how these affected students. Finally the video provided evidence that actually contradicted her impressions of how an activity had gone . At the time she had felt it didn’t go well, but on reviewing the video afterwards she actually saw that it had been effective.

Suggested reading

Evidence – notes from the reading group

Suzi read Real geek: Measuring indirect beneficiaries – attempting to square the circle? From the Oxfam Policy & Practice blog. I was interested in the parallels with our work:

  • They seek to measure indirect beneficiaries of our work
  • Evaluation is used to improve programme quality (rather than organisational accountability)
  • In both cases there’s a pressure for “vanity metrics”
  • The approaches they talk about sound like an application of “agile” to a fundamentally non-technological processes

The paper is written at an early point in the process of redesigning their measurement and evaluation of influencing. Their aim is to improve the measurement of indirect beneficiaries at different stages of the chain, adjust plans, “test our theory of change and the assumptions we make”. Evaluation is different when you are a direct service provider than when you are a “convenor, broker or catalyst”. They are designing an evaluation approach that will be integrated into day to day running of any initiative – there’s a balance between rigor and amount of work to make it happen.

The approach they are looking at – which is something that came up in a number of the papers other people read – is sampling: identifying groups of people who they expect their intervention to benefit and evaluating it for them.

Linked to from this paper was Adopt adapt expand respond – a framework for managing and measuring systemic change processes. This paper presents a set of reflection questions (and gives some suggested measures) which I can see being adapted for an educational perspective:

  • Adopt – If you left now, would partners return to their previous way of working?
  • Adapt – If you left now, would partners build upon the changes they’ve adopted without us?
  • Expand – If you left now, would pro-poor outcomes depend on too few people, firms, or organisations?
  • Respond – If you left now, would the system be supportive of the changes introduced (allowing them to be upheld, grow, and evolve)?

Roger read “Technology and the TEF” from the 2017 Higher Education Policy Institute (HEPI)  report “Rebooting learning for the digital age: What next for technology-enhanced higher education?”.

This looks at how TEL can support the three TEF components, which evidence teaching excellence.

For the first TEF component, teaching quality, the report highlights the potential of TEL in increasing active learning, employability especially digital capabilities development, formative assessment, different forms of feedback and EMA generally, and personalisation. In terms of evidence for knowing how TEL is making an impact in these areas HEPI emphasises the role of learning analytics.

For the second component, learning environment, the report focusses on access to online resources, the role of digital technologies in disciplinary research-informed teaching, and again learning analytics as a means to provide targeted and timely support for learning. In terms of how to gather reliable evidence it mentions the JISC student digital experience tracker, a survey which is currently being used by 45 HE institutions.

For the third component, student outcomes and learning gain, the report once again highlights student digital capabilities development whilst emphasising the need to support development of digitally skilled staff to enable this. It also mentions the potential of TEL in developing authentic learning experiences, linking and networking with employers and showcasing student skills.

The final part of this section of the report covers innovation in relation to the TEF.  It warns that “It would be a disaster” if the TEF stifled innovation and increased risk-averse approaches in institutions. It welcomes the inclusion of ’impact and effectiveness of innovative approaches, new technology or educational research’ in the list of possible examples of additional evidence as a “welcome step.” (see Year 2 TEF specification Table 8)

Mike read  Sue Watling – TEL-ing tales, where is the evidence of impact and In defence of technology by Kerry Pinny. These blog posts reflect on an email thread started by Sue Watling in which she asked for evidence of the effectiveness of TEL. The evidence is needed if we are to persuade academics of the need to change practice.  In response, she received lots of discussion, including and what she perceived to be some highly defensive posts.  The responses contained very little by way of well- researched evidence. Watling, after Jenkins, ascribes ‘Cinderella Status’ to TEL research, which I take to mean based on stories, rather than fact.  She acknowledges the challenges of reward, time and space for academics engaged with TEL. She nevertheless  makes a pleas that we are reflective in our practice and look to gather a body of evidence we can use in support of the impact of TEL. Watling describes some fairly defensive responses to her original post (including the blog post from James Clay that Hannah read for this reading group). By contrast. Kerry Pinny’s post responds to some of the defensiveness, agreeing with Watling – if we can’t defend what we do with evidence, then this in itself is evidence that something is wrong.

The problem is clear, how we get the evidence is less clear. One point from Watling that I think is pertinent is that it is not just TEL research, but HE pedagogic research as a whole, that lacks evidence and has ‘Cinderella status’. Is it then surprising that TEL HE research, as a  subset of  HE pedagogic research, reflects the lack of proof and rigour? This may in part be down to the lack of research funding. As Pinny points out, it is often the school or academic has little time to evaluate their work with rigour.  I think it also relates to the nature of TEL as a  set of tools or enablers of pedagogy, rather than a singular approach or set of approaches. You can use TEL to support a range of pedagogies, both effective and non-effective, and a variety of factors will affect its impact.  Additionally, I think it relates to the way Higher Education works – the practice there is and what evidence results tends to be very localised, for example to a course, teacher or school. Drawing broader conclusions is much, much harder.  A lot of the evidence is at best anecdotal. That said, in my experience, anecdotes (particularly form peers) can be as persuasive as research evidence in persuading colleagues to change practice (though I have no rigorous research to prove that).

Suzanne read Mandernach, J. 2015, ” Assessment of Student Engagement in Higher  Education: A Synthesis of Literature and Assessment Tools“, International Journal of Learning, Teaching and Educational Research Vol. 12, No. 2, pp. 1-14, June 2015

This text was slightly tangential, as it didn’t discuss the ideas behind evidence in TEL specifically, but was a good example of an area in which we often find it difficult to find or produce meaningful evidence to support practice. The paper begins by recognising the difficulties in gauging, monitoring and assessing engagement as part of the overall learning experience, despite the fact that engagement is often discussed within HE. Mandernach goes back to the idea of ‘cognitive’ ‘behavioural’ and ‘affective’ criteria for assessing engagement, particularly related to Bowen’s ideas that engagement happens with the leaning process, the object of study, the context of study, and the human condition (or service learning). Interestingly for our current context of building MOOC-based courses, a lot of the suggestions for how these engagement types can be assessed is mainly classroom based – for example the teacher noticing the preparedness of the student at the start of a lesson, or the investment they put into their learning. On a MOOC platform, where there is little meaningful interaction on an individual level between the ‘educator’ and the learner, this clearly becomes more difficult to monitor, and self-reporting becomes increasingly important. In terms of how to go about measuring and assessing engagement, student surveys are discussed – such as the Student Engagement Questionnaire and the Student Course Engagement Questionnaire. The idea of experience sampling – where a selection of students are asked at intervals to rate their engagement at that specific time – is also discussed as a way of measuring overall flow of engagement across a course, which may also be an interesting idea to discuss for our context.

Suggested reading