Tool for creating ‘Deep Links’ to a Blackboard Course/Organisation

Blackboard Deep Link Generator

A frequent question we’re asked at this time of year is ‘How can I link directly to a Blackboard Course?’

It’s something that can be done, but simply sending out a courses URL will only work if the recipient is logged in/authenticated. Often the use case for sending out direct links means that it’s really unlikely the user will be logged in, so they need to be prompted. Sending out an email that says ‘log into Blackboard and then come back and click this link’ is obviously a nonsense.

This is where ‘deep linking’ comes into play. By amending the URL and adding a bit of extra text at the beginning you can create a link that forces the user to sign in/authenticate and then redirect them to the desired course, bypassing the Blackboard Home page. Unfortunately processing a Bb URL in this way can feel like you’re dabbling in the dark arts.

We were asked about this yesterday, so I’ve re-purposed one of the tools I used to use to do it all for you. Click the link above to launch.

Note: This will only work for University of Bristol URLs, so external visitors will need to view source and amend themselves – you should only need to change the ‘prefix’ variable to your own institutions login pages. 

Designing and Evaluating Accessible Learning Experiences 

I attended an Accessibility in Education workshop in London last Wednesday (24th May).  Lisa Taylor-Sayles and Dr Eric Jensen presented two strands – design and evaluation.

Lisa’s presentations covered designing for accessibility and inclusivity. Her focus was on Universal Design for Learning (UDL).  Designing learning experiences fit for everyone, regardless of their needs.

She began with a potted history, WW2 veterans returning with disabling injuries. This led to changes in approach to town planning, infrastructure, and assistive technology. Later the principles of Universal Design developed into the Three Principles for UDL.

Read more about the Seven Principles for Universal Design here.

Read more about the Three Principles for Universal Design for Learning

Lisa gave practical advice on small changes that build to improve learning experiences. Simple steps such as using a tool to check colours benefit everyone. Rather than assuming you’ll retrofit for accessibility if needed, you increase inclusivity.

This approach dovetailed well with Eric’s strand which focussed on effective evaluation. He covered Formative evaluation, surveys and qualitative methods such as Empowerment Evaluation. Eric gave real life examples of where he’s used these approaches in his work (and what works and doesn’t).

Sessions and workshops alternated between the two strands, keeping it fresh. For me this also helped cement the need for evaluation needs to be core to any learning design. Would definitely recommend the event if it’s repeated.

https://www.methodsforchange.org/designing-evaluating-accessible-learning-experiences/

Video – notes from the reading group

Hannah read ‘Motivation and Cognitive Strategies in the Choice to Attend Lectures or Watch them Online‘ by John N Bassilli. It was quite a in depth study but the main points were:

  • The notion of watching lectures online has a positive reaction from those who enjoy the course and find it important, but also from those who don’t want to learn in interaction with peers and aren’t inclined to monitor their learning.
  • From the above groups, the first group is likely to watch lectures online in addition to attending them face-to-face, whereas the second group are likely to replace face-to-face interaction with online study.
  • The attitude towards watching lectures online is related to motivation (ie. those who are motivated to do the course anyway are enthusiastic about extra learning opportunities), whereas the actual choice to watch them is related to cognitive strategies.
  • There is no demonstrable relation between online lecture capture and exam performance, but often the locus of control felt by students is marginally higher if they have the option to access lectures online.

Amy recommended Lifesaver (Flash required) as an amazing example of how interactive video can be used to teach.

Suzi read three short items which lead me to think about what video is good for. Themes that came up repeatedly were:

  • People look to video to provide something more like personal interaction and (maybe for that reason) to motivate students.
  • Videos cannot be skimmed – an important (and overlooked) difference compared to text.

The first two items were case studies in the use of video to boost learning, both in the proceedings of ASCILITE 2016.

Learning through video production – an instructional strategy for promoting active learning in a biology course, Jinlu Wu, National University of Singapore. Aim: enhance intrinsic motivation by ensuring autonomy, competence, and relatedness. Student video project in (theoretically) cross-disciplinary teams. Select a cell / aspect of a cell, build a 3D model, make a short video using the model and other materials, write a report on the scientific content and rationale for the video production. Students did well, enjoyed it, felt they were learning and seem to have learn more. Interesting points:

  • Students spent much longer on it than they were required to
  • Nearly 400 students on the module (I would like to have heard more about how they handled the marking)

Video-based feedback: path toward student-centred learning, Cedomir Gladovic, Holmesglen Institute. Aim: increase the student’s own motivation and enhance the possibility for self-assessment and reflection. They want to promote the idea of feedback as a conversation. Tutor talking over students online submission (main image) with webcam (corner image). Students like it but a drawback is that they can’t skim feedback. Interesting points:

  • How would tutors feel about this?
  • Has anyone compared webcam / no webcam?
  • Suggested video length <5 mins if viewed on smartphone, <10 mins if viewed on monitor

Here’s a simple way to boost your learning from videos: the “prequestion” looks at the effect of testing whether students remember more about specific questions and more generally when they are given prequestions on a short video. Answer seems to be yes on both counts. They thought that prequestions were particularly useful for short videos because students can’t easily skim through to just those topics.

Roger read “Using video in pedagogy”, an article from the Columbia University Center for Teaching and learning.

The article primarily focuses on the use of video as a tool for teacher reflection. The lecturer in question teaches Russian and was being observed. As she teaches in the target language which her observer didn’t speak her original motivation was to make the recording then talk the observer through what was happening. In actual fact she discovered additional benefits she had not envisaged. For example she was able to quantify how much time she was speaking compared to the students (as an important objective is to get students speaking as much as possible in the target language, and the teacher less). Secondly she could analyse and reflect on student use of the vocabulary and structures they had been taught. Thirdly it helped her to reflect on her own “quirks and mannerisms” and how these affected students. Finally the video provided evidence that actually contradicted her impressions of how an activity had gone . At the time she had felt it didn’t go well, but on reviewing the video afterwards she actually saw that it had been effective.

Suggested reading

Evidence – notes from the reading group

Suzi read Real geek: Measuring indirect beneficiaries – attempting to square the circle? From the Oxfam Policy & Practice blog. I was interested in the parallels with our work:

  • They seek to measure indirect beneficiaries of our work
  • Evaluation is used to improve programme quality (rather than organisational accountability)
  • In both cases there’s a pressure for “vanity metrics”
  • The approaches they talk about sound like an application of “agile” to a fundamentally non-technological processes

The paper is written at an early point in the process of redesigning their measurement and evaluation of influencing. Their aim is to improve the measurement of indirect beneficiaries at different stages of the chain, adjust plans, “test our theory of change and the assumptions we make”. Evaluation is different when you are a direct service provider than when you are a “convenor, broker or catalyst”. They are designing an evaluation approach that will be integrated into day to day running of any initiative – there’s a balance between rigor and amount of work to make it happen.

The approach they are looking at – which is something that came up in a number of the papers other people read – is sampling: identifying groups of people who they expect their intervention to benefit and evaluating it for them.

Linked to from this paper was Adopt adapt expand respond – a framework for managing and measuring systemic change processes. This paper presents a set of reflection questions (and gives some suggested measures) which I can see being adapted for an educational perspective:

  • Adopt – If you left now, would partners return to their previous way of working?
  • Adapt – If you left now, would partners build upon the changes they’ve adopted without us?
  • Expand – If you left now, would pro-poor outcomes depend on too few people, firms, or organisations?
  • Respond – If you left now, would the system be supportive of the changes introduced (allowing them to be upheld, grow, and evolve)?

Roger read “Technology and the TEF” from the 2017 Higher Education Policy Institute (HEPI)  report “Rebooting learning for the digital age: What next for technology-enhanced higher education?”.

This looks at how TEL can support the three TEF components, which evidence teaching excellence.

For the first TEF component, teaching quality, the report highlights the potential of TEL in increasing active learning, employability especially digital capabilities development, formative assessment, different forms of feedback and EMA generally, and personalisation. In terms of evidence for knowing how TEL is making an impact in these areas HEPI emphasises the role of learning analytics.

For the second component, learning environment, the report focusses on access to online resources, the role of digital technologies in disciplinary research-informed teaching, and again learning analytics as a means to provide targeted and timely support for learning. In terms of how to gather reliable evidence it mentions the JISC student digital experience tracker, a survey which is currently being used by 45 HE institutions.

For the third component, student outcomes and learning gain, the report once again highlights student digital capabilities development whilst emphasising the need to support development of digitally skilled staff to enable this. It also mentions the potential of TEL in developing authentic learning experiences, linking and networking with employers and showcasing student skills.

The final part of this section of the report covers innovation in relation to the TEF.  It warns that “It would be a disaster” if the TEF stifled innovation and increased risk-averse approaches in institutions. It welcomes the inclusion of ’impact and effectiveness of innovative approaches, new technology or educational research’ in the list of possible examples of additional evidence as a “welcome step.” (see Year 2 TEF specification Table 8)

Mike read  Sue Watling – TEL-ing tales, where is the evidence of impact and In defence of technology by Kerry Pinny. These blog posts reflect on an email thread started by Sue Watling in which she asked for evidence of the effectiveness of TEL. The evidence is needed if we are to persuade academics of the need to change practice.  In response, she received lots of discussion, including and what she perceived to be some highly defensive posts.  The responses contained very little by way of well- researched evidence. Watling, after Jenkins, ascribes ‘Cinderella Status’ to TEL research, which I take to mean based on stories, rather than fact.  She acknowledges the challenges of reward, time and space for academics engaged with TEL. She nevertheless  makes a pleas that we are reflective in our practice and look to gather a body of evidence we can use in support of the impact of TEL. Watling describes some fairly defensive responses to her original post (including the blog post from James Clay that Hannah read for this reading group). By contrast. Kerry Pinny’s post responds to some of the defensiveness, agreeing with Watling – if we can’t defend what we do with evidence, then this in itself is evidence that something is wrong.

The problem is clear, how we get the evidence is less clear. One point from Watling that I think is pertinent is that it is not just TEL research, but HE pedagogic research as a whole, that lacks evidence and has ‘Cinderella status’. Is it then surprising that TEL HE research, as a  subset of  HE pedagogic research, reflects the lack of proof and rigour? This may in part be down to the lack of research funding. As Pinny points out, it is often the school or academic has little time to evaluate their work with rigour.  I think it also relates to the nature of TEL as a  set of tools or enablers of pedagogy, rather than a singular approach or set of approaches. You can use TEL to support a range of pedagogies, both effective and non-effective, and a variety of factors will affect its impact.  Additionally, I think it relates to the way Higher Education works – the practice there is and what evidence results tends to be very localised, for example to a course, teacher or school. Drawing broader conclusions is much, much harder.  A lot of the evidence is at best anecdotal. That said, in my experience, anecdotes (particularly form peers) can be as persuasive as research evidence in persuading colleagues to change practice (though I have no rigorous research to prove that).

Suzanne read Mandernach, J. 2015, ” Assessment of Student Engagement in Higher  Education: A Synthesis of Literature and Assessment Tools“, International Journal of Learning, Teaching and Educational Research Vol. 12, No. 2, pp. 1-14, June 2015

This text was slightly tangential, as it didn’t discuss the ideas behind evidence in TEL specifically, but was a good example of an area in which we often find it difficult to find or produce meaningful evidence to support practice. The paper begins by recognising the difficulties in gauging, monitoring and assessing engagement as part of the overall learning experience, despite the fact that engagement is often discussed within HE. Mandernach goes back to the idea of ‘cognitive’ ‘behavioural’ and ‘affective’ criteria for assessing engagement, particularly related to Bowen’s ideas that engagement happens with the leaning process, the object of study, the context of study, and the human condition (or service learning). Interestingly for our current context of building MOOC-based courses, a lot of the suggestions for how these engagement types can be assessed is mainly classroom based – for example the teacher noticing the preparedness of the student at the start of a lesson, or the investment they put into their learning. On a MOOC platform, where there is little meaningful interaction on an individual level between the ‘educator’ and the learner, this clearly becomes more difficult to monitor, and self-reporting becomes increasingly important. In terms of how to go about measuring and assessing engagement, student surveys are discussed – such as the Student Engagement Questionnaire and the Student Course Engagement Questionnaire. The idea of experience sampling – where a selection of students are asked at intervals to rate their engagement at that specific time – is also discussed as a way of measuring overall flow of engagement across a course, which may also be an interesting idea to discuss for our context.

Suggested reading

Horizon Report 2017 – notes from the reading group

This time we looked at the NMC Horizon Report 2017 and related documents.

Amy read ‘Redesigning learning spaces’ from the 2017 Horizon report and ‘Makerspaces’ from the ELI ‘Things you should know about’ series. The key points were:

  • Educational institutions are increasingly adopting flexible and inclusive learning design and this is extending to physical environments.
  • Flexible workspaces, with access to peers from other disciplines, experts and equipment, reflect real-world work and create social environments that foster cross-discipline problem-solving.
  • For projects created in flexible environments to be successful, the facilitator allow the learners to shape the experience – much of the value of a makerspace lies in its informal nature, with learning being shaped by the participants rather than the facilitator.
  • There are endless opportunities for collaboration with makerspaces, but investment – both financial and strategic – is essential for successful projects across faculties.

Roger read blended learning designs. This is listed as a short term key trend, driving ed tech adoption in HE for the next 1 to 2 years. It claims that the potential of blended learning is now well understood, that blended approaches are widely used, and that the focus has moved to evaluating impact on learners. It suggests that the most effective uses of blended learning are for contexts where students can do something which they would not otherwise be able to, for example via VR. In spite of the highlighting this change in focus it provides little detailed evidence of impact in the examples mentioned.

Suzi read the sections on Managing Knowledge Obsolescence (which seemed to be around how we in education can make the most of / cope with rapidly changing technology) and Rethinking the Role of Educators. Interesting points were:

  • Educators as guides / curators / facilitators of learning experiences
  • Educators need time, money & space to experiment with new technology (and gather evidence), as well as people with the skills and time to support them
  • HE leaders need to engage with the developing technology landscape and build infrastructure that supports technology transitions

Nothing very new, and I wasn’t sure about the rather business-led examples of how the role of university might change, but still a good provocation for discussion.

Hannah read ‘Achievement Gap’ from the 2017 Horizon Report. It aimed to talk about the disparity in enrolment and performance between student groups, as defined by socioeconomic status, race, ethnicity and gender, but only really tackled some of these issues. The main points were:

  • Overwhelming tuition costs and a ‘one size fits all’ approach of Higher Education is a problem, with more flexible degree plans being needed. The challenge here is catering to all learners’ needs, as well as aligning programmes with deeper learning outcomes and 21st century problems.
  • A degree is becoming increasingly vital for liveable wages across the world, with even manufacturing jobs increasingly requiring post secondary training and skills.
  • There has been a growth in non-traditional students, with online or blended offerings and personalise and adaptive learning strategies being implemented as a retention solution.
  • Some Universities across the world have taken steps towards developing more inclusive offerings: Western Governors University are offering competency based education where students develop concrete skills relating to specific career goals; Norway, Germany and Slovenia offer free post secondary education; under the Obama administration, it was made so that students can secure financial aid 3 months earlier to help them make smarter enrolment decisions; in Scandinavian countries, there is a lot of flexibility in transferring to different subjects, something that isn’t widely accepted in the UK but could help to limit the drop-out rate.
  • Some countries are offering different routes to enrolment in higher education. An example of this is Australia’s Fast Forward programme provides early information to prospective students about alternative pathways to tertiary education, even if they have not performed well in high school. Some of these alternative pathways include online courses to bridge gaps in knowledge, as well as the submission of e-portfolios to demonstrate skills gained through non-formal learning.
  • One thing I thought the article didn’t touch on was the issue of home learning spaces for students. Some students will share rooms and IT equipment, or may not have access to the same facilities as others.

Learning Design Cross Institutional Network #5

On Friday 24th February, my colleague Hannah and I ventured up to Northampton University to attend the fifth Learning Design Cross Institutional Network (LDCIN) event. The LDCIN was formed in 2015, with colleagues from a number of institutions across the world taking place in discussions about learning design in education.

The event began with an introduction from Simon Walker, who heads up the Educational Development team at the University of Greenwich. He discussed the future of learning design; the increased interest with the introduction of the TEF, and the impact big data will have on how we design our courses, briefly touching on the report the Open University have recently published on data analytics and learning design (see below for more information).

Participants who had offered to give a ten-minute overview of their work were then invited to deliver. This session started with Natasa Petrovic, from UCL, who discussed her ABC model for learning design – a process my colleague Hannah had successfully used the day before for her Bristol Futures enrichment course! This model is becoming widely adopted as a method to develop course design, with participants only having to attend a 90-minute to 2-hour session for a complete overhaul of their module. More on this method can be found here.

We then had three further presentations from colleagues across the country. Fiona Hale from the University of Edinburgh presented their new model for learning design, which (she admits) very closely resembles the CAIeRO model, created at Northampton. Adele Gordon from Falmouth discussed their development as a learning design team, and how their focus was on employability above anything else – a method that will hopefully be increasingly adopted across the sector.

Finally, Tom Olney and Jitse van Ameijde from the Open University talked about their work on data analytics and retention-satisfaction. They have created a model for designing activities that ensure high retention and success (the ‘ICEBERG’ model) and have discovered some interesting trends. For example, students have higher satisfaction on courses where there are fewer collaborative activities, yet their ‘success’ (in terms of retention, meeting learning outcomes and grades) is lower. Similarly, more collaborative activities meant lower student satisfaction, yet much higher success rates. The report on designing for retention can be found here. This report will have the biggest impact where universities offer online-only courses, where retention is higher than on traditional courses.

These sessions were followed by a tea break (no biscuits provided!) and then a session from Jisc’s Ruth Drysdale, who posed the ‘wicked’ question of how to evidence the impact of TEL – a question that was best answered by Jitse van Ameijide, who simply said ‘You can’t – and shouldn’t.’ The impact of TEL should only be measured by learning success as a whole, rather than how technology has impacted on learning. This focus on successful learning rather than the impact of various technologies was a key theme throughout the morning, and potentially the focus of the next LDCIN meeting.

Next on the agenda was a session from our hosts, which asked us to answer the question ‘How do you solve a problem like Waterside?’ Waterside is a new university campus being built in the heart of Northampton but, unlike a traditional campus, Waterside will have no lecture theatres – teaching will take place online and via small-group or one-to-one tutorials. All course programmes (over 2,000 of them!) have to be redesigned to fit the new teaching style, which also means that the minds of all academics will have to be won over to face this new and radical change. We were tasked with deciding on the best way to motivate staff to engage with this strategy, thinking about five key areas: grassroots campaigns, community, strategic, faculty-level and research-based.

Our group created three models to engage staff with the new teaching strategy. The first, and least desirable, was a ‘top-down’ model, where senior management forced staff to engage with workshops to redesign their courses. However, this would not be a positive change, and would leave staff feeling demotivated and uninspired. The second was a ‘hand-holding’ approach, where a great deal of resource was added to the learning design team to ensure each academic had a bespoke session to redesign their course, with a number of community groups set up to support staff and provide on-hand advice whenever it was needed. The final approach was champion-led, where each faculty had a self-elected ‘champion’ of learning design, who could create a buzz inside their faculty and be available to support staff at short notice. Technology Enhanced Learning has been engaged with most in departments that have appointed learning technologists, and we believe this model is key to success, especially when it comes to changing culture and mindset.

Unfortunately, we had to leave after this session, but judging from the Twitter feed the afternoon was also a success, with a workshop from Edinburgh’s Fiona Hale on mapping learning activities and anther session from the Learning Design team at Northampton on evaluating learning design support.

In an ever-changing sector, it is essential colleagues working towards similar goals come together to share their experiences, methods and thoughts. I was especially inspired by colleagues at Northampton, who are leading the way in terms of a blended approach to education – I can’t wait to see how the challenge of Waterside works out. The LDCIN will be meeting again in the summer to discuss this and other projects taking place across the network. To keep up to date with the latest LDCIN updates, click here.

Flexible and inclusive learning – notes from reading group

Amy read: Why are we still using LMSs, which discusses the reasons LMS systems have not advanced dramatically since they came onto the market. The key points were:

  • There are five core features that all major LMS systems have: they’re convenient; they offer a one-stop-shop for all University materials, assessments and grades; they have many accessibility features built in; they’re well integrated into other institutional systems and there is a great deal of training available for them.
  • Until a new system with all these features comes onto the market, the status quo with regard to LMS systems will prevail.
  • Instructors should look to use their current LMS system in a more creative way.

Mike read: Flexible pedagogies: technology-enhanced learning HEA report

This paper provided a useful overview of flexible learning, including explanations of what it might mean, dilemmas and challenges for HE. The paper is interesting to consider alongside Bristol’s Flexible and Inclusive learning paper. For the authors, Flexible learning gives students choice in the pace, place and mode of their learning. This is achieved through application of pedagogical practice, with TEL positioned as an enable or way of enhancing this practice. Pace is about schedules (faster or slower), or allowing students to work at their own pace. Place is about  physical location and distance. Mode includes notions of distance and blended learning.

Pedagogies covered include personalised learning, flexible learning – (suggesting it is similar to adaptive learning in which materials adapt to individual progress), gamification, fully online and blended approaches. The paper considers the implications of offering choice to students for example, over what kind of assessment. An idealised form would offer a very individualised choice of learning pathway, but with huge implications on stakeholders.

In the reading, group, we had an interesting discussion as to whether students are always best equipped to understand and make such choices. We also wondered how we would resource the provision of numerous pathways.  Other  risks include potential for information overload for students, ensuring systems and approaches work with quality assurance processes. Barriers include interpretations of KIS data which favours contact time.

We would have a long way to go in achieving the idealised model set out here. Would a first step be to change the overall diet of learning approaches across a programme, rather than offering choice at each stage? Could we then introduce some elements of flexibility in certain areas of programmes, perhaps a bit like the Medical School’s Self Selected Components, giving students choice in a more manageable space within the curriculum?

Suzanne read:  Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. The main points were:

  • Self-regulated learning is something which happens naturally in HE, as students will assess their own work and give themselves feedback internally. This paper suggests this should be harnessed and built on in feedback strategies in HE.
  • Shift in focus to see students having a proactive rather than reactive role in feedback practices, particularly focused on deciphering, negotiating and acting on feedback.
  • The paper suggests 7 principles for good feedback practice, which encourages this self-regulation: 1. clarifying what good performance is; 2. facilitating self-assessment; 3. delivering high quality feedback information; 4. encouraging dialogue; encouraging self-esteem and motivation; 6. giving opportunities to close the gap between where the student is now and where they need/want to be; 7. using feedback to improve teaching.
  • For our context, this gives some food for thought in terms of the limitations of a MOOC environment for establishing effective feedback practices (dialogue with every student is difficult if not impossible, for example), and emphasises the importance of scaffolding or training effective peer and self-assessment, to give students the confidence and ability to ‘close the gap’ for themselves.

Suzanne also read: Professional Development Through MOOCs in Higher Education Institutions: Challenges and Opportunities for PhD Students Working as Mentors

This paper reports on a small-scale (20 participants), qualitative study into the challenges and opportunities for PhD students acting as mentors in the FutureLearn MOOC environment. As a follow-on from the above reading, using mentors can be a way to help students with the peer and self-assessment practices, which is why I decided to read it in parallel. However, it also focuses on the learning experiences of the PhD student themselves as they perform the mentor role, also giving these students a different (potentially more flexible and inclusive) platform to develop skills.

Overall, the paper is positive about the experiences of PhD MOOC mentors, claiming that they can develop skills in various areas, including:

  • confidence in sharing their knowledge and interacting with people outside their own field (especially for early career researchers, who may not yet have established themselves as ‘expert’ in their field);
  • teaching skills, particularly related to online communication, the need for empathy and patience, and tailoring the message to a diverse audience of learners. It’s noteworthy here that many of these mentors had little or no teaching experience, so this is also about giving them teaching experience generally, not teaching in MOOCs specifically;
  • subject knowledge, as having to discuss with the diverse learning community (of expert and not expert learners) helped them consolidate their understanding, and in some cases pushed them to find answers to questions they had not previously considered.

Roger read Authentic and Differentiated Assessments

This is a guide aimed at School teachers. Differentiated assessment involves students being active in setting goals, including the topic, how and when they want to be evaluated. It also involves teachers continuously assessing student readiness in order to provide support and evaluate when students are ready to move on in the curriculum.

The first part of the article describes authentic assessment, which it defines as asking students to apply knowledge and skills to real world settings, which can be a powerful motivator for them. A four stage process to design authentic assessment is outlined.

The second part of the article focuses on differentiated assessment. We all have different strengths and weaknesses in how we best demonstrate our learning, and multiple and varied assessments can help accommodate these. The article stresses that choice is key, including of learning activity as well as assessment. Project and problem based learning are particularly useful.  Learning activities should always consider multiple intelligences and the range of students’ preferred ways of learning, and there should be opportunities for individual and group tasks as some students will perform better in one or the other.

Hannah read: Research into digital inclusion and learning helps empower people to make the best choices, a blog by the Association for Learning and Teaching about bridging the gap between digital inclusion and learning technology. The main points were:

  • Britain is failing to exploit opportunities to give everyone fair and equal access to learning technology through not doing enough research into identifying the best way to tackle the problem of digital exclusion
  • Learning technology will become much more inclusive a way of learning once the digital divide is addressed
  • More must be done to ensure effective intervention; lack of human support and lack of access to digital technology are cited as two main barriers to using learning technology in a meaningful way
  • We need to broaden understanding of the opportunities for inclusion, look into how to overcome obstacles, develop a better understanding of the experiences felt by the excluded and understand why technological opportunities are often not taken up

Suzi read:  Disabled Students in higher education: Experiences and outcomes which discusses the experience of disabled students, based on surveys, analysis of results, interviews, and case studies at four, relatively varied, UK universities. Key points for me were:

  • Disability covers a wide range of types and severity of issues but adjustments tend to be formulaic, particularly for assessment (25% extra time in exams)
  • Disability is a problematic label, not all students who could do will choose to identify as disabled
  • Universal design is the approach they would advocate where possible

Suzi also read: Creating Better Tests for Everyone Through Universally Designed Assessments a paper written for the context of large-scale tests for US school students, which nonetheless contains interesting background and advice useful (if not earth-shattering). The key messages are:

  • Be clear about what you want to assess
  • Only assess that – be careful not to include barriers (cognitive, sensory, emotional, or physical) in the assessment that mean other things are being measured
  • Apply basic good design and writing approaches – clear instructions, legible fonts, plain language

 

Assessment and Feedback: Transforming Curricula and Assessment in HE

On Thursday 2nd February I attended an event at the University of Bath entitled “Assessment and Feedback: Transforming Curricula and Assessment in HE.” There were many interesting sessions , of which the following were some of my personal highlights.

Dr Alex Buckley from the University of Strathclyde spoke about their use of the TESTA  approach to reviewing assessment at a Programme level. Funded by the HEA, the project, standing for Transforming the Experience of Students Through Assessment,  originally involved four partner institutions, Bath Spa, Chichester, Winchester and Worcester.  It is now used in over 50 universities in an attempt to address deep challenges of assessment which Alex stressed need to be considered at Programme level. 

The TESTA website contains further information and resources on this approach, including a manual on how to implement it. Alex explained that it involves triangulating data from a programme audit, an assessment experience questionnaire for students and focus groups.  At Strathclyde those programmes who have engaged with TESTA have found it an extremely useful diagnostic tool as well as helping colleagues to think differently about assessment. After the process, programme teams have a workshop with educational developers where they consider practical changes that can be made which address the TESTA findings.   The TESTA website contains case studies and best practice guides with concrete suggestions. An example is reducing reliance on formal documentation to communicate standards, and putting greater effort into providing exemplars in order make explicit, and open to discussion, the meaning of assessment criteria and enable to students internalise these through marking exercises and self and peer assessment in relation to the exemplars.

Kay Sambell from Edinburgh Napier University expanded in the afternoon on Alex’s point that we need to facilitate student engagement with feedback rather than simply flagging up when feedback is being provided. However both Kay and Jane Rand recognised that this can be easier said than done. Literature provided evidence of the effectiveness of this a decade ago, Jane said, but much practice hasn’t changed.  

Kay went on to show some practical strategies that can be used. Her work is based on the Assessment Standards Knowledge Exchange at Oxford Brookes,  which provides a range of useful resources. She also referred to the work of Winstone, Nash, Parker & Rowntree (2016) entitled Supporting Learners’ Agentic Engagement With Feedback: A Systematic Review and a Taxonomy of Recipience Processes, which emphasises the importance of “proactive recipience” of feedback.

Kay talked about dialogic use of exemplars, which can take different forms; complete or part of an assignment, authentic or re-created, annotated with feedback or not. She went on to give an example of a peer review workshop from her own practice, the process for which is outlined in the photo on the right. Students really valued this opportunity for extended dialogue around assessment criteria.

Kay also referred to the work of Nicol, Thomas and Breslin on feedback production being recognised as just as valuable for learning as receipt of feedback. She recognised that students are sometimes reluctant to engage with engagement activities (such as peer review)! However, when they do engage they find them extremely useful, and she has found that exemplar assignments are highly effective as “vicarious peer assessment”. Kay mentioned the work of Carless and Chan on managing dialogic use of exemplars. This contains analysis of how teachers can orchestrate dialogue around exemplars. They suggest in the paper that ” the dialogic use of exemplars should be a core aspect of teachers’ repertoire of assessment for learning strategies, in that the development of student skills in making academic judgements is fundamental to the university experience.”  This is a point often made by D Royce Sadler, well known for his work on conditions necessary for students to benefit from feedback (Sadler, 1989) . In his own teaching Sadler makes use of a version of exemplars in the peer review of formative writing his students do. He puts his own attempt at the writing task in with his students’ which are distributed and peer reviewed. Sadler describes this in more detail in “‘Opening up feedback: Teaching learners to see“.

References

David Carless & Kennedy Kam Ho Chan (2016): Managing dialogic use of exemplars, Assessment & Evaluation in Higher Education

David Nicol, Avril Thomson & Caroline Breslin (2014) Rethinking feedback practices in higher education: a peer review perspective, Assessment & Evaluation in Higher Education, 39:1, 102-122

Sadler, D. R. (2013) Opening up feedback: Teaching learners to see” ( In Merry, S., Price, M., Carless, D., & Taras, M. (Eds.) Reconceptualising Feedback in Higher Education: developing dialogue with students. (Ch. 5, 54-63). London: Routledge.

Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18, 119-144

Naomi E. Winstone, Robert A. Nash, Michael Parker & James Rowntree (2016): Supporting Learners’ Agentic Engagement With Feedback: A Systematic Review and a Taxonomy of Recipience Processes, Educational Psychologist

 

How can we use ‘unconferencing’ to enrich the Bristol Futures experience?

I went to ‘a conference on unconferencing’ (which actually turned out to be an unconference on unconferencing) in Birmingham on 20th January. Having no preconceptions as to what an unconference was, I went with the aim of gathering some ideas around enrichment activities for Bristol Futures Workstream 5.

The day started with a presentation from one of the organisers, Daniel King, on why conferences matter. He identified three main reasons:

  • Career development; networking; status
  • Field configuring; knowledge exchange; gaining a common perspective
  • Cultural management; learning how to act within a certain field

Conferences often work around a set agenda and hierarchy; the few talking to the many. The audiences take a passive role and the conference is a ‘man in suit’ affair, reinforcing existing power relations.

The idea of an unconference is to challenge visible hierarchies in conventional conferences, encouraging participation and inclusivity. The participants set the agenda, and little involvement or facilitation is needed from the organisers.

So how did it work?

We sat around in a circle, with post it notes and pens in the centre. There was an empty timetable on the wall, with locations on one side (‘middle bit’ ‘by the plant pot’ etc). Everyone was given the opportunity to ‘pitch’ an idea for a session, which involved writing a short description, question or discussion point on the paper, reading it out to the group and posting it onto the timetable (there was a bit of negotiation involved here, particularly where ideas crossed over and became one session rather than two). The person who suggested the session ‘owns’ it, and is responsible for kickstarting the discussion as well as typing up and sharing the notes afterwards. I pitched a session on how we can tackle the issue of invisible hierarchies within unconferences and how, despite the focus on inclusion, a lack of structure will invite certain forms of elite, such as those with social confidence taking over the discussion.

The sessions began, and everything went really smoothly! Before starting, the organisers had let us know that unconferences operate on a ‘rule of two feet’, which means that if you feel that you are no longer contributing or getting anything from a conversation, you’re entitled to leave it and join another session whenever you like.

How can this work for Bristol Futures?

The unconference format could work really well as a Bristol Futures Enrichment activity for all three of the online courses. The easy going, non hierarchical structure made for really interesting, balanced debates and conversations. Unconferences are designed to facilitate peer-to-peer learning, encourage less separation between different points of the hierarchy (from undergraduate through to academics), and have a focus on experience and views rather than status.

We could hold an unconference (maybe calling them something other than ‘unconference’) for each of the three pathways, encouraging students to pitch topics for discussion. This could be anything related to the content of a course (eg. ‘how can an individual make an impact on a global level?’), the course design (eg. ‘why I didn’t think week two worked well’), or something related to the overall theme that the course has not covered. The students would experience the unconference as an enrichment activity and opportunity to connect with each other and collaborate in a meaningful way. For us as lead educators and learning designers,  the unconference format could be used not only as an enrichment activity, but also as a way of using student insight to inform future iterations of the courses and make changes where needed.

An unconference would require little organisation outside booking a room and providing stationary and simple guidance, as well as little resource in terms of facilitation, as students become facilitators through the unconference format. These events could be held to kick off the course run, as a ‘touch base’ point during the middle of the course run, to round off the courses in a meaningful and useful way, or all three.

The aim of the Bristol Futures enrichment courses is to equip students with the skills they need to be happy, well rounded, resourceful adults. Through participating in an unconference, students will develop many of the Bristol Attributes including:

  • Intellectual risk, through participating in discussions potentially outside one’s comfort zone
  • Active and self aware learning, through pitching suggestions and taking ownership of a session
  • Inquisitiveness and initiative, through discussing a topic and trying to find a solution
  • Collaboration, through working with others in the group to define and run the unconference
  • Influence and leadership, through motivating and directing others to invite effective contributions
  • Responsibility, through managing the sessions without too much input from the organisers

Unconferencing encourages students to interact in a respectful, innovative and democratic way and could make a really effective enrichment activity across all three pathways.

Formative language activities using technology’, a free event organised by the School of Modern Languages and the Centre for English and Foundation Studies

As a language teacher I am always interested in what other colleagues do around assessment and feedback practice so on the 19th of January I attended a free seminar organised by CELFS (Centre for English and Foundation Studies) and SML (School of Modern Languages) on ‘Formative language activities using technology’.

The seminar focused on strategies for engaging students with formative and summative feedback using a range of technologies both in and outside the language classroom.

I took away lots of good ideas but also a couple of questions that remain unanswered. First, are we now more inclined to the idea that best practice may require the use of multiple technologies rather that one solution for all, and second, how can make the environment seamless to our students? and what about accessibility requirements?

my notes on the event

Engaging students with feedback. I know I did not come to our feedback appointment but could you tell me what my mark is?’ Emilie Poletto’s first slide showing a teacher snowed under a huge mountain of paper is a great illustration of the issue; most of the time students tend to concentrate on the end product rather than on their learning process but it is up to us to change this says Emilie ‘we need to change the role of the student from a consumer approach to a partnership’.
So the big question is ‘What strategies can we use to rethink the way we give students formative feedback? it clearly requires more than a new shiny piece of technology. Maggie Boswell says the change must be driven by the learning process not the means of delivery ‘Some might argue for the use of technology to mark student work while others might argue for traditional methodologies. How student engage with their feedback and make subsequent progress is at the heart of my student-driven ongoing enquiry.

Here are a few tips that teachers shared with the audience

  • work with students on assessment criteria and engage them in collaborative learning activities. Give them the opportunity to identify their strengths and weaknesses and to own a plan for improving their competences.
  • ask students to identify specific features for formative feedback so that you can target both the quality and the amount of feedback you provide
  • use personalised feedback, eg video through Mediasite or any screencasting solution
  • use a variety of feedback formats, written, audio and video
  • provide student support throughout the whole process, they may not need help with using the technology but with the orientation, for example finding where they have to go to look at the feedback 

a bit more from some of the individual presentations

Maggie Boswell uses a combination of different feedback formats such as drop in corrective written and voice comments, and a range of technologies like Turnitin Grademark and Mediasite. Turnitin Grademark allows her to annotate essays using both the a reusable comment bank and voice recording features, while Mediasite desktop recorder allows her to create screencast and add audio feedback.

With this combination of methods she provides feedback during TB1 over a twelve-week period on essay redraft and final draft. A couple of tips from Maggie on voice feedback; first, students engage more with this type of feedback because they hear a familiar voice, second, it is important to use the right tone and elaborate on some of the negative comments so that students don’t worry too much about a mistake that may be less serious than they might think. ‘I really like the video feedback. At first when I saw ‘omit’ (grademark drop in written comment) I thought it was really bad, but when I watched the video, I realised it was not such a bad error because of the intonation’. (from student survey collected via Google forms).

Emilie Poletto’s presentation ‘Thanks for the feedback, but what is my mark?” How to help students engage with feedback, was the one I liked most as it goes straight to the point, we spend lots of time providing formative feedback and then realise that students completely ignore it and only focus on the final mark. What can we do about this?

Emilie’s approach, inspired by the work of Alex Forsythe & Sophie Johnson as well as the work of other colleagues in the SML, focuses on ‘feedback action plans’ and student motivation.  Each student gets an individual action plan  to record specific areas of their learning that are routinely discussed with the teacher during individual tutorial. The action plan puts the onus on the students to devise their own strategies, critically evaluate the feedback they are given, build on their strengths and address their weaknesses. Students may not be used to do all of this at first but that they are more likely to engage if they feel they are in charge of the process and get good support from their teacher. Grades are only discussed at a later stage, in fact Emilie doesn’t give students their marks until they have completed the action plan which means students really have to focus on their learning first.   

In terms of working with multiple technologies I liked Jana Nahodilová’s presentation about the use of Blackboard, Quizlet and Xerte: the best parts of all of them to support assessment and feedback. Her approach for providing formative assessment is built on three main areas; Ongoing multi-phase daily process that takes place through teacher-pupil interaction, providing feedback for immediate action (for student and teacher) and reflecting on how to modify teaching activities to improve learning (motivation) and results.
For each one of these tools Jana has identified both advantages and considerations from a teacher’s perspective. Advantages include ‘easy to use and interactive’, ‘great for monitoring’, and ‘wide range of possible activities’, while some consideration are ‘little flexibility’, ‘complex set up’ and ‘lack of the functionalities required’.

More on the range of technologies on show

Blackboard assessment engine available within Blackboard and fully supported at UoB

Xerte online tutorial tool with a range of functionalities for assessment and feedback and fully supported at UoB

Quizlet  free online learning tool particularly used for flashcards to support vocabulary learning

Mediasite fully supported UoB lecture capture tool with a range of functionalities for editing videos and screencasting  

Turniting Grademark, fully supported at UoB grading tool with a variety of functionalities for automated written feedback and voice feedback

Google forms free and easy to use quiz tool available from individual google accounts

Sonocent an audio note taking software with a wide array of functionalities for feedback and assessment such as visual annotations of text and audio

Many thanks to the presenters for sharing their work:

Maggie Boswell, English teacher (CELFS)
Emilie Poletto, French teacher (SML)
Jana Nahodilová, Czech teacher (SML)