Cutting a long story short – notes from a talk as part of BBC Digital Bristol Week

This was a panel discussion with Rowan Kerek Robertson (Taylor Kerek) chairing, Sam Bailey (online/video for BBC Radio 1), and Stephen Follows (Catsnake, a production house specialising in short videos often for campaigning charities).

There was discussion of the using different platforms. For SB, for a content idea to be good it must be able to lead to something for all platforms: iPlayer, radio, social (Twitter, Facebook), and Youtube. SF and SB talked about the difference between video content on different platforms based on audience expectations:

  • iPlayer – generally about 30 minutes long, people sitting down to watch telly
  • Youtube – shorter, grabbier, but people are geared up to be watching something
  • Facebook – autoplay without sound, people who just want to see what’s going on

There was discussion about social sharing of content. Shares is often used as a metric, but should be used with caution. If you really want people to watch the the end, or to take action, you need to measure that. SF recommended the book Contagious which, among other things, lists the 5 emotions that cause sharing as: anger, anxiety, awe, excitement, and humour (in Radio 1 parlance – WTF, OMG, LOL). SF said that they’ve found the most successful way to get meaningful shares is to target people “who already care” via blogs. Sites like Buzzfeed might give you lots of people loading your video, but will they actually watch it?

There was interesting detail from SF on how their production process. They start with an understanding of what their clients want: “who do you want to do what?”. From this they write a brief (eg “This film will get women aged 25-30 to share X because it will make them feel like Y”). Key performance indicators need to go in the brief and need to really reflect what the client is trying to achieve. They then have an ideas session with this visible. They don’t have a maximum length for videos (their greatest hit is 8 minutes). Digital allows you to be flexible: embrace that.

Testing has 3 stages.

  1. Informal focus group (friends, friends-of-friends) – just to get the feel of the demographic, not to test out ideas.
  2. Show the video to a few people from that group.
  3. Seeding (targeted Youtube views) to around 1-2k people.

This made it sound relatively light-touch and low-cost – great for higher education.

SF believes storytelling is a key way humans have passed on knowledge, so is a fundamental driver. Knowledge sharing leads to a joy in storytelling (just as the need for food leads to appreciation of cuisine, and reproduction leads to sex being pleasurable). A storytelling technique is the “curiosity gap” – something that isn’t fulfilled until the end (but not by tricking people, more like stringing out a joke so it gets more enjoyable the longer it goes on … and you know when to stop). Koney 2012 is an example of a video that uses this technique.

Relatedly, recent research suggests that men who tell good stories are seen as more attractive.

Games and gamification – notes from the reading group

Suzi read Do points, levels and leaderboards harm intrinsic motivation?

This study attempted to shed light on when/why common gamification techniques (points, levels, leaderboards) harm intrinsic motivation, as measured by the intrinsic motivation inventory (IMI). They found that, for this image-tagging task, intrinsic motivation was not harmed and the number of tags increased with all three interventions. They conclude that these techniques could be useful for some tasks. There are limitations, which the authors acknowledge. In this situation leaderboards, etc don’t mean anything here, they don’t create stress, in other situations they might well.

Suzi watched FOTE12: Nicola Whitton ‘What is the Future of Digital Games and Learning’. This was an interesting short talk, covering interesting examples:

Whitton argues that a key idea from games that’s overlooked is play. She talks about the idea of creating a “magic circle” – a safe space to practice, have fun, and make mistakes. Her suggestions  for considering gamification include: implement some mystery, do something unexpected, be playful, and create a safe space to make mistakes.

Chris read about the Reading Game from Macquarie. This is basically exactly the same as Peerwise, and appears to be defunct – probably because Peerwise has cornered the market. So, I then talked about my recent experiences of Peerwise. We’ve just used it with our first years, with mixed results because they didn’t engage as much as I would have liked, and many people only did the minimum required for credit. However, Peerwise contains a scoring system that rewards students for various kinds of participation, and some people have reported that using this to introduce an element of competition can motivate students to participate. So next year, rather than asking students to do a certain amount of work for credit, they will be asked to achieve a certain score. Watch this space….

Mike looked at Evoke, an online multiplayer game with grand ambitions to help people ‘change the world’ by collectively addressing problems.  Element see relevant to HE and Bristol Futures in particular, whilst parts of the approach would (I suspect) alienate some potential participants.. The idea of coming up with ‘Evokations’ (grand challenges people can respond to) has been used successfully elsewhere. The use of mentors to facilitate, prizes to incentivise seem sound. Evole had a time-based (weekly) structure with people being drip fed the stages, which reminded me of the Twelve days of Twitter course. The thing that might be off-putting to some is the suggestion that people take on superhero-like persona. The point scoring part looked complicated, but may have worked to motivate some.

Roger read Lameras (2015) Essential Features of Serious Games Design in Higher Education . This paper provides some useful scaffolding for teachers thinking about using games or gamification techniques. Particularly useful were:

  • the game design planner, which provides some prompts for teachers considering using games, eg around learning outcomes, feedback, and the teacher’s role, as well as which types and characteristics of games might be most appropriate in the context, eg types of player choice and challenge, nature of any collaboration or competition, and rules
  • The mapping of learning attributes to game attributes, eg ways in which games can support information transmission, collaboration, and discussion . Key game attributes include  rules, goals, choices, tasks, challenges, competition, collaboration, and feedback, which are evidenced in game features such as missions, puzzles, scoring, progress indicators, leaderboards, branching tasks, gaining / losing lives and team activities

It is evident from reading the paper that there is a strong overlap between game design and good learning design in general, for example in the importance of feedback, challenge, choice and social learning.

MOOCs: what have we learnt? – notes from the reading group

Steve read HEA: Liberating learning: experiences of MOOCs

MOOCs are increasing in popularity. Will this continue? Registrations, drop outs, completions. Will they disrupt HE?

10-person sample size, people who completed Southampton MOOC. Want to understand motivations, opportunities, problems. Discussed findings with five academics who taught/led it. Aware of small scale, so no recommendations – but reflections and suggestions.

Themes from findings:
1 Flexible, fascinating and free – can fit into lives, customise pace, no financial commitment.
2 Feeling part of something – social & international aspect, even for passive ‘lurkers’
3 Ways of learning – prefer sequential over dipping in/out.
4 A bit of proof? – cost sensitivity to purchasing accreditation. Only 1 wanted this.

Four-quadrant model for MOOC engagement, suggests stuff to include. Two axes:
personal enjoyment vs work/education
studying alone vs social learning

Steve also read What are MOOCs Good For?

MOOC boom and bust? High-profile implementation at San Jose failed, inc backlash from academics. General completion/dropout rate  (SB: do we care about drop outs? Most are window shoppers). Experiments and options/opportunities are still expanding. In summary, more data needed but need to moderate expectations – still a place for innovation, also integrating with traditional teaching – take best bits of both?

Roger read: Practical Guidance from MOOC Research: Students Learn by Doing

This is one of a series of blog posts by Justin Reich, who is Executive Director of the Teaching Systems lab at MIT, which ” investigates the complex, technology-mediated classrooms of the future and the systems we need to develop to prepare teachers for those classrooms.”
In this post from July 2015, Justin’s main point is that when developing MOOCs it is better for student learning to focus on development of interactive activities as opposed to high production videos.  He mentions particularly the value of formative peer assessment, synchronous online discussion and simulations “that create learning experiences that students may not have in other contexts”.
If making videos then focus on the early parts of the course, as watching tends to drop off later in courses. There is some evidence that students prefer Khan academy type screencasts with pen animations rather than talking over slides.

Suzi read Why there are so many video lectures in online learning, and why there probably shouldn’t be

The article argues that video is expensive, particularly if you aim for higher production values (which many people do). Their methodology was a literature review, interviews with experts, and studying the use of video in over 20 MOOCs. There’s no evidence that video does (or doesn’t) work as a learning tool, and little or none that high production values add much. Learners wrongly self-report that they learn well from video (cf the study of physics videos – Saying the wrong thing: improving learning with multimedia by including misconceptions

They argue that people should:

  • think twice before using video
  • use video where it really does add value (virtual field trips, creating rapport, manipulating time and space, telling stories, motivating learners, showcasing historical footage, conducting demonstrations, visual juxtaposition)
  • focus on media-literacy for the content experts and DIY approaches (eg filming on mobile phones)

Suzi also read 10 ways MOOCs have forced universities into a rethink

Broadly an argument that MOOCs are changing HE. MOOCs have given universities the impetus to experiment with pedagogy (notably, fewer lectures), assessment, accreditation, and course structure. They have made more common to think in terms of a digital education strategy. They are also disrupting universities: HEIs are no longer the only providers of HE and cheaper degrees are becoming available. They’ve highlighted an unmet demand (for something like evening classes?) and particularly in vocational and practical subjects. Clark talks about global networks of universities being like airline consortia – the passenger buys one ticket but makes their journey over several airlines.

Mike read  ‘7 ways to make MOOCs Sticky’, a blog post by Donald Clark and also ‘Bringing the Social back to MOOCs’ by Todd Bryant in an EduCause review.

The former looked at design to keep a MOOC audience coming back.  The latter looked at how MOOCs can encompass social learning (rather than just provide content). A point of contention between the two is the importance of social learning – overemphasised if you believe Clark and missing from many MOOCs if you believe Todd.

Clark, drawing on MOOC data from Derby’s Dementia MOOC, listed 7 ways to retain learners. For me, his seven points divide into three related areas, audience, structure and the value of social. He framed the discussion in the recognition that we cannot apply metrics from campus courses to things that are free, open and massive  courses. Clark is often a provocative commentator though, and his downplaying of the social is interesting.

An overarching theme of Clark’s post is audience sensitivity, though at times the audience he is most sensitive to seems to be himself. In my experience, this is a tough challenge for MOOCs. To Clark this is about not treating MOOC learners like undergraduates who are ‘physically and psychologically at University’. He rightly states they have different needs and interests. As someone who has helped design MOOCs, it is hard to make something that is all things to all people, and often it is about providing a range of activities, levels and opportunities for learners to engage.

Related to audience sensitivity, Clark sees a value in keeping MOOCs shorter (definitely wise), modular (allowing people to dip into bits), with less reliance on a weekly structure and coherent whole. This is maybe less about keeping learners, and more about allowing them to get what they want from parts of a course. It would be great to come up with ways to evaluate MOOCs for learners who want to take bits of courses. Post-course surveys are self-selecting and largely made up of completers. It is also a tough design challenge to appeal to such learners whilst also trying to deliver depth and growth through a course. Clark is involved in some companies who develop adaptive learning systems, perhaps reflecting a similar philosophy. Adaptive approaches may provide some answers in the future.

Clark is also is not a fan of the weekly structure, at least in terms of following through with a cohort. I think many learners like both the structure and the social, and these is are the main differentiating factors for MOOCs that mean they are not just a set of online materials. Many learners find the event driven, weekly structure motivating, and it is event many enjoy and learn the social element of MOOCs more than the content. I was always keen to draw out the social elements, to give learners the chance to contribute to the course and learn from each other.  Clark is somewhat scathing of social constructivism and the kind of learning emphasised in C-MOOCs.

This is in contrast to Bryant’s article. For Bryant, too many MOOCs are ‘x-MOOCs’ – largely about content and neglecting the social.  Interestingly, he does cite features of EdX and Coursera that have the potential to change this by allowing learners to work in groups and buddy up during courses. We would have really valued such features when I was working on MOOC about Enterprise. FutureLearn is not currently well equipped in this area.  He goes on to explore other ways of helping people collaborate off platform through user groups and crowd sourcing/ knowledge building tools. This would work well for some, but doubtless exclude others. He considers simulations, virtual worlds and ‘alternate reality games’ – simulations played in the real world. These could all play a role, but for me, alongside a core MOOC structure. Bryant sees MOOCs as a potential ‘bridge between open content and collaborative learning’. I suspect Bryant and Clark would value very different kinds of MOOC. Should we try to appeal to both extremes (and all in between) or pitch the MOOC at a particular audience? Probably the latter, but it still isn’t easy.

Psychology and education – notes from the reading group

Chris read Is it time to rethink the way university lectures are delivered?, a short article about a Science paper from 2011. A class of Canadian physics-major freshmen was split into two and one week of material was delivered differently to the two halves of the class. The first half stuck to the tried and tested lecture-using-powerpoint format, whilst the other half used a more ‘interactive’ approach termed ‘deliberate practice’: discussion groups, preclass reading assignments, in-class clicker-questions, online quizzes. Lo and behold, in a test the following week the second cohort scored 74% on a test about the material and the other half  only got 41%, thus illustrating that three days later they could remember the material better. The study has come in for a lot of criticism about methodology – only 211 of 271 students actually took the test (how would the others have altered the results?), and the people that designed it were also the ones that delivered the intervention so may well have been ‘teaching to the test’. However, the general feeling seems to be that though the study is flawed, the conclusions are broadly correct. It also illustrates that having a Nobel Prize allows you to publish anything you like anywhere you want.

Chris also read A better way to practice, 2012 . Written by Noa Kagayame, a Julliard School of Music violinist turned performance psychologist. His argument is that it is better to practice smart than practice hard – take home aphorisms from this article are Practice makes permanent and Perfect practice makes perfect, the implication being that unless you practice correctly you can reinforce bad habits. That seems logical enough. He also argues that more thoughtful study can reduce the time needed for practice and increase the likelihood of successful performance, but I (and many of the commenters below the fold) disagree with him about this. Whilst this might be true at the highest levels, at lower levels when it’s all about training muscle memory there’s simply no substitute for doing it over and over again.

Steve watched The key to success? Grit and read True Grit, Angela Lee Duckworth & Lauren Eskreis-Winkler, 2013. I’d phrase ‘grit’ as perseverance – effort and stamina to achieve something difficult over an extended period of time. In the Tortoise and the Hare, the hare has talent, but the tortoise has grit and achieves more in the end. This summary indicates that talent and grit are often orthogonal, or negatively correlated. In the past persistence was assessed against physical challenges, but this may not relate to long-term mental grit. Modern assessment is by questioning against traits e.g. ‘I finish whatever I begin’. ((to complete)).

Suzi read Stereotype threat and women’s math performance and Mindsets and Math/Science Achievement

Both papers discuss how mindset might affect learning.

Stereotype threat is a stress-induced threat of self-fulfilling a negative and well-known stereotype. For example an elderly man looking for his keys may worry about looking senile, become stressed, and so find it harder to find his keys. The paper puts forward evidence that women’s performance in difficult maths tests can be affected by the threat of fulfilling a negative stereotype: that maths is not a girls subject. Other studies have looked at stereotype threat in relation to racial stereotypes.

Growth mindset is the belief that intelligence can be improved. Not everyone has it, others have a “fixed mindset”. Many people will tell you that they are just not a maths person. The paper states that mindsets can predict maths/science performance over time, and can mitigate for negative effects such as stereotype threat.

Both are interesting and seem plausible. Some of the suggested strategies for reducing stereotype threat and/or increasing growth mindset are:

  • feedback should emphasise the high standards of the test, and that the student has the potential to meet them
  • frame high-stakes tests as “assessing current skills and not long-term potential to learn”
  • praise effort and process, not intelligence
  • describe great mathematicians and scientists as people who loved and devoted themselves to the subject (not born geniuses)

Evidence in teaching – notes from the reading group

Suzi read Why “what works” won’t work: evidence-based practice and the democratic deficit in educational research, Biesta, G 2007 and a chapter by Alberto Masala from the forthcoming book From Personality to Virtue: Essays in the Philosophy of Character, ed Alberto Masala and Jonathan Webber, OUP, 2015

Biesta gives what is broadly an argument against deprofessionalisation in the context of government literacy and numeracy initiatives at primary school level. I found the main argument somewhat unclear. It was most convincing talking about the difficulty in defining what education is for, making it difficult to test whether an intervention has worked. Talks at length about John Dewey and his description of education as a moral practice and learning as reflective experimental problem solving.

“A democratic society is precisely one in which the purpose of education is not given but is a constant topic for discussion and deliberation.”

Masala’s paper is on virtue/character education but is of wider interest as it talks very clearly about educational theory. I found particularly useful in this context the distinction between skill as a competence (defined by performance, so easily testable) and skill as mastery (defined by a search for superior understanding and less easily tested), and the danger of emphasising competence.

Hilary read Version Two: Revising a MOOC on Undergraduate STEM Teaching, which briefly outlined some key approaches and intended developments in a Coursera MOOC aimed at STEM graduates and post docs interested in developing their teaching.

The author of the blog post is Derek Bruff (director of the Vanderbilt University Center for Teaching, and senior lecturer in the Vanderbilt Department of Mathematics with interests in agile learning, social media and SRS – amongst other things: see http://derekbruff.org/)

Two key points:

  1. MOOC centred learning communities – the MOOC adopted a facilitated blended approach, building on the physical groupings of graduate student participants by facilitating 42 learning communities across the US, UK and Australia to use face to face activities to augment the course materials, and improve completion rates.
  2. Red Pill: Blue Pill – adopting the metaphor used by George Siemens in the Data, Learning and Analytics MOOC to give two ways to complete the course – either an instructor-led approach which was more didactic and focussed on the ability to understand and apply a broad spectrum of knowledge OR a student-directed approach which used peer graded assignments and gave the students the opportunity to pick the materials which most interested them, and so focus on gaining a deeper but less comprehensive understanding of the topic.

Final take away – networked learning is hard, as would be the logistics of offering staff / student development opportunities as online and face-to-face modules, with different pathways through the materials, but interesting …

Steve read Building evidence into education, 2013 report by Ben Goldacre for the UK government

Very accessible summary of the case for evidence-based pedagogy in the form of large-scale randomised controlled trials. Compares current ‘anecdote/authority’ edu research with past medical work – lots of interesting analogies. Focused on primary/secondary education but some ideas can transfer to higher – although would be more challenging.

Presents counterarguments to a number of common arguments against the RCT approach – it IS ethical if comparing methods where you don’t know which is best (and if you do know, why bother trialling?!). Difficulty in measuring is not a reason to discount, RCTs are a way to remove noise. Talks about importance of being aware of context and applicability. Uses some good medical examples to illustrate points.

Sketches out an initial framework – teachers don’t need to be research experts (doctors aren’t), should be research-focused team leading and guiding with stats/trials experts etc.

Got me thinking – definitely worth a read.

Roger read “Using technology for teaching and learning in higher education: a critical review of the role of evidence in informing practice, (2014) by Price and Kirkwood

This study explores the extent to which evidence informs teachers’ use of TEL in Higher Education. It involved a literature review, online questionnaire and focus groups. The authors found that there are differing views on what constitutes evidence which reflect differing views on learning and may be characteristic of particular disciplines. As an example they suggest a preference for large-scale quantitative studies in medical education.
In general evidence is under-used by teachers in HE, with staff influenced more by their colleagues and more concerned about what works rather than why. Educational development teams have an important role as mediators of evidence.

This was a very readable and engaging piece, although the conclusions didn’t come as much of a surprise!  The evidence framework they used (page 6) was interesting, with impact categorised as micro (e.g. individual teacher), meso (e.g. within a department) or Macro (across multiple institutions).

Mike read Evidence-based education: is it really that straightforward?, 2013, Marc Smith, Guardian Education response to Ben Goldacre

This is a thoughtful and well argued response to Goldacre’s call for educational research to learn from medical research, particularly in the form of randomised controlled trials. Smith is not against RCTs, but suggests they are not a silver bullet.

Smith applauds the idea that we need teachers to drive the research agenda and that we do need more evidence. His argument that it will be challenging to change the culture of teaching to achieve this, seems valid, but is not necessarily a reason not to try. The thrust of his argument is that  RCTs, whilst effective in medicine, are harder to apply to education due to the complexity of teaching and learning. He believes (and I tend to agree) that cause and effect are harder to determine in the educational context. Smith argues  that in medicine  there is a specific problem (an illness or condition) and a predefined intended outcome (change to that condition). This can be problematic in the medical context, but is even harder to measure in education. I would add that the environment as a whole is harder to control and interventions more difficult to replicate. Different teachers could attempt to deliver the same set of interventions, but actually deliver radically different sessions to learners who will interact with the learning in a variety of ways. Can education be thought of as a change of state caused by an intervention in the same way we would prescribe a drug for a specific ailment?

All this is not to say that RCTs cannot play a role, but that you have to think about what you are trying to research before choosing your methodology (some of the interventions Goldacre addressed related to specific quantitative measurable things like teenage pregnancy rates, or criminal activity). Perhaps it is my social scientist bias, bit I woudl still want to triangulate using a range of methods (quantitative and qualitative).

From a personal perspective, I sometimes think that ideas translated from science to a more social scientific context can lose some scientific validity in the process (though this is maybe most true at the level of theory than scientific practice. For example Dwarkins translated selfish genes into the concept of cultural memes, suggesting cultural traits are transmitted in the same way as genetic code. Malcolm Gladwell’s tipping point is a metaphor from epidemiology which he applies to the spreading of ideas, bringing much metaphorical baggage in the process. Perhaps random control trials could provide better evidence for the validity of these theories too?

53 powerful ideas (well, 4 of them at least) – notes from the reading group

This month we picked articles from SEDA’s 53 powerful ideas all teachers should know about blog.

Mike read Students’ marks are often determined as much by the way assessment is configured as by how much students have learnt

Many of the points made in this article are hard to dispute. Different institutions and subject areas vary so widely that not only are how marks are determined different between say Fine Art and Medicine, but also between similar subjects at the same institution, and also between the same subject at different institutions. This may reflect policy or process (eg dropping the lowest mark before calculating final grade).  In particular, Gibbs argues that coursework tends to encourage students to focus on certain areas of the curriculum, rather than testing knowledge of the whole curriculum.  Gibbs also feels these things are not always clear to external examiners. He does not feel that QAA emphasis on learning outcomes address these shortcomings.

The article (perhaps not surprisingly) does not come up with a perfect answer to what is a complex problem. Would we expect Fine Artists to be assessed in the same way as doctors? How can we ensure qualifications from different institutions are comparable? Some ideas are explored, such as asking students to write more course work essays to cover the curriculum, and then marking a sample. This is however rejected as something students would not tolerate. The main thing I can take from this is that thinking carefully about what you really need to assess when designing the assessment is important (nothing new really). For example, is it important that students take away a breadth of knowledge of the curriculum, or develop a sophistication of argument? Design the assessment to reflect the need.

Suzi read Standards applied to teaching are lower than standards applied to research and You can measure and judge teaching

The first article looks at the difference between the way academics receive training for teaching and the way research and the way teaching and research are evaluated and accredited. Teaching, as you might imagine, comes off worse in all cases. There aren’t any solutions proposed, though the author muses on what would happen if they were treated in the same way:

“Imagine a situation in which the bottom 75% of academics, in terms of teaching quality, were labelled ‘inactive’ as teachers and so didn’t do it (and so were not paid for it).”

The second argues that students can evaluate courses well if you ask them right things: to comment on behaviour which are known to affect learning. There didn’t seem to be enough evidence in the article to really evaluate his conclusions.

The argument put at the end seemed sensible: that evaluating for student engagement works well (while evaluating for satisfaction, as we do in the UK, doesn’t).

The SEEQ, a standardised (if long) list of questions for evaluating teaching by engagement, looks like a useful resource.

Roger read Students do not necessarily know what is good for them.

This describes three examples where students and/or the NUS have demanded or expressed a preference for certain things, which may not actually be to their benefit in the longer term. He believes that these cases can be due to a lack of sophistication of learners (“unsophisticated learners want unsophisticated teaching”) or a lack of awareness of what the consequences of their demands might be (in policy or practice). The first example is class contact hours. Gibbs asserts that there is a strong link between total study hours (including independent study) and learning gain, but no such link between class contact hours and learning gain. Increasing contact hours often means increasing class sizes which generally means a dip in student performance levels.   Secondly he looks at assessment criteria, saying that students are demanding “ever more detailed specification of criteria for marking” , which he states are ineffective in themselves for helping students get good marks, as people interpret criteria differently. A more effective mechanism would be discussion of a range of examples where students have approached a task in different ways, and how these meet the criteria. Thirdly he says that students want marks for everything, but evidence suggests that they learn more when receiving formative feedback with no marks, as otherwise they can focus more on the mark than the feedback itself.

The solution, he suggests is to make evidence-based judgements which take into account student views, but are not entirely driven by them, to try to help students develop their sophistication as learners and to explain why you are taking a certain approach. This article resonated with me in a number of ways, especially with regard to assessment criteria and feedback. There is an excellent example of practice in the Graduate School of Education where the lecturer provides a screencast in which she goes through an example of a top level assignment, explaining what makes it so good.  She has found that this has greatly reduced the number of student queries along the lines of “What do I need to do to get a first / meet the criteria”.  I also strongly agree with his point about explaining to students the rationale for taking a particular pedagogic approach. Sometimes we can assume that students know why a certain teaching method is educationally beneficial in a particular context, but in reality they don’t. And sometimes students resist particular approaches (peer review anyone!) without necessarily having the insight into how they may be helpful for their learning.

6 very good things about MIT’s #medialabcourse MOOC

I started taking MIT’s Media Lab’s Learning Creative Learning MOOC (often referred to as #medialabcourse or LCL) at the beginning of February. It’s something I’ve done in my spare time rather than directly for work but it’s been a great experience and I wanted to reflect on what has worked so well for me.

1. Google+ communities. Google+ turns out to be really rather good for groups and group discussions. The combination of threaded discussion (with email notifications of responses) and micro-blogging type front-page (making it easy to scan through new posts) has certainly promoted impressively engaging and lively discussion. It’s even (and I can’t believe I’m saying this about a Google product) nice to look at.

2. Small groups. People who enrolled in time were placed into small groups, each with its own email list, and each encouraged to set up its own Google+ group. These small groups (my own included) have largely petered-out – but others have survived, often by picking up refugees from the less active groups, and I joined one of those. They provide a safer, less public, arena for discussion – especially for those people who are perhaps less confident or for material that doesn’t seem important / relevant / polished enough to share with the world.

3.Openness. LCL was designed to be almost entirely open, based on P2PU’s mechanical MOOC. Course reading is published on a public website and the main community is an open Google+ group. Weekly emails are sent out to remind people about this week’s activity and reading. Even with the small groups, I get the impression it’s those who left their Google+ communities as open who have survived because they could pick up new members. As well as being a Good Thing, this openness helps to make it easier to navigate the course, and to access the materials from a range of computers and devices.

4. Variety. Each week there are suggested readings, an activity, and further resources. There’s also a video panel discussion, and of course there’s continuous activity and discussion on the Google+ community. Early on the course, the course leaders stated explicitly that people should engage with what they can / what interests them and not feel they have to do everything. The variety of tasks and materials (some of the “readings” are short videos) make it possible to stay engaged even when you have little time to spare.

5. Events. There are live-broadcast panel discussion each week, directly relating to the week’s reading and activity. The video stream for these is embedded within a chat forum so that you can chat with your fellow students while you watch, and submit questions for the Q&A section at the end. These broadcasts feel very personal and inclusive, they are relaxed and conversational in tone. Course moderators join the chat rooms – providing helpful information, support with technical issues, and (maybe more than anything else) a real sense that the online participants do matter. In terms of a teaching device, I’m not sure how well they work – I find myself picking up fragments of the video and fragments of the chat and not properly engaging in either. But they can be useful place to reflect on and refine my ideas and they help give the course a nice pace.

6. Enthusiasm. Mitch Resnik, Natalie Rusk, and the rest of the course team exude enthusiasm for their subject, excitement about the course, and an openness that makes you feel like a real student. They seem friendly and genuinely interested in what online participants are saying. I think their attitude sets the tone for the community as a whole.

Active learning – notes from reading group

Active learning might be an unhelpfully broad topic but there are some very helpful ideas in these papers.

  • Bonwell, C. (1991), Active learning: creating excitement in the classroom, Eric Digest – The article starts by defining what AL is, the key factor being that students must do more than just listen e.g. read. write, discuss, problem solve. It identifies the main barrier to use of AL as risk, for example that students will not participate, or that the teacher loses control.  It suggests ways to address this for example by trying low risk strategies such as short, structured, well-planned activities.
  • Prince, M. (2004), Does Active Learning Work? A Review of the Research, Journal of Engineering Education, 93(3), 223-232. Splits active learning into constituent parts and looks at the evidence for (often relatively minor) interventions covering each of these parts, in an attempt to identify what really works. A useful reference for anyone looking for quantitative evidence for active learning type interventions and a useful discussion of what leads to successful (or unsuccessful) problem-based-learning.
  • Jenkins, M. (2010), Active Learning Typology: a case study of the University of Gloucestershire. The paper describes how an ‘active learning ‘strategy has been implemented at the University of Gloucester. In the first paragraph Jenkins provides some references on active learning to unpacks its meaning that helped us to better understand the term and put it into context,  for example, …the role of the teacher is not to transmit knowledge to a passive recipient, but to structure the learner’s engagement with the knowledge, practising the high-level cognitive skills that enable them to make that knowledge their own (Laurillard, 2008; 527). page 2. At the same time this is compared to the understanding of ‘active learning’ of the staff at the university which through  a survey were asked to identify their conceptions of active learning. The results identified three categories ‘families’, 1) external (student are active when they learn by doing), 2) ‘internal (student are active when they are engaged in cognitive processes) and 3) holistic (it is a composite of the two, and students are active learning is generally investigative, developmental, creative. An interesting perspective is a distinction in the interpretation where the emphasis is placed on the student or the teacher, Is active learning what the teacher gets the students to do or what learning is done by students? The data showed that there is a split between some staff practising ‘active teaching’ and other practising ‘active learning’. The outcome of the project has produced a framework for staff to work with which is very useful and identifies common elements of active learning in these five categories: Co-learning opportunities, Authenitcity, Reflection, Skills development, Student support.