Authentic Learning Activities for when simulations are not quite enough

On 2nd September I presented at the International Enterprise Educators Conference which this year was held in Cardiff.

The title of my talk was “Alternate Reality Games for Enterprise Education: Bridging the reality gap between simulation and authentic experience“.

I don’t have much time to expand at the moment but you can view the slides, which I think are fairly self-explanatory, on Slideshare, here:


Using Audio for Summative Student Feedback

At a conference earlier this year I watched an excellent presentation from a colleague who was using audio as a mechanism for feeding back to her students.  Since then I have been dying to give it a go, my prime motivations being:

  1. To provide my students with higher quality, richer feedback
  2. To save time while assessing work

So after doing some research, speaking to other colleagues here at the University and figuring out the best way to do achieve this technically, I finally had a go.  This is a short summary of my experience (not an exhaustive study and analysis by any stretch of the imagination!)

Unit Title: Study Skills (Level One)

Number of Students: 12

Assessment artefact: 1500 word essay

Traditional marking/feedback method

Historically I have read through the essays, correcting and commenting in red pen, directly onto the original script as I go.  I would then mark the work using an assessment rubric sheet which I complete as a Word document.  This document (Example here [MS Word File]) has a space for comments where I would then type feedback for each student.  This would then then printed and attached to the original essay script for collection by the student.

Looking back at historical examples, the amount of feedback I would provide for each student using this method would be somewhere between four to six bullet-pointed statements which included things the student did well and things they might do to improve.

The number of words used for this typed feedback varied between 61 and 374 with an average of 163 words.

[Example of previous feedback – MS Word File]

Audio feedback methodology

At the University of Portsmouth we use the Blackboard VLE which I believe does not normally have an audio feedback facility.  However, we have purchased a suite of tools produced by Wimba called Wimba Voice, one of which is called Voice Email.  This is the tool I used for producing the audio feedback.

Apart from the Blackboard VLE, Wimba Voice Email and a computer (of course!) you will need a microphone.  My laptop has a built in microphone which works just perfectly but you can buy USB microphones/headsets (Madonna style!) for next to nothing now.

The new method was as follows:

  1. I read through a piece of work, correcting and commenting in red pen as before.
  2. As in previous years I then marked the work (in Word) against the assessment rubric sheet.
  3. At this point I would switch to Blackboard and select the student from the unit list within the Voice Email application.
  4. The process of recording the feedback could not be easier.  It is a simple case of clicking the record button and talking.  The process is no different from leaving a voicemail message on a mobile phone or telephone answering machine.
  5. Once the feedback is recorded I added a subject line and a short, typed message to the Voice Email (cut and paste because it was the same for each student) and the clicked send (Fig. 1.)
  6. An email is then sent to the individual student’s email address which contains my message and a link to the audio feedback.
  7. When a student clicks the link they are taken to a webpage where they can listen to the feedback using a simple audio player.  They also have the ability to download the audio file as a .wav should they choose to. (Fig. 2.)
Wimba Screenshot

Figure 1. The Wimba Voice interface

Wimba Interface

Figure 2. What the student sees

My Impressions

Talking to my laptop did seem a little odd to begin with but after a couple of test-runs I got into the swing of things and can honestly say that I really enjoyed the process.  It actually felt very natural to leave feedback in this way and why shouldn’t it?  After all, we provide formative, oral feedback like this to our students in class all the time.

Students tell me that written feedback can be open to misinterpretation.  This is not surprising, particularly as it is often written in a form of short-hand (i.e. summary points), in unintelligible hand writing and, let’s be honest, more often than not, written in a bad mood and in a hurry.  The spoken word is far less likely to be misinterpreted in this way.  For example, if I were to write “Well-done, you are a genius!” you might, after interpreting this literally, feel rather smug but then wonder why I only gave you a mark of 35% for your work.  Of course, I actually meant this statement to be read with a heavy dowsing of sarcasm “Well-done, you are a genius – NOT!!!”.  If I had spoken this to you, you would be under no illusion at all as to what it was I was saying but the written word does not convey this.  Yes this is a silly example, and of course I would never use sarcasm in my student feedback, but it illustrates my point.  If you think about it, we should all be familiar with the shortfalls of written communication.  Only this week I sent what was a seemingly innocuous sounding email to someone, only to receive a short, sharp, venomous “what did you mean by that?” type response.  I subsequently phoned the person to explain what I had really meant.  If only I had done this in the first place! Audio feedback aids clarity of understanding.

On a related note, another clear advantage of using audio feedback over the written equivalent is that it provides one with the ability to express emotion in ones comments.  Using my voice I can clearly demonstrate a whole spectrum of emotions such as disappointment, excitement, concern, delight and empathy etc. which would just not be possible if written down.  Audio allows me to provide emotionally intelligent feedback.

You can listen to an example of my audio feedback here.

Does providing audio feedback save time?

Honestly, I don’t think it does!  I would say it took about the same amount of time as my old method once I had done a few.  However, the quantity of feedback I could provide in a similar amount of time was much greater.  My audio recordings lasted anywhere between 5 and 7 minutes in length.  This amounts to considerably much more feedback than the average of 163 words I was providing in the previously written versions.  Yes, some of it is a little rambling and not as concise as if it was written down but I still believe the majority of the feedback was of a higher quality.


I had no problems using the Wimba tools – it was very simple indeed.  The system provides an archive of all recorded feedback for audit purposes and, as a belt and braces approach, I also downloaded the files as a .Zip archive which I burnt to a CD for the external examiner -neat!

So far only one student has complained that they did not receive the email containing the link to the feedback.  This may have been an email client issue but was easily remedied by sending the student the Wav file directly to his inbox.

What did the students think?

I sent a simple survey to all of the students in order to elicit their thoughts.  You can view the survey here – Audio Feedback Survey [MS Word File].  I realise that this is far from scientific so if you are a social scientist, well versed in the ways of qualitative and quantitative research methods please don’t shout.

Three out of a cohort of 12 students have responded so far (it gets worse!).  Overall from the feedback received it seems that the students liked it.  The following is a summary of their feedback.

1. On the following scale how would you rate audio feedback?

Excellent Good Neutral Poor Very Poor
2 1 0 0 0

2. What did you like about audio feedback?  What in your opinion are the benefits?

“It took me through each bit of my essay and pointed out where I went wrong and what was good”

“It’s more personal, and it gives a clearer idea of what is meant by the feedback.”

“I liked the audio feedback as it was very personalized and more intimate than the usual written feedback. Due to this I feel that I clearly understood the feedback you were trying to portray, which can often get misconstrued in written text.”

3. What did you dislike about audio feedback?  What criticisms do you have?

“I had no dislikes in regards to the audio feedback.”

“I didn’t notice it was there. You had to tell us that it was sent.”


4. Did you encounter any problems receiving and listening to the feedback?  If your answer is yes, what were they?

“I didn’t encounter any problems.”



5. Students were asked to place a mark on a scale to show how they rated audio feedback compared to written feedback.  Two students rated audio feedback as “much better” while the other placed the mark midway between “much better” and the same.

6. Why did you place your cross at that location?

“In general I thought the audio feedback is a great way to provide feedback to students. Personally I much prefer it to written feedback, it is a lot more personalized and messages can be portrayed across effectively.   I would happily receive feedback through this method for the rest of my units.”

“I feel that it is better than written feedback but not by that much, largely due to needing to be reminded of it.”

“I find written feedback can be wrongly translated often and also seem to sincere”

7. How important it is for you to receive feedback for your work?  Why?

“Extremely important, helps with future essays and a possible retake”

“Quite. It helps me to know where and what I need to improve.”

“I believe receiving feedback is essential for tracking progress and presenting areas of student improvement. Without feedback a student wouldn’t know at what level they are achieving, and it would be unclear how they could approach improving their work.”

Summary and Considerations

Even though this process did not save me any time I actually thoroughly enjoyed it.  My prime motivation for trying this was to provide better quality feedback for my students.  Although “better quality” is a term that could be picked apart and analysed in great detail, this is not within the scope of this little experiment (there is quite a lot of existing literature on this actually).  I can only make an assessment based upon my experience, the feedback from my students and, to a certain extent, on what common sense tells me.  Taking these factors into consideration I have no doubt that the audio feedback I provided is much better than the written equivalent I have provided in previous years.  I will certainly be doing more of this for my other units in the future.  A few things to consider/lessons learnt:

  • Find a quiet place to record where you won’t be disturbed.
  • Don;t forget to switch off any software that makes a noise.  You will hear my Twitter client “pinging” away in the background on some of my feedback!
  • Structuring your feedback makes it much simpler.  Think of the running order as you would for a presentation.  Although I didn’t use a script (kind of defeats the object and detracts somewhat from the richness that comes from spontaneity) I did pretty much say the same introduction and concluding remarks for each student.  I used the assessment rubric as the framework for discussing the work so that the student, once they have collected the work, will be able to follow this through logically.
  • I included the mark for the assessment within the audio feedback.  I gave this mark at the beginning but I think, with the benefit of hindsight, it would have been a better idea to tell the student at the start that the mark will be given at the end of the feedback.  This might encourage students to listen through the whole feedback recording.
  • Audio feedback is of no use to students with hearing difficulties.  It is obviously important to provide a written alternative for these students.
  • For large cohorts of students I would record a generic feedback for the whole group.  Individual students would still receive useful quantitative feedback from the hand written comments on their work and the details marking rubric

Thanks to Alex Mosely (Uni. of Leicester) for some Wave based discussion and to Carol Ekinsmyth (UoP) for providing me with insights from a similar study.

    Using an Alternate Reality Game to Teach Enterprise

    This article was originally submitted as a poster presentation at the International Entrepreneurship Educators Conference, Edinburgh, 2009.  I was pleased to win the poster prize for this contribution.

    You can download a printable pdf of the original poster here: IEEC 2009 Poster.


    Entrepreneurial learning is enhanced when learners:

    • are immersed in authentic or near-real experiences (transformed into knowledge and understanding-1)
    • are able to reflect on these experiences
    • are able to synthesise key concepts by construction of links between theory and practice (a deep approach to learning-2)
    • are motivated and engaged
    • are allowed to imitate, experiment (play) in a safe (low-risk) environment

    Simulations, case-studies, role-plays and, more recently, computer-based simulations (hereafter collectively referred to as “simulation(s)”) are the commonly used experiential teaching “tools” employed to approximate realistic experiences.

    Although becoming more sophisticated these interventions are generally unconvincing proxies for real life experiences often requiring a large leap of imagination on the part of the participating students.

    The best experiential teaching techniques shrink the “reality gap” (figure 1) that exists between classroom “simulation” and authentic life experience (the wider the gap – the harder the participant has to work to suspend their disbelief).

    A functional hallmark of Alternate Reality Game (ARG) design is the use of immersive techniques to blur the boundary between what is story and what is reality (ARGs are often referred to as immersive games).

    These same techniques (see “2. What is an ARG?”) are to be employed in a taught enterprise unit in an attempt to create what might be termed a hyperreal simulation or hyperreality – an experience that will produce a sense of realism far in excess of that of traditional “simulations”

    A well designed ARG based unit also provides the opportunity to enhance student motivation and engagement through the use of techniques employed by traditional game designers (competition, reward etc).

    reality gap

    What is an Alternate Reality Game (ARG)?

    “Alternate Reality Games take the substance of everyday life and weave it into narratives that layer additional meaning, depth, and interaction upon the real world.”3

    Alternate Reality Games (ARG’s) tell rich, interactive stories through narrative elements that are distributed across various platforms.

    Pieces of the story are distributed online or off across multiple mediums. These may include websites, blog posts, email, video clips (maybe on youtube, television or cinema), audio clips (podcasts, voicemail messages, radio broadcasts), print ads in magazines and newspapers, billboards, posters in shops, payphones (cards or real-time calls), people with placards on streets and packages sent through the post.

    The stories are carefully concealed from players until appropriate moments determined by the game designer(s) or ‘puppetmasters’.

    Game play involves players working collaboratively (often globally) using email, phone/sms contact, real-time interactions and extensive online engagement to solve problems (puzzles) revealed through the narrative.

    Players of ARG’s often develop strong emotional connections with the story characters who are designed to appear as real people. Devices such as real-time communication with the players (using blogs, social networks, video, VOIP, sms, telephone, online chat etc) and development of realistic, rich character histories make this possible (Figure 2).


    Instead of requiring the player to enter a fictional game world, ARG designers attempt to enmesh the game within the fabric of the player’s real world by harnessing as many media technologies and interfaces as possible. By doing so, ARG’s expand the frame for the game beyond the computer monitor or television screen, effectively making the entire world the “game board.” (above modified after 4).

    ARG players report unprecedented levels of immersion in the presented narratives as well as high levels of engagement.

    Why ARGs for Enterprise Education? (after 5)

    Realistic, interactive narrative (characters/story)

    • Not just a “normal” simulation / roleplay. Extends the simulation
    • paradigm towards authentic experience
    • Contextualised – the narrative can be modified to fit ANY real-life
    • situation
    • Experiential – application of theory in a low-threat, realistic
    • environment

    Self directed play – influence on outcomes

    • Ownership / responsibility for learning
    • Enquiry/research based
    • Facilitates critical (deep) academic thinking
    • Motivating

    Progress and rewards (leaderboard and prizes)

    • Provides incentive, motivation, competition, sense of achievement, fun!(feedback/reflection)
    • Regular delivery of new problems/events
    • Key to maintaining engagement/interest
    • Large, active communities
    • All the benefits of learning in groups (collaborative, self-supportive, peer feedback, vicarious learning)
    • Development of teamworking skills

    Novel method of delivery

    • Engagement, Fun, Motivation

    Utilises Existing technologies

    • Removes barriers = Buy-in
    • Promotes engagement

    The Unit: Enterprise in Context (Level One, 10 Credits, One Semester)

    The Scenario

    Students will be participating in an Undergraduate Training Programme (UTP) run, in collaboration with Portsmouth Centre for Enterprise, by a company who specialise in turning around failing businesses.  The company is called Phoenix UK Ltd.


    Phoenix UK Ltd has directors, employees, a history, a web presence, email addresses, Skype and Twitter accounts.  The managing director is a character called Simon Brookes!! Simon will be running the UTP.

    Phoenix UK Ltd are based in the City of Porthampton which has a council (, a newspaper –  The Porthampton Bugle
    ( and a football team (non-league!).


    In small groups the students will be working with a “real” business that is suffering financially.  Their task is to turnaround the fortunes of this failing business.

    The failing company, Salter & Son, is a rather old-fashioned, high street, gentlemen’s outfitter whose “brand” and product range leave much to be desired!


    Salter & Son, also based in Porthampton, has a new Managing Director (Craig Salter) and several key staff who deal with different areas of the business; Christine Parker – Marketing; Barry Scott – Finance.

    photos The scene is set!


    Mostly self-directed (or guided self-research), usually in response to triggers from Salter & Son employees.

    The students will need to communicate and work closely with each of Salter & Son’s key personnel in order to get the information they need to make their final recommendations.

    Communication will be through email, conference calls, online chat, video and traditional mail (Students will receive packages through the post sent from Salter&Son Employees).

    As the unit progresses the students will learn about the basics of marketing (retail slant), business finance, business management and personnel issues (the unit learning outcomes).

    Several workshops or master-classes will supplement the students own research, delivered by “external consultants” (in reality academics role-playing consultants).

    Students will need to keep on their toes and be ready to respond to the clients requests, moods and whims. There will be some surprises!


    Assessment one – small group presentation to the client (in this case Craig Salter) on the findings of a “best practice” fieldtrip the students will have undertaken in a local high street (students also have an opportunity to spend some time interviewing the owners of several local clothes retailers).

    Assessment two – individual short answer tests.  At five points throughout the unit each student will receive an email from a Salter & Son employee which will contain questions pertaining to that employees area of responsibility (marketing, finance etc).  The answers emailed back to the client will be assessed.  These marks also contribute towards the Phoenix UK Leaderboard™ score (see “The Leaderboard”).


    Assessment three is designed to encourage the students to reflect on their performance as a member of a team.  Every fortnight the students will self- and peer-assess each other’s teamwork performance against a number of pre-selected criteria.  An individual 500 word essay reflecting on their overall performance will be assessed.  The total mark at each assessment also contributes towards the Phoenix UK Leaderboard™ score (see The Leaderboard).

    Assessment four is a group submitted, 2000 word report, written for the benefit of the failing company, Salter & Son.  This report will include all of the students’ recommendations (marketing, financial, personnel etc.).

    The Leaderboard

    A bespoke, online leaderboard has been developed for this unit.

    Throughout the duration of the unit students are given marks for various tasks completed (answers submitted to client’s questions and also for teamwork contribution).

    These marks convert to scores which will be added to the Phoenix UK leaderboard.

    Students will be able to view their scores and those of their colleagues online, throughout the programme.

    At the end, prizes will be awarded to the highest scoring individual and also the highest scoring team.

    The leaderboard was included as a way of introducing an element of competition (this is a game!) and fun to the unit.


    1.    Stokes, D. (2009, March 26). Action learning tools for building self-efficacy and entrepreneurial skills. Paper presented at Essential Tools for Entrepreneurship Education, University of Reading. Retrieved August 25, 2009,

    2.    Biggs, J. (1999) What the Student Does: teaching for Enhanced Learning. Higher Education Research & Development, Vol. 18: 1, 57-75.).

    3.    Martin, A., Thompson, B. & Chatfield, T. (2006). Alternate Reality Games White Paper – IGDA ARG SIG. Retrieved August 26, 2009 from

    4.    What is an ARG? (2008). Retrieved August 26, 2009, from the website:

    5.    Mosely, A. (2008). An Alternative Reality for Higher Education? Lessons to be learned from online reality games. Retrieved January 28, 2009,

    6.    Harvey dent demo photo. Retrieved august 29 2009 from

    7.    Phillips, Andrea. “Deep Water.” (July 26, 2001) Access: May 6, 2002.