Posted on Sep 10, 2010

On Wednesday I participated in a repeat of last week’s remote amplification experiment when Brian Kelly invited me to get involved with his presentation University 2.0 at UIMP 2.0. This time, although I was again providing a live commentary via Twitter approximately 600 miles from the venue, I also took on a slight different role, which I had not expected… I became Brian’s assistant.

This began with a magic trick. In order to judge the time lag on the video stream, Brian asked me to read his mind and name the playing card he was thinking of via a tweet in response to a pre-arranged cue. My response appeared on a Twitter wall for the live audience to see, whilst the remote audience could see this via the event hash tag. Brian used this “magic” trick to introduce the concept of gathering audience responses and conversations about a presentation and contextualising them within a recording of that presentation using the iTitle tool, which provides Twitter captioning of video. He argued that this is not only useful to the speaker – who can search the video based on the tweets to find out what triggered a particular response – but it could also be useful to future learners, who can pick up additional ideas, arguments and links to resources whilst watching the recording. The magic trick also served to demonstrate how an audience outside of the room can be actively engaged using tools like video and Twitter, enabling us to move away from the model of the university as a walled garden where learning can only take place at a specific time and location.

That particular instance of assistance was obviously a pre-planned double act designed to make a specific point. However, later in the talk I was able to assist further when Brian asked me to help him out by tweeting a link that he needed for a demonstration. He asked the question verbally. I was able to hear via the live video stream, which was kindly provided on the hoof by Steve Hargadon using an Elluminate session, as the official event live stream was dubbed into Spanish. I was able to tweet the resource link to Brian, thus freeing him up to continue talking rather than searching for a link and losing his flow.

Anyone who has watched a speaker flounder when they find they don’t have a resource immediately to hand to illustrate their point will appreciate how this type of support could be useful. However, the speaker would need to be monitoring – and occasionally engaging with – a twitter client to take advantage of such support. Brian was obviously very aware that I was “there” and commentating, which made it convenient for him to direct a comment to me as soon as he realised he didn’t have the link that he wanted bookmarked, and then move on, knowing that I would find the link for him and tweet it back for him to use. He did also take time to respond to some of the questions raised on Twitter in the course of his talk.

Whilst many speakers will not be so comfortable working collaboratively with an official event amplifier and/or their remote audience, it was interesting to see how such sessions could potentially be managed.

Taken to its logical conclusion, a speaker could have a dedicated stream in part of the screen in front of them which the event amplifier could populate with messages, links and questions, thus providing some level of filtration for the speaker and helping them to engage in real time with their remote audience. Engaging with a physical audience in front of you can often be hard enough, so maybe part of an event amplifier’s role needs to be assisting the speaker so that they can engage effectively with their full audience, not just the ones immediately in front of them.