Select Page

After an event is over we take stock. We measure the number of tweets, the number of people who could possibly have seen those tweets, the number of blog posts, the number of discussion threads in the LinkedIn group etc, etc, etc….

We take these numbers to indicate the level of engagement surrounding an event. But is this what they are actually telling us?

A recent post by Ann Priestley challenged me to think about the ways we measure the online engagement with a conference. She presented a graph from Socious, who use the high peak of activity during an event and sharp tapering of this activity after the event as part of their argument to sell their product. Their implication is that unless your event has a long tail of post-event activity, it is not as successful at long-term engagement.

My concern here is that it is easy to confuse “activity” and “engagement”. I admit that I have been as guilty of this as anyone else. The number of tweets generated during an event can sound impressive and help event organisers to see the value of supporting the use of Twitter by the audience. However, the number of tweets on a hash tag only represents the level of current activity, not necessarily the level of ongoing engagement. At the end of the day, all event-related activity tapers off sharply after the event itself, moving aside for more natural discourse and engagement with the topics, without the constraints of the event identity and structure.
 

Activity vs Engagement

 
Engagement is very much a soft issue and difficult, if not impossible, to measure accurately. Activity is easy. You can see outputs of activity very clearly. They can be tagged and identified and measured and visualised. Engagement is much more subjective and dispersed, along with the audience.

Tweeting prolifically during a conference may demonstrate how actively engaged the audience are with the topic at that point in time, when there is a collected audience sharing that experience with them in real time. However, after the event the ideas discussed and the connections made may not be associated with the conference hash tag, making them difficult to trace. We can’t currently record new follows that result from use of a conference hash tag (to my knowledge!) and we can’t automatically determine if a conversation a month afterwards is an extension of ideas raised by the conference unless the participants have tagged it. Once the flush of public activity around the event is over, this might not be appropriate for either party, so the traceable link diminishes, but the engagement with between the participants about related issues remains.

Equally, a reflective blog post after the event demonstrates that one individual was deeply engaged with the issues raised at that point in time. If it is tagged, we can find it. However, unless we measure the number of people who read that post over time, commented on it or linked to it (both literally and thematically), then we have no idea whether it has led to further engagement beyond this point.
 

Problems with Long-Term Reporting

 
Conference producers may be interested in connecting everything back to the original event, but the delegates themselves are usually more interested in the topics than the event identity. This makes it especially difficult to attribute the results of engagement to the event the further away from it that you get. With an amplified event, the person engaging may only have a very general awareness of the original event which triggered the content or conversation in the first place, significantly reducing the likelihood that they would tag their response to make it traceable.

The level of research and the amount of time required to get a clear picture of the long term level of engagement with the event outputs is often outside of the scope of most conferences. This results in an approach that favours generating maximum activity around the event itself to get as much stuff out there as possible, then trusting to the quality of that content or just hoping for the best in terms of long-term engagement. However, this assumption is rarely reflected in the post-event reports, giving the impression that once this initial activity has died down, that is the end of the engagement and there is therefore a problem to solve to increase long-term engagement.
 

Attention or Response?

 
It also depends how you define engagement on a fundamental level: is it about attention or is it about responding?

People can engage with an event just by listening: in a lecture, an audience can be engaged without speaking, or creating any measurable activity at all (except perhaps a lot of noise over the following coffee break). We have very little information about engaged listeners at amplified events. We know how many are watching a live video stream, but we can’t tell how many have left it running whilst making a cup of tea or are only half listening. We know how many people could possibly see tweets relating to the event, but we have no idea if or when they might read that content, or how deeply they will engage with those tweets. With more people learning how to temporarily screen prolific conference Twitterers, we cannot even guarantee that they have seen them. We may get better at this, but we can really only know “how many” not “how deeply”.

Another word for engagement might also be “impact”. This is a popular word at the moment, particularly in HE circles, where my event amplification services are often used to increase the impact of an event by disseminating information as far as possible beyond the confines of the conference hall. Impact, in this case, is quantified by measuring the number of people a message reaches.

Brian Kelly has made this argument before in relation to Slideshare as a way of extending the life of a presentation – increasing its impact by increasing its visibility over time. But again, whilst we know how many people have looked at the slides, we do not know how deeply those people engaged with those slides except where their engagement led to what we could call an extreme response, such as embedding the slides into a blog post to reflect on them in more detail. This is highly engaged activity. But what about all the people who merely linked to the slides? This more casual response is not currently reported by Slideshare, so it can be difficult to trace. What about the people who showed them to a colleague or used them in a class? There is no way to get a complete picture, any more than there’s a way to judge the way a paper handout is used post-conference. The only difference is that no-one questions the value of the paper handout! 😉
 

Conclusion

 
How we define engagement and impact will affect the types of metrics we attempt to collect to demonstrate the success of an amplified event over time. However, accepting that engagement with the event will not necessarily lead to clearly definable, traceable digital objects may be the first step in rethinking not just how measure success, but what we are trying to achieve through the event in the first place.
 
 
[shareaholic app=”share_buttons” id=”7637501″]