macongame-thumb-640

Can “Reading” An Evaluation Be Fun? It Should Be.

Macongame Thumb 640

Guest Post: R. Christine Hershey

In an increasingly visual culture, the old ways of communicating lessons learned are startlingly out of touch with how we want and expect to get information. That’s why  the John S. and James L. Knight Foundation decided to try a new approach to sharing lessons they were learning from the evaluation of two social impact games.  One game, Battlestorm, was a youth-based game to improve hurricane preparation awareness and habits on the Gulf Coast. The other, Macon Money, used an alternative form of local currency to connect residents to each other and to attract and expose people to local businesses in Macon, Georgia.  For both evaluations, the foundation wanted to communicate its findings in ways that were just as appealing, interactive, and forward-thinking as the games themselves.

Hershey Cause Communications led the evaluation team and produced a suite of interactive data visualizations, infographics, PowerPoints and publications to communicate the results.  The idea was to communicate the findings in real-time over the course of the games and again when the final analysis was complete.  For more details, check out what popular philanthropy blogger, Lucy Bernholz, recently called “… the most fun evaluation report you’ve ever seen.” Or in keeping with the idea that visuals can sometimes say more than words, you can experience the website here and via the links mentioned below.

We learned two key lessons from this process of combining evaluation and communications to share valuable information:

1.     Let your data paint the picture.

If you are evaluating a program that involves thousands of people, like Macon Money did, that means you have huge data sets. Using a series of data visualizations to communicate the results of Macon Money allowed us to plug our actual data into a graphical interface that would make our findings visual, understandable and interactive.

We partnered with Stamen Design, a San Francisco-based data visualization firm, to develop compelling and fun ways to visualize the multitude of game activity data points—everything from interactions between players to community events, to Facebook posts, to economic impact.

For the game activity timeline, a simple interface lets users look at specific dates or zoom in to examine the game’s activity in a hyper-local context. Another set of graphics allows users to explore the types of social connections made by playing the game. But it’s important to remember that not all data sets lend themselves to visualizations.

Compared to Macon Money, Battlestorm had a much smaller data set, more qualitative data, and we felt we could best illustrate the key finding – that youth can be “superconductors” of information in their communities – with a simple but powerful infographic.

2.     Make your data usable and shareable—by all of your audiences.

Evaluations can be extremely useful to funders, to other evaluators and to the organizations on the ground that may want to highlight their work to the broader community. The visualizations and infographics designed for each game reflect the games’ designs—which in turn reflected local partners and design contexts. And they were designed to be easily shareable. They can be linked to from any website, shared on any Facebook page or Twitter account, or dropped into any PowerPoint.

For those who wanted to know a high-level view of what Knight learned about funding and evaluating both games, our eight-page summary used simple, creative graphic design to distill our findings into key principles without oversimplifying them.

In sum, the increased presence of online gaming and other interactive digital technologies in our culture reminds us not to underestimate the power of play nor the power of graphic design to move our causes forward. Good design for good causes never goes out of style.

 


Communications Network member R. Christine Hershey is founder and president of Hershey Cause Communications. 

 

4 Comments

  1. Susan ParkerSusan Parker07-24-2012

    Hi Christine,

    Great post. You and your team did an amazing job of bringing an evaluation to life and making it interactive. I hope that many foundations learn from your approach.

    I have one question. Did you or the Knight Foundation have a plan to evaluate how well this approach worked in terms of engaging key audiences (or whatever goal that Knight had for taking a more interactive approach)? If you did, what did you measure–time on the website, downloads, other actions? I’m really curious about how this type of interactive evaluation report translates into greater engagement, etc. and what you learned.

  2. WIll B. SullivanWIll B. Sullivan08-02-2012

    Hi, Susan. I agree – I’d love to learn how this product – which is very well done – translated to engagement. And, what type of engagement metrics were established during the development of the product. Increasingly, we’re focused on the new SEO – Social Engagement Optimization – but the metrics for measuring that sometimes seem a little broad. Downloads, social shares and the like are good to measure – and you have to be sure to set up your measurement tools to capture these in advance of launching your product – but I like to see metrics that measure the conversation that occurs around products, and also information on how these products empower people to take your issue and make it their own.

  3. David CrowleyDavid Crowley08-15-2012

    Good post & work on the evaluations. I’ve been reviewing the Macon Money evaluation on the Knight website a few times, getting some good ideas about the program and measurement approach from it. Thanks!

  4. Elizabeth MillerElizabeth Miller08-15-2012

    Great comments all, and thanks for the interest in Knight’s thinking behind these content objects.

    First, about our approach – it was very much centered on creating compelling, sharable content that was a true derivative of the project we were evaluating. This was an evaluation of games, so we felt there should be some playfulness to it, while still telling a very serious and objective story. We found this particular report to be one of the most liked and retweeted pieces of content we’ve ever put out. It’s not necessarily that the high number of likes, shares and retweets we saw drove record traffic, but it drove record engagement as measured by those likes, shares, comments and retweets. At the very least, these are anecdotal indicators of a highly engaged social community interested in this package of information.

    Most of Knight’s most trafficked pages (and many other foundation’s) are the ‘housekeeping’ pages of our organization website – the strategy pages, the staff pages, the apply pages, etc. So when we are looking for trends in traffic or social engagement (or both) to better understand the performance of a piece of content we notice spikes its way into the top 10-20 most viewed pages in a specific time period, we intentionally track it with Google Analytics. There are other (also playful) evaluations, we’ve released that may have driven higher traffic numbers, but not as high numbers of likes, shares and tweets. This could have been due to the content of that report targeting a more Twitter-driven audience, while this games piece was very much interesting to the Facebook community as well as the tweeps.

    There are so many layers to all of this, it is difficult to sum up so succinctly how this worked flat–out based on engagement numbers. But, I can certainly say that our assumption for this particular package played out as expected: Create a way to make an evaluation about games more like an actual game itself in order to better engage an online audience. In this regard, I think we went a long way toward achieving that goal. We’d love to keep the conversation going. And look for more work from our fantastic evaluation team soon at knightfoundation.org.

Leave a Reply