Skip to Content
18 Min Read

Survey: Evaluating [Digital] Communications

This post has been cross-posted from Forum One’s website. 

Introduction

In the for-profit world, measuring the impact of communications and marketing efforts is by no means an insignificant task; however, such companies do have the benefit of clearly-defined target outcomes to do so. All of the common metrics—e.g., leads generated, products sold, customer retention—are ultimately in service of maximizing revenue and profit.

In the nonprofit realm, the clarity in knowing exactly what you are trying to achieve, and being able to reliably measure your performance against these goals, is often much more challenging. How are your communications activities generating meaningful impact? Which of these activities are providing you with the greatest success?

The Survey

In March 2016, Forum One released a 10-question survey to shed light on how nonprofit professionals define and measure the success of their communications efforts. In all, 92 staff across research institutions, advocacy organizations, associations, foundations, academic institutions, and the federal government responded to the survey. Additionally, we conducted follow-up interviews with six nonprofit professionals to provide more context around their responses. The goal is for the survey’s findings to serve as a practical reference tool for nonprofits as they continue to develop and refine their communications strategies and measurement plans.

Key Findings

Most Communications Strategies are Aligned with Mission, But Not in a Measurable Way

On a scale of 1 to 5, we asked respondents to score how closely they connected their communications activities to their organizational mission and goals? Two-thirds graded themselves as a 4 or 5. About one-quarter thought they did an average job in aligning their activities with their organization’s mission, and the remaining 9% thought there was considerable room for improvement.

Slightly more than 50% of all respondents overall—and specifically 42% of people who said their communications activities were closely connected to their organization’s mission—said that it was difficult to align their activities with their organization’s mission in a measurable way.

This challenge was supported by many respondents’ comments when asked how they would improve their communications metrics definition and reporting. For example:

“Leadership not communicative on organizational goals and how best to measure if our efforts are effective in reaching them.” – Digital Communications Associate at an Association

“[We need to] define success for research/information initiatives in a measurable way.” – Monitoring & Evaluation Associate at a Nonprofit Research Institution

“I would like to tie communications efforts to practical achievements in a more tangible way.” – Communications Director at a Foundation

“[We need to] have a clearer way to link our social media/communications numbers with actual impact—know better how each number translates to actual success.” – Communications Director at a Nonprofit Research Institution

“Finding better ways to measure and directly attribute impact and outcomes to specific communication interventions.” – Communications Associate at a Nonprofit Research Institution

“Being more strategic and focused in the metrics that we choose to report and analyze.” – Digital Communications Coordinator at an Issue Advocacy Nonprofit

There is clearly a demand among nonprofit communicators—and those to whom they report—for guidance on measuring and communicating the impact of their work. Not a single survey respondent or interviewee said something along the lines of, “oh measuring impact of communications? We have that figured out.” Instead, I noticed among those I interviewed a humble and hopeful sense that at least one of their peers was getting it right.

Research Institutions Are More Likely to Track Communications Impact than Other Nonprofits

In all, 31% of respondents said they were actively tracking “impact metrics.” In the survey, we provided examples to guide this response: e.g., the number of policies referencing your research, the number of lives positively impacted through online donations, or a measurable public opinion shift on your issue. In other words, we wanted to know who was going beyond metrics such as unique website visitors, event attendee numbers, or media mentions of their organization and attempting to measure their influence on a positive change in the real world.

So who were these 31% of respondents who were measuring impact? I was surprised to find that the largest share worked for research-oriented nonprofits. Approximately 40% of respondents who worked for such groups are reportedly measuring the impact of their communications activities, as opposed to 26% from issue advocacy organizations and 13% from associations and foundations. In our experience, research institutions have a difficult time defining impact in a measurable way. And in fact, even among those who said they were capturing impact data, most reported that aligning communications activities with the organization’s mission in a measurable way was a challenge. Researchers strive to publish and disseminate work that is timely and valuable, but their stated end goal is most often “to inform” rather than “to influence” a debate or perspective or an issue. This wall of separation between research and advocacy makes it difficult to define desired outcomes. As one interviewee from a research institution put it:

“Our communications shop does a good job of identifying target audiences and using web analytics for reach and engagement measures. But it still begs the question, ‘is this information valued by the target audiences; are they the right target audiences.’ It’s tough to set measurable goals that matter.” – Monitoring & Evaluation Associate at a Nonprofit Research Institution

Equally surprising was that issue advocacy nonprofits reported more difficulty in measuring impact of communications than research institutions. We had assumed, incorrectly, that since success for these groups is easier to define (e.g., policy victories, lives saved, conditions improved, public opinions shifted, etc.), it would be easier to connect communications activities to progress against these goals. But perhaps movement on the issues these groups address happens slowly and with too many contributing factors to determine how best to measure its impact.

Most respondents are in circumstances similar to the following response:

“We’re struggling to find measurable things we can track that directly connect the work we do (media, emails, website, speaking, lobbying, research reports, etc) to the impact of our work (policies passed, opinions changed). We can track visibility (web traffic, media hits, social media views, etc) and engagement (actions taken, email sign ups, social shares, etc) but metrics that connect those things to campaign impact is hard.” – Communications Director at an Issue Advocacy Nonprofit

Most Nonprofits Track a Combination of Output, Reach & Engagement Metrics

More than 80% of respondents said they tracked “reach metrics” (e.g., number of press mentions of your organization, follower growth on social media accounts, search engine ranking for keywords) and “output metrics” (e.g., number of events hosted, number of tweets posted, number of policy briefs published). About 70% are tracking engagement metrics (e.g., number of event attendees, number of petition signatures, number of blog comments). In order to collect this data, respondents are using a long list of tools.

Web Analytics, Social Media, and Email Marketing Software Are the Most Popular Tools for Collecting Communications Data

Most respondents reported using a combination of web analytics, social media, email marketing software and spreadsheets to collect data. Among interviewees, Google Analytics was consistently named as a primary tool used for communications data collection and reporting.

In terms of analytics tools, feedback included the following:

“I use Google Analytics as the primary way of figuring out where people are going, what is the [user] flow, how did they get to the site.” – Digital Media Director at an Association

“I’m learning the advanced features of Google Analytics; instead of just looking at traffic, looking at engagement.” – Webmaster at an Association

About ⅓ of respondents collect data using their CRM or AMS, a press clipping service, and audience surveys and interviews. Twenty percent and fewer used event management, fundraising/advocacy or SEO software, or a direct mail service. Several interviewees mentioned some interesting and creative approaches to gathering information:

“We have started tracking event attendance and unsolicited requests for information as an indicator of influence.” – Monitoring & Evaluation Associate at a Nonprofit Research Institution

“I call policy makers and ask them, ‘do you use what we provide?’” – Executive at an Issue Advocacy Nonprofit

Nonprofit Leaders Crave Communications Data, But Are They Asking the Right Questions?

The highest percentage of respondents reported on communications metrics to leadership on a quarterly or bi-annual basis. Fewer than 25% reported monthly, and about 20% reported on an ad hoc basis.

“My boss is really into impact – ‘these numbers are great. What did they translate into?’” – Communications Director at a Nonprofit Research Institution

“There’s an appetite among leadership for numbers, whether those numbers matter — web hits, Twitter followers, articles in the NY Times…We’re stepping back and asking ‘what actually matters for you achieving your desired goals?’ Not clear that there is alignment between what is being tracked and utility.” – Monitoring & Evaluation Associate at Nonprofit Research Institution

A significant majority of respondents said they produced these reports manually using a combination of tools. About 16% solely used web analytics dashboards. Only 3 respondents said they imported data from a variety of sources into an analytical software tool.

A Majority of Respondents Lack the Time and/or Skills to Report on the Effectiveness of Their Communications Efforts

Respondents identified “capacity” as the largest challenge in communicating about the effectiveness of their work. Interviewees from comparatively-small organizations said they felt a tension between “measuring and doing” communications work. They need guidance on what is most meaningful to track and report on, how to do so, and at what frequency. Some view the options available in a tool such as Google Analytics as overwhelming, and without proper training and direction, data capture and reporting initiatives can seem aimless and risk petering out.

“We don’t do this and we know we need to! We don’t have the funds to hire someone to focus on this, or the skills to be able to do it alongside something else, at the moment.” – CEO at a Nonprofit Research Institution

“None of us are trained on using Google Analytics correctly. I took a course on GA a few months ago, and it was eye opening how much I didn’t know.” – Webmaster at an Association

“Even though we are a single issue organization, we are still more reactive than proactive.” – Executive at an Issue Advocacy Nonprofit

“Communications is under-funded in some sense, so being able to measure the impact of communications is imperative.” – Digital Media Director at an Association

Demographics

As mentioned, 92 individuals responded to the survey. They represented a range of nonprofit and government organizations, roles, and levels of seniority as represented in the charts below.

Conclusion & Recommendations

Survey respondents and interviewees are eager to improve the ways in which they measure and report on the impact of their communications activities. In their own words, they’d like to:

“Have the capacity to just start tracking our communications efforts in any kind of meaningful and, importantly, consistent way!” – Operations Director at an Association

“Identify a strong way to measure impact.” – Communications Director at an Association

“[Understand] How to connect communications to strategic impact.” – Communications Executive at a Foundation

“Need to identify impact metrics which would show how successful we are.” – Digital Communications Director at a Nonprofit Research Institution

However, this work can often be challenging, time-consuming, costly and unproductive. I’ve heard anecdotally of million-dollar impact evaluations for communications-related work with findings that were, at best, inconclusive. Given these challenges, what guidance can we provide for nonprofit communicators?

1. Define Measurable Organizational Goals For Your Communications Strategy

Several interviewees from smaller organizations reported that they were currently operating without a communications strategy. Others were taking steps this year to develop a plan for aligning communications activities with their organization’s mission. And for those with communications strategies in place, they wanted guidance on how to measure their impact.

The first step is to clearly define what you are trying to achieve. Here’s a summary of some common goals for different types of nonprofit organizations:

  • Nonprofit Research Institution
    • Goal: Reach, inform, and influence policy makers and/or practitioners with your research and analysis.
  • Nonprofit Issue Advocacy Organization
    • Goal: Educate and inspire people to change policies, perspectives, and/or behaviors, or deliver a product or service to improve outcomes related to your cause.
  • Association
    • Goal: Increase the capacity and effectiveness of your members and their work.
  • Philanthropy
    • Goal: Improve outcomes in the focus areas to which you direct funding.

Mapping communications strategies and tactics to organizational goals is a requirement for an effective plan.

2. Identify “Right-Fit” Impact Metrics

You might consider several of the above goals difficult to measure, and you wouldn’t be wrong. But that doesn’t mean you shouldn’t try. In considering this challenge, we were inspired by the new “Goldilocks” initiative at Innovations for Poverty Action (IPA), which is intended to help recommend “right-fit” monitoring and evaluation solutions for organizations who are not good candidates for randomized evaluations of their programs. IPA states that, “the desire to measure impact often leads organizations in one of two dangerous directions: collecting mountains of data that cannot be used to measure impact or collecting insufficient data to demonstrate accountability and to learn what to do in the future.” Similarly, I would encourage nonprofit communicators to consider their goals and activities and identify a limited number of impact or proxy impact metrics to target.

Here are some example “right-fit” impact metrics for the goals above:

  • Reach, inform, and influence policy makers and/or practitioners with your research and analysis.
    • Impact Metric: Citations of your research in policies.
    • Proxy Impact Metrics: Mentions of your research during policy debates, share of voice on research topics, interview requests with experts
  • Educate and inspire people to change policies, perspectives, and/or behaviors, or deliver a product or service to improve outcomes related to your cause.
    • Impact Metric: Improved outcomes linked to your campaign or intervention.
    • Proxy Impact Metrics: Public opinion shift on your issue, evidence-supported actions taken to address issue (before true impact can be measured), proposed legislation in support of your cause.
  • Increase the capacity and effectiveness of your members and their work.
    • Impact Metric: Increased productivity and influence across your sector.
    • Proxy Impact Metrics: Advancements or awards for your members, share of voice among members on relevant topics, increased funding/revenue across membership.
  • Improve outcomes in the focus areas to which you direct funding.
    • Impact Metric: Improved outcomes linked to funded work.
    • Proxy Impact Metrics: Key evidence-supported actions or policies informed by funded work, increased awareness around focus areas and work of grantees, share of voice among your experts and grantees.

3. Consider Your Specific Contributions (But Don’t Sweat Them)

Any discussion of impact metrics naturally leads to the question, “how much of this impact can we take credit for?” Often there are countless factors that influence an outcome and its impact, and it would be disingenuous and inaccurate for an organization to claim sole responsibility for a victory. We recommend considering your organization’s specific contributions, but not agonizing over them.

You can hire external evaluators for a similar assessment of your communications work, or you can make an educated guess. For example, if you are the only organization focused on a niche issue and a policy for which you advocated was instituted or changed, your work could be considered decisive. If public opinion shifted on an issue on which you and other organizations have actively campaigned or conducted research, you could consider your contribution important. While you should tread cautiously in using an unscientific self-assessment in external communications about the impact of your work, a “decisive” or “important” rating could inspire your colleagues internally to continue their influential work.

4. Use Impact Metrics to Advocate For Your Work

Demonstrating impact that you can connect to communications activities can also help make the case to leadership to prioritize your work. Several survey respondents expressed frustration or challenges with their superiors and colleagues:  

“My supervisor does not listen” – Communications Director at an Association

“Leadership considers work a check-off item” – Digital Communications Associate at an Issue Advocacy Nonprofit

“Currently, the biggest challenge is collecting content, which I hope will go better as people realize the importance of such communication.” – Digital Communications Associate at an Issue Advocacy Nonprofit

Others reported leadership’s fixation on metrics such as website unique visitors, which alone can have little meaning or consequence. It’s critical for nonprofit communications professionals to better educate their colleagues about defining meaningful metrics, and how doing so can help the organization make better decisions about how to invest in communications activities. Be transparent about the imperfect science of communications measurement, but also don’t shy away from using relevant data to advocate for your work.

Final Thoughts

Reporting on communications metrics should not be about volume. By all means, capture as much data as you can — you may find some utility for it. But just because you’ve captured it, doesn’t mean you need to analyze and report on it. Focus on what’s meaningful and actionable.

If your overall website page views are down 5% this quarter, what does it mean and what are you supposed to do about it? These questions are nearly impossible to answer because “pageviews” in isolation doesn’t mean anything. However, if you are a think tank and fewer visitors are accessing your publications and/or spending less time reading them, you should dig deeper and consider improvements to your email or social campaigns, search engine optimization, site architecture, or content strategy. Targeted, and tested, improvements can lead to increased reach and engagement, and ultimately impact.

This survey was conducted by Forum One. The analysis was authored by Brian Pagels. It has been cross-posted from Forum One’s website. 

Join The Network

Community, learning, and leadership to help you do good, better.

Become a member