One of the biggest pushes in PR measurement is the use of primary survey research. It makes sense: in order to determine how effective a program or messaging implementation has been, the best thing a PR practitioner can do is ask.
Survey forms are now easier than ever to develop and send, which gives communicators a powerful tool to see how their programs are resonating with target audiences. SurveyGizmo, Survey Monkey, and Zoho Survey are just a few of the survey software products available to PR practitioners and communicators.
Conducting surveys is one of the very few methods by which a PR program can determine if communications directed at target audiences have been successful. For example, if you are using AMEC’s framework to focus measurement efforts, the next to last step is listing Outcomes. The framework lists the following ways to assess outcomes (bold added):
Outcomes are the effects that your communication had on your target audiences that align to your objectives. Examples of outcomes of communication can include:
Learning/knowledge – e.g., through survey or interview data, quizzes, tests
Trust – e.g., increased trust ratings in surveys
Preference – e.g., stated preference in surveys, social media comments
Intention – e.g., through inquiries, registrations, trialing, survey data
Attitude change – e.g., through survey or interview data
Complying behaviour – e.g., sales, donations, driving safely; voting, etc.
Advocacy – e.g., endorsements in online comments
Of the seven different examples listed, five cite using surveys as a data collection method. Clearly, conducting surveys is important to the process of assessing the effectiveness of communications.
The problem previously has been cost. Statistically valid surveys can be expensive to run. Along those same lines, if a target audience is small, the cost of running a survey can sometimes be ineffective at determining true results, because when a sample size is small, accuracy suffers—so getting a valid survey with accurate results on a small program meant that the cost to run the survey would represent a significant (and potentially outsized) portion of the overall budget.
The introduction of inexpensive and easy to use software that streamlines the survey process is a boon to PR practitioners, marketers, and really anyone interested in capturing customer or audience reactions.
How much is too much?
My husband and I recently went on vacation; by the end of the week, we’d received links to surveys from the airline, the rental car company, and several of the B&Bs we stayed with during the trip. I frequently get requests from clothing retailers, our internet provider, and bank (both online survey requests and a request at the automated phone prompt when I call for any reason). Over the summer, I received a five-page survey from the medical group my doctor’s office is part of. Surveys are also either sent or we receive a call after every car appointment, customer service request, or product return, etc.
I used to dutifully answer each one—some were long, some were very short—but as someone on “the other side” I really did want to help and provide feedback.
Not anymore. It’s a rare occasion now that I decide to respond to a survey, and really only do so if there’s a “special case” that warrants me spending the time to provide a response (e.g., either a very good experience with exceptional service, or a bad experience that feedback might be able to correct).
Surveys structure is important
Survey response rates vary depending on the audience being surveyed; an internal audience, such as employees, will have a higher response rate than a survey sent to a consumer. With so many surveys taking place, there is a potential for survey fatigue to set in—and this could cause you to miss important insights into your audience.
There are some recommended practices that are good to keep in mind, and Survey Monkey has a very detailed and complete guide of Surveys 101 that walks you through good survey design. In addition to what they cover, here are some other tips:
- Try and keep surveys short. Figure out what is most important to know or understand about your audience, and stick as closely as possible to questions that will surface this information.
- Use clear and unbiased language if you want legitimate responses. This sounds like a no-brainer, but particularly if you are sending a survey on behalf of a client program, be very aware of the language you use in questions to make certain you are eliciting an honest response—even if the response might be negative or not reflect well on your client or program.
- Be honest about how long it will take to complete. Don’t say in your email that it will “take just a few minutes of your time” if it’s a multi-question survey, with scales assessing from 1 to 10, with additional open-ended questions…and so on. People will be annoyed, and your survey will be abandoned midway through, leaving you with insufficient data.
- If you do have a long survey, make it worth the person’s time to complete it. I haven’t once won the gift card several have promised I’d be in the running for if I did complete the survey, but it did compel me to complete the survey. High-value coupons or discounts are great if you’re able to offer them; and bonus points when the gift card/prize is logically connected to your target audience in a meaningful way.
You can also learn a lot from the pros—Pew Research has a terrific explainer on their website about survey questions and design practices.
The ready availability of survey software can seem like a gift after years of avoiding surveys on smaller or mid-sized programs because of the costs. But before you start sending out too many surveys, take the time to learn about good survey design, and think carefully about what you hope to learn—and be mindful of how much you are asking of the survey respondent.