Best Practices for Gathering Feedback with an Event App [VIDEO]

When it comes to launching surveys to collect feedback at your event, the faster you can launch them, typically the better the response! While event apps have made the process of launching surveys and collecting feedback faster (remember how painful paper surveys were?), the actual quality of the feedback is still dependant on the questions you ask.

That’s why we’ve put together the following best practices for gathering feedback using your event app. With your EventMobi event app, you can launch 3 different types of surveys: live polls, session surveys and event surveys. This video explains how to use the right kind of survey, at the right time, in the right way to improve the number and quality of your submissions. We hope it helps!

Getting Started (Correctly)

At EventMobi we think about survey feedback in three different ways: live polls, session surveys and event surveys. Determining what you’re trying to accomplish with the poll or survey will inform which method of feedback you use, as well as what you’re asking, when you’re asking and how you’re asking questions. We could be here all day going through which questions you should or shouldn’t ask at every stage of the event. However, nailing down your objectives for the survey or poll and working back (with the advice below in mind) will result in a polished poll or survey your attendees will be happy to complete. For example, if your objective is to gauge the quality of speakers, you should create simple star rating questions in a session survey and send it out directly following the session.

Live Polls

Usually shorter than surveys and tied directly to sessions in the event app, Live Polls should be treated as part of the session and driven by the speaker. Particularly useful to serve real-time objectives, like taking the audience’s temperature throughout the event, measure interest in the current topic of discussion, or drive conversations around the consensus. Live Polls require the advocacy and adoption from speakers, as they will be implementing them during their session.

Session Surveys

Also tied directly to sessions in the event app, session surveys should be sent out either near the end or directly after the session. Since attendees might be on a break and certainly on the move, the questions should be short and easy to answer (e.g. star rating, yes/no, multiple choice.) Session surveys should focus primarily on the session they’re linked with, as that will be top of mind for attendees. Leading survey vendors offer content-specific questions so speakers can tailor the feedback to the topics discussed (i.e. multiple choice question asking what the key takeaway of the session was.) It’s also a great idea to encourage speakers to draw attention to the survey and budget enough time in the corresponding session for attendees to fill it out before the next activity.

Event Surveys

Not tied to a session, event surveys should be utilized to gather more detailed information about the event as a whole, used primarily for pre and post event information gathering.  Event surveys are usually the longest of the three types of feedback and the most appropriate to include open-text response questions. Since attendees will be receiving these surveys before and after the event, they will likely have more time to fill them out. However, filling out a survey may not be a priority for attendees before and after the event so bringing their attention to a survey will go a long way. With EventMobi you can attach a survey to an Alert that’s pushed through the event app to each attendee’s phone.

Building a Better Survey or Poll

Technology improves the distribution and collection of surveys and polls, but it is also limiting in that they’re being viewed primarily on mobile devices, which means smaller screens and more distractions. It’s been said before but we’ll say it again: The number of questions, the length of the questions and the detail in each question all have an impact on the quality of the feedback you’re gathering. If the survey is too long, attendees are less likely to complete it, if the questions are too long, attendees are less likely to read it, and if there isn’t enough detail in the question, the answers won’t be useful anyway.

  • Keep to one topic per survey or poll.
  • Keep surveys short and to the point.
  • Keep questions as concise and specific as possible without excluding anyone.  

Bad Event Survey Questions

The responses your attendees are willing to submit freely are a valuable commodity, so getting the most out of every question is very important. In addition to avoiding long and vague questions, you should also avoid:

  • Poor phrasing of the question or using jargon (e.g. In the session earlier today were you excited about the POTUS video clip?)
  • Leading questions (e.g. Was the keynote speech amazing or unforgettable?)
  • Asking questions that don’t matter. Understanding your audience is integral to understanding their feedback (e.g. If you’re asking a conference full of engineers which sessions they enjoyed most, it’s reasonable to assume their answers will lean towards engineering sessions.)

Good Event Survey Questions

In addition to keeping questions simple and limited to one subject, your questions should also:

  • Focus on things that you have control of at the event (i.e. Multiple choice question asking if a particular session was an appropriate length of time.)
  • Give attendees the chance to answer fully by ensuring that they have an opportunity to articulate why they did or didn’t feel a certain way (i.e. a simple follow up question with an open-text response asking why they answered the way they did.)
  • Have a balanced and full range of responses so nobody feels forced into an answer that doesn’t accurately represent their feelings (which will ruin your data.) If it’s possible they don’t know, let that be an answer.

Strategic Tips

  1. Introduce submitting feedback early with a ‘how-to’ as part of the opening session and explain, or better yet show exactly how to access the surveys and polls. Don’t just hit your audience with a 10 question open-text survey at lunch, as nobody will be interested in that.
  2. Include a test live poll or survey with your ‘how-to’ to get the audience familiar with the process and treat it like a soft introduction. Pique your audience’s interest with a entertaining or contentious question about a relevant or fun topic (i.e. asking a graphic design conference what typeface they enjoy using most.) Intersperse fun or thought provoking questions throughout your surveys to keep attendees intrigued and wondering what the next question will be.
  3. If you’re trying to measure the effectiveness of an education session, simply asking if they learned something isn’t going to reveal the quality of the session. Asking on a scale of 1-10 how much the audience understands the topic before and after the learning experience has taken place will effectively gauge the quality of the learning.
  1. Introduce your event technology early to speakers who may be using it during their session and position it as a valuable addition to their session. Offer to collaborate on questions and involve the speaker as much they’re comfortable with. Speakers need to have an interest in using the technology or they won’t care to implement it correctly, negating any impact. If speakers are just blowing through live poll questions, not giving the audience enough time to reply, the audience will see submitting feedback as unimportant. If it seems annoying to the speaker, it’ll seem annoying to the audience.
  2. Whether you receive happy or unsatisfied comments, consider replying to feedback. It may seem daunting depending on the size of your event, and we wouldn’t suggest replying to every comment but responding to genuine feedback will make your event stand out by ensuring your attendees feel heard.

P.S. For more information on how you can get creative with polls, surveys and feedback visit https://www.eventmobi.com/live-polls-surveys/

What’s Next

Editors Note: This post has been updated from the original version published in January, 2016