
Engaging the public in the many interconnected democratic processes that affect their lives is far from simple. There can be no one-size fits all method of meaningful engagement as there is no “one-size” citizen.
In recent years, as more governments and public bodies have come to recognise the benefits of engaging the wider public in their work, the activities used to engage and consult citizens have developed significantly. However, despite these innovations, consulting the public via online surveys remains one of the most useful and widely used forms of public engagement.
Almost every consultation will at some point ask participants to take part in a survey. The data gathered in these surveys can be vital to the development of relevant policy, the response of public bodies, and the future of affected communities. That’s why it is so important that consultation planners get this step right.
Survey Design and Public Engagement

Effective survey design can improve both the quantity and quality of the responses received by a consultation. It is often the first point of contact the public has with a proposed project, plan or policy. Surveys therefore can be an important first impression for participants and the community at large. Most surveys are now conducted online, using a purpose-built GovTech platform that can ensure security and accessibility.
Surveys have become a vital tool for policy-makers, allowing them to gain a deeper understanding of the priorities and issues of the public. However, If well designed, a survey can be more than a one-sided delivery of public sentiment. A good survey can be the start of a conversation.
When it comes to citizen engagement, surveys can serve a number of purposes:
- Providing information on a new project, plan or policy.
- Gathering views on a proposal from the public.
- Allowing policy-makers to easily see whether certain positions are more or less common in certain groups (e.g, are home-owners more opposed to a certain planning announcement? Are parents more in favour of certain policies? etc.)
- Understanding the impact a planned change may have, for example if surveying the use of a certain public service.
- Giving decision-makers access to lived-experience to go alongside what may be a more professional or academic understanding of a given issue.
Usually, the more people engage with a survey, the more representative the data will be. As well as just the number of responses, a survey that has participants from all affected communities and demographics will be able to provide response data that is “wide” as well as deep. Engaging people from all affected communities may require additional effort, and where possible, a comprehensive community engagement strategy.
A survey that is poorly structured, inaccessible or otherwise difficult to engage with is likely to deter people from taking part. Even when people do take part, the responses are likely to be less clear, irrelevant or more difficult to meaningfully analyse. A well-designed survey however, is likely to encourage greater participation.
How to Design a Survey: Best Practices for Better Public Engagement

Designing a survey that is both engaging for respondents and useful for decision-makers can be tricky. As a method of citizen engagement, surveys may seem like a relatively simple one. However, there are many common ways that governments and public bodies can go wrong when it comes to survey design. The following tips can help make sure your survey is engaging, inclusive, and gets the data you really need.
Designing Good Questions
Perhaps unsurprisingly, the first step in creating an effective survey is designing good questions. Where a poorly designed question may result in vague, unhelpful or irrelevant answers – a well designed question can help collect meaningful, in-depth answers.
There are some common, basic mistakes that people tend to make when designing survey questions. These include:
- Using leading or loaded language that may influence how a participant answers a question. For example, “how beneficial will this transformative new policy be?”
- Using language that is vague or unclear. Note that this is not necessarily the same as asking open questions, which intentionally allow some individual interpretation.
- Asking about more than one issue in a single question. Example: “How affordable and useful do you think this service will be?”
- Not allowing for undecided, apathetic or unknowing responses. If you only give people options that assume a pre-existing position, you may miss an issue with public understanding.
- Forcing participants to respond to all questions, especially if a survey is long.
Instead, when designing questions for a survey, it is important to ensure questions are clear and concise. They should use neutral language in plain English. They should offer options to answer only certain questions and state where there is a lack of existing knowledge. Survey questions should be accessible to anyone you need to interact with your survey in a meaningful way.
Including Quantitative and Qualitative Questions
As well as thinking about the kinds of language used when designing questions, it is also important to think about the type of question. Questions can generally be separated into one of two categories; quantitative and qualitative.
Quantitative questions are good at generating structured data. They can be used to measure self-reported sentiment, compare preferences, and easily calculate trends and differences between groups. Examples of quantitative questions include:
- Radio buttons or dropdown menus, for questions where you only want one choice selected.
- Checkboxes, for questions that may have multiple selections.
- Matrix questions for comparing and rating several factors.
- Numeric input fields, to capture things like quantities, ages or time living in an area.
Qualitative questions are more open-ended, and give respondents the ability to explain and describe in their answers. They are important for getting in-depth answers, particularly when engaging with complex topics or when fact-finding. Qualitative answers also have the ability to inform decision-makers of considerations they may otherwise not be captured with set responses.
While qualitative answers may be harder to turn into easily understood data, there is now software like CItizen Space that can simplify this analysis by turning qualitative answers into quantifiable data. For example, by identifying common words, performing a sentiment analysis, or by sorting data thematically.
Most good surveys contain both qualitative and quantitative questions. For the majority of consultations, a balanced approach can offer a richer understanding of participants and their communities than either type alone.
Linear vs Non-Linear Survey Structures
If you haven’t designed a survey before, linear and non-linear survey structures may not be terms you’ve come across before. In fact, the definitions are quite simple.
Linear survey structures take all respondents through the same pages. Although not all participants in a linear survey structure necessarily have all the same questions (addressed in the next section), they do go through all pages in the same order. Linear survey structures require the respondent to click ‘next’ as they progress through sections, and required questions will need to be answered to access the next set of questions.
Non-linear structures give respondents more choices about what questions they answer and in what order. A contents page is generated, allowing respondents to select the sections they wish to answer themselves. This is useful for long, detailed surveys where respondents may only want to interact with certain sections.
Both types of surveys have their uses. A linear survey is useful for where you want consistent, uniform data from as many participants as possible. A non-linear survey is useful for encouraging responses only with issues participants are actually invested in, and prevents ‘click through’ responses that can skew data.
Making the Most of Skip Logic
Skip logic (also known as survey routing or branching) moves respondents through a survey depending on how they answered previous questions. Skip logic shows or hides certain questions or sections according to their relevance to the participant or the needs of the consultation.
A survey using skip logic will contain a mix of conditional and unconditional questions. Unconditional questions appear to everyone, regardless of any previous answer. Conditional questions will appear as a result of previous answers.
Example 1: A survey aimed at understanding crime in a certain area wants to have a specific understanding of how women feel walking alone at night in a certain area. Therefore questions relating to this will only appear if respondents previously selected ‘female’ at the start of the survey.
Example 2: A survey wants to understand why some people feel negatively about a certain planning proposal. Therefore a text-box asking them to explain their feelings will only happen if they rated their feelings on the development 1-2 out of 5 in a previous question.
Skip logic is a very useful functionality, offered by digital engagement platforms like Citizen Space, to improve respondent experience. Making sure people are only presented with questions that are relevant to them is a key part of good survey design.
Accessibility in Survey Design
Accessibility in public engagement is everybody’s responsibility. If an activity is aimed at the public, it must both understand and accommodate differences. That means everyone should be able to access and complete the survey regardless of their background, literacy level or their need for assistive technology.
When designing a public engagement survey, it is good to keep accessibility and inclusivity in mind at all points. This means:
- Writing in plain language wherever possible.
- Supporting assistive technology like screen-readers and avoiding embedded media without alt-text.
- Offering alternative formats, including language translation.
- Making sure surveys work across devices, particularly mobiles.
When conducting a survey online, it is important that it is WCAG compliant. WCAG is an internationally recognised set of guidelines aimed at improving web accessibility. Responsibility for adhering to this guidance lays not just with those designing a survey, but also with the platform hosting it. That is why it is important to use a recognised, purpose-built platform like Citizen Space to ensure accessibility needs are consistently met.
Closing the Feedback Loop

Once a survey has closed, there are still several important steps that need to be followed.
First of all, the responses need to be sorted and analysed. Too often, surveys are conducted without a proper plan of how the data will be used. Not only is this a waste of time and resources, it also has the effect of diminishing trust in the process. Digital engagement platforms like Citizen Space can help, as they have the ability to do a lot of the work of analysing data and providing easily visualised results.
A survey with useful results to decision-makers is a success in itself. However, if at all possible, it is always helpful to communicate the anonymised data back to the participants themselves. It is particularly good to demonstrate how this data is being used to inform future policy. Showing participants that their input mattered can make a great deal of difference to their likelihood to participate in future research.
Citizen Space is the go-to platform for connecting governments, developers, and citizens. If you’d like to learn more about how our software supports survey design and the consultation process, book a free demo and we’ll walk you through it.
Sign up for the Delib newsletter here to get relevant updates posted to your email inbox.