Click on the headings to open them. They will open on this page. Open the following link for further information about these headings if required.
Your browser does not support these headings. To ensure that the contents remain accessible, they have been automatically opened so that all the information on the page is displayed.
However, to take advantage of the headings and to ensure that the layout and design of this site are displayed correctly, you are recommended to upgrade to a current version of one of the following standards compliant browsers:
- Internet Explorer (http://www.microsoft.com/windows/downloads/ie/getitnow.mspx)
- Mozilla Firefox (http://www.mozilla.com/en-US/firefox/firefox.html)
- Opera (http://www.opera.com/download/)
There are references to sources and further reading within the text. You can view the full reference by clicking on the name to open a 'pop-up window'. You can then add comments to these references and include them in a personal references list.
Ongoing instructions are provided, but if you would like to read more information on how to do this before you begin, or if you experience problems, select this link for instructions on how to use the personal references list
Instructions:
- Select the references to see full bibliographic details in a pop-up window.
- NB. If you use pop-up window blocking software, you will need to deactivate it for pop-ups on this site to use the reference list. Alternatively, all full references can be seen by navigating to the 'References' page.
- If you would like to add a comment to attach to your record of the reference, write in the text box.
- Select 'add to list' to add the reference and comment to your list.
- You can view your references at any time, by selecting one of the 'Show references list' links. This will open your list in a pop-up window.
- NB. Each module has a different reference list. If you are navigating between modules, any references collected will be saved to different lists. To view the list for a particular module, select any 'Show references list' link within that module.
- If you leave this page, your list will be saved and will be
available for you to refer to again if you return.
(This will not work if you have disabled cookies in your browser settings) - NB. Comments will not be saved if you navigate away from the page. You should copy all comments before you leave if you would like to save them.
- Use of the references list is JavaScript dependent. If JavaScript is disabled, it will be necessary to open the 'References' page to view the full references.
Glossary links are also included within the text. If a word appears as a link, clicking on this link will show the definition of the word in a 'pop-up window'. Select the following link for information about these glossary links if required.
- Select the links see the definitions in a pop-up window.
- NB. If you use pop-up window blocking software, you will need to deactivate it for pop-ups on this site to use the glossary links. Alternatively, all glossary definitions can be seen on the 'Glossary' page in the 'Resources' section.
- Use of the glossary links is JavaScript dependent. If JavaScript is disabled, it will be necessary to open the 'Glossary' page to view the definitions. Opening this page in a new window may allow you to refer more easily to the definitions while you navigate the site.
Principles of good content design
- Background information including the purpose of the research, the sponsor, the means of returning the questionnaire, reassurances of confidentiality (if appropriate), time it will take to complete questionnaire, details of how to contact the researcher;
- Clear instructions for completing each question, with examples if necessary;
- Allocation of adequate time for producing and piloting an initial draft, redesigning and producing the revised version and for pursuing non-respondents;
- Care with questions to avoid duplicate or unnecessary questions; avoid questions that will upset or irritate the respondent; avoid questions that respondents will not have the knowledge or experience to answer; avoid leading, biased or ambiguous questions; use non-sexist, non-racist language;
- Use of a variety of question types including open and closed questions, ranking and rating questions, open-ended responses, multiple select (but bear in mind for online questionnaires open-ended questions are less likely to be completed and may result in more item non-response);
- Use of plain, easy to understand language and avoidance of technical terms, jargon and acronyms;
- Organisation of questions in logical groups and with important questions asked first and questions sub-divided into sections;
- Use of a simple style (but if colour is essential, use of opponent-colours, such as yellow and blue); use of 12 or 14-point font;
- Finishing the questionnaire with a place for respondents to add comments, with a thank-you for assistance and with details of how and when participants can get a copy of the completed survey.
Some general content design principles are noted below.
Welcome screen
No matter how the respondents are directed to or locate the online questionnaire, it is important to ensure that there is a welcome message that informs them that they have come to the correct place. This should outline the purpose of the questionnaire, make respondents feel that their contribution is important, and emphasise how easy it is for them to participate. It should also include the necessary information to ensure that it is in line with ethical guidelines (see example). The welcome message should be as simple as possible and avoid unnecessarily complex information such as lists of instructions about how to complete the questionnaire. The welcome message/page is also the appropriate place to locate login facilities for online questionnaires where access is going to be restricted in this way. Where necessary, the welcome screen should also explain the ideal screen configuration for viewing the questionnaire, and provide details of any other requirements (e.g. pdf viewer, flash player etc).
Example
The following welcome page is taken from a psychology study run at the Universities of Cambridge and Cardiff early in 2005. It was carried out by Martin Bruder under supervision of Prof Tony Manstead. The content of the page was designed to be in line with the American Psychological Association and British Psychological Society ethics guidelines, the guidelines of the US government on the use of human subjects (informed consent) and the additional requirements for listing online studies by the Social Psychology Network. The main characteristics include:
- Short description of the content of the study;
- Information on the Institutional Review Board that has approved the study;
- Contact addresses for pertinent questions about the research and research participants' rights;
- Expected duration of study;
- Statement concerning the voluntary character of participation and the right to withdraw from the study at any stage;
- Minimum age for participation;
- Assurance of anonymity of the data;
- Note of caution about data security;
- Description of foreseeable risks and benefits;
- Contact information of principal researcher and supervisor.
The study itself investigated what participants imagine themselves feeling and thinking in short written scenarios. It was of particular interest, in its investigation of how far these responses are influenced by the emotional behaviour of a second person described in each scenario and the supposed relationship with this person.
|
||||
|
Question type
The highly flexible and graphical nature in which materials can be delivered over the web mean that it is tempting to experiment with novel formats for presenting questions in online questionnaires. For example, Dillman et al. (1998, 8) detail some of the unconventional ways in which they have seen questions formatted in web surveys. However, respondents are more likely to find the questionnaire easy to navigate and complete if its layout and the question formats are similar to those used in onsite questionnaires. This is because respondents can concentrate on answering the actual questions rather than getting to grips with an unfamiliar question or response mode. Questions should be presented in the most appropriate formats rather than being driven by the technology. Additionally, the choice of question type will depend on the aims of the research and the characteristics of the respondents. For example, highly computer savvy respondents will have fewer problems in incorporating more complex question formats since these users are likely to be familiar with using such functions from other applications.
For email questionnaires question types are generally limited to open and closed types. But for web-based questionnaires there are a greater variety of question types that can be used: open, closed, pull-down menus, buttons and slider bars. Generally, open-ended questions are less likely to be completed and result in more item non-response. Closed questions must be employed with care in email questionnaires, as the researcher cannot place any limits on the nature of response, so respondents may provide multiple responses to a question that requires a single response. In web-based questionnaires closed questions are less problematic as text input fields can be programmed to accept only appropriate responses. Pull-down menus can also be used allowing multiple responses (usually by holding the 'shift' or 'control' key while clicking the mouse) and reduce the space required for the questionnaire but this can decrease usability and increase completion time (Dillman 2000). Click tags can either be radio buttons or check boxes. Radio buttons are circular tags that fill in when one option is selected [example]. Although radio buttons do not allow multiple responses, they do result in fast and reliable responses (Best and Krueger 2004). Check boxes are square tags that display a mark when selected and can allow multiple responses, useful for rating questions [example]. Click tags are easy to understand and use but take up a lot of space and require good hand/eye coordination skills. Slider bars align response options along a track containing a pointer bar. They are particularly useful for rating scales because they offer a continuum (Arnau et al. 2001) but require good technical skills, may not appear identically in different browsers and non-response is difficult to determine. When considering ways to present 'one answer from many' questions, it is vital to consider usability and measurement error. The use of drop-down boxes, for example, has particularly problematic implications when used with a mouse with scroll wheel, as use of the scroll-wheel while the drop-down list is selected can lead to participants inadvertently changing their response (Healey, 2007).
Instructions
When giving instructions for various sections of an online questionnaire, it is useful to divide the information into sections to maximise the chance that users will be able to obtain guidance when it is required. If a user experiences difficulty in using the questionnaire or is left unsure of how to proceed, the chances of non-response are likely to be increased.
Care should be taken with each of the following key sections of instructions:
General completion instructions
These should include adequate detail to allow the user to begin. However, it is important not to overload the respondent in the initial stages of completing a questionnaire.
It may be beneficial to highlight key instructions using a particular background or a border to distinguish them from the questions and provide continuity.
Specific instructions for different question types
These instructions are provided to make the method of answering a particular question clear, e.g.
Please answer the following question by selecting the 'Yes' or 'No' button.
They may need to be more or less explicit depending on how conventional the question type is, and on the potential technical skill of the target group. Usability testing is likely to provide assistance in targeting potential confusion or inadequate instruction.
It may be beneficial to provide these as optional instructions which can be accessed by the user when required or delivered where a user experiences difficulty. This can be done by offering 'alert-box' instructions [example], or by providing instructions in 'pop-up' windows [example]. If using pop-up windows, it is important to be aware that many users will block them to prevent pop-up ads being displayed on their computers. If you plan to use pop-up windows, it is a good idea to inform users that they are essential to the working of the survey and that they will need to disable any pop-up blocking software while working on it.
See 'Key design issues' section of the 'Technical guide' module for further details of how these instructions can be implemented.
Submission instructions
Successful instructions at the final stage of a questionnaire can improve response rates, and reduce the incidence of multiple submission or submission of invalid data.
'Before submission, it is important to inform the user if the submit button should be pressed only once or if sections of the questionnaire have been missed or answered incorrectly. Appropriate validation routines prior to the point where the user attempts to submit the form are vital in maximising success. After submission, it is also important to inform the user that the form has been submitted and to express gratitude.
See the 'Form validation' section of the 'Technical guide' module for further information about validation and submission.
It is also essential to reassure the user by informing them exactly what information will be saved, reminding them of the purposes to which it will be put, and providing opportunities to limit the submission of personal details if required.'
Examples
Instructions on how to submit the data, given at the end of a questionnaire:
You have now completed the questionnaire. Press the 'submit' button to anonymously send the data.
Please only press the submit button once and wait until the last page of the study appears.
Your data will be saved which might take up to 30 seconds...
Final page which appears after the submission:
Thank you very much for your help in our research.
The data that has been collected as a result of your participation will be held anonymously.
If you chose to supply your e-mail address for further contact, this will be stored separately from your data. It will only be used to contact you for this particular purpose and will not be shared with any third party.
In case you have any comments on this study, we would be grateful, if you would share them with us by completing the textbox below or by contacting the authors via e-mail:
Once again, thank you very much for supporting our research.
Length
Keep the questionnaire as short as possible. Long, time-consuming questionnaires are likely to result in fewer people wanting to participate and less people completing. To overcome these problems ensure that the invitation to participate in the questionnaire, and the welcome message, provide realistic details about how long the questionnaire will take respondents to complete. It is usually better to provide this information as a time rather than stating how many questions respondents will be asked to answer because some online question formats and structures facilitate the completion of a surprisingly large number of questions in a relatively short period of time. For example, respondents typically took less than 13 minutes to answer 119 questions in a web-based survey about Student Experiences at the University of Leicester. As a general rule, Crawford et al (2001) recommend that online questionnaires should take respondents no more than 10 minutes to complete. Providing respondents with an indication of how far through the questionnaire they are at regular intervals can be useful for participant retention. This can be achieved in various ways. A progress indicator can be used with little technical difficulty, though this may affect download speed and there is a danger some users may find them distracting. More simple ways include structuring the questionnaire into different sections (each of which may contain several questions) and indicating to respondents the nature of this structure both in the welcome message and at relevant points through the questionnaire.
Example
The following is an example of extracts from instructions designed to highlight the structure of the questionnaire and make respondents aware of progress.
Instructions in the welcome message:
“This questionnaire contains three sections. The first asks you about your experiences as a student during this academic year, the second about your overall university experience and the third about your plans for when you graduate. Completing these three sections will take around 10 minutes.”
At the beginning of the first section:
Progress bar
The following is an example of progress indicator to give an indication of what they involve.
Select 'next' to simulate progression from page to page of an online questionnaire.
See the 'Technical guide' module for details of how to implement a progress bar.
Multi-media stimuli
Multi-media stimuli can be added to web-based questionnaires to test certain reactions, to improve design or to clarify instructions. But these can produce the most technical difficulties in terms of download time, differences in appearance according to browser and skills required from respondents, and they can also have a major influence on responses given (Dillman and Smyth, 2007). It is therefore best to only use multi-media stimuli when essential to the research project.
Static formats include pictures, photos and graphics. As a minimum, an image of the institution the researcher is affiliated to will probably be necessary for research legitimacy, but graphics may also be necessary as an essential feature of the research. In all cases, it is important to ensure that images are provided in an appropriate format (in the vast majority of cases, GIF or JPEG), and to be aware of the effect that size and compression rates will have on file size, and hence on download time. Keeping download times to a minimum is an important factor in reducing non-response.
Example
The following examples illustrate the time needed to download two different questions. The first consists of text only, which the second also includes graphics. The radio buttons are shown for illustation only and are not active.
Static format (Text-only)
Approx file size: 0.05 KB
Approx time to download:
56KB Modem: 0.1 secs
Look at the following words, and give each a score according to how positive you feel about it, by selecting the radio buttons underneath.
Aluminium | Tin | ||||||
|
|
||||||
Silver | Magnesium | ||||||
|
|
||||||
Copper | Brass | ||||||
|
|
Static format (Graphics)
Approx file size: 9.4KB
Approx time to download:
56KB Modem: 2 secs | Basic broadband connection: 0.4 secs | University connection: 0.2 secs
Look at the following shapes, and give each a score according to how positive you feel about it, by selecting the radio buttons underneath each.
|
|
||||||
|
|
||||||
|
|
Streaming formats include audio, video and animation. In addition to issues of file size, which can be particularly problematic in the case of video, there are also a number of further aspects which must be considered. The use of sound and video may require access to computers with relatively high minimum specifications or with particular equipment such a speakers or software. It may also be extremely difficult to predict how the information will appear on the respondent's computer. Given this, the use of multi-media should only be considered if it is essential to the research and, in certain cases, it may be necessary to carry out the research in a controlled environment such as a university computer lab to ensure validity of results (though, of course, this is likely to negate many of the benefits of carrying out research online).
Where multi-media is used, it is important to use formats that are likely to be as widely distributed as possible, simple to download and install, and available free-of-charge, such as 'Macromedia Flash'. It is also a good idea to attempt to maximise the chances of the user having a trouble-free experience. Methods of achieving this include adding an easily accessible link to any plug-in software required, clearly specifying minimum specifications required for the questionnaire, and using 'browser sniffing' to test whether or not the user has certain technology installed. These methods can make the inclusion of multi-media realistic in cases where absolute control over how it appears is less important to the research.
Example
The following example illustrates the time needed to download a question which includes Macromedia Flash streaming multimedia content. The radio buttons are shown for illustation only and are not active.
Streaming format
Approx file size: 131 KB
Approx time to download:
56KB Modem: 29 secs | Minimum broadband connection: 4 secs | University connection: 1.5 secs
Listen to the following sounds by selecting the buttons. Give each a score according to how positive you feel about it, by selecting the radio buttons underneath.
|
|
||||||
|
|
||||||
|
|
Good design for online questionnaires: a checklist
- Ensure consistency of appearance with different browsers and computers;
- Include a welcome screen;
- Provide clear and precise instruction on how to complete and submit the questionnaire;
- Keep the questionnaire as short as possible- 10 minutes maximum;
- Include a progress indicator to reduce drop-outs;
- Use colour sparingly- yellow and blue most effective;
- Use 12 or 14 point Times New Roman or Arial font. Verdana is also a good choice;
- Use a variety of question types appropriate to the research question and type of respondent;
- Only use multi-media stimuli when essential to the research project;
- Pilot and do usability testing (see the 'Implementation: Piloting, evaluation and analysis' section).
OPEN MY REFERENCE LIST ADD ALL REFERENCES « BACK UP NEXT »