Computer-assisted web interviewing
Computer-assisted web interviewing (CAWI) is an Internet surveying technique in which the interviewee follows a script provided in a website. The questionnaires are made in a program for creating web interviews. The program allows for the questionnaire to contain pictures, audio and video clips, links to different web pages, etc. The website is able to customize the flow of the questionnaire based on the answers provided, as well as information already known about the participant. It is considered to be a cheaper way of surveying since one doesn't need to use people to hold surveys unlike computer-assisted telephone interviewing. With the increasing use of the Internet, online questionnaires have become a popular way of collecting information. The design of an online questionnaire has a dramatic effect on the quality of data gathered. There are many factors in designing an online questionnaire; guidelines, available question formats, administration, quality and ethic issues should be reviewed. Online questionnaires should be seen as a sub-set of a wider-range of online research methods.
Using online questionnaires
- The administrator has greater flexibility in displaying questions. Questions can be displayed with:
- Check boxes
- Pull-down menus
- Pop-up menus
- Help screens
- Sub menus
- An online forum allows responses to be received more quickly from subjects.
- This method is also cheaper to administer, as there are no costs associated with purchasing paper or other materials for printing. Postage costs are also mitigated.
- Since data is collected into a central database, the time for analysis is subsequently reduced.
- It is easier to correct errors on an online questionnaire, since the administrator does not have to reprint all the questionnaires for distribution.
- Not everyone has access to the Internet, so the response rate is limited.
- Many people are not receptive to completing questionnaires online.
- Studies indicate that the demographic that responds to online questionnaire invitations are generally biased to younger people.
An online questionnaire needs to be carefully thought through before it is launched. There are several important paradigms that should be considered when creating an online questionnaire.
Collection and prioritization of data
- The objectives of the initial inquiry need to be reviewed to determine what information needs to be gathered.
- The required information should be ranked in order of significance in an unbiased manner. Topics should not lead a person into making false conclusions.
Online questionnaire format
- The questionnaire should begin with a short introduction that informs the subject why the questionnaire is being conducted.
- Questions for questionnaire should be created in the most appropriate type of format that facilitates understanding.
- In creating the layout of the online questionnaire, “smart branching” should be utilized to lessen complexity. For example, if a subject selects “yes” to a question, the questionnaire would automatically jump to the next relevant question and vice versa.
- A brief “thank you” note should be included at the end of the questionnaire.
- As a rule of thumb, the questionnaire should not exceed 5 minutes.
- A sample of the questionnaire should be distributed to at least five people, prior to publication on the web. Upon their completion of the questionnaire, feedback from the participants should be obtained.
- Information relating to whether they understood the main point of the questionnaire should be gathered.
- It is important to distinguish if participants had any difficulties with any of the questions.
- The feedback of the subjects should be utilized to make any necessary changes to the questionnaire.
In designing a questionnaire, the evaluation method should be kept in mind when choosing the response format. In this section, there are various response formats that can be used on online questionnaires.
The respondent is required to click on the circle, which corresponds to the desired answer. A dot in the middle will appear once an answer is chosen. Only one answer can be chosen.
- Recommended when the choice of answers are mutually exclusive.
- No default answer should be provided. If a default answer is provided, it may be mistaken as an answer if the respondent chooses to skip the question.
- Require precision in clicking.
The respondent is required to click on the box next to the answer that corresponds to the desired choice. A checkmark will appear in the box once an answer is chosen. More than one answer can be selected.
- If there are many options, a simple matrix is recommended.
- When using check boxes, if more than one answer can be checked, it should be specified in the instructions.
- If “none of the above” is required, provide it with a radio button to prevent an erroneous check on this choice in case another answer has been chosen.
The respondent is required to click on the arrow on the far right side of the box. Once clicked, a display with a list of answers will appear. A scroll bar may appear on the right hand side if a large number of answers are displayed. The respondent can click on the highlighted section of the list to select an answer. This answer will then appear in the box. Only one answer can be selected for this type of question.
- Good option for long lists such as state/country of residence.
- Should be avoided for items where typing is faster. For example, year of birth.
- In designing drop-down boxes, do not make the first option visible to the respondent. This may be misleading where no answer may be chosen.
Open-ended questions are those that allow respondents to answer in their own words. In an online survey, textboxes are provided with the question prompt in order for respondents to type in their answer. Open-ended questions seek a free response and aim to determine what is at the tip of the respondent's mind. These are good to use when asking for attitude or feelings, likes and dislikes, memory recall, opinions, or additional comments.
The respondent is required to click inside of the text box to get the cursor inside the box. Once the cursor is blinking inside of the box, the answer of the question can be typed in.
- Make the size of the text box according to the desired and required amount of information from the respondent.
- Provide concise and clear input instructions.
The respondent must select one value from a scale of possible options; for example, poor, fair, good, or excellent. Rating scales allow the person conducting the survey to measure and compare sets of variables.
- If using rating scales, be consistent throughout the survey. Use the same number of points on the scale and make sure meanings of high and low stay consistent throughout the survey.
- Use an odd number in the rating scale to make data analysis easier. Switching the rating scales around will confuse survey takers, which will lead to untrustworthy responses.
- Limit the number of items in ranking or rating scale questions to fewer than ten. These questions can become difficult to read after ten options. Longer rating or ranking questions can also cause display issues in some environments.
Response rates are frequently quite low and there is a danger that they will continue to drop due to over-surveying of web-users.
Jon Krosnick argues that the following three factors determine the successfulness of the questionnaire and the likelihood of achieving decent levels of response.
- Respondent ability
- Respondent motivation
- Task difficulty/questionnaire design
Bosnjak and Tuten argue that there are at least seven ways in which online surveys are responded to.
They establish the following typology
- Complete responders are those respondents who view all questions and answer all questions.
- Unit nonresponders are those individuals who do not participate in the survey. There are two possible variations to the unit nonresponder. Such an individual could be technically hindered from participation, or he or she may purposefully withdraw after the welcome screen is displayed, but prior to viewing any questions.
- Answering Drop-Outs consist of individuals who provide answers to those questions displayed, but quit prior to completing the survey.
- Lurkers view all of the questions in the survey, but do not answer any of the questions.
- Lurking drop-outs represent a combination of 3 and 4. Such a participant views some of the questions without answering, but also quits the survey prior to reaching the end.
- Item non-responders view the entire questionnaire, but only answer some of the questions.
- Item non-responding drop-outs represent a mixture of 3 and 6. Individuals displaying this response behavior view some of the questions, answer some but not all of the questions viewed, and also quit prior to the end of the survey.
Once the questionnaire is designed, it must be administered to the appropriate sample population for data collection. Attracting the appropriate target audience often requires advertisement. There are various methods used to attract participants
- bulletin boards
- mass emails
- advertisements in commercial areas
- monetary incentives
- discounts on company products
This usually helps in attracting willing participants which ultimately provide better quality data as opposed to reluctant participants.
Location of administration for the online questionnaire may be a factor in the administration if a specific environment is required. A quiet environment may be needed for questions, which require a certain amount of concentration. The questionnaire may need to be administered in a secluded environment to protect sensitive information provided by the participant. Security measures in the software may also need to be added in these cases. In contrast, online questionnaires may also be very informal and relaxed and can be conducted in the comfort of someone’s home.
Questionnaire quality can be measured through the value of the data obtained and participant satisfaction. To maintain a high quality questionnaire length, conciseness and question sequence should be considered. First, questionnaires should only be as long as they need to be. Conciseness can be achieved through removing redundant and irrelevant questions, which can add frustration to the participant, but not value to the research. Finally, placing questions in a logical sequence also gives the participants a better mental map as they are filling out the questionnaire. Moving randomly between subjects and having answers in a non-intuitive sequence can confuse the participant.
Ethical issues should be considered when gathering data from a target audience. Below are common things one should keep in mind when considering the rights and interests of the participant.
- Participants should not be obliged to answer any of the questions.
- Incentives to take a survey should be used sparingly.
- Questionnaires should have the option to be anonymous.
- Confidentiality must be imposed on certain questionnaires. Identification may be required on questionnaires that need follow up. Although, in this case the administrator may choose to use identifying numbers rather than names. In this case, the participant of the questionnaire should completely understand what the number is used for and why it is there.
- Questions should have the option of “I don’t know” or an option that denotes neutrality so the participant feels he/she has the opportunity to plead ignorance or neutrality so that inaccurate data is not provided.
- Questions should not trick the participant. They should be worded clearly; the participant should feel comfortable and know exactly what he or she is responding to.
- Participant in most cases should know why the questionnaire is taking place and what the information will be used for.
- In some cases, the questionnaire should be reviewed by an ethics committee or outside party. This is particularly important if the questionnaire involves giving sensitive information or the topic is one which may make some participants uncomfortable.
- Computer-assisted personal interviewing
- Official statistics
- Online interview
- Paid survey
- Questionnaire construction
- Sharp, H., Rogers, Y., Preece, J., Interaction Design: Beyond Human-Computer Interaction. John Wiley & Sons, Inc. 2002
- Reips, U.-D. (2000). The Web Experiment Method: Advantages, disadvantages, and solutions. In M. H. Birnbaum (Ed.), Psychological experiments on the Internet (pp. 89-118). San Diego, CA: Academic Press.
- Bradburn, Norman M., Sudman, Seymour, Wansink, Brian, Asking Questions: The Definitive Guide to Questionnaire Design – For Market Research, Political Polls, and Social and Health Questionnaires. Jossey-Bass. 2004
- Shatz, Itamar (2017). "Fast, free, and targeted: Reddit as a source for recruiting participants online" (PDF). Social Science Computer Review. 35 (4): 537–549.
- Online Questionnaire Design Guide, "Web Based Questionnaires" [cited Mar 10, 2007]. Available HTTP[permanent dead link]
- StatPac, "Questionnaire Design - General Considerations" [cited Feb 24, 2007]. Available HTTP
- Presser, Stanley, Rothgeb, Jennifer M., Couper, Mick P., Lessler, Judith T., Martin, Elizabeth, Martin, Jean, Singer, Eleanor, Methods for Testing and Evaluating Questionnaire Questionnaires. John Wiley & Sons, Inc. 2004
- Groves, Robert M., Fowler, Floyd J., Couper, Mick P., Lepkowski, James M., Singer, Eleanor, Tourangeau, Roger, Questionnaire Methodology. John Wiley & Sons, Inc. 2004 exactly
- National Research Council of Canada, "Online Questionnaire Design" [cited Mar 10, 2007]. Available HTTP Archived 2007-05-15 at the Wayback Machine
- QuestionPro, "Preparing an Online Questionnaire - How to Conduct an Online Questionnaire" [cited Mar 10, 2007]. Available HTTP
- Survey Monkey, Smart Survey Design (2007), http://s3.amazonaws.com/SurveyMonkeyFiles/SmartSurvey.pdf
- See "Archived copy". Archived from the original on 2007-08-19. Retrieved 2008-02-20.CS1 maint: archived copy as title (link) for a list of Krosnick's publications
- Bosnjak, M. and Tuten, T. L. (2001) Classifying response behaviors in web-based surveys, Journal of Computer-Mediated Communication, 6, 3. http://jcmc.indiana.edu/vol6/issue3/boznjak.html
- Couper, Mick P., Baker, Reginald P., Clark, Cynthia Z. F., Martin, Jean, Nicholls, William L., O'Reilly, James M., Computer Assisted Survey Information Collection. John Wiley & Sons, Inc. 1998
- Norman, Donald A. The Design of Everyday Things. Basic Books. 2002