Survey data are collected in a variety of ways: face to face, self-administered, telephone, mail, and the Internet. Each has its own strengths and limitations. The choice of method begins by understanding how the final data will be used and how the responding population would most like to reply and how they can be reached. The approach considers time, money, and personnel. Understanding this allows a researcher to choose a method best suited to the task.
The most successful data collection efforts combine methods, so that respondents can participate in ways most comfortable to themselves. Research methods should provide:
- A standard method for contacting informants
- A standard method of administering the survey
- Repeated attempts to make contact with informants
- An appropriate data collection period
- A standard set of questions and response options
- Neutral word choice in all materials
- Training and oversight for data collection personnel
- Protection of informant confidentiality
- Protection of any data files from loss, theft, or damage
- Respect for the dignity of each individual.
There is plenty of knowledge about the willingness of various groups to cooperate by in-person, mail, phone, and Internet methods. For example, Elizabeth Tighe and colleagues looked at Jewish respondents, Jo Lindsay looked specifically at challenges of reaching young adults, and D. A. Ashe and colleagues report on challenges of getting physician cooperation. The mode will affect some responses, and some methods are not available to all potential respondents. Moreover, the mode will affect administration costs, which for most survey projects is limited. Researchers should familiarize themselves with research literature and the intended respondent population’s access to the Internet or phones before deciding which combination of methods will work best. Testing can help anticipate how the mode of data collection will affect responses and cooperation.
Before choosing a method, it is important to understand the population of interest. Researchers should consider this when deciding how to administer a survey to senior citizens or children. Research involving physicians indicates that when given a choice, they are more likely to respond by mail than other forms.
Don Dillman’s total design method is standard procedure (2000). This method involves:
- An advance letter introducing the study
- A second mailing containing the survey
- A reminder postcard or phone call
- A final version of the survey mailed in a special package, like overnight express
- A token of appreciation, such as a cash incentive
- A postage-paid return envelope.
When possible, mailings should be addressed to a particular person. Word processors can personalize materials. A cover letter should discuss the nature of the survey and sponsorship, assure confidentiality, and suggest a deadline. It is good practice to provide a way to contact the research team. Mail is helpful in surveying businesses or other institutions. There is a risk that mail may be opened by someone other than the intended respondent. Phone follow-through helps ensure delivery. The primary costs of a mail survey are postage (or express service) and printing, especially in color. The overall costs of a mail survey depend on the number of pages to be printed and delivered, the number of attempts to reach a respondent, the amount of data to enter, and whether the researcher is willing to use premium delivery services. Optical scanning systems can reduce costs associated with data entry, while limiting the nature of data collected. Usually data scanning will require bubble sheets, while clerks can key enter longer alphanumeric responses.
In-Person
In-person interviewing is effective in covering very sensitive subject matter and for surveys that may last more than thirty minutes. In-person interviewing ensures that the correct respondent is interviewed, allows interaction with respondent, and allows the interviewer to code some observations without having to ask. In-person interviews suffer fewer skipped questions or broken interviews. This method reduces data-entry errors, as trained staff enters most information into a laptop computer. Skilled interviewers can collect bio-specimens, abstract records, perform assessments, or conduct air or water sampling at the time of an interview. Modern computer devices allow for playing of recordings, so that respondents can hear sensitive questions without feeling embarrassed in front of an interviewer.
It is labor intensive and can easily cost more than $750 per complete interview. Consider the costs of travel, wages, intensity of interviewer training, duration of interviews, and the number of visits to each respondent to get a completion. Costs can be controlled by interviewing at busy locations (train stations or shopping malls). If considering this type of intercept interview, the loss of statistical precision should be weighed.
Telephone
Computer-assisted telephone interviewing (CATI) has been in wide practice since the 1980s. This method is inexpensive for general-public surveys. Costs are determined by complexity of programming and the number of attempts that will be made to each sample member. Additional costs are associated with sampling. More sophisticated firms can generate their own random-digit dial lists. Others purchase lists from a vendor who generates the numbers and usually tries to eliminate cell numbers or unwanted numbers, such as business numbers for residential surveys.
The benefit of CATI is that it is easy to program complex skip patterns for tailored follow-up questioning and to program error checks. It allows for constant staff monitoring. Because telephone use is widespread, coverage error in most cases is small. CATI technology also can be enhanced with automated voice and touch-tone response features to help minimize respondents’ nervousness about answering sensitive questions. Because answers in CATI are entered straight into a database, researchers have almost instant results.
With the proliferation of telemarketing phone banks (which sometimes take contract work for low-budget survey firms), CATI surveys have suffered dropping response rates. State and federal do not call (DNC) lists have reduced unwanted solicitation, while making exceptions for research. There is little evidence that DNCs improved response rates.
CATI surveys run into other barriers. Some sample vendors do not screen unwanted phone numbers (i.e., disconnected lines, data lines). Caller ID and answering devices make it difficult to contact respondents who screen calls. Researchers should think about coverage and the precise unit of analysis for CATI surveys of the general population and then decide whether inclusion of cell numbers is appropriate.
Internet
Like CATI, Internet surveys allow for programmed skip commands, errors-checking for illogical or missed answers, and quick access to analyzable of data. Graphics and hyperlinks can be added. Voice programs can be used to speak questions. The risks of Inter net interviewing include hackers, fraudulent entries, and data interception. Vendors like Survey Monkey make Web-based survey research relatively cheap for researchers who want an “off-the-shelf ” survey. This type of interviewing is useful only when the population of interest is computer literate, when the researcher does not need the upfront work such as developing and pretesting questions tailored to specific research needs, or when the researcher is willing to give up a lot of control of the day-to-day activities of monitoring and supervising the research activities. Some associations have good e-mail lists, but generally e-mail samples should be treated with greater caution than random-digit telephone samples. Invitations to participate may be delivered by mail with log-on instructions. The costs of Web-enabled surveys are about the same as other programmed data systems. There will be maintenance costs. Also, Web researchers should be aware of security features that can add to the protection of data and respondent the confidentiality, avoid the spread of viruses and the like, but also affect overall Web survey costs.
Tracking And Disposition
The status of each case should be monitored so the case can be coded as completed, as a refusal, ineligible, or given other codes that help the research team. This allows for appropriate follow-up or termination of data collection on individual cases. At the end of the data collection period, each case should be given a final disposition as complete, incomplete, ineligible, or the like. From this, a transparent coding classification and response rate should be reported. The American Association for Public Opinion Research provides samples of disposition rules and calculating response rates.
Testing
Testing involves surveying a small number of respondents prior to the full data collection. Testing helps identify errors in questionnaire design, delivery mode, and problems with the sample. Testing helps predict the number of bad addresses or phone numbers in a sample. Testing is particularly helpful when there is a risk that respondents will be hard to find or might have difficulty with the questionnaire. Field tests generally run all survey procedures on a small subsample of the survey population. In some cases respondents are contacted to discuss their experience with the staff. Quality control measures before going live identify nonworking phone numbers, duplicate cases, and bad addresses and can help locate respondents who relocated.
Bibliography:
- Dillman, Don A. Mail and Internet Surveys:The Tailored Design Method. New York:Wiley, 2000.
- Dillman, Don A., Glenn Phelps, Robert Tortora, Karen Swift, Julie Kohrell, and Jodi Berck. “Response Rate and Measurement Differences in Mixed Mode Surveys: Using Mail,Telephone, Interactive Voice Response, and the Internet.” Draft paper, n.d., www.sesrc.wsu.edu/dillman/papers/Mixed%20Mode%20ppr%20_with%20Gallup_%20POQ.pdf.
- Groves, Robert M. Survey Errors and Survey Costs. New York: Wiley, 1989.
- Shih,Tse-Hua, and Xitao Fan. “Comparing Response Rates from Web and Mail Surveys: A Meta-analysis.” Field Methods 20 (2008): 249–71.
- Sudman, Seymour, and Norman M. Bradburn. Asking Questions: A Practical Guide to Questionnaire Design. San Fransisco: Jossey-Bass, 1982.
This example Survey Techniques And Design Essay is published for educational and informational purposes only. If you need a custom essay or research paper on this topic please use our writing services. EssayEmpire.com offers reliable custom essay writing services that can help you to receive high grades and impress your professors with the quality of each essay or research paper you hand in.
See also:
- How to Write a Political Science Essay
- Political Science Essay Topics
- Political Science Essay Examples