¿Cómo influye el tamaño del campo de texto en las respuestas a preguntas abiertas?Evidencias de un diseño experimental

  1. León, Carmen María
  2. Eva Aizpurúa González
  3. Vázquez, David
Journal:
REMA

ISSN: 1135-6855

Year of publication: 2018

Volume: 23

Issue: 2

Pages: 1-18

Type: Article

DOI: 10.17811/REMA.23.2.2018.1-18 DIALNET GOOGLE SCHOLAR lock_openDialnet editor

More publications in: REMA

Sustainable development goals

Abstract

The visual design of questionnaires can affect the quality of the data obtained, especially when asking open-ended questions that respondents answer in their own words. In this paper, we analyze the effects of manipulating the size of the text boxes provided for answers to a set of open-ended questions in a self-administered questionnaire about opinions of the Criminal Justice system in Spain. For this, a split-ballot experiment was conducted dividing the sample (N = 100) into two equivalent halves. One half received questionnaires with small box sizes for the answers to the 16 open-ended questions while the other half received questionnaires with larger box sizes. The content on the questionnaires was the same. The results showed that those participants who received larger text boxes provided longer answers. However, manipulation of the text box did not influence 1) the number of issues addressed; or 2) response times. The results and their implications for questionnaire design are discussed.

Bibliographic References

  • Barrios, M., Villarroya, A., Borrego, A., & Ollé, C. (2010). Response rates and data quality in web and mail surveys administered to PhD holders. Social Science Computer Review, 29, 208–220. doi: 10.1177/0894439310368031
  • Chaudhary, A. K. & Israel, G. D. (2016). Influence of importance statements and box size on response rate and response quality of open-ended questions in web/mail mixed-mode surveys. Journal of Rural Social Sciences, 31, 140–159.
  • Christian, L. M. & Dillman, D. A. (2004). The influence of graphical and symbolic language manipulations on responses to self-administered questions. Public Opinion Quarterly, 68, 57–80. doi: 10.1093/poq/nfh004
  • Christian, L. M., Dillman, D. A., & Smyth, J. D. (2007). Helping the respondents get it right the first time: the influence of words, symbols, and graphics in Web surveys. Public Opinion Quarterly, 71, 113-125. doi: 10.1093/poq/nfl039
  • Couper, M. P., Conrad, F. G., & Tourangeau, C. R. (2007). Visual context effects in web surveys. Public Opinion Quarterly, 4, 623-634. doi:10.1093/poq/nfm044
  • Couper, M. P., Kennedy, C., Conrad, F. G., & Tourangeau, R. (2011). Designing input fields for non-narrative open-ended responses in web surveys. Journal of Official Statistics, 27, 65–85.
  • Denscombe, M. (2008). The length of responses to open-ended questions. A comparison of online and paper questionnaires in terms of a mode effect. Social Science Computer Review, 26, 359–368. doi: 10.1177/0894439307309671
  • Deutskens, E., de Ruyter, K., & Wetzels, M. (2006). An assessment of equivalence between online and mail surveys in service research. Journal of Service Research, 8, 345–355. doi: 10.1177/1094670506286323
  • Díez-Ripollés, J. L. & García-España, E. (2009). Encuesta a víctimas en España. Málaga: Instituto Andaluz Interuniversitario de Criminología.
  • Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys. The tailored design method (3rd Ed.). Hoboken, NJ: Wiley
  • Emde, M. & Fuchs, M. (2012). Using adaptive questionnaire design in open-ended questions: A field experiment. Paper presented at the American Association for Public Opinion Research (AAPOR) 67th Annual Conference, San Diego, USA.
  • Fuchs, M. (2009). Differences in the visual design language of paper-and-pencil surveys versus web surveys: A field experimental study on the length of response fields in open-ended frequency questions. Social Science Computer Review, 27, 213–227. doi: 10.1177/0894439308325201
  • Groves, R. M., Fowler, F. J., Couper, M. P., Lepkowski, J. M., Singer, E., & Tourangeau, R. (2009). Survey methodology (2nd Ed.). Hoboken, New Jersey: Wiley.
  • Hedberg, E. C., Wallace, D., & Cesar, G. (2013). The effect of survey mode on socially undeirable responses to open ended questions: online vs. paper instruments. Paper presented at the Annual Conference of American Association for Public Opinion Research, Boston.
  • Holland, J. L. & Christian, L. M. (2009). The influence of topic interest and interactive probing on responses to open-ended questions in web surveys. Social Science Computer Review, 27, 197–212. doi: 10.1177/0894439308327481
  • Israel, G. D. (2006). Visual cues and response format effects in mail surveys. Paper presented at the Annual Meeting of the Southern Rural Sociological Association, Orlando, FL.
  • Israel, G. D. (2010). Effects of answer space size on responses to open-ended questions in mail surveys. Journal of Official Statistics, 26, 271–285.
  • Israel, G. D. (2014). Using motivating prompts to increase responses to open-ended questions in mixed-mode surveys: Evidence on where the prompt should be placed. Paper presented at the annual conference of the American Association for Public Opinion Research (AAPOR), Anaheim, CA.
  • Keusch, F. (2014). The influence of answer box format on response behavior on list-style open-ended questions. Journal of Survey Statistics and Methodology, 2, 305–322. doi: 10.1093/jssam/smu007
  • Krosnick, J. A. & Presser, S. (2009). Question and questionnaire design. In Marsden, P.V. & Wright, J.D. (Eds.). The handbook of survey research (pp. 263-314). Bingley, UK: Emerald Group Publising.
  • Kwak, N. & Radler, B. (2002). A comparison between mail and web surveys: Response pattern, respondent profile, and data quality. Journal of Official Statistics, 18, 257–273.
  • MacElroy, B., Mikucki, J., & McDowell, P. (2002). A comparison of quality in open-end responses and response rates between web-based and paper and pencil survey modes. Journal of Online Research. Available at http://www.websm.org/uploadi/editor/comparison.pdf
  • Maloshonok, N. & Terentev, E. (2016). The impact of visual design and response formats on data quality in a web survey of MOOC students. Computers in Human Behavior, 62, 506-515. doi: 10.1016/j.chb.2016.04.025
  • Mohr, A., Sell, A., & Lindsay, T. (2015). Thinking inside the box: Visual design of the response box affects creative divergent thinking in an online survey. Social Science Computer Review, 1-13. doi: 10.1177/0894439315588736
  • Richards, L. (2009). Handling qualitative data: A practical guide. London: SAGE.
  • Scholz, E. & Zuell, C. (2012). Item non-response in open-ended questions: Who does not answer on the meaning of left and right? Social Science Research, 41, 1415–1428.
  • Silverman, D. (2011). Interpreting qualitative data. London: SAGE.
  • Smith, T. W. (1993). Little things matter: A sampler of how differences in questionnaire format can affect survey responses. GSS Methodological Report, 78. Chicago: National Opinion Research Center.
  • Smyth, J. D., Dillman, D. A., Christian, L. M., & McBride, M. (2009). Open-ended questions in web surveys. Can increasing the size of answer boxes and providing extra verbal instructions improve response quality? Public Opinion Quarterly, 73, 325–337.
  • Stern, M. J., Dillman, D. A., & Smyth, J. D. (2007). Visual design, order effects, and respondent characteristics in a self-administered survey. Survey Research Methods, 1, 121–138.
  • Tourangeau, R., Couper, M. P., & Conrad, F. G. (2004). Spacing, position, and order: Interpretive heuristics for visual features of survey questions. Public Opinion Quarterly, 68, 368–393.
  • Zuell, C., Menold, N., & Körber, S. (2014). The influence of the answer box size on item nonresponse to open-ended questions in a web survey. Social Science Computer Review, 33, 115-122. doi: 10.1177/0894439314528091