TY - JOUR
T1 - Apples and Oranges
T2 - An International Comparison of the Public's Experience of Justiciable Problems and the Methodological Issues Affecting Comparative Study
AU - Pleasence, P.
AU - Balmer, N. J.
AU - Sandefur, R. L.
N1 - Publisher Copyright:
© 2016 Cornell Law School and Wiley Subscription Services, Inc.
PY - 2016/3/1
Y1 - 2016/3/1
N2 - Since the mid-1990s, at least 28 large-scale national surveys of the public's experience of justiciable problems have been conducted in at least 15 separate jurisdictions, reflecting widespread legal aid reform activity. While the majority of these surveys take their structure from Genn's Paths to Justice survey (1999), they vary significantly in length, scope, mode of administration, types of problems included, survey reference period, data structure, data analysis, and question formulation. This article draws on surveys from across the world, contrasting their methodologies, comparing their headline findings, and setting out the potential for bias as a consequence of methodological variation. The article also presents findings from five online experiments testing the impact of various question formulations on problem prevalence, use of advice, and formal processes. Specifically, the experiments test whether varying the reference period, describing problems as "legal," offering detailed as opposed to simple problem descriptions, and describing problems as "difficult to solve" had an impact on reported prevalence of justiciable problems, and whether presenting lists as opposed to a series of individual questions had an impact on reported use of advice and processes. The experiments demonstrated that modest differences in question formulation yield significantly different results. Specifically, alteration of survey reference period did not result in a proportional change in reported problem prevalence, introducing problems as either "legal" or "difficult to solve" significantly reduced reported prevalence, and introducing use of advice/processes as multiple questions rather than as lists significantly increased reported use. The risks involved in comparative analysis (and particularly in looking beyond methodology when attempting to explain jurisdictional variation) are discussed. In relation to future studies, the importance of understanding the impact of methodological change, learning the lessons of the past, making technical details transparent, and making data available are highlighted.
AB - Since the mid-1990s, at least 28 large-scale national surveys of the public's experience of justiciable problems have been conducted in at least 15 separate jurisdictions, reflecting widespread legal aid reform activity. While the majority of these surveys take their structure from Genn's Paths to Justice survey (1999), they vary significantly in length, scope, mode of administration, types of problems included, survey reference period, data structure, data analysis, and question formulation. This article draws on surveys from across the world, contrasting their methodologies, comparing their headline findings, and setting out the potential for bias as a consequence of methodological variation. The article also presents findings from five online experiments testing the impact of various question formulations on problem prevalence, use of advice, and formal processes. Specifically, the experiments test whether varying the reference period, describing problems as "legal," offering detailed as opposed to simple problem descriptions, and describing problems as "difficult to solve" had an impact on reported prevalence of justiciable problems, and whether presenting lists as opposed to a series of individual questions had an impact on reported use of advice and processes. The experiments demonstrated that modest differences in question formulation yield significantly different results. Specifically, alteration of survey reference period did not result in a proportional change in reported problem prevalence, introducing problems as either "legal" or "difficult to solve" significantly reduced reported prevalence, and introducing use of advice/processes as multiple questions rather than as lists significantly increased reported use. The risks involved in comparative analysis (and particularly in looking beyond methodology when attempting to explain jurisdictional variation) are discussed. In relation to future studies, the importance of understanding the impact of methodological change, learning the lessons of the past, making technical details transparent, and making data available are highlighted.
UR - http://www.scopus.com/inward/record.url?scp=84958019903&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84958019903&partnerID=8YFLogxK
U2 - 10.1111/jels.12097
DO - 10.1111/jels.12097
M3 - Article
AN - SCOPUS:84958019903
SN - 1740-1453
VL - 13
SP - 50
EP - 93
JO - Journal of Empirical Legal Studies
JF - Journal of Empirical Legal Studies
IS - 1
ER -