Scholarly article on topic 'Developing a Reporting Guideline for Social and Psychological Intervention Trials'

Developing a Reporting Guideline for Social and Psychological Intervention Trials Academic research paper on "Psychology"

Share paper
Academic journal
The British Journal of Social Work
OECD Field of science

Academic research paper on topic "Developing a Reporting Guideline for Social and Psychological Intervention Trials"

British Journal of Social Work (2013) 43,1024-1038


Advance Access publication July 1,2013

Research Note

Developing a Reporting Guideline for Social and Psychological Intervention Trials

Paul Montgomery, Sean Grant*, Sally Hopewell,

Geraldine Macdonald, David Moher, and Evan Mayo-Wilson |

'Correspondence to Sean Grant, Centre for Evidence-Based Intervention, University of Oxford, g

Barnett House, 32 Wellington Square, Oxford OX1 2ER, UK. E-mail: t

Abstract x

Social and psychological interventions are often complex. Understanding randomised u

controlled trials (RCTs) of these complex interventions requires a detailed description of 3

the interventions tested and the methods used to evaluate them; however, RCT reports §

often omit, or inadequately report, this information. Incomplete and inaccurate reporting /

hinders the optimal use of research, wastes resources, and fails to meet ethical obligations £

to research participants and consumers. In this paper, we explain how reporting guidelines §

have improved the quality of reports in medicine, and describe the on-going development t

of a new reporting guideline for RCTs: CONSORT-SPI (an extension for social and psycho- |

logical interventions). We invite readers to participate in the project by visiting our ^

website, in order to help us reach the best-informed consensus on these guidelines (http:// t

Keywords: Randomised controlled trial, RCT, CONSORT-SPI, reporting guideline, ^^

reporting standards 5"

Accepted: March 2013 1


Social and psychological interventions aim to improve physical health, mental health, and associated social outcomes. They are often complex and typically involve multiple, interacting intervention components (e.g., several behaviour change techniques) that may act and target outcomes on several levels (e.g., individual, family, community) (Medical Research

© The Author 2013. Published by Oxford University Press on behalf of The British Association of Social Workers. All rights reserved.

Council, 2008). Moreover, these interventions may be contextually dependent upon the hard-to-control environments in which they are delivered, (e.g., health care settings, correctional facilities) (Bonell, 2002; Pawson et al., 2004). The functions and processes of these interventions may be designed to accommodate particular individuals or contexts, taking on different forms while still aiming to achieve the same objective (Bonell et al., 2012; Hawe et al., 2004). Complex interventions are common in public health, psychology, education, social work, criminology, and related disciplines. For example, Multisystemic Therapy (MST) is an intensive intervention for juvenile offenders. Based on social ecological and family systems theories, MST providers target a variety of individual, family, school, peer, neighbourhood, and com- 0

munity influences on psychosocial and behavioural problems (Henggeler n

et al., 2002). Treatment teams of professional therapists and caseworkers g

work with individuals, their families, and their peer groups to provide tailored £

services (Littell et al., 2009). These services may be delivered in homes, social °

care, and community settings. Other examples of social and psychological t

interventions may be found in reviews by the Cochrane Collaboration (e.g., /

the Developmental, Psychosocial, and Learning Problems Group; the Ss

Cochrane Public Health Group) and the Campbell Collaboration. 0

To understand their effects and to keep services up to date, academics, r

policy makers, journalists, clinicians and consumers rely on research U

reports of intervention studies in scientific journals. Such reports should g

explain the methods, including the design, delivery, uptake and context of 0

interventions, as well as subsequent results. Accurate, complete and trans- «a.

parent reporting is essential for readers to make best use of new evidence, to achieve returns on research investment, to meet ethical obligations to research participants and consumers of interventions, and to minimise waste in 0 research. However, reports of randomised controlled trials (RCTs) are often £ insufficiently reported across disciplines including criminology (Perry et al., i 2010), social work (Naleppa and Cagle, 2010), education (Torgerson et al., 2005), psychology (Michie et al., 2011; Stinson et al., 2003) and public £ health (Semaan et al., 2002). Biomedical researchers have developed guide- § lines to improve the reporting of RCTs of health-related interventions (Schulz et al., 2010). However, many social and behavioural scientists have £ not fully adopted these guidelines, which may not be wholly adequate for 1 social and psychological interventions in their current form (Bonell et al., ^ 2006; Davidson et al., 2003; Perry et al., 2010; Stinson et al., 2003). Because of the unique features of these interventions, updated reporting guidance is needed.

This article describes the development of a reporting guideline that aims to improve the quality of reports of RCTs of social and psychological interventions. We explain how reporting guidelines have improved the quality of reports in medicine, and why guidelines have not yet improved the quality of reports in other disciplines. We then introduce a plan to develop a new reporting guideline for RCTs'CONSORT-SPI (an extension for social and

psychological interventions)—which will be written using recommend techniques for guideline development and dissemination (Moher et al., 2010b). Wide stakeholder involvement and consensus are needed to create a useful, acceptable and evidence-based guideline, so we hope to recruit stakeholders from multiple disciplines and professions.

Randomised trials are not the only rigorous method for evaluating interventions; many alternatives exist when RCTs are not possible or appropriate due to scientific, practical, and ethical concerns (Bonell etal., 2011). Nonetheless, RCTs are important to policy makers, practitioners, scientists and service users, for they are generally considered the most valid and reliable research method for estimating the effectiveness of interventions (Chalmers, 2003). As a result, this project will focus on standards for RCTs, which could then also inform the development of future guidelines for other evaluation designs.

Impact of CONSORT guidelines

Reporting guidelines list (in the form of a checklist) the minimum information required to understand the methods and results of studies. They do not prescribe research conduct, but facilitate the writing of reports by authors and appraisal of reports by research consumers. For example, the Consolidated Standards of Reporting Trials (CONSORT) Statement 2010 is an evidence-based guideline; to identify items, the developers reviewed evidence of trial design and conduct that could contribute to bias. Using consensus methods, they developed a checklist of 25 items and a flow diagram (Schulz et al., 2010). CONSORT has improved the reporting of thousands of medical experiments (Turner et al., 2012). It has been endorsed by over 600 journals (Moher et al., 2004) and it is supported by the Institute of Educational Sciences (Torgerson et al., 2005). CONSORT is the only guideline for reporting RCTs that has been developed with such rigour, and it has remained more prominent that any other guideline for over 15 years; for greatest impact, then, any further reporting guidelines related to RCTs should be developed in collaboration with the CONSORT Group.

Limitations of previous reporting guidelines for social and psychological interventions

Researchers and journal editors in the social and behavioural sciences are generally aware of CONSORT but often object that it is not appropriate for social and psychological interventions (Bonell et al., 2006; Davidson et al., 2003; Perry et al., 2010; Stinson et al., 2003). As a result, uptake of CONSORT guidelines in these disciplines is low. While some criticisms are

due to inaccurate perceptions about common features of RCTs across disciplines, many relate to real limitations for social and psychological interventions (Mayo-Wilson, 2007). For example, CONSORT is most relevant to RCTs in medical disciplines; it was developed by biostatisticians and medical researchers with minimal input from experts in other disciplines. Journal editors, as well as social and behavioural science researchers, believe there is a need to include appropriate stakeholders in developing a new, targeted guideline to improve uptake in their disciplines (Gill, 2011; Torgerson et al., 2005). The CONSORT Group has produced extensions of the original CONSORT Statement relevant to social and psychological interventions, such as additional checklists for cluster (Campbell et al., 2004), non- 0 pharmacological (Boutron et al., 2008), pragmatic (Zwarenstein et al., 2008) n and quality-of-life RCTs (Calvert et al., 2011). These extensions provide im- g portant insights, but complex social and psychological interventions, for £ example, include multiple, interacting components at several levels, with ° various outcomes. These RCTs require use of several extensions at once, cre- t ating a barrier to guideline uptake; increasing intervention complexity also / gives rise to new issues that are not included in existing guidelines. Therefore, bs simply disseminating CONSORT guidelines as they stand is insufficient, as 0 this would not address the need for editors and authors to 'buy-in' to this r process. To improve uptake in these disciplines, CONSORT guidelines U need to be extended to specifically address the important features of social g and psychological interventions. 0 Social and behavioural scientists have developed other reporting guide- «a. lines, including the Workgroup for Intervention Development and Evaluation Research (WIDER) Recommendations for behavioural change interventions (Abraham, 2009; Michie et al., 2011), the American Education- 0 al Research Association's (AERA) Standards for Reporting Research £ (AERA, 2006), the REPOSE guidelines for primary research in education i (Newman and Elbourne, 2004) and the Journal Article Reporting Standards (JARS) of the American Psychological Association (APA JARS Group, 2008). Whilst they address issues not covered by the CONSORT Statement § and its extensions, these guidelines (except for JARS (APA JARS Group, £ 2008)) do not provide specific guidance for RCTs. Moreover, compared £ with the CONSORT Statement and its official extensions, guidelines in the 1 social and behavioural sciences have not consistently followed optimal techniques for guideline development and dissemination that are recommended by international leaders in the advancement of reporting guidelines (Moher et al., 2010), such as the use of systematic literature reviews and formal consensus methods to select reporting standards (Grant etal., 2012). Researchers in public health, psychology, education, social work, and criminology have noted that these guidelines could be more 'user-friendly', and dissemination could benefit from up-to-date knowledge transfer techniques (Abraham, 2009; Armstrong et al., 2008; Davidson et al., 2003; Naleppa and Cagle, 2010; Perry and Johnson, 2008; Stinson et al., 2003; Torgerson et al., 2005).

For example, JARS — a notable and valuable guideline for empirical psychological research—is endorsed by a handful of journals outside of the APA, whereas CONSORT is endorsed by hundreds of journals internationally. According to ISI Web of Knowledge and Google Scholar citations, JARS is cited approximately a dozen times annually, while CONSORT guidelines are cited hundreds of times per year. Moreover, the APA commissioned a select group of APA journal editors and reviewers to develop JARS, and the group based most of their work on existent CONSORT guidelines; by comparison, official CONSORT extensions have been developed using rigorous consensus methods, have involved various international stakeholders in guideline development and dissemin- c

ation, and update content on the most recent scientific literature. Nonethe- n

less, no current CONSORT guideline adequately addresses the unique a

features of social and psychological interventions. This new CONSORT ex- |

tension will incorporate lessons from previous extensions, reporting guide- 3

lines, and the research literature to aid the critical appraisal, replication, t

and uptake of this research. /

poorly reported—even in the least complex social and psychological intervention studies. Reports of RCTs should describe procedures for minimising selection bias, but reports often omit information about random sequence generation and allocation concealment (Ladd et al., 2010; Perry and Johnson, 2008) and psychological journals report methods of sequence generation less frequently than medical journals (Stinson et al., 2003). A review of educational reports found no studies that adequately reported allocation concealment (Torgerson et al., 2005) and reports in criminology often lack information about randomisation procedures (Gill, 2011; Perry et al., 2010). RCTs of social and psychological interventions may also use non-traditional randomised techniques, such as stepped-wedge or natural allocation (Medical Research Council, 2011), which need to be thoroughly described. In addition, reports of social and psychological intervention trials often fail

Aspects of internal validity |

Internal validity is the extent to which the results of a study may be influenced 3

by bias. Like other study designs, the validity of RCTs depends on high- 3

quality execution. Poorly conducted RCTs can produce more biased «a.

results than well-conducted trials and well-conducted non-randomised studies (Pildal et al., 2007; Prescott et al., 1999). For example, evidence indicates that RCTs that do not adequately conceal the randomisation sequence 3 exaggerate effect estimates by up to 30 per cent (Schulz et al., 1995), while § low-quality reports of these RCTs are associated with effect estimates exag- i gerated by up to 35 per cent (Moher et al., 1999). Social and psychological intervention RCTs are susceptible to these risks of bias as well.

Some aspects of internal validity, although included in CONSORT, remain 3

to include details about trial registration, protocols and adverse events (Ladd et al., 2010; Perry and Johnson, 2008), which may include important negative consequences at individual, familial and community levels.

Other aspects of CONSORT may require greater emphasis or modification for RCTs of social and psychological interventions. In developing this CONSORT extension, we expect to identify new items and to adapt existing items that relate to internal validity. These may include items discussed during the development of previous CONSORT extensions or other guidelines, as well as items suggested by participants in this project. For example, it may not be possible to blind participants and providers of interventions, but blinding of outcome assessors is often possible but rarely 0 reported, and few studies explain whether blinding was actually maintained n or how lack of blinding was handled (Davidson et al., 2003; Ladd et al., 2010; g Perry and Johnson, 2008). In social and psychological intervention studies, £ outcome measures are often subjective, variables may relate to latent con- ° structs and information may come from multiple sources (e.g. participants, t providers). While an issue in other areas of research, the influence on RCT / results of the quality of subjective outcome measures in social and psycho- SS logical intervention research has long been highlighted given their preva- 0 lence in social and psychological intervention research (Marshall et al., r 2010). Descriptions of the validity, reliability, and psychometric properties U of such measures are therefore particularly useful for social and psychological g intervention trials, especially when they are not widely available or discussed 0 in the research literature (Campbell et al., 2004; Fraser et al., 2009). Moreover, «a. multiple measures may be analysed in several ways, so authors need to trans- t parently report which procedures were performed and to explain their rationale.

Aspects of external validity

External validity is the extent to which a study's results are applicable in other §§

settings or populations. Currently, given that RCTs are primarily designed to £

increase the internal validity of study findings, the CONSORT Statement £

gives relatively little attention to external validity. While high internal valid- 1

ity is an important precondition for any discussion of an RCT's external val- ^

idity, updating the CONSORT Statement to include more information about external validity is critical for the relevance and uptake of a CONSORT extension for social and psychological interventions. These interventions may be influenced by context, as different underlying social, institutional, psychological, and physical structures may yield different causal and probabilistic relations between interventions and observed outcomes. Contextual information is necessary to compare the effectiveness of an interventions across time and place (Cartwright and Munro, 2010). Lack of information about external validity may prevent practitioners or policy makers from using

evidence to inform decision making, yet existing guidelines do not adequately explain how authors should describe (i) how interventions work, (ii) for whom and (iii) under what conditions (Moore and Moore, 2011).

First, it is useful for authors to explain the key components of interventions, how those components could be delivered and how they relate to the outcomes selected. At present, authors can follow current standards for reporting interventions without providing adequate details about complex interventions (Shepperd et al., 2009). Many reports neither contain sufficient information about the interventions tested nor reference treatment manuals (Glasziou et al., 2008). Providing logic models—as described in the Medical Research Council (MRC) Framework for Complex Interventions (Craig 0

et al., 2008) — or presenting theories of change can help elucidate links in n

causal chains that can be tested, identify important mediators and modera- g

tors, and facilitate syntheses in reviews (Ivers et al., 2012). Moreover, inter- £

ventions are rarely implemented exactly as designed, and complex 33

interventions may be designed to be implemented with some flexibility, in t

order to accommodate differences across participants (Hawe et al., 2004), /

so it is important report how interventions were actually delivered by provi- Ss

ders and actually received by participants (Hardeman et al., 2008). Particular- 3

ly for social and psychological interventions, the integrity of implementing r

the intended functions and processes of the intervention are essential to U

understand (Hawe et al., 2004). As RCTs of a particular intervention can 3

yield different relative effects depending on the nature of the control 3

groups, information about delivery and uptake should be provided for all «a.

trial arms (McGrath et al., 2003). G

Second, reports should describe recruitment processes and the represen- °

tativeness of samples. Participants in RCTs of social and pscyhological 3

interventions are often recruited outside of practice settings via processes n

that differ from routine services (AERA, 2004). An intervention that i

works for one group of people may not work for people living in different cultures or physical spaces, or it may not work for people with slightly dif-

ferent problems and co-morbidities. Enrolling in an RCT can be a complex 3

process that affects the measured and unmeasured characteristics of participants, and recruitment may differ from how users normally access inter- £ ventions. Well-described RCT reports will include the characteristics of 1 all participants (volunteers, those who enrolled and those who completed) in sufficient detail for readers to assess the comparability of the study sample to populations in everyday services (AERA, 2006; APA JARS Group, 2008; Evans and Brown, 2003).

Finally, given that these interventions often occur in social environments, reports should describe factors of the RCT context that are believed to support, attenuate or frustrate observed effects (Moore, 2002). Interventions may differ across groups of different social or socioeconomic positions, and equity considerations should be addressed explicitly (Tugwell et al., 2010; Welch et al., 2012). Several aspects of setting and implementation may be

important to consider, such as administrative support, staff training and supervision, organisational resources, the wider service system and concurrent political or social events (Bonell et al., 2012; Fixsen et al., 2005; Shepperd etal., 2009; Wang etal., 2006). Reporting process evaluations may help understand mechanisms and outcomes.

Developing a new CONSORT extension

This new reporting guideline for RCTs of social and psychological inter- ö

ventions will be an official extension of the CONSORT Statement. Opti- n

mally, it will help improve the reporting of these studies. Like other g

official CONSORT extensions (Boutron et al., 2008; Campbell et al., £

2004; Hopewell et al., 2008; Zwarenstein et al., 2008), this guideline will °

be integrated with the CONSORT Statement and previous extensions, t

and updates of the CONSORT Statement may incorporate references to /

this extension. Ss

The project is being led by an international collaboration of researchers, 0

methodologists, guideline developers, funders, service providers, journal r

editors and consumer advocacy groups. We will be recruiting participants U

in a manner similar to other reporting guideline initiatives—identifying sta- g

keholders through literature reviews, the project's International Advisory 0

Group, and stakeholder-initiated interest in the project (Michie et al., 2011; «a. Schulz et al., 2010). We hope to recruit stakeholders with expertise from all related disciplines and regions of the world, including low- and middle-

income countries. Methodologists will identify items that relate to known 0

sources of bias, and they will identify items that facilitate systematic £

reviews and research synthesis. Funders will consider how the guideline 3 can aid the assessment of grant applications for trials and methodological innovations in intervention evaluation. Practitioners will identify informa-

tion that can aid decision making. Journal editors will identify practical J

steps to implement the guideline and to ensure uptake. £

We will use consensus techniques to reduce bias in group decision making £

and to promote widespread guideline uptake and knowledge translation ac- 1

tivities upon project completion (Murphy et al., 1998). Following rigorous ^

reviews of existing guidelines and current reporting quality, we will conduct an online Delphi process to identify a prioritised list of reporting items to consider for the extension. That is, we will invite a group of experts to answer questions about reporting items and to suggest further questions. We will circulate their feedback to the group and ask a second round of questions. The Delphi process will capture a variety of international perspectives and allow participants to share their views anonymously. Following the Delphi process, we will host a consensus meeting to review the findings and to generate a list of minimal reporting standards, mirroring

the development of previous CONSORT guidelines (Boutron et al., 2008; Schulz et al., 2010; Zwarenstein et al., 2008).

Together, participants in this process will create a checklist of reporting items and a flowchart for reporting social and psychological intervention RCTs. In addition, we will develop an Explanation and Elaboration (E&E) Document to explain the scientific rationale for each recommendation and to provide examples of clear reporting; a similar document was developed by the CONSORT Group to help disseminate a better understanding for each included checklist item (Moher et al., 2010a). This document will help persuade editors, authors and funders of the importance of the guideline. It will be a useful pedagogical tool, helping students and researchers understand c

methods for conducting RCTs of social and psychological interventions, and n

it will help authors meet the guideline requirements (Moher et al., 2010b). a

The success of this project depends on widespread involvement and |

agreement among key international stakeholders in research, policy and 3

practice. For example, previous developers have obtained guideline en- t

dorsement by journal editors who require authors and peer reviewers to /

use the guideline during manuscript submission and who must enforce s

journal article word limits (Michie et al., 2009; Moher et al., 2010b). Many 3

journal editors have already agreed to participate, and we hope other o

researchers and stakeholders will volunteer their time and expertise. U

Conclusion ^

Reporting guidelines help us use scarce resources efficiently and ethically. °

Randomised controlled trials are expensive, and the public have a right to ft

expect returns on their investments through transparent, usable reports. £

When RCT reports cannot be used (for whatever reason), resources are i

wasted. Participants contribute their time and put themselves at risk of harm to generate evidence that will help others, and researchers should disseminate that information effectively (Davidson et al., 2003). Policy makers J benefit from research when developing effective, affordable standards of practice and choosing which programmes and services to fund. Administra- £ tors and managers are required to make contextually appropriate decisions. 1 Transparent reporting of primary studies is essential for their inclusion in sys- w tematic reviews that inform these activities. For example, there is the need to determine if primary studies are comparable, examine biases within included studies, assess the generalisability of results and implement effective interventions. Finally, we hope this guideline will reduce the effort and time required for authors to write reports of RCTs.

Randomised controlled trials are not the only valid method for evaluating interventions (Bonell et al., 2011), nor are they the only type of research that would benefit from better reporting (Goldbeck and Vitiello, 2011). Colleagues have identified the importance of reporting standards for

other types of research, including observational (von Elm et al., 2007), quasi-experimental (Des Jarlais et al., 2004) and qualitative studies (Tong et al., 2007). This project is the first step towards improving reports of many designs for evaluating social and pscyhological interventions, which we hope will be addressed by this and future projects. We invite stakeholders from disciplines that frequently research these interventions to join this important effort and participate in guideline development by visiting our website, where they can find more information about the project, updates on its progress, and sign up to be involved ( CONSORT-study).

Acknowledgements g

All of the authors are involved in the development of this CONSORT exten- °

sion. P.M., E.M.W. and S.G. conceived of the idea for the project. All authors t

helped to draft the manuscript, and all have read and approved the final /

manuscript. This paper is submitted on behalf of the CONSORT Internation- bs

al Advisory Group: J. Lawrence Aber, Distinguished Professor of Applied 0

Psychology and Public Policy, Steinhardt School of Culture, Education, r

and Human Development, New York University; Chris Bonell, Professor U

of Sociology and Social Intervention, Centre for Evidence Based Interven- g

tion, University of Oxford; David M. Clark, Chair of Psychology, Depart- 0

ment of Experimental Psychology, University of Oxford; Frances Gardner, «a. Professor of Child and Family Psychology, Centre for Evidence Based Intervention, University of Oxford; Steven Hollon, American Psychological Association Guidelines Committee (Chair), Gertrude Conaway Professor

of Psychology, Department of Psychology, Vanderbilt University; Jim £

McCambridge, Senior Lecturer in Behaviour Change, Department of i Social and Environmental Health Research, London School of Hygiene and Tropical Medicine; Susan Michie, Professor of Health Psychology, De-

partment of Clinical, Educational & Health Psychology, University J

College London; Laurence Moore, Professor of Public Health Improvement, £

Cardiff School of Social Sciences, Cardiff University; Mark Petticrew, Pro- £

fessor of Public Health Evaluation, Department Social and Environmental 1

Health Research, London School of Hygiene and Tropical Medicine; Lawrence Sherman, Wolfson Professor of Criminology, Cambridge Institute of Criminology, Cambridge University; Steve Pilling, Director, Centre for Outcomes Research and Effectiveness, University College London; James Thomas, Associate Director EPPI - Centre, Reader in Social Policy, Institute of Education, University of London; Elizabeth Waters, Jack Brockhoff Chair of Child Public Health, McCaughey VicHealth Centre for Community Well-being, Melbourne School of Population & Global Health, University of Melbourne, Australia; David Weisburd, Director and Walter E. Meyer Professor of Law and Criminal Justice, Institute of Criminology, Hebrew University

Faculty of Law, Jerusalem; Jo Yaffe, Associate Professor, College of Social Work, University of Utah. This project is funded by the UK Economic and Social Research Council (ES/K00087X/1). We thank the Centre for Evidence Based Intervention (Oxford University), the Centre for Outcomes Research and Effectiveness (University College London), and the National Collaborating Centre for Mental Health (NCCMH) for their support. S.G. is supported by a linked Clarendon Fund-Green Templeton College Annual Fund Scholarship to support his doctoral studies and research. D.M. is supported by a University Research Chair.

References I

Abraham, C., for the Workgroup for Intervention Development and Evaluation Research I

(2009) 'Wider recommendations to improve reporting of the content of behaviour 3

change interventions', available online at p

uploads /2009/02/wider-recommendations.pdf. b

American Educational Research Association (AERA) (2006) 'Standards for reporting on W

empirical social science research in AERA publications', Educational Researcher, X

35(6), pp. 33-40. I

American Psychological Association Journal Article Reporting Standards Group (APA Iu JARS Group) (2008) 'Reporting standards for research in psychology: Why do we 3

need them? What might they be?', American Psychologist, 63, pp. 839-51. |

Armstrong, R., Waters, E., Moore, L., Riggs, E., Cuervo, L. G., Lumbiganon, P. and Hawe, /

P. (2008) 'Improving the reporting of public health intervention research: advancing TREND and CONSORT', Journal of Public Health, 30(1), pp. 103-109. Bonell, C. (2002) 'The utility of randomized controlled trials of social interventions: An t

examination of two trials of HIV prevention', Critical Public Health, 12(4), pp. 321 -334. w

Bonell, C., Fletcher, A., Morton, M., Lorenc, T. and Moore, L. (2012) 'Realist randomised ^

controlled trials: a new approach to evaluating complex public health interventions,' Social Science & Medicine, 75(12), pp. 2299-2306. 3

Bonell, C., Oakley, A., Hargreaves, J., Strange, V. and Rees, R. (2006) 'Assessment of gen- |

eralisability in trials of health interventions: Suggested framework and systematic J

review', BMJ, 333, pp. 346-9. 3

Bonell, C. P., Hargreaves, J., Cousens, S., Ross, D., Hayes, R., Petticrew, M. and Kirkwood, ,

B. R. (2011) 'Alternatives to randomisation in the evaluation of public health interven- 0

tions: Design challenges andsolutions', Journal of Epidemiological Community Health, 5

65, pp. 582-7.

Boutron, I., Moher, D., Altman, D. G., Schulz, K. and Ravaud, P., for the CONSORT group (2008) 'Extending the CONSORT Statement to randomized trials of nonpharmacolo-gic treatment: Explanation and elaboration,' Annals of Internal Medicine, 148, pp. 295-309.

Calvert, M., Blazeby, J., Revicki, D., Moher, D. and Brundage, M. (2011) 'Reporting quality of life in clinical trials: A CONSORT extension', The Lancet, 378, pp. 1684-5. Campbell Collaboration. (2013) Avaialble at: Campbell, M. K., Elbourne, D. R. and Altman, D. G. (2004) 'CONSORT statement: Extension to cluster randomised trials', BMJ, 328, pp. 702-8.

Cartwright, N. and Munro, E. (2010) 'The limitations of randomized controlled trials in

predicting effectiveness,' Journal of Evaluation in Clinical Practice, 16, pp. 260-266. Chalmers, I. (2003) 'Trying to do more good than harm in policy and practice: The role of rigorous, transparent, up-to-date evaluations', The ANNALS of the American Academy of Political and Social Science, 589(22), pp. 22-40. Cochrane Collaboration. (2013) Available at: Craig, P., Dieppe, P., Macintyre, S., Mitchie, S., Nazareth, I. and Petticrew, M. (2008) 'Developing and evaluating complex interventions: The new Medical Research Council guidance', BMJ, 337, pp. 979-83. Davidson, K. W., Goldstein, M., Kaplan, R. M., Kaufmann, P. G., Knatterud, G. L., Orleans, C. T., Spring, B., Trudeau, K. J. and Whitlock, E. P. (2003) 'Evidence-based behavioural medicine: What is it and how do we achieve it?', Annals of Behavioral Medicine, 26, pp. 161-71. Des Jarlais, D. C., Lyles, C. and Crepaz, N., the TREND Group (2004) 'Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: The TREND statement', American Journal of Public Health, 94, pp. 361-6. Evans, T. and Brown, H. (2003) 'Road traffic crashes: Operationalizing equity in the context

of health sector reform', Injury Control and Safety Promotion, 10(1-2), pp. 11-12. p

Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. and Wallace, F. (2005) Imple- b

mentation Research: A Synthesis of the Literature, Tampa, FL, University of South Florida. X

Fraser, M. W., Galinsky, M. J., Richman, J. M. and Day, S. H. (2009) Intervention Research: d

Developing Social Programs, Oxford, Oxford University Press. 'o

Gill, C. E. (2011) 'Missing links: How descriptive validity impacts the policy relevance of g

randomized controlled trials in criminology', Journal of Experimental Criminology, 0

7(3), pp. 201-24. /

Glasziou, P., Meats, E., Heneghan, C. and Shepperd, S. (2008) 'What is missing from

descriptions of treatment in trials and reviews?', BMJ, 336, pp. 1472-4. Goldbeck, L. and Vitiello, B. (2011) 'Reporting clinical trials of psychosocial interventions e

in child and adolescent psychiatry and mental health', Child and Adolescent Psychiatry |

and Mental Health, 5(1): pp. 4. U

Grant, S., Montgomery, P. and Mayo-Wilson, E. (2012) 'Development of a CONSORT exten- |■

sion for interventions in public health and related disciplines,' The Lancet, 380(Supp. 3), i

p. S14. Available at 0

Hardeman, W., Michie, S., Fanshawe, T., Prevost, A. T., McLoughlin, K. and Kinmonth, A.

L. (2008) 'Fidelity of delivery of a physical activity intervention: Predictors and conse- £

quences', Psychology & Health, 23, pp. 11-24. ,

Hawe, P., Shiell, A. and Riley, T. (2004) 'Complex interventions: how "out of control" can a 0

randomised controlled trial be?', BMJ, 328(7455), pp. 1561-1563. 5

Henggeler, S. W., Schoenwald, S. K., Rowland, M. D. and Cunningham, P. B. (2002) Serious Emotional Disturbances in Children and Adolescents: Multisystemic Therapy, New York: Guilford Press. Hopewell, S., Clarke, M., Moher, D., Wager, E., Middleton, P., Altman, D. G. and Schulz, K. F., and the CONSORT Group. (2008) 'CONSORT for reporting randomised trials in journal and conference abstracts,' Lancet, 371, pp. 281-283. Ivers, N., Jamtvedt, G., Flottorp, S., Young, J. M., Odgaard-Jensen, J., French, S. D., O'Brien, M. A., Johansen, M., Grimshaw, J. and Oxman, A. D. (2012) 'Audit and feedback: Effects on professional practice and healthcare outcomes', Cochrane Database of Systematic Reviews, 6, p. CD000259.

Ladd, B. O., McCrady, B. S., Manuel, J. K. and Campbell, W. (2010) 'Improving the quality of reporting alcohol outcome studies: Effects of the CONSORT statement', Addictive Behaviors, 35, pp. 660-6. Littell, J. H., Campbell, M., Green, S. and Toews, B. (2009) 'Multisystemic Therapy for social, emotional, and behavioral problems in youth aged 10-17', Cochrane Library of Systematic Reviews, 4, CD004797. Marshall, M., Lockwood, A., Bradley, C., Adams, C., Joy, C. and Fenton, M. (2000). 'Unpublished rating scales: a major source of bias in randomised controlled trials of treatments for schizophrenia', British Journal ofPsychiatry, 176, 249-252. Mayo-Wilson, E. (2007) 'Reporting implementation in randomized trials: Proposed additions to the Consolidated Standards of Reporting Trials statement', American Journal of Public Health, 97, pp. 630-3. McGrath, P. J., Stinson, J. and Davidson, K. (2003) 'Commentary: The Journal of Pediatric Psychology should adopt the CONSORT statement as a way of improving the evidence base in pediatric psychology', Journal of Pediatric Psychology, 28, pp. 169-71. Medical Research Council. (2008) A Framework for Development and Evaluation ofRCTs

for Complex Interventions to Improve Health. London, Medical Research Council. Medical Research Council (2011) Using Natural Experiments to Evaluate Population p

Health Interventions: Guidance for Producers and Users of Evidence, London, b

Medical Research Council. w'

Michie, S., Abraham, C., Eccles, M. P., Francis, J. J., Hardeman, W. and Johnston, M. Ix

(2011) 'Strengthening evaluation and implementation by specifying components of be- I

haviour change interventions: A study protocol', Implementation Science, 6, p. 10. U

Michie, S., Fixsen, D., Grimshaw, J. M. and Eccles, M. P. (2009) 'Specifying and reporting 3

complex behaviour change interventions: The need for a scientific method', Implemen- I

tation Science, 4, p. 40. a/

Moher, D., Altman, D. G., Schulz, K. F. and Elbourne, D. R. (2004) 'Opportunities and challenges for improving the quality of reporting clinical research: CONSORT and beyond', Canadian Medical Association Journal, 171(4), pp. 349-50. e

Moher, D., Hopewell, S., Schultz, K. F., Montori, V., G0tzsche, P. C., Devereaux, P. J., |

Elbourne, D., Egger, M. and Altman, D. G. (2010a) 'CONSORT 2010 Explanation ^

and Elaboration: Updated guidelines for reporting parallel group randomised trials', V

BMJ, 340, p. c869. i

Moher, D., Pham, B., Jones, A., Cook, D. J., Jadad, A. R., Moher, M., Tugwell, P. I

and Klassen, T. P. (1999) 'Does quality of reports of randomized trials affect estimates J

of intervention efficacy reported in meta-analyses?', Lancet, 352, pp. 609-13. y

Moher, D., Schulz, K. F., Simera, I. and Altman, D. G. (2010b) 'Guidance for developers of ,

health research reporting guidelines', PLoS Med, 7(2), p. e1000217. 0

Moore, L. (2002) 'Research design for the rigorous evaluation of complex educational 5

interventions: Lessons from health services research', Building Research Capacity, 1, pp. 4-5.

Moore, L. and Moore, G. F. (2011) 'Public health evaluation: Which designs work, for whom and under what circumstances?', Journal of Epidemiology & Community Health, 65, pp. 596-7.

Murphy, M. K., Black, N. A., Lamping, D. L., McKee, C. M., Sanders, C. F. B., Askham, J. and Marteau, T. (1998) 'Consensus development methods, and their use in clinical guideline development', Health Technology Assessment, 2(3), pp. 1-88.

Naleppa, M. J. and Cagle, J. G. (2010) 'Treatment fidelity in social work intervention research: A review of published studies', Research on Social Work Practice, 20, pp. 674-81.

Newman, M. and Elbourne, D. (2004) 'Improving the usability of educational research: Guidelines for the REPOrting of primary empirical research Studies in Education (The REPOSE Guidelines)', Evaluation & Research in Education, 18(4), pp. 201-12. Pawson, R., Greenhalgh, T., Harvey, G. and Walshe, K. (2004) Realist Synthesis: An Introduction. University of Manchester: ESRC Research Methods Programme. Perry, A. E. and Johnson, M. (2008) 'Applying the Consolidated Standards of Reporting Trials (CONSORT) to studies of mental health provision for juvenile offenders: A research note', Journal of Experimental Criminology, 4, pp. 165 -85. Perry, A. E., Weisburd, D. and Hewitt, C. (2010) 'Are criminologists describing randomized controlled trials in ways that allow us to assess them? Findings from a sample of crime and justice trials', Journal of Experimental Criminology, 6, pp. 245-62. Pildal, J., Hrobjartsson, A., J0rgensen, K. J., Hilden, J., Altman, D. G. and G0tzsche, P. C. (2007) 'Impact of allocation concealment on conclusions drawn from meta-analyses of randomized trials', International Journal of Epidemiology, 36(4), pp. 847 -57. Prescott, R. J., Counsell, C. E., Gillespie, W. J., Grant, A. M., Russell, I. T., Kiauka, S., p

Colthart, I. R., Ross, S., Shepherd, S. M. and Russel, D. (1999) 'Factors that limit the b

quality, number and progress of randomised controlled trials', Health Technology Assessment, 3(20), pp. 1-143. X

Schulz, K. F., Altman, D. G. and Moher, D., for the CONSORT Group (2010) 'CONSORT d

2010 Statement: Updated guidelines for reporting parallel group randomised trials', U

BMJ, 340, pp. 698-702. g

Schulz, K. F., Chalmers, I., Hayes, R. J. and Altman, D. G. (1995) 'Allocation concealment 0

in randomised trials: Defending against deciphering', Lancet, 359, pp. 614-17. /

Semaan, S., Kay, L., Strouse, D., Sogolow, E., Mullen, P. D., Neumann, M. S., Flores, S. A., Peersman, G., Johnson, W. D., Lipman, P. D., Eke, A. and Des Jarlais, D. C. (2002) 'A profile of U.S.-based trials of behavioral and social interventions for HIV risk reduc- e

tion', Journal of Acquired Immune Deficiency Syndromes, 30, pp. S30-50. |

Shepperd, S., Lewin, S., Straus, S., Clarke, M., Eccles, M. P., Fitzpatrick, R., Wong, G. U

and Sheikh, A. (2009) 'Can we systematically review studies that evaluate complex 3.

interventions?', PLoS Medicine, 6(8), p. 31000086. i

Stinson, J. N., McGrath, P. J. and Yamada, J. T. (2003) 'Clinical trials in the Journal of Pedi- 0

atric Psychology: Applying the CONSORT statement', Journal of Pediatric Psych- 3

ology, 28, pp. 159-67. £

Tong, A., Sainsbury, P. and Craig, J. (2007) 'Consolidated criteria for reporting qualitative ,

research (COREQ): A 32-item checklist for interviews and focus groups', International 0

Journal for Quality in Health Care, 19(6), pp. 349-57. 5

Torgerson, C. J., Torgerson, D. J., Birks, Y. F. and Porthouse, J. (2005) 'A comparison of RCTs in health and education', British Educational Research Journal, 31(6), pp. 761-85.

Tugwell, P., Petticrew, M., Kristjansson, E., Welch, V. and Ueffing, E. et al. (2010) 'Assessing equity in systematic reviews: realising the recommendations of the Commission on Social Determinants of Health', BMJ, 341, c4739. Turner, L., Shamseer, L., Altman, D. G., Weeks, L., Peters, J., Kober, T., Dias, S., Schulz, K. F., Plint, A. C. and Moher, D. (2012) 'Consolidated standards of reporting trials (CONSORT) and the completeness of reporting of randomised controlled trials

(RCTs) published in medical journals', Cochrane Database of Systematic Reviews, 11, p. MR000030.

von Elm, E., Altman, D. G., Egger, M., Pocock, S. J., Gotzsche, P. C. and Vandenbroucke, J. P. (2007) 'The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: Guidelines for reporting observational studies', Annals of Internal Medicine, 147(8), pp. 573-7.

Welch, V., Petticrew, M., Tugwell, P., Moher, D., O'Neill, J., Waters, E. and White, H., and the PRISMA-Equity Bellagio group. (2012) 'PRISMA-Equity 2012 extension: reporting guidelines for systematic reviews with a focus on health equity', PLoS Med, 9(10), e1001333.

Zwarenstein, M., Treweek, S., Gagnier, J. J., Altman, D. G., Tunis, S., Haynes, B., Oxman, A. D. and Moher, D., for the CONSORT and Pragmatic Trials in Healthcare (Practihc) group (2008) 'Improving the reporting of pragmatic trials: an extension of the CONSORT statement', BMJ, 337, p. a2390.