Quality of stroke rehabilitation clinical practice guidelines

Abstract
Background and purpose Clinical practice guidelines (CPGs) are systematically developed statements that assist practitioners to provide appropriate evidence-based care. The purpose of this study was to evaluate the quality of currently published CPGs for stroke care and to examine the reliability and validity of the appraisal of guidelines, research and evaluation (AGREE) instrument. Methods Multiple databases and Internet resources were searched for stroke care CPGs. Guidelines included were published in English or French from 1998 to 2004 and developed by a group process. Four appraisers evaluated each CPG using the AGREE instrument. The AGREE consists of 23 items, ranked on a 4-point Likert scale, that is organized into six domains. A standardized score is calculated separately for each domain and ranges from 0 to 100. Results Eight guidelines were identified. The AGREE quality scores were high for the ‘scope and purpose’ domain (mean ± SE = 71.2 ± 5.48, intra-class correlation (ICC) = 0.66), and ‘clarity and presentation’ (mean ± SE = 70.6 ± 4.43, ICC = 0.66). There was wide variation in ratings of ‘rigour of development’ (mean ± SE = 60.7 ± 7.1, ICC = 0.75) and ‘stakeholder involvement’ (mean ± SE = 52.6 ± 7.14, ICC = 0.89). The ‘editorial independence’ (mean ± SE = 38.1 ± 8.72, ICC = 0.88) and ‘applicability’ (mean ± SE = 35.1 ± 4.93, ICC = 0.75) had the lowest scores. Conclusions There is considerable variability in quality of stroke care guidelines but stroke guidelines score higher on the AGREE rigour of development domain than CPGs from other medical fields. The Scottish Intercollegiate Guideline Network, Veterans Affairs/Department of Defence, Royal College of Physicians, and the New Zealand Guidelines Group consistently scored the highest across the domains. Stroke rehabilitation clinicians should consider these results in selecting a guideline. CPG development groups can improve their AGREE scores by considering the cost of implementing their CPGs, pilot testing their CPGs, recording conflict of interest of development panel members and providing tools supporting application of the their CPGs.