Skip to main content

Table 1 Comparison of direct-observation park evaluation tools

From: A reliability assessment of a direct-observation park evaluation tool: the Parks, activity and recreation among kids (PARK) tool

Tool Name Author [ref] Date Total Items Number of Test Parks Test Site Reliability Estimate Lowest Estimate Highest Estimate % Items ≥ 0.40 kappa or ≥ 70 % agree Overall Reliability
Recreation Facilities Assessment Tool Cavnar et al. [17] 2004 61 27 Southeastern USA Kappa −0.50 1.00 75 % 0.80 kappa
Public Open Space Tool (POST) Giles-Corti et al. [24] 2005 49 516 Perth, Australia Kappa 0.60 1.00 100 % NA
Physical Activity Resource Assessment instrument (PARA) Lee et al. [22] 2005 34 22 Kansas City, Kansas and Missouri, USA 10 % Overlap NA NA NA rs ≥ 0.77
Bedimo-Rung Assessment Tool - Direct Observation (BRAT-DO) Bedimo-Rung et al. [19] 2006 181 2 New Orleans, Louisiana, USA Percent agreement 63.60 % 100 % 95.60 % 87.20 % agreement
Environmental Assessment of Public Recreation Spaces (EARPS) Saelens et al. [18] 2006 646 225 Greater Cincinnati area, Ohio, USA Kappa, ICC and percent agreement NA NA 56 % 65.6 % of 506 items were either kappa/ICC ≥ 0.60 or ≥ 75 % agreement
Path Environment Audit Tool (PEAT) Troped et al. [23] 2006 40 6 Massachusetts, USA Kappa (15 of 16 primary amenity items) −0.03 kappa 1.00 kappa 75 % ≥ 0.04 kappa ≥ 0.49 kappa
Kappa (7 binary items) 0.19–0.71 kappa
ICC (3 of 5 ordinal items) −0.04 ICC 0.84 ICC 43 % ≥ 0.40 ICC ≥ 0.49 ICC
Percent agreement 34 % agree 100 % agree 85 ≥ 70 % agree ≥ 81 % agreement
Children’s Public Open Space Tool
(C-POST)
Crawford et al. [21] 2008 27 19 Melbourne, Australia Inter- and intra-rater reliability NA NA NA NA
Community Park Audit Tool (CPAT) Kaczynski et al. [20] 2012 140 59 Kansas City, Missouri, USA Kappa NA NA 89 % ≥ 0.40 kappa ≥ 0.40 for all but 8 of 56 items where kappa could be calculated
Percent Agreement NA NA 97 ≥ 70 % agree ≥ 70 % agreement for all but 4 items
  1. NA Not available, ICC Intraclass correlation coefficient