Open Access

Methods for the evaluation of the Jamie Oliver Ministry of Food program,Australia

  • Anna Flego1Email author,
  • Jessica Herbert1,
  • Lisa Gibbs2,
  • Boyd Swinburn3, 4,
  • Catherine Keating1,
  • Elizabeth Waters2 and
  • Marj Moodie1
BMC Public Health201313:411

DOI: 10.1186/1471-2458-13-411

Received: 9 January 2013

Accepted: 15 April 2013

Published: 30 April 2013

Abstract

Background

Community-based programs aimed at improving cooking skills, cookingconfidence and individual eating behaviours have grown in number over thepast two decades. Whilst some evidence exists to support theireffectiveness, only small behavioural changes have been reported andlimitations in study design may have impacted on results.

This paper describes the first evaluation of the Jamie Oliver Ministry ofFood Program (JMoF) Australia, in Ipswich, Queensland. JMoF Australia is acommunity-based cooking skills program open to the general public consistingof 1.5 hour classes weekly over a 10 week period, based on the program ofthe same name originating in the United Kingdom.

Methods/Design

A mixed methods study design is proposed. Given the programmaticimplementation of JMoF in Ipswich, the quantitative study is anon-randomised, pre-post design comparing participants undergoing theprogram with a wait-list control group. There will be two primary outcomemeasures: (i) change in cooking confidence (self-efficacy) and (ii) changein self-reported mean vegetable intake (serves per day). Secondary outcomemeasures will include change in individual cooking and eating behaviours andpsycho-social measures such as social connectedness and self-esteem.Repeated measures will be collected at baseline, program completion (10weeks) and 6 months follow up from program completion. A sample of 250participants per group will be recruited for the evaluation to detect a meanchange of 0.5 serves a day of vegetables at 80% power (0.5% significancelevel). Data analysis will assess the magnitude of change of these variablesboth within and between groups and use sub group analysis to explore therelationships between socio-demographic characteristics and outcomes.

The qualitative study will be a longitudinal design consisting ofsemi-structured interviews with approximately 10-15 participants conductedat successive time points. An inductive thematic analysis will be conductedto explore social, attitudinal and behavioural changes experienced byprogram participants.

Discussion

This evaluation will contribute to the evidence of whether cooking programswork in terms of improving health and wellbeing and the underlyingmechanisms which may lead to positive behaviour change.

Trial registration

Australian and New Zealand Trial registration number: ACTRN12611001209987.

Background

Cooking skills programs have been described as a practical illustration of how tosimultaneously change knowledge, attitudes and behaviours around healthy eatingpractices [1]. Interest in cooking has been stimulated by media attention afforded tocelebrity chefs and popular prime time television cooking programs. However, theneed to promote cooking skills to individuals has in part stemmed from a decline inthe traditional pathways by which individuals learn to cook [2], and from the hypothesis that a decline in cooking skills may havecontributed to the growth in nutrition-related disease in certain sub-sections ofwestern populations [3]. In Australia, Winkler investigated the relationship between a lack ofconfidence to cook and the purchasing of fruit and vegetables. The author concludedthat cooking skills may contribute to socio economic differences in dietary intakeand that promotion of such could be a useful strategy to improve fruit and vegetableintake [4].

In the past two decades, there has been an increase in the number of not-for-profitcommunity-based cooking skills programs both in Australia and internationally [58]. Such programs have been conducted in a variety of community andinstitutional settings, targeting different sub-populations and varying in purpose;however, they are predominantly aimed at increasing confidence to cook, promotinghealthy eating, addressing health inequalities and increasing access to healthy food [9].

Whilst there is emerging evidence of the effectiveness of these adult programs interms of increasing confidence to cook and creating positive dietary change, thisevidence, to date, has been based on small scale evaluations that are subject tomethodological limitations [9]. In a recent systematic review of the effectiveness of adult communitycooking programs conducted in the United Kingdom, only one evaluation was identifiedas suitably robust to provide reliable findings with respect to programeffectiveness [10]. This highlights the need for more rigorous, larger scale studies toexamine the range of impacts and outcomes of cooking skills programs and theunderlying potential mechanisms for change in individual behaviour. At the sametime, study designs must take account of the challenges associated with evaluationin community settings and be practical, feasible and sensitive to all stakeholdersinvolved.

The Jamie Oliver Ministry of Food program, Australia

This methods paper describes the evaluation framework and design for the JamieOliver Ministry of Food program (JMoF) Australia, Ipswich site. The JMoF programwas originally developed by Jamie Oliver, a renowned celebrity chef and foodauthor based in the United Kingdom (UK). JMoF Australia has been specificallyadapted for the Australian setting. It is a community focused program thatteaches basic cooking skills and good nutrition to non-cooks. It consists of 10,weekly, 1.5 hour cooking skills classes aimed at getting people of all ages andbackgrounds cooking simple, fresh, healthy food quickly and easily [11]. Participants pay AUD10 per class and, where this may pose a barrierto entry, subsidies are made available.

JMoF was pioneered as a community-based cooking skills program in Rotherham, UKin 2008 and since then, other centres have opened in Bradford and Leeds and amobile centre in the North West of the UK. These centres were reliant on fundingmostly from local councils and to a lesser extent, charities and the privatesector. The first Australian site opened in Ipswich in the state of Queenslandin April 2011 co-funded by a local philanthropic non-government organisation(NGO), The Good Foundation (TGF), and the Queensland Department of Health.Ipswich was intentionally chosen given its significant low socio-economic statuspopulation [12] and increasing levels of overweight and obesity [13].

Objectives of the JMoF program Australia

Consultation occurred between TGF, Queensland Health (as program co-founder) andthe program evaluation team to describe program objectives of the JMoF programAustralia in sufficient detail to be tested in an applied evaluation. Thefollowing program objectives resulted:
  1. 1.

    To provide opportunities, to people of different age and demographic background, to experience and learn how to cook healthy meals quickly and cheaply.

     
  2. 2.

    To increase program participants’ cooking skills, knowledge and self-efficacy.

     
  3. 3.

    To increase program participants’ enjoyment of food and social connectedness.

     

Theoretical perspectives

A program logic model was developed as a framework to describe the potentialpathways to behaviour change, and in turn to guide evaluative enquiry(Figure 1). Whilst some steps along the logicpathway were grounded by emerging or convincing evidence, other areas werebacked by limited evidence, thereby requiring further hypothesis testing.
https://static-content.springer.com/image/art%3A10.1186%2F1471-2458-13-411/MediaObjects/12889_2013_Article_5360_Fig1_HTML.jpg
Figure 1

JMOF Australia program logic model.

Theoretical frameworks were not explicitly stated for the JMoF Program. However,Carahar and Lang, 1999 [2] have identified theoretical perspectives specific to cooking skillsthat are in keeping with the objectives of the JMoF program and its evaluation -cooking skills empower individuals in preparation for healthy eating, encourageself-esteem and provide opportunities for leisure and enjoyment.

Other theories that resonate with the program include Kolb’s concept ofexperiential learning [14] which identifies the importance of empowering participants withpractical “get your hands dirty” experience in learning to cook fromscratch as a basis for skill acquisition, and Bandura’s Social CognitiveTheory [15] which states that changes in attitudes and beliefs and thedevelopment of self- efficacy (i.e. confidence in cooking) are central toinfluencing behaviour change. Bandura’s Social Learning Theory [16] also states that modelling is an important component of the learningprocess and that opportunities for practising learned behaviours and positivereinforcement are needed for learning to take place. An important element in thelearning process is role model credibility [17].

Methods/Design

The evaluation will be conducted over a 2.5 year period from late 2011 to early 2014.The evaluation was approved by the Deakin University Human Research Ethics Committee(HEAG-H 117_11) in October 2011. Evaluation project governance will be provided by areference group (comprising personnel from the TGF team and the research teammembers) which will meet twice yearly to oversee the project. A representative fromQueensland Health will be invited to attend Reference Group meetings, whenappropriate.

The evaluation will use a questions-oriented approach [18] derived primarily from the JMoF Program objectives. It will alsoincorporate additional economic questions of relevance to potential governmentfunders and program stakeholders. A longitudinal mixed methods evaluation designwill be employed. The quantitative and qualitative components will be conductedsequentially, with baseline quantitative data informing sampling for the initialqualitative interviews. Each component will then be analysed independently, withmerging of data occurring at the interpretation stage [19].

Quantitative study

Research questions

The quantitative component of the evaluation will answer the following researchquestions:
  1. 1.

    Does the JMoF program increase participants’ skills, knowledge, attitudes, enjoyment and satisfaction of cooking and cooking confidence (self-efficacy)?

     
  2. 2.

    Does the JMoF program result in broader positive outcomes for participants in terms of behaviour change to a healthier diet, more affordable healthy meals, improved self-esteem and social connectedness?

     

Outcome measures

There will be two primary outcome measures: a change in cooking confidence(self-efficacy) and a change in self-reported mean vegetable intake (serves perday). Secondary outcome measures will include change in self-reported measuresof: (i) mean daily fruit intake, (ii) mean weekly takeaway/fast food intake,(iii) frequency of cooking the main meal from basic ingredients, (iv) nutritionknowledge, (v) attitudes towards cooking, (vi) willingness to try new foods and(vii) enjoyment and satisfaction of cooking. Change in psycho-social measuressuch as (viii) global self-esteem and (ix) social connectedness in relation tocooking and eating will be measured as will (x) a change in participant’stotal expenditure on food.

Study design

A quasi-experimental pre-post design will consist of an intervention group ofparticipants undergoing the JMoF program and a control group comprising ofparticipants from the program waitlist who are waiting for at least 10 weeksuntil program entry. Recruitment to each group will be based on program startdate and will not be subject to randomisation. Randomisation was not possible asit would not allow participants any choice as to when and with whom theyparticipated in the program – which are important aspects of the JMoFprogram design [20].

Intervention participants will be surveyed at three time-points: programcommencement, program completion (10 weeks) and at six months post programcompletion. Controls drawn from the waitlist will be surveyed at twotime-points: 10 weeks prior to program commencement and on completion of their10 weeks on the waitlist (which will correspond to their program entry). Atime-three measurement will not be obtained from controls as it was consideredneither feasible nor acceptable for the waitlist controls to have to wait afurther six months before entering the program (equivalent to the interventionfollow-up period); this potentially would lead to a high drop-out rate both fromthe evaluation and the program itself. However, for one of the primary outcomemeasures, vegetable intake, Queensland state-wide monitoring data will be usedas a proxy time-three measure for the control group.

Survey instrument

In collaboration with key stakeholders, a quantitative measurement tool wasdeveloped. The self- administered questionnaire was designed to be completed inapproximately 15 minutes. Given the lack of validated and reliable survey toolswhich can accurately measure the impacts and outcomes of cooking skills programsin varying population groups [21], a prototype questionnaire was designed to address the uniqueobjectives of the evaluation. Where suitable, specific questions wereincorporated that have been previously used to measure the impact of cookingskills programs, particularly on cooking confidence and cooking behaviours suchas those used by Barton et al, 2011 [21].

To measure the primary outcome of cooking confidence as a proxy for cookingself-efficacy, questions were developed addressing confidence in relation tospecific cooking skills based on Short’s work [22] and Barton et al. 2011 work [21]. These items are presented on a 5 point Likert confidence scaleranging from ‘not at all confident’ to ‘extremelyconfident’.

The other primary outcome measure of change in vegetable intake will be capturedthrough self-report questions of vegetable intake (serves per day) and alignedwith baseline measurement questions of the same nature used by Queensland Healthin its population-based self-reported health surveys [23]. Survey items addressing specific secondary outcome measures such asself- reported mean daily fruit intake and mean daily takeaway/fast food intakewere also aligned with corresponding questions from the same baseline populationhealth survey [23].

Other secondary outcome items include measuring change in the self-reportedfrequency of cooking the main meal from basic ingredients and the inclusion ofsalad or vegetables with the main meal. Nutrition knowledge questions, alignedwith the nutrition messages embedded within the program, will test knowledgearound salt, fat and sugar intake and have been adapted from Parmenter etal’s work [24].

Participant attitudes towards cooking and eating healthy foods, willingness totry new foods, enjoyment and satisfaction in cooking and eating healthy foodswill be tested, using Likert scale based questions. Questions about sharedenjoyment of cooking, eating and normative eating behaviours were adapted fromquestions from The Stephanie Alexander Kitchen Garden Evaluation (SAKG) [25]. The Rosenberg Self-Esteem Scale (RES), a widely validated andreliable measure of self-esteem, will also be administered [26]

Participants will be asked to report their total household food and drink(excluding alcohol) expenditure and expenditure specifically on take-away foodand fruit and vegetables. Height and weight will be self-reported to enable thecalculation of Body Mass Index (BMI).

Piloting

The questionnaire was piloted in a 3 step process to test the design, content andpotential delivery methods. Following comment by the reference group andacademic colleagues on content and layout/design, a paper-based version of thequestionnaire was piloted with a sample (N = 30) of the current JMoFpopulation. Feedback was invited through informal focus group sessions,facilitating the identification of any questions that were ambiguous, sensitiveor required further development.

The final stage of piloting involved testing the questionnaire in an onlineformat whilst simultaneously testing the online survey distribution system andthe likely response rate. As the survey distribution required integrationbetween the JMoF participant database and the survey platform, Qualtrics™ [27], the piloting tested that these two components were effectivelyintegrated and capable of providing the necessary information needs.

Recruitment

All participants registered on the JMoF Australia waitlist database, aged 18years or over, and who had received a confirmed start date for the program, willbe eligible to participate in the quantitative component of the evaluation.Whilst the program is open to all members of the Ipswich community over the ageof 12 years, the evaluation will target the adult population only as othercooking skills programs exist within the Greater Ipswich region that arespecifically targeting children and adolescents in educational settings.

All participants will be required to consent prior to participation in theevaluation. A link to the questionnaire will be embedded within an emailgenerated by the JMoF program database, inviting program participants toparticipate in the evaluation questionnaire.

Recruitment to both groups will be based on confirmed program start dates. Ascontrol participants are required to be on the program waitlist for 10 weeks andbecause the JMoF waitlist is sufficiently large to enable this to occurnaturally, participants that are allocated a start date longer than 10 weeksahead, will be automatically made a control, whilst persons commencing theprogram within 10 weeks will be assigned to the intervention group. Computerprogrammed “rules” in the JMoF participant database have beencreated to automate this process.

To show appreciation for participation, those intervention participants whocomplete the third and final questionnaire at six months post completion of theprogram will be sent a $20 “Good Guys” store charge card redeemableat any Good Guys store (white goods retailer) across Australia.

Data collection

Data collection commenced in February 2012 following completion of piloting.Whilst the challenges of delivering online surveys are well documented [28], it was decided to be the most feasible and practical first method ofdelivery with subsequent delivery of a postal version to non-respondents andpersons who do not have a working email address or access to a computer.

Sample size

Given that the specific questions developed for this study to measure confidenceto cook (cooking self-efficacy) have not been previously employed, a precisesample size calculation for this primary outcome could not be calculated giventhe absence of a priori baseline measures, measures of effect size and measuresof variance (standard deviation).

Sample size calculations around the second primary outcome of a change invegetable intake between the two groups assumed the use of a split-plot anovaand F test for interaction given the wait list design. The literature does notprovide clear guidance with regards to an expected effect size for a program ofthis nature [29, 30]. However, as there is some evidence from Wrieden et al that an effectsize of one serve per day may be too large for a program of this nature [20], sample size calculations are based on an effect size range likely tobe achieved [29, 30]. For an effect size of 0.5 serves a day, starting from a mean dailyconsumption at baseline of 2.5 serves per day [13, 23], 250 subjects per group will be required at 80% power (0.05significance). In the event that accrual is slower than expected, recruiting atleast 140 participants will give 80% power for an effect size of 0.7 serves aday. There will be no analysis of the data before accrual has closed.

Data analysis

Demographic and baseline characteristics will be summarised for both interventionand control groups using standard summary statistics (mean and standarddeviation) and non-parametric statistics (medians and inter-quartile ranges)where applicable. Frequencies and percentages will be reported for categoricalvariables. The magnitude of change both within and between intervention andcontrol groups for T1, T2 and T3 time-points will be assessed. For continuousdata, such as fruit and vegetable intake, two sample t tests will be employed tocompare means between intervention and control groups at each time point andpaired t tests for within group comparisons. A split-plot in time Anova will beused as the basis for these t-tests. For categorical data, frequencies will bereported for intervention and control groups and chi squared analysis willprovide between group comparison results. Regression analysis will be conductedto determine the potential contribution of specific demographic factors on theoutcome variables of interest. All data analysis will be performed using STATAS.E. 12.0 statistical software.

Qualitative study

Research questions

To further understand how and why the JMoF program impacts on participants, thequalitative investigation will explore the following:
  1. 1.

    What are the expectations and experiences of participants?

     
  2. 2.

    What are the moderators, facilitators and barriers to behaviour change?

     
  3. 3.

    Were there any unanticipated outcomes?

     

Study design

A longitudinal design will be employed for the qualitative study to followprogram participants over the course of their JMoF journey. This will allow forprospective accounts of a participant’s experience and change over time [31, 32]. Repeated semi-structured interviews will be conducted withparticipants. Up to three interviews will be conducted - prior to programcommencement, on program completion and six months after completion.

Participant interviews

The interview structure was generated to capture participant perspectives, toexplore moderators, facilitators and barriers to behaviour change, and tocapture any unanticipated outcomes from program involvement. As indicated inTable 1, the purpose of the sequentialinterviews is to understand participant expectations and experiences atdifferent stages of program involvement. The interviews will be unstructured;however questions and prompts will be used to guide the discussion whereappropriate. Table 1 lists the general discussionpoints used for each participant (interviews three and four will be based onprevious discussions).
Table 1

Interview structure

Interview timing

Discussion topics

Interview 1: Prior to commencement of the program

Motivations for registering for the program, expectations ofthe program. Discussion of previous and current food andcooking experiences.

Contact: During program

Phone conversation to recruit participants to repeatinterviews and to enquire how the course is going.

Interview 2: After program completed

Discussion around their program experience and whetherprogram expectations were met. If participants experiencedany changes in food and cooking behaviour and anyunanticipated changes as a result of the program.

Interview 3: Six months after program completion

Discussion around whether any changes as a result of theprogram have been sustained in terms of food and cookingbehaviour. Any unanticipated changes as a result of theprogram. Reflection on what was talked about in the lastinterview.

Interviews of approximately 30-40 minutes duration will take place in a publiclocation that provides a comfortable environment for both interviewer and theparticipant. All interviews will be conducted by the same researcher andparticipants will be required to consent to both participation and recording ofthe interview. To show appreciation for participation in an interview,participants will be thanked with a $15 Coles supermarket gift voucher at eachinterview.

Participant sampling

Purposive sampling will be employed initially to capture maximum variation [33] in factors considered likely to influence expectations andexperiences of the JMoF program. These factors were captured in the participantquestionnaire and included socio-economic status, age, gender, family structure,and cooking confidence level. Further sampling will be informed by themesemerging from the initial interviews and instructor observations aboutcharacteristics that seem to influence participant motivations and experience.Interview one will subsequently be conducted with approximately 10-15participants. Participants who provide rich data in terms of insights and uniqueperspectives, and who are willing to continue, will be invited to progress tointerviews two and three. In the event that there are insufficient numbers toprogress (less than 6), new participants will be recruited from newenrolments.

Participant recruitment

Participants who have completed the baseline quantitative study survey and haveagreed to be contacted for an interview, will be eligible to participate as wellas all participants who are within the first three weeks of commencing theprogram. The baseline survey data will assist in purposive selection for thequalitative interviews, using the criteria based on demographic and personalcircumstances as previously described. Opportunistic purposive sampling willalso be carried out during the researcher’s time in Ipswich during classobservations. Selected participants will be contacted by phone or in person (inclass context) and provided with information about the qualitative component ofthe study and invited to participate in the interviews. Written consent will berequired prior to participation in the interview process.

Data analysis

The interviews will be transcribed verbatim. The data will be managed with theassistance of qualitative software package NVivo 9 (NVivo 9 [program]: QSRInternational Pty Ltd 2011). Concurrent data collection and analysis will beconducted to allow for confirmation of emerging themes and clarification of anycontradictory findings [34]. To contribute to the analytical process, the interviewer will recordpost-interview memos as reflections including contextual information, non-verbalfactors of note, reflections on the interview process, and thoughts aboutemerging patterns or contradictions in the data [33, 35].

The analysis of interview transcripts and interview memos will be conducted bythe interviewer using inductive thematic analysis. The data will first be codedand then categorised to allow themes and patterns to emerge [34, 36]. A second researcher will independently generate codes on asub-sample of transcripts [37]. Comparisons will then be conducted and any differences will bediscussed to achieve consensus in the final codes. The categorised data willthen be reviewed to explore similarities and differences, to identify patternsand to determine whether there are specific relationships occurring betweencategories that together provide an overall conceptual picture of the impact ofthe JMOF program within the context of its unique setting and population. Theresultant conceptual analysis will then be compared both to relevant theoreticalframeworks and to the literature base to determine if it resonates with existingknowledge or makes a new contribution to the evidence.

Integrated analysis

In addition to the separate analysis of the quantitative and qualitative results,integration of the respective findings will be conducted. This will involveexamining consistencies and inconsistencies in the findings from each method [19, 38] to build a more nuanced and comprehensive understanding of the JMoFprogram impacts and outcomes. This added depth and breadth will inform theconclusions drawn from the evaluation.

Discussion

The evaluation of the JMoF program will contribute to the growing body of literatureon the effectiveness of community-based cooking skills programs. It will employ amixed methodology to draw on the strengths of both quantitative and qualitativestudy design to best capture and measure experiences, impacts and outcomes ofcooking skills programs. The methods described herein will inform the researchcommunity about program outcomes and facilitate comparisons of results with othercooking skills programs conducted in comparable populations.

This study will also provide insights into practical considerations required whendesigning program evaluations in community settings. These include factors such asrecruitment of a comparison group, minimisation of participant data collectionburden, and the suitability and feasibility of selected data collection modes, whichmust be considered without compromising study design or program integrity.

There are both strengths and limitations to the evaluation design. Mixed methodsstudies as a paradigm can risk compromising methodological rigour if integrationoccurs at point of data collection and/or analysis and potentially underminesparadigm and process considerations [18, 19]. This is not an issue in the current evaluation with integration onlyoccurring in relation to sample identification and final integration offindings.

Whilst it is acknowledged that the use of a non-randomised quasi-experimental designmakes the quantitative study vulnerable to sampling bias, practical limitationsprevented the application of a randomised design. Despite this potential bias, theuse of a waitlist control and pre and post measures support attribution of anychanges to the program.

Potential selection bias associated with choice to participate or not in thequantitative study may also occur. However various methods were employed in anattempt to address this issue: providing participants with multiple options forsurvey completion, follow up of non-responders and the use of incentives.

In the quantitative study, there is no direct measure of cooking skills despite theJMoF program being a cooking skills program per se. However, there is currently nogold standard for the measurement of cooking skills in an adult population norconsensus on the definition of cooking skills or whether changes in it alone willpredict the likelihood of changes in cooking behaviour [3]. Therefore confidence to cook which reflects self-efficacy, a relativelystrong predictor of behaviour change, was the chosen measure for the evaluation assuggested by Winkler, Wrieden and Barton et al [3, 20, 21]. It is noted that even Barton et al’s confidence questions uponwhich some of the current survey questions are based, whilst considered reliable,have yet to be formally tested in the community setting [21]. Another limitation of the quantitative study is the reliance onself-reported measures. Yet lessons learnt from previous evaluations [20, 21] suggest that the use of more intensive methods would likely overburdenparticipants and lead to low participation rates.

In summary, the use, in this evaluation, of a mixed method, pre-post design with awaitlist control group will provide sufficient strength of evidence to assess theimpact of the JMoF program on participants’ attitudes and behaviours. It willalso make a contribution to the limited evidence base about the effectiveness ofcommunity-based cooking programs.

Declarations

Acknowledgments

The authors would like to acknowledge Alicia Peardon and all the staff of TheGood Foundation as well as Christina Stubbs of the Queensland Department ofHealth for their contributions in relation to evaluation design andimplementation. The authors also thank Associate Professor John Reynolds, DeakinUniversity for his contribution to the quantitative data analysis design. Moodieand Swinburn are researchers within an NHMRC Centre for Research Excellence inObesity Policy and Food Systems (APP1041020).

Authors’ Affiliations

(1)
Deakin Health Economics, Faculty of Health, Deakin University
(2)
Jack Brockhoff Child Health and Wellbeing Program, The McCaughey Centre,Melbourne School of Population Health, The University of Melbourne
(3)
WHO Collaborating Centre for Obesity Prevention, Faculty of Health, Deakin University
(4)
School of Population Health, Faculty of Medical and Health Sciences, University of Auckland

References

  1. Caraher M, Dixon P, Lang T, Carr-Hill R: The state of cooking in England: the relationship of cooking skills to foodchoice. Brit Food J. 1999, 101 (8): 590-609. 10.1108/00070709910288289.View ArticleGoogle Scholar
  2. Caraher M, Lang T: Can't cook, won't cook: A review of cooking skills and their relevance tohealth promotion. Int J Health Promot Educ. 1999, 37 (3): 89-100.View ArticleGoogle Scholar
  3. Winkler E: Food Accessibility affordability, cooking skills, and socioeconomicdifferences in fruit and vegetable purchasing in Brisbane, Australia. 2008, Australia: Queensland University of Technology, Institute of Health andBiomedical Innovation School of Public Health,Google Scholar
  4. Winkler E, Turrell G: Confidence to Cook Vegetables and the Buying Habits of AustralianHouseholds. J Am Diet Assoc. 2009, 109 (10): 1759-1768. 10.1016/j.jada.2009.07.006.View ArticlePubMedGoogle Scholar
  5. Block K, Johnson B: Evaluation of the Stephanie Alexander Kitchen Garden Program. Final Report to: The Stephanie Alexander Kitchen Garden Foundation. Edited by: Wellbeing TMCVCftPoMHaC. 2009, Melbourne: University of Melbourne and Deakin UniversityGoogle Scholar
  6. Fulkerson JA, Rydell S, Kubik MY, Lytle L, Boutelle K, Story M, Neumark-Sztainer D, Dudovitz B, Garwick A: Healthy Home Offerings via the Mealtime Environment (HOME): feasibility,acceptability, and outcomes of a pilot study. Obesity. 2010, 18 (1s): S69-S74. 10.1038/oby.2009.434.View ArticlePubMedPubMed CentralGoogle Scholar
  7. Keller HH, Gibbs A, Wong S, Vanderkooy PD, Hedley M: Men can cook! Development, implementation, and evaluation of a senior men'scooking group. J Nutr Elderly. 2004, 24 (1): 71-87. 10.1300/J052v24n01_06.View ArticleGoogle Scholar
  8. Foley W, Spurr S, Lenoy L, De Jong M, Fichera R: Cooking skills are important competencies for promoting healthy eating in anurban Indigenous health service. Nutr Diet. 2011, 68 (4): 291-296. 10.1111/j.1747-0080.2011.01551.x.View ArticleGoogle Scholar
  9. Engler-Stringer R: Food, cooking skills, and health: a literature review. Can J Diet Pract Res. 2010, 71 (3): 141-145. 10.3148/71.3.2010.141.View ArticlePubMedGoogle Scholar
  10. Rees R, Hinds K, Dickson K, O'Mara-Eves A, Thomas J: Communities that cook. A systematic review of the effectiveness andappropriateness of interventions to introduce adults to home cooking. EPPI-Centre report 2004. 2012, London: EPPI-Centre Social Science Research Unit, Institute of Education,University of London,Google Scholar
  11. Ministry of Food Mission.  . http://www.thegoodfoundation.com.au/about-us/,
  12. Office of Economic and Statistical Research: Queensland Regional Profiles Ipswich City Based on local government area(2010). Profile generated on 27 May 2011. 2011, : ,Google Scholar
  13. Queensland Health: Self- reported Health Status 2009-2010: Local government Area SummaryReport. 2011, Brisbane: ,Google Scholar
  14. Kolb DA: Experiential learning: experience as the source of learning anddevelopment. 1984, Englewood cliffs, NJ: Prentice Hall,Google Scholar
  15. Bandura A: Self-efficacy: Toward a unifying theory of behavioral change. Psychol Rev. 1977, 84 (2): 191-215.View ArticlePubMedGoogle Scholar
  16. Bandura A: Social Learning Theory. 1977, Englewood Cliffs, NJ: Prentice-HallGoogle Scholar
  17. Turner G, Shepherd J: A method in search of a theory: peer education and health promotion. Health Educ Res. 1999, 14 (2): 235-247. 10.1093/her/14.2.235.View ArticlePubMedGoogle Scholar
  18. Stufflebeam DL: New Directions for Evaluation. Evaluation models. 2001, San Francisco, CA: Jossey-Bass, 89Google Scholar
  19. An Inclusive Framework for Conceptualizing Mixed Method DesignTypologies. Edited by: Nastasi B, Jea H. 2010, London: Sage,Google Scholar
  20. Wrieden WL, Anderson AS, Longbottom PJ, Valentine K, Stead M, Caraher M, Lang T, Gray B, Dowler E: The impact of a community-based food skills intervention on cookingconfidence, food preparation methods and dietary choices - an exploratorytrial. Public Health Nutr. 2006, 10 (2): 203-211.Google Scholar
  21. Barton KL, Anderson AS, Wrieden WL: Validity and realiability of a short questionnaire for assessing impact ofcooking skills interventions. J Hum Nutr Diet. 2011, 24: 588-595. 10.1111/j.1365-277X.2011.01180.x.View ArticlePubMedGoogle Scholar
  22. Short F: Domestic cooking skills- what are they?. J HEIA. 2003, 10 (3): 13-22.Google Scholar
  23. Queensland Health: Self-Reported Adult Health Status: Queensland. 2009 Survey Report. Edited by: Pollard G, White D, Harper C. 2009, Brisbane: Queensland HealthGoogle Scholar
  24. Parmenter K, Wardle J: Development of a general nutrition knowledge questionnaire for adults. Eur J Clin Nutr. 1999, 53 (4): 298-308. 10.1038/sj.ejcn.1600726.View ArticlePubMedGoogle Scholar
  25. Block K, Johnson B, Gibbs L: Final Report to: The Stephanie Alexander Kitchen Garden Foundation. Edited by: Wellbeing TMCVCftPoMHaC. 2009, Australia: Melbnourne University of Melbourne and DeakinUniversity,Google Scholar
  26. Ang RP, Neubronner M, Oh S-A, Leong V: Dimensionality of Rosenberg's Self-Esteem Scale among Normal-Technical StreamStudents in Singapore. Curr Psychol. 2006, 25 (2): 121-131.View ArticleGoogle Scholar
  27. Qualtrics online survey software.  . http://www.qualtrics.com,
  28. Lefever S, Dal M, Matthíasdóttir A: Online data collection in academic research: advantages and limitations. Brit J Educ Technol. 2007, 38 (4): 574-582. 10.1111/j.1467-8535.2006.00638.x.View ArticleGoogle Scholar
  29. Brunner EJ, Rees K, Ward K, Burke M, Thorogood M: Dietary advice for reducing cardiovascular risk. Cochrane Database Syst Rev. 2007, CD002128-4
  30. Pomerleau J, Lock K, Knai C, McKee M: Interventions Designed to Increase Adult Fruit and Vegetable Intake Can BeEffective: a Systematic Review of the Literature. J Nutr. 2005, 135: 2486-2495.PubMedGoogle Scholar
  31. Shirani F, Henwood K: Continuity and change in a qualitative longitudinal study of fatherhood:relevance without responsibility. Int J Soc Res Methodol. 2010, 14 (1): 17-29.View ArticleGoogle Scholar
  32. Farrall S: What is qualitative longitudinal research?. Papers in Social Research Methods Qualitative Series No 11. 2006, London: London School of Economics and Political Science MethodologyInstitute,Google Scholar
  33. Qualitative Research and Evaluation Methods. Edited by: Patton M. 2002, California: Sage, 3Google Scholar
  34. Green J, Willis K, Hughes E, Small R, Welch N, Gibbs L, Daly J: Generating best evidence from qualitative research: the role of dataanalysis. Aust N Z J Public Health. 2007, 31 (6): 545-550. 10.1111/j.1753-6405.2007.00141.x.View ArticlePubMedGoogle Scholar
  35. Gibbs L, Kealy M, Willis K, Green J, Welch N, Daly J: What have sampling and data collection got to do with good qualitativeresearch?. Aust N Z J Public Health. 2007, 31 (6): 540-544. 10.1111/j.1753-6405.2007.00140.x.View ArticlePubMedGoogle Scholar
  36. Liamputtong P: Qualitative data analysis: conceptual and practical considerations. Health Promot J Austr. 2009, 20 (2): 133-139.PubMedGoogle Scholar
  37. Liamputtong P: Qualitative research methods. 2009, Melbourne: Oxford University Press, 3Google Scholar
  38. Tashakkori A, Teddlie C: Sage handbook of mixed methods in social & behavioral research. 2010, Los Angeles: SAGE Publications, 2View ArticleGoogle Scholar
  39. Pre-publication history

    1. The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1471-2458/13/411/prepub

Copyright

© Flego et al.; licensee BioMed Central Ltd. 2013

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative CommonsAttribution License (http://creativecommons.org/licenses/by/2.0), whichpermits unrestricted use, distribution, and reproduction in any medium, provided theoriginal work is properly cited.

Advertisement