Skip to main content

The evolution of health literacy assessment tools: a systematic review

Abstract

Background

Health literacy (HL) is seen as an increasingly relevant issue for global public health and requires a reliable and comprehensive operationalization. By now, there is limited evidence on how the development of tools measuring HL proceeded in recent years and if scholars considered existing methodological guidance when developing an instrument.

Methods

We performed a systematic review of generic measurement tools developed to assess HL by searching PubMed, ERIC, CINAHL and Web of Knowledge (2009 forward). Two reviewers independently reviewed abstracts/ full text articles for inclusion according to predefined criteria. Additionally we conducted a reporting quality appraisal according to the survey reporting guideline SURGE.

Results

We identified 17 articles reporting on the development and validation of 17 instruments measuring health literacy. More than two thirds of all instruments are based on a multidimensional construct of health literacy. Moreover, there is a trend towards a mixed measurement (self-report and direct test) of health literacy with 41% of instruments applying it, though results strongly indicate a weakness of coherence between the underlying constructs measured. Overall, almost every third instrument is based on assessment formats modeled on already existing functional literacy screeners such as the REALM or the TOFHLA and 30% of the included articles do not report on significant reporting features specified in the SURGE guideline.

Conclusions

Scholars recently developing instruments that measure health literacy mainly comply with recommendations of the academic circle by applying multidimensional constructs and mixing up measurement approaches to capture health literacy comprehensively. Nonetheless, there is still a dependence on assessment formats, rooted in functional literacy measurement contradicting the widespread call for new instruments. All things considered, there is no clear “consensus” on HL measurement but a convergence to more comprehensive tools. Giving attention to this finding can help to offer direction towards the development of comparable and reliable health literacy assessment tools that effectively respond to the informational needs of populations.

Peer Review reports

Background

Health literacy is an important determinant of public and individual health and is seen as a core element of patient centered care [1]. In recent years there is a growing effort to adjust the structures of heath care systems according to the population’s health literacy to help them navigate through the layers of the health care system successfully [2]. The underlying objective is to enhance access to health care services for vulnerable populations [3].

Overall health literacy denotes “people’s knowledge, motivation and competences to access, understand, appraise and apply health information in order to make judgments and take decisions in everyday life concerning health care to maintain or improve quality of life during the life course” [4]. Improving people’s knowledge is of importance since there is a distinct interplay between limited health literacy and poor health outcomes as well as avoidable health care service utilization demonstrated in numerous studies [57]. Meanwhile the prevalence of limited health literacy is high, accounting for 26% of the population in the United States and between 29% and 62% among the populations of eight European countries [8, 9]. Consequently, the importance of health literacy has been recognized on a national and international level and great efforts are made to reduce the risk of limited health literacy by setting up international collaborations, national priority action plans and determining legal regulations [10, 11]. Following this course, the main key to mediate the transformation process to a health literacy friendly health care system is the availability of detailed and comparable information of population based health literacy [12].

Therefore the call for action regarding the development of an internationally comparable and reliable population based measure of health literacy is increasing [12].

By now there are several definitions and theoretical frameworks of health literacy in place serving as a foundation to operationalize health literacy by developing framework based measures [4]. These instruments have been developed to measure health literacy on the basis of skills related to finding, understanding, evaluating, communicating and using health related information in healthcare decision making [13, 14]. While using objective or subjective measurement modes by deriving a direct test of skills or obtaining a self-report of perceived skills, scholars identified central pillars of health literacy such as print, prose and document literacy, numeracy and oral literacy [15]. Though multiple measurement modes are applied, a number of specific critiques are traceable in the academic literature principally scrutinizing varying definitions and frameworks of health literacy as well as incomprehensive measurement approaches and inconsistent reporting of psychometric properties [16, 17]. Thus, health literacy involves a “constellation of skills” [18] including the ability to interpret documents, read and write prose (print literacy), use quantitative information (numeracy or quantitative literacy) as well as being able to communicate effectively (oral literacy) and all skills need to be addressed when developing a tool [15]. By now, there is no evidence on how health literacy measurement proceeded in the last few years and if recently published articles dealing with the development of health literacy measures consider the methodological critiques and recommendations of the academic circle that requires a set of features an instrument has to cover [16, 17].

In this systematic review, we evaluate the status quo of health literacy measurement by providing insights in the currently applied measurement approaches and modes. Further, we appraise the reporting quality of publications dealing with the development and validation of instruments measuring health literacy. The review will help to verify if currently developed tools aiming to measure health literacy consider methodological critiques in the academic literature and contribute to the improvement of health literacy measurement.

Methods

We conducted a systematic review of generic measurement instruments developed and validated to assess health literacy. Our review is in accordance with the recently extended guidelines of the PRISMA statement for reporting systematic reviews [19] (see Figure 1 and Additional file 1). The used 27 item instrument ensures the transparent and complete reporting of systematic reviews and meta-analyses.

Figure 1
figure 1

PRISMA flow diagram of systematic review inclusion and exclusion process.

Data sources and selection

The review was completed by using PubMed, the Educational Resources Information Center (ERIC), the Cumulative Index to Nursing and Allied Health Literature (CINAHL) and Web of Knowledge databases. Additionally references in already published reviews and other publications were screened and a manual search on websites and print sources dealing with health literacy measurement was conducted. The search strategies encompassed key words as well as MESH terms depending on the database and were supplemented by synonyms and thesaurus terms as described in Additional file 2. The search was performed from January 2009 to 24th April, 2013 and was limited to fully available English language publications developing and validating (testing, evaluating) generic instruments to measure health literacy. We decided on the specific period of time to cover literature left out in previous reviews on health literacy measurement [13, 20]. The search was limited to instruments targeting adolescents and adults. Translations of instruments originally published before the search period were excluded.

Data extraction and assessment of reporting quality

Articles were included by screening titles and abstracts of all unique publications and formulating full-text reports of all records passing the title/abstract screen by two independent reviewers. All health literacy instruments were categorized according to their characteristics including their purpose, applied health literacy taxonomy and construct, instrument design, availability, scoring method, validation-study sample characteristics as well as psychometric properties such as reliability and validity of the instrument. Additionally, a quality assessment according to the specifications of the reporting guidelines for survey research (SURGE) was performed. It encompasses reporting items in eight subdomains: article background, methods used, sample selection criteria, research tool characteristics, response rate as well as presentation of results, interpretation and discussions of findings and requirements of ethics and disclosure [21]. Categories within the framework were adapted when relevant for the appraisal of health literacy indices. The accurate reporting on the development and validation of instruments assessing patient reported outcomes such as health literacy is important in terms of an objective assessment of applied methods and identified psychometric properties of instruments and therefore the generalizability of study results. Especially during the research process transparency concerning methodological issues of indices can help to enhance overall study quality by allowing refinements on the instrument. The SURGE is an adequate instrument to appraise the reporting quality in surveys including detailed information on the characteristics of the used survey instruments. Therefore, it served as an appropriate instrument to appraise the reporting quality of health literacy indices.

After extracting the instrument characteristics these were entered into an evidence table and critically assessed for reporting quality by two independent raters, followed by a third rater checking the extracted information for accuracy. Disagreements were resolved by a consensus process between the three raters.

Results

The PRISMA flow chart in Figure 1 summarises the results of the search process. Our search yielded 17 generic instruments. The majority of excluded articles were not measuring health literacy (n = 196) or did not report on the development or/and validation of a novel health literacy assessment tool (n = 168). Nine instruments had to be excluded due to a non-generic measurement approach [2230] and eight were direct translations of already developed instruments [22, 28, 3035] in several languages.

Study characteristics

Among the 17 included publications on the development/validation of a generic health literacy measurement tools certain patterns can be identified. As depicted in Table 1, about one third of the instruments use either a direct test of an individual’s abilities (objective measurement) or the elicitation of self-reported abilities (subjective measurement). In studies using the objective measurement approach, patient abilities are assessed by solving tasks dealing with print literacy, numeracy or oral literacy whereas the self-report approach is characterized by the self-report of perceived abilities in multiple domains. Moreover according to Table 1 the combination of both measurement modes can be found in 41,2% of all identified instruments, respectively.

Table 1 Measurement modes and approaches of health literacy

The generic instruments identified here consider the multi-dimensional measurement approach by applying print literacy in all instruments and measuring quantitative abilities in almost half of the identified tools. In contrast, only three instruments are considering communication skills of individuals when measuring an individual’s health literacy. Following the theoretical framework of health literacy being a multidimensional, dynamic construct [36] with an comprehensive approach, 76,5% of all identified instruments are based on a multidimensional construct of health literacy as shown in Table 1. Therefore multiple domains of health literacy are addressed such as healthcare information seeking, communication in the patient-provider encounter, interaction with the health care system and awareness of rights and responsibilities [37]. Additionally the utilization of a multidimensional measurement approach is pursued in almost all instruments mostly assessing print and quantitative literacy.

Health literacy assessment by an objective measurement approach

The direct testing of competencies related to the health literacy construct is used frequently in the academic literature and five novel instruments were published in the search period. The Medical Term Recognition Test (METER) developed in the United States is a brief self-administered screening tool (2 min administration time) for the clinical setting and includes 40 medical words and 40 words without an actual meaning (non-words) while aiming the identification of the medical words [38]. The format of the tool includes many words from the Rapid Estimate of Adult Literacy in Medicine (REALM)[39]. Thus, there is a high correlation (r = 0.74) between the instruments [38]. The Short Assessment of Health Literacy in Spanish and English populations (SAHL-S&E) also uses a word recognition approach as applied in the REALM and combines these with a comprehension test using multiple choice questions designed by an expert panel [40]. To guaranty word recognition as well as comprehension the examinees read aloud 18 medical terms and associate each term with another word similar in meaning. The English as well as the Spanish version of the test demonstrate high correlations to other health literacy indices, display high reliability values and are particularly suitable to screen individuals with low health literacy [40]. One instrument developed to measure health and financial literacy addresses the link between literacy and decision making in the context of health related and financial factors. It examines health literacy by using 9 items dealing with health knowledge regarding health insurance, burden of disease as well as medication skills [41]. The test to measure critical health competencies (CHC-Test) consists of 72 items presented in 4 scenarios dealing with skills such as the understanding of medical concepts, searching literature, basic statistics and the design of experiments and samples [42]. The bilingual health literacy assessment (Talking Touchscreen) focuses on building a novel item pool in accordance with items used in the Test of Functional Health Literacy in Adults (TOFHLA). It measures prose, document and quantitative literacy in the field of certain lifestyle diseases as well as insurance related issues and patient rights administering these items with a multimedia gadget [43, 44]. A detailed description of the characteristics of instruments using an objective measurement approach is, presented in Table 2.

Table 2 Main instrument characteristics categorized into objective, subjective and mixed measurement

Health literacy assessment by subjective measurement tools

All identified instruments measuring health literacy by a self-report use a multidimensional concept of health literacy by integrating several domains and factors associated with health literacy. The self-report approach was applied in five instruments published in the search period. The Multidimensional Measure of Adolescent Health Literacy (MAHL) assesses health literacy as a dynamic construct by addressing several domains: patient-provider encounter, interaction with the health care system, rights and responsibilities and health information. These are developed by analyzing items of numerous already existing instruments, identifying relevant items and modifying as well as supplementing them by new items [37]. The Health Literacy Management Scale (HELMS) consists of 8 scales with 4–5 items and aims to assess health literacy by using a comprehensive approach. It encompasses multiple domains such as patient attitudes towards health and their proactivity as well as access, understanding and use of health information and access and communication with healthcare professionals [16]. The 127 item Swiss Health Literacy Survey (HLS-CH) also addresses numerous domains such as information and (critical) decision making, cognitive and interpersonal skills as well as problem solving. In this regard health literacy is rather a package of competencies interacting with each other [45]. The All Aspects of Health Literacy (AAHLS) measures health literacy based on the framework developed by Nutbeam [46] and measures functional, communicative and critical literacy by using 14 items derived from an analysis of already existing scales in the field of health as well as media literacy [47]. Seemingly relevant items from numerous sources were adopted, partially modified, and supplemented resulting in an adequate overall reliability of Cronbach’s alpha = 0.74 whilst weak consistency among the subscales. The 63 item Health Literacy Scale developed in Taiwan (MHLS) also captures health literacy as a multi-domain construct encompassing obtaining, understanding and processing health related information related to health promotion, disease symptoms, diagnosis, and treatment and using them in decision making [48]. A further detailed description of the characteristics of instruments applying a subjective measurement approach is, presented in Table 2.

Health literacy assessment by a mixed measurement approach

The combination of a direct testing and a self-report of health literacy skills is practiced frequently among indices, thus seven instruments identified in the search period use this approach. It enables to combine the methodological advantages of both approaches by diminishing possible straits [49]. The Health Literacy Skills instrument (HLSI) as well as the short form (HLSI-SF) are 25/10 item tools that use real life health stimuli to assess an individual’s health literacy addressing print, oral, quantitative and internet based information seeking skills. The short form is derived by analyzing the psychometric properties of the HLSI and selecting best performing items. Additionally an 8 item self-report of the perceived performance among the skills addressed in the direct assessment of health literacy is conducted. Both approaches assess print literacy, numeracy and oral literacy as well as media literacy in a different manner demonstrating an acceptable internal consistency reliability of a Cronbach’s alpha of 0.86 for the HLSI and 0.70 for the HLSI-SF [36, 50]. The European Health Literacy Survey (HLS-EU) carried out in eight European countries (Germany (NRW), Bulgaria, Austria, Greece, Spain, Ireland, Netherlands, Poland) also uses a mixed assessment approach measuring functional health literacy with the Newest Vital Sign (NVS) and using a self-report survey with 47 items. It defines health literacy in three domains (health care, disease prevention, health promotion) and 4 modes (access, understand, evaluate and apply health information). Though the HLS-EU demonstrates a robust reliability of a Cronbach’s alpha of 0.97 for general health literacy the Spearman’s rho correlation between the NVS and HLS-EU with r = .245 is comparatively low indicating different constructs of health literacy [8, 49, 51]. Similar findings are apparent in the Canadian explanatory study aiming to define a health literacy measure by combining nine self-report items dealing with the access, understanding and appraisal of health information as well as communication skills in the patient provider encounter. Additionally, nine task performance (objective) items focus on understanding health related skills. A correlation between the measurement approaches could not be demonstrated [52]. A further Canadian study developing an instrument for measuring the health literacy of Canadian high school students focuses on skills to understand and evaluate health information. It uses 11 health related passages from several sources (internet, heath centers, health education and media materials) and develops 47 items examining the comprehension and interpretation of the presented information in the passages. A self-rating of health literacy skills is also included. Despite of a satisfactory overall reliability value of a Cronbach’s alpha of 0.92, bivariate correlations of r = 0.256 between the self-rating and the direct testing doesn’t indicate a strong coherence [53]. The brief subjective measure of numeracy (SNS) and general health literacy (SLS) is an 11 item instrument combining a subjective measurement of functional literacy by using the SBSQ [54] and the subjective numeracy scale (SNS) [55] with numerous previously developed objective indices to scale down bias of self-reports demonstrating a robust internal reliability [56]. The health literacy measurement applied in the special diabetes program for Indians (SDPI-HH-PL) follows a similar approach by combining items of the SBSQ to measure document literacy by a self-report and items of previously published instruments to measure numeracy by directly testing quantitative skills [54, 5759]. Though the mixed measurement approach broadens the health literacy framework some studies indicate an absence of coherence between the underlying constructs subsequently detecting missing correlations between the measurement approaches [8, 52, 53]. A further detailed description of the characteristics of instruments applying a mixed measurement approach is, presented in Table 2.

Reporting quality of identified health literacy instrument studies

The application of reporting guidelines is a useful way to facilitate transparency and gauge the reliability of an instrument used in a survey. However the compliance with reporting guidelines such as the “reporting guideline for survey research” recently compiled by Bennett and colleges [21] is limited among papers reporting on the development and validation of health literacy indices as depicted in Table 3. Among the 17 identified publications, about a third does not report on the significant reporting features specified in the guideline. The reporting frequency varies across different domains of the guideline. Study objectives, presentation of the results as well as interpretation and discussion of the findings are appropriately described in all publications. Article parts related to methodological issues such as data replication and verification (58,8%), the procedures of sample selection such as sample size calculation (23,5%), and representativeness of the sample (41,2%) are reported noticeably less as described in Table 3. Furthermore, the description of the characteristics of health literacy indices is limited among features such as the instrument pretesting, reported reliability and validity as well as the scoring method, not described in 52,9%, 23,5% and 64,7%, respectively, of all publications. Additionally, 58,8% (n = 10) of the articles do not present items of the instrument entirely making it difficult to perform an appraisal as presented in Table 2. Though reflection of non-response is central among the analysis of quantitative data, only two third of the publications do report these and 82,4% do not discuss the role of non-response rates among the performed study as listed in Table 3. Similar findings apply to the handling of missings, which are not described in more than two third of the publications. However several checklists provide guidance on the reporting of survey research and instrument development and could be used in order to report on study results adequately [60, 61].

Table 3 Survey reporting quality of identified studies dealing with the development and/or validation of health literacy indices

Discussion

In our review, we identified recently published (2009 forward) publications dealing with novel instruments developed and validated to measure health literacy. The review followed two main objectives. In the first place, we examined how the measurement of health literacy proceeded in recent years particularly emphasizing whether novel instruments consider existing recommendations of the scientific community on features an instrument measuring health literacy should cover. In addition, we analyzed the reporting quality of the identified papers dealing with the development of health literacy measurement tools.

Our analysis resulted in six major findings, which extend the prior knowledge on health literacy measurement.

First of all, we examined an increasing use of multidimensional constructs to measure health literacy. Especially instruments with a subjective measurement format address numerous domains of health literacy such as patient-provider encounter; interaction with the health care system; rights and responsibilities; health information-seeking; understanding, processing, and using healthcare information as well as communication with healthcare professionals [8, 16, 36, 37, 45, 48, 50]. In this regard, earlier critiques towards the one-dimensional measurement modes usually used in health literacy measurement are taken into consideration when developing novel instruments [12]. This in turn allows a more in depth and comprehensive operationalization of the dynamic construct “health literacy” and helps to improve the measurement.

Furthermore, we found that almost all instruments apply a multidimensional measurement of health literacy by principally assessing print literacy and numeracy and in some cases adding oral literacy. Previous reviews dealing with health literacy measurement tools emphasized the lack of instruments integrating communication skills (oral literacy) in the health literacy construct [17]. To fill this gap, three novel instruments containing oral literacy were developed and validated in the search period of our review (2009 forward) [16, 36, 50]. This result further indicates that newly developed instruments take the recommendations of the academic circle into consideration.

In addition we identified that there is a trend towards the use of objective (task based) and subjective (self-report based) measurement approaches in a mixed manner. Scholars using this mixed measurement approach often apply already existing health literacy screeners (e.g. SBSQ; NVS) and develop additional item batteries [8, 56, 59]. Principally the mixed measurement approach offers advantages by broadening the health literacy concept and enabling researchers to address multiple skills. However, studies using this approach in our review found a weakness of coherence between the underlying constructs measured by the different approaches. This subsequently results in limited correlation between the measurement approaches [8, 52, 53]. Consequently, these results should be taken into consideration when using the mixed measurement approach.

A further striking finding is that regardless of the used measurement approach, scholars do not sufficiently explain why they are choosing a certain type of measurement. According to Abel, the first issue in the circle of instrument development is to determine the purpose of the instrument by answering the “what for” question. As soon as the given theoretical context and setting is clear, ideas on the way of measurement can be developed systematically [62]. If the reason for a certain approach is not clearly determined, the development of a structured and comparable procedure to measure health literacy will be hard to achieve.

Finally, there is an extensive use of assessment formats modeled on already existing instruments such as the REALM or the TOFHLA inserting mostly straightforward additions [37, 38, 40, 44, 47]. Since these instruments have many weaknesses, researchers are calling for the development and use of new measurement approaches to avoid stagnation [17].

The appraisal of the reporting quality of publications dealing with the development and validation of health literacy indices has yielded mixed findings. Some domains such as the description of the article background and presentation and interpretation of results are reported thoroughly, while other domains addressing methodological properties have received less consideration. Overall, the identified papers included in the review demonstrate a lack of compliance with reporting guidelines especially for methodological issues such as psychometric properties of the developed instruments, sample selection strategy and response rate presentation. These findings are in line with previous research stating that key survey characteristics in health care literature in general [63, 64] and in health literacy research in particular [13] are often underreported. Although Jordan and colleges had already identified these weaknesses in their review considering measurement tools published between 1990 and 2008 [13] only few improvements are noticeable. Especially the reporting on the psychometric properties (reliability, validity) of the instruments is still not appropriate in nearly one third of all instruments. Additionally more than two third of the articles neither mention the issue of instrument scoring nor discuss the significance of non-response in the study setting. These findings demonstrate potential for further enhancements in improving health literacy research.

From an overall perspective, almost all identified instruments apply a multi-dimensional measurement (often print and numeracy literacy) and the majority utilizes a mixed measurement approach (objective and subjective measurement) with a multidimensional construct enhancing the comprehensiveness of tools measuring health literacy. Nevertheless, there is no clear indication of the demanded “consensus” on health literacy measurement. This is mainly because there have been only minor developments among the measurement formats, as can be seen in the increased use of earlier instruments, even though the academic world is calling for new instruments [17].

To continuously advance the field of health literacy measurement work should proceed on several fronts. Though there is currently a huge effort to improve the more comprehensive measurement of health literacy, the format of measurement generally relies on already existing approaches such as the cloze technique (used in the “The test of functional health literacy in adults” (TOFHLA)) or word recognition (used in “Rapid estimate of adult literacy in Medicine” (REALM)) [37, 38, 40, 44, 47]. Therefore, future health literacy research should strongly emphasize the development of new measurement approaches such as skill-based concepts with a generic approach [36, 50]. Here, the use of vignettes assessing ones abilities in a daily life setting could be an innovative step towards an approach that is already being used for measuring mental health literacy [65]. Consideration of measurement formats used in the field of information literacy could be also of great interest as they focus on the handling of information [66, 67]. Of course, these need to be tailored to the capacity of lay people.

Apart from the issue of originality, it would be necessary to reflect more closely on the combination of objective and subjective measurement instruments, thus current studies show less coherence. Though the limited reporting guideline compliance of health literacy instruments was identified by Jordan and colleges before [13] our analysis displays similar findings. Especially the poor reporting of the scoring methods and the weaknesses among the currently used procedures to determine construct validity need to be improved. Thus, construct validity is most often measured by comparing the instrument with screeners assessing functional literacy derived from standardized literacy tests without taking into account that health literacy is a dynamic and comprehensive construct and therefore not comparable with tests. The described procedure does not contribute to the qualitative improvement of health literacy indices but increases a path dependency. The consequences are recognizable among newly developed instruments in European countries often simply translating literacy based screeners developed in English speaking countries [32, 33] without considering cultural and institutional differences.

In considering such recommendations, certain limitations should be noticed regarding our review. Although we followed the PRISMA guidelines when performing our systematic review and used MESH terms and key words, we may have missed relevant literature. Furthermore, there was no reporting guideline available that provided a scoring scheme for the reporting quality. As a consequence we could not grade the reporting quality of the identified articles resulting in a descriptive description of the results. Finally, the appraisal of health literacy instruments was limited as the item batteries and scoring methods were not always available despite a direct request to the authors.

Apart from this, our review exhibits certain strengths such as the compliance to guidelines when performing the literature search, data selection, analysis and appraisal of the reporting quality of the identified articles.

Conclusions

Our review offers insights in the status quo of health literacy measurement. It critically appraises applied measurement approaches and analyses reporting qualities by commenting on current developments and their value for the further evolution of health literacy measurement. Giving attention to the evidence presented here can help to offer direction towards the development of comparable and reliable health literacy assessment tools that effectively respond to the informational needs of populations.

Authors’ information

All authors are affiliated to the Institute for Health Economics and Clinical Epidemiology, University Hospital of Cologne. and primarily deal with health systems and outcomes research focusing on chronic care and disease management. Mrs. Prof. Dr. med. Stephanie Stock is the chairwoman of the German Health Literacy Network and coordinates the network activities in Germany.

References

  1. Nutbeam D: The evolving concept of health literacy. Soc Sci Med. 2008, 67: 2072-8. 10.1016/j.socscimed.2008.09.050.

    Article  PubMed  Google Scholar 

  2. Koh HK, Berwick DM, Clancy CM, Baur C, Brach C, Harris LM, Zerhusen EG: New federal policy initiatives to boost health literacy can help the nation move beyond the cycle of costly 'crisis care'. Health Aff. 2012, 31: 434-43. 10.1377/hlthaff.2011.1169.

    Article  Google Scholar 

  3. European Innovation Partnership on Active and Healthy Ageing: Action plan on prevention and early diagnosis of frailty and functional decline, both physical and cognitive, in older people. 2012, Brussels, available at http://ec.europa.eu/research/innovation-union/pdf/active-healthy-ageing/a3_action_plan.pdf#view=fit%26pagemode=none last accessed 12nd Dec. 2013

    Google Scholar 

  4. Sorensen K, Van S, den Broucke J, Fullam GD, Pelikan J, Slonska Z, Brand H, E Consortium Health Literacy Project: Health literacy and public health: a systematic review and integration of definitions and models. BMC Public Health. 2012, 12: 80-10.1186/1471-2458-12-80.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Baker DW, Wolf MS, Feinglass J, Thompson JA, Gazmararian JA, Huang J: Health literacy and mortality among elderly persons. Arch Intern Med. 2007, 167: 1503-9. 10.1001/archinte.167.14.1503.

    Article  PubMed  Google Scholar 

  6. Cho YI, Lee SY, Arozullah AM, Crittenden KS: Effects of health literacy on health status and health service utilization amongst the elderly. Soc Sci Med. 2008, 66: 1809-16. 10.1016/j.socscimed.2008.01.003.

    Article  PubMed  Google Scholar 

  7. Hope CJ, Wu J, Tu W, Young J, Murray MD: Association of medication adherence, knowledge, and skills with emergency department visits by adults 50 years or older with congestive heart failure. American journal of health-system pharmacy : AJHP : official journal of the American Society of Health-System Pharmacists. 2004, 61: 2043-9.

    Google Scholar 

  8. HLS-EU Consortium: Comparative report of health literacy in eight EU member states. The European Health Literacy Survey HLS-EU. 2012, available at http://www.health-literacy.eu last accessed 22nd Dec. 2013

    Google Scholar 

  9. Paasche-Orlow MK, Parker RM, Gazmararian JA, Nielsen-Bohlman LT, Rudd RR: The prevalence of limited health literacy. J Gen Intern Med. 2005, 20: 175-84. 10.1111/j.1525-1497.2005.40245.x.

    Article  PubMed  PubMed Central  Google Scholar 

  10. European Commission: Together for Health: A strategic approach for the EU 2008–2013. 2007, Brussels, available at http://ec.europa.eu/health/ph_overview/Documents/strategy_wp_en.pdf last accessed 13th Dec. 2013

    Google Scholar 

  11. World Health Organization (WHO): Health literacy and health behavior. 2011, available at http://www.who.int/healthpromotion/conferences/7gchp/track2/en/ last assessed 14th Dec. 2013

    Google Scholar 

  12. Pleasan A, McKinney J: Coming to consensus on health literacy measurement: an online discussion and consensus-gauging process. Nurs Outlook. 2011, 59: 95-106 e1. 10.1016/j.outlook.2010.12.006.

    Article  Google Scholar 

  13. Jordan JE, Osborne RH, Buchbinder R: Critical appraisal of health literacy indices revealed variable underlying constructs, narrow content and psychometric weaknesses. J Clin Epidemiol. 2011, 64: 366-79. 10.1016/j.jclinepi.2010.04.005.

    Article  PubMed  Google Scholar 

  14. Ormshaw MJ, Paakkari LT, Kannas LK: Measuring child and adolescent health literacy: a systematic literature review. Health Educ. 2013, 113: 433-455. 10.1108/HE-07-2012-0039.

    Article  Google Scholar 

  15. Berkman ND, Davis TC, McCormack L: Health literacy: what is it?. J Health Commun. 2010, 15 (Suppl 2): 9-19.

    Article  PubMed  Google Scholar 

  16. Jordan JE, Buchbinder R, Briggs AM, Elsworth GR, Busija L, Batterham R, Osborne RH: The health literacy management scale (HeLMS): a measure of an individual's capacity to seek, understand and use health information within the healthcare setting. Patient Educ Couns. 2013, 91: 228-35. 10.1016/j.pec.2013.01.013.

    Article  PubMed  Google Scholar 

  17. Pleasant A, McKinney J, Rikard RV: Health literacy measurement: a proposed research agenda. J Health Commun. 2011, 16 (Suppl 3): 11-21.

    Article  PubMed  Google Scholar 

  18. American Medical Association: Health literacy: report of the Council on Scientific Affairs. JAMA. 1999, 281: 552-7. 10.1001/jama.281.6.552.

    Article  Google Scholar 

  19. Moher D, Liberati A, Tetzlaff J, Altman DG, Group P: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. J Clin Epidemiol. 2009, 62: 1006-12. 10.1016/j.jclinepi.2009.06.005.

    Article  PubMed  Google Scholar 

  20. Mancuso JM: Assessment and measurement of health literacy: an integrative review of the literature. Nurs Health Sci. 2009, 11: 77-89. 10.1111/j.1442-2018.2008.00408.x.

    Article  PubMed  Google Scholar 

  21. Bennett C, Khangura S, Brehaut JC, Graham ID, Moher D, Potter BK, Grimshaw JM: Reporting guidelines for survey research: an analysis of published guidance and reporting practices. PLoS Med. 2010, 8: e1001069-

    Article  PubMed  Google Scholar 

  22. Chang LC, Hsieh PL, Liu CH: Psychometric evaluation of the Chinese version of short-form Test of Functional Health Literacy in Adolescents. J Clin Nurs. 2012, 21: 2429-37. 10.1111/j.1365-2702.2012.04147.x.

    Article  PubMed  Google Scholar 

  23. Helitzer D, Hollis C, Sanders M, Roybal S: Addressing the "other" health literacy competencies–knowledge, dispositions, and oral/aural communication: development of TALKDOC, an intervention assessment tool. J Health Commun. 2012, 17 (Suppl 3): 160-75.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Lee J, Stucky B, Rozier G, Lee SY, Zeldin LP: Oral Health Literacy Assessment: development of an oral health literacy instrument for Spanish speakers. J Public Health Dent. 2013, 73: 1-8. 10.1111/jphd.12000.

    Article  PubMed  Google Scholar 

  25. Levin-Zamir D, Lemish D, Gofin R: Media Health Literacy (MHL): development and measurement of the concept among adolescents. Health Educ Res. 2011, 26: 323-35. 10.1093/her/cyr007.

    Article  PubMed  Google Scholar 

  26. Mazor KM, Rogers HJ, Williams AE, Roblin DW, Gaglio B, Field TS, Greene SM, Han PK, Costanza ME: The Cancer Message Literacy Tests: psychometric analyses and validity studies. Patient Educ Couns. 2012, 89: 69-75. 10.1016/j.pec.2012.06.018.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Sabbahi DA, Lawrence HP, Limeback H, Rootman I: Development and evaluation of an oral health literacy instrument for adults. Community Dent Oral Epidemiol. 2009, 37: 451-62. 10.1111/j.1600-0528.2009.00490.x.

    Article  PubMed  Google Scholar 

  28. Sauce JA, Loya AM, Sia JJ, Taylor T, Wiebe JS, Rivera JO: Medication literacy in Spanish and English: psychometric evaluation of a new assessment too. J Am Pharm Assoc (2003. 2012, 52: e231-40. 10.1331/JAPhA.2012.11264.

    Article  Google Scholar 

  29. Stucky BD, Lee JY, Lee SY, Rozier RG: Development of the two-stage rapid estimate of adult literacy in dentistry. Community Dent Oral Epidemiol. 2011, 39: 474-80. 10.1111/j.1600-0528.2011.00619.x.

    Article  PubMed  PubMed Central  Google Scholar 

  30. van der Vaart R, Drossaert CH, Taal E, ten Klooster PM, Hilderink-Koertshuis RT, Klaase JM, van de Laar MA: Validation of the Dutch functional, communicative and critical health literacy scales. Patient Educ Couns. 2012, 89: 82-8. 10.1016/j.pec.2012.07.014.

    Article  PubMed  Google Scholar 

  31. Apolinario D: C Braga Rde, RM Magaldi, AL Busse, F Campora, S Brucki, SY Lee: Short Assessment of Health Literacy for Portuguese-speaking Adults. Rev Saude Publica. 2012, 46: 702-11. 10.1590/S0034-89102012005000047.

    Article  PubMed  Google Scholar 

  32. Connor M, Mantwill S, Schulz PJ: Functional health literacy in Switzerland–validation of a German, Italian, and French health literacy test. Patient Educ Couns. 2013, 90: 12-7. 10.1016/j.pec.2012.08.018.

    Article  PubMed  Google Scholar 

  33. Fransen MP, Van Schaik TM, Twickler TB, Essink-Bot ML: Applicability of internationally available health literacy measures in the Netherlands. J Health Commun. 2011, 16 (Suppl 3): 134-49.

    Article  PubMed  Google Scholar 

  34. Ko Y, Lee JY, Toh MP, Tang WE, Tan AS: Development and validation of a general health literacy test in Singapore. Health Promot Int. 2012, 27: 45-51. 10.1093/heapro/dar020.

    Article  PubMed  Google Scholar 

  35. van der Vaart R, van Deursen AJ, Drossaert CH, Taal E, van Dijk JA, van de Laar MA: Does the eHealth Literacy Scale (eHEALS) measure what it intends to measure? Validation of a Dutch version of the eHEALS in two adult populations. J Med Internet Res. 2011, 13: e86-10.2196/jmir.1840.

    Article  PubMed  PubMed Central  Google Scholar 

  36. McCormack L, Bann C, Squiers L, Berkman ND, Squire C, Schillinger D, Ohene-Frempong J, Hibbard J: Measuring health literacy: a pilot study of a new skills-based instrument. J Health Commun. 2010, 15 (Suppl 2): 51-71.

    Article  PubMed  Google Scholar 

  37. Massey P, Prelip M, Calimlim B, Afifi A, Quiter E, Nessim S, Wongvipat-Kalev N, Glik D: Findings Toward a Multidimensional Measure of Adolescent Health Literacy. Am J Health Behav. 2013, 37: 342-350(9). 10.5993/AJHB.37.3.7.

    Article  PubMed  Google Scholar 

  38. Rawson KA, Gunstad J, Hughes J, Spitznagel MB, Potter V, Waechter D, Rosneck J: The METER: a brief, self-administered measure of health literacy. J Gen Intern Med. 2010, 25: 67-71. 10.1007/s11606-009-1158-7.

    Article  PubMed  Google Scholar 

  39. Davis TC, Crouch MA, Long SW, Jackson RH, Bates P, George RB, Bairnsfather LE: Rapid assessment of literacy levels of adult primary care patients. Fam Med. 1991, 23: 433-5.

    CAS  PubMed  Google Scholar 

  40. Lee SY, Stucky BD, Lee JY, Rozier RG, Bender DE: Short Assessment of Health Literacy-Spanish and English: a comparable test of health literacy for Spanish and English speakers. Health Serv Res. 2010, 45: 1105-20. 10.1111/j.1475-6773.2010.01119.x.

    Article  PubMed  PubMed Central  Google Scholar 

  41. James BD, Boyle PA, Bennett JS, Bennett DA: The impact of health and financial literacy on decision making in community-based older adults. Gerontology. 2012, 58: 531-9. 10.1159/000339094.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Steckelberg A, Hulfenhaus C, Kasper J, Rost J, Muhlhauser I: How to measure critical health competences: development and validation of the Critical Health Competence Test (CHC Test). Adv Health Sci Educ Theory Pract. 2009, 14: 11-22. 10.1007/s10459-007-9083-1.

    Article  PubMed  Google Scholar 

  43. Parker RM, Baker DW, Williams MV, Nurss JR: The test of functional health literacy in adults: a new instrument for measuring patients' literacy skills. J Gen Intern Med. 1995, 10: 537-41. 10.1007/BF02640361.

    Article  CAS  PubMed  Google Scholar 

  44. Yost KJ, Webster K, Baker DW, Choi SW, Bode RK, Hahn EA: Bilingual health literacy assessment using the Talking Touchscreen/la Pantalla Parlanchina: Development and pilot testing. Patient Educ Couns. 2009, 75: 295-301. 10.1016/j.pec.2009.02.020.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Wang J, Thombs BD, Schmid MR: The Swiss Health Literacy Survey: development and psychometric properties of a multidimensional instrument to assess competencies for health. Health Expect. 2014, 17 (3): 396-417. 10.1111/j.1369-7625.2012.00766.x.

    Article  PubMed  Google Scholar 

  46. Nutbeam D, Kickbusch I: Advancing health literacy: a global challenge for the 21st century. Health Promot Int. 2000, 15 (3): 183-184. 10.1093/heapro/15.3.183.

    Article  Google Scholar 

  47. Chinn D, McCarthy C: All Aspects of Health Literacy Scale (AAHLS): developing a tool to measure functional, communicative and critical health literacy in primary healthcare settings. Patient Educ Couns. 2013, 90: 247-53. 10.1016/j.pec.2012.10.019.

    Article  PubMed  Google Scholar 

  48. Tsai TI, Lee SY, Tsai YW, Kuo KN: Methodology and validation of health literacy scale development in Taiwan. J Health Commun. 2011, 16: 50-61.

    Article  PubMed  Google Scholar 

  49. Chan D: So why ask me? Are self-report data really that bad?. Statistical and methodological myths and urban legends; Doctrine, Verity and fable in the organizational and social sciences. Edited by: Lance CE, Vandenber RJ. 2008, New York: Routledge, 309-336.

    Google Scholar 

  50. Bann CM, McCormack LA, Berkman ND, Squiers LB: The Health Literacy Skills Instrument: a 10-item short form. J Health Commun. 2012, 17 (Suppl 3): 191-202.

    Article  PubMed  Google Scholar 

  51. Weiss BD, Mays MZ, Martz W, Castro KM, DeWalt DA, Pignone MP, Mockbee J, Hale FA: Quick assessment of literacy in primary care: the newest vital sign. Ann Fam Med. 2005, 3: 514-22. 10.1370/afm.405.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Begoray DL, Kwan B: A Canadian exploratory study to define a measure of health literacy. Health Promot Int. 2012, 27: 23-32. 10.1093/heapro/dar015.

    Article  PubMed  Google Scholar 

  53. Wu AD, Begoray DL, Macdonald M: J Wharf Higgins, J Frankish, B Kwan, W Fung, I Rootman: Developing and evaluating a relevant and feasible instrument for measuring health literacy of Canadian high school students. Health Promot Int. 2010, 25: 444-52. 10.1093/heapro/daq032.

    Article  PubMed  Google Scholar 

  54. Chew LD, Griffin JM, Partin MR, Noorbaloochi S, Grill JP, Snyder A, Bradley KA, Nugent SM, Baines AD, Vanryn M: Validation of screening questions for limited health literacy in a large VA outpatient population. J Gen Intern Med. 2008, 23: 561-6. 10.1007/s11606-008-0520-5.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Fagerlin A, Zikmund-Fisher BJ, Ubel PA, Jankovic A, Derry HA, Smith DM: Measuring numeracy without a math test: development of the Subjective Numeracy Scale. Med Decis Making. 2007, 27: 672-80. 10.1177/0272989X07304449.

    Article  PubMed  Google Scholar 

  56. McNaughton C, Wallston KA, Rothman RL, Marcovitz DE, Storrow AB: Short, subjective measures of numeracy and general health literacy in an adult emergency department. Acad Emerg Med. 2011, 18: 1148-55. 10.1111/j.1553-2712.2011.01210.x.

    Article  PubMed  Google Scholar 

  57. Lipkus IM, Samsa G, Rimer BK: General performance on a numeracy scale among highly educated samples. Med Decis Making. 2001, 21: 37-44.

    Article  CAS  PubMed  Google Scholar 

  58. Baker DW, Williams MV, Parker RM, Gazmararian JA, Nurss J: Development of a brief test to measure functional health literacy. Patient Educ Couns. 1999, 38: 33-42. 10.1016/S0738-3991(98)00116-5.

    Article  CAS  PubMed  Google Scholar 

  59. Brega AG, Jiang L, Beals J, Manson SM, Acton KJ, Roubideaux Y, P Special Diabetes Program for Indians Healthy Heart Demonstration: Special diabetes program for Indians: reliability and validity of brief measures of print literacy and numeracy. Ethn Dis. 2012, 22: 207-14.

    PubMed  Google Scholar 

  60. Kelley K, Clark B, Brown V, Sitzia J: Good practice in the conduct and reporting of survey research. Int J Qual Health Care. 2003, 15: 261-6. 10.1093/intqhc/mzg031.

    Article  PubMed  Google Scholar 

  61. Draugalis JR, Plaza CM: Best practices for survey research reports revisited: implications of target population, probability sampling, and response rate. Am J Pharm Educ. 2009, 73: 142-10.5688/aj7308142.

    Article  PubMed  PubMed Central  Google Scholar 

  62. Abel T: Measuring health literacy: moving towards a health - promotion perspective. Int J Public Health. 2008, 53: 169-70.

    Article  PubMed  Google Scholar 

  63. Cummings SM, Savitz LA, Konrad TR: Reported response rates to mailed physician questionnaires. Health Serv Res. 2001, 35: 1347-55.

    CAS  PubMed  PubMed Central  Google Scholar 

  64. Schilling LM, Kozak K, Lundahl K, Dellavalle RP: Inaccessible novel questionnaires in published medical research: hidden methods, hidden costs. Am J Epidemiol. 2006, 164: 1141-4. 10.1093/aje/kwj349.

    Article  PubMed  Google Scholar 

  65. Melas PA, Tartani E, Forsner T, Edhborg M, Forsell Y: Mental health literacy about depression and schizophrenia among adolescents in Sweden. Eur Psychiatry. 2013, 28: 404-411. 10.1016/j.eurpsy.2013.02.002.

    Article  CAS  PubMed  Google Scholar 

  66. Eskola EL: Information literacy of medical students studying in the problem-based and traditional curriculum. Information Research. 2005, 10: 221-

    Google Scholar 

  67. Thompson N, Lewis S, Brennan P, Robinson J: Information literacy: are final-year medical radiation science students on the pathway to success?. J Allied Health. 2010, 39: e83-9.

    PubMed  Google Scholar 

Pre-publication history

Download references

Acknowledgements

The authors thank Prof. Claus Wendt and Dr. Nadine Reibling who contributed towards the article by revising the drafting critically for important intellectual content. Further, we thank our colleague Stephanie Sangalang M.A. who provided medical writing services. The whole project and the manuscript preparation (medical writing services included) were funded by the Robert-Bosch Foundation as well as the Institute for Health Economics and Clinical Epidemiology at the University Hospital of Cologne.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sibel Vildan Altin.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

Background: SA, ST, SKA Methods: SA, IF Results: SA, IF Discussion: SA, IF, ST, SKA Conclusions: SA, IF, ST, SKA. All authors read and approved the final manuscript.

Electronic supplementary material

Authors’ original submitted files for images

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Altin, S.V., Finke, I., Kautz-Freimuth, S. et al. The evolution of health literacy assessment tools: a systematic review. BMC Public Health 14, 1207 (2014). https://doi.org/10.1186/1471-2458-14-1207

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1471-2458-14-1207

Keywords