| On September 12, 1987, Torontonians awoke to find their Saturday Star headlining One in four Canadians functionally illiterate national survey finds. In December, 1999, The Minister of Education of Nova Scotia, Jane Purvis, was confronted by reporters asking for her comments on the claim of a government appointed Task Force on Fiscal Management that between 40 and 50 percent of Nova Scotians are functionally illiterate¼. For over ten years Canadian adult literacy has had statistics about the reading skills of adults from a series of national surveys. These surveys, I believe, have both highlighted the scope of the problem and defined a discourse about adult literacy. While the Southam 1987 survey, the source of the Star headline, was the first conducted in Canada, it built on over a decades worth of surveys of adult literacy in the United States, beginning, at least, with the Louis Harris Poll survey of survival literacy in 1970. These were distinguished by the fact that they tested the respondents ability to read rather than simply ask them how well they could read. The early surveys, such as the Harris, attempted to identify a small number of documents that all adults might be expected to be able to read (but ones that not all were able to read). The problem was that getting agreement on what this small number of documents might be was not easy. Further, it was difficult to know from a test using only ten, rather familiar documents, how well someone might do on a broader range of reading. To meet this need to generalise beyond the actual texts used in the test, Irwin Kirsch and his associates at Educational Testing Service developed a methodology for the Young Adult Literacy Survey (YALS) in the United States that allowed them to use texts that were representative of the range of reading adults did, but were not necessarily ones essential to survival. The Southam Survey borrowed and adapted its test materials from YALS, but chose to interpret the results as if the texts were survival texts. ![]() The Southam Survey generated considerable interest, especially in the then newly formed National Literacy Secretariat. The NLS, however, was more interested in the YALS approach to surveying and analysing adult literacy, because that would allow researchers to estimate how well Canadians could read a broad range of texts, not just those on the test itself. To get this data, NLS contracted with Statistics Canada in 1988 for the Survey of Literacy Skills Used in Daily Activities (LSUDA). This Statistics Canada survey was to be the first national survey of its kind that covered the adult working population, 16-65. Further, it was the first to measure adult literacy in two languages in a comparable way. LSUDA built on work done for YALS and on work done in Ontario on the Test of Adult Functional Literacy and its French counterpart Test ontarien dalphabétisation fonctionnelle des adultes. It used the results to categorise adult reading ability by levels, referenced to the relative complexity of reading tasks an individual could easily and regularly carry out; in this it went beyond the Young Adult Literacy Survey and introduced an approach that was picked up in later surveys that followed LSUDA and YALS in the United States. The results from LSUDA were released in June of 1990, International Literacy Year. It is fair to say that the results had an impact on policy, but not on the public. The results were publicly released only in Statistics Canadas Daily, along with other regular official statistics. While there was some press coverage, there was no report as such to which interested adult literacy workers could turn for information. NLS developed a series of in-house reports to assist it in developing and managing policy, but these were seldom widely available. The Government of Ontario had helped fund the survey and prepared a report, and although it was not widely circulated, it did have a role in developing policy for Ontario. The biggest impact of LSUDA may, in fact, have been in Paris, at the Organisation for Economic Co-operation and Development. OECD had long been interested in the role of human capital and the skills of the work force in labour market policy, but had no means for adequately measuring them. OECD contracted for its own report on LSUDA and began working with Statistics Canada, and soon with Educational Testing Service, to develop an internationally comparable survey of adult literacy. That survey was called IALS the International Adult Literacy Survey. Originally conducted in Canada, the United States, Germany, Sweden, the Netherlands, Switzerland, and Poland in 1994, the first results from were released in December 1995 for an international meeting of Ministers of Education at the OECD, a meeting that saw the Ministers endorse the IALS concept and urge other countries to participate. Since 1995, a second report covering an additional five countries (Ireland, United Kingdom, Australia, New Zealand and the Flemish Belgium) has been released and another ten countries will be included in the third international report in the spring of 2000. More importantly for Canada, a detailed report on the Canadian data was released by Statistics Canada in 1996 (Reading the Future: A Portrait of Literacy in Canada) and a series of reports jointly published by Statistics Canada and Human Resources Development Canada have used the IALS data to look a particular issues in adult literacy immigration, labour force participation, health and seniors. IALS has had a larger impact on Canadian adult literacy work than LSUDA. There are several reasons for this. Statistics Canada made a determined effort to publicise IALS in a way that it had not done for LSUDA. The international comparisons possible with LSUDA meant that it could be used as a benchmark in a way that LSUDA could not. And the adult literacy community was ready for IALS in a way it had not been for LSUDA. In 1990, Carleton University held a summer institute for literacy workers in Ontario, shortly after the LSUDA report. Many of the participants at the institute who were starting in recently funded positions - were new to literacy work. While one of the impacts of IALS is undoubtedly the data that has been used to support the need for adult literacy programs and for other policy developments as by the Nova Scotia Task Force it has also had an impact on how adult literacy in Canada is described. Some ten years ago, adult literacy development was referenced to grade levels (someone could read at, say, the ninth grade level). Although most adult literacy workers recognised how inappropriate such grade levels were for adults, there was nothing else to use. Because IALS, like LSUDA, identified levels of reading ability in terms of task complexity it provided a tool for talking about adult literacy development. In many ways the IALS levels have become the framework in which adult literacy work is defined. It is a standard reference point for policy at the NLS and elsewhere in the federal government. Ontario bases much of its program development on the IALS levels. A workplace literacy test (TOWES Test of Workplace Essential Skills) is based on the IALS levels. Thus, the impact of the survey is greater than simply a set of numbers. IALS has now become a standard part of Canadian discourse about adult literacy. It appears as if it will also be a standard part of international discourse about adult skills. Now that most of the member countries of the OECD have completed an IALS survey, they are interested in expanding the methodology to other adult skills. An international group is now working on the next decades survey, IALS+, to measure literacy plus numeracy, perhaps problem solving, teamwork, and skills in technology and communications. As means for talking about them are developed through co-operation on the survey, they too may well become a standard part of the discourse on adult learning in Canada, and elsewhere. |
| Back | Next Page |