Research Note: The Rate at Which Education Increases Literacy Skill

Not all of DataAngel’s analysis ends up in published reports. Some of this material is nonetheless useful in that it offers unique insights into matters of crucial importance to policy. The following research note highlights the rate at which additional education adds to the stock of literacy skill supply.

Research shows that education is the most important determinant of literacy skill (Desjardins, 2004) and that increases in average literacy scores over time explain 55% of differences in the rates of growth of GDP and labour productivity over the long run (Coulombe, Tremblay and Marchand, 2004). So policy makers have an interest in the rate at which rising levels of education are adding to Canada’s literacy skill supply.

The following chart uses data for the 2011 OECD PIAAC adult skill assessment for Canada to estimate the relationship between years of education and literacy.

The chart below reveals a strong linear relationship between the two variables.

In the 2011 PIAAC file, each additional year of education adds an average of 9.1 points.

This relationship suggests that, on average, it takes roughly 5.5 additional years of education to move up a level on the PIAAC literacy proficiency scale. Thus, even at Canada’s relatively rapid rate of increase in years of education, it will take several decades for the population average literacy skill to rise from the bottom of Level 3, where it currently sits, to the bottom of Level 4. In this sense, adult literacy skill upgrading is significantly more efficient, as it can generate this level of skill gain in as little as 30 hours of focused instruction.

This increase in additional points per year of education is slightly lower than the 10.2 points observed in the 2003 IALSS data for Canada. This finding is likely the joint product of higher levels of skill loss in the 2011 study and of higher proportions of immigrants coming from less efficient education systems.

Readers should also keep several things in mind when interpreting these data.

First, the observed skill level at every education level reflects significant amounts of skill gain and loss that occurs after graduation. Because there has been more skill loss than gain over the past three decades, skills observed at the point at which the highest credential was obtained, and the average literacy skill gain per additional year of education, would be higher.

Second, readers should remember that these average score gain obscure significant variation in scores at every level of education.

Notwithstanding these caveats, these data make it clear that Canada cannot rely on increasing post-secondary education levels to fill the growing shortage of workers with Level 3 and above literacy skill. Significant investments in adult literacy skill upgrading will be needed. At $1000 per learner, the cost of adult literacy skill upgrading is low enough for employers to fund themselves. Given the enormous economic and social  costs of Canada’s growing literacy skill shortages, a case can be made for governments to subsidize the cost of literacy skill upgrading.

 

T. Scott Murray is a retired senior manager from Statistics Canada and President, DataAngel Policy Research Incorporated, a global specialist in education, skills and productivity.

T. Scott Murray
DataAngel Policy Research
Email: dataangel@mac.com
Web: http://www.dataangel.ca
Mobile: +1 613 240 8433

Advertisements

Research Note: Market Failure in Canada’s Skills Market

Not all of DataAngel’s analysis ends up in published reports. Some of this material is nonetheless useful in that it offers unique insights into matters of crucial importance to policy. The following research note highlights a key aspect linked to the fairness of adult literacy and numeracy skill upgrading, specifically the need for the instructional offer to reflect learner’s needs and objectives. Learners deserve and expect instruction that is efficient, effective and that produces consistent results. Sadly, in the majority of programs, this is not the case.

One of the fundamental principles of adult education is that the program offer should reflect the learners learning needs and objectives. In 1924, Vygotski defined a region slightly above learner’s skill level where instruction would be most efficient and effective – the so-called “zone of proximal development”. By extension, instruction outside this zone would lead to disengaged learners and highly variable score gain within any given group of learners.

The simplest way to create classes with homogeneous learning needs is to assess the learners and sort them into classes.

Where volumes or context do not provide for multiple classes to be created, systematic assessment also provides Instructors with a means to adjust the content and pace of instruction to compensate. Research identifies 64 distinct patterns of strength and weakness across proficiency levels in oral fluency, prose literacy, document literacy and numeracy, a level of diversity that even skilled and experienced instructors would be unable to detect.

Systematic assessment of a broad range of skills at the point of program intake is the only answer.

I argue that learners also need to be systematically tested at the end of their instructional program to confirm how much they have learned and to update their learning plan. The same information on skill gain would serve several related purposes:

  • It would provide instructors with the means to reflect on their instructional practice,
  • When aggregated at the program level, it would provide training providers with a way to compare the performance of instructors and funders with a way to compare program efficiency, effectiveness and consistency across programs,
  • When published, it would provide training providers, learners and funders with a way to compare programs, information that is crucial to detecting promising innovations and to driving poor training providers out of business.

There is no excuse for programs not to assess all learners at the point of program entry and exit. Employment and Social Development Canada has funded the development and validation of the TOWES-Prime suite of web-based, fully adaptive tests of prose literacy, document literacy, numeracy and reading components. These low cost tests offer real time results that are reliable at four levels of precision, including one that yields reliable estimates of skill gain when administered pre and post training.

Governments, as the funder of the much of the language, literacy and numeracy instruction offered, need to demand systematic pre- and post-assessment of participant’s skills.

 

T. Scott Murray is a retired senior manager from Statistics Canada and President, DataAngel Policy Research Incorporated, a global specialist in education, skills and productivity.

T. Scott Murray
DataAngel Policy Research
Email: dataangel@mac.com
Web: http://www.dataangel.ca
Mobile: +1 613 240 8433