1191 - ready The Journal of Community Informatics ISSN: 1721-4441 Articles Mobile phones and reading for enjoyment: evidence of use and behaviour change A South African non-profit organisation, FunDza, launched a programme that delivers reading material via mobile phones. Computer log files of user activity over an eight-month period were analysed (N = 9,212,716), which showed that relatively large numbers of readers made use of the material (N = 65,533), and read a substantial amount of the material. We found evidence of positive shifts in reading behaviour. Further analysis showed that greater levels of participation in the programme were associated with greater enjoyment of reading. Furthermore, the longer participants read, the more confident they felt about their self-rated reading proficiency Introduction Schoolchildren in South Africa show worryingly low levels of literacy. The national Department of Basic Education conducts regular annual national assessments to determine the literacy competency of learners in Grades 1 to 6 and Grade 9 in schools in South Africa, and in 2013, the national average performance of Grade 3 learners in literacy stood at 51% for Home Language. For Grade 9, the national average performance stood at 43% for Home Language, and at 33% for First Additional Language (Department of Basic Education, 2013). Achievement between 50 and 59% on the assessment is regarded as “adequate”, between 40 !90 Tredoux, C., Louw, J., Louw-Potgieter, J. (2016). Mobile phones and reading for enjoyment: evidence of use and behaviour change. The Journal of Community Informatics, 12 (1), 90–103. Date submitted: 2015-01-21. Date accepted: 2015-12-03. 
 Copyright (C), 2016 (the authors as stated). Licensed under the Creative Commons Attribution- NonCommercial-ShareAlike 2.5. Available at: www.ci-journal.net/index.php/ciej/article/view/1191 Colin Tredoux Department of Psychology, University of Cape Town, South Africa colin.tredoux@uct.ac.za Johann Louw Department of Psychology, University of Cape Town, South Africa. Corresponding Author. johann.louw@uct.ac.za Joha Louw-Potgieter School of Management Studies, University of Cape Town, South Africa. joha.louw-potgieter@uct.ac.za http://www.ci-journal.net/index.php/ciej/article/view/1191 mailto:colin.tredoux@uct.ac.za mailto:johann.louw@uct.ac.za mailto:joha.louw-potgieter@uct.ac.za http://www.ci-journal.net/index.php/ciej/article/view/1191 The Journal of Community Informatics ISSN: 1721-4441 and 49% as “moderate”, and between 30 and 39% as “elementary”. The First Additional Language is typically English, the language of learning and teaching, and the poor performance of Grade 9 learners is of particular concern. Not surprisingly, schools in economically disadvantaged areas did worse than schools in more affluent areas. The average mark achieved in the poorest schools in the assessment of Home Language in Grade 9 was 32%, while for the most affluent schools it was 55%. For First Additional Language, the respective percentages were 30% and 48%. Data from the Progress in International Reading Literacy Study (PIRLS) from 2011 confirm these findings. Howie, van Staden, Tshele, Dowse and Zimmerman (2012) reported that Grade 5 learners in South Africa were performing below the international centre point (500 on a scale from 0 to 1000), and that 43% were unable to reach the Low International Benchmark, meaning that they had not mastered basic reading skills. The SACMEQ III country report (Moloi & Chetty, 2010) focused on the schooling and home environments of South African learners, where the literacy skills of both Grade 6 learners and teachers were tested. Fifteen sub-Saharan countries participated in the overall project (Spaull, 2012). Twenty-seven percent of Grade 6 learners in South Africa were deemed functionally illiterate (that is, unable to read a short simple text and extract its meaning). This compared poorly to the average of 18% for all participating countries. South Africa ranked tenth out of the 15 participating countries in mean reading score (Spaull, 2011). The results of these surveys indicate the extent of the literacy problem in South Africa. Mobile technology is considered a promising platform to deliver various educational services to young people. Roberts and Vänskä (2011), for example, reported on the use of a mobile mathematics learning service in South Africa. In terms of literacy, UNESCO (2013), at its second Mobile Learning Week, addressed the question “How can mobile technology support literacy development for young people and adults?” In many developing countries, the absence of books makes it difficult for young learners to reach acceptable levels of literacy, and under such conditions, mobile phones are increasingly considered an option to compensate for the absence of books. As UNESCO (2014) states on its website, “[a]lthough many parts of the world are book-poor, these same places are increasingly mobile-phone rich”. Overview of the project In an attempt to make use of mobile technology to address the challenge of literacy achievement in South Africa, a local not-for-profit organisation, the FunDza Literacy Trust (2014), launched the Growing Communities of Readers Programme (GCRP). In this programme, professional writers are commissioned to produce content specifically aimed at the target group, namely teenage and young adult readers who have limited access to books, and who read infrequently. The reading material is accessible through both feature phones and smartphones, using the mobile instant messaging service Mxit, and through computers with an Internet connection (FunDza Literacy Trust, 2016). A new story is published each week – it starts on a Friday, and is released in serialised format over the course of the week – a new chapter each day. Discussion questions at the end of each chapter encourage interaction and prompt readers to think about the story and how it relates to their lives and experiences. All stories are then archived in FunDza’s growing “mobi library” (FunDza Literacy Trust, !91 The Journal of Community Informatics ISSN: 1721-4441 2016). (MOBI, or Mobipocket, is an e-book format, based on the Open eBook standard). The organisation also publishes articles on inspirational young South Africans, full-length books —both fiction and non-fiction, some of which are released as “premium”, or paid-for, content —; information booklets, on issues such as pregnancy, refugees, work opportunities, among others; and other relevant materials. In a recent review of 44 mobile-for-reading projects, the United States Agency for International Development (USAID, 2014) found that less than half of the projects focused on providing content. In other words, a lack of reading materials in general is believed to be strongly implicated in low levels of literacy. South Africa’s Department of Basic Education’s reading interventions follow the same logic, where the department has asserted that “learners who are exposed to quality reading resources, have access to libraries and good quality reading programmes and instruction, perform above the national target for the grade” (Department of Basic Education, 2013, p. 10). By increasing access, the GCRP hopes to achieve a secondary objective, namely that learners will engage in self-initiated reading as a leisure activity. FunDza reasons that this is likely to increase learners’ interest in reading, their enjoyment of it, and the confidence that learners have in their own reading ability. This will, in turn, reinforce and strengthen learners’ reading: they will read more (for longer, and with greater frequency), and will read different content. Clark and De Zoysa (2011) have acknowledged that the relationship between enjoyment of reading, attitudes toward reading (positive or negative), reading behaviour (frequency and amount of time), and reading attainment is a complex one. Nevertheless, they indicate that reading for pleasure has a positive effect on both personal and educational development. The Organisation for Economic Co-operation and Development’s (OECD) Programme for International Student Assessment (PISA) (2010) showed that in all countries, with one exception, students who enjoyed reading the most performed significantly better than students who enjoyed reading the least. As Krashen (2004, p. 7) has asserted, “the relationship between reported free voluntary reading and literacy development is not always large, but it is remarkably consistent”. Thus the strategy adopted by FunDza is supported in the research literature. As the use of mobile learning is growing across the world, we need evidence about how such reading is used, and how effective the claims are of reading, learning, and the capacity to address specific social problems. Indeed, in the USAID (2014) literature review the researchers identified the lack of evidence and the need for monitoring and evaluation as major challenges in the area of mobiles for reading. We saw an opportunity in FunDza’s GCRP to collect evidence on the effectiveness of one of these mobiles-for-reading programmes. We were able to address two questions specific to the programme, but with wider implications for evidence regarding mobile learning. These questions are “Do programmes of this kind attract significant levels of participation?” and “Does participating in such programmes change reading behaviour?” We designed two studies to address these questions. !92 The Journal of Community Informatics ISSN: 1721-4441 Study 1: Level of Participation Method A major advantage of delivery of reading material and interventions through mobile technology is that it is possible to measure engagement with the texts behaviourally, as well as through self-report. As readers access the MOBI site and Mxit portal, details of usage are generated and stored on data servers as ‘log files’. We were given access to 245 log files, containing 9,212,716 records of usage on the site in the period 3 February 2013 to 6 October 2013. The log files were provided to us by FunDza, and anonymised, so that it was not possible to identify individual users. The files, however, did include uniform resource locators (URLs) to the material that users accessed, so we were able to use the URL information to obtain individual stories referenced by the URL addresses, thus allowing us to estimate the amount of reading material accessed, in characters, per visit. From this we were able to find records of 65,533 unique users that had visited the site and clicked on content with more than 65 characters (the length of the shortest poem). In addition, entries of below 100 characters that were not poems were excluded, like promotional material, site announcements, and other peripheral material. We then computed number of visits and total amount of text (characters) contained in FunDza pages that users clicked on, for each unique Mxit identity, thus cumulating amount of reading material per user accessed through the service, over the period of time in question. Results Available data allowed us to examine two questions to gauge participation in the GCRP: “How many people visited the FunDza site?” and “How much did they read when they were there?” Both variables, namely number of visits to the site, and length of reading material, were highly skewed. In the case of number of visits, 50% of users visited FunDza seven times or less. In the case of length of reading material accessed, 50% of users viewed or read material of length equal to 18,541 characters, which is about the equivalent of five A4 manuscript pages , or less. 1 Given the skewness of the variables, it is informative to consider them at different points of their distribution. If we consider the top quartile (25%) of the variable recording number of visits, we find that users in this quartile each visited FunDza 39 times or more. Similarly, users in the top 1% (n = 655) each visited FunDza at least 717 times. In terms of amount viewed or read, if we consider readers that visited more than seven times (the top 50% of the distribution), then the median number of characters read was 139,860, or the equivalent of about 77 A4 pages. This means, in more practical terms, that about 33,000 readers were reached, who clicked on the equivalent of over 77 pages of reading material. Readers in the top quartile in this set (16,375 of them) viewed or read a median of 243 pages of text in the eight-month period in question, and readers in the top 1% read at least 1,660 pages of text. Assuming that there are approximately five characters in a word, and 360 words in a double-1 spaced printed page, with font size being 12 points. !93 The Journal of Community Informatics ISSN: 1721-4441 Reflections To be regarded as successful, a programme such as FunDza’s GCRP must at the least attract readers, by getting them to read the material provided. The question of how many readers should be attracted, and how much these readers should read, is, of course, difficult to establish, and FunDza has no specific goals in mind. The abovementioned results show that a large number of readers (more than 65,000) used the site between 3 February 2013 and 6 October 2013, and read at least 65 characters per visit. Fifty percent of these readers visited the site seven times or more, and the top 25% of these readers visited the site 39 times or more. More than half of the visitors to the site (i.e. more than 33,000 people) read a significant amount of material – the equivalent of over 75 A4 pages of reading material. Furthermore, the 1% of readers that visited the site most frequently – over 650 of them – read what seems to us to be an undeniably large amount of material, namely a minimum of 1,660 pages. Of course, we do not know whether these readers were motivated by the GCRP to read more than they usually do. In Study 2, we attempt to establish possible changes in reading behaviour. Study 2: Changes in reading behaviour Method We wanted to test more rigorously the notion that visiting the FunDza site increases reading enjoyment and gets its target audience to read more. A major objective of the GCRP is to get young people to read more “for enjoyment” (i.e. outside of the school context), meaning that they would read for longer periods of time, and would read more often. To assess possible changes in reading behaviour, a single-group quasi-experimental pretest- posttest design was used to compare reading behaviour before joining FunDza with reading after joining. For this purpose, an eight-item questionnaire (see Table 1) was designed to gather information about key variables. The most convenient – and perhaps only – way we could access this population in sufficiently large numbers was for the respondents to receive the questionnaire on their mobile phones. Also, since the respondents accessed the FunDza readings via their mobile phones, it made sense to insert the questionnaire into this channel. However, delivering the questionnaire via mobile phones posed severe limitations on the number of questions that could be asked, and on the possible length of the questions (small screen size, limited display graphics on low-end mobile models, and the cost to the respondent in terms of airtime were some of the considerations in this respect). We required information about reading behaviour (questions 2, 3, and 4 in Table 1), reading enjoyment (question 7), access to books at home (question 1), self-assessed reading proficiency (question 6), reading preferences (question 5), and preferred leisure activities (question 8). Almost all these questions were drawn from examples used in the national surveys of the British National Literacy Trust (Clark, 2011; Clark & De Zoysa, 2011), as the variables they assessed came closest to our interests in the present study. Some of the questions have been slightly re-phrased to fit local circumstances better, or to make them more suitable for delivery via mobile phone. We added one question about the number of books at home, as research by, for example, Evans, Kelley, Sikora, and Treiman (2010), !94 The Journal of Community Informatics ISSN: 1721-4441 indicated that the size of a home library contributes substantially to a child’s educational achievement. Table 1: Questionnaire items assessing reading behaviour New GCRP readers could access the pretest questionnaire between 5 November 2013 and 30 January 2014, and 6,466 completed it. The posttest questionnaire was sent to all these users 2 between 10 February and 28 February 2014. A total of 542 users completed both the pretest and the posttest questionnaires. Since both the pretest and the posttest questionnaires were required in order to assess the effect of the intervention, only these 542 users were considered for the analysis. In addition, we included only respondents who were younger than 25 years of age (N=476); this was in order to fit FunDza’s stated goal of improving reading in adolescents and younger adults. The majority of the respondents were women (63.2%), with an average age of 18.53 years (SD = 2.82, min. = 13), with an underrepresentation perhaps of respondents in the younger age groups (below age 15: 8.6%). 1. How many books do you have at home? 1 = None; 2 = 1-10 books; 3 = 11-50 books; 4 = More than 50 books. 2. How many books did you read last month outside of school? 1 = None; 2 = 1 to 2 books; 3 = 3 to 5 books; 4 = More than 5 books. 3. When you read, how long do you normally read for? 1 = I do not read; 2 = Up to 10 minutes; 3 = About 30 minutes; 4 = About 1 hour. 4. How often do you read outside of school? 1 = Never; 2 = About once a week; 3 = 2-3 times a week; 4 = Every day. 5. Which of these do you read most frequently outside of school, and at least once a month? 1 = Magazines; 2 = Websites; 3 = Books (storybooks, novels); 4 = Newspapers; 5 = Comics; 6 = I don’t read any of these at least once a month. 6. How good a reader do you think you are? 1 = Not a good reader; 2 = An average reader; 3 = A very good reader. 7. How much do you enjoy reading? 1 = Don’t enjoy; 2 = Enjoy a little; 3 = Enjoy a lot. 8. If you have free time after school, what do you prefer to do? 1. Watch TV; 2. Just chill, don’t do much; and 3. Read something. An incentive was offered to respondents – they could earn a small amount of ‘moolah’, or airtime, 2 by completing the survey. This may explain the large number of respondents, and the difference between the pretest and the posttest response rates. !95 The Journal of Community Informatics ISSN: 1721-4441 Results Descriptive results (mean, and standard deviation) are shown in Table 2 for six of the eight questions used in the questionnaire, along with results of paired sample t-tests. For the remaining two questions,‑ which did not lend themselves to mean comparisons, we report 3 tests of difference between dependent proportions for the modal category. Table 2: Means (standard deviations) across the pretest and the posttest for six questionnaire items * not computed (see narrative) ** p < .001 † p < .05 d = Cohen’s mean standardised difference. All df = 435. How many books do you have at home? A majority of respondents in both the pretest and the posttest questionnaires reported having fewer than 11 books at home (64% vs 57%; less than value 2 on the scale we used). However, at the second time of measurement, a modest (but statistically significant) increase in the number of books was reported (paired sample t-test: t = 4.13, df = 435, p < .001, d = .19). A survey conducted by the South African Book Development Council (2007) estimated that on average there are eight books in a typical South African household. In the evaluation of the iREAD project in Ghana, similar results emerged: all the primary school children had fewer than 11 books at home (Worldreader et al., 2012). The fact that the majority of respondents in the present study reported having so few books at home indicates that the GCRP attracts what it regards as the appropriate target population, that is, young people who are “book-poor”. Question Pre-measure Post-measure t d z How many books do you have at home? 2.35 (.93) 2.51 (.83) 4.13† .19 How many books did you read last month outside of school? 2.63 (.90) 2.86 (.88) 4.48** .26 When you read, how long do you normally read for? 3.44 (.74) 3.51 (.71) 2.14† .12 How often do you read outside of school? 3.20 (.81) 3.37 (.69) 3.89** .21 How good a reader do you think you are? 2.55 (.57) 2.60 (.56) 2.03† How much do you enjoy reading? 2.89 (.34) 2.86 (.40) * * Note that incorporation of some of the measures of reading derived from the user log files – as 3 discussed later in the report – resulted in a reduction of about 40 cases in the pretest sample size, due to missing data. !96 The Journal of Community Informatics ISSN: 1721-4441 How many books did you read last month outside of school? A majority of respondents in both the pretest and the posttest questionnaires reported reading more than two books outside of school in the most recent month (53% vs 65%). This is not so different from what 16- to 17-year olds in the USA reported in a recent Pew Center survey (2012), namely 18 books on average per year. Our respondents reported reading more books at the later measurement. There was a statistically significant increase between the pretest and the posttest (paired sample t-test: t = 4.48, df = 435, p < .001, d = .26), which was moderate in size. When you read, for how long do you normally read? A majority of respondents in both the pretest and the posttest questionnaires reported their normal reading time to be more than 1 hour (58% and 62%). Clark (2012) showed that about 28% of British school children between the ages of 8 and 16 reported that they read for about an hour or longer at a time. There was a statistically significant increase between the pretest and the posttest (paired sample t-test: t = 2.14, df = 435, p < .033, d = .12), which, although small in size, indicated that the respondents read for longer at the posttest. How often do you read outside of school? A large majority of respondents in both the pretest and the posttest questionnaires reported that they read two to three times a week, or every day (82% vs 88%). This is comparable to Clarke’s (2012) findings, where nearly 60% of her respondents read at least a few times a week. Again, in the present study there was a statistically significant increase between the pretest and the posttest (paired sample t-test: t = 3.89, df = 435, p < .001, d = .21), which shows that the respondents read more frequently at the posttest. How good a reader do you think you are? A majority of respondents in both the pretest and the posttest questionnaires reported that they consider themselves very good or average readers. Nearly 90% of Clarke’s (2012) British respondents answered in the same way. The distributions in the pretest and the posttest were highly skewed, so we did not test the mean difference. However, we did compare the proportions answering that they were “very good” readers, and there was a statistically significant increase in this proportion, from the pretest to the posttest (from 59% to 64%). This was tested with an exact McNemar test: n = 435, p < .021. In other words, at the second measurement, more respondents regarded themselves as very good readers. How much do you enjoy reading? A near complete majority of respondents in both the pretest and the posttest questionnaires reported that they enjoy reading very much (89%, and 88% respectively). Clarke (2012) showed lower numbers: 50% of the British sample enjoyed reading “very much” or “a lot”. The distributions in the pretest and the posttest were highly skewed, so we did not test the mean difference. There was also strong evidence of a ‘ceiling effect’, meaning either that the scale was not sensitive enough, or that the population in question has very few members who do not enjoy reading. !97 The Journal of Community Informatics ISSN: 1721-4441 Which of these do you read most frequently outside of school, and at least once a month: books, magazines, newspapers, or websites? A majority of respondents in both the pretest and the posttest questionnaires reported that they read books more frequently than other material (magazines, newspapers, and websites). There was a statistically significant increase in the reading of books between the pretest and the posttest (from 49% to 58%), which was moderate in size. This was tested with an exact NcNemar test: n = 435, p < .002. The OECD’s (2010) assessment showed that “students who reported reading fiction and non-fiction books regularly, i.e. several times a month or several times a week, are particularly likely to perform well in the PISA reading assessment” (p. 35). If you have free time after school, what would you prefer to do? A large majority of respondents in both the pretest and the posttest questionnaires reported that they prefer to read in their free time, rather than watch television, or “just chill out”. This proportion was already high in the pretest (76%), but increased significantly in the posttest (81%). This was tested with an exact probability McNemar test: n = 435, p < .03. We are somewhat sceptical about such high percentages, and respondents answering in a socially desirable way, especially as 54% of Clarke’s (2012) British sample preferred watching television over reading. Reflections In summary, it can be concluded that all the comparisons made between the “before” and the “after” measures show moderate but significant changes in the desired direction, as respondents reported that they have more books at home, read more books outside of school, read more frequently and for longer, prefer reading books over doing other activities, and consider themselves better readers. Since reading for enjoyment is an important activity that FunDza wants to encourage, the finding justifies a brief comment. Enjoyment of reading was high before participation in the programme, and remained high at posttest. One way to interpret this finding is that the programme attracts young people who already enjoy reading, but do not have access to physical books that they may want, i.e. they are “book-poor”. After all, reading books or stories on the small screen of a mobile phone could be regarded as a poor second choice to reading physical books. Thus one could say that the programme gives people who want to read access to reading material that they don’t have, and thus whether the programme attracts new readers and changes them into enthusiastic ones cannot be inferred from the current data. This interpretation is supported by the responses to the question how they spend their free time. The majority of respondents indicated that they prefer to read rather than watch television, or just relax, at the “before” as well as the “after” stage. These results are based on self-report data, but our analysis below of actual reading activity shows similar evidence of positive change. For a start, a large number of readers were attracted to the reading programme, and a significant proportion kept on reading. The majority read a substantial amount of material. In fact, one could argue that the respondents read a surprisingly high amount of material, given that they were reading it on a mobile phone, which typically has a very small, low-resolution screen. It is also not likely, given the !98 The Journal of Community Informatics ISSN: 1721-4441 time delay, that the respondents’ responses in the pretest positively biased their responses in the posttest. We conducted one further analysis to establish whether the programme had brought about changes in reading behaviour, as we had data regarding the extent to which the respondents had participated in the programme, and we could relate the data to the changes that the respondents reported. Analysis of change between pretest and posttest, controlling for ‘dosage’ For this analysis, we relied on log file records of participants’ activity in the FunDza service, in order to factor in the amount of reading that they were doing of material provided through the service. If changes we observed between the pretest and the posttest are a function of the amount of reading (i.e. programme “dosage”), then we could be more confident of the efficacy of the programme itself. Over 3.5 million individual records were processed, and from these we were able to extract several aspects with regard to usage of the service by the survey respondents. We opted for a measure of how many Internet pages participants had read, rather than visits to the FunDza site, or any of the other possible measures. On average, participants read over 1,000 story pages from the FunDza site, the equivalent of approximately 100 A4 pages A more detailed breakdown shows that the bottom 25% of the sample read fewer than 78 story pages, while the top 25% read 13,800 or more story pages. The most active 5% of the sample read 27,000 or more story pages. The findings of this smaller sample thus partially replicate the findings of study 1, in that they show that these new readers also read a substantial amount of material. The number of pages that participants read was considered a dosage measure, that is, a measure of how much of the programme a participant received. If the programme is achieving its desired outcomes, one expects a positive relationship between participation in the programme (i.e. how much participants read) and the responses given to key questions. To explore this, we calculated Pearson correlations between the total number of pages read and the questions we had asked in the pretest and posttest surveys. These correlations are shown below. These correlations in general show that there is a positive relationship between the number of pages read and agreement with items on the questionnaires. That is, the more participants read, the more likely they are to indicate that they enjoy reading outside of school, that they consider themselves good readers, that they read for longer, and that they read more outside of school in the most recent month. Most of the correlations are not very strong, but some questionnaire items had restricted range, and/or showed ceiling effects, or near-ceiling effects, which is likely to have attenuated the coefficients. !99 The Journal of Community Informatics ISSN: 1721-4441 Table 3: Correlations between questionnaire items and amount of material read More importantly, the measure of amount of reading correlates with items from both the pretest and the posttest. This is to be expected, since participants are in the programme voluntarily, and their pre-existing enjoyment of reading, or frequency of reading (among other things), is likely to be one of the reasons they enrolled in the programme. It is, however, important, from our point of view, to show that the amount of reading predicts improvement in the posttest scores, in particular. This comes much closer to the overall issue of interest that we alluded to above: Does the programme bring about changes in the outcomes of interest? We assessed this by conducting hierarchical linear multiple regression analyses, factoring out any pre-existing relation between pretest items and posttest items, before entering the reading measure. These analyses are shown in the first six lines of Table 4, for items with which we could do such an analysis (i.e. interval measurement scales). In the final two lines of the table we show results for items we rescaled to dichotomies, and used logistic regression on. 4 Questionnaire item Total pages read Pre How many books do you have at home? .106* Post How many books do you have at home? .057 Pre How many books (such as storybooks and novels) did you read last month outside of school? .132** Post How many books (such as storybooks and novels) did you read last month outside of school? .125** Pre When you read, how long do you normally read for? .215** Post When you read, how long do you normally read for? .159** Pre How often do you read outside of school? .004 Post How often do you read outside of school? .083 Pre How good a reader do you think you are? .052 Post How good a reader do you think you are? .119* Pre How much do you enjoy reading? .128** Post How much do you enjoy reading? .165** * p < 0.05 level (2-tailed) ** p < 0.01 level (2-tailed). Listwise N=435. Some signs have been reversed for readability. Two of the items in the questionnaire were of a categorical nature (e.g. the item that asked 4 participants what kind of material they like to read). We recoded these items to a binary form, since in each case one of the options could be considered a positive outcome intended by the programme, and the other options could be considered less important, or undesirable. !100 The Journal of Community Informatics ISSN: 1721-4441 Table 4: Hierarchical regression “dosage” analysis Note: For hierarchical linear regression, all df = 1,433; for logistic regression, df = 1. Some signs have been reversed for readability. ° These items were analysed using hierarchical logistic regression on recoded dichotomies. * p < .05 ✝ p < .01 The analyses show that the amount of reading, measured in number of FunDza story pages, is a statistically significant predictor of changes in four of the eight relevant items, and is nearly significant as a predictor for two of the remaining items, once pre-existing scores on the items have been controlled for. For instance, the amount of reading (indicated by the log files) is a positive and significant predictor of self-rated enjoyment of reading, even after controlling for the enjoyment reported before the FunDza intervention. This, along with the reported change between the pretest and the posttest, is evidence of the effectiveness of the intervention for increasing enjoyment of reading, self-rated proficiency in reading, self-rated amount of reading, and self-rated book reading. Conclusion The USAID (2014) review of the mobiles-for-reading landscape identified six purposes of these interventions, and the GCRP of FunDza falls into the “providing content” category. Projects with this purpose in mind typically provide reading content via mobile devices, because the assumption is that young people lack access to textbooks and reading materials. The long-term objective of FunDza is to improve literacy levels of “book-poor” young people in South Africa, by making reading material available to them via mobile phones. The current study, of course, is unable to answer questions about improved literacy. Nevertheless, it is important to know whether the GCRP could appeal in the short term to sufficiently large numbers of young people, who actually read the material provided to them. In the first study, we provided data that strongly suggests that this is the case. Having this kind of data, and analysing it in some detail, gives a more positive indication of engagement with the material than merely documents “access” to books and stories (see, for example, Worldreader et al., 2012). Authors like Wigfield, Guthrie, Perencevich, Klauda, McRae and Barbosa (2008) Posttest item B (s.e.) R2 F/Wald P How many books do you have at home? -.002 (.023) .000 0.01 .940 How many books did you read last month outside of school? .050 (.027) .005 2.80 .100 When you read, how long do you normally read for? .044 (.022) .008 3.99 .046* How often do you read outside of school? .040 (.021) .007 3.37 .067 How good a reader do you think you are? .038 (.017) .010 5.09 .025* How much do you enjoy reading? .040 (.013) .016 7.69 .006✝ Reads books more frequently than other materials° .170 (.073) - 5.4 .020* Reads in free time rather than other activities° .029 (.317) - 0.97 .330 !101 The Journal of Community Informatics ISSN: 1721-4441 indeed have shown that reading engagement increases reading comprehension, which in turn is expected to lead to higher reading achievement. Study 2 suggests, firstly, that the programme attracts young people who already enjoy reading, but do not have access to physical books that they may want to read. Although this is a positive finding, it does not tell us whether the programme draws in non-readers and makes readers out of them. A quite different study would be required to answer such a question. Secondly, measured over a relatively short period of time, respondents reported that they had more books at home (although still a limited number), that they read more books outside of school, that they read more frequently and for longer, that they preferred reading books over other activities, and that they considered themselves better readers. Finally, when taking a dosage variable into account, we established positive relationships between the number of pages read and enjoyment of reading outside of school, reading confidence, how long respondents read at a time, and how much they read after school. This is entirely in line with the studies we have quoted here, such as Wigfield et al. (2008), OECD (2010), and Worldreader et al. (2012). We have acknowledged a few limitations to the study, but overall the results point in the same positive direction, and they interlink with and support each other. The one obvious strength of this study is its direct measurement of reading behaviour via log files, instead of just relying on self-report data. This kind of information is not often available in such a format. References Clark, C. (2011). Setting the baseline. The National Literacy Trust’s first annual survey into young people’s reading – 2010. London: National Literacy Trust. Clark, C. (2012). Children’s and young people’s reading today. Findings from the 2011 National Literacy Trust’s annual survey. London: National Literacy Trust. Clark, C., & De Zoysa, S. (2011). Mapping the interrelationships of reading enjoyment, attitudes, behaviour and attainment. An exploratory investigation. London: National Literacy Trust. Department of Basic Education. (2013). Report on the Annual National Assessment of 2013. Pretoria: Department of Basic Education. http://www.education.gov.za/Curriculum/ AnnualNationalAssessment/tabid/424/Default.aspx . Evans, M.D.R., Kelley, J., Sikora, J., & Treiman D.J. (2010). Family scholarly culture and educational success: Books and schooling in 27 nations. Research in Social Stratification and Mobility, 28, 171-197. FunDza Literacy Trust. (2014). The Fundza Literacy Trust. http://www.fundza.co.za/. FunDza Literacy Trust. (2016). Our Mobi library. http://www.fundza.co.za/our-programmes/ growing-communities-of-readers-programme/our-mobi-library/ Howie, S., van Staden, S., Tshele, M., Dowse, C., & Zimmerman, L. (2012). PIRLS 2011: South African Children’s Reading Literacy Achievement Report. Pretoria: Centre for Evaluation and Assessment, University of Pretoria. h t t p : / / w w w. u p . a c . z a / m e d i a / shared/Legacy/sitefiles/file/publications/2013/pirls_2011_report_12_dec.pdf !102 http://www.education.gov.za/Curriculum/AnnualNationalAssessment/tabid/424/Default.aspx http://www.fundza.co.za/ http://www.fundza.co.za/our-programmes/growing-communities-of-readers-programme/our-mobi-library/ The Journal of Community Informatics ISSN: 1721-4441 Krashen, S.D. (2004). The power of reading. Westport, CT: Libraries Unlimited. Moloi, M.Q., & Chetty, M. (2010). The SAMEQ III project in South Africa: A study of the conditions of schooling and the quality of education. Pretoria: Department of Basic Education. OECD. (2010). PISA 2009 results: Learning to learn – student engagement, strategies and practices (Volume III). http://dx.doi.org/10.1787/9789264083943-en Pew Research Center. (2012). The rise of e-reading. Washington DC: Pew Research Center. http://libraries.pewinternet.org/files/legacy-pdf/The%20rise%20of%20e-reading %204.5.12.pdf Roberts, N., & Vänskä, R. (2011). Challenging assumptions: Mobile Learning for Mathematics Project in South Africa. Distance Education, 32, 243-259. Spaull, N. (2011). A preliminary analysis of SACMEQ III South Africa. Stellenbosch University Economic Working Papers: 11/11. www.ekon.sun.ac.za/wpapers/2011/ wp112011/wp-11-2011.pdf. Spaull, N. (2012). South Africa at a glance. SACMEQ at a glance series. Stellenbosch University Research on Socio-economic Policy (RESEP). http:resp.sun.ac.za/ index.php/projects/. South African Book Development Council. (2007). National survey into the reading and book reading behaviour of adult South Africans: Quantitative research into the reading, book reading & book buying habits of South Africans from age 16. Pretoria: Print Industries Cluster Council; Department of Arts and Culture. http:// www.sabookcouncil.co.za/sabookcouncil/pdf/NRSDOCopt.pdf UNESCO. (2013). UNESCO Mobile Learning Week. Retrieved July 10, 2014. http:// www.unesco.org/new/fileadmin/MULTIMEDIA/HQ/ED/pdf/MLW_2013_Program- withlinks.pdf. UNESCO. (2014). Mobile reading. http://www.unesco.org/new/en/unesco/themes/icts/m4ed/ mobile-reading/. USAID. (2014). Mobiles for reading: A landscape research review. Washington, DC: U S A I D . h t t p : / / l i t e r a c y. o r g / s i t e s / l i t e r a c y. o r g / fi l e s / p u b l i c a t i o n s / wagner_mobiles4reading_usaid_june_14.pdf . Wigfield, A., Guthrie, J. T., Perencevich, A. T., Klauda, S. L., McRae, A., & Barbosa, P. (2008). Role of reading engagement in mediating effects of reading comprehension instruction on reading outcomes. Psychology in the Schools, 45, 432–445. Worldreader, ILC Africa & USAID. (2012). IRead Ghana study, final valuation report. Washington: USAID. http://pdf.usaid.gov/pdf_docs/pnadz402.pdf !103 http://www.ekon.sun.ac.za/wpapers/2011/wp112011/wp-11-2011.pdf http://www.unesco.org/new/fileadmin/MULTIMEDIA/HQ/ED/pdf/MLW_2013_Program-withlinks.pdf http://www.unesco.org/new/en/unesco/themes/icts/m4ed/mobile-reading/ http://literacy.org/sites/literacy.org/files/publications/wagner_mobiles4reading_usaid_june_14.pdf