Australasian Journal of Educational Technology, 2019, 35(6). 48 Digital equity and accessible MOOCs: Accessibility evaluations of mobile MOOCs for learners with visual impairments Kyudong Park Pohang University of Science and Technology, Korea Hyo-Jeong So Ewha Womans University, Korea Hyunjin Cha SoonChunHyang University, Korea Despite the popular claim that massive open online courses (MOOCs) can democratise educational opportunities, this study suggests that current MOOC platforms are not designed to be accessible and inclusive for learners with disabilities. Our main goals in this study were to identify the needs and barriers that learners with visual impairments face when learning with mobile devices in MOOCs and to make recommendations for designing MOOCs that are more accessible and inclusive. We conducted this study in two phases: a user study (Phase I) and a heuristic walkthrough (Phase II). In Phase I, we conducted a user study with three university students with visual impairments to identify their needs and the barriers to learning that they encounter in mobile MOOC platforms. In Phase II, five evaluators conducted a heuristic walkthrough based on Web Content Accessibility Guidelines 2.0 (World Wide Web Consortium, 2008) to examine the degree of accessibility of a MOOC platform. Overall, the results indicate that serious accessibility issues exist in MOOC platforms, preventing learners with visual impairments from fully participating in learning activities. We conclude this paper by recommending ways to design mobile MOOCs to make them more accessible for learners with visual impairments. Implications for practice or policy: • To make MOOCs accessible to learners with visual impairment, MOOC platforms need to provide auto-translation and downloadable lectures with subtitles. • Efforts should be devoted to providing alternative texts for non-text content media and information on the current state in hidden menu elements. • The use of bypass buttons can help learners with visual impairment better access repetitive information. • This study recommends improving the accessibility of MOOCs based on the universal design for learning principles. Keywords: digital equity, MOOCs, learners with visual impairments, universal design for learning, accessibility, mobile learning Introduction This study focused on digital equity issues in mobile massive open online course (MOOC) environments. MOOCs have emerged in the higher education scene as part of a growing effort to extend learning opportunities to a wider range of learners. In the context of this openness movement, MOOC platforms must have high levels of accessibility for learners with and without disabilities. Learning activities in MOOCs include not only passive activities such as reading materials and watching online video lectures but also active participation such as writing posts in discussion forums, taking quizzes, submitting assignments, and performing peer reviews. However, whether current MOOCs provide truly open and accessible learning platforms that enable learners with disabilities to participate in such diverse learning activities remains an open question. From the perspective of digital equity, Selwyn (2016) argued that, despite the popular claim that digital technologies can democratise educational opportunities, online learning tends to be designed and configured to the norm of self-motivated and highly able students. Hence, the advantages of online learning disappear when those who do not conform to this norm and come from marginalised groups engage in online learning. Australasian Journal of Educational Technology, 2019, 35(6). 49 The accessibility issues of MOOCs are also indicative of the failure of the digital democracy claim in education. Existing research generally suggests that current MOOC platforms have serious accessibility issues that prevent learners with disabilities from fully participating in learning activities (e.g., Akgűl, 2018; Iniesto, McAndrew, Minocha, & Coughlan, 2016a). Between 2009 and 2016, Sanchez-Gordon and Luján- Mora (2017) conducted a systematic literature review of 40 relevant studies about the accessibility of MOOCs for diverse learners and identified challenges to achieving accessible MOOCs. The review highlighted the need for more research investigating accessibility needs from learners’ perspectives and for considering accessibility guidelines from the early stages of development. Against this backdrop, we set out to contribute to the body of research on accessible MOOCs by evaluating the accessibility of mobile MOOC platforms for learners with visual impairments. Recognising that most studies on accessible MOOCs have been conducted in the desktop computer environment (e.g., Al-Mouh, Al-Khalifa, & Al-Khalifa, 2014; Bohnsack & Puhl, 2014; Iniesto & Rodrigo, 2014), we focused specifically on evaluating the accessibility of MOOCs in mobile contexts. Given the widespread adoption of mobile devices, we postulated that mobile learning environments will become a dominant mode of learning, making it critically important to examine the extent to which mobile MOOCs accommodate the needs of learners with diverse backgrounds, abilities, and disabilities. To this end, we conducted the study in two phases: a user study and a heuristic walkthrough. In Phase I, we conducted a user study with learners with visual impairments to identify their needs and the barriers to learning that they encounter in mobile MOOC platforms. In Phase II, five evaluators conducted a heuristic walkthrough method based on Web Content Accessibility Guideline (WCAG) 2.0 (World Wide Web Consortium [W3C], 2008) to examine the degree of accessibility of a MOOC platform. We conclude this paper by discussing the implications of our findings for the development of accessible mobile MOOCs for learners with visual impairments. Theoretical background and related work MOOCs and digital equity MOOCs originated from the open educational resources (OER) movement, which is based on the principle of providing everyone with free educational content and open access to learning resources using digital technologies in the hope of expanding educational opportunities to all (Atiaja & Guerero-Proenza, 2016). In particular, collaboration among prestigious universities and educational institutions through alliances and partnerships has fuelled the creation of open learning platforms available to large audiences (Atiaja & Guerero-Proenza, 2016). This proliferation of MOOC platforms has transformed the educational sphere of lifelong learning for all, making access to learning for anyone and at any place. MOOCs have benefitted diverse groups of learners, with open and flexible features such as free or low-cost access to diverse learning resources, flexible participation in learning activities at the learners’ preferred pace and in their preferred locations, social learning opportunities with learners in different geographical locations, and the acquisition of certificates from prestigious universities (Iniesto, McAndrew, Minocha, & Coughlan, 2017a; Iniesto, McAndrew, Minocha, & Coughlan, 2017b). However, recent studies related to the accessibility of MOOCs have indicated that, in spite of their massive and open nature, MOOCs might not offer accessible learning environments for learners with special needs (Akgűl, 2018). Sanchez-Gordon and Luján-Mora (2016) even insisted that MOOCs ignore the needs of many people who would like to access their learning environments and, in fact, only pretend to democratise education. Indeed, technology has played an important role in enabling people with disabilities to overcome barriers and pursue opportunities to expand their learning and life skills (Foley & Ferri, 2012). Previous studies have shown that e-learning and open content can create better learning environments for students with disabilities (Guglielman, 2010; Scanlon, McAndrew, & O’Shea, 2015). For instance, statistics from the Open University in the United Kingdom show that the number of students with disabilities increased annually from 4.18% (between 2010 and 2011) to 13.83% (between 2014 and 2015), testifying to the increased participation of learners with disabilities in online learning environments (Iniesto et al., 2016b). In this respect, MOOCs are also believed to have great potential to provide alternative learning opportunities for learners with disabilities who cannot attend educational institutions using traditional means. Despite this great potential and their suitability to support people with disabilities, researchers have argued that most MOOC platforms remain less than universally accessible (Akgűl, 2018). Australasian Journal of Educational Technology, 2019, 35(6). 50 Research on accessible MOOCs Accessibility refers to the design and practice of making the products, services, and environments usable by all people, including persons with special needs (Henry, Abou-Zahra, & Brewer, 2014). As web technologies are rapidly advancing and Internet use on a wide range of devices is growing, web accessibility issues have received increasing attention in the literature. To provide equal access and mitigate various limitations that disabled people face while using the web on various devices, W3C developed the WCAG 1.0 in 1999 and improved the guidelines to WCAG 2.0 in 2008 (Akram & Sulaiman, 2017). WCAG 2.0, which was most widely utilised in previous accessibility studies, consists of 61 success criteria and 12 guidelines in four principles: 22 success criteria in the perceptible principle, 20 criteria in the operable principle, 17 criteria in the understandable principle, and two criteria in the robust principle (W3C, 2008). The success criteria were organised (based on the extent to which the web service meets the needs of different people and situations) into three levels of conformances: A (lowest), AA (mid-range), and AAA (highest) (W3C, 2008). W3C also extended the accessibility standard issues to several products and services and developed accessibility-focused standards: ARIA (Accessible Rich Internet Applications), ATAG (Authoring Tool Accessibility), and UAAG (User Agent Accessibility Guidelines) (W3C, n.d.). Coughlan, Ullmann, and Lister (2017) emphasised that although accessibility guidelines and standards can promote public awareness and compliance to necessary requirements, disabled people still face problems. They contended that more empirical studies should be carried out to identify limitations in such guidelines. In addition, Cooper, Sloan, Kelly, and Lewthwaite (2012) discussed whether accessibility issues should be handled with the consideration of the user’s contextual factors. In the education sector, many institutions and projects have addressed the issue of accessibility for disabled students in online learning environments and e-learning, considering teaching and learning contexts (Jemni, Laabidi, Ayed, 2014). For instance, the Digital Accessible Information System (DAISY) standard was developed for accessible e-books and materials (DAISY consortium, n.d.), and the IMS AccessForAll and the ISO standards (IMS Global, 2012) were developed to define and provide a description of accessible learning resources and content. Furthermore, previous studies have attempted to develop accessible content and tools in education, including a teacher tool to develop e-learning content for visually impaired students, and Websign as an interface to teach sign language for deaf students in online learning environments (Jemni et al., 2014). However, the number of studies focusing on accessibility issues in MOOCs remains limited. Furthermore, as MOOCs become more and more popular, the need for empirical research examining the accessibility levels of MOOCs and the related requirements for learners with disabilities is increasingly urgent. In general, there are three types of web page accessibility evaluation techniques: automated testing, manual testing, and user testing (Abou-Zahra, 2008). In this section, we group and discuss existing research on the accessibility of MOOCs according to each of these evaluation techniques; Table 1 summarises our findings. First, automated testing is a technique for evaluating accessibility using automatic programs or tools. This method usually does not require help from human evaluators. For example, Iniesto, Rodrigo, and Moreira Teixeira (2014) evaluated MOOCs using eXaminator, which is an automatic online testing tool. Calle- Jimenez, Sanchez-Gordon, and Luján-Mora (2014) evaluated a MOOC on geographical information systems using three automated tools: Accessibility Audit, eXaminator, and WAVE. They contended that researchers should supplement automated testing with user testing involving users with different types of disabilities because automatic tools cannot detect certain errors. Second, manual testing is a technique for identifying and testing accessibility issues by human evaluators, while it sometimes can be aided by software tools. Sanchez-Gordon and Luján-Mora (2013) deduced accessibility requirements for elderly users based on WCAG 2.0 and the Web Accessibility Initiative – Ageing Education and Harmonisation (WAI-AGE) recommendations for five Coursera courses. They identified 22 accessibility requirements corresponding to WCAG 2.0 and seven requirements from the recommendations of WAI-AGE. Al-Mouh et al. (2014) conducted both expert and user evaluations to identify accessibility problems on Coursera for learners with visual impairments. Two experts participated in the heuristic evaluation to examine Coursera accessibility based on WCAG 2.0. Furthermore, three blind users took part in the usability testing, completing a predefined set of tasks on MOOCs by utilising screen readers. Based on the results of both studies, the researchers developed five specific recommendations related to information presentation, text alternatives, form input labels, fill-in blanks in quizzes, and descriptive alternatives on links, tables, and images. Australasian Journal of Educational Technology, 2019, 35(6). 51 Third, user testing is a technique for testing a target website with real end users. In the study by Bohnsack and Puhl (2014), a blind user examined courses from Udacity, Coursera, edX, OpenCourseWorld, and Iversity. The study revealed that, among these, only edX was accessible to visually impaired users. Table 1 Summary of the literature on the accessibility issues of MOOCs Evaluation technique Reference MOOC platform tested Method Findings Automated testing Iniesto et al. (2014); Iniesto & Rodrigo (2014) Spanish and Portuguese MOOCs (UNED COMA UAb iMOOC) Automated tool and simulator; Heuristic evaluation with eXaminator and aDesigner MOOC platforms have serious accessibility problems. Martín, Amado- Salvatierra, & Hilera (2016) Coursera, edX, Udacity, MiriadaX, UNED COMA, Udemy, FutureLearn, & NovoEd Accessibility tools such as eXaminator, FAE, & Tingtun Among eight MOOC platforms, the level of accessibility (from high to low) was edX, FutureLearn, UNED COMA, NovoEd, Coursera, MiriadaX, Udemy, & Udacity. Akgül (2018) Turkish MOOCs AChecker tool with 3-level priority accessibility checkpoints based on WCAG 2.0 MOOC failed to comply with WCAG 2.0. Manual testing Sanchez- Gordon & Luján-Mora (2013) Coursera Heuristic evaluation by experts with the WCAG 2.0 Coursera failed to reflect some accessibility needs for elderly people, based on WCAG 2.0 success criteria. Al-Mouh et al. (2014) Coursera Heuristic evaluation with WCAG 2.0 using PC and mobile Coursera failed to conform to WCAG 2.0. Sanchez- Gordon & Luján-Mora (2016) edX Studio, a course-authoring software Expert evaluation of the conformance of edX Studio based on ATAG 2.0 edX Studio does not comply with ATAG 2.0. User testing Bohnsack & Puhl (2014) Five MOOCs platforms (i.e., Coursera, Udacity, edX, Iversity, & OpenCourseWorld) A blind person was asked to register for random courses in MOOC platforms User could not access most of MOOC platforms or faced serious accessibility problems. Al-Mouh et al. (2014) Coursera User evaluations by three visually impaired persons on the web Coursera failed to conform to WCAG 2.0. In addition, some studies have applied holistic evaluation methodologies to test the accessibility of MOOCs. Researchers from the Open University in the United Kingdom have developed conceptual models and a holistic evaluation framework to improve the accessibility levels of MOOCs. These studies focused on designing a MOOC recommender system (Iniesto et al., 2014; Iniesto & Rodrigo, 2014); developing strategies for improving MOOC accessibility through the analysis of MOOC platforms, meta-information from user profiles, and educational content (Iniesto et al., 2016a, 2016b; Iniesto & Rodrigo, 2016); and implementing an audit instrument that combines expert-based heuristic evaluations with user-based evaluations (Iniesto, McAndrew, Minocha, & Coughlan, 2017a, 2017b). They also expanded approaches to examining accessibility to encompass the perspectives of various stakeholders by conducting semi- Australasian Journal of Educational Technology, 2019, 35(6). 52 structured interviews with MOOC platform content managers, software designers, and MOOC accessibility researchers. The results revealed that it is necessary for providers to better understand the accessibility needs of potential learners with disabilities, and for developers to clearly understand legislative and organisational requirements. Furthermore, Sanderson, Chen, Bong, and Kessel (2016) emphasised the importance of the systematic study of MOOC accessibility from the perspective of instructors and evaluated the Canvas platform using a heuristic evaluation method with ATAG 2.0 by W3C. They recommended three ways to improve MOOCs accessibility: support for efficient keyboard navigation, support for screen readers, and options for avoiding or correcting mistakes. Universal design for learning (UDL) One crucial issue with MOOC accessibility research is that most studies in this area used technical web accessibility guidelines to conduct their evaluations, giving little consideration to teaching and learning activities. Certain MOOC platform websites may be accessible in a technical sense but may not fully support teaching and learning activities (e.g., participating in a discussion forum). Hence, to further identify accessibility problems related to pedagogical features, we adopted universal design for learning (UDL) principles in this study. Historically, UDL is rooted in the philosophy of universal design (UD) in the field of architecture. Originally proposed by Ron Mace at the Center for Universal Design, UD refers to designing environments and products usable for all people without the need for extra adaptation for special needs (McGuire, Scott, & Shaw, 2006). The original notion of UD encompassed seven principles: equitable use, flexibility in use, simple and intuitive use, perceptible information, tolerance for error, low physical effort, and size and space for approach and use. Because UD was conceived in the context of architectural design, these seven principles were structured around the design of physical objects and built environments that people use or occupy. Initial efforts to integrate the notion of UD with learning shifted the focus from the environment and product design to curriculum design that facilitates multiple means of presentation, expression, and engagement (Meyer & Rose, 1998). Since then, some adaptions to the original UDL framework have been made to better reflect the nature of inclusion and accessibility in educational contexts. Table 2 provides a chronological overview of three widely cited UDL frameworks. For instance, Scott, McGuire, and Shaw (2003) derived the idea of universal design for instruction (UDI) from the original UD framework and added two principles specific to learning situations, namely community of learners and support and instructional climate. Later, Burgstahler and Cory (2008) proposed the UDL framework in higher education contexts with nine principles, focusing on the design of instructional materials and activities that facilitate the learning of individual learners with diverse physical, cognitive, and linguistic abilities. They further emphasised the critical importance of flexibility and early integration in UDL, stating that UDL can be achieved by designing teaching and learning based on flexible curricula and instructional methods at the beginning. Table 2 A chronological review of UDL frameworks UDL UDI UDI Meyer & Rose (1998) Scott et al. (2003) Burgstahler & Cory (2008) (1) Multiple means of presentation (2) Multiple means of expression (3) Multiple options for engagement (1) Equitable use (2) Flexible use (3) Simple and intuitive (4) Perceptible information (5) Tolerance for error (6) Low physical and technical effort (7) Community of learners and support (8) Instructional climate (1) Class climate (2) Interaction (3) Physical environments and products (4) Delivery method (5) Information resources and technology (6) Feedback (7) Assessment (8) Accommodation Australasian Journal of Educational Technology, 2019, 35(6). 53 Methodology Research rationale and purpose Our main goal in this study was to investigate the needs and barriers that visually impaired learners face when learning with mobile MOOCs. While researchers have actively conducted and published studies on MOOC accessibility since 2014, few have attempted to identify specific requirements based on learner disability types. Regarding future research direction, Iniesto et al. (2016b) argued that it is crucial to build insights into the accessibility needs of learners with disabilities in order to determine how to adapt learning resources and content to accommodate a wide variety of disabilities. Indeed, studies need to identify accessibility problems and needs that are specific to individual disability types. The above literature review, however, revealed a lack of empirical user studies regarding the accessibility problems that learners with disabilities face while performing learning tasks in MOOCs. In addition, few studies have considered accessibility issues in mobile MOOC platforms. Taking this background into consideration, our main aim was to identify the specific accessibility requirements of learners with visual impairments by exploring the accessibility problems they face when performing learning tasks in the context of mobile MOOCs. Overall process We conducted this research in two phases (Figure 1). In Phase I, we conducted a user study with three learners with visual impairments. They performed specific tasks related to using MOOCs on three representative MOOC platforms (i.e., edX, Coursera, and Khan Academy). In the interview session, we asked participants about current accessibility levels, main barriers, and the pedagogical usefulness of mobile MOOCs. In Phase II, Coursera was evaluated using a heuristic walkthrough method to further identify accessibility problems; we recruited five evaluators who had extensive experience designing mobile applications and knowledge about human–computer interaction. From these two phases, we derived key findings and recommendations for making MOOCs more accessible in terms of UDL principles. Figure 1. Overall research process Phase I: User study Participants and apparatus We recruited three university students with visual impairments (two males and one female) for the user study. Table 3 presents the profiles of the three users. Their ages ranged from 18 to 26 years, and all of them were legally blind. They all also owned smartphones, with the average length of smartphone usage being 4.5 years. In addition, all participants indicated that they had experience taking online courses and that they all had sufficient English competence to understand MOOCs in English. Finally, all three participants used their own smartphones – iPhones (iOS 9.0 or higher) with VoiceOver – to perform tasks. Using Safari as a web browser for the experiment, we conducted the user study for three representative MOOC platforms – Coursera, edX, and Khan Academy – and selected a suitable course for the participants to perform the predefined tasks (see Table 4). Australasian Journal of Educational Technology, 2019, 35(6). 54 Table 3 Participant profiles and MOOC platforms used for testing ID Gender Age Years of blindness Years of smartphone usage MOOC platform P1 Male 18 18 4.5 Coursera P2 Male 26 10 4.0 edX P3 Female 24 13 5.0 Khan Academy Procedure The user study consisted of three parts: pre-interview, experiment, and post-interview. First of all, we asked participants to sign a consent form. In the pre-interview, we asked participants basic questions about their backgrounds, such as “Have you heard about MOOCs?” and “Have you taken any online courses?” Then, we explained the purpose and methods of the experiment. In the experiment, which was conducted in a quiet room, we asked the participants to perform nine predefined tasks (Table 4) that we deemed necessary to take courses in MOOCs. We designed two types of tasks: general tasks and course-specific tasks. Although the general tasks involved any tasks learners needed to perform regardless of courses (e.g., registration), course-specific tasks were dependent on the course that learners were taking and were more pedagogical compared to general tasks. Participants had 5 minutes to complete each task. When performing each task, the participants were asked to think aloud, saying whatever they were thinking, doing, and feeling at each moment. Lastly, during the post-interviews, we asked the participants the following questions to further examine their overall experiences and feelings about using MOOCs: (1) What are the pros and cons of MOOCs in your experience? (2) How do you evaluate the usefulness of MOOCs from both accessibility and learning perspectives? (3) How can MOOCs be improved to increase accessibility for learners with disabilities? We transcribed and analysed the interview responses based on the UDI principles developed by Scott et al. (2003). Table 4 Predefined tasks in the user study Task type Task descriptions General task Register Sign up Access a profile menu and check profile features Sign in and out Sign in and out Help centre Access a help centre Course-specific task Search & enrol Search a given course and check course information Enrol in a course Course home Access a course home page Check a course announcement Lecture Play and pause a lecture video clip Adjust playback speed Quiz Submit an exercise or a quiz Discussion forum Write, edit, and delete a post Write, edit, and delete a reply Un-enrol Un-enrol in the course Phase II: Heuristic walkthrough Purpose The heuristic walkthrough is a hybrid usability method that combines the advantages of heuristic evaluation and cognitive walkthrough, which are popular inspection-based evaluation techniques (Sears, 1997). In a heuristic evaluation, evaluators use usability heuristics to evaluate any aspect of an interface and document any problems that violate the heuristics (Nielsen & Molich, 1990). A potential disadvantage of a heuristic evaluation is that novice evaluators may identify large numbers of false positives that would not actually hinder users. Meanwhile, a cognitive walkthrough is a more structured and task-specific method, where evaluators work through a series of tasks from a user perspective. The main assumption underlying the Australasian Journal of Educational Technology, 2019, 35(6). 55 heuristic walkthrough method is that the rigor of heuristic evaluation, which tends to be prone to false positives, can be enhanced by a task-based review in the cognitive walkthrough process (Sears, 1997). A heuristic walkthrough method is a two-pass process. The first pass involves a cognitive walkthrough where evaluators work through a set of tasks with thought-provoking questions to identify problems. The second pass is a free-form evaluation where evaluators use heuristics to identify additional problems. Evaluators and apparatus Although the proper number of evaluators remains controversial, researchers generally agree that at least three to five evaluators are required to identify serious usability problems (Nielsen, 1993). The five evaluators in this study had sufficient knowledge of and experience with usability engineering and mobile computing. Evaluators used iPhones (iOS 9.3.1) with VoiceOver and Safari as a web browser. We conducted a heuristic walkthrough in a course on the Coursera platform since the evaluators needed to use a common platform to compile evaluation scores and suggestions Procedure This phase of the research included three parts: a pre-evaluation session, a task-oriented evaluation, and a free-form evaluation. First, we conducted the pre-evaluation session for about 30 minutes to help the evaluators become familiar with using the screen reader (iPhone’s VoiceOver). They were blindfolded and asked to use VoiceOver on some frequently accessed websites. Next, in the task-oriented evaluation, the evaluators performed the nine tasks in Table 4, which were the same tasks that the participants with visual impairment performed in the Phase I study. We asked evaluators to perform a cognitive walkthrough for each task from the perspective of visually impaired users. For each step in the sequence, the evaluators recorded accessibility and usability problems in the form of success and failure stories using the following thought-provoking questions (Sears, 1997; Wharton, Rieman, Lewis, & Polson, 1994): • Will users know what they need to do next? • Will users notice that there is a control available that will allow them to accomplish the next part of their task? • Once users find a control, will they know how to use it? • If users perform the correct action, will they see that progress is being made towards completing the task? Then, they evaluated each problem using a 4-point severity rating scale (Rubin & Chisnell, 2008): • 1 (Irritant): The problem occurs only intermittently and can be circumvented easily. • 2 (Moderate): The user will have to exert some moderate effort to get around the problem. • 3 (Severe): The user will be severely limited in his or her ability to overcome the problem. • 4 (Unusable): The user either will be unable or unwilling to use a particular part of the product. Lastly, in the free-form evaluation, the evaluators were free to explore any aspect of the MOOC platform using WCAG 2.0 as heuristics (Caldwell, Cooper, Reid, & Vanderheiden, n.d.), which are international technical standards for making web content accessible. WCAG 2.0 includes four principles (i.e., perceivable, operable, understandable, and robust), 12 guidelines, and 61 success criteria. Each evaluator compiled problems where the MOOC platform violated the heuristics. After documenting accessibility problems, we asked the evaluators to rate the severity of each problem on the 4-point severity rating scale. Among the three tested MOOC platforms, Coursera and edX display the accessibility statement that their website conforms with the WCAG 2.0 level AA, but Khan Academy does not have an explicit accessibility statement available. Results Phase I: User study results Table 5 shows the success scores for each participant across the nine tasks. Overall, the participants struggled to complete the given tasks. They all failed to complete seven of the tasks within the given time. Two participants succeeded in completing two tasks – accessing a course home page and taking a quiz. Australasian Journal of Educational Technology, 2019, 35(6). 56 Table 5 Task completion in the user study Task list P1 (Coursera) P2 (edX) P3 (Khan Academy) Success score General task Register & profile X X X 0 Sign in and out X X X 0 Help centre X X X 0 Course-specific task Search & enrol X X X 0 Course home O X O 2 Lecture X X X 0 Quiz X O O 2 Discussion forum X X X 0 Un-enrol X X X 0 The qualitative data (e.g., think-aloud scripts and interview data) were content-analysed based on the seven UDI principles developed by Scott et al. (2003). Among the seven principles, the main usability problems and requirements concerned equitable use, perceptible information, and low technical and physical effort. First, the main equitable use problems were related to language and translation. Participants wanted to have an auto-translation feature in the form of translated text files. P2 and P3 commented that dubbing textual information would be desirable. The interviews confirmed that the participants would like to download lectures and subtitles. Although edX had a menu for downloading lectures and their subtitles, the menu was located under the Interactive transcript item, which most participants could not find: “It would be better if all lectures and translated scripts were downloadable. [After checking the position of the download button] Oh, I think its location is too far from the video” (P2). The participants also identified several perceptible information problems related to the alternative texts for some functional buttons. The participants could not perceive intuitively the exact functions of certain buttons. For instance, all participants failed in the tasks Register & profile and Search & enrol because the buttons were hidden in the dropdown menu: “I didn’t know that there was a toggle menu at first” (P1). Only one participant succeeded in accessing the Help Centre because the Help button was located in a submenu in the footer. In addition, no participants succeeded in participating in the discussion forum, because they read the button for Write as a button and the text field as a “plus button,” which made it difficult for them to intuitively understand the button functions. On the other hand, the alternative texts for images and symbols on the contents were well provided. Concerning the principle of low technical and physical effort, we observed that the participants had to listen to all the web page elements repeatedly using the screen reader because the category information about courses in the test platform was usually positioned at the top of each web page. The participants thus had to spend considerable time navigating through repetitive content, which easily discouraged them: “Same information about categories is presented over and over again every time the web-page is loaded. It is really annoying” (P2). The users were also irritated by the illogical stream of focus, which resulted from structural problems with the web page. For example, when a participant clicked on a button, the focus was forcibly directed to the top of the screen in the mobile web page. Overall, it appeared that users with visual impairments would not be able to enjoy the benefits of the MOOCs’ services in the mobile environment, and the participants themselves confirmed this: “If I use this site from today, it would be hard without someone’s help” (P2); “Other people with visual impairment will give up on using this platform because the structure is very complex” (P3). Phase II: Heuristic walkthrough results In the task-oriented evaluation, the evaluators used four thought-provoking questions to identify 10 failure stories – summarised with their respective severity scores in Table 6. Overall, the severity scores of failure stories 4, 5, 6, 9, and 10 were relatively high. The severity means of seven failure stories were above 3 (severe), indicating that users would be severely limited in their ability to perform the given tasks. Failure stories 4 and 5 concerned the lack of text alternatives for non-text content such as images. Failure story 6 Australasian Journal of Educational Technology, 2019, 35(6). 57 concerned the activation of toggle navigation through VoiceOver. Failure stories 9 and 10 concerned the absence of feedback about where new content was located. Table 6 Failure stories and severity ratings in the task-oriented evaluation Failure story Descriptions Severity (Mean) 1 Users easily miss “Menu” functions since the meaning of the term is too general. 2.2 2 There is no submission button after filling in a text field in the Search area. Searching starts only when users click a return key after typing text. 1.6 3 Users may not know the type of content they are accessing (e.g., video clip or reading material) until they move to the next page. 2.0 4 It is difficult for users to notice the button for writing new text because it is labelled with an image without text alternatives. 3.8 5 It is difficult for users to notice the course settings button because it is labelled with an image without text alternatives. 3.6 6 Users do not know how to activate toggle navigation. 3.6 7 Users do not receive feedback on the current state of toggle navigation – whether it is open or closed. 3.4 8 Users do not receive feedback on the current state of a “menu” link – whether it is open or closed. 3.0 9 When users click on the popup button, the popup code is created at the bottom of the page, and it is difficult to find newly added content. 3.6 10 When users click on the reply button, the content is automatically created without reloading the web page. However, there is no feedback that signals the creation of new content. 3.6 In the free-form evaluation, the evaluators used four principles (i.e., perceivable, operable, understandable, and robust), 12 guidelines, and 61 success criteria from WCAG 2.0 (W3C, 2008) to evaluate how serious each criterion is for learners with visual impairment to take MOOCs. Table 7 presents severity ratings for respective success criteria. The evaluators considered the criterion 1.1.1 (level A) the most serious and criterion 1.2.7 (level AAA) the least serious. Concerning the perceivable principle, the evaluators identified 26 violations in the success criterion 1.1.1 (level A) that requires the provision of alternative texts for any non-text content, such as controls, input, and time-based media (guideline 1.1). The evaluators perceived criterion 1.1.1 as the most serious issue that makes users unable to use or not wanting to use the MOOCs. Next, although time-based media include both pre-recorded and live content, the MOOC platforms in this study provided only pre-recorded content (e.g., video lectures). As a result, success criteria 1.2.3 (level A), 1.2.5 (level AA), and 1.2.7 (level AAA), which concern the provision of media alternative, audio description, and extended audio description respectively, were not satisfied. The severity ratings are 2.8 for criterion 1.2.3, 2.2 for criterion 1.2.5, and 1.6 for criterion 1.2.7. It is interesting that the evaluators rated criterion 1.2.7 the least serious, while the conformance level of this criterion is the highest. It is possible that the evaluators perceived that this criterion does not occur frequently and that not all websites may need to deal with this criterion. WCAG 2.0 also indicates that due to the difficulty of achieving level AAA, conformance across entire sites is not required as a general policy (W3C, 2008). We suggest that when pictures and diagrams appear in pre- recorded lecture videos, audio descriptions should be provided to help visually impaired learners understand the meaning via screen readers. To satisfy 1.2.7 (level AAA), expanded video should be prepared to give sufficient time to add audio descriptions of the video content. Regarding the operable principle, guideline 2.4 specifies that a platform should offer ways to help users navigate, find content, and determine their locations. The Coursera platform, however, did not meet success criterion 2.4.1 (level A) since it did not provide ways to bypass repeated elements on multiple web pages. Likewise, the platform did not satisfy success criterion 2.4.3 (level A), since it was difficult for screen readers to identify the focus order in the navigation sequence. The severity ratings indicate that the evaluators tend to perceive that users need to exert some moderate effort to get around the problems related to these two criteria. Australasian Journal of Educational Technology, 2019, 35(6). 58 Under the understandable principle, guideline 3.1 specifies that a platform should make text content readable and understandable. The default language of the web pages in Coursera was not programmatically determined (success criterion 3.1.1, level A). In addition, the predictable guideline specifies that platforms be operated in predictable ways. However, the Coursera platform did not satisfy success criterion 3.2.1 (level A) because the screen reader focus was forcibly moved to the top of the web page when specific links were opened. The evaluators rated criterion 3.2.1 (M = 3.6) more serious than criterion 3.1.1 (M = 2.4). Concerning the robust principle, guideline 4.1 specifies that a platform should maximise compatibility with current and future user agents, including assistive technologies. Success criterion 4.1.1 (level A) was not satisfied since several attributes with the same ID were found in the HTML document. The mean severity score for this criterion was 2.4, indicating that users have to exert some moderate effort to get around this problem. Table 7 Severity ratings according to WCAG 2.0 (W3C, 2008) in the free-form evaluation Principle Guideline Success criteria (conformance level) Severity (Mean) Perceivable Text alternatives 1.1.1 (A) All non-text content that is presented to the user has a text alternative that serves the equivalent purpose, except for the situations listed below. 4.0 Time-based media 1.2.3 (A) An alternative for time-based media or audio description of the prerecorded video content is provided for synchronised media, except when the media is a media alternative for text and is clearly labeled as such. 2.8 1.2.5 (AA) Audio description is provided for all prerecorded video content in synchronised media. 2.2 1.2.7 (AAA) Where pauses in foreground audio are insufficient to allow audio descriptions to convey the sense of the video, extended audio description is provided for all prerecorded video content in synchronised media. 1.6 Operable Navigable 2.4.1 (A) A mechanism is available to bypass blocks of content that are repeated on multiple Web pages. 2.4 2.4.3 (A) If a Web page can be navigated sequentially and the navigation sequences affect meaning or operation, focusable components receive focus in an order that preserves meaning and operability. 2.8 Understandable Readable 3.1.1 (A) The default human language of each Web page can be programmatically determined. 2.4 Predictable 3.2.1 (A) When any component receives focus, it does not initiate a change of context. 3.6 Robust Compatible 4.1.1 (A) In content implemented using markup languages, elements have complete start and end tags, elements are nested according to their specifications, elements do not contain duplicate attributes, and IDs are unique, except where the specifications allow these features 2.4 Australasian Journal of Educational Technology, 2019, 35(6). 59 Discussion Recommendations on accessible MOOCs In this study, we attempted to identify the learning needs of learners with visual impairments in mobile MOOCs, hoping to contribute to the design of MOOCs that are more accessible. We conducted the research in two phases: a user study (Phase I) and a heuristic walkthrough (Phase II). In this section, we summarise our key findings and derive recommendations for improving the accessibility of MOOCs based on UDI principles (Scott et al., 2003). First, we found that MOOC platforms have both pros and cons for supporting the equitable use principle for learners with visual impairments. Serious accessibility issues in the translation and language selection functions in MOOC platforms limit equitable use. Thus, we recommend using auto-translation and downloadable lectures and subtitles to make MOOC platforms more accessible to learners with visual impairments as well as learners using languages other than English and/or those with poor network connections. In terms of their overall perception of MOOCs, the participants in the user study shared their positive perception of MOOCs as useful platforms that provide them with wider learning opportunities, especially through mobile phones (with which they can access learning content at anytime and anywhere). Second, another significant barrier that prevents learners with visual impairments from fully participating in MOOC activities concerned screen readers’ inability to read information in dropdown menus and the lack of alternative texts for non-text content and time-based media. These issues violate the simple and intuitive and perceptible information UDI principles. A dropdown menu is a toggle navigation that provides users with a predefined list of items only when the mouse rolls over it. Since VoiceOver reads this menu as a single button, it is difficult for learners with visual impairments to detect this as a toggle navigation menu. Furthermore, several tasks failed mainly due to the absence of alternative texts in some functional buttons. Accordingly, we recommend that efforts be devoted to providing both alternative texts for non- text content media, especially the video lectures that most MOOCs rely on, and information on the current state in hidden menu elements. Third, mobile learning should be developed to require low physical and technical effort. Ideally, the information needed by visually impaired students who utilise screen readers would be presented on a single page. This study found that participants had no choice but to read repetitive information. To avoid this, the use of bypass buttons should be considered. Bypass buttons are objects accessed only by screen readers and can perform specific functions. For example, edX provides a practical feature with bypass buttons. Using the button, users can skip headline content to go directly to the main content. By significantly reducing the processing of repeated information, the use of bypass buttons would enable learners to focus on learning activities rather than dealing with technical issues. Lastly, although this study mainly focused on the technical dimensions of MOOC accessibility, we suggest that supporting the platforms’ pedagogical dimensions, including the UDI principles of community of learners and support and instructional climate are of critical importance. To make MOOCs more accessible, we recommend the facilitation of community support for learning through the development of support services and appropriate tools. cMOOCs (connectivist MOOCs) emphasise the connectivist approach to learning, where learners collaborate through learning communities and social media (Milligan, Littlejohn, & Margaryan, 2013). The current study, however, revealed that participants with visual impairments had difficulties participating in even simple activities such as writing posts in discussion forums. Elias (2011) proposed UDI principles specific to mobile learning, including strategies for supporting learning communities and instructional climates conducive to learning for students with disabilities. These strategies include having study groups, links to support services, regular communication with students, and the availability of one-to-one consultation. We, however, found that these strategies may not be workable in current MOOC platforms due to their massive and open nature. Nevertheless, we recommend that some support structures be considered to support learners with disabilities, such as providing multiple means of participation through non-text modes (e.g., video, audio), having study groups where students with similar special needs can help each other, and providing a service support dedicated to students with disabilities, where they can easily communicate and report any technical and learning-related issues they encounter on the platforms. Australasian Journal of Educational Technology, 2019, 35(6). 60 With more intelligent functions from learning analytics and big data being integrated into MOOC platforms (e.g., Guo, Kim, & Rubin, 2014), we predict that the next generation of MOOC platforms will be able to reduce the level of digital inequity among learners with disabilities by providing more customisable functions and content. It is possible to send customised notifications using various smartphone functions or based on learning analytics techniques (Park, Cha, & Lee, 2016). Limitations and directions for future research Despite the significance of our findings, this study had some limitations that future research should address. First, we evaluated the accessibility of the mobile environment based on mobile web page responsiveness. Further research should be undertaken to also evaluate the accessibility of mobile applications. In addition, future research should consider the variations and complexity that exist in MOOC platforms, users’ skill levels, and user agent applications (e.g., VoiceOver). Second, the participants had to complete predefined tasks in a single session, which might have limited their ability to identify accessibility problems related to sustained usage patterns at deeper levels. We suggest that future researchers conduct prolonged studies in which participants take one or more MOOCs for longer time periods, and that they also track users’ clickstream data and logs to conduct learning analytics. Third, this study focused only on learners with visual impairments and did not include other types of disabilities. Hence, caution should be exercised when it comes to generalising our findings to other contexts involving learners with disabilities. The small number of participants is another limitation. We plan to expand the study to include more learners with other types of disabilities, such as hearing impairments and physical disabilities that require the help of assistive technologies. Lastly, it should be noted that while this study utilised WCAG 2.0, WCAG 2.1 (published in 2018) includes several criteria specific to mobile accessibility. WCAG 2.1 builds on WCAG 2.0 by improving the accessibility for three groups: users with cognitive or learning disabilities, users with low vision, and new accessibility requirements related to mobile devices (W3C, 2018). WCAG 2.1 appended one guideline and 17 success criteria. At the time of the present study, only WCAG 2.0 was available. Since criteria more specific to mobile accessibility are available in WCAG 2.1, future research should utilise WCAG 2.1 to better evaluate mobile accessibility issues in MOOCs. Conclusion The rapid expansion of mobile technologies and devices has fuelled a growing recognition that learning can take place outside of classrooms and formal learning contexts (Sharples, 2015). MOOCs were launched with the great promise to provide open and accessible learning opportunities to a wide range of learners. Despite such promises, however, this study reveals that current MOOC platforms do not address some critical issues of accessibility and UD principles for learners with disabilities. Recognising that MOOCs, as part of the open education movement, are likely to be a major teaching and learning trend for the next decade, this study highlights ways to make open courses and resources more accessible to learners with disabilities. We believe that this study provides empirical evidence of certain user requirements, which will aid in designing more inclusive and accessible MOOCs. Acknowledgements This paper is an extension of the research work presented at the HCI Korea 2016 Conference on Human Factors in Computing Systems (Park, Kim, & So, 2016). We would like to thank all the participants and the Center for Students with Disabilities at Daegu University, who made this research possible. References Abou-Zahra, S. (2008). Web accessibility evaluation. In S. Harper, & Y. Yesilada (Eds.), Web accessibility (pp. 79–106). London, United Kingdom: Springer. https://doi.org/10.1007/978-1-84800- 050-6_7 Akgül, Y. (2018). Accessibility evaluation of MOOCs’ websites of Turkey. Journal of Life Economics, 5(4), 23–36. https://doi.org/10.15637/jlecon.259 Akram, M., & Sulaiman, R.B. (2017). A systematic literature review to determine the web accessibility issues in Saudi Arabian university and government websites for disable people. International Journal https://doi.org/10.1007/978-1-84800-050-6_7 https://doi.org/10.1007/978-1-84800-050-6_7 https://doi.org/10.15637/jlecon.259 Australasian Journal of Educational Technology, 2019, 35(6). 61 of Advanced Computer Science and Applications, 8(6), 321–329. https://doi.org/10.14569/IJACSA.2017.080642 Al-Mouh, N. A., Al-Khalifa, A. S., & Al-Khalifa, H. S. (2014). A first look into MOOCs accessibility: The case of Coursera. In K. Miesenberger, D. Fels, D. Archambault, P. Peňáz, & W. Zagler (Eds.), Computers helping people with special needs: Proceedings of the International Conference on Computers for Handicapped Persons 2014 (pp. 145–152). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-08596-8_22 Atiaja, L., & Guerero- Proenza, R. S. (2016). MOOCs: Origin, characterization, principal problems and challenges in higher education. Journal of e-Learning and Knowledge Society, 12(1), 65–76. Retrieved from https://www.learntechlib.org/p/171428/ Bohnsack, M., & Puhl, S. (2014). Accessibility of MOOCs. In K. Miesenberger, D. Fels, D. Archambault, P. Peňáz, & W. Zagler (Eds.), Computers helping people with special needs: Proceedings of the International Conference on Computers for Handicapped Persons 2014 (pp. 141– 144). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-08596-8_21 Burgstahler, S., & Cory, R. (2008). Universal design in higher education: From principles to practice. Boston, MA: Harvard Education Press. Caldwell, B., Cooper, M., Reid, L. G., & Vanderheiden, G. (Eds.). (n.d.). Web content accessibility guidelines (WCAG) 2.0. Retrieved from https://www.w3.org/TR/WCAG20/ Calle-Jimenez, T., Sanchez-Gordon, S., & Luján-Mora, S. (2014). Web accessibility evaluation of massive open online courses on Geographical Information Systems. In 2014 IEEE Global Engineering Education Conference (pp. 680–686). Piscataway, NJ: IEEE. https://doi.org/10.1109/educon.2014.6826167 Cooper, M., Sloan, D., Kelly, B., & Lewthwaite, S. (2012). A challenge to web accessibility metrics and guidelines: Putting people and processes first. In Proceedings of the International Cross-disciplinary Conference on Web Accessibility (Article 20). New York, NY: ACM. https://doi.org/10.1145/2207016.2207028 Coughlan, T., Ullmann, T. D., & Lister, K. (2017). Understanding accessibility as a process through the analysis of feedback from disabled students. In Proceedings of the 14th Web for All Conference on the Future of Accessible Work (Article 14). New York, NY: ACM. https://doi.org/10.1145/3058555.3058561 DAISY Consortium. (n.d.). DAISY Format 3. Retrieved from https://daisy.org/activities/standards/daisy/daisy-3/ Elias, T. (2011). Universal instructional design principles for mobile learning. The International Review of Research in Open and Distributed Learning, 12(2), 143–156. https://doi.org/10.19173/irrodl.v12i2.965 Foley, A., & Ferri, B. A. (2012). Technology for people, not disabilities: Ensuring access and inclusion. Journal of Research in Special Educational Needs, 12(4), 192–200. https://doi.org/10.1111/j.1471- 3802.2011.01230.x Guglielman, E. (2010). E-learning and disability: Accessibility as a contributor to inclusion. In K. Maillet, R. Klamma, T. Klobucar, D. Gillet, & M. Joubert (Eds.), Proceedings of the 5th Doctoral Consortium at the European Conference on Technology Enhanced Learning (pp. 31–36). Barcelona, Spain: CEUR-WS. Retrieved from http://ceur-ws.org/Vol-709/paper06.pdf Guo, P. J., Kim, J., & Rubin, R. (2014). How video production affects student engagement: An empirical study of MOOC videos. In Proceedings of the first ACM Conference on Learning@Scale Conference (pp. 41–50). New York, NY: ACM. https://doi.org/10.1145/2556325.2566239 Henry, S. L., Abou-Zahra, S., & Brewer, J. (2014). The role of accessibility in a universal web. In Proceedings of the 11th Web for All Conference (Article 17). New York, NY: ACM. https://doi.org/10.1145/2596695.2596719 IMS Global. (2012). IMS Global AccessforAll (AfA) Primer. Retrieved from https://www.imsglobal.org/accessibility/afav3p0pd/AfAv3p0_SpecPrimer_v1p0pd.html Iniesto, F., McAndrew, P., Minocha, S., & Coughlan, T. (2016a). The current state of accessibility of MOOCs: What are the next steps? In Proceedings of Open Education Global Conference 2016 (pp. 8– 14). Retrieved from http://oro.open.ac.uk/id/eprint/46070 Iniesto, F., McAndrew, P., Minocha, S., & Coughlan, T. (2016b). Accessibility of MOOCs: Understanding the provider perspective. Journal of Interactive Media in Education, 2016(1). https://doi.org/10.5334/jime.430 https://doi.org/10.14569/IJACSA.2017.080642 https://doi.org/10.1007/978-3-319-08596-8_22 https://www.learntechlib.org/p/171428/ https://doi.org/10.1007/978-3-319-08596-8_21 https://www.w3.org/TR/WCAG20/ https://doi.org/10.1109/educon.2014.6826167 https://doi.org/10.1145/3058555.3058561 https://daisy.org/activities/standards/daisy/daisy-3/ https://doi.org/10.19173/irrodl.v12i2.965 https://doi.org/10.1111/j.1471-3802.2011.01230.x https://doi.org/10.1111/j.1471-3802.2011.01230.x http://ceur-ws.org/Vol-709/paper06.pdf https://doi.org/10.1145/2556325.2566239 https://doi.org/10.1145/2596695.2596719 https://www.imsglobal.org/accessibility/afav3p0pd/AfAv3p0_SpecPrimer_v1p0pd.html http://oro.open.ac.uk/id/eprint/46070 https://doi.org/10.5334/jime.430 Australasian Journal of Educational Technology, 2019, 35(6). 62 Iniesto, F., McAndrew, P., Minocha, S., & Coughlan, T. (2017a). What are the expectations of disabled Learners when participating in a MOOC? In Proceedings of the first ACM Conference on Learning@ Scale Conference (pp. 225–228). New York, NY: ACM. https://doi.org/10.1145/3051457.3053991 Iniesto, F., McAndrew, P., Minocha, S., & Coughlan, T. (2017b). Auditing the accessibility of massive open online courses (MOOCs). In Proceedings of the 14th Congress of Association for the Advancement of Assistive Technology in Europe. Retrieved from http://oro.open.ac.uk/id/eprint/50394 Iniesto, F., & Rodrigo, C. (2014). Accessibility assessment of MOOC platforms in Spanish: UNED COMA, COLMENIA and Miriada X. In J.-L. Sierra-Rodríguez, J.-M. Dodero-Beardo, & D. Burgos (Eds.), Proceedings of the International Symposium on Computers in Education (pp. 169–172). Piscataway, NJ: IEEE. https://doi.org/10.1109/siie.2014.7017724 Iniesto, F., & Rodrigo, C. (2016). Strategies for improving the level of accessibility in the design of MOOC-based learning services. In Proceedings of the International Symposium on Computers in Education (SIIE) (pp.1–6). Piscataway, NJ: IEEE. https://doi.org/10.1109/SIIE.2016.7751841 Iniesto, F., Rodrigo, C., & Moreira Teixeira, A. (2014). Accessibility analysis in MOOC platforms. A case study: UNED COMA and UAB iMOOC. In Proceedings of the International Conference on Quality and Accessibility of Virtual Learning (CAFVIR) (pp. 545–550). Retrieved from http://oro.open.ac.uk/id/eprint/45192 Jemni, M., Laabidi, M., & Ayed, L. J. B. (2014). Accessible e-learning for students with disabilities: From the design to the implementation. In R. Huang, Kinshuk, & N. S. Chen (Eds.), The new development of technology enhanced learning: Lecture notes in educational technology (pp. 53–74). Heidelberg, Germany: Springer. https://doi.org/10.1007/978-3-642-38291-8_4 Martín, J. L., Amado-Salvatierra, H. R., & Hilera, J. R. (2016). MOOCs for all: Evaluating the accessibility of top MOOC platforms. The International Journal of Engineering Education, 32(5), 2374–2383. Retrieved from https://dialnet.unirioja.es/servlet/articulo?codigo=6919332 McGuire, J. M., Scott, S. S., & Shaw, S. F. (2006). Universal design and its applications in educational environments. Remedial and Special Education, 27(3), 166–175. https://doi.org/10.1177/07419325060270030501 Meyer, A., & Rose, D. H. (1998). Learning to read in the computer age. Cambridge, MA: Brookline. Milligan, C., Littlejohn, A., & Margaryan, A. (2013). Patterns of engagement in connectivist MOOCs. MERLOT Journal of Online Learning and Teaching, 9(2), 149–159. Retrieved from https://jolt.merlot.org/vol9no2/milligan_0613.pdf Nielsen, J., & Molich, R. (1990). Heuristic evaluation of user interfaces. In J. Carasco Chew & J. Whiteside (Eds.), Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 249–256). New York, NY: ACM. https://doi.org/10.1145/97243.97281 Nielsen, J. (1993). Usability engineering. San Francisco, CA: Morgan Kaufmann. Park, T. J., Cha, H. J., & Lee, G. Y. (2016). A study on design guidelines of learning analytics to facilitate self-regulated learning in MOOCs. Educational Technology International, 17(1), 117–150. Retrieved from http://kset.or.kr/eti_ojs/index.php/instruction/article/view/61/pdf_19 Park, K., Kim, H. J., & So, H. J. (2016). Are massive open online courses (MOOCs) really open to everyone? A study of accessibility evaluation from the perspective of universal design for learning. In HCI Korea: Proceedings of the Conference on Human Factors in Computing Systems (pp. 29–36). Seoul, South Korea: Hanbit Media, Inc. https://doi.org/10.17210/hcik.2016.01.29 Rubin, J., & Chisnell, D. (2008). Handbook of usability testing: How to plan, design and conduct effective tests. Indianapolis, IN: John Wiley & Sons. Sanchez-Gordon, S., & Luján-Mora, S. (2013). Web accessibility of MOOCs for elderly students. In Proceedings of the International Conference on Information Technology Based Higher Education and Training (pp. 1–6). Piscataway, NJ: IEEE. https://doi.org/10.1109/ITHET.2013.6671024 Sanchez-Gordon, S., & Luján-Mora, S. (2017). Research challenges in accessible MOOCs: A systematic literature review 2008-2016. Universal Access in the Information Society, 17(4), 775–789. https://doi.org/10.1007/s10209-017-0531-2 Sanderson, N. C., Chen, W., Bong, W. K., & Kessel, S. (2016). The accessibility of MOOC platforms from instructors’ perspective. In M. Antona & C. Stephanidis (Eds.), Universal access in human- computer interaction: Users and context diversity (pp. 124–134). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-40238-3_13 Scanlon, E., McAndrew, P., & O’Shea, T. (2015). Designing for educational technology to enhance the experience of learners in distance education: How open educational resources, learning design and MOOCs are influencing learning. Journal of Interactive Media in Education, 2015(1), 1–9. https://doi.org/10.5334/jime.al https://doi.org/10.1145/3051457.3053991 http://oro.open.ac.uk/id/eprint/50394 https://doi.org/10.1109/siie.2014.7017724 https://doi.org/10.1109/SIIE.2016.7751841 http://oro.open.ac.uk/id/eprint/45192 https://doi.org/10.1007/978-3-642-38291-8_4 https://dialnet.unirioja.es/servlet/articulo?codigo=6919332 https://doi.org/10.1177/07419325060270030501 https://jolt.merlot.org/vol9no2/milligan_0613.pdf https://doi.org/10.1145/97243.97281 http://kset.or.kr/eti_ojs/index.php/instruction/article/view/61/pdf_19 https://doi.org/10.1109/ITHET.2013.6671024 https://doi.org/10.1007/s10209-017-0531-2 https://doi.org/10.1007/978-3-319-40238-3_13 https://doi.org/10.5334/jime.al Australasian Journal of Educational Technology, 2019, 35(6). 63 Scott, S. S., McGuire, J. M., & Shaw, S. F. (2003). Universal design for instruction. Remedial and Special Education, 24(6), 369–379. https://doi.org/10.1177/07419325030240060801 Sears, A. (1997). Heuristic walkthroughs: Finding the problems without the noise. International Journal of Human-Computer Interaction, 9(3), 213–234. https://doi.org/10.1207/s15327590ijhc0903_2 Selwyn, N. (2016). Is technology good for education? Cambridge, United Kingdom: Polity Press. Sharples, M. (2015). Seamless learning despite context. In L. H. Wong, M. Milrad, & M. Specht (Eds.), Seamless learning in the age of mobile connectivity (pp. 41–55). Singapore: Springer. https://doi.org/10.1007/978-981-287-113-8_2 Wharton, C., Rieman, J., Lewis, C., & Polson, P. (1994). The cognitive walkthrough method: A practitioner's guide. In J. Nielsen, & R. L. Mack (Eds.), Usability inspection methods (pp. 105–140). New York, NY: John Wiley & Sons. Retrieved from https://dl.acm.org/citation.cfm?id=189214 World Wide Web Consortium. (2008). Web Content Accessibility Guidelines (WCAG) 2.0. Retrieved from https://www.w3.org/TR/2008/REC-WCAG20-20081211/ World Wide Web Consortium. (2018). Web Content Accessibility Guidelines (WCAG) 2.1. Retrieved from https://www.w3.org/TR/WCAG21/ World Wide Web Consortium. (n.d.). Accessibility. Retrieved from http://www.w3.org/standards/webdesign/accessibility Corresponding author: Hyo-Jeong So, hyojeongso@ewha.ac.kr Copyright: Articles published in the Australasian Journal of Educational Technology (AJET) are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY- NC-ND 4.0). Authors retain copyright in their work and grant AJET right of first publication under CC BY-NC-ND 4.0. Please cite as: Park, K., So, H.-J., & Cha, H. (2019). Digital equity and accessible MOOCs: Accessibility evaluations of mobile MOOCs for learners with visual impairments. Australasian Journal of Educational Technology, 35(6), 48–63. https://doi.org/10.14742/ajet.5521 https://doi.org/10.1177/07419325030240060801 https://doi.org/10.1207/s15327590ijhc0903_2 https://doi.org/10.1007/978-981-287-113-8_2 https://dl.acm.org/citation.cfm?id=189214 https://www.w3.org/TR/2008/REC-WCAG20-20081211/ https://www.w3.org/TR/WCAG21/ http://www.w3.org/standards/webdesign/accessibility mailto:hyojeongso@ewha.ac.kr https://creativecommons.org/licenses/by-nc-nd/4.0/ https://creativecommons.org/licenses/by-nc-nd/4.0/ https://doi.org/10.14742/ajet.5521 Introduction Theoretical background and related work MOOCs and digital equity Research on accessible MOOCs Universal design for learning (UDL) Methodology Research rationale and purpose Overall process Phase I: User study Participants and apparatus Procedure Phase II: Heuristic walkthrough Purpose Evaluators and apparatus Procedure Results Phase I: User study results Phase II: Heuristic walkthrough results Discussion Recommendations on accessible MOOCs Limitations and directions for future research Conclusion Acknowledgements References