International Journal of Interactive Mobile Technologies (iJIM) – eISSN: 1865-7923 – vol. 13, No. 8, 2019 Paper—User-Centred Design in Content Management System Development: The Case of EMasters User-Centred Design in Content Management System Development: The Case of EMasters https://doi.org/10.3991/ijim.v13i08.10727 Jelena Nakić (*) University of Split, Split, Croatia jelena.nakic@pmfst.hr Antonija Burčul Science High School, Split, Croatia Nikola Marangunić University of Split, Split, Croatia Abstract—Including users in design and development of an interactive product is crucial to achieve high level of usability. Content management sys- tems have two categories of users, content creators and content consumers, and designers of these systems have to considers the needs of both user groups. In design of interactive learning systems, special attention has to be given to the process of learning, which means that functional, accessible and usable inter- face has to serve the purpose of knowledge acquisition. Designing for mobile learning brings additional challenges due to the small screens of mobile devices. The paper describes the process of utilization of user-centred design in devel- opment of a simple content management system for learning called EMasters. The aim of the EMasters is to enable teachers to easily create and organize courses which will be delivered to students to facilitate web-based and mobile learning. According to the user-centred design approach, teachers and students are involved in iterative process of design, implementation and evaluation of EMasters. Evaluation study used complementary methods and provided quanti- tative and qualitative feedback. The usability score reached good level and the guidelines for redesign of the system interface are drown. According to the ob- tained results, proposed framework is confirmed to be applicable in user-centred design of content management systems in general. In addition, the directions for adjustment of the framework for specific cases are provided. Keywords—Mobile learning, user-centred design, user interface, content man- agement system, usability, rapid prototyping, user testing 1 Introduction The concept of learning anywhere anytime is rather old. If we consider textbooks as the first mobile learning devices, as suggested by Searson [1], then learning any- where and anytime begins with the students reading textbooks on the bus on their way iJIM ‒ Vol. 13, No. 8, 2019 43 https://doi.org/10.3991/ijim.v13i08.10727 https://doi.org/10.3991/ijim.v13i08.10727 https://doi.org/10.3991/ijim.v13i08.10727 mailto:jelena.nakic@ mailto:jelena.nakic@ Paper—User-Centred Design in Content Management System Development: The Case of EMasters to school. Contemporary digital learning environments support this concept since they are typically web-based and therefore available by any device connected to the Inter- net. In addition to learning, they bring new affordances such as online testing, com- munication with teachers and collaboration with peers. The pervasive owning of smartphones, tablets and other mobile devices allow learners to move freely while learning and to communicate with peers faster than ever before. Furthermore, the emerging paradigm of ubiquitous learning brings the dimension of “anyhow” to the plethora of benefits of mobile learning [2], [3], [4]. Recent trends in digital education also include adaptive learning systems, serious games and virtual environments. Re- search in intelligent and adaptive user interfaces enabled development of personalized learning environments that address user individual differences and deliver individual- ly tailored content and learning paths through the course [4], [5], [6]. Introducing gamification elements into digital learning resources keeps students more focused and engaged in learning [7]. In simulated game-like environments such as 3D virtual worlds, serious games and virtual reality environments, learners can face real situa- tions and learn directly from these first-person experiences [8], [9], [10], [11]. The common goal of all technological interventions in learning, including above- mentioned trends, is to increase the ease of use of learning applications and to im- prove the learning outcomes of learners. These applications are usually highly interac- tive thus keeping the focus and motivation at learning. All these desirable features are results of application of the principles of interaction design and learner-centred design [12]. Considering specifically mobile learning, today learning applications are compet- ing with prevalent usage of mobile phones for chatting, music, videos and social net- works. Learning is no more a single process that is closed in controlled environment such as for example virtual worlds or virtual laboratories [9], [11]. Instead, learning activities are frequently interrupted with notifications from various applications that distract student while learning. Thus, besides the fact that uncontrollable usage of mobile phones leads to decreased learning performance of students in general [13], in case of mobile learning the process of learning is directly affected by the events at the same screen on which the learning occurs. In those circumstances, the importance of good design of learning applications for mobile devices becomes crucial. If the stu- dent is frustrated by the learning application he is currently using, the chances that after responding on a chat he will get back to learning are probably lower than in case of pleasant and meaningful learning experience. This fact raises the need for further research on finding more efficient ways of using mobile phones for learning. The paper presents the process of design and development of a simple content management system for learning called the EMasters. The system is responsive i.e. adapted to small screens of mobile devices such as laptops, tablets and smartphones. The development cycle is an iterative process which is conducted by the principles of user-centred design. The paper is structured as follows. After explaining the rationale for the study in the introductory section, Section 2 provides a theoretical framework with short review of online educational systems, their strengths and weaknesses. The theoretical section continues with definitions of the terms and methods used in the study. Section 3 brief- 44 http://www.i-jim.org Paper—User-Centred Design in Content Management System Development: The Case of EMasters ly presents the key points in design and development of the EMasters. Section 4 de- scribes the procedure of conducted evaluation study and the results follow in Section 5. In Section 6 the obtained results are discussed, and the implications of findings are brought. Section 7 provides closing remarks. 2 Theoretical Background 2.1 Online educational systems Many academic institutions and private organizations make continuous efforts and invest significant resources in providing online learning opportunities for their stu- dents or employees and network technologies continuously develop new possibilities for successful utilization of online leaning for personal growth [14]. Therefore the pool of learners is now expanded from the „traditional“ students to a much broader scope which includes employees in a company or self-motivated learners who choose to attend some courses in their free time for individual development. The most com- mon forms of online education in institutional settings are Massive Open Online Courses (MOOCs) [15] and Learning Management Systems (LMSs), e.g. Moodle [16] as a category of Content Management Systems (CMSs) specially indented for learn- ing. Despite the different definitions and basic functions of these systems, as ex- plained in [17], they aim to provide high-quality education to the cohorts of students in a cost-effective manner. They are widely implemented in high academic institu- tions, public and private organizations, either to support blended learning or to pro- vide fully on-line education. These solutions are convenient for large institutions because teachers and management have control over learning achievements of the participants. In addition, they may obtain numerous reports on learning analytics, which is the basis for development of new policies and strategies and for progress of the institution. However, researchers suggest that these institutional systems still do not succeed to meet the requirements. Some of frequently reported weak points are: high costs of development and continuous maintenance; user interface is often non-responsive, i.e. not adapted to small screens typical of mobile devices; students sometimes oppose to mandatory software, while teachers, who are expected to design learning content, are sometimes not ready to take this new role of course creators [18], [19], [20]. As a part of an answer to these challenges, small e-learning applications or microlearning appli- cations have appeared lately as an alternative to learning content in typically large e- learning systems [21]. Microlearning as a novel type of digital instruction can con- tribute to interactivity in learning experience due to a very easy form of application and the ability of simple integration into online educational systems and virtual envi- ronments [21]. The common feature of all these forms of educational applications is that teach- er/instructional designer need not to have any programming skills to be able to suc- cessfully create learning content. According to the definition of the CMSs, which are iJIM ‒ Vol. 13, No. 8, 2019 45 Paper—User-Centred Design in Content Management System Development: The Case of EMasters in focus of this paper, teachers are provided with tools and features for creating the content of online courses and for moderating interaction with their students [17]. 2.2 User-centred design User-centred design (UCD) is covered by ISO standards related to a broader scope of human-centred design and usability [22]. ISO 9241-210:2010 describes require- ments of human-centred design principles and activities that are related to the usage of computer systems. The standard concerns with ways to enhance human–system inter- action through usage of both hardware and software components of interactive com- puter systems. The standard is intended to be used by professionals who manage the design and development of interactive systems. Figure 1 shows typical stages of UCD according to ISO: • Plan the UCD process • Understand and specify context of use • Specify user requirements • Produce design solutions • Evaluate designs against requirements • Design solution meets user requirements In iterative process of designing, implementation and evaluation several phases are reoccurring as many times as needed to reach the final stage in which design solution meets user requirements to a great extent. Fig. 1. The user-centred design process according to ISO 9241-210:2010 46 http://www.i-jim.org Paper—User-Centred Design in Content Management System Development: The Case of EMasters For software designers building interactive learning applications, Quintana, Krajcik, and Soloway [23] extended the traditional definition of UCD approach and proposed a definition of learner-centred design. They have considered three dimen- sions of interaction in learning systems: the audience (users vs. learners), the ad- dressed problem (using tools vs. learning work) and underlying approach (supporting action vs. supporting learning). When designed for learners, educational software must address several unique needs of learners as users: the concept of learning by doing, individual differences and different levels of motivation [24]. Specifically, when designing CMSs, two categories of users have to be considered, namely the content creators and the content consumers. In interactive learning sys- tems, content creators are teachers or instructional designers while content consumers are learners who use delivered content for knowledge acquisition. To ensure that sys- tem will be adopted by the users, both user categories must be engaged in the process of system design. The users expect to be able to control the interaction and user en- rolment in design process is crucial to achieve this goal. This is especially important in informal learning where there is no institutional incentive or requirement for using specific software. To be adopted and really used, an application for both institutional and informal learning has to provide appealing user experience for teachers and learn- ers. The UCD approach is focusing on usability [25]. According to ISO [22] usability is: “ extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use”. Recently published new ISO Standard for Usability, Usability Re- ports and Usability Measures, ISO 9241-11:2018 extend this definition and explains that usability should not be considered as a property of a system but as an outcome of use [26]. Usability evaluation is regularly conducted during the evaluation phase of UCD process described in Figure 1. Moreover, evaluation and design are closely integrated in this process and some of the same methods are used both in evaluation and in the phase of specifying user requirements [12]. Methods of usability evaluation include [27]: • Usability testing – an evaluation process with users doing real tasks on a prototype or real system, while carefully observing their behaviour and emotional response during the interaction; also called user testing; • Usability inspection – method in which usability experts evaluate the interface according to a set of usability guidelines, with no user involvement; • Usability inquiry – method of gathering user feedback after the interaction with the system, usually through surveys, interviews of focus groups. The methods differ in nature and in number of participants. Expert inspection re- quires significantly smaller number of participants compared to methods that involve end users. Since it may be very hard to find five usability experts to do the evaluation [28], user testing and usability inquiry are more frequently applied. However, to get different perspectives on the interface, research suggests triangulation of methods when possible [12]. iJIM ‒ Vol. 13, No. 8, 2019 47 Paper—User-Centred Design in Content Management System Development: The Case of EMasters Qualitative methods are usually conducted through formative evaluation to check the compliance with user requirements and to provide guidelines for interface rede- sign [29]. On the other hand, quantitative methods are more often used as a part of summative evaluation to confirm that design has reached high level of perceived usa- bility [12], [29]. In the rest of the paper, we will explain in details the methods which are chosen to be applied in usability evaluation of EMasters, namely the thinking aloud protocol as one of user testing techniques followed by several questionnaires and an interview used in post-session usability inquiry (see Chapter 4). 3 Design and Development of the EMasters The aim of the EMasters is to enable teachers to easily create and organize courses which will be delivered to students to facilitate web-based and mobile learning. Teachers register to the system, make courses and organize their content. As part of the course, they may write text and import other objects such as pictures, videos and microlearning applications. Besides learning content, the teachers can make quizzes for evaluation or self-evaluation of knowledge acquisition process. They can update and delete any piece of the course they have created. The application also has a forum section in which both the teachers and the students can start a discussion on a topic and reply to previous comments. In addition to these two user groups, the application has an administrator who is in charge of the system and controls the proper entry of all data. The first design of the EMasters was made as a wireframe e.g. paper prototype for desktop, web-based and mobile interfaces. In a simplified process of UCD, that in- volves iteration of design, implementation and evaluation, the pilot testing of the first design was carried out with five students of University in Split, Croatia, Faculty of Science. Three participants tested the wireframe in teacher`s role and two participants in student’s role. According to the feedback from pilot testing, the second wireframe was made. Redesign included several changes in teacher’s interface and student’s interface. For example, buttons for updating and deleting a chapter in teacher’s inter- face were replaced with hyperlinks. This feature is shown in Figure 2, as developed later in the implementation phase. In addition, the forum page, which shows a topic, was redesigned in a way that the field for entering the reply is enlarged and the button for submitting the entry is red instead of green. Figure 3 shows the forum page with the theme “Regular expressions” which is a topic in the “Unix” course. Fig. 2. Homepage of the UNIX course in EMasters as displayed for teachers (on the left) and students (on the right) 48 http://www.i-jim.org Paper—User-Centred Design in Content Management System Development: The Case of EMasters Fig. 3. Screenshot of a forum topic on theme “Regular expressions” in EMasters The EMasters application was created in Python programming language using Django framework [30]. Django makes use of Model-Template-View (MTV) appli- cation architecture as presented in Figure 4. Model is linked to a database where the data is being stored or retrieved from. This layer contains all the information regard- ing data storage, data connections, entry limitations, mandatory or non-mandatory data, etc. Template is a presentation of the model, i.e. HTML webpage that contains data with instructions on how the webpage should be displayed. The View in MTV architecture serves as a bridge between the other two layers. The user accesses the server using a web browser, after which Django ensures to open a specific view at the user’s request. Finally, the database is accessed, and the data are used to display the webpage in the browser according to the request. Fig. 4. Model-Template-View (MTV) software architecture [28] iJIM ‒ Vol. 13, No. 8, 2019 49 Paper—User-Centred Design in Content Management System Development: The Case of EMasters To adapt the interface to devices with different screen sizes, Bootstrap is used as a front-end framework. Figure 5 shows responsive design of a typical content page for teacher, displayed at laptop and iPhone6/7/8. Layout of the same page for students does not have hyperlinks for updating and deleting existing objects nor the hyperlinks at the bottom of the page that enable adding new content and creating a quiz. Fig. 5. Responsive design of a typical content page for teacher/course creator in EMasters 4 Usability Evaluation Evaluation study was conducted in several individual user sessions with slightly different procedures for participants who accepted the role of teachers and the ones who took the role of students. The study begins with teachers’ sessions. The students’ sessions are conducted after all the teachers have finished creating their courses in the EMasters. Thus, the students were free to choose a course from the pool of courses delivered by the system. All participants followed the same method in the evaluation study but with different instruments according to their role. The evaluation procedure for each participant begins with user testing technique called thinking aloud protocol, which is conducted with teachers in the processes of course creating and later again with students in the processes of learning in the EMasters. The following steps in- volve usability inquiry methods (see Chapter 2.2.), namely post-session survey and semi-structured interview. Figure 6 briefly presents procedure of usability study with the steps of individual sessions carried out with teachers and students. 50 http://www.i-jim.org Paper—User-Centred Design in Content Management System Development: The Case of EMasters Fig. 6. The method of EMasters usability study Users who accepted the role of teachers were asked to register in the EMasters and to create an online course on a topic of their choice. They were instructed to make learning content with minimum two chapters and a quiz. They were allowed, but not explicitly told, to import different types of external files, such as images and videos. The structure of the course was not precisely set. However, the teachers were suggest- ed to follow the ADDIE model of instructional design [31] in the phases of course design and development. The teachers were able to review developed course by taking the role of student. In the learning session, students were invited to register in the system and to select one of the actual learning courses. They were free to explore the content and take quizzes at their own pace and in order they prefer. They were also asked to find the forum and write a comment regarding one of the topics. Both teachers and students were encouraged to think aloud while doing their tasks. The supervisor of the study observes the user behaviour in real time. He notes the user`s comments as well as key aspects of the interaction such as unexpected steps or mistakes the user makes in the process of course creating or learning. The supervisor does not help the users in achieving their goals. After all users’ sessions are complet- ed, the evaluator’s notebook becomes valuable resource of potential usability prob- lems of both teacher’s and student’s interface. 4.1 Post-session survey Following the procedure presented in Figure 6, after individual hands-on session in the EMasters, each participant was invited to fill in a survey. The post-session survey was hosted on Google Forms and included three sections: SUS questionnaire, qualita- tive feedback section and background section. Questions in the first two sections were iJIM ‒ Vol. 13, No. 8, 2019 51 Paper—User-Centred Design in Content Management System Development: The Case of EMasters the same for both of the user roles, while background sections were specifically de- signed to be different for teachers and students. The SUS [32], [33] is a standardized questionnaire for measuring perceived usabil- ity of the system and is used worldwide for overall usability evaluation of different systems [34], [35]. The reliability, validity and sensitivity of the SUS was confirmed in numerous studies as reported by Lewis and Sauro [36]. In the same paper they have provided a curved grading scale to interpret the overall SUS score and this scale will be used in our usability study. Although Brooke initially limited the interpretation of SUS to the overall score [32], Lewis and Sauro conducted a comprehensive psycho- metric analysis of the SUS and suggested that SUS results could also be interpreted by individual items [36]. Based on regression analysis they have provided item bench- marks related to overall SUS scores. Table 1 in the Results section of the paper shows all the SUS questions. The second section of the post-session survey is designed to obtain qualitative feedback from the users. It included open-ended questions: What did you like the most in the EMaster system? What are the flaws? What improvements do you sug- gest? The users’ subjective opinion about their experience in using the EMasters ap- plication can reveal major flows in design and help us develop guidelines for redesign of the system interface. Finally, in the background section of the survey, we obtained demographic data of the users who participated in evaluation study. There were several multichoice ques- tions, related to the role of the participants. The background section in the teacher’s survey collected age, affiliation and the level of previous experience in using e- learning systems as well as in creating online courses. On the other hand, in survey for students, the background section asked their age, time spent in online learning and specifically in mobile learning as well as previous experience in using e-learning systems. 4.2 Semi-structured interview Semi-structured interview is carried out as an addition to qualitative feedback ob- tained from thinking aloud protocol and the post-session survey. The interview was conducted individually with each participant, immediately after they filled in the sur- vey. The interview usually begun with several questions related to the answers that user provided in the survey and then continued in form of free dialog. The users had the opportunity to review their experience in using the system and to explain what improvements of the interface they would suggest and why. Using complementary methods of usability evaluation, namely quantitative and qualitative methods, we can get deep insight in usability issues of the system being developed. The SUS gives us the score which reveals the severity of usability prob- lems while qualitative methods may provide initiatives and concrete guidelines for interface redesign which is supposed to reduce recognized problems in interaction and increase the overall usability of the system. 52 http://www.i-jim.org Paper—User-Centred Design in Content Management System Development: The Case of EMasters 5 Results The study was conducted in the autumn semester 2018 at University in Split, Croa- tia, Faculty of Science. To obtain potentially wide scale of usability issues, the partic- ipants were recruited from students, pupils, teachers and general population. The results are presented separately for teacher’s and student’s interface. 5.1 Teacher’s interface Seven participants volunteered to use the application in the role of teacher/course creator. Age of participants ranged from 24 to 45 with the mean of 37.17 and standard deviation of 7.97. One participant was a student of educational vocations, two partici- pants were high school teachers and one high school pedagogist. Since the application is intended to be used in informal learning, three participants were recruited from general population. Previous experience in creating online learning content ranged from novices (3 participants) to experts (1 participant). All participants successfully created a course on a topic of their interest. The SUS items along with the mean scores and standard deviations for teacher’s interface are shown in Table 1. The scores are ranged 0 to 4. It has to be noted here that even num- bered questions have reversed polarity and their score is calculated accordingly, as described in [32], [33]. This means that for odd numbered questions the score in Table 1 represent the level of users’ agreement with the statement while for even numbered questions the score represents the level of users’ disagreement with the statement. Thus, for example, the score 2.71 on item no. 2. “I found the system unnecessarily complex” means that users rated simplicity of the interface with 2.71 or 67.75%. As a result, the higher score in Table 1 always stands for higher level of perceived users’ satisfaction with the EMasters. Table 1. SUS results for teacher’s interface SUS question Mean SD 1. I think that I would like to use this system frequently. 3.00 0.82 2. I found the system unnecessarily complex. 2.71 1.38 3. I thought the system was easy to use. 3.14 0.38 4. I think that I would need the support of a technical person to be able to use this system. 1.86 1.07 5. I found the various functions in this system were well integrated. 3.00 0.82 6. I thought there was too much inconsistency in this system. 2.86 1.35 7. I would imagine that most people would learn to use this game very quickly. 2.71 0.49 8. I found the system very awkward to use. 3.14 0.69 9. I felt very confident using the system. 2.86 0.69 10. I needed to learn a lot of things before I could get going with this system. 3.14 0.90 Overall SUS score for teacher’s interface is given in Table 2. The obtained value means that satisfaction of participants who took the role of course creators is 71.1 %. iJIM ‒ Vol. 13, No. 8, 2019 53 Paper—User-Centred Design in Content Management System Development: The Case of EMasters Considering individual ratings from participants, the minimum individual grade is 2.2 (55%) and the maximum 3.4 (85%). Table 2. Summary of the SUS for teacher’s interface N Mean SD Min Max 7 2.84 0.93 2.2 3.4 In the qualitative feedback section of the post-session survey six of seven partici- pants wrote positive impressions. Most of them appreciated the simplicity of the inter- face. The comments also included the “ease of use”, “nice and intuitive design” as well as highly functional interaction. Five users reported that have experienced some flows of the EMasters system. They were not satisfied with: restricted possibilities in design of page content i.e. the positioning of objects on a page; the absence of data on students’ usage and achievements; and the fact that quiz allows only multiple-choice type of questions. These issues were clarified in individual interviews with each par- ticipant. In addition, a user suggested that teacher should be able to edit only his/her own courses. 5.2 Student’s interface In the role of student, 14 volunteers took part in the study. Mean age was 28.28, with standard deviation of 10.15, minimum age of 12 and maximum age of 43. Two participants were primary school pupils, two high school students and one university student. The rest were adults with various backgrounds. Prior experience in using computers and mobile devices for learning was almost equally distributed in the sam- ple (rare, sometimes, often, very often, regularly). All participants registered in the system, took time for learning and taking the quiz on selected lesson. All of them succeeded in finding a forum and leaving a comment. The SUS results obtained from students are presented in Table 3 and the overall SUS score is given in Table 4. Overall students’ satisfaction is 68.9%. The lowest individ- ual student’s grade is 1.6 (40%) and the highest grade is 3.8 (95%). Table 3. SUS results for student‘s interface SUS question Mean SD 1. I think that I would like to use this system frequently. 2.43 1.16 2. I found the system unnecessarily complex. 2.86 1.17 3. I thought the system was easy to use. 2.79 1.42 4. I think that I would need the support of a technical person to be able to use this system. 2.50 1.40 5. I found the various functions in this system were well integrated. 2.50 0.65 6. I thought there was too much inconsistency in this system. 2.86 1.23 7. I would imagine that most people would learn to use this game very quickly. 3.07 1.14 8. I found the system very awkward to use. 2.57 1.34 9. I felt very confident using the system. 3.43 0.76 10. I needed to learn a lot of things before I could get going with this system. 2.57 1.60 54 http://www.i-jim.org Paper—User-Centred Design in Content Management System Development: The Case of EMasters Table 4. Summary of the SUS for student‘s interface N Mean SD Min Max 14 2.76 1.22 1.6 3.8 In the post-session survey, we obtained various feedback from users in the stu- dent’s role. Four participants did not write comments regarding positive aspects of the application. The same users wrote that they did not experience difficulties in interac- tion and they had no suggestions for improvement. Six participants liked the ease of use and several students like the aspects that are more related to learning content than to the user interface. Considering users’ opinion on shortcomings of the application, most of the students referred to weaknesses in learning content (too much text, not enough images, videos etc.) while two students had difficulties in reading because of the small letter size. Reported usability issues and suggestions for improvement from student’s perspective were discussed in post-session interviews with each participant. As opposed to content pages, the users had no objections to the usage of quizzes and forum. 6 Discussion According to the interpretation scale for SUS [36], where average grade (C) stands for SUS scores 65.0 -71.0 and slightly above average grade (C+) stands for 71.1 -72.5, we can conclude that users rated the EMasters application as good for course creation (71.1%) and learning (68.9%). Qualitative analysis also confirmed that the application is generally perceived as easy to use for both course creators and learners. Participants of the study appreciated the simplicity of the interface, clear design, availability of the content and the possibility of communication through forum. Discussing the weak- nesses of the application, we collected significant number of comments. The obtained feedback is used to decide what comments are relevant for the interaction and can provide guidelines for redesign that could improve usability of the system. For exam- ple, enlarged letter size can be used for the text in content pages, or we can provide users with an option to set the letter size to desired value. It is important to communi- cate all reported issues with the users to reach the final decision on redesign. Thus, the issue of inability to manipulate the position of objects on a content page is considered as less relevant when compared to risk of a possible bad layout on small screens of mobile devices. Conducted evaluation study revealed some concerns that are not usability problems but also demand consideration. They are particularly related to handling the quizzes and managing the course and its participants. In quizzes, the feature of displaying scores immediately after taking the quiz can be added. Major improvements of the system could include features for teachers such as administering the students, tracking their progress through the course and analyse their achievements on quizzes. Still, the initial incentive of the EMasters was to facilitate informal learning so these major improvements could be considered if the purpose of application is to be used in insti- tutional settings. iJIM ‒ Vol. 13, No. 8, 2019 55 Paper—User-Centred Design in Content Management System Development: The Case of EMasters Considering the procedure of conducted study, we have to notice that some of the results regarding student`s interface may be the consequence of a poor design of learning content in some courses. Several users that took the role of course creators are not familiar with the principles of instructional design and did not apply any of them. Although these users can contribute in usability evaluation of teacher’s inter- face, especially when developing applications for informal learning, their courses may have major flows related to content quality. On the other hand, the study shows that students who evaluate the interface in learning process sometimes have troubles iden- tifying which problems are related to the usability and which to the content quality. This limitation can be overcome in a way that poorly designed courses are excluded from the second part of the study. This approach requires evaluation of developed courses by instructional design experts prior to the usability study of student’s inter- face. Another solution is to engage only teachers or instructional designers to create online courses in the first phase of the study. Both options support the initial idea of the EMasters to be available for all, which means that everyone can create a course for interested audience on a topic of their own choice. Both solutions also ensure that the high usability level of the system will reinforce the process of content creation and of learning. 7 Conclusion The paper, describing the case of design and development of simple learning CMS called the EMasters, presents an iterative process of user-centred design. The process begins with rapid prototyping and pilot testing of the first design. The second design is made according to the users’ feedback and the implementation follows. To evaluate developed system, a comprehensive usability study was carried out. The evaluation method combined several well-known and reliable techniques of usability testing and usability inquiry. Quantitative results show that usability score reached a good level. Qualitative user feedback revealed several usability problems as well as other issues related to the content quality. The outcomes of the study provide guidelines for im- provements which can be implemented in the next iteration of UCD for EMasters. According to the obtained results, the evaluation framework confirmed to be suita- ble for iterative process of user-centred design of CMSs. The study shows that usabil- ity evaluation can be successfully applied in design process in a quick and cost- effective manner. In addition, the framework can be further adjusted and refined to fulfil the requirements of specific CMSs developed for the purpose of learning thus contributing to the researchers and practitioners in the field of design and develop- ment of learning CMSs. 8 References [1] Searson, M. (2014) Foreword. In: Miller, C. and Doering, A. (eds). The new landscape of mobile learning: Redesigning education in an app-based world. Routledge 56 http://www.i-jim.org Paper—User-Centred Design in Content Management System Development: The Case of EMasters [2] Pimmer, C., Mateescu, M. and Grohbiel, U. (2016). Mobile and ubiquitous learning in higher education settings. A systematic review of empirical studies. Computers in human behavior, 63: 490-501. https://doi.org/10.1016/j.chb.2016.05.057 [3] Cárdenas-Robledo, L. and Peña-Ayala, A. (2018). Ubiquitous learning: A systematic re- view. Telematics and Informatics, 35(5): 1097-1132. https://doi.org/10.1016/j.tele.2018. 01.009 [4] El Guabassi, I., Al Achhab, M., Jellouli, I. and EL Mohajir, B. E. (2018). Personalized Ubiquitous Learning via an Adaptive Engine. International Journal of Emerging Technolo- gies in Learning. 13(12): 177-190. https://doi.org/10.3991/ijet.v13i12.7918 [5] Nakić, J., Granić, A. and Glavinić, V. (2015). Anatomy of Student Models in Adaptive Learning Systems: A Systematic Literature Review of Individual Differences from 2001 to 2013. Journal of Educational Computing Research. 51(4): 459-489. https://doi.org/10.21 90/ec.51.4.e [6] Tesene, M. (2018). Adaptable Selectivity: A Case Study in Evaluating and Selecting Adap- tive Learning Courseware at Georgia State University. Current Issues in Emerging eLearn- ing. 5(1) [7] Jagušt, T., Botički, I. and So, H.-J. (2018). Examining Competitive, Collaborative and Adaptive Gamification in Young Learners' Math Learning. Computers & education, 125: 444-457. https://doi.org/10.1016/j.compedu.2018.06.022 [8] Dalgarno B. and Lee M. J. W. (2010). What are the learning affordances of 3-D virtual en- vironments? British Journal of Educational Technology. 41(1): 10-32 [9] Granić A., Nakić J. and Ćukušić M. (2017): Preliminary Evaluation of a 3D Serious Game in the Context of Entrepreneurship Education. In Strahonja V. and Kirinić V (eds): Pro- ceedings of the 28th Central European Conference on Information and Intelligent Systems, CECIIS, Varaždin, Croatia. pp 91-98 [10] Kokkalia, G., Drigas, A., Economou, A., Roussos, P. and Choli S. (2017). The Use of Se- rious Games in Preschool Education. International Journal of Emerging Technologies in Learning. 12(11): 15-27. https://doi.org/10.3991/ijet.v12i11.6991 [11] Khairudin M, Triatmaja A.K, Istanto W.J and Azman M. N. A (2019). Mobile Virtual Re- ality to Develop a Virtual Laboratorium for the Subject of Digital Engineering. Interna- tional Journal of Interactive Mobile Technologies. 13(4): 80-95. https://doi.org/10.3991/ ijim.v13i04.10522 [12] Preece, J.; Rogers, Y. and Sharp, H. (2015). Interaction Design: Beyond Human-Computer Interaction, 4th edition. John Wiley & Sons [13] Jumoke S., Oloruntoba, S. A. and Okafor B. (2015). Analysis of Mobile Phone Impact on Student Academic Performance in Tertiary Institution. International Journal of Emerging Technology and Advanced Engineering 5(1): 361-367 [14] Daradkeh, Y.I., Testov, V.A. and Golubev, O.B. (2019). Educational Network Projects as Form of E-Learning. International Journal of Advanced Corporate Learning. 12(1): 29-40. https://doi.org/10.3991/ijac.v12i1.9465 [15] Sanchez-Gordon, S. and Lujan-Mora, S. (2018). Technological Innovations in Large-Scale Teaching: Five Roots of Massive Open Online Courses. Journal of Educational Computing Research 56(5): 623-644. https://doi.org/10.1177/0735633117727597 [16] https://moodle.org [17] Alshammari, S.H., Bilal Ali, M. and Rosli, M.S. (2018). LMS, CMS and LCMS: The con- fusion among them. Science International 30 (3): 455-459 [18] Strang, K. D., and Vajjhala, N. R. (2017). Student Resistance to a Mandatory Learning Management System in Online Supply Chain Courses. Journal of Organizational and End User Computing. 29(3): 49-67. https://doi.org/10.4018/joeuc.2017070103 iJIM ‒ Vol. 13, No. 8, 2019 57 https://doi.org/10.1016/j.chb.2016.05.057 https://doi.org/10.1016/j.chb.2016.05.057 https://doi.org/10.1016/j.tele.2018.01.009 https://doi.org/10.1016/j.tele.2018.01.009 https://doi.org/10.3991/ijet.v13i12.7918 https://doi.org/10.3991/ijet.v13i12.7918 https://doi.org/10.2190/ec.51.4.e https://doi.org/10.2190/ec.51.4.e https://doi.org/10.1016/j.compedu.2018.06.022 https://doi.org/10.1016/j.compedu.2018.06.022 https://www.bib.irb.hr/896015 https://www.bib.irb.hr/896015 https://doi.org/10.3991/ijet.v12i11.6991 https://doi.org/10.3991/ijet.v12i11.6991 https://doi.org/10.3991/ijim.v13i04.10522 https://doi.org/10.3991/ijim.v13i04.10522 https://doi.org/10.3991/ijac.v12i1.9465 https://doi.org/10.3991/ijac.v12i1.9465 https://doi.org/10.1177/0735633117727597 https://doi.org/10.1177/0735633117727597 https://doi.org/10.1177/0735633117727597 https://moodle.org/ https://doi.org/10.4018/joeuc.2017070103 Paper—User-Centred Design in Content Management System Development: The Case of EMasters [19] Schoepp, K. (2005). Barriers to technology integration in a technology-rich environment. Learning and Teaching in Higher Education. 2(1): 1-24 [20] Al Meajel, T. M. and Sharadgah T. A. (2018). Barriers to Using the Blackboard System in Teaching and Learning: Faculty Perceptions. Technology, Knowledge and Learning. 23(2): 351-366. https://doi.org/10.1007/s10758-017-9323-2 [21] Park, Y. and Kim, Y. (2018). A Design and Development of micro-Learning Content in e- Learning System. International Journal on Advanced Science, Engineering and Infor- mation Technology. 8(1): 56-61 [22] ISO 9241-210. (2010) Ergonomics of human-system interaction — Part 210: Human- centred design for interactive systems [23] Quintana, C., Krajcik, J. and Soloway, E. (2000). Exploring a Structured Definition for Learner-Centered Design. In B. Fishman & S. O'Connor-Divelbiss (eds.), Fourth Interna- tional Conference of the Learning Sciences. Mahwah, NJ: Erlbaum. pp. 256-263 [24] Soloway, E., Jackson, S. L., Klein, J., Quintana, C., Reed, J., Spitulnik, J., Stratford, S.J., Studer, S., Eng, J., and Scala, N. (1996). Learning Theory in Practice: Case Studies in Learner-Centered Design. Human Factors in Computing Systems: CHI ’96 Conference Proceedings, Vancouver, Canada. https://doi.org/10.1145/238386.238476 [25] Norman, D.A. (1986). Cognitive Engineering. In: D.A. Norman & S.W. Draper (eds.), Us- er Centered System Design. Lawrenece Erlbaum Associates [26] Bevan, N., Carter, J., Earthy, J., Geis, T. and Harker, S. (2016). New ISO Standards for Usability, Usability Reports and Usability Measures. Human-Computer Interaction. Theo- ry, Design, Development and Practice: 18th International Conference, HCI International 2016, Toronto, ON, Canada. Proceedings, Part I. pp.268-278. https://doi.org/10.1007/978- 3-319-39510-4_25 [27] Shneiderman, B., Plaisant, C., Cohen, M., Jacobs, S., Elmqvist, N. and Diakopoulos, N. (2016). Designing the User Interface: Strategies for Effective Human-Computer Interac- tion. 6th ed. Pearson [28] Lazar, J., Feng, J. H. and Hochheiser, H. (2010). Research Methods in Human-Computer Interaction. Wiley Publishing [29] Rubin, J. and Chisnell, D. (2008). Handbook of usability testing: how to plan, design, and conduct effective tests. 2nd ed. Wiley Publishing, Inc. Indianapolis [30] https://www.djangoproject.com [31] Arshavskiy, M. (2013). Instructional Design for eLearning: Essential guide to creating successful eLearning courses. CreateSpace Independent Publishing Platform [32] Brooke, J. (1996). SUS: A “quick and dirty” usability scale. In: Jordan, P. W., Thomas, B., Weerdmeester, B. A. and McClelland A. L. (eds.), Usability Evaluation in Industry. Taylor and Francis [33] Brooke, J. (2013). SUS: A retrospective. Journal of Usability Studies. 8(2): 29–40 [34] Sauro, J. and Lewis, J. R. (2009). Correlations among prototypical usability metrics: Evi- dence for the construct of usability. In Proceedings of CHI 2009, Boston, MA: ACM. pp. 1609–1618. https://doi.org/10.1145/1518701.1518947 [35] Lewis, J., Brown, J. and K. Mayes, D. (2015). Psychometric Evaluation of the EMO and the SUS in the Context of a Large-Sample Unmoderated Usability Study. International Journal of Human-Computer Interaction. 31(8): 545-553. https://doi.org/10.1080/104473 18.2015.1064665 [36] Lewis J. R. and Sauro J. (2018). Item benchmarks for the system usability scale. Journal of Usability Studies. 13(3), 158-167 58 http://www.i-jim.org https://doi.org/10.4018/joeuc.2017070103 https://doi.org/10.1007/s10758-017-9323-2 https://doi.org/10.1007/s10758-017-9323-2 https://doi.org/10.1145/238386.238476 https://doi.org/10.1145/238386.238476 https://www.researchgate.net/publication/321543411_Human-Computer_Interaction_Theory_Design_Development_and_Practice_18th_International_Conference_HCI_International_2016_Toronto_ON_Canada_July_17-22_2016_Proceedings_Part_I https://www.researchgate.net/publication/321543411_Human-Computer_Interaction_Theory_Design_Development_and_Practice_18th_International_Conference_HCI_International_2016_Toronto_ON_Canada_July_17-22_2016_Proceedings_Part_I https://www.researchgate.net/publication/321543411_Human-Computer_Interaction_Theory_Design_Development_and_Practice_18th_International_Conference_HCI_International_2016_Toronto_ON_Canada_July_17-22_2016_Proceedings_Part_I https://www.researchgate.net/publication/321543411_Human-Computer_Interaction_Theory_Design_Development_and_Practice_18th_International_Conference_HCI_International_2016_Toronto_ON_Canada_July_17-22_2016_Proceedings_Part_I https://doi.org/10.1007/978-3-319-39510-4_25 https://doi.org/10.1007/978-3-319-39510-4_25 https://www.djangoproject.com/ https://www.djangoproject.com/ https://doi.org/10.1145/1518701.1518947 https://doi.org/10.1145/1518701.1518947 https://doi.org/10.1080/10447318.2015.1064665 https://doi.org/10.1080/10447318.2015.1064665 Paper—User-Centred Design in Content Management System Development: The Case of EMasters 9 Authors Jelena Nakić is a Postdoctoral Researcher at the Department of Computer Science at the Faculty of Science, University of Split, Croatia. She holds Ph.D. and M.Sc. in Computer Science from the Faculty of Electrical Engineering and Computing, Uni- versity of Zagreb, Croatia. Her research focus is web-based learning and adaptive e- learning, particularly development and evaluation of interactive learning environ- ments. Antonija Burčul is a teacher of Computer Science at Science High School, Split, Croatia. She works also as a teaching assistant at The Department of Information Technology of University in Split and at EDIT Code School organized by Split- Dalmatia County. Nikola Marangunić is an Assistant Professor at the Department of Social and Human Sciences, Faculty of Science at the University of Split, Croatia. He graduated psychology at the Faculty of Philosophy, University of Zagreb where he also achieved his M.Sc. and Ph.D. in the field of Human-Computer Interaction. His scientific inter- ests are related to multidisciplinary research where he contributes through a perspec- tive of the Cognitive Psychology. Article submitted 2019-04-24. Resubmitted 2019-05-27. Final acceptance 2019-05-28. Final version published as submitted by the authors. iJIM ‒ Vol. 13, No. 8, 2019 59