Bulletin of Social Informatics Theory and Application ISSN 2614-0047 Vol. 6, No. 2, December 2022, pp. 177-188 177 https:doi.org/10.31763/businta.v6i2.581 A review of sentiment analysis approaches for quality assurance in teaching and learning Emughedi Oghu a,1,*, Emeka Ogbuju a, Taiwo Abiodun a , Francisca Oladipo a a Department of Computer Science, Federal University Lokoja, Nigeria 1 emughedi.oghu-pg@fulokoja.edu.ng * corresponding author 1. Introduction The education system presently represents a landscape enriched by a continuous massive amount of data generated in a different format daily. Embedded in this data is valuable and helpful information. Discovering and extracting this information from a large amount of data is one of the benefits that opinion mining and sentiment analysis can give. Opinions and sentiments that students express are valuable information that can be used for analyzing the opinion of students about teachers, courses, and topics. Though opinion mining and sentiment analysis appear similar, they vary slightly from each other. Opinion means extracting and analyzing individuals’ opinions about a particular subject, while sentiments analysis refers to finding sentiment words or phrases that exhibit emotion. In this paper, we used both techniques interchangeably. Sentiment/opinion polarity (positive, negative, or neutral) signifies someone's opinion toward a subject, while emotions represent someone’s feelings toward a subject. This paper presented a systematic review of sentiment analysis on students’ feedback. The aim is to evaluate and present a general summary of research findings and implications for research and practice. This is needed to provide updates concerning the state of research, identify well-researched areas, reveal lagging areas that need further research, and understand similar challenges. The remaining part of this paper is in the following order; “Background and related work” which gives information on how sentiment analysis has been used on students’ feedback. “Research methods,” which discussed the adopted research methodology. “Result,” which shows the findings of the study. The “Identified gaps and challenges” section presents the challenges in the reviewed papers A R T I C L E I N F O A B S T R A C T Article history Received Ocktober 30, 2022 Revised November 28, 2022 Accepted December 3, 2022 The education industry considers quality to be a crucial factor in its development. Nevertheless, the quality of many institutions is far from perfect, as there is a high rate of systemic failure and low performance among students. Consequently, the application of digital computing plays an increasingly important role in assuring the overall quality of an educational institution. However, the literature lacks a reasonable number of systematic reviews that classify research that applied natural language processing and machine learning solutions for students’ sentiment analysis and quality assurance feedback. Thus, this paper presents a systematic literature review that structure available published papers between 2014 and 2023 in a high-impact journal-indexed database. The work extracted 59 relevant papers from the 3392 initially found using exclusion and inclusion criteria. The result identified five (5) prevalent techniques that are majorly researched for sentiment analysis in education and the prevalent supervised machine learning algorithms, lexicon-based approaches, and evaluation metrics in assessing feedback in the education domain. This is an open access article under the CC–BY-SA license. Keywords Quality assurance Opinion mining Sentiment analysis Machine learning algorithms RE TR AC TE D https://doi.org/10.31763/businta.v6i2.581 mailto:emughedi.oghu-pg@fulokoja.edu.ng http://creativecommons.org/licenses/by-sa/4.0/ http://creativecommons.org/licenses/by-sa/4.0/ 178 Bulletin of Social Informatics Theory and Application ISSN 2614-0047 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) on sentiment analysis of student feedback. The “limitation of the review” section shows the limitation of the study, while the paper is concluded in the “conclusion and future work” section 2. Background and Related Work 2.1. Background Many theories regarding emotion detection and analysis have been established since the 1960s. The study conducted by [1] grouped emotions into eight groups which are joy, anticipation, anger, disgust, fear, trust, surprise, and sadness. Documents, sentences, and words are different levels in which sentiment analysis can be carried out. However, due to documents, handling sentiment manually is imperial. For this reason, automatic data processing is required. Natural Language Processing (NLP) can be used on text-based sentiment analysis or document-level corporal. Most studies identified in the research up to 2016–2017 used only NLP methods, such as sentiment analytical techniques based on lexicons and dictionaries. Those papers rarely made use of traditional machine learning classifiers. Both recognition and classification of sentiment have recently changed from purely NLP-based techniques to deep learning-based models, and the number of papers recently published on the study issue has dramatically grown. Recently, the popularity and relevance of student feedback have risen, especially during the COVID-19 pandemic when most educational institutions shifted from traditional face-to-face interaction to an online format. The amount of new research indicates that there is a growing interest in using NLP or machine learning techniques for sentiment analysis in the area of education. To the best of our knowledge, the literature body lacks a review that systematically classifies and categorizes research and outcomes by showing the frequencies and summaries of publications and trends to determine the state of evidence in education. In order to carry out a systematic review, this article uses a process structure to respond to research questions. In particular, we created several research questions that address general concerns about the researched sentiment analysis elements, models, methodologies, and trends in assessment metrics in the teaching and learning community. 2.2. Related Work According to past studies, one study [2] on sentiment analysis (SA) in education concentrated on identifying the methodologies and tools utilized in SA and the significant importance of using SA on educational data. Our study is an expanded version of this research. Therefore, data from different sources, including bibliographic sources, research trends and patterns, and the most current SA tools, is provided. A summary of sentiment analysis techniques for education was presented in a review study by [3]. For multimodal fusions, the authors of this study presented a sentiment detection and assessment framework. Our review paper seeks to cover all issues related to the sentiment analysis of educational content, focusing on textual information systematically instead of the text, audio, and visual signals focused in [3]. Additionally, we provide a detailed review of current approaches used for sentiment discovery along with the results they achieved. Similar to [4], which reviewed the research journals of SA on education data and helped identify areas for further study, the writers of [4] cover subjects like the building of sentiment analysis systems, the examination of topics that are relevant to students, the analysis of teachers' teaching ability, etc., from about 41 related published research. In contrast, we first screened 618 research papers from various publications and conferences before conducting our scientific literature review analysis. In this study, we finalized and incorporated 59 of the most relevant and excellent scientific publications published from 2014 to 2023. The primary goal of this work is to systematically compile all of the material currently available on sentiment analysis of educational data in one place. Such review studies are very beneficial for readers in this domain. This review study will help researchers, academicians, and practitioners interested in sentiment analysis and quality assurance in education. 3. Method The method adopted in this study is a systematic literature review of tools and technologies used in analyzing student opinion in higher education by adopting [5] and [6] as models. RE TR AC TE D ISSN 2614-0047 Bulletin of Social Informatics Theory and Application 179 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) 3.1. Research Question The research questions (RQs) devised for this study were as follows: • RQ1. What are the most explored aspects of education concerning sentiment analysis? • RQ2. Which techniques and models are extensively researched for using sentiment analysis in education? • RQ3. What are the most common metrics for measuring the effectiveness of sentiment analysis systems? • RQ4. What are the most popular methods for gathering student feedback? 3.2. Search String To create a good search string, you must structure your keyword phrase regarding comparison, intervention, population, and outcome [5]. Relevant papers were obtained by constructing a search phrase using keywords based on the previously stated research question. Seven (7) common database indexes, Scopus, EBSCOhost, Science Direct, IEEE Xplore, Web Science, SpringerLink, and ACM DL, were used to conduct the searches. The search strings are eleven (11) in total; they are “sentiment analysis”, “opinion mining”, “technologies used in sentiment analysis”, “sentiment analysis framework”, “sentiment analysis algorithms”, “sentiment analysis tools”, “students’ feedback”, “teacher assessment”, “feedback assessment”, “learners’ feedback sentiment analysis reviews” and “quality assurance”. 3.3. Data Sources Choosing from broad and standardized databases is more practical as research gets more multidisciplinary, international, and interactive. The following databases were consulted: • Scopus: Scopus is a database launched in 2004 and includes citations and abstracts for academic journal articles. It provides a thorough picture of the world's scientific, technical, medical, and social research output and contains over 36,377 publications from over 11,678 publishers. It is the most extensive database of peer-reviewed literature citations and abstracts. • ScienceDirect: This database is Elsevier's top information resource for students and information professionals. It offers open and subscriber access to a sizable database that combines credible, proper scientific, technical, and healthcare papers with clever, user-friendly features. It has over 35,000 books and over 14,000,000 publications from over 3,800 journals. • EBSCO: Researchers can access various comprehensive and bibliographic databases through EBSCOhost, which again offers digital journal services for academic and corporate researchers. Over 900,000 high-quality e-books and publications, 16,711 indexed journals, 14,914 of which come from peer-reviewed sources, over 60,000 recordings, and more than 1500 prominent academic publishers are all included. • IEEE Xplore: This database is a research resource for finding and accessing conference proceedings, journal articles, and documents relating to computer science, electronics, and electrical engineering. IEEE Xplore has over 300 peer-reviewed journals, 1,900 international conferences, over 11,000 technical standards, approximately 5,000 e-books, and more than 500 online courses. • Web Science: This platform, formerly known as Web of Knowledge, is a platform with a paid subscription that gives users access to several databases with reference and citation information from conference proceedings, academic journals, and other publications in various academic subjects. • SpringerLink: This database is the most extensive online library of books, journals, series, protocols, and reference materials for science and technology. The database provides millions of scientific documents to researchers. • ACM DL: The ACM DL is a database for research discovery that contains a Full-Text collection of publications, including books, journals, conference proceedings, technical magazines, and newsletters. RE TR AC TE D 180 Bulletin of Social Informatics Theory and Application ISSN 2614-0047 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) 3.4. Data Retrieval Most high-impact journals and conferences are indexed in this collection of comprehensive databases. The eleven (11) search words were joined using Boolean 'OR'. As displayed in Table 1, 3,392 articles from the seven databases were retrieved.. Table.1 First search string result Scopus ScienceDirect EBSCOhost IEEE Xplore Web Science SpringerLink ACM DL Total Number of papers 821 437 681 576 465 268 144 3,392 The search was further streamlined by restricting it to computer science-related papers and papers published between 2004 and 2023. At this point, 618 papers remained after a total of 2,774 papers were removed, as displayed in Table 2. Table.2 Second search string result Scopus ScienceDirect EBSCOhost IEEE Xplore Web Science SpringerLink ACM DL Total Number of papers 129 89 92 99 82 73 54 618 After the second search, we went through the titles of the 618 remaining papers and discovered that only 292 have relevant titles, as shown in Table 3. Table.3 Papers with relevant titles Scopus ScienceDirect EBSCOhost IEEE Xplore Web Science SpringerLink ACM DL Total Number of papers 69 44 51 34 41 31 22 292 Next, we went through the abstracts and introduction of the papers with relevant titles to know if they were at variance with our research questions that had earlier been stated. The papers’ citations were exported to Microsoft Excel to facilitate analysis, and three categories were used to classify the papers. These categories are “relevant”, “partially relevant,” and “not relevant”. The relevant papers were marked with a green, the partially relevant papers were marked with yellow, and the not-relevant papers were marked with red. At this point, 88 papers were determined to be “relevant,” 74 papers to be “partially relevant,” and 130 papers to be “irrelevant”. After a rigorous review of the abstracts, 233 publications were eliminated based on the exclusion criteria, leaving 59 papers, as indicated in Table 4, for qualitative evaluation according to the study questions. Table.4 Final selection result Scopus ScienceDirect EBSCOhost IEEE Xplore Web Science SpringerLink ACM DL Total Number of papers 13 10 11 6 8 7 5 59 3.5. Eligibility Criteria 3.6. Inclusion Criteria Papers from peer-reviewed conferences, journals, workshops, and between 2014 and 2023 were included. Additionally, in cases where there were publications with identical studies and outcomes, the most current papers were chosen. RE TR AC TE D ISSN 2614-0047 Bulletin of Social Informatics Theory and Application 181 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) 3.7. Exclusion Criteria Papers not written in English, unrelated to sentiment analysis, and whose contributions to the work are not explicitly stated in the abstract were excluded from the reviewed papers in this study. 4. Results and Discussion The study's results are now presented about the research questions that guided the conduct of the systematic literature review. • RQ1. What are the most investigated aspects in the education domain concerning sentiment analysis? Students' opinions help them gain essential knowledge on different educational entities, such as lecturers, institutions, classes, and teaching approaches involving these entities. Recognizing these aspects as they are expressed in students' textual remarks is crucial because it helps decision-makers take the necessary steps to address them specifically. In this context, we looked at and categorized the reviewed articles according to the issues the authors wanted to look into. Specifically, we discovered three groups and associated teaching aspects that were the focus of these studies research. The first group of researchers looked at how students responded to different qualities of their teachers, such as their knowledge, behavior, pedagogy, etc. The second group includes publications addressing other facets of the three distinct entities: courses, teachers, and institutions. Course-related features include tuition costs, the campus, student life, and other characteristics connected to the institution entity. Course-related aspects comprised dimensions like course content, course structure, and evaluation. Meanwhile, the third group includes Papers examining the perspectives and attitudes of students toward institutional entities. From our findings, as illustrated in Table 5, we found that 76% of the papers reviewed were based on extracting students’ thoughts, opinions, and attitudes toward teachers, and 16% were based on extracting students’ opinions toward courses and institutions. In contrast, the remaining 8% were based on extraction student opinion towards the institution. Table.5 Student Feedback Aspects Examined in the Reviewed Papers Students’ opinion Towards Teacher Towards Institutions Toward courses and institutions Percentage 76% 8% 16% • RQ2. Which techniques and models are extensively researched for using sentiment analysis in education? Various techniques and models have been used to conduct sentiment analysis. These techniques are generally classified into three groups: supervised learning, unsupervised learning, and lexicon-based techniques. While some researchers decide to use either supervised, unsupervised, or lexicon-based techniques, others decide to use a hybrid of two primary techniques. Table 6 shows the learning techniques used for sentiment analysis in the area of education. Table.6 Learning techniques used for sentiment analysis in the education domain Learning Techniques Papers Supervised [7], [8], [9], [10], [11], [12], [13], [2], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27], [28], [29]. Unsupervised [30], [31], [32], [33], [34]. Lexicon-based [35], [36], [37], [38], [39], [40], [41], [42], [43], [44], [45], [46], [47]. Supervised and unsupervised [48], [49], [50], [51]. Supervised and lexicon-based [52], [53], [54], [55], [56], [57], [58], [59], [60], [61]. Unsupervised and lexicon-based [62], [63], [64]. RE TR AC TE D 182 Bulletin of Social Informatics Theory and Application ISSN 2614-0047 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) Table 7 emphasizes supervised learning models wildly studied for sentiment analysis in education. These models include the Decision Tree (DT), Support Vector Machine (SVM), K Nearest Neighbor (KNN), Naïve Bayes (NB), and Neural Network (NN). Table.7 Supervised learning models that are wildly studied for sentiment analysis in the education domain Supervised learning models Papers DT [7], [16], [18], [19], [24], [25], [41], [56], [59]. SVM [2], [7], [8], [9], [12], [15], [16], [17], [19], [20], [21], [24], [25], [38], [41], [48], [51], [52], [56], [57], [59], [60], [64]. KNN [9], [15], [18], [19], [23], [52], [64]. NB [9], [12], [15], [16], [17], [20], [21], [22], [23], [24], [25], [26], [34], [38], [41], [51], [52], [56], [57], [58], [59], [64], [65]. NN [9], [11], [13], [14], [17], [19], [23], [38], [41], [59], [61]. Additionally, as shown in Table 6, lexicon-based learning approaches, also called rule-based sentiment analysis, were frequently used in several research studies and were frequently linked to either supervised or unsupervised learning techniques. We observed that the Valence Aware Dictionary and Sentiment Reasoner (VADER) and Sentiwordnet were used far more frequently than TextBlob, MPQA, Sentistrength, and Semantria in Table 8 list of the most commonly used lexicons elaborated among the examined publications. Table.8 Frequently used lexicons Lexicon-Based Papers VADER [36], [38], [42], [43], [48]. Sentiwordnet [46], [56], [57], [65]. Semantria [45], [58]. Sentistrength [44]. TextBlob [38], [49]. MPQA [20]. • RQ3. What are the most common metrics for measuring the effectiveness of sentiment analysis systems? Systems designed for sentiment analysis were commonly evaluated using metrics based on information retrievals such as precision, F1-score, and recall. Additionally, other research used measures based on statistics to evaluate the precision of systems. Comparing the number of articles that utilized a certain assessment measure to evaluate the performance of systems with the number of articles that either performed no evaluation or chose not to stress the employed metrics is highly intriguing. Table 9 shows the percentage of articles defined for each assessment metric. Table.9 Percentage of evaluation metrics applied in the reviewed papers Evaluation metrics Information retrieval- based metrics (accuracy, precision, f1-score, and recall) Kappa Pearson R- value N/A Papers (%) 67% 4% 3% 26% Table 9 shows that 67% of the publications featured accuracy or other evaluation metrics such as precision, recall, and F1-score. On the other hand, Kappa was employed in just 4% of the research, while Pearson's R-value was 3%, and no assessment metrics were specified in 26% of the research. RE TR AC TE D ISSN 2614-0047 Bulletin of Social Informatics Theory and Application 183 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) • RQ4. What are the most popular methods for gathering student feedback? While reviewing the papers in this study, we found different data sources and divided them into three categories based on their characteristics. These categories are: 1) Questionnaires/Survey: This dataset category was collected by providing questionnaires to gather student feedback or conducting a survey among teachers and students. 2) Social media and blogs: This category of dataset comprises data that are collected through social media platforms like Facebook, Twitter, and blogs. 3) Education/research platforms: In this dataset category, data are extracted through online education and research platforms such as edX, Coursera, ResearchGate, Kaggle, and LinkedIn. Based on the reviewed paper, just about a third of the papers disclosed the data source while about one-third did not disclose information about the source of the dataset collected. A tabular representation of these papers and dataset source is shown in Table 10. Table.10 Dataset sources that the reviewed papers have used S/N Category of dataset Papers Description 1 Questionnaires/ Surveys [7], [10], [19], [22], [23], [29], [30], [31], [35], [40], [42], [43], [51], [52], [57], [58], [59], [61]. This dataset category was collected by providing questionnaires to gather student feedback or conducting a survey among teachers and students. 2 Social media and blogs [12], [21], [32], [34], [39], [41], [46], [47], [48], [59], [60], [62], [64], [66]. This category of dataset comprises data that are collected through social media platforms like Facebook, Twitter, and blogs 3 Research platforms/Education [8], [9], [13], [24], [25], [26], [28], [50], [56] This dataset category extracts data through online education and research platforms such as edX, Coursera, ResearchGate, Kaggle, and LinkedIn. 4.1. Identified Gaps and Challenges We observed that some areas in students’ feedback sentiment analysis need more research and development. One of these areas from RQ1 is the use of figurative speeches from students’ feedback, such figurative speeches include the use of irony and sarcasm. This area is lacking and in need of further studies. In RQ2, we observed that most domain-specific techniques do not perform well in multiple domains. Another challenge from RQ2 is an inability to handle complex constructs such as abbreviations and words with multiple meanings. In RQ4, most of the datasets in the reviewed paper are unstructured. Therefore, identifying the leading entities to which the sentiments were directed is not feasible until applying an entity extraction model, which limits the application of the existing dataset. 4.2. Limitation of The Review As authors explore papers from Scopus, Science Direct, EBSCO, Web Science, IEEE Xplore, ACM DL, and SpringerLink, relevant papers from other databases may have been missed. Also, the research team analysis was done based on the selected papers that were reviewed, while other research has been done concerning techniques and methods as well as technologies and tools employed in sentiment analysis. 5. Conclusion From our review study, we were able to identify the significant student feedback aspects in sentiment analysis, and based on the paper reviewed, we observed that the highest rate, which is 76% are towards teacher while only 8% are towards institutions, and the remaining 16% are towards courses and institutions. Furthermore, we identify five (5) techniques that are majorly researched for using sentiment analysis in education, and these techniques include supervised learning, unsupervised learning, lexicon-based, supervised & lexicon-based, and unsupervised lexicon-based. The supervised RE TR AC TE D 184 Bulletin of Social Informatics Theory and Application ISSN 2614-0047 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) learning approach also identified five (5) machine learning algorithms. These algorithms include Decision Tree, Support Vector Machine, K-Nearest Neighbor, Naive Bayes, and Neural Network. The lexicons associated with the lexicon-based approaches from the reviewed papers are VADER, Sentiwordnet, Semantria, Sentistrength, TextBlob, and MPQA. Also, we identified the most common metrics for measuring the effectiveness of sentiment analysis systems: information retrieval-based evaluation metrics (such as Accuracy, Precision, F1-score, and Recall), Kappa, and Pearson R-value. We observed that 26% of the papers reviewed did not use any evaluation metrics, while a high percentage (67%) used Information retrieval-based evaluation metrics, while Kappa and Pearson R- value were reviewed by 4% and 3%, respectively. Finally, we identified the most popular methods for gathering student feedback through questionnaires/survey, social media and blogs, and education/research platforms. 5.1. Further Work Based on the challenges and gaps identified in the revised paper, we recommend future research on the following aspects. • Dataset size and structure: the majority of the papers revised in this research used a small dataset with less than five thousand samples, which affected the results [67], so future research can work on larger datasets to make the result more reliable. Also, a structured feedback dataset is needed via a survey and questionnaire, rather than the unstructured format used. • Emotion Detection: Only a few articles that were reviewed focused on detecting students’ emotions for sentiment analysis. Thus, we recommend future work that considerss using students’ emotional expressions as feedback for student sentiment analysis. Acknowledgment We want to thank and acknowledge God Almighty for making it possible for us to complete this manuscript. We sincerely thank the Federal University, Lokoja's Management, and staff for creating a peaceful learning environment devoid of intimidation, harassment, and other forms of crime. I pray and wish that peace should continue to exist in this University. References [1] R. Plutchik, “The Nature of Emotions,” Am. Sci., vol. 89, no. 4, p. 344, 2001, doi: 10.1511/2001.28.344. [2] K. Mite-Baidal, C. Delgado-Vera, E. Solís-Avilés, A. H. Espinoza, J. Ortiz-Zambrano, and E. Varela- Tapia, “Sentiment Analysis in Education Domain: A Systematic Literature Review,” in Communications in Computer and Information Science, vol. 883, Springer Verlag, 2018, pp. 285–297, doi: 10.1007/978-3-030-00940-3_21. [3] Z. Han, J. Wu, C. Huang, Q. Huang, and M. Zhao, “A review on sentiment discovery and analysis of educational big‐data,” WIREs Data Min. Knowl. Discov., vol. 10, no. 1, p. e1328, Jan. 2020, doi: 10.1002/widm.1328. [4] J. Zhou and J. Ye, “Sentiment analysis in education research: a review of journal publications,” Interactive Learning Environments, pp. 1-13, Routledge, 2020, doi: 10.1080/10494820.2020.1826985. [5] B. Kitchenham and P. Brereton, “A systematic review of systematic review process research in software engineering,” Inf. Softw. Technol., vol. 55, no. 12, pp. 2049–2075, Dec. 2013, doi: 10.1016/j.infsof.2013.07.010. [6] M. Höst and A. Oručević-Alagić, “A systematic review of research on open source software in commercial software product development,” Inf. Softw. Technol., vol. 53, no. 6, pp. 616–624, Jun. 2011, doi: 10.1016/j.infsof.2010.12.009. [7] D. K. Dake and E. Gyimah, “Using sentiment analysis to evaluate qualitative students’ responses,” Educ. Inf. Technol., vol. 28, no. 4, pp. 4629–4647, Apr. 2023, doi: 10.1007/s10639-022-11349-1. [8] R. Faizi and S. El Fkihi, “A Sentiment Analysis Based Approach for Exploring Student Feedback,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 13449 LNCS, Springer Science and Business Media RE TR AC TE D https://doi.org/10.1511/2001.28.344 https://doi.org/10.1007/978-3-030-00940-3_21 https://doi.org/10.1002/widm.1328 https://doi.org/10.1080/10494820.2020.1826985 https://doi.org/10.1016/j.infsof.2013.07.010 https://doi.org/10.1016/j.infsof.2010.12.009 https://doi.org/10.1007/s10639-022-11349-1 ISSN 2614-0047 Bulletin of Social Informatics Theory and Application 185 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) Deutschland GmbH, 2022, pp. 52–59, doi: 10.1007/978-3-031-15273-3_6 . [9] J. G. K. Mabunda, A. Jadhav, and R. Ajoodha, “Sentiment Analysis of Student Textual Feedback to Improve Teaching,” in Interdisciplinary Research in Technology and Management, London: CRC Press, 2021, pp. 643–651, doi: 10.1201/9781003202240-100. [10] N. R, P. M. S, P. P. Harithas, and V. Hegde, “Sentimental Analysis on Student Feedback using NLP & POS Tagging,” in 2022 International Conference on Edge Computing and Applications (ICECAA), Oct. 2022, pp. 309–313, doi: 10.1109/ICECAA55415.2022.9936569. [11] H. Peng, Z. Zhang, and H. Liu, “A Sentiment Analysis Method for Teaching Evaluation Texts Using Attention Mechanism Combined with CNN-BLSTM Model,” Sci. Program., vol. 2022, pp. 1–9, Feb. 2022, doi: 10.1155/2022/8496151. [12] M. Umair, A. Hakim, A. Hussain, and S. Naseem, “Sentiment Analysis of Students’ Feedback before and after COVID-19 Pandemic,” Int. J. Emerg. Technol., vol. 12, no. 2, pp. 177–182, 2021, [Online]. Available at: https://www.researchgate.net/profile/Muhammad-Umair-35/publication/353305417. [13] S. Katragadda, V. Ravi, P. Kumar, and G. J. Lakshmi, “Performance Analysis on Student Feedback using Machine Learning Algorithms,” in 2020 6th International Conference on Advanced Computing and Communication Systems (ICACCS), Mar. 2020, pp. 1161–1163, doi: 10.1109/ICACCS48705.2020.9074334. [14] M. L. Barrón Estrada, R. Zatarain Cabada, R. Oramas Bustillos, and M. Graff, “Opinion mining and emotion recognition applied to learning environments,” Expert Syst. Appl., vol. 150, p. 113265, Jul. 2020, doi: 10.1016/j.eswa.2020.113265. [15] K. F. Hew, X. Hu, C. Qiao, and Y. Tang, “What predicts student satisfaction with MOOCs: A gradient boosting trees supervised machine learning and sentiment analysis approach,” Comput. Educ., vol. 145, p. 103724, Feb. 2020, doi: 10.1016/j.compedu.2019.103724. [16] N. T. P. Giang, T. T. Dien, and T. T. M. Khoa, “Sentiment Analysis for University Students’ Feedback,” in Advances in Intelligent Systems and Computing, vol. 1130 AISC, Springer, 2020, pp. 55–66, doi: 10.1007/978-3-030-39442-4_5. [17] Z. Kastrati, F. Dalipi, A. S. Imran, K. Pireva Nuci, and M. A. Wani, “Sentiment Analysis of Students’ Feedback with NLP and Deep Learning: A Systematic Mapping Study,” Appl. Sci., vol. 11, no. 9, p. 3986, Apr. 2021, doi: 10.3390/app11093986. [18] R. K. Kavitha, “Sentiment Research on Student Feedback to Improve Experiences in Blended Learning Environments,” Int. J. Innov. Technol. Explor. Eng., vol. 8, no. 11S, pp. 159–163, Oct. 2019, doi: 10.35940/ijitee.K1034.09811S19. [19] N. Spatiotis, I. Perikos, I. Mporas, and M. Paraskevas, “Evaluation of an Educational Training Platform Using Text Mining,” in Proceedings of the 10th Hellenic Conference on Artificial Intelligence, Jul. 2018, pp. 1–5, doi: 10.1145/3200947.3201049. [20] A. Terkik, E. Prud’Hommeaux, C. O. Alm, C. Homan, and S. Franklin, “Analyzing Gender Bias in Student Evaluations.” pp. 868–876, 2016, Accessed: Apr. 27, 2023. [Online]. Available at: https://aclanthology.org/C16-1083. [21] G. G. Esparza et al., “A Sentiment Analysis Model to Analyze Students Reviews of Teacher Performance Using Support Vector Machines,” in Advances in Intelligent Systems and Computing, vol. 620, Springer Verlag, 2018, pp. 157–164, doi: 10.1007/978-3-319-62410-5_19. [22] K. Van Nguyen, V. D. Nguyen, P. X. V. Nguyen, T. T. H. Truong, and N. L.-T. Nguyen, “UIT-VSFC: Vietnamese Students’ Feedback Corpus for Sentiment Analysis,” in 2018 10th International Conference on Knowledge and Systems Engineering (KSE), Nov. 2018, pp. 19–24, doi: 10.1109/KSE.2018.8573337. [23] A. Koufakou, J. Gosselin, and D. Guo, “Using data mining to extract knowledge from student evaluation comments in undergraduate courses,” in 2016 International Joint Conference on Neural Networks (IJCNN), Jul. 2016, vol. 2016-Octob, pp. 3138–3142, doi: 10.1109/IJCNN.2016.7727599. [24] J. Sultana, N. Sultana, K. Yadav, and F. AlFayez, “Prediction of Sentiment Analysis on Educational Data based on Deep Learning Approach,” in 2018 21st Saudi Computer Society National Computer RE TR AC TE D https://doi.org/10.1007/978-3-031-15273-3_6 https://doi.org/10.1201/9781003202240-100 https://doi.org/10.1109/ICECAA55415.2022.9936569 https://doi.org/10.1155/2022/8496151 https://www.researchgate.net/profile/Muhammad-Umair-35/publication/353305417 https://doi.org/10.1109/ICACCS48705.2020.9074334 https://doi.org/10.1016/j.eswa.2020.113265 https://doi.org/10.1016/j.compedu.2019.103724 https://doi.org/10.1007/978-3-030-39442-4_5 https://doi.org/10.3390/app11093986 https://doi.org/10.35940/ijitee.K1034.09811S19 https://doi.org/10.1145/3200947.3201049 https://aclanthology.org/C16-1083 https://doi.org/10.1007/978-3-319-62410-5_19 https://doi.org/10.1109/KSE.2018.8573337 https://doi.org/10.1109/IJCNN.2016.7727599 186 Bulletin of Social Informatics Theory and Application ISSN 2614-0047 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) Conference (NCC), Apr. 2018, pp. 1–5, doi: 10.1109/NCG.2018.8593108. [25] Z. Mutlaq Ibrahim, M. Bader-El-Den, and M. Cocea, “A data mining framework for analyzing students’ feedback of assessment,” CEUR Workshop Proc., vol. 2294, pp. 1–7, 2018, [Online]. Available at: https://pure.port.ac.uk/ws/files/13569657/DCECTEL2018_paper_13.pdf. [26] K. S. Krishnaveni, R. R. Pai, and V. Iyer, “Faculty rating system based on student feedbacks using sentimental analysis,” in 2017 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Sep. 2017, vol. 2017-Janua, pp. 1648–1653, doi: 10.1109/ICACCI.2017.8126079. [27] M. P. Ortega, L. B. Mendoza, J. M. Hormaza, and S. V. Soto, “Accuracy’ Measures of Sentiment Analysis Algorithms for Spanish Corpus generated in Peer Assessment,” in Proceedings of the 6th International Conference on Engineering & MIS 2020, Sep. 2020, pp. 1–7, doi: 10.1145/3410352.3410838. [28] H. H. Lwin, S. Oo, K. Z. Ye, K. Kyaw Lin, W. P. Aung, and P. Paing Ko, “Feedback Analysis in Outcome Base Education Using Machine Learning,” in 2020 17th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), Jun. 2020, pp. 767–770, doi: 10.1109/ECTI-CON49241.2020.9158328. [29] H. G. Jiménez, M. A. Casanova, A. C. Finamore, and G. Simões, “Sentiment Analysis of Student Surveys A Case Study on Assessing the Impact of the COVID-19 Pandemic on Higher Education Teaching.,” in International Educational Data Mining Society, 2021, pp. 1–7, Accessed: Apr. 27, 2023. [Online]. Available at: https://eric.ed.gov/?id=ED615622. [30] S. Cunningham-Nelson, M. Baktashmotlagh, and W. Boles, “Visualizing Student Opinion Through Text Analysis,” IEEE Trans. Educ., vol. 62, no. 4, pp. 305–311, Nov. 2019, doi: 10.1109/TE.2019.2924385. [31] O. Chantamuang, J. Polpinij, V. Vorakitphan, and B. Luaphol, “Sentence-Level Sentiment Analysis for Student Feedback Relevant to Teaching Process Assessment,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 13651 LNAI, Springer Science and Business Media Deutschland GmbH, 2022, pp. 156–168, doi: 10.1007/978-3-031-20992-5_14. [32] M. Sivakumar and U. S. Reddy, “Aspect based sentiment analysis of students opinion using machine learning techniques,” in 2017 International Conference on Inventive Computing and Informatics (ICICI), Nov. 2017, pp. 726–731, doi: 10.1109/ICICI.2017.8365231. [33] G. I. Nitin, G. Swapna, and V. Shankararaman, “Analyzing educational comments for topics and sentiments: A text analytics approach,” in 2015 IEEE Frontiers in Education Conference (FIE), Oct. 2015, vol. 2015, pp. 1–9, doi: 10.1109/FIE.2015.7344296. [34] M. L. Barron-Estrada, R. Zatarain-Cabada, R. Oramas-Bustillos, and F. Gonzalez-Hernandez, “Sentiment Analysis in an Affective Intelligent Tutoring System,” in 2017 IEEE 17th International Conference on Advanced Learning Technologies (ICALT), Jul. 2017, pp. 394–397, doi: 10.1109/ICALT.2017.137. [35] G. S. Chauhan, P. Agrawal, and Y. K. Meena, “Aspect-Based Sentiment Analysis of Students’ Feedback to Improve Teaching–Learning Process,” in Smart Innovation, Systems and Technologies, vol. 107, Springer Science and Business Media Deutschland GmbH, 2019, pp. 259–266, doi: 10.1007/978-981-13-1747-7_25. [36] M. C. Martinis, C. Zucco, and M. Cannataro, “An Italian lexicon-based sentiment analysis approach for medical applications,” in Proceedings of the 13th ACM International Conference on Bioinformatics, Computational Biology and Health Informatics, Aug. 2022, pp. 1–4, doi: 10.1145/3535508.3545594. [37] Q. Rajput, S. Haider, and S. Ghani, “Lexicon-Based Sentiment Analysis of Teachers’ Evaluation,” Appl. Comput. Intell. Soft Comput., vol. 2016, pp. 1–12, 2016, doi: 10.1155/2016/2385429. [38] R. Cobos, F. Jurado, and A. Blazquez-Herranz, “A Content Analysis System That Supports Sentiment Analysis for Subjectivity and Polarity Detection in Online Courses,” IEEE Rev. Iberoam. Tecnol. del Aprendiz., vol. 14, no. 4, pp. 177–187, Nov. 2019, doi: 10.1109/RITA.2019.2952298. RE TR AC TE D https://doi.org/10.1109/NCG.2018.8593108 https://pure.port.ac.uk/ws/files/13569657/DCECTEL2018_paper_13.pdf https://doi.org/10.1109/ICACCI.2017.8126079 https://doi.org/10.1145/3410352.3410838 https://doi.org/10.1109/ECTI-CON49241.2020.9158328 https://eric.ed.gov/?id=ED615622 https://doi.org/10.1109/TE.2019.2924385 https://doi.org/10.1007/978-3-031-20992-5_14 https://doi.org/10.1109/ICICI.2017.8365231 https://doi.org/10.1109/FIE.2015.7344296 https://doi.org/10.1109/ICALT.2017.137 https://doi.org/10.1007/978-981-13-1747-7_25 https://doi.org/10.1145/3535508.3545594 https://doi.org/10.1155/2016/2385429 https://doi.org/10.1109/RITA.2019.2952298 ISSN 2614-0047 Bulletin of Social Informatics Theory and Application 187 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) [39] C. Pong-inwong and W. Songpan, “Sentiment analysis in teaching evaluations using sentiment phrase pattern matching (SPPM) based on association mining,” Int. J. Mach. Learn. Cybern., vol. 10, no. 8, pp. 2177–2186, Aug. 2019, doi: 10.1007/s13042-018-0800-2. [40] K. Nilanga, M. Herath, H. Maduwantha, and S. Ranathunga, “Dataset and Baseline for Automatic Student Feedback Analysis,” 2022 Lang. Resour. Eval. Conf. Lr. 2022, no. June, pp. 2042–2049, 2022, [Online]. Available at: https://aclanthology.org/2022.lrec-1.219. [41] V. D. Nguyen, K. Van Nguyen, and N. L.-T. Nguyen, “Variants of Long Short-Term Memory for Sentiment Analysis on Vietnamese Students’ Feedback Corpus,” in 2018 10th International Conference on Knowledge and Systems Engineering (KSE), Nov. 2018, pp. 306–311, doi: 10.1109/KSE.2018.8573351. [42] H. Newman and D. Joyner, “Sentiment Analysis of Student Evaluations of Teaching,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10948 LNAI, Springer Verlag, 2018, pp. 246–250, doi: 10.1007/978-3-319- 93846-2_45. [43] A. G. S. Raj, K. Ketsuriyonk, J. M. Patel, and R. Halverson, “What Do Students Feel about Learning Programming Using Both English and Their Native Language?,” in 2017 International Conference on Learning and Teaching in Computing and Engineering (LaTICE), Apr. 2017, pp. 1–8, doi: 10.1109/LaTiCE.2017.8. [44] P. Jiranantanagorn and H. Shen, “Sentiment analysis and visualisation in a backchannel system,” in Proceedings of the 28th Australian Conference on Computer-Human Interaction - OzCHI ’16, Nov. 2016, pp. 353–357, doi: 10.1145/3010915.3010992. [45] C. L. Santos, P. Rita, and J. Guerreiro, “Improving international attractiveness of higher education institutions based on text mining and sentiment analysis,” Int. J. Educ. Manag., vol. 32, no. 3, pp. 431– 447, Apr. 2018, doi: 10.1108/IJEM-01-2017-0027. [46] A. Iram, “Sentiment Analysis of Student’s Facebook Posts,” in Communications in Computer and Information Science, vol. 932, Springer Verlag, 2019, pp. 86–97, doi: 10.1007/978-981-13-6052-7_8. [47] Z. Liu, C. Yang, X. Peng, J. Sun, and S. Liu, “Joint Exploration of Negative Academic Emotion and Topics in Student-Generated Online Course Comments,” in 2017 International Conference of Educational Innovation through Technology (EITT), Dec. 2017, vol. 2018-March, pp. 89–93, doi: 10.1109/EITT.2017.29. [48] G. Elia, G. Solazzo, G. Lorenzo, and G. Passiante, “Assessing learners’ satisfaction in collaborative online courses through a big data approach,” Comput. Human Behav., vol. 92, pp. 589–599, Mar. 2019, doi: 10.1016/j.chb.2018.04.033. [49] S. Pyasi, S. Gottipati, and V. Shankararaman, “SUFAT - An Analytics Tool for Gaining Insights from Student Feedback Comments,” in 2018 IEEE Frontiers in Education Conference (FIE), Oct. 2018, vol. 2018-Octob, pp. 1–9, doi: 10.1109/FIE.2018.8658457. [50] F. F. Lubis, Y. Rosmansyah, and S. H. Supangkat, “Experience in learners review to determine attribute relation for course completion,” in 2016 International Conference on ICT For Smart Society (ICISS), Jul. 2016, pp. 32–36, doi: 10.1109/ICTSS.2016.7792865. [51] F. Colace, L. Casaburi, M. De Santo, and L. Greco, “Sentiment detection in social networks and in collaborative learning environments,” Comput. Human Behav., vol. 51, pp. 1061–1067, Oct. 2015, doi: 10.1016/j.chb.2014.11.090. [52] D. S. A. M. Mrtdaa Mohammed Almosawi, “Lexicon-Based Approach For Sentiment Analysis To Student Feedback,” Webology, vol. Volume 19, no. No. 1, pp. 6971–6989, 2022, [Online]. Available at: http://www.webology.org/abstract.php?id=1537. [53] N. Spatiotis, I. Perikos, I. Mporas, and M. Paraskevas, “Sentiment Analysis of Teachers Using Social Information in Educational Platform Environments,” Int. J. Artif. Intell. Tools, vol. 29, no. 02, p. 2040004, Mar. 2020, doi: 10.1142/S0218213020400047. [54] K. Sangeetha and D. Prabha, “RETRACTED ARTICLE: Sentiment analysis of student feedback using multi-head attention fusion model of word and context embedding for LSTM,” J. Ambient Intell. RE TR AC TE D https://doi.org/10.1007/s13042-018-0800-2 https://aclanthology.org/2022.lrec-1.219 https://doi.org/10.1109/KSE.2018.8573351 https://doi.org/10.1007/978-3-319-93846-2_45 https://doi.org/10.1007/978-3-319-93846-2_45 https://doi.org/10.1109/LaTiCE.2017.8 https://doi.org/10.1145/3010915.3010992 https://doi.org/10.1108/IJEM-01-2017-0027 https://doi.org/10.1007/978-981-13-6052-7_8 https://doi.org/10.1109/EITT.2017.29 https://doi.org/10.1016/j.chb.2018.04.033 https://doi.org/10.1109/FIE.2018.8658457 https://doi.org/10.1109/ICTSS.2016.7792865 https://doi.org/10.1016/j.chb.2014.11.090 http://www.webology.org/abstract.php?id=1537 https://doi.org/10.1142/S0218213020400047 188 Bulletin of Social Informatics Theory and Application ISSN 2614-0047 Vol. 6, No. 2, December 2022, pp. 177-188 Oghu et.al (A review of sentiment analysis approaches for quality assurance in teaching and learning) Humaniz. Comput., vol. 12, no. 3, pp. 4117–4126, Mar. 2021, doi: 10.1007/s12652-020-01791-9. [55] N. Soe and P. T. Soe, “Domain Oriented Aspect Detection for Student Feedback System,” in 2019 International Conference on Advanced Information Technologies (ICAIT), Nov. 2019, pp. 90–95, doi: 10.1109/AITC.2019.8921372. [56] M. Korkmaz, “Sentiment analysis on university satisfaction in social media,” in 2018 Electric Electronics, Computer Science, Biomedical Engineerings’ Meeting (EBBT), Apr. 2018, pp. 1–4, doi: 10.1109/EBBT.2018.8391463. [57] L. Mandal, R. Das, S. Bhattacharya, and P. N. Basu, “Intellimote: a hybrid classifier for classifying learners’ emotion in a distributed e-learning environment,” TURKISH J. Electr. Eng. Comput. Sci., vol. 25, no. 3, pp. 2084–2095, Jan. 2017, doi: 10.3906/elk-1510-120. [58] K. Muollo, P. Basavaraj, and I. Garibay, “Understanding Students’ Online Reviews to Improve College Experience and Graduation Rates of STEM Programs at the Largest Post-Secondary Institution: A Learner-Centered Study,” in 2018 IEEE Frontiers in Education Conference (FIE), Oct. 2018, vol. 2018-Octob, pp. 1–7, doi: 10.1109/FIE.2018.8658450. [59] Y. Sahu, G. S. Thakur, and S. Dhyani, “Dynamic feature based computational model of sentiment analysis to improve teaching learning system,” Int. J. Emerg. Technol., vol. 10, no. 4, pp. 17–23, 2019, [Online]. Available at: https://www.researchtrend.net/ijet/pdf/Dynamic Feature based Computational model of Sentiment Analysis to Improve Teaching Learning System Y. SAHU.pdf. [60] Z. Nasim, Q. Rajput, and S. Haider, “Sentiment analysis of student feedback using machine learning and lexicon based approaches,” in 2017 International Conference on Research and Innovation in Information Systems (ICRIIS), Jul. 2017, pp. 1–6, doi: 10.1109/ICRIIS.2017.8002475. [61] I. Sindhu, S. Muhammad Daudpota, K. Badar, M. Bakhtyar, J. Baber, and M. Nurunnabi, “Aspect- Based Opinion Mining on Student’s Feedback for Faculty Teaching Performance Evaluation,” IEEE Access, vol. 7, pp. 108729–108741, 2019, doi: 10.1109/ACCESS.2019.2928872. [62] S. Srinivas and S. Rajendran, “Topic-based knowledge mining of online student reviews for strategic planning in universities,” Comput. Ind. Eng., vol. 128, pp. 974–984, Feb. 2019, doi: 10.1016/j.cie.2018.06.034. [63] P. X. V. Nguyen, T. T. T. Hong, K. Van Nguyen, and N. L. T. Nguyen, “Deep Learning versus Traditional Classifiers on Vietnamese Students’ Feedback Corpus,” in NICS 2018 - Proceedings of 2018 5th NAFOSTED Conference on Information and Computer Science, Jan. 2019, pp. 75–80, doi: 10.1109/NICS.2018.8606837. [64] A. A. Abed and A. M. El-Halees, “Detecting Subjectivity in Staff Perfomance Appraisals by Using Text Mining: Teachers Appraisals of Palestinian Government Case Study,” in Proceedings - 2017 Palestinian International Conference on Information and Communication Technology, PICICT 2017, Sep. 2017, pp. 120–125, doi: 10.1109/PICICT.2017.25. [65] L. Balachandran and A. Kirupananda, “Online reviews evaluation system for higher education institution: An aspect based sentiment analysis tool,” in 2017 11th International Conference on Software, Knowledge, Information Management and Applications (SKIMA), Dec. 2017, vol. 2017- Decem, pp. 1–7, doi: 10.1109/SKIMA.2017.8294118. [66] K. Nimala and R. Jebakumar, “RETRACTED ARTICLE: Sentiment topic emotion model on students feedback for educational benefits and practices,” Behav. Inf. Technol., vol. 40, no. 3, pp. 311–319, Feb. 2021, doi: 10.1080/0144929X.2019.1687756. [67] V. Kagklis, A. Karatrantou, M. Tantoula, C. T. Panagiotakopoulos, and V. S. Verykios, “A Learning Analytics Methodology for Detecting Sentiment in Student Fora: A Case Study in Distance Education,” Eur. J. Open, Distance E-Learning, vol. 18, no. 2, pp. 74–94, Dec. 2015, doi: 10.1515/eurodl-2015- 0014. RE TR AC TE D https://doi.org/10.1007/s12652-020-01791-9 https://doi.org/10.1109/AITC.2019.8921372 https://doi.org/10.1109/EBBT.2018.8391463 https://doi.org/10.3906/elk-1510-120 https://doi.org/10.1109/FIE.2018.8658450 https://www.researchtrend.net/ijet/pdf/Dynamic%20Feature%20based%20Computational%20model%20of%20Sentiment%20Analysis%20to%20Improve%20Teaching%20Learning%20System%20Y.%20SAHU.pdf https://www.researchtrend.net/ijet/pdf/Dynamic%20Feature%20based%20Computational%20model%20of%20Sentiment%20Analysis%20to%20Improve%20Teaching%20Learning%20System%20Y.%20SAHU.pdf https://doi.org/10.1109/ICRIIS.2017.8002475 https://doi.org/10.1109/ACCESS.2019.2928872 https://doi.org/10.1016/j.cie.2018.06.034 https://doi.org/10.1109/NICS.2018.8606837 https://doi.org/10.1109/PICICT.2017.25 https://doi.org/10.1109/SKIMA.2017.8294118 https://doi.org/10.1080/0144929X.2019.1687756 https://doi.org/10.1515/eurodl-2015-0014 https://doi.org/10.1515/eurodl-2015-0014