Linguistic, English Education and Art (LEEA) Journal Volume 5 Nomor 1, Juli-Desember 2021 e-ISSN: 2597-3819 p-ISSN: 2597-9248 DOI : https://doi.org/10.31539/leea.v5i1.3102 57 IMPLEMENTATION OF LEVELS OF THINKING SKILLS AND COMMUNICATIVE LANGUAGE ABILITY IN ENGLISH NATIONAL EXAM Hervina 1 STKIP Abdi Pendidikan Payakumbuh Rendi Afriadi 2 Zetka Harmyn Institute vinaharmyn@gmail.com 1 Submit, 21-11-2021 Accepted, 06-12-2021 Publish, 25-12-2021 ABSTRACT This study aims to explore the implementation of the learning domain, which is manifested in the level of thinking skills, namely HOTS and LOTS, and the concept of CLA in the English national exam. The research method used is the descriptive analysis by analyzing all the questions in the English national exam for SMK in 2014/2015. The results showed that all the questions in the English national exam for SMK 2014/2015 are directed to assess students' cognitive skill domains with different skill levels and thinking processes. The concept of HOTS, LOTS and CLA have been implemented and applied in the construction and design of this exam. In conclusion, the implementation of English national exam has implemented requirement of good language testing. Keywords: Communicative Language Ability, National Exam, Standardized Test, Thinking Skills INTRODUCTION The importance of language assessment has long been established and widely agreed. The practice of language assessment is inevitable in every teaching and learning context (Ozdemir-Yilmazer & Ozkan, 2017). In other words, language teachers will always involve their students in testing and assessment during the learning process. Moreover, Ahmed et al., (2019) mentions that assessment regardless of its form is central to a successful language program in terms of its effectiveness. Without the implementation of language assessment and testing, it is highly unlikely possible to know the progress of students’ learning. In this case, language assessment serves as the measurement to 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 58 determine students’ achievement in learning as well as teachers’ accomplishment in their teaching (Ridhwan, 2017). As language testing is important for both learning and teaching practice, a wide range of language test designs have been proposed and developed. One of the common language test designs widely used is standardized tests. As its name suggests, these tests apply a certain set of standard in their implementation. A standardized test is designed to create a valid measurement that can infer students’ skill and knowledge in a standardized manner (Cifuentes-Medina et al., 2019). In other words, the test takers will be tested against the similar criteria without any concern of their background and prior knowledge as well as experience. An example of a standardized test commonly implemented in Indonesia is Ujian Nasional or National exam. This standardized test is held every year with the aims measure students’ achievement in elementary and secondary levels of education in Indonesia (Firdaos & Ahmad, 2019; Ratnasari, 2018). The implementation of national exam is expected to be able to improve educational performance as well as to measure achievement of graduates' competency on certain subjects nationally (Rosidin et al., 2019). The current practice of national examination is administered under the implementation of K13 curriculum. Accordingly, the test items of national exam, in this case English, have to incorporated fundamental concept which becomes underlying principle of K13 itself. K 13 is developed to help learners to have a qualified human resource to compete in the era, which needs higher-order thinking skills including the ability to analyze, evaluate and create (Pratiwi & Mustadi, 2021). In order words, students are required to have ability beyond the understanding the material only (LOTS). Higher-order thinking skills are needed in every learning process to improve learners' qualities and education. Learners should reach the targeted competencies such as critical thinking, creative and innovative, communication, and collaborative skills. They also need to have high confidence in higher-order thinking skills. In addition, the practice of language teaching has now shifted into Communicative Language Ability (CLA). Unlike previous type of language testing which only focused on assessing students’ language competence without any concern of contextual use, CLA by Bachman & Palmer (1996) formulates language testing framework which does not only test language competence but also how they use such competence in communicative context. It provides a broad basis for both the development and use of language tests, which measure both knowledge or competence, and the capacity for impelementing or executing that competence in appropriate, contextualized communicative language use. With the purpose to assess students’ progress in national context to prepare them for international competition as the main goal of K13 curriculum, national 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 59 exam is ideally required to consider the aspect of the thinking skills process and communicative language ability in its test design. Besides, Zaim (2016) suggests that the concept of learning taxonomy and communicative language ability is really important and useful consideration in language test design and development. Therefore, in this present study, the researcher will explore how these concepts are implemented in English national exam 2015 for vocational high school level. LITERATURE REVIEW National Exam as Standardized Testing A good standardized test is the result of empirical research and development that may extend beyond simply the establishment of standards. In order to be classified as standardized test, such tests have to have four characteristics as proposed by Brown & Abeywirackma (2010). The first characteristic is standard-based. Standardized test is standard base in the fact that the application of these tests uses systematic procedure and administration. This test presupposes certain objective or performance levels that are held constant across one form a test to another. The second characteristic is that the test uses norm-referenced criteria. The goal of the test is to place test-takers on a continuum across a range of scores and to differentiate test-takers by their relative ranking. The third characteristic is product of research and development. As it was previously mentioned that standardized test design process is not only limited to defining test standards, it involves continuous ways of research and development for the revising and improvement of the current test. The last characteristic is systematic scoring and administration procedures. The practice of establishing standard in test administration is also found in educational context of Indonesia. Irdiyansyah & Rizki (2018) mention that standardized test is widely used in education field including in Indonesia to measure students’ cognitively. This test has been applied for decades with several names such as Ujian Negara, Evaluasi Belajar Tahap Akhir Nasional dan Ujian Akhir Nasional. Ujian Nasional (abbreviated into UN) held annually throughout the country to measure students’ achievement at the end of a learning period in each level is the latest form of a school leaving examination in Indonesia starting from 2005 (Ahmad, 2016). However, national exam practice was ended in 2020, but students’ achievement in the final year of their learning is still assessed by educational institution through its own designed final test by paying attention to the formulated core competence and basic competence outlined in the curriculum (Menteri Pendidikan dan Kebudayaan, 2021). 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 60 Bloom’s Taxonomy and Thinking Skills Seen from its process, language learning will involve students to select their knowledge domains and thinking process. During the learning process, students are involved in three domains of learning known as cognitive, affective, and psychomotor (Hoque, 2017; Sari & Rahmah, 2019; Sönmez, 2017). Cognitive aspect is realized through the mastery of concept and factual information. It reflects scientific concept that students’ must learn during the learning process; affective includes attitude, motivation, value; and psychomotor is strongly related to physical movement and its underlying causes (Hoque, 2017). Essential to three learning domains, especially cognitive aspect, are the thinking processes through which these domains are tested. In light of this thinking process, the concept of bloom’s taxonomy is introduced. Bloom’s taxonomy is a system of classification proposed by Benjamin Bloom in 1956 and later revised by Anderson and Krathwohl (Panggabean & Asariski, 2021). According to Rahman & Manaf (2017) the aim of this taxonomy is to make students aware of what they were learning, hence striving to attain more sophisticated levels of learning with six cognitive-learning categories.. The process of learning will involve students in a continuum of thinking process starting from remembering until creating. Abosalem (2018) proposes that there is tendency among learning theorists to divide the process of thinking skill into low order thinking skill (LOTS) and high order thinking skills (HOTS). The process of learning, however, is expected to achieve this HOTS. Related the six cognitive thinking categories, high order thinking skill (HOTS) is in the level of C4 to C6, which are analyzing, evaluating and creating. On the other hand, students with low order thinking skill (LOTS) are only in the level from C1 to C3, which are remembering, understanding, and applying (Kusuma et al., 2017). Communicative Language Ability Communicative Language Ability (CLA) provides a basis for both development and use of language tests. This language testing framework belongs to the communicative language testing approach, which indicates that language assessment should involve the notion of language use in its process (Amirian et al., 2017). CLA was proposed by Bachman & Palmer (1996) as the response to address the necessity to base language tests on language proficiency framework. In other words, the design of language assessment and testing should base itself on language proficiency rather than just on knowledge only. Ideally, language test should be used to measure language learners’ ability to use the target language in authentic situations (Morrow, 2018). The ability to use language in authentic education serves as the basic tenet of CLA framework. The language test, according to this framework, should 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 61 integrate both linguistic knowledge and communicative competence (Al- Mekhlafi, 2019). In other words, when performing the test, students will utilize both their linguistic knowledge and communicative competence, which Bachman (2000) defines as the knowledge and the capacity to use such knowledge in appropriate contextual use of language. Moreover, Bachman & Palmer used the term language ability to refer to language users’ capacity to create and interpret discourse (Bachman & Palmer, 2010). Bachman & Palmer (1996) formulated CLA through three competences. The first is language competence which comprises a set of specific knowledge components that are utilized in communication via language. The second competence is strategic competence. It refers to the characterization of the mental capacity for implementing the components of language competence in contextualized communicative language use. The last is psychophysiological mechanisms which refer to the neurological and psychological processes involved in the actual execution of language as a physical phenomenon. RESEARCH METHOD This present article aims to analyze the levels of questions (HOTS or LOTS) based on Bloom Taxonomy across three learning domains and the concept of communicative language ability (CLA) as how they are implemented in national examination for English subject. have been applied in the National Examination. To that aim, document studies in the form of content analysis is conducted. Document used as the source of content analysis is the English test for Vocational High School (SMK) national exam 2014/2015. The unit of content to be analyze is the test items. Data analysis identifies and explores those items to see how they are related to HOTS or LOTS questions and CLA. FINDING After analyzing the document of the national exam for English subject, the researcher found that the concept of HOTS and LOTS as well as CLA have been implemented and applied in the construct and design of this test. From the analysis, it is found that questions assessing students Low Order Thinking Skills (LOTS) range from C1 and C2, remembering and understanding while questions assessing High Order Thinking Skills (HOTS) is only C4, analyzing. The thinking process of C3, C5, and C6 (applying, evaluating, and creating) are not assessed in this test. Seen from their proportion, Low Order Thinking Skills (LOTS) is more dominant than High Order Thinking Skills (LOTS), 23 questions (46%) and 27 questions (46%) conversely. As for Low Order Thinking Skills (LOTS) questions, the questions with remembering (C1) is 11 questions (48%) and the questions with understanding is 12 questions (52%). 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 62 As for High Order Thinking Skills (HOTS), all of questions i.e 23 assesses students’ ability in analyzing. The questions of remembering are found in both listening and reading section of the test. In the listening section, such questions ask students to select the best option as the correct choice to the question based on the utterance given. They must find information in the options suitable to the questions by simply remembering the information directly stated in the recording. Hence, it tests cognitive domains specifically remembering. Similarly, in the reading question such questions ask students to select the best option as the correct choice to the question based on the text given. They must find information in the options suitable to the questions by simply remembering the information directly stated in the text. In conclusion, C1 (remembering) questions are given in the form of finding directly stated information which does not require complex thinking process. The questions of understanding (C2) are found in both listening and reading section of the test. In listening questions, such questions ask students to select the best option describing the picture given. Hence, test takers are tested to draw conclusion from the visual information, so it tests cognitive domains specifically understanding with the process of interpreting information and drawing its conclusion. In the both section, such questions ask students to select the best option as the correct choice to the word definition based on the text and utterance given. They must find information in the options suitable to the questions by interpreting or defining the meaning of the words. In addition, the question of understanding (C4) also assesses students’ cognitive process in inferring information. Such questions ask students to find the best option describing the main point of the text. To find the answer, they must infer the text to find its main point. Therefore, it tests students’ cognitive domains specifically understanding with the process of inferring information to find main idea. In short, the questions of understanding (C2) found tests students’ thinking process in interpreting and inferring. The questions of C4 (analyzing) are also found in both listening and reading. In listening section, such questions ask students to select the best option as the spoken response to the utterance given. To select the suitable answer, they must differentiate the nuance of meaning among the three responses. Therefore it tests cognitive domains with the thinking process of differentiating. In addition, in the reading section, such questions ask students to analyze the sentence containing grammatical error. It tests cognitive domains with the process of recognizing error. Finally, still in the reading questions, such questions also ask students to select the best option to complete the dialogue given. To select the answer, the must differentiate the nuance of meaning among the four responses given to select 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 63 the one suitable to complete the dialogue. Therefore, it tests cognitive domains specifically analyzing with the process of differentiating. In short, analyzing (C4) questions test students’ ability in the cognitive process of differentiating and recognizing error. Regarding the concept of communicative language ability, the analysis reveals that national exam especially for English subject has applied the concept of CLA i.e language competence, strategic competence, and physchophysiological mechanism. Language competence tested is both organizational competence and pragmatic competence. The component of strategic competence and physchophysiological mechanism are indirectly tested in the test. In other word, there are no questions specifically designed in the test to assess this. As strategic competence is the characterization of the mental capacity for implementing the components of language competence in contextualized communicative language use, this competence is realized through students’ process of understanding the language components such as words, sentences etc. in the test in their attempt to find the answer. In other words, students who can answer most of the questions given can be said to have used their strategic competence maximally and properly in this language test. Similarly, the component of physchophysiological mechanism is also indirectly tested. In other word, there are no questions specifically designed in the test to assess this component. As psychophysiological mechanisms refers to the neurological and psychological processes involved in the actual execution of language as a physical phenomenon, this process involves through students’ activity in absorbing the test material given. In listening section, psychophysiological mechanisms occur in students’ ear listening to the utterance and through their mind processing the meaning. Similarly, in reading section psychophysiological mechanisms occur in students’ eyes viewing and reading the text and through their mind processing the meaning. As for language competence, some competences are directly tested and others are indirectly tested. The competence directly tested is organizational competence i.e grammatical competence encompassing words and grammar. Some test items requires students’ to find the meaning of the words and also find the sentences with correct grammatical. However, these competences are also indirectly tested. Some questions asking for the content of utterance and the test indirectly assess students grammatical competence because they must utilize their grammatical competence in understanding the text. Textual competence as part of organizational competence is also tested. This competence is realized through questions required students to identify how information is arranged in the text, such as important information or less important information. 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 64 Finally, pragmatic competence is tested through involving students to select the best response to the utterance spoken or dialogue written (completion task). In this set of test items, students are given three in listening and four in reading responses as the possible match to the utterance or missing dialogue. These options are highly distracted since students have to find nuances of meaning suitable to the utterance or dialogue. To do this, they need to apply pragmatic competence recognizing the context of utterance and dialogue so they can be properly matched. DISCUSSION The data analysis indicates that the national test questions have incorporated students’ cognitive learning domains both LOTS and HOTS. However, the other two domains as suggested by Zaim (2016), affective and psychomotor, are not assessed in this exam. In other words, this test focuses on assessing students’ cognitive. Irdiyansyah & Rizki (2018) suggest that standardized tests such as national exam are intended to measure students’ cognitive competence. The main purpose of the test is to know students achievement on the mastering the factual concept and information they have learned in English subject during their three year study. The absence of affective and psychomotor domains is probably due to the nature of national exam, which does not fit the assessment criteria for affective and psychomotor domains. All of questions is national exam only tests students’ knowledge and mastery over the concept of the subject matter. There is no question designed to assess students’ motivation or attitude in such tests. Similarly, there is no question to assess students’ psychomotor either. Regarding the level of thinking skills, HOTS or LOTS, this study indicates that the questions found in the national exam for English subject has incorporated these thinking skills. Seen from the proportion, questions that require students high order thinking skills is less dominant than those for low order thinking skills, 46 % to 54 %. By incorporating HOTS questions, this has been designed to meet the requirement the latest curriculum which expects students to utilize their high order thinking skills more (Sitorus et al., 2021). In fact, in 2013 curriculum, the government makes an attempt to promote students’ critical and creative thinking by involving them in HOTS-based teaching and learning experience (Utami et al., 2019). Unlike this study, which indicates the small different proportion of LOTS and HOTS questions, the study from Ahmad (2016) on English national exams in 2013 revealed that LOTS questions are much more dominant with the comparison of 87.4 % to 10.6%. It is probably due to the fact the aforementioned study was conducted during the implementation of school-based curriculum or the transition 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 65 to 2013 curriculum. Again, the national exam question analyzed in this study has shown a significant improvement from the previous one. However, there seems to be insignificant difference found in terms of the LOTS and HOTS question subcategories found in both tests. The prevalence of LOTS questions in English national exam after the implementation of 2013 curriculum is also evident in Putra & Abdullah (2019). Their study on English national exam from 2013 until 2018 revealed that 157 items in English national exam from these periods was categorized into LOTS questions and only 53 items into HOTS questions. In other words, there seems to be consistent LOTS questions in English national exam although it was implemented during 2013 curriculum era, even after its revision in 2017. English national exam questions in 2019, the last implementation of national exam, still employed more LOTS questions that HOTS one even though the difference was not that significant. Ilham, et.al (2020) found in their study that out of 35 questions in the test, 15 of them (42%) of them assessed students HOTS and 20 of them (48%) assessed students LOTS. This proportion seems to be consistent with the findings of the study 46 % to 54 %. Similarly, the domain of HOTS questions found in all of the previous studies as well as this is mostly on analyzing level (C4) (Ahmad, 2016; Ilham et al., 2020; Putra & Abdullah, 2019). In addition to HOTS and LOTS questions, English national exam has also incorporated the notion of communicative language ability (CLA) consisting of language competence, strategic competence, and psychophysiological mechanisms. These competences are tested directly and indirectly. Language competence is directly tested while psychophysiological competence and strategic competence are indirectly tested. In other words, besides the fulfillment of cognitive domain, the test has also made an attempt to incorporate another important element in language testing process as proposed by Bachman and Palmer (Bachman & Palmer, 1996). CONCLUSION The implementation of English national exam has implemented requirement of good language testing. It has implemented various levels of cognitive domains, thinking skills, HOTS and LOTS. Even though LOTS questions is more dominant than HOTS one, the proportion difference between the two is not that significant. In addition, this test has also attempted the concept of communicative language ability. In other words, in this test, the students are not only tested about the factual information only but are also involved in various thinking skills as well as tested in terms of communicative language ability. 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 66 REFERENCES Abosalem, Y. (2018). Assessment Techniques and Students’ Higher-Order Thinking Skills. ICSIT 2018 - 9th International Conference on Society and Information Technologies, Proceedings, 4(1), 61–66. https://doi.org/10.11648/j.ijsedu.20160401.11 Ahmad, U. L. (2016). Senior High School National Examination and Thinking Skills. Beyond Words, 4(2), 168–189. http://journal.wima.ac.id/index.php/BW/article/view/945 Ahmed, F., Ali, S., & Shah, R. A. (2019). Exploring Variation in Summative Assessment: Language Teachers’ Knowledge of Students’ Formative Assessment and Its Effect on Their Summative Assessment. Bulletin of Education and Research, 41(2), 109–119. https://files.eric.ed.gov/fulltext/EJ1229441.pdf Al-Mekhlafi, A. (2019). The Question of Communicative Language Ability in EFL Testing : The Case of Language Testing in Oman Principles and Characteristics of Communicative Testing. Sumerianz Journal of Education, Linguistics, and Literature, 2(8), 51–61. https://www.sumerianz.com/pdf-files/sjell2(8)51-61.pdf Amirian, S. M. R., Moqaddam, H. H., & M., Q. J. (2017). Critical Analysis of the Models of Language Proficiency with a Focus on Communicative Models. Theory and Practice in Language Studies, 7(5), 400-407. https://doi.org/10.17507/tpls.0705.11 Bachman, L. F. (2000). Modern Language Testing at the Turn of the Century: Assuring that What We Count Counts. Language Testing, 17(1), 1–42. https://doi.org/10.1191/026553200675041464 Bachman, L. F., & Palmer, A. S. A. S. (1996). Language Testing in Practice: Designing and Developing Useful Language Tests. Oxford: Oxford University Press Bachman, L., & Palmer, A. (2010). Language Assessment in Practice. Oxford: Oxford University Press Brown, H. D., & Abeywickrama, P. (2010). Language Testing: Principles and Classroom Practice. New York: Pearson Education Cifuentes-Medina, J. E., Poveda-Pineda, D. F., & Rodríguez-Ortiz, D. A. (2019). Education Quality: Reflections on Its Evaluation Through Standardized Testing. Saber, Ciencia y Libertad, 14(2), 247–255. https://doi.org/10.18041/2382-3240/saber.2019v14n2.5894 Firdaos, R., & Ahmad, A. (2019). The Implementation of National Examination as the Direction of National Education Policy. Al-Tadzkiyyah: Jurnal Pendidikan Islam, 9(1), 105–112. http://dx.doi.org/10.24042/atjpi.v9i1.2788 Hoque, M. E. (2017). Three Domains of Learning: Cognitive, Affective and Psychomotor. The Journal of EFL Education and Research, 2(2), 45–52. https://www.researchgate.net/publication/330811334_Three_Domains_of_ Learning_Cognitive_Affective_and_Psychomotor Ilham, N. W., Jabu, B., & Korompot, C. A. (2020). Analysis of Higher-Order Thinking Skills (HOTS) Items in Senior High School English National 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 67 Examination 2019. ELT Worldwide: Journal of English Language Teaching, 7(2), 156-162. https://doi.org/10.26858/eltww.v7i2.14764 Irdiyansyah, I., & Rizki, T. (2018). Teachers ’ Perspective on Standardized Test. Journal of Humanities and Social Studies, 2(1), 18–21. https://doi.org/10.33751/jhss.v2i1.816 Menteri Pendidikan dan Kebudayaan. (2021). Surat Edaran Menteri Pendidikan dan Kebudayaan Nomor 1 Tahun 2021 tentang Peniadaan Ujian Nasional dan Ujian Kesetaraan Serta Pelaksanaan Ujian Sekolah dalam Masa Darurat Penyebaran Corona Virus Desease (COVID-19). https://disdik.beraukab.go.id/ova_doc/se-mendikbud-no-1-tahun-2021- peniadaan-ujian-nasional-2021/ Kusuma, M. D., Rosidin, U., Abdurrahman, A., & Suyatna, A. (2017). The Development of Higher Order Thinking Skill (HOTS) Instrument Assessment in Physics Study. IOSR Journal of Research & Method in Education (IOSRJRME), 7(1), 26–32. https://doi.org/10.9790/7388- 0701052632 Morrow, C. K. (2018). Communicative Language Testing. The TESOL Encyclopedia of English Language Teaching, 1–7. https://doi.org/10.1002/9781118784235.eelt0383 Ozdemir-Yilmazer, M., & Ozkan, Y. (2017). Classroom Assessment and Practice of Language Instructors. Journal of Language and Linguistic Studies, 13(2), 324–345. https://www.jlls.org/index.php/jlls/article/view/678/318 Panggabean, C. I. T., & Asariski, A. (2021). An Analysis of EFL Students’ Questions in Research on ELT Class at University of PGRI Ronggolawe Tuban. English Education:Journal of English Teaching and Research, 6(1), 13–21. https://doi.org/10.29407/jetar.v6i1.15919 Pratiwi, N., & Mustadi, A. (2021). Hots-Based Learning in 2013 Curriculum: Is It Suitable? JPI (Jurnal Pendidikan Indonesia), 10(1), 128-135. https://doi.org/10.23887/jpi-undiksha.v10i1.22781 Putra, T. K., & Abdullah, D. F. (2019). Higher-Order Thinking Skill (Hots) Questions in English National Examination in Indonesia. Jurnal Bahasa Lingua Scientia, 11(1), 145–160. https://doi.org/10.21274/ls.2019.11.1.145-160 Rahman, S. A., & Manaf, N. F. A. (2017). A Critical Analysis of Bloom’s Taxonomy in Teaching Creative and Critical Thinking Skills in Malaysia Through English Literature. English Language Teaching, 10(9), 245–256. https://doi.org/10.5539/elt.v10n9p245 Ratnasari, W. (2018). Students’ Perspective toward National Examination in Indonesia. AL-MUQAYYAD: Jurnal Ekonomi Syariah, 1(2), 112–124. https://doi.org/10.46963/jam.v1i2.9 Ridhwan, M. (2017). Understanding Formative and Summative Assessment for EFL Teachers: Theoretical Reflections on Assessment for Learning. J- SHMIC : Journal of English for Academic, 4(1), 40–50. https://doi.org/10.25299/jshmic.2017.vol4(1).505 Rosidin, U., Herpratiwi, Suana, W., & Firdaos, R. (2019). Evaluation of National Examination (UN) and National-Based School Examination (USBN) in 2021. Linguistics, English Education and Art (LEEA) Journal 5 (1):57-68 68 Indonesia. European Journal of Educational Research, 8(3), 827–837. https://doi.org/10.12973/eu-jer.8.3.827 Sari, I. D. P., & Rahmah, T. H. (2019). Virtual Discussion for EFL Students Establishing Three Domains: Cognitive, Affective, and Psychomotor. International Journal for Educational and Vocational Studies, 1(3), 249– 253. https://doi.org/10.29103/ijevs.v1i3.1586 Sitorus, M. M., Silalahi, L. H., Rajagukguk, H., Panggabean, N., & Nasution, J. (2021). The Effect of Higher-Order Thinking Skill (HOTS) in Reading Comprehension. IDEAS Journal of Language Teaching and Learning, Linguistics and Literature, 9(1), 109–123. http://ejournal.iainpalopo.ac.id/index.php/ideas/article/view/1895 Sönmez, V. (2017). Association of Cognitive, Affective, Psychomotor and Intuitive Domains in Education, Sönmez Model. Universal Journal of Educational Research, 5(3), 347–356. https://doi.org/10.13189/ujer.2017.050307 Utami, F. D., Nurkamto, J., Marmanto, S., & Taopan, L. L. (2019). The Implementation of Higher-Order Thinking Skills in EFL Classroom: Teachers’ Perceptions. Proceeding of the Second International Conference on Future of Education, 2(1), 64–72. https://doi.org/10.17501/26307413.2019.2107 Zaim, M. (2016). Evaluasi Pembelajaran Bahasa Inggris. Jakarta: Kencana