INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL ISSN 1841-9836, 11(3):441-449, June 2016. A Fuzzy Logic Software Tool and a New Scale for the Assessment of Creativity I. Susnea, G. Vasiliu Ioan Susnea*, Grigore Vasiliu Dunarea de Jos University of Galati Romania, 800008 Galati, Domneasca, 47 ioan.susnea@ugal.ro; grigore.vasiliu@ugal.ro *Corresponding author: ioan.susnea@ugal.ro Abstract: It is difficult to measure something we cannot clearly define. No wonder that, for the over 100 definitions of the creativity proposed in the literature, there are almost as many scales and assessment tools. Most of these instruments have been designed for research purposes, and are difficult to apply and score, especially in the educational environment. Not to mention that they are expensive. The research described in this paper is aimed to develop a free, fast, and easy to use software tool for the assessment of creativity in the educational context. To this purpose, we have designed a new scale with 20 items, based on a novel approach focusing on detecting the factors known to block the creativity, like stereotypical thinking, and social conformity. The user input is collected through a web based interface, and the actual interpretation of the results is automated by means of a fuzzy logic algorithm. The proposed solution is interesting because it can be easily integrated in almost any e-learning platform, or used as a stand-alone tool for tracing the evolution of the students involved in courses for the development of creative thinking skills, and also for possible other applications. Keywords: assessment of creativity, e-learning, fuzzy logic 1 Introduction The world of the 21’st century is very different from what it used to be just a couple of decades ago. New professions emerge overnight (think of Android developer, market research data mining specialist, or cloud services engineer ), while others quickly fade out (postal services, newspaper delivery, travel agents, word processors, and many others). To help students succeed in this world, the educational system should create new skills and should be able to assess these new skills ( [5]). One of the fundamental skills required in our rapidly changing society is creative thinking ( [27], [11], [31]). This is the reason why the modern School has been intensely criticized ( [23], [22]) for it is unable to foster the creativity of the students, or - even worse - for killing their innate creativity. By reviewing the vast literature dedicated to creativity, it appears that the real reason why we don’t have serious initiatives to foster creativity in School is that we still don’t fully understand this construct ( [20]). There are currently over 100 definitions of the creativity ( [33], [1]), many explanatory theories ( [29], [12], [26]), an ocean of literature about creativity, but very few initiatives explicitly aimed to develop educational content for the education for creativity ( [28]). The lack of consensus of the researchers concerning the many facets of the creativity is also manifest in the field of formal education: though most teachers declare that they value and encourage the creativity of the students, most of them can barely recognize it and are totally unprepared to stimulate it ( [7]). An additional obstacle is the lack of easy to use assessment tools to trace the evolution of the students involved in creativity courses, and to demonstrate the efficiency of the specific educational content. Copyright © 2006-2016 by CCC Publications 442 I. Susnea, G. Vasiliu The existing assessment tools are hard, if not impossible, to apply in the educational en- vironment, either because they are too complex and difficult to score, like the famous TTCT ( [32]), or simply because they are not free (e.g. rCAB - Runco Creativity Assessment Battery, www.creativitytestingservices.com) . Considering the fact that, the first creativity lesson in any course is that “there are no right and wrong answers”, it results that the Moodle-style evaluation tests, wherein the students must select one or more “correct”answers from a predefined list are totally useless. For these reasons, the main objective of the research described here is to develop a new creativity assessment tool that is free, easy to use, and score, and compatible with most existing e-learning platforms. Unlike the vast majority of the existing psychometric approaches, which treat the tendency towards social conformity as a bias, in our study we assumed that the social conformity is a clue indicating a type of “stereotypical thinking”that blocks creative thinking itself, not the process of measuring it. In other words, in our approach, the social conformity is treated as a signal, not as noise. The automatic scoring is performed by means of a fuzzy logic algorithm starting from two subscales, each having 10 items, focused on different “dimensions”of the creativity: one subscale assessing the ideational behavior, the other aimed to measure the stereotypical thinking and social conformity. Beyond the present introduction, this work is structured as follows: Section 2 is a brief review of the related work aimed to clearly define the context of this study. Section 3 contains the description of the proposed solution, and Section 4 is reserved for discussion and conclusions. 2 Related Work Given the polymorphic nature of the creativity, it’s no wonder that the assessment instru- ments are equally diverse as the definitions of creativity. For this reason, it is convenient to present the assessment tools from the perspective of the 4 P’s (Person, Product, Process, and Press/Place) commonly used to illustrate the definitions of the creativity. Other, more comprehensive reviews of the state of the art in the field of creativity assessment are available in ( [24], [13], [4]). There are many tests that focus on traits or behaviors specific to creative persons, which is understandable considering that most researchers of the creativity are psychologists. Among this type of tests, it is worth to mention: The Kirton Adaptor-Innovator Inventory ( [16]), SCAB - The Scale for Creative Attributes and Behaviors ( [14]) and RIBS - Runco Ideational Behavior Scale ( [25]). In another approach, the tests focus on creative behaviors from the past. Examples of products based on this idea are SPCA - The State of Past Creative Achievements ( [6]) and The Creative Achievement Questionnaire ( [8]). The tests that evaluate the creativity by analyzing the creative products are, in most cases, based on the measure of the divergent thinking by means of open ended prompts. These are by far the most frequently used, which determined Kaufman ( [13]) to ironically note: “One of the great ironies of the study of creativity is that so much energy and effort have been focused on a single class of assessments: measures of divergent thinking. In other words, there’s not much divergence in the history of creativity assessment. ” It is not unlikely that this preference for the divergent thinking tests is connected with the popularity of TTCT (The Torrance Test of Creative Thinking, [32]). However, beyond the A Fuzzy Logic Software Tool and a New Scale for the Assessment of Creativity 443 incontestable success, the divergent thinking measures have their critics ( [21], [15]), who note that applying and scoring these tests are cumbersome and their predictive value is questionable. In the same class of product oriented tests it is worth to mention CAT - The Consensual Assessment Techniques ( [2]), which is based on a methodology of evaluating the creative products by independent experts. The creativity assessment tools based on the analysis of the processes leading to creative outcomes are far less common. One example is CPAC - Cognitive Processes Associated with Creativity ( [19]). In what concerns the assessment of the influence of the environment on creativity, this is a difficult and less studied problem. One notable example in this direction is the KEYS test ( [3]). There are also complex tests containing specific subscales for multiple P’s of the creativity. For example, CSQ-R - Creativity Styles Questionnaire - Revised. ( [17]) contains 78 items organized in the following subscales: Belief in Unconscious Processes (person), Superstition (person), Final Product Orientation (product), Use of Techniques (process), Use of Other People (process), and Use of Senses (process), and Environmental Control (press). In what concerns the way to collect the user’s responses, the most popular solution is the Likert scale, but there are also tests that use the simplified dialogue based on binary responses Yes/No, or True/False. From the perspective of the software implementation, the problem of creating the GUI (Graphic User Interface) to collect the user’s responses is quite simple. However, things become considerably more complex when it comes to automate the interpretation of multiple subscales addressing distinct “dimensions”of the creativity. See next section for details on how we solved the problem of computing a global creativity quotient CQ. We will conclude this brief presentation of the state of the art in the field of the creativity assessment by citing again the opinion of Kaufman: “Creativity assessment is a work in progress - we know far less about creativity and its measurement than we would like to know - but that is not to say that we know nothing at all. ”( [13]). 3 Description Of The Proposed Solution Probably the most common fallacy about creativity is to confound it with the divergent thinking. In fact, divergent thinking is just one of the many facets of the construct called creativity (see figure 1). When dealing with such complex concepts, the evaluations based on a single dimension of the creativity - be it divergent thinking or any other - have, inevitably, a limited reliability, and in the same time, addressing multiple dimensions leads to large and cumbersome scales (e.g. CSQ-R, described in [17] with 78 items). One possible approach to facilitate the understanding of complex intellectual constructs is to consider their opposite, or their associated “negative space”(see figure 2 for a graphical metaphor that illustrates this idea) So, what is the opposite of creative thinking? One possible answer is “thinking inside a box”, a style of thinking heavily biased by stereotypes, prejudices, illicit generalizations, superficiality and conformism. Starting from this idea we have developed a scale that attempts to indirectly assess the individual creativity by considering the factors that indicate stereotypical thinking. The proposed scale, called IACEST (Indirect Assessment of Creativity through the Estimation of Stereotypical Thinking) contains two subscales as shown in Tables 1 and 2. See [19] and [10] for details on how the items of similar scales are formulated. Note: In order to prevent the attempts to learn by heart the “right”answers, a third set of items containing 5 additional filler statements has been included in the online implementation 444 I. Susnea, G. Vasiliu DIVERGENT THINKINGCONVERGENT THINKING S U B JE C TI V IT Y O B JE C TI V IT Y H O L IS T IC D E T A IL O R IE N T E D V IS U A L V E R B A L A S S O C IA T IV E L O G IC A L R IG H T B R A IN LE FT B R A IN C R E A T IV IT Y L A C K O F C R E A T IV IT Y Figure 1: The multiple facets of creativity non-A A Figure 2: The Apple logo redesigned by Jonathan Mak a metaphor based on the negative space Table 1: Subscale 1. Creative personality and thinking style items. Item Statements 1 An image is worth a thousand words. 2 People say I am a bit lazy and scatterbrained. 3 I have a great sense of humor, and I always see the funny side of life. 4 Sometimes I get obsessed with a problem, and I keep trying until I find a solution. 5 A bit of adrenaline is always welcome. Life is boring without it. 6 I am very curious. 7 People think that I am good at finding solutions to common problems. 8 I enjoy trying to find new solutions to problems. 9 I have lots of ideas in every domain. 10 One plus one does not always equal two of the scale. (The answers to these items are simply ignored in the evaluation.) For the same reason, the items are presented in random order each time the test starts. The graphic user interface implementing the five point Likert scale for collecting the user’s responses is presented in figure 3. Each answer is scored with a numeric value between 0 (totally disagree) and 4 (definitely agree). For each subscale we compute a total score: < ss1 = 10∑ i=1 Ai >;< ss2 = 20∑ i=11 Ai > (1) A Fuzzy Logic Software Tool and a New Scale for the Assessment of Creativity 445 Table 2: Subscale 2. Items for detecting stereotypical thinking and other blocking factors for creativity. Item Statements 1 I always play by the rules. 2 My parents were very strict with me. 3 If anything can go wrong, it will. 4 I am very disciplined and diligent. 5 Sometimes I use oracles when I need to make difficult decisions. 6 I know exactly what I will do next summer. 7 I always trust reputable scientists. 8 I like to solve the problems one by one. 9 I like to quote the opinions of wiser people. 10 I feel very embarrassed if I fail. Figure 3: A snaposhot of the GUI of the application Obviously ss1,ss2 ∈ [0,40]. While a high score for the first subscale indicates a high cre- ativity, the second subscale is aimed to detect social conformity tendencies, and stereotypical thinking. A high score for the second subscale is likely to indicate the existence of important blocking factors for the subject’s creativity. Since the two subscales address distinct factors of the creativity, it is not possible to compute the final creativity quotient - CQ by simply adding or substracting the scores of the subscales. Assuming that the domain od variation for CQ is the interval [0,100], one possible way to compute this quotient is: CQ = 50 ∗ (1 + tanh(k ∗ (ss1 − ss2))) (2) where k is a scaling factor, empirically set to the value k = 0.07. Though this heuristic method of computing CQ spreads the user responses reasonably well over the interval [0,100] when using exactly two subscales that reflect opposite influences, we preferred to use a fuzzy inference algorithm to compute CQ. This solution is proven effective in 446 I. Susnea, G. Vasiliu many other difficult problems (see for example [9]), provides superior flexibility, and the resulting code is largely reusable in other applications. For this experimental version, and considering the limitations of the PHP language required by the dedicated web based application, we chose the simplest implementation with three fuzzy domains for ss1 and ss2 and linear membership functions, as shown in figure 4. 10 20 30 40 1 LOW MEDIUM HIGH x m L m Mm H m ss ,ss 1 2 Fuzzy domains and membership functions Figure 4: Fuzzy domains and membership functions for ss1, ss2 With these assumptions, the knowledge base for the actual interpretation of the scores ss1 and ss2 for computing CQ is described by the set of rules presented in Table 3: Table 3: The fuzzy rule base ss1 ss2 CQ LOW LOW LOW LOW MEDIUM LOW LOW HIGH LOW MEDIUM LOW MEDIUM MEDIUM MEDIUM MEDIUM MEDIUM HIGH LOW HIGH LOW HIGH HIGH MEDIUM HIGH HIGH HIGH MEDIUM Each line of Table 3 should be read as an IF/THEN statement of the following type: IF (ss1 is LOW) AND (ss2 is LOW) THEN CQ is LOW The truth values µL,µM,µH of the statements (ss1 is LOW), (ss1 is MEDIUM), (ss1 is HIGH), (ss2 is LOW), (ss2 is MEDIUM), (ss2 is HIGH), for the particular values of ss1, ss2 derived from the user’s responses are determined using the equations of the membership functions. And the truth value of the entire statement for the rule i is: Zi = min(µ1,µ2) (3) Assuming that the output domain is CQ ∈ [0,100], we can choose constant values for the output fuzzy domains (“singletons”, Si) e.g. SLOW = 10,SMEDIUM = 50, SHIGH = 100. With these notations, the final “crisp”value of CQ is the center of gravity of the entire knowledge base: CQ = ∑9 i=1 Zi ∗ Si∑ 9 i=1Zi (4) A Fuzzy Logic Software Tool and a New Scale for the Assessment of Creativity 447 For details on the tehory behind the above implementation, see [30]. Figure 5 is a snapshot of the screen presenting the results of the test and the computed value of CQ. The actual implementation contains additional software modules for user authentication, and report generation. The results of the tests are stored in a database. A beta version of the web application can be tested at http://dev.ugal.ro/creativity/ Figure 5: A snaposhot of the final screen presenting the result of the test 4 Discussion and conclusions The validation of the proposed scale is in progress. A simple pretest for internal consistency has been conducted with N=30 undergraduate students of the Faculty of Automation, Com- puters, Electrical and Electronics Engineering resulting in statistically acceptable values of the Cronbach quotient α = 0.73 for subscale 1, and α = 0.78 for subscale 2. Though this research has been conducted in the context of an educational project partly funded by EACEA - The Education, Audiovisual and Culture Executive Agency of the European Commission - (namely TECRINO- Teaching creativity in engineering, 538710-LLP-1-2013-1-CY- LEONARDO-LMP), for unknown reasons, the EACEA representatives stubbornly denied all the requests for permission to disseminate the results of this work through scientific publications, and to allocate funds, within the same budget, to deepen this study. Due to lack of funding, our work in this direction is much slower than we hoped. Therefore, for this moment, we must align to the expectations formulated by Miller in ( [18]): “It should also be noted that validation of any instrument is an ongoing procedure.... Once a measure has been adequately developed, it is the responsibility of all researchers in the field to further the generation of evidence for its validity.” Obviously, further validation studies using a larger sample are definitely required. The preliminary results are promising: the proposed tool is free, simple, easy to use, easy to integrate in almost any e-learning platform, and serves the purpose of assessing the evolution of the students enrolled in creativity training courses. And the idea of using a fuzzy algorithm for automated scoring of psychometric scales may have other interesting applications. 448 I. Susnea, G. Vasiliu Acknowledgment The authors gratefully acknowledge the contribution of Dr. Mihai Vlase, who wrote the code for the software implementation of the instrument described in this paper. Bibliography [1] Aleinikov, A., Kackmeister, S., and Koenig, R. (Eds.). (2000); Creating creativity: 101defi- nitions, Midland, MI: Alden B. Dow Creativity Center, Northwoods University. [2] Amabile, T.M. (1982); Social psychology of creativity: A consensual assessment technique, Journal of Personality and Social Psychology, 43: 997-1013. [3] Amabile, T. M., Conti, R., Coon, H., Lazenby, J., and Herron, M. (1996); Assessing the work environment for creativity, Academy of management journal, 39(5): 1154-1184. [4] Batey, M. (2012); The measurement of creativity: From definitional consensus to the intro- duction of a new heuristic framework, Creativity Research Journal, 24(1): 55-65. [5] Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., and Rumble, M. (2012); Defining twenty-first century skills, Assessment and teaching of 21st century skills, Springer Netherlands, 17-66. [6] Bull, K.S. & Davis, G.A. (1980); Evaluating creative potential using the statement of past creative activities, Journal of Creative Behavior, 14: 249-257. [7] Cachia, R., Ferrari, A., Kearney, C., Punie, Y., Van den Berghe, W.,and Wastiau, P. (2009); Creativity in schools in Europe: A survey of teachers in Europe, European Commission-Joint Research Center-Institute for Prospective Technological Studies, Seville. [8] Carson, S., Peterson, J.B.,& Higgins, D.M. (2005); Reliability, Validity, and Factor Structure of the Creative Achievement Questionnaire, Creativity Research Journal, 17(1): 37-50. [9] Dzitac, I., Vesselenyi, T., Tarca, R. C. (2011); Identification of ERD using fuzzy inference systems for brain-computer interface, International Journal of Computers Communications & Control, 6(3): 403-417. [10] Fields, Z., & Bisschoff, C. A. (2014); Developing and Assessing a Tool to Measure the Creativity of University Students, J Soc Sci, 38(1): 23-31. [11] Florida, R. (2006); The Flight of the Creative Class: The New Global Competition for Talent. Liberal Education, 92(3): 22-29. [12] Kasof, J. (1995); Explaining creativity: The attributional perspective, Creativity Research Journal, 8:311-366. [13] Kaufman, J. C., Plucker, J. A., & Baer, J. (2008); Essentials of creativity assessment, John Wiley & Sons, 53. [14] Kelly, K.E. (2004); A brief measure of creativity among college students. College Student Journal, 38: 594-596. [15] Kim, K. H. (2006); Can we trust creativity tests? A review of the Torrance Tests of Creative Thinking (TTCT), Creativity research journal, 18(1): 3-14. A Fuzzy Logic Software Tool and a New Scale for the Assessment of Creativity 449 [16] Kirton, M. (1976); Adaptors and innovators: A description and measure. Journal of Applied Psychology, 61: 622-629 [17] Kumar, V. K., Kemmler, D., & Holman, E. R. (1997); The Creativity Styles Questionnaire– Revised. Creativity Research Journal, 10(1): 51-58. [18] Miller, A. L. (2009), Cognitive processes associated with creativity: Scale development and validation (Doctoral dissertation, Ball State University). [19] Miller, A. L. (2014); A self-report measure of cognitive processes associated with creativity, Creativity Research Journal, 26(2): 203-218. [20] Parkhurst, H. B. (1999); Confusion, lack of consensus, and the definition of creativity as a construct, The Journal of Creative Behavior, 33(1): 1-21. [21] Plucker, J.A., Runco, M.A. (1998); The death of creativity measurement has been greatly exaggerated: Current issues, recent advances, and future directions in creativity assessment, Roeper Review, 21: 36-40. [22] Resnick, M. (2007); Sowing the Seeds for a More Creative Society, International Society for Technology in Education, 35(4): 18-22 [23] Robinson, K. (2011); Out of our minds: Learning to be creative. Capstone [24] Runco, M.A. (1999); Appendix II: Tests of creativity. In M.A. Runco and S.R. Pritzker Eds.), Encyclopedia of creativity, San Diego, CA: Academic Press, 755-760). [25] Runco, M. A., Plucker, J. A., & Lim, W. (2001); Development and psychometric integrity of a measure of ideational behavior. Creativity Research Journal, 13(3-4): 393-400. [26] Sawyer, R. K. (2011); Explaining creativity: The science of human innovation, Oxford University Press. [27] Susnea, I., Pecheanu, E., Tudorie, C. and Cocu, A. (2014); The education for creativity - the only student’s tool for coping with the uncertainties of the future, MAC ETEL 2014 - International Conference on Education, Teaching and e-Learning, Prague , Oct. 2014. [28] Susnea, I. Pecheanu, E. Tudorie, C. (2014); Initiatives towards and Education for Creativity, The 6th International Conference Edu World 2014 Education Facing Contemporary World Issues, 7th - 9th November 2014. Also published in Procedia - Social and Behavioral Sciences, 2015 180: 1520 - 1526. [29] Sternberg, R. J. (1988); A three-facet model of creativity. In R. J. Sternberg (Ed.), The nature of creativity, Cambridge: Cambridge University Press, 125-147. [30] Tanaka, K. (1997); An introduction to fuzzy logic for practical applications, Springer Verlag. [31] Thorsteinsson, G., Page, T., & Niculescu, A. (2010); Adoption of ICT in supporting ideation skills in conventional classroom settings, Journal of Studies in Informatics and Control, 19(3): 309-318. [32] Torrance, E. P. (1974); Torrance Tests of Creative Thinking: Norms and technical manual. Bensenville, IL: Scholastic Testing Press [33] Treffinger, D. J. (1996); Creativity, creative thinking, and critical thinking: In search of definitions. Sarasota, FL: Center for Creative Learning.