Open access peer-reviewed chapter

Perspective Chapter: New Approaches to the Assessment of Domain-Specific Creativity

Written By

Zehra Topal Altindiş

Submitted: 25 November 2021 Reviewed: 21 December 2021 Published: 30 January 2022

DOI: 10.5772/intechopen.102311

From the Edited Volume

Creativity

Edited by Sílvio Manuel Brito and João P. C. Fernandes Thomaz

Chapter metrics overview

258 Chapter Downloads

View Full Metrics

Abstract

Science and technology getting continue to advance, the true wealth of our civilization will manifest in human creative output. Accordingly, technological developments offer great opportunities for creativity researches and assessment of creativity. While there are studies in the literature on the creation of computer-based creative products on the one hand, studies on whether creativity can be evaluated automatically or not, on the other hand, have started to attract attention. In addition, field experts turned to new research to understand whether creativity assessment could be automated and measured more quickly and qualitatively, and to explore whether this calculation method could be standardized. Researches conducted in the last 10 years have shown that computational approaches towards semantic distance have made significant contributions to the field both in theory and in practice. However, it can be said that there are very few studies that measure creativity based on semantic distance. This chapter presents a brief overview to discuss whether a computer-based measurement tool that can perform automatic calculations can be used in the evaluation of linguistic creativity in light of the evidence obtained from the literature.

Keywords

  • creativity
  • assessment of domain-specific creativity
  • tests of creative thinking
  • semantic distance
  • latent semantic analysis
  • This chapter is converted from some part of author’s PhD thesis

1. Introduction

Without knowing the force of words, it is impossible to know more.

Confucius

Although, there have been tremendous studies on creativity over the decades, it can be said that there are many treasures that can be found in “the mining of creativity”. Over the years, both the development/change process that humanity have been through and the technological advancements have resulted in the formation of various resources ranging from the definition to the evaluation of creativity. Despite these advancements, as each era brings its own needs, new necessities are occurring in the field of creativity. One of the necessities can be said to develop a web-based automatic scale to evaluate fairly the potential of the twenty-first century individual called as digital native. In this chapter of the book, the questions tried to be answered: “What quantitative measures of semantic distance applied in research tell us about creativity or domain-specific creativity? Why is LSA getting be popular in recent research? Is LSA scores successfully predicted the average human creativity scores?”. In addition, the definition of creativity, the tendency of creativity studies through the process, the studies about the evaluation of the creativity, the automatization about the evaluation of the domain-specific creativity and the usage of LSA and the related knowledge were discussed.

1.1 The potential that humanity has and cannot be defined: creativity

“The beginning of wisdom is to call things by their right names.”

Socrates

If the invention of writing is a turning point for the history of humanity, the speech that Guilford gave in the 1950s in American Psychological Association is the same for modern creativity studies. It can be said that this speech had an impactful effect on the domain experts. Before the speech of Guilford, the researches related to creativity were only % 0.2 in the Psychological Abstract index [1]. However, after this speech, it can be seen that both the amount of the studies and the results of them have increased in the field of creativity.

Recently, there are many definitions, theories, methods and scales available in the field of creativity. To begin with the definitions, Treffinger [2] has reviewed over 100 definitions in the field. Some of researchers has compiled 101 contemporary definition proposals from the children and the adults [3]. Despite so many definitions, as the studies of creativity cannot present a clear definition, this situation leads to inconsistent results [4]. This problem in the field of creativity can be likened to blind men and the elephant issues. When the recent studies are looked into, there is a wide range of discussions in the topics of like Covid-19 [5], defining oxytocin level [6], migration studies [7], creative process studies [8] using blind men and elephant issue for explaining. The blind men and elephant issue is based on the artwork of famous calligraphy artist Hanabusa Itchô (see more detail in [8]). In this artwork, a group of blind men tries to understand and define an elephant by touching its body but they have limited knowledge during this process. The shortage of knowledge leads them to make wrong or limited guesses. Similarly, this metaphor reflects the situation of the creativity field. Although, the studies focus on the different viewpoints of the creativity, the sum of the studies can be worthy for understanding the creativity.

When we look into the studies focusing on the theory and methods of the creativity, the most well-known classification in the field is seen to be the 4P Framework of Creativity (process, person, product, press) [9]. The other noticeable theories and models in the field can be listed as Associative Theory: stimulus-response (S-R) perspective [10], Structure of Intellect Model (SOI) [11], Incubation and Intuition [12, 13], Componential Model [14], Geneplore Model [15], Investment Theory of Creativity [16], Systems Model of Creativity [17], Amusement Park Theory [18], H-creativity [19], multiple levels of creativity (Big-C, Pro-c, little-c, and mini-c) called The Four-C Model of Creativity [20], The 5A’s of Creativity: Person/Actor, Process/Action, Product/Artifact, Press/Audience & Affordances [21] and more recent one can be shown as the Minimal Theory of Creativity Ability [22]. All these studies can be seen as concrete proof of the hard work that field experts do to understand, evaluate and form the theoretical framework of creativity.

Creativity is studied for over a century, so there are many competing and complementary creativity tests in the field. It can be claimed that the situation is not pessimistic regarding the evaluations done to evaluate creativity. For instance, according to the National Center on the Gifted and Talented, there are more than 100 techniques [3], 72 tests to evaluate the creativity by the center for creative learning, CLL [23] and Cropley [24] stated that there are at least 255 creativity tests in the literature [25]. According to Weiss, Wilhelm & Kyllonen [26], there are 228 identified creativity measures appearing in the literature since 1900. The many competing and complementary creativity tests in the field make it difficult for potential users to decide on their appropriateness. The most widely used creativity tests in the literature are presented by grouping in terms of 4P of the creativity and schemed by using Coggle.it in the below:

Because creativity is multidimensional and can be represented with different viewpoints, the way how creativity is defined affects how it will be measured and evaluated [25]. For that reason, in which ways creativity is evaluated has been tried to be explained in the above section. However, the scales that are known to measure creativity are usually limited in the sense of conception and psychometry. Moreover, as the choice of the staff and the training of them increase the expense, it reduces the functionality of these tests [27, 28]. According to Baer’s severe criticism, the future of these tests is independent of their existence in the twenty first century or not is clearly suspicious [29].

1.2 The tendency of creativity studies in the process

Compared with the definition of creativity, it can be said that measuring creativity by using criterion-based objective rating scales is more difficult [30]. There are hundreds of tests to evaluate the creativity in the literature [31]. The methods and classifications used to classify and define the assessment types previously mentioned have been tried to be explained above (see Figure 1). The way of the tendency of the studies focusing on creativity has been explained below.

Figure 1.

Tests of creativity within the 4P framework.

Sawyer [32] divided the studies related to the creativity into three categories. The first category consisted of the studies focusing on the creative person and the features of him/her; the second category consisted of the cognitive psychology experiments in the 1980s and the studies focusing on the cognitive aspect of creativity; the third category was defined as the studies dealing with the experiments conducted within the scope of sociocultural and interdisciplinary approach in the 1990s. The recent creativity studies focusing on the neurology and the computer-aided studies can be seen as the fourth category. Just as Guilford criticized the field of psychology for being indifferent to creativity, today, similar criticism is valid for the lack of studies in the field of neurology and artificial intelligence. In the visual given below, the previously mentioned four processes are schemed and presented (Figure 2).

Figure 2.

The tendency of creativity studies in the process.

It will not be a good approach to think of these categories separately from each other. Because science has a cumulative and progressive structure, it can be claimed that the results of the former studies shape and contribute to the latter studies.

1.3 Where does the nose of the ship show? Is creativity domain-general or domain-specific?

Due to the fact that the creativity’s own features, it is a quite generous field to do studies for the field experts. Besides, it can be said that it gives directions to creativity studies in each period. The field of the creativity has been a remarkable study field focusing on a particular field or discipline within the last 15–20 years (see [29, 33, 34, 35, 36]). One of the reasons for this situation may be thought resulting from the fact that whether the creativity is domain-general or domain-specific. In other words, the wheel of the ship has turned with an angle of 180°, and it is a sign of the fact that the nose of the ship also turned from domain-general to domain-specific. Along with this information, scientific creativity tests, artistic creativity tests, tests measuring the potential of linguistic creativity can also be included into the roof of the scales recognizing the creativity as domain-specific. Although, there are tests named verbal tests in the field, the tests which measure the potential linguistic creativity are needed in the field.

Advertisement

2. Language and linguistic creativity

The language is the infinite use of finite means.”

Wilhelm von Humbolt

It can be claimed that language, which is the common ground especially for linguists, philosophers, sociologists, anthropologists, psychologists, educators, communication experts and even computer scientists, Then we can ask the question that why the language means so much for many different disciplines? Language is a talent representing a unique feature that is only special to human beings according to many researchers [37]. In other words, language is a tool used for the service of thinking [38] and also a tool aiding for making arrangements [37]. According to the writer of the chapter, language is both the representative of the creative potential of the person and a tool for expressing the creative potential of the person. Similarly, Holtgraves [37] states that language can be seen as a tool, which can be used to achieve particular goals.

Just as language is known to have more than one skill (listening/observing, speaking, reading, writing), creativity also has more ways/aspects. Because both of them have multiple structures, language and creativity can be likened to each other. The evaluation of these two concepts is very difficult because of the aspects they have.

It can be said that there are a great number of theories and models related to the creativity in the literature. A similar situation may also be seen in the scientific creativity (see [39, 40, 41, 42, 43]). However, the linguistic creativity does not have the same richness [44] and unfortunately has a limited research database in the literature. Furthermore, when the linguistic creativity is concerned, it is a limited perspective to think of the studies of language teaching. Moreover, it can be said that there are limitations in terms of the conceptual perspective.

The definition of language creativity varies just like creativity. For instance, there are researchers who state that language is an ongoing creation, and those who state that language can expand until the newly formed borders by exploring new resonances [3]. These definitions show that linguistic creativity has a turbulent history in the twentieth century linguistic [45]. Not only the term of the linguistic creativity is a special gift or draws an unexpected path with words but also it has a wide range of meaning starting from the understanding of the linguistic creativity to the special usage of the term. Variability is related with the concept of linguistic creativity. In other words, to be able to create different language structures, the flexible association of the language units is needed [46].

It can be said that if there is language in a place, there are also ideas. If there are ideas, associations are also there because the relationships, which are based on the associations, happen within the framework of the language person has. This situation reminds us the words that Wittgenstein said: “The borders of our language determine the borders of our World”. On the other hand, it can be said that the language skill has both cognitive and social way. For instance, we benefit from language in our academic life, doing our daily chores, communicating with people, analyzing the events we face and use them through filtering personally and cognitively and as a result, we come to an assumption or conclusion.

When real life problems are mentioned, as a solution of them, maths and science based disciplines are remembered or there is a perception regarding of them. As mentioned above, in the focus of the experience and problem takes place the language. Thereby, it can be said that in the solution of every kind of problems, language processes remain. For that reason, associations and analogies take place in the framework of the domain-specific creativity.

Advertisement

3. The controversy face of the evaluations: creativity

The questions that what the creativity really is and if it is possible to measure it or how the ideal evaluation scale should have always been discussed by the field experts. In the field of psychology and psychometry, the persuasive and convincing seen criterions are mentioned as the “validity” criterions. In this sense, a couple of validity kinds were defined. One of them is defined as “face validity” and it means that the content of the test has a meaning for field experts. The second kind of validity is known as “construct validity”. This validity type means whether the content of the test is based on the nature of the creativity. The third kind of validity is named as “predictive validity” [47]. In this kind of validity, test should foresee everything related to the creativity concept. For instance, to be successful in the field of life needs creativity. Just like Weisberg’s [36] example pointing out that RAT measures the potential of the creative thinking capacity: “If you want to determine the potential of a good marathon runner, you should measure his capacity of lungs not his speed of running”. Why shouldn’t we use the scales which will help the creativity integrate into the education programmes or measure the potential of the creativity automatically? Because the biggest service of an assessment tool is to determine the individuals who have the potential of the creativity in the future?

3.1 The problem of the twenty first century: how the digital immigrants assess the digital natives?

“Our task, regarding creativity, is to help children climb their own mountains, as high as possible. No one can do more.”

Loris Malaguzzi

In the digital society changing fast, creativity is accepted as one of necessary basic skills of the twenty first century for professional and personal success [20]. For that reason, many countries have been focusing on the creativity in terms of their education process more and more. It has been a primary subject to determine and develop the creativity of the students in the early ages of them [48, 49]. So, reliable psychometric tests are necessary to select creative individuals and determine the potential of the creative individuals in the field [27, 28]. Consequently, the usage of the assessment scales that will measure the potential of the creativity of the individuals and can make automatic calculations, which are suitable for the conditions of the twenty first century, is not a choice but a necessity.

Education and instruction environments can be defined as the places preparing the individual for life. If it is the case, what kind of assessment scale(s) should we use to measure the creativity of these individuals? Teachers are not always the right people measuring or assessing the creativity in a best way. As an example of it, Torrance’s [50] (the father of the creativity was an English teacher) 12-year-study can be given. According to the result of the study, the relationship between the assessment and the creative features of the students could not be found. The same students are seen to have shown their creativity in their adulthood [51]. The reason for this matter should be questioned. How much do teachers know their students? or how can teachers choose the students who have the creativity potential? If the teachers are expected to find answers for these questions, the system should also be regulated to serve for the aims of the teachers. So, where is the basis of the current system based on? Let’s get ready to find out by going back in time.

The organizational processes, which see the creativity as a concrete aim, are very few. No matter thinking out of the box, being eager to take risk and being original are the words that are uttered frequently, it is open to discussion whether the system and decision-makers in the system truly want it. For example, the starting point of the current school-based education is based on the book of “The principles of Scientific Method”, which is a book by mechanical engineer Frederick Taylor written in 1911 to increase the productivity in the workplace. The “Taylorism” approach, which was put forward in his book, still has its effect on the education system. The reason why this approach had so much impact on the education can result from the fact that it was supported by the father of the American psychologist Edward Thorndike. Taylorism claims that the individuals should be placed based on their talents and supports the idea that the scale of the system should be based on the speed. Because, the measurement of the speed is easier than determining the individual’s talent on a field. Therefore, to assessment people’s ability, they were looked into how much they were fast. As a result of this, standardization was focused not on the individualism and creativity [52]. The fact that how much this approach serves to meet the requirements of the human profile needed in the twenty first century is open to discussion.

Individuals grown up as digital natives [53] and known as the twenty first century individual, can be defined as game age or application generation [54]. If we want to assess this age’s native, we should design a scale appropriate for the conditions of this age. While assessing the creativity, we should get used to it. If not, as the digital immigrants, how can we guide to the digital natives who have a great potential to contribute their potentials to the universe?

When the scales are checked in the worldwide, it is clearly seen that using paper-pen-tests have been replaced with the web based tests. This change shows itself not only in the format of the tests but also in the process of the assessments (see [55]). For example, PISA, which is a test applied globally, was planned to be in the computer based environment and delayed to 2022 due to the global pandemic, but even so, it is seen that the test was revised according to the needs and the conditions of the period. The change in the application and assessment in PISA has also been seen in the question styles. According to the announcement published on the formal website, for the first time in the exam of PISA in 2022, students are expected to produce a visual artwork instead of giving a written answer or choosing one correct answer to measure the creative potentials of the students, The assessment includes open-ended tasks in which the answers do not have only one single answer but many of them [55]. This means that collecting answers from many students in the world and gives the hint of using computer-based (objective and automatic) assessment techniques. One example from the question of the PISA is shared below (Figure 3).

Figure 3.

A question related to creativity in PISA.

In addition to this, the tendency of preferring the subjective assessment approach to objective (and automatic) assessment is also observed in the field of creativity [44, 56, 57, 58, 59]. This can be stemmed from the needs of the age. Furthermore, field experts should be open to progressive approaches to accelerate the growth of the proofs belonging to the assessment of the creativity [60].

Advertisement

4. New approaches to the assessment of domain-specific creativity

Computer-based analyses allow for objective methods to make an assessment of the semantic-linguistic quality of narratives at the text level. In this regard, researchers have begun to explore the benefits of automated scoring approaches using computer-based computational tools [33, 61, 62, 63, 64, 65, 66, 67]. One of these approaches is Latent Semantic Analysis (LSA).

Being one of the automatic scoring approaches LSA, is a technique to extract and represent sentences with mathematical or statistical calculation. LSA express the ideas about that the similarities and differences in the meaning of words can be influenced by the similarities and differences in the overall context in which the word is there or not [68].

On the other hand, LSA is a language learning model in which meanings of words are extracted from statistical analyses of large chunks of text. LSA determines whether the words are related or not through analysis of the relationship between words based on which words are frequently used together and which words are rarely together [57, 69].

In addition, LSA is also used to measure the consistency of texts [70]. The aim of LSA is to create a structure that shows the level of similarity between words [69, 71]. With LSA, hidden connections in textual data are revealed. Unstructured data must be converted to structured format so that LSA can be used. Thus, LSA can be applied to any document stack whose syntactic and grammatical structure is cleaned [72]. Also, Heinen and Johnson [63] found that LSA-based measures of semantic distance relate to measures of novelty and appropriateness, measures of creative output [34, 67].

4.1 Using semantic distance in assessing creativity

The semantic distance plays a role in various models about creative process research in field. So, Latent Semantic Analysis is a popular computational method to represent semantic distance in creativity research is through [65, 66, 69]. The role of semantic distance in creativity is rooted in the associative theory of creativity [10, 65, 73]. Therefore, the further a new concept moves away from a concept in a semantic space, the newer or more creative it will be. This is consistent with the “Associative Theory of Creativity,” which states that creativity involves the ability to connect relatively weak or distant concepts to each other and to combine them with new and useful objects. That is, the greater distance there is between the concepts produced, the newer combinations are produced and the more creative the produced thing is [10, 65, 69].

Creativity involves the ability to associate relatively weak or distant semantic components and to combine them into new and useful objects [74]. Researchers focusing on creativity have long struggled with how to measure creativity [56]. Examining studies showing that creativity can be achieved through computational processes, Boden [19] argued that it may be possible to create a program that can create works of art or symphonies [36]. The literature contains studies suggesting that computer-based and artificial intelligence supported creative products can be developed [75, 76, 77] on the one hand and studies on whether creativity can be assessed automatically [57, 58, 78] on the other hand. However, it is stated that there are limited studies on measuring creativity based on semantic distance [74]. For example, the creativity literature does not include much research on estimating the originality of an idea in a written work [44]. It is seen that there are just a couple of studies, and they deal with behavior [79] and words or expressions as a creative way of using language [80]. In addition, other studies in the literature show that writing quality is associated with stronger reading skills [81, 82, 83], broader vocabulary [81, 84], grade level [85], more flexible thinking skill [86], and level of knowledge on the topic to be written about [87].

In the study conducted by Runco, Turkman, Acar, & Nural [88], one of the more recent studies in the literature, the relationship between idea frequency and creativity is investigated. Another recent study [44] has incorporated new measurement techniques (keyword study) into the assessment process of linguistic creativity. With the computer-based keyword method used, yet another study [88] demonstrated how creativity revealed itself in written works (products) and to what extent this method reduced the time and effort cost in scoring multiple thinking tasks. Furthermore, the tests administered to the participants were assessed by domain experts (subjective) along with computer-based (objective and automatic) assessment, the new method. The obtained assessment results were promising for the objective assessment in hand.

Researchers in the literature [89, 90] found that divergent thinking (DT) tasks can be properly scored using LSA [72, 91]. The semantic word categorization performance of LSA [92] is reported to be satisfactory and comparable to human performance [65]. Additionally, LSA is preferred as it is objective, not based on human judgment, which allows obtaining reliable results among users, measurable and numerically applicable, grounded on a theoretical background, and justified [93] and it offers a strong way for quantitative analysis [73]. It can be stated that these findings show the usefulness and effectiveness of semantic distance measures to measure domain-specific creativity.

Advertisement

5. Method

This chapter is a descriptive study. The method of research which concerns itself with the present phenomena in terms of conditions, practices beliefs, processes, relationships or trends invariably is termed as “descriptive survey study” [94]. In addition, descriptive research is devoted to the gathering of information about prevailing conditions or situations for the purpose of description and interpretation. This type of research method is not simply amassing and tabulating facts but includes proper analyses, interpretation, comparisons, identification of trends and relationships. It is concerned with the present and attempts to determine the status of the phenomenon under investigation. The survey research employs applications of scientific method by critically analyzing and examining the source materials, by analyzing and interpreting data, and by arriving at generalization and prediction [95].

According to Pandey and Pandey [95], Types of Descriptive Method: Descriptive method is divided into four parts. You may view the schematic below (Figure 4).

Figure 4.

Types of Descriptive Method.

Advertisement

6. Discussion and conclusion

In this chapter, the context is discussed through the questions asked. What quantitative measures of semantic distance applied in research tell us about creativity or domain-specific creativity? The application of such methods can potentially provide a more quickly and objective measure of the output of creative thinking [65]. Why is LSA getting be popular in recent research? LSA, unlike many other methods, employs a preprocessing step in which the overall distribution of a word over its usage contexts, independent of its correlations with other words, is first taken into account; pragmatically, this step improves LSA’ s results considerably [69]. According to Kenett [65], a growing number of studies are applying quantitative measures of semantic distance in creativity research. In addition, more recently, LSA may have inspired some researchers to examine. So, it may cause more recent research is getting prefer to use LSA. Is LSA scores successfully predicted the average human creativity scores? Yes, according to Forster and Dunbar [90], LSA scores successfully predicted the average human creativity scores. and the success of this measurement technique was confirmed with a scale independently judged by humans and shown to be a better approximation of human responses than traditional measures.

Creativity has various definitions, theories and models and creativity can be assessed in many ways as well. The many competing and complementary creativity tests in the field make it difficult for potential users to decide on their appropriateness. On the other hand, there is still a surprising gap regarding in the field about computer-based (objective and automatic) assessment of creativity. However, recently new developments as implementing new technologies [96], digitalization [97] and scoring or assessment methods [56, 98] have been proposed. Despite the remarkable developments in technology, the literature on providing new scoring methods to pioneering creativity tests is still sparse [26].

As a matter of fact, the doctoral dissertation of the author of the chapter also includes the use of LSA in the automatic assessment of linguistic creativity [dissertation writing process in progress]. It is thought that the study will make a humble contribution to meet the need in the field. On the other hand, research focusing on LSA continues to be added to the literature day by day. In this regard, using LSA seems to be an effective method in the automatic/objective assessment of linguistic creativity. In summary, in this section, it is aimed to raise awareness of the need for computer-based (objective and automatic assessment) new scoring methods in the field of creativity.

References

  1. 1. Sternberg RJ. Wisdom, Intelligence and Creativity Synthesized. Cambridge: Cambridge University Press; 2003. 246 p
  2. 2. Treffinger DJ. Creativity, Creative Thinking, and Critical Thinking: In Search of Definitions. Sarasota, FL: Center for Creative Learning; 1996
  3. 3. Treffinger DJ, Young GC, Selby EC, Shepardson C. Assessing Creativity: A Guide for Educators. Sarasota, Florida: National Research Center on the Gifted and Talented, Center for Creative Learning; 2002
  4. 4. Plucker JA, Makel MC. Assessment of creativity. In: Kaufman JC, Sternberg RJ, editors. The Cambridge Handbook of Creativity. New York, NY: Cambridge University Press; 2010. pp. 48-73
  5. 5. Simpson Frances K, Lokugamage Amali U. The elephant and the blind men: The children of long Covid. BMJ. 2021;372. DOI: 10.1136/bmj.n157
  6. 6. MacLean EL, Wilson SR, Martin WL, Davis JM, Nazarloo HP, Carter CS. Challenges for measuring oxytocin: The blind men and the elephant? Psychoneuroendocrinology. 2019;107:225-231. DOI: 10.1016/j.psyneuen.2019.05.018
  7. 7. DeWind J. Blind men and the elephant: One view of the field of migration studies. Comparative Migration Studies. 2020;8(1):1-16
  8. 8. Nooshin L. The elephant and the blind men: Myth-making, musical tracking and the creative process. In: Keynote Paper presented at the Tracking the Creative Process in Music. 2017. pp. 1-19
  9. 9. Rhodes M. An analysis of creativity. Phi Delta Kappan. 1961;42:305-310. DOI: 10.2307/20342603
  10. 10. Mednick S. The associative basis of the creative process. Psychological Review. 1962;69(3):220-232
  11. 11. Guilford JP. The Nature of Human Intelligence. New York: McGraw Hill; 1967. 538 p
  12. 12. Sio UN, Ormerod TC. Does incubation enhance problem solving? A meta-analytic review. Psychological Bulletin. 2009;135:94-120
  13. 13. Wallas G. The Art of Thought. Harcourt, Brace: San Diego; 1926. 202 p
  14. 14. Amabile TM. Social psychology of creativity: A consensual assessment technique. Journal of Personality and Social Psychology. 1982;43:997-1013
  15. 15. Finke RA, Thomas BW, Steven MS. Creative Cognition: Theory, Research, and Applications. Cambridge: MIT Press; 1992
  16. 16. Sternberg RJ, Todd IL. Investing in creativity. American Psychologist. 1995;51(7):677
  17. 17. Csikszentmihalyi M. The Systems Model of Creativity: The Collected Works of Mihaly Csikszentmihalyi. Springer; 2015. 163 p
  18. 18. Kaufman JC, Baer J. The amusement park theory of creativity. In: Kaufman JC, Baer J, editors. Creativity Across Domains: Faces of the Muse. Mahwah, NJ: Erlbaum; 2005. pp. 321-328
  19. 19. Boden MA. The Creative Mind: Myths and Mechanisms. Routledge; 2004
  20. 20. Kaufman JC, Beghetto RA. Beyond big and little: The four-c model of creativity. Review of General Psychology. 2009;13(1):1-12. DOI: 10.1037/a0013688
  21. 21. Glăveanu VP. Rewriting the language of creativity: The five A’s framework. Review of General Psychology. 2013;17(1):69-81
  22. 22. Stevenson C, Baas M, van der Maas H. A minimal theory of creative ability. Journal of Intelligence. 2021;9(1):9
  23. 23. Center for Creative Learning. Assessing Creativity Index [Internet]. 2015. Available from: https://www.creativelearning.com/index.php/free-resources/assessing-creativity-index [Accessed: October 17, 2021]
  24. 24. Cropley AJ. Defining and measuring creativity: Are creativity tests worth using? Roeper Review. 2000;23(2):72-79
  25. 25. Walia C. A dynamic definition of creativity. Creativity Research Journal. 2019;31(3):237-247
  26. 26. WWeiss S, Wilhelm O, and Kyllonen P. An improved taxonomy of creativity measures based on salient task attributes. Psychology of Aesthetics, Creativity, and the Arts. 2021; 2-9. DOI: 10.1037/aca0000434
  27. 27. Barbot B, Richard WH, Reiter-Palmon R. Creativity assessment in psychological research: (Re)setting the standards. Psychology of Aesthetics, Creativity, and the Arts. 2019;13(2):233
  28. 28. Said-Metwaly S, Van den Noortgate W, Kyndt E. Methodological issues in measuring creativity: A systematic literature review. Creativity. Theories-Research-Applications. 2017;4(2):276-301
  29. 29. Baer J. Is creativity domain specific? In: Kaufman JC, Sternberg RJ, editors. The Cambridge Handbook of Creativity. 2010. pp. 321-341
  30. 30. Weiss S, Oliver W. Coda: Creativity in psychological research versus in linguistics—Same but different? Cognitive Semiotics. Interdisciplinary Creativity Research. 2020;13(1):1-18
  31. 31. Weiss S, Wilhelm O, Kyllonen P. A review and taxonomy of creativity measures. Psychology of Aesthetics, Creativity, and the Arts. 2020. Submitted for publication
  32. 32. Sawyer RK. Explaining Creativity: The Science of Human Innovation. Oxford University Press; 2011
  33. 33. Acar S, Runco MA. Assessing associative distance among ideas elicited by tests of divergent thinking. Creativity Research Journal. 2014;26(2):229-238
  34. 34. Baer J. The case for domain specificity of creativity. Creativity Research Journal. 1998;11:173-177. DOI: 10.1207/s15326934crj1102_7
  35. 35. Plucker JA, Beghetto RA. Why creativity is domain general, why it looks domain specific, and why the distinction does not matter. In: Sternberg RJ, Grigorenko EL, Singer JL, editors. Creativity: From Potential to Realization. American Psychological Association; 2004. pp. 153-167. DOI: 10.1037/10692-009
  36. 36. Weisberg RW. Creativity: Understanding Innovation in Problem Solving, Science, Invention, and the Arts. New Jersey: John Wiley & Sons; 2006. 622 p
  37. 37. Holtgraves TM. Language as Social Action: Social Psychology and Language Use. Psychology Press; 2013
  38. 38. Semin GR. Cognition, language, and communication. In: Fussell SR, Kreuz RJ, editors. Social and Cognitive Approaches to Interpersonal Communication. Hillsdale, NJ: ERlbaum; 1998. pp. 229-257
  39. 39. Astutik S, Prahani BK. The practicality and effectiveness of collaborative creativity learning (CCL) model by using PhET simulation to increase students’ scientific creativity. International Journal of Instruction. 2018;11(4):409-424
  40. 40. Hu W, Adey P. A scientific creativity test for secondary school students. International Journal of Science Education. 2002;24(4):389-403. DOI: 10.1080/09500690110098912
  41. 41. Kanlı E. The development of creative scientific associations test and examining its psychometric properties [thesis]. Istanbul: Istanbul University; 2014
  42. 42. Park JW. A suggestion of cognitive model of scientific creativity (CMSC). Journal of the Korean Association for Science Education. 2004;24(2):375-386
  43. 43. Simonton DK. Varieties of (scientific) creativity: A hierarchical model of domain-specific disposition, development, and achievement. Perspectives on Psychological Science. 2009;4(5):441-452
  44. 44. Turkman B, Runco MA. Discovering the creativity of written works: The keywords study. Gifted and Talented International. 2019;34(1-2):19-29
  45. 45. Zawada B. Linguistic creativity from a cognitive perspective. Southern African Linguistics and Applied Language Studies. 2006;24(2):235-254
  46. 46. Langlotz A. Idiomatic Creativity: A Cognitive-Linguistic Model of Idiom-Representation and Idiom-Variation in English. Vol. 17. John Benjamins Publishing; 2006. 185-194 p
  47. 47. Andreasen NC. The Creating Brain: The Neuroscience of Genius. Dana Press; 2011
  48. 48. Beghetto RA. Creativity in the classroom. In: Kaufman JC, Sternberg RJ, editors. The Cambridge Handbook of Creativity. Cambridge Univerity Press; 2010. pp. 447-463
  49. 49. Atesgoz NN, Sak U. Test of scientific creativity animations for children: Development and validity study. Thinking Skills and Creativity. 2021;40:100818
  50. 50. Torrance EP. Can We Teach Children to Think Creatively? American Educational Research Association; 1972
  51. 51. Woolfolk AE. Educational Psychology. 13rd ed. Pearson Education; 2016
  52. 52. Couch JD, Towne J. Rewiring Education: How Technology Can Unlock Every Student’s Potential. BenBella Books; 2018
  53. 53. Prensky MR. Teaching Digital Natives: Partnering for Real Learning. Vol. 2010. Corwin Press;
  54. 54. Gardner H, Davis K. The App Generation. Yale University Press; 2013
  55. 55. What is innovative about the PISA 2022 Creative Thinking assessment? [Internet]. 2021. Available from: https://www.oecd.org/pisa/innovation/creative-thinking/ and http://actnext.info/drawingtool/eng-ZZZ/question01.html [Accessed: September 29, 2021]
  56. 56. Beaty RE, Johnson DR. Automating creativity assessment with SemDis: An open platform for computing semantic distance. Behavior Research Methods. 2021;53:757-780. DOI: 10.3758/s13428-020-01453-w
  57. 57. LaVoie N, Parker J, Legree PJ, Ardison S, Kilcullen RN. Using latent semantic analysis to score short answer constructed responses: Automated scoring of the consequences test. Educational and Psychological Measurement. 2020;80(2):399-414
  58. 58. Olteţeanu AM, Falomir Z. comRAT-C: A computational compound remote associates test solver based on language data and its comparison to human performance. Pattern Recognition Letters. 2015;67:81-90
  59. 59. Olteţeanu AM, Schultheis H, Jonathan BD. Computationally constructing a repository of compound remote associates test items in American English with comRAT-G. Behavior Research Methods. 2018;50(5):1971-1980
  60. 60. Villani D, Antonietti A. Measurement of creativity. In: Encyclopedia of Creativity, Invention, Innovation and Entrepreneurship. 2020. pp. 1589-1594. DOI: 10.1007/978-3-319-15347-6_377
  61. 61. Dumas D, Organisciak P, Doherty M. Measuring divergent thinking originality with human raters and text-mining models: A psychometric comparison of methods. Psychology of Aesthetics, Creativity, and the Arts. 2020:1-67. DOI: 10.1037/aca0000319
  62. 62. Dumas D, Runco M. Objectively scoring divergent thinking tests for originality: A re-analysis and extension. Creativity Research Journal. 2018;30(4):466-468
  63. 63. Heinen DJP, Johnson DR. Semantic distance: An automated measure of creativity that is novel and appropriate. Psychology of Aesthetics, Creativity, and the Arts. 2018;12(2):144
  64. 64. Diedrich J, Benedek M, Jauk E, Neubauer AC. Are creative ideas novel and useful? Psychology of Aesthetics, Creativity, and the Arts. 2015;9(1):35-40. DOI: 10.1037/a0038688
  65. 65. Kenett YN. What can quantitative measures of semantic distance tell us about creativity? Current Opinion in Behavioral Sciences. 2019;27:11-16. DOI: 10.1016/j.cobeha.2018.08.010
  66. 66. Prabhakaran R, Green AE, Gray JR. Thin slices of creativity: Using single-word utterances to assess creative cognition. Behavior Research Methods. 2014;46(3):641-659. DOI: 10.3758/s13428-013-0401-7
  67. 67. Zedelius CM, Mills C, Schooler JW. Beyond subjective judgments: Predicting evaluations of creative writing from computational linguistic features. Behavior research Methods. 2019;51(2):879-894. DOI: 10.3758/s13428-018-1137-1
  68. 68. Ratna AAP, Purnamasari PD, Adhi BA. SIMPLE-O, the essay grading system for indonesian language using LSA method with multi-level keywords. In: ACE ACSET ACEurs. 2015. pp. 155-164
  69. 69. Landauer TK, Foltz PW, Laham D. An introduction to latent semantic analysis. Discourse Processes. 1998;25(2-3):259-284
  70. 70. Foltz PW, Kintsch W, Landauer TK. The measurement of textual coherence with latent semantic analysis. Discourse Processes. 1998;25(2-3):285-307
  71. 71. Varçın F, Erbay H, Horasan F. Latent semantic analysis via truncated ULV decomposition. In: 24th Signal Processing and Communication Application Conference (SIU ‘16); 16-19 May 2016. Zonguldak: IEEE; 2016. pp. 1333-1336. DOI: 10.1109/SIU.2016.7495994
  72. 72. Deerwester S, Dumais ST, Fumas GW, Landauer TK, Harshman R. Indexing by latent semantic analysis. Journal of the American Society for Information Science. 1990;41(6):391-407. DOI: 10.1002/(SICI)1097-4571(199009)41:6<391::AID-ASI1>3.0.CO;2-9
  73. 73. Kenett YN. Going the extra creative mile: The role of semantic distance in creativity—Theory, research, and measurement. In: Jung RE, Vartanian O, editors. The Cambridge Handbook of the Neuroscience of Creativity. Cambridge University Press; 2018. pp. 233-248. DOI: 10.1017/9781316556238.014
  74. 74. Liu C, Ren Z, Zhuang K, He L, Yan T, Zeng R, et al. Semantic association ability mediates the relationship between brain structure and human creativity. Neuropsychologia. 2021;151:107722. DOI: 10.1016/j.neuropsychologia.2020.107722
  75. 75. Colton S. The painting fool: Stories from building an automated painter. In: Mc Cormack J, d’Inverno M, editors. Computers and Creativity. Vol. 2012. Springer; 2012. pp. 3-38
  76. 76. Moruzzi C. Creative AI: Music composition programs as an extension of the composer’s mind. In: 3rd Conference on Philosophy and Theory of Artificial Intelligence. 2017. pp. 69-72. DOI: 10.1007/978-3-319-96448-5_8
  77. 77. Williams H, PW MO. Magic in the machine: A computational magician’s assistant. Frontiers in Psychology. 2014;5:1283. DOI: 10.3389/fpsyg.2014.01283
  78. 78. Issa L, Alghanim F, Obeid N. Computational creativity: The design of a creative computer program. In: 2019 10th International Conference on Information and Communication Systems (ICICS ‘19); Irbid; 11-13 June 2019. Jordan: IEEE; 2019, 2019. pp. 193-198. DOI: 10.1109/IACS.2019.8809107
  79. 79. Ruscio J, Whitney DM, Amabile TM. Looking inside the fishbowl of creativity: Verbal and behavioral predictors of creative performance. Creativity Research Journal. 1998;11(3):243-263. DOI: 10.1207/s15326934crj1103_4
  80. 80. Estes Z, Ward TB. The emergence of novel attributes in concept modification. Creativity Research Journal. 2002;14(2):149-156. DOI: 10.1207/S15326934CRJ1402_2
  81. 81. Allen LK, Snow EL, Crossley SA, Jackson GT, McNamara DS. Reading comprehension components and their relation to writing. LAnnee Psychologique. 2014;114(4):663-691. DOI: 10.4074/S0003503314004047
  82. 82. Fitzgerald J, Shanahan T. Reading and writing relations and their development. Educational Psychologist. 2000;35(1):39-50. DOI: 10.1207/S15326985EP3501_5
  83. 83. Tierney RJ, Shanahan T. Research on the reading–writing relationship: Interactions, transactions, and outcomes. In: Barr R, Kamil ML, Mosenthal PB, Pearson PD, editors. Handbook of Reading Research. Vol. 2. 1991. pp. 246-280
  84. 84. Allen LK, McNamara DS. You Are Your Words: Modeling Students’ Vocabulary Knowledge with Natural Language Processing Tools. International Educational Data Mining Society; 2015. pp. 258-265
  85. 85. Attali Y, Powers D. A developmental writing scale. ETS Research Report Series. 2008;1:i-59. DOI: 10.1002/j.2333-8504.2008.tb02105.x
  86. 86. Allen LK, Snow EL, McNamara DS. The narrative waltz: The role of flexibility in writing proficiency. Journal of Educational Psychology. 2016;108(7):911-924
  87. 87. Saddler B, Graham S. The relationship between writing knowledge and writing performance among more and less skilled writers. Reading & Writing Quarterly. 2007;23(3):231-247
  88. 88. Runco MA, Turkman B, Acar S, Nural MV. Idea density and the creativity of written works. Journal of Genius and Eminence. 2017;2(1):26-31. DOI: 10.18536/jge.2017.04.02.01.03
  89. 89. Dumas D, Dunbar KN. Understanding fluency and originality: A latent variable perspective. Thinking Skills and Creativity. 2014;14:56-67. DOI: 10.1016/j.tsc.2014.09.003
  90. 90. Forster EA, Dunbar KN. Creativity evaluation through latent semantic analysis. Proceedings of the Annual Conference of the Cognitive Science Society. 2009;31:602-607
  91. 91. Wolfe MBW, Goldman SR. Use of latent semantic analysis for predicting psychological phenomena: Two issues and proposed solutions. Behavior Research Methods, Instruments, & Computers. 2003;35:22-31. DOI: 10.3758/BF03195494
  92. 92. Louwerse MM, Zwaan RA. Language encodes geographical information. Cognitive Science. 2009;33(1):51-73. DOI: 10.1111/j.1551-6709.2008.01003.x
  93. 93. Landauer TK, Dumais ST. A solution to Plato’s problem: The latent semantic analysis theory of acquisition, induction, and representation of knowledge. Psychological Review. 1997;104(2):211-240. DOI: 10.1037/0033-295X.104.2.211
  94. 94. Salaria N. Meaning of the term descriptive survey research method. International Journal of Transformations in Business Management. 2012;1(6):1-7
  95. 95. Pandey P, Pandey MM. Research Methodology Tools and Techniques. Buzau: Bridge Center; 2021.85 p
  96. 96. Guegan J, Nelson J, Lubart T. The relationship between contextual cues in virtual environments and creative processes. Cyberpsychology, Behavior and Social Networking. 2017;20(3):202-206. DOI: 10.1089/cyber.2016.0503
  97. 97. Cseh GM, Jeffries KK, Lochrie M, Egglestone P, Beattie AA. A DigitalCAT: A fusion of creativity assessment theory and HCI. In: The 30th International BCS Human Computer Interaction Conference (HCI ‘16); 11-15 July 2016. Poole, UK: Bournemouth University; 2016
  98. 98. Reiter-Palmon R, Forthmann B, Barbot B. Scoring divergent thinking tests: A review and systematic framework. Psychology of Aesthetics, Creativity, and the Arts. 2019;13(2):144-152. DOI: 10.1037/aca0000227

Written By

Zehra Topal Altindiş

Submitted: 25 November 2021 Reviewed: 21 December 2021 Published: 30 January 2022