Automating creativity assessment with SemDis : An open platform for computing semantic distance

  • PDF / 1,329,823 Bytes
  • 24 Pages / 595.276 x 790.866 pts Page_size
  • 74 Downloads / 134 Views

DOWNLOAD

REPORT


Automating creativity assessment with SemDis: An open platform for computing semantic distance Roger E. Beaty 1 & Dan R. Johnson 2

# The Author(s) 2020

Abstract Creativity research requires assessing the quality of ideas and products. In practice, conducting creativity research often involves asking several human raters to judge participants’ responses to creativity tasks, such as judging the novelty of ideas from the alternate uses task (AUT). Although such subjective scoring methods have proved useful, they have two inherent limitations— labor cost (raters typically code thousands of responses) and subjectivity (raters vary on their perceptions and preferences)— raising classic psychometric threats to reliability and validity. We sought to address the limitations of subjective scoring by capitalizing on recent developments in automated scoring of verbal creativity via semantic distance, a computational method that uses natural language processing to quantify the semantic relatedness of texts. In five studies, we compare the top performing semantic models (e.g., GloVe, continuous bag of words) previously shown to have the highest correspondence to human relatedness judgements. We assessed these semantic models in relation to human creativity ratings from a canonical verbal creativity task (AUT; Studies 1–3) and novelty/creativity ratings from two word association tasks (Studies 4–5). We find that a latent semantic distance factor—comprised of the common variance from five semantic models—reliably and strongly predicts human creativity and novelty ratings across a range of creativity tasks. We also replicate an established experimental effect in the creativity literature (i.e., the serial order effect) and show that semantic distance correlates with other creativity measures, demonstrating convergent validity. We provide an open platform to efficiently compute semantic distance, including tutorials and documentation (https://osf.io/gz4fc/). Keywords Assessment . Creativity . Divergent thinking . Semantic distance . Word association

Creativity researchers have long grappled with how to measure creativity. Indeed, the question of how to best capture creativity remains open and active, with a recent special issue on creativity assessment recently published in Psychology of

R.E.B. is supported by a grant from the National Science Foundation [DRL-1920653]. Electronic supplementary material The online version of this article (https://doi.org/10.3758/s13428-020-01453-w) contains supplementary material, which is available to authorized users. * Roger E. Beaty [email protected] * Dan R. Johnson [email protected] 1

Department of Psychology, Pennsylvania State University, 140 Moore Building, University Park, PA 16802, USA

2

Department of Cognitive and Behavioral Science, Washington and Lee University, Lexington, VA 24450, USA

Aesthetics, Creativity, and the Arts (Barbot, Hass, & ReiterPalmon, 2019). Over the years, a range of assessment approaches have been developed, from methods that rely on experts to judge the creat