Assessing creativity with divergent thinking tasks: Exploring the reliability and validity of new subjective scoring methods.
- UNCG Author/Contributor (non-UNCG co-authors, if there are any, appear on document)
- Paul Silvia, Professor (Creator)
- John T Willse, Assistant Professor (Contributor)
- Beate P. Winterstein (Contributor)
- Institution
- The University of North Carolina at Greensboro (UNCG )
- Web Site: http://library.uncg.edu/
Abstract: Divergent thinking is central to the study of individual differences in creativity, but the traditional scoring systems (assigning points for infrequent responses and summing the points) face well-known problems. After critically reviewing past scoring methods, this article describes a new approach to assessing divergent thinking and appraises its reliability and validity. In our new Top 2 scoring method, participants complete a divergent thinking task and then circle the two responses that they think are their most creative responses. Raters then evaluate the responses on a 5-point scale. Regarding reliability, a generalizability analysis showed that subjective ratings of unusual-uses tasks and instances tasks yield dependable scores with only 2 or 3 raters. Regarding validity, a latent-variable study (n = 226) predicted divergent thinking from the Big Five factors and their higher-order traits (Plasticity and Stability). Over half of the variance in divergent thinking could be explained by dimensions of personality. The article presents instructions for measuring divergent thinking with the new method.
Assessing creativity with divergent thinking tasks: Exploring the reliability and validity of new subjective scoring methods.
PDF (Portable Document Format)
1879 KB
Created on 1/1/2008
Views: 30114
Additional Information
- Publication
- Psychology of Aesthetics, Creativity, and the Arts, 2, 68-85
- Language: English
- Date: 2008
- Keywords
- Creativity, Divergent thinking, Generalizability theory, Validity, Reliability, Assessment