Developing a Test to Measure Design Thinking

Anna Rusmann, Jeppe Bundsgaard

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

2 Citations (Scopus)

Abstract

Like other progressive learning approaches, game-based learning is expected to develop a number of advanced competences in students (Gee, 2007). These competences are often referred to as 21st century skills (Partnership for 21st Century Skills, 2002; Griffin and Care, 2015). Various frameworks of 21st century skills have been proposed, and what they typically have in common is the inclusion of competences such as collaboration, critical thinking and ICT skills. Two on-going Danish research projects, Game-Based Learning in the 21st Century and Community Drive, involve school-based interventions based on principles from design studies. Students learn to collaboratively investigate real-world problems, conceive ideas, build prototypes of solutions and present them to external parties. The aim is to develop students' design thinking skills, which can be seen as a subset or special category of 21st century skills. To measure the effects of the interventions, we have developed a computer-based test of design thinking. The test is a performance test, which means that, in addition to being asked to answer factual or procedural questions, students are given the opportunity to engage in activities such as building models, conceiving ideas and reflecting on ethical and social conflicts. To date, no other performance-based assessment of design skills has been developed (Razzouk and Shute, 2012). Our test consists of four test modules. Within each module, students solve different types of tasks within a simulated, authentic narrative. The tasks measure various aspects of four design competences, which we argue are relevant for primary and lower secondary school students. The Rasch model is used to test the dimensionality of the data and scale the responses on the four dimensions. A number of tasks can be scored automatically, while others require human evaluation. In this paper, we present the test, the underlying design thinking theory, the design decisions and the statistical Rasch model used to scale the responses and validate the test. The principles behind the task scoring are also outlined, and the limitations of the test format are discussed in relation to the measurement of design thinking.
Original languageEnglish
Title of host publicationProceedings of the 13th European Conference on Game-Based Learning
EditorsLars Elbaek, Gunver Majgaard, Andrea Valente, Saifuddin Khalid
Number of pages9
Place of PublicationSDU, Odense, Denmark
PublisherAcademic Conferences and Publishing International
Publication date3 Oct 2019
Pages587-595
ISBN (Print)978-1-912764-38-9
ISBN (Electronic)978-1-912764-37-2
DOIs
Publication statusPublished - 3 Oct 2019
Event13th European Conference on Games-Based Learning 2019 - Odense, Denmark
Duration: 3 Oct 20194 Oct 2019

Conference

Conference13th European Conference on Games-Based Learning 2019
Country/TerritoryDenmark
CityOdense
Period03/10/201904/10/2019
SeriesProceedings of the European Conference on Games-based Learning
ISSN2049-0992

Keywords

  • Competence measurement
  • Design thinking
  • Game-based learning
  • Lower secondary education
  • Primary education
  • Rasch model

Fingerprint

Dive into the research topics of 'Developing a Test to Measure Design Thinking'. Together they form a unique fingerprint.

Cite this