An evaluation methodology for crowdsourced design

Research output: Contribution to journalArticle

15 Citations (Scopus)

Abstract

In recent years, the “power of the crowd” has been repeatedly demonstrated and various Internet platforms have been used to support applications of collaborative intelligence to tasks ranging from open innovation to image analysis. However, crowdsourcing applications in the fields of design research and creative innovation have been much slower to emerge. So, although there have been reports of systems and researchers using Internet crowdsourcing to carry out generative design, there are still many gaps in knowledge about the capability and limitations of the technology. Indeed the process models developed to support traditional commercial design (e.g. Pugh’s Total Design, Agile, Double-Diamond etc.) have yet to be established for Crowdsourced Design.
As a contribution to the development of such a general model this paper proposes the cDesign framework to support the creation of Crowdsourced Design activities. Within the cDesign framework the effective evaluation of design quality is identified as a key component that not only enables the leveraging of a large, virtual workforces’ creative activities but is also fundamental to most iterative and optimisation processes.
This paper reports an experimental investigation (developed using the cDesign framework) into two different Crowdsourced design evaluation approaches; free evaluation and ‘crowdsourced Design Evaluation Criteria’ (cDEC). The results are benchmarked against an evaluation carried out by a panel of experienced designers. The results suggest that the cDEC approach produces design rankings that correlate strongly with the judgements of an “expert panel”. The paper concludes that cDEC assessment methodology demonstrates how Crowdsourcing can be effectively used to evaluate, as well as generate, new design solutions.
LanguageEnglish
Pages775-786
Number of pages12
JournalAdvanced Engineering Informatics
Volume29
Issue number4
Early online date23 Oct 2015
DOIs
Publication statusPublished - 31 Oct 2015

Fingerprint

Innovation
Internet
Image analysis
Diamonds

Keywords

  • crowdsourcing
  • crowdsourced design methodology
  • design evaluation
  • crowdsourced design evaluation criteria
  • collaborative design
  • human based genetic algorithm

Cite this

@article{b1fcc6678dc549179c300b049e52e49a,
title = "An evaluation methodology for crowdsourced design",
abstract = "In recent years, the “power of the crowd” has been repeatedly demonstrated and various Internet platforms have been used to support applications of collaborative intelligence to tasks ranging from open innovation to image analysis. However, crowdsourcing applications in the fields of design research and creative innovation have been much slower to emerge. So, although there have been reports of systems and researchers using Internet crowdsourcing to carry out generative design, there are still many gaps in knowledge about the capability and limitations of the technology. Indeed the process models developed to support traditional commercial design (e.g. Pugh’s Total Design, Agile, Double-Diamond etc.) have yet to be established for Crowdsourced Design. As a contribution to the development of such a general model this paper proposes the cDesign framework to support the creation of Crowdsourced Design activities. Within the cDesign framework the effective evaluation of design quality is identified as a key component that not only enables the leveraging of a large, virtual workforces’ creative activities but is also fundamental to most iterative and optimisation processes.This paper reports an experimental investigation (developed using the cDesign framework) into two different Crowdsourced design evaluation approaches; free evaluation and ‘crowdsourced Design Evaluation Criteria’ (cDEC). The results are benchmarked against an evaluation carried out by a panel of experienced designers. The results suggest that the cDEC approach produces design rankings that correlate strongly with the judgements of an “expert panel”. The paper concludes that cDEC assessment methodology demonstrates how Crowdsourcing can be effectively used to evaluate, as well as generate, new design solutions.",
keywords = "crowdsourcing, crowdsourced design methodology, design evaluation, crowdsourced design evaluation criteria, collaborative design, human based genetic algorithm",
author = "Hao Wu and Jonathan Corney and Michael Grant",
year = "2015",
month = "10",
day = "31",
doi = "10.1016/j.aei.2015.09.005",
language = "English",
volume = "29",
pages = "775--786",
journal = "Advanced Engineering Informatics",
issn = "1474-0346",
number = "4",

}

An evaluation methodology for crowdsourced design. / Wu, Hao; Corney, Jonathan; Grant, Michael.

In: Advanced Engineering Informatics , Vol. 29, No. 4, 31.10.2015, p. 775-786.

Research output: Contribution to journalArticle

TY - JOUR

T1 - An evaluation methodology for crowdsourced design

AU - Wu, Hao

AU - Corney, Jonathan

AU - Grant, Michael

PY - 2015/10/31

Y1 - 2015/10/31

N2 - In recent years, the “power of the crowd” has been repeatedly demonstrated and various Internet platforms have been used to support applications of collaborative intelligence to tasks ranging from open innovation to image analysis. However, crowdsourcing applications in the fields of design research and creative innovation have been much slower to emerge. So, although there have been reports of systems and researchers using Internet crowdsourcing to carry out generative design, there are still many gaps in knowledge about the capability and limitations of the technology. Indeed the process models developed to support traditional commercial design (e.g. Pugh’s Total Design, Agile, Double-Diamond etc.) have yet to be established for Crowdsourced Design. As a contribution to the development of such a general model this paper proposes the cDesign framework to support the creation of Crowdsourced Design activities. Within the cDesign framework the effective evaluation of design quality is identified as a key component that not only enables the leveraging of a large, virtual workforces’ creative activities but is also fundamental to most iterative and optimisation processes.This paper reports an experimental investigation (developed using the cDesign framework) into two different Crowdsourced design evaluation approaches; free evaluation and ‘crowdsourced Design Evaluation Criteria’ (cDEC). The results are benchmarked against an evaluation carried out by a panel of experienced designers. The results suggest that the cDEC approach produces design rankings that correlate strongly with the judgements of an “expert panel”. The paper concludes that cDEC assessment methodology demonstrates how Crowdsourcing can be effectively used to evaluate, as well as generate, new design solutions.

AB - In recent years, the “power of the crowd” has been repeatedly demonstrated and various Internet platforms have been used to support applications of collaborative intelligence to tasks ranging from open innovation to image analysis. However, crowdsourcing applications in the fields of design research and creative innovation have been much slower to emerge. So, although there have been reports of systems and researchers using Internet crowdsourcing to carry out generative design, there are still many gaps in knowledge about the capability and limitations of the technology. Indeed the process models developed to support traditional commercial design (e.g. Pugh’s Total Design, Agile, Double-Diamond etc.) have yet to be established for Crowdsourced Design. As a contribution to the development of such a general model this paper proposes the cDesign framework to support the creation of Crowdsourced Design activities. Within the cDesign framework the effective evaluation of design quality is identified as a key component that not only enables the leveraging of a large, virtual workforces’ creative activities but is also fundamental to most iterative and optimisation processes.This paper reports an experimental investigation (developed using the cDesign framework) into two different Crowdsourced design evaluation approaches; free evaluation and ‘crowdsourced Design Evaluation Criteria’ (cDEC). The results are benchmarked against an evaluation carried out by a panel of experienced designers. The results suggest that the cDEC approach produces design rankings that correlate strongly with the judgements of an “expert panel”. The paper concludes that cDEC assessment methodology demonstrates how Crowdsourcing can be effectively used to evaluate, as well as generate, new design solutions.

KW - crowdsourcing

KW - crowdsourced design methodology

KW - design evaluation

KW - crowdsourced design evaluation criteria

KW - collaborative design

KW - human based genetic algorithm

UR - https://www.sciencedirect.com/journal/advanced-engineering-informatics

U2 - 10.1016/j.aei.2015.09.005

DO - 10.1016/j.aei.2015.09.005

M3 - Article

VL - 29

SP - 775

EP - 786

JO - Advanced Engineering Informatics

T2 - Advanced Engineering Informatics

JF - Advanced Engineering Informatics

SN - 1474-0346

IS - 4

ER -