Applying Rapid Crowdsourced Playtesting to a Human Computation Game

2021 
Player engagement and task effectiveness are crucial factors in human computation games. However, collecting data and making design changes towards these goals can be time-consuming. In this work, we incorporate rapid crowdsourced playtesting via the ARAPID (As Rapid As Possible Iterative Design) system to iterate on the design of a human computation platformer game. For each level in the game, the player’s goal is to collect items relevant to a given scenario while avoiding irrelevant items. We extended the visualization modules in the existing ARAPID system to include a multi-level data visualization and item collection task effectiveness plot. A designer from the project team used the system to iterate on the game’s level design, with the goal of increasing relevant and decreasing irrelevant items collected by players. A large-scale test with the game versions created during the iterative analysis found that the designer was able to use ARAPID to improve the specified goal parameters.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    0
    Citations
    NaN
    KQI
    []