Published on in Vol 9, No 2 (2021): Apr-Jun

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/21900, first published .
A Gamification Framework for Cognitive Assessment and Cognitive Training: Qualitative Study

A Gamification Framework for Cognitive Assessment and Cognitive Training: Qualitative Study

A Gamification Framework for Cognitive Assessment and Cognitive Training: Qualitative Study

Original Paper

Department of Computer Engineering, Imam Khomeini International University, Qazvin, Iran

Corresponding Author:

Ali Khaleghi, PhD

Department of Computer Engineering

Imam Khomeini International University

Norouzian Boulevard

Qazvin

Iran

Phone: 98 9121003006

Email: ali.khaleghi.ir@gmail.com


Background: Cognitive tasks designed to measure or train cognition are often repetitive and presented in a monotonous manner, features that lead to participant boredom and disengagement. In this situation, participants do not put forth their best effort to do these tasks well. As a result, neuropsychologists cannot draw accurate conclusions about the data collected, and intervention effects are reduced. It is assumed that greater engagement and motivation will manifest as improved data quality. Gamification, the use of game elements in nongame settings, has been heralded as a potential mechanism for increasing participant engagement in cognitive tasks. Some studies have reported a positive effect of gamification on participant performance, although most studies have shown mixed results. One reason for these contrasting findings is that most studies have applied poor and heterogeneous design techniques to gamify cognitive tasks. Therefore, an appropriate gamification design framework is needed in these tasks.

Objective: This study aimed to propose a framework to guide the design of gamification in cognitive tasks.

Methods: We employed a design science research (DSR) approach to provide a framework for gamifying cognitive assessments and training by synthesizing current gamification design frameworks and gamification works in cognitive assessment and training, as well as incorporating field experiences. The prototypes of the framework were iteratively evaluated with 17 relevant experts.

Results: We proposed a framework consisting of 7 phases: (1) preparation; (2) knowing users; (3) exploring existing tools for assessing or training a targeted cognitive context and determining the suitability of game-up and mapping techniques; (4) ideation; (5) prototyping using the Objects, Mechanics, Dynamics, Emotions (OMDE) design guideline; (6) development; and (7) disseminating and monitoring.

Conclusions: We found that (1) an intermediate design framework is needed to gamify cognitive tasks, which means that game elements should be selected by considering current cognitive assessment or training context characteristics since game elements may impose an irrelevant cognitive load that, in turn, can jeopardize data quality; (2) in addition to developing a new gamified cognitive task from scratch, 2 gamification techniques are widely used (first, adding game elements to an existing cognitive task and second, mapping an existing game to a cognitive function or impairment to assess or train it); and (3) further research is required to investigate the interplay of cognitive processes and game mechanics.

JMIR Serious Games 2021;9(2):e21900

doi:10.2196/21900

Keywords



Background

Statistics show that the cognitive assessment and training market will achieve a growth rate of 32.39% from 2018 to 2022, and gamification will be one of the key vendors operating in this market [1]. These statistics confirm another estimation that shows 1 out of 5 people will be over 60 years old in the next 40 years [2]. Minor and major neurocognitive disorders, which have prevalences of approximately 10%-20% and 5%-7%, respectively, are global health issues due to the aging population; according to the American Psychiatric Association Diagnostic and Statistical Manual of Mental Disorders, edition 5, minor and major neurocognitive disorders encompass mild cognitive impairment and dementia, respectively [3]. In minor neurocognitive disorders, cognitive abilities decline, but changes are not severe enough to significantly affect individuals’ activities of daily living (ADLs); 35% of individuals with a minor neurocognitive disorder progress to a major neurocognitive disorder within a 3-year period [4]. In this stage, individuals lose independence in their ADLs and require care and support from others. It is not only aging that causes cognitive impairments; they may also exist since childhood (such as attention deficit hyperactivity disorder [ADHD] and autism spectrum disorder) or because of different factors like alcohol or drug abuse [5-7].

Cognitive assessment and training play an essential role in preventing loss of autonomy and independence in ADLs [8]. Cognitive assessment is associated with evaluating individuals’ cognitive abilities (eg, working memory, attention, executive functions) [9]. Cognitive training refers to using cognitive tasks to maintain or improve a particular aspect of cognitive functioning [9]. There are cognitive tasks to assess or train cognitive functions. Cognitive tasks such as the Mini-Mental State Examination (MMSE) [10], Confusion Assessment Method (CAM) [11], and Montreal Cognitive Assessment (MoCA) [12] are widely used for cognitive assessment. There are also cognitive training companies that have developed cognitive training tasks such as Cogmed [13], Nintendo Brain Age [14], Lumosity [15], and Posit Science BrainHQ [16]. The prevailing approach in these companies for designing a training task is to convert lab-based and individualistic cognitive assessment tasks into a training task [17] (eg, Nintendo Brain Age includes several cognitive assessment tasks like the Stroop task [18]). Cognitive tasks are a vital tool for the assessment and training of cognitive impairments. However, participants often view them as monotonous and boring since they have a repetitive nature and are rigidly presented [8,9,19-23]. These features increase the frequency of insufficient efforts to perform these tasks, and consequently, the reliability of data collected decreases [9,19,23-26]. The typical process taken by cognitive specialists to perform a cognitive training program is shown in Figure 1.

Figure 1. The process of traditional cognitive training programs (the figure was drawn based on descriptions provided by Vourvopoulos et al [8]). ADLs: activities of daily living.
View this figure

It is assumed that greater engagement and motivation will manifest as improved data quality in cognitive tasks [9,19-27]. Among the existing solutions, gamification, which is the process of adding game elements (eg, scoring system, leaderboard, badge) to nongame contexts (eg, education context, business context, cognitive tasks) [28], stands as one of the most influential and promising solutions to improve motivation in monotonous tasks [9]. A greater understanding of human motivation helps maintain users' encouragement to participate in cognitive tasks over time [17]. Motivation is multidimensional and falls on a continuum from intrinsic motivation to extrinsic motivation to amotivation (little to no motivation exists) [29]. Intrinsic motivation is regulated internally and refers to performing activities for their inherent satisfaction. Intrinsic motivation is required for long-term engagement and long-term changes. In contrast, extrinsic motivation (doing activities solely for their outcomes and rewards) is useful for short-term engagement and short-term changes and is also regulated externally [29,30]. Self-determination theory (SDT) [31] and Flow [32] theory are widely used to improve users’ participation and motivation. According to SDT [31], intrinsic motivation can be sustained by satisfying 3 psychological needs of relatedness (is experienced when individuals feel connected to others), autonomy (the need for freedom to make choices based on one’s volition during an activity), and competence (the need for challenge and feelings of self-efficacy). Flow refers to “the state in which people are so involved in an activity that nothing else seems to matter; the experience itself is so enjoyable that people will do it even at great cost, for the sheer sake of doing it” [32]. Gamification can combine intrinsic motivation with extrinsic motivation to raise motivation and engagement [30]. Game elements such as badges, points, game levels, a leaderboard, and avatars lead to extrinsic motivation and are useful for capturing early user motivation [17]. Based on SDT [31] and Flow [32] theory, gamification can also improve the intrinsic motivation of participants through game elements such as optimal challenges and positive feedback (these elements satisfy human needs of competence). Figure 2 presents some examples of gamified cognitive tasks from Craven and Groom [33], Boendermaker et al [34], Lumsden et al [9], and Van de Weijer-Bergsma et al [35].

Figure 2. Examples of gamified cognitive tasks. A task was categorized as an assessment or training according to the self-prescription provided by the authors of the publication. The owl and apple pictures were used or adapted by [33] from openclipart [36] and 4vector [37] (online media repository of free graphics). ADHD: attention deficit hyperactivity disorder.
View this figure

Challenges With Designing Gamification for Cognitive Assessment or Training

Despite the growing trend towards using gamification in cognitive tasks, its impacts on participant engagement and data quality are not stable. Studies have [20-22,27,38-40] stated that gamification has a positive effect on users’ engagement as well as data quality. However, other studies [19,24,26,34] have reported that no effects were observed on users’ performance by adding game elements, but they perceived the gamified task as funnier and more challenging than the nongamified version. In addition, other studies [19,27] showed that gamification worsened data quality but had positive effects on engagement. The gamification applied by Birk et al [23] not only did not have a positive impact on data quality and engagement but also worsened them [19]. These mixed findings are potentially due to 4 main reasons: (1) Most gamified cognitive tasks have been developed by cognitive psychologists, not professional gamification designers, and for scientists, the clinical effectiveness of a gamified task is important, with less focus on employing effective and creative gamification designs [41]; (2) a variety of gamification techniques have been applied to cognitive tasks [9,24,38]; (3) gamification techniques have been applied to different cognitive tasks [24]; and (4) the results obtained from gamified cognitive tasks are often preliminary and limited by small sample sizes. Also, the considered duration for evaluating the efficacy of gamified tasks is relatively short [9,23-25,42,43].

There are recommendations and design guidelines to integrate game elements into cognitive tasks (eg, [9,30]). However, to our knowledge, they did not propose a detailed and step-by-step framework that clearly shows what factors are essential to designing gamification in these tasks from early stages (eg, planning and preparation phases) to develop, evaluate, and disseminate gamified tasks, followed by monitoring the efficacy of such tasks in the long term. In general, several gamification frameworks have been developed by experts, not designed for cognitive tasks (eg, [44-53]). However, they suffer from 3 main limitations: (1) Most of them have been designed for enterprise and business contexts, with less focus on health contexts [54]; (2) cognitive tasks need to engage participants for the long term, but existing frameworks have not been designed for this purpose [54,55]; and (3) they have not specified how game elements should be added to a particular context. Incorporating game elements in cognitive tasks may jeopardize data quality by imposing an additional cognitive load to these tasks [56]. For example, Katz et al [25] gamified the N-Back task by including a real-time scoring system while completing the task. The gamified task, in comparison to the actual task, negatively impacted data quality. One possible explanation is that the game features imposed irrelevant cognitive demands by distracting the players’ attention.

Objectives

Despite the papers that have shown mixed findings of using game elements in cognitive tasks, we assume that gamification can positively influence data quality and user engagement. Therefore, we are proposing a framework to guide the process of incorporating game elements in cognitive tasks by synthesizing (1) existing gamification design frameworks, (2) gamification efforts in cognitive assessment and training, and (3) field experiences.


Overview

We approached the research problem through the design science research (DSR) methodology [57]. Design science is an accepted research methodology in information systems. It emphasizes that research should be firmly grounded in existing knowledge and target the context in which the developed artifact must be used, to create scientifically sound artifacts (eg, theories, models, and methods) [58]. The 2 main steps for conducting DSR include (1) developing artifacts and (2) evaluation of the developed artifacts [53]. In our DSR approach, similar to Morschheuser et al [53], we employed an assembly-based situational method engineering methodology proposed by Brinkkemper [59]. Method engineering is an approach in information systems to develop new methods from recognized fragments of existing methods knowledge to propose a situational method tuned to the situation of the project at hand. According to method engineering, 3 phases are needed to develop a new method [59]: (1) building a method database, which includes all the resources required for the development of a new situational method; (2) constructing the situational method through assembling of the methods fragments from the method database; and (3) evaluation of the developed method. Figure 3 provides an overview of our assembly-based situational method engineering.

Figure 3. Situational method engineering approach followed for developing the framework (adapted from [53,59]).
View this figure

Knowledge Base

Defining Sources of Evidence

We selected the following 3 resources to gather the knowledge required for proposing the intended framework: (1) studies that proposed a gamification design framework, since by synthesizing them, we could extract the general framework for the design of gamification; (2) projects that used gamification in cognitive assessment or training to extract key factors and considerations for gamifying these contexts; (3) the experiences of relevant experts to integrate the evidence from live environments.

Exploring Relevant Papers

We used systematic literature review strategies in the Google search engine, Google Scholar, PubMed, and Research Gate to use a wide variety of relevant papers. The search was started in May 2017 and lasted until the submission date. Two search strings were developed based on keywords, their synonyms, and related terms: (1) (“gamification” AND (“framework” OR “platform” OR “process” OR “method”)) and (2) (“gamification” OR “serious games” OR “game up” OR “video game”) AND ((“cognitive” AND (“training “OR “assessment”)) OR “Go /No-Go” OR “N-Back” OR “MMSE” OR “MoCA” OR “ADHD” OR “stop-signal task” OR “dyslexia”). Furthermore, search strategies such as checking the reference lists of included studies and cited reference searching were applied.

Inclusion Criteria for Selecting Gamification Frameworks

To select the most highly regarded gamification frameworks, the following 4 metrics were used: (1) framework was not focused on the parts or steps of the gamification design process, but covering the maximum number of steps; (2) framework was determined to be worthy in terms of efficacy by calculating the number of its citations; (3) framework was developed by gamification experts (we considered an individual an expert based on whether she or he published at least 10 scientific articles concerning gamification issues); (4) framework was developed using a robust methodology.

Inclusion Criteria for Selecting Gamification Projects in Cognitive Assessment and Training

First, we included projects that published reports about the impacts of game elements on data quality and user engagement in the form of a scientific paper, to obtain sufficient author-presented analytical expressions on how game elements should be incorporated into cognitive tasks. Second, similar to Lumsden et al [9], we did not select projects based on whether they included the word “gamification”; instead, we selected projects if our search strategies found them. We intentionally did not define gamification as defined by Deterding et al [28] (“the use of game design elements in non-gaming contexts”), since precisely defining the elements that make a game is challenging and limiting [9]. Therefore, we decided that a cognitive task was gamified if its purpose was to increase participants’ commitment and motivation. However, it used other game-inspired designs such as serious games, video games, games with a purpose, and game-like interventions. We erred on the side of caution to minimize the potential loss of relevant sources.

Expert Evaluation

From the early stages, the framework was screened and judged by a homogeneous group of 17 experts from relevant disciplines, including information technology, game, gamification, and cognitive psychology. The average years of experience are presented in Table 1.

Table 1. Experts’ background and average years of experience.
BackgroundAverage experience (years)
Information technology (IT)20
Game4/5
Gamification3
Cognitive psychology11 years of academic experience and 6 years of clinical experience

After extracting each piece of evidence and then applying it to the under-development framework, the whole of the framework was visualized for expert evaluation. Then, the framework was refined based on the feedback collected.

Method Base

Extracting the General Gamification Framework

After comparing the selected frameworks in terms of main characteristics, merits, and demerits, 2 general gamification design frameworks were elaborated. The first framework was based on analyses presented in Multimedia Appendix 1. The second framework was based on the Mechanics, Dynamics, Aesthetics (MDA) process, a formal framework for designing and analyzing games [60], and 2 adapted versions of MDA presented in [44,52], now known as Objects, Mechanics, Dynamics, Emotions (OMDE). The phases and activities of these frameworks will be described in detail in the subsequent sections while describing the proposed framework.

Customizing the General Gamification Frameworks for Cognitive Assessment and Training

An exhaustive number (n=63) of empirical project reports or theoretical works that applied gamification into cognitive tasks were gathered to customize the obtained general frameworks for cognitive assessment and training contexts. We tried to extract an abstract process for each paper by observing the papers’ different sections. As abstracted in Figure 3, most articles did not use a specific or formal framework for gamifying cognitive tasks. Therefore, either we could not extract a process, or the process obtained consisted of only one chunk. Finally, the isolated processes, fragments, and general gamification frameworks were converged based on their commonalities and unique features to assemble the intended framework.


Search Results

The Explored Gamification Design Frameworks

We identified a total of 35 gamification design frameworks (these frameworks are listed in Multimedia Appendix 1). Of these, 11 frameworks were selected for more in-depth analysis (these frameworks are highlighted in bold in Multimedia Appendix 1).

The Explored Gamification Projects in Cognitive Assessment and Training

A total of 63 gamification projects in cognitive assessment or training was selected. Of these, 41 (41/63, 64%) were empirical project reports; 2 (2/63, 3%) used empirical and theoretical methods, and 20 (20/63, 32%) were theoretical. As for the purpose of the papers, 22 (22/63, 35%) were for assessment, 32 (32/63, 51%) for training, and 9 (9/63, 14%) for both assessment and training. In addition, 19 (19/63, 38%) were for children aged 3-14 years; 16 (16/63, 34%) for adolescents, youth, and adults aged 15-55 years; and 13 (13/63, 28%) for older adults aged 56-94 years. For more details and raw data about the works included, such as their targeted cognitive functions or impairments, please see both tables in Multimedia Appendix 2.

The Proposed Framework

Overview

To address the need for a framework that more effectively integrates game elements into cognitive assessment and training, we introduce a framework consisting of 7 phases: (1) preparation; (2) knowing users; (3) exploring existing tools for assessing or training the targeted cognitive context and determining the suitability of game-up and mapping techniques; (4) ideation; (5) prototyping using OMDE; (6) development; and (7) disseminating and monitoring. These phases are grouped into 3 overarching categories: predevelopment, development, and postdevelopment (Figure 4). Although the framework phases are presented sequentially, they are not necessarily to be conducted linearly since different ideas and directions may be explored while integrating gamification into targeted cognitive tasks. Therefore, projects are encouraged to loop back through the phases continuously [61,62].

Figure 4. The proposed framework. OMDE: Objects, Mechanics, Dynamics, Emotions.
View this figure

The framework aims to facilitate creating more effective gamified cognitive tasks by using an interdisciplinary team of gamification designers, cognitive experts, and target users. Only a truly multidisciplinary team has the knowledge and expertise to address the complex factors involved in the design of gamification into cognitive tasks [61,63,64]. Gamification designers are not familiar enough with the process and execution of targeted cognitive tasks. As a result, they may incorporate game elements inappropriately [19,23-25,33,65]. Therefore, gamification designer and cognitive expert involvement is needed throughout the design process, and target users should be involved throughout phases 2, 4, 5, and 6. The involvement of target users in these phases places their needs and motivations at the center of attention. The first 3 framework phases are primarily about information gathering to develop a more well-accepted and scientific gamified task in later stages. Steps 4, 5, and 6 (ideation, prototyping using OMDE, and development, respectively) follow 2 main objectives: (1) generating gamification design ideas around targeted cognitive tasks (stages 4 and 5) and (2) developing actual gamified cognitive tasks through frequent consultation with gamification and cognitive experts and regular testing with target users (stage 6). Finally, once the efficacy of a gamified cognitive task has been demonstrated in phase 6, the task is disseminated to its target audience and then monitored periodically to maintain its effectiveness over the long term (stage 7).

Phase 1: Preparation

The primary purpose of the preparation stage is to have a good understanding of a gamification project’s objectives [53,66]. Defining objectives has been recommended in most reviewed gamification frameworks (9 of 11) and will support a later stage to figure out if the desired goals have been achieved [66]. The interdisciplinary team should list all potential objectives and then rank and justify the list in terms of importance since trade-offs of less important goals for more important ones might be needed [49,53]. Finally, as the team goes through gamification design and development, it can go back to the list to focus on what is really important [49]. Therefore, the defined objectives should be achievable, specific, relevant, measurable, and time-bound [66]. For instance, an initially broad goal of “increasing participants’ motivation to complete their cognitive training exercises” may be refined to “conduct 10 minutes of training each day.”

According to frameworks such as [50,53,66], gamification’s suitability as a possible way to intervene should be examined before starting any gamification design process. It can be carried out by detecting the problem that gamification should solve by gathering and analyzing quantitative and qualitative information. After determining the problem, the root reason that caused the problem must be motivational. Otherwise, gamification is not suitable [66]. The root reason can be identified by the “Five Whys” technique that determines the root cause of a problem by repeating the question “Why?” [67].

It is also essential to identify the standard project requirements and constraints such as scope (time, personnel, budget) and legal and ethical constraints since they can affect a gamification project’s success [50,53].

Phase 2: Knowing Users

During this stage, the interdisciplinary team must select one or more of a variety of methods to collect information about target users’ motivations and needs (eg, interviews, observation of target users’ behaviors, surveys, focus groups, questionnaires) [50,53]. After collecting and analyzing users’ data, users with similar characteristics should be segmented into groups to create user personas. The segmentation helps the team choose a more acceptable design to gamify targeted cognitive tasks.

Typically, gamification through motivational affordances enriches information systems [53]. Therefore, it is essential to conduct this phase (9 of 11 frameworks have had one step for understanding users). People are motivated by different motivational affordances based on characteristics such as their age, gender, and culture. The Octalysis gamification framework is widely used to segment users based on their motivations [68]. Octalysis was developed by Chou [68] as an octagon with 8 core drivers of individuals on each side: (1) epic meaning and calling, (2) development and accomplishment, (3) empowerment of creativity and feedback, (4) ownership and possession, (5) social influence and relatedness, (6) scarcity and impatience, (7) unpredictability and curiosity and, (8) loss and avoidance. The game strategies or elements that are associated with each driver have been grouped next to it.

In cognitive contexts, in addition to considering the users’ motivations, their needs should be identified since they may suffer from mild to severe cognitive dysfunction, which may sometimes be accompanied by physical disabilities. Afrasiabi Navan and Khaleghi [7] developed the game “Smile 1” to help Iranian autistic children recognize emotional states such as happiness, sadness, anger, and fear in the cartoon faces of girls and boys that appear in the game. The girls have a scarf (Figure 5) since these children only identify women and girls who wear a scarf (in Iranian culture, girls and women wear a scarf). To identify users’ needs, most gamified assessments and training have tried to implement a gamified experience that allows users natural and straightforward interactions, often using touch-based technologies such as smartphones and tablets (22 of 63). For more information about designing a user-friendly interface for people facing cognitive and physical disabilities, please see [2,69-73].

Figure 5. Screenshot of the game “Smile 1,” which was developed to help autistic children recognize different emotions [7].
View this figure
Phase 3: Exploring Existing Tools for Assessing or Training the Targeted Cognitive Context and Determining the Suitability of Game-Up and Mapping Techniques

At this stage, the interdisciplinary team should thoroughly acquaint itself with existing tools for assessing or training the targeted cognitive functions or impairments through methods such as consulting with cognitive experts and gathering quantitative and qualitative information. This work helps the team incorporate game elements into these tools without changing their process and execution and find integration points for adding game elements [74]. Describing the tools at a granular level is required. Otherwise, it is not guaranteed that the next framework steps will lead to the desired outcomes [53,74]. Three main tools that can be explored for cognitive assessment and training purposes are standard computerized or not computerized cognitive tasks, existing cognitive games, and existing video games.

For standard computerized or not computerized cognitive tasks, there may be more than one cognitive task designed to assess or train a cognitive function (eg, continuous performance test, Go/No-Go test, stop signal task for assessing attention, inhibitory, and motor skills). Selecting appropriate cognitive tasks is very important. Some tasks may have better performance than others [24,75]. Valladares-Rodriguez et al [75] chose the California Learning Verbal Test II to assess episodic memory. The task, in comparison to other tasks such as the Children’s Memory Scale, Rey Auditory Verbal Learning Test, and Wechsler Memory Scale, has a large number of variables and produces more qualitative information. Computer versions have now been made for many standard cognitive tasks, which are cheaper, more repeatable, and easier to administer and distribute [76]. Many of these tasks can be found in [77-79].

Regarding existing cognitive games, many games have been developed based on standard cognitive tasks [80]. It is beneficial to find these games since they can be reused for current cognitive assessment and training purposes, or the initial inspiration for gamifying current tasks can be obtained by reviewing the style of these games for integrating game elements into a cognitive task [25,33,43,75]. Brain games can be found from platforms such as Cogmed [13], Nintendo Brain Age [14], Lumosity [15], and Posit Science BrainHQ [16].

Regarding existing video games, it has also been demonstrated that classical video games of different genres that have not been inherently designed to assess or train cognitive functions can be reused as a standard cognitive task. Video game challenges come in various forms, and players have to use their underlying neural systems and cognitive abilities to win these games [81-90]. Each cognitive function is typically characterized by a set of parameters estimated from a gameplay to reuse for assessing and training. In other words, the team must identify which cognitive skills are central to each gameplay [91]. For winning games like Tetris and Candy Crush, mental rotation and spatial reasoning skills are required [91]. Card games like Solitaire and FreeCell have a reasonable correlation with classical measurements of executive functions and planning abilities [92,93]. The team can explore existing games from platforms such as the App Store and Google Play. They rank games based on their rate of downloads and players’ comments, which help select games according to the target user’s preferences. According to Green and Bavelier [83], Doherty et al [91], and expert experiences, it is unnecessary to find appropriate games through earlier methods for categorizing games such as genre-based methods since they are no longer effective. Games that have never overlapped in terms of content and mechanics now have many points of overlap [83].

After collecting the tools, the interdisciplinary team should determine whether game-up and mapping techniques can be used instead of designing a new gamified cognitive task from scratch. Game-up refers to adding game elements such as colors, animations, sound effects, and a backstory into standard cognitive tasks without changing their fundamental properties such as stimuli, design, and procedure [5,26,94] (21 of 63 studies used the game-up technique). The developed gamified cognitive tasks based on game-up are often presented in the form of a battery of mini-games. Each mini-game focuses on a specific cognitive function (eg, [64,76,95]). For example, Zeng et al [76] gamified a computerized cognitive test battery to detect impairments in 5 cognitive functions involved in developing a major neurocognitive disorder. For each test, some mini-games were designed in the context of ADLs such as cooking, cleaning, and shopping. The main feature of game-up is simplicity in terms of its mechanics and design [7,65,96,97]. This feature is useful for individuals who suffer from cognitive impairments like children with learning disabilities and ADHD who have weak working memory capacity [65,96]. Therefore, gamified cognitive tasks for these children should be broken into short and discrete tasks [65,96]. Mapping refers to reusing an existing game (cognitive and classical games) as a cognitive task and can save considerable time and effort that have been applied in the design of explored games [98,99]. The mapping technique was used by 21 of the 63 studies (eg, [72,92,93,100-104]). Explored games should be adjusted appropriately since they usually do not provide cognitive psychologists with sufficient quantitative data about the participant’s performance and progression on one hand. On the other hand, the used game elements and storylines may not align with participants' preferences [103,105,106]. Moreover, they may impose an additional cognitive load. Therefore, the exact cognitive demands of selected games should be identified by analyzing their structural characteristics [83]. Each game's structural characteristics should be examined individually since different games, even those that fall into one category such as action or first-shooting person, may require greatly different cognitive demands [81,85,107,108]. In mapping, it is also possible to mash up various games for cognitive assessment and training purposes [81]. For example, dyslexia is associated with a variety of underlying deficits in phonological, auditory, motor, memory, and visual attentional processes. According to previous findings showing the core deficit in dyslexia is related to attentional problems, Franceschini et al [81] explored 10 action games to train dyslexic children. Action games can enhance a wide variety of visual attentional abilities, such as segmenting items both in time and across space.

Phase 4: Ideation

The interdisciplinary team is involved in a highly iterative design process through the next 3 phases (ideation, prototyping using OMDE, and development). Iterative processes enable the team to obtain more creative and effective gamification designs. Of the 11 frameworks, 5 have one or more steps that should be iterated until the desired designs and outcomes are reached (ie, [46,48-50,53]). The steps that are often performed iteratively are ideation, prototyping, and development [53].

At this stage, the team combines the analyses and materials obtained in new ways to produce apt and innovative ideas to engage target users. It is necessary to involve a cross-functional group of people from cognitive experts, gamification designers, and target users to start this stage [61,62]. This work helps the interdisciplinary team to collect a greater number of more varied and creative ideas [61,62]. The participants should be encouraged to use different ways to be creative [61,62]. Brainstorming, co-creation workshops, and mind mapping are some methods [53,61,62]. The important question at this stage is how to help participants find the ideas. One solution is to explore existing games, gamification designs, and examples that may be a perfect fit for the current project [62]. For instance, in the game “Whack a Mole,” moles hide quickly, and the player trying to hit them with a hammer has to be faster. Various types of moles exist in different game versions, such as ninja, pirate, samurai, and batman moles [109]. Based on studies reusing the Whack a Mole to measure attention, inhibitory control, and executive functions [72,102,103], one idea is to use ninja and samurai moles as metaphors for Go and No/Go stimuli, respectively. As a result, a gamified cognitive task that mimics the Go/No-Go design can be created using the Whack-a-Mole style. Exploring many games and gamification examples and then mashing them up to fit the current problem is another right approach for generating ideas to gamify the current task [62].

After preparing ideas, similar ideas should be clustered using affinity diagrams; then, the clusters should be prioritized using methods such as dot voting. This work helps the team to focus on important ideas in the next 2 phases [50,53].

Phase 5: Prototyping Using OMDE

After collecting the right ideas, the interdisciplinary team needs to start prototyping. Prototyping is the stage in which the team implements the ideas into tangible forms to see how they actually work. In this stage, low-fidelity (ie, “quick and dirty”) prototypes are developed rapidly to gather feedback from relevant experts and target users early and often [61]. Prototyping saves time and resources by helping the team to identify refinements required before solidifying a design [61,62].

During each iteration of the prototyping, the team can use the OMDE design guideline to (1) check the motivational characteristics of prototypes (such as fun, flow, engagement, positive emotions) and (2) validate prototypes from cognitive psychology aspects. OMDE divides the components of a gamified cognitive task into 4 categories: objects, mechanics, dynamics, and emotions. Objects are a gamified cognitive task's assets, such as visual assets, images, audios, videos, and animations [60]. Mechanics refer to a gamified task's components at the level of game rules, algorithms, and data representation [60]. Dynamics are run-time users’ behaviors that emerge as users partake in the gamified task such as competition and cooperation [60]. Emotions refer to whatever emotions users experience while interacting with the gamified task [44]. Participants may experience different emotions such as fear, happiness, anger, sadness, and pride while interacting with the gamified task [44]. Dynamics and emotions emerge from the selected objects and mechanics [44,52,60]. For instance, a leaderboard mechanic leads to dynamics such as competition and comparison and emotions such as fear and happiness. Some participants may be afraid of being judged by others, and the use of the leaderboard may demotivate them from continuing the gamified task. Or, many participants may enjoy these dynamics, and the leaderboard can motivate them. Therefore, displaying participants' statuses in the leaderboard must be an optional feature in a gamified task.

Good dynamics and emotions are vital to ensuring a strong user commitment to participation [44,52,60]. To check gamified tasks’ motivational features with OMDE, the interdisciplinary team must first define the desired dynamics and emotional responses that the designed task should evoke among users. Then, in each iteration, the team must list what dynamics and emotions emerge from the gamified task in practice and then compare the responses with the desired ones to determine if the desired responses have been reached. The team cannot accurately predict what dynamics and emotions will emerge from a gamified task. Therefore, it is necessary to use OMDE iteratively [44,52].

In gamified cognitive tasks, it is also essential to validate the components of OMDE from cognitive aspects because they may impose an additional cognitive load. Objects and mechanics can cause difficulty in categorizing cognitive tasks’ stimuli for participants or can evoke emotions such as anxiety and stress that may distract participants’ attention from completing gamified tasks [19,23-25,33,65]. In this circumstance, participant errors increase, and the reliability of the data obtained decreases. For instance, in the study by Birk et al [23], the gamified Go/No-Go task decreased users’ performance. In the standard task, a sequence of stimuli is presented for 500 ms. Participants should respond to circles but not to squares. In the gamified version, participants should shoot blond zombies (Go stimulus) but not yellow hat moles (No/ Go stimulus). In the standard task, a circle is very different from a square. In contrast, in the gamified task, the colors of yellow hat moles and blond zombies are close to each other and can cause difficulty while participants are trying to identify the gamified task’s stimuli. In order to gamify the Go/No-Go task, Lumsden et al [19] suggested that red and green colors be used instead of cartoon characters because participants are more familiar with colors. The components of OMDE can be validated by discussions with cognitive experts and answering questions such as: (1) Is it possible to gamify the cognitive tasks’ stimuli? If yes, how can we do so? (2) Does the team have the freedom to choose objects for gamifying cognitive tasks, or should they be selected among those listed in a specified set or ones that participants are more familiar with, such as everyday objects? (3) What degree of structural similarities (such as shape, size, and color) between objects and mechanics should be adjusted? (4) Is it possible to gamify the surrounding environment of the selected cognitive tasks’ stimuli? If yes, how should the degree of separation between cognitive and gamified sections be adjusted? (5) Does the designed gamified cognitive task lead to negative emotions like anxiety and stress?

Phase 6: Development

During this stage, actual gamified cognitive tasks are developed through frequent consultation with relevant experts (gamification and cognitive experts) and regular testing with target users (Figure 6). Based on the examined gamification efforts in cognitive tasks, to test the efficacy of gamified tasks, rigorous evaluations are required in terms of user engagement and data quality (eg, [9,19-21,23-27,34,38-40]).

Figure 6. Overall structure of the development phase.
View this figure

Two methods are widely used to evaluate how gamified cognitive tasks influence users’ engagement and motivation [24,27,38,39]: (1) subjective measures of engagement, in which the motivation level of a gamified task is measured through self-report questionnaires based on SDT [31] and Flow [32] theory (eg, Flow State Scale [110]) and (2) objective measures of engagement such as the number of times participants used a gamified task or the number of optional cognitive assessment or training sessions performed by participants, methods that might be more preferred by the interdisciplinary team [24,38,39,63,111]. A combination of both methods is often used to measure gamified tasks’ motivation levels [24].

To assess how gamified cognitive tasks impact the quality of data and to indicate the maturity of these tasks to be used as a valid clinical tool, they must be evaluated for 2 essential properties [43]: (1) reliability, which refers to the extent to which a task's results are consistent and repeatable, and there are 4 types of reliability (test-retest reliability, parallel forms reliability, internal consistency reliability, and interrater reliability) and (2) validity, which refers to how well a task measures what it claims and includes criterion validity, content validity, construct validity, face validity, external validity, and ecological validity.

In cognitive training, it may also be necessary to measure to what extent gamified tasks can impact transfer effects [38,41,83]. The term “transfer” is frequently used in clinical practice and refers to the extent to which considered cognitive training tasks can improve untrained cognitive abilities. New tasks and situations are included to measure transfer effects. Transfer effects are divided into near and far transfer effects. Cognitive training has near transfer effects if it improves cognitive skills that are highly similar to trained cognitive skills. Far transfer effects refer to improvements in cognitive skills that are less similar to trained skills.

There are 2 other essential factors for conducting rigorous evolutions: (1) selecting sufficient sample sizes and (2) selecting an appropriate duration for evaluation. Most gamification efforts in cognitive contexts have used small sample sizes to evaluate the efficacy of gamified cognitive tasks [9]. Also, little consideration has been given to using statistical analyses such as power analysis for a sample size calculation [9,112]. For more information about how to calculate sufficient sample sizes, please see [113,114]. Insufficient sample sizes limit the reliability and generalizability of the results [9,115,116]. Moreover, only a few studies have used randomized controlled trials (RCTs) to evaluate gamified cognitive tasks [30]. In clinical research, RCTs are considered the most robust study design for evaluating the effectiveness of a new tool due to the ability of RCTs to minimize several forms of bias [61]. RCTs randomly assign participants to an experimental group and a control group. The use of an RCT design comparing gamified (experimental group) and nongamified (control group) versions of the same cognitive task has been highly recommended to evaluate the potential efficacy of gamified tasks [9,30]. Regarding selecting the appropriate duration for evaluation, participants are not involved in the gamified task over the long term but instead participate for a short time. In turn, it remains unclear after how long participants feel boredom and how the quality of data will alter in these circumstances [19,25]. Moreover, a short duration can cause errors due to participants' unfamiliarity with the gamified task. In this regard, using short tutorials and warm-up sessions before actual evaluation sessions has been recommended [65,76].

Phase 7: Disseminating and Monitoring

Once the gamified cognitive task's efficacy has been demonstrated in the previous step, the task is finally disseminated to its target audience. There have long been calls for disseminating and sharing well-evaluated digital health interventions due to the abundance of low-quality interventions currently available to the public [17,61]. Disseminating gamified tasks provides access to the broader population that may benefit the most from these tasks and helps the industry invest in these interventions more quickly [61,73]. Disseminating can occur via the App Store or Google Play. Also, industry partnerships can support a more effective and sustainable dissemination of gamified cognitive tasks [61,117]. It is also highly recommended that projects disseminate their findings, experiences, and methods for developing gamified cognitive tasks to scientific journals, conferences, researchers, and digital mental health intervention developers. It can advance future gamified tasks and improve their effectiveness [61,117]. Disseminating can also include open sharing of gamified cognitive task codes via GitHub [118] or allowing free access to a mobile health platform such as Mobile Sensor Data-to-Knowledge (MD2K) [119].

For disseminating gamified tasks, 2 other important factors should be considered by the team. First, according to expert experiences and [91] in collaboration with cognitive experts, appropriate guidelines and prescriptions should be prepared for using gamified cognitive tasks by clinics and target users (eg, determining the minimum effort and time that target users should spend to improve their cognitive skills). These instructions help mental experts and target users to use gamified tasks effectively. Second, ethical concerns are another primary concern that should be considered in this stage since performance variables are embedded in gamified tasks to track improvements in users’ cognitive functions. Therefore, gathering target users' information should be undertaken with their explicit consent [23,47,50,53,73].

After disseminating, gamified tasks’ performance should also be accompanied by postlaunch monitoring and evaluation, in which usage and applicability of the gamified task through methods introduced in the previous phases are observed in regular intervals to compile a list of possible improvements [47,53]. For instance, different levels of cognitive impairments may exist among users, and after a cognitive training program through the gamified task, users' cognitive status may improve. Therefore, according to the users’ new levels, some changes may need to be applied in the gamified task [69,73]. Also, gamified tasks’ motivation levels should be continuously monitored to maintain intrinsic motivation for the long term. The initial effects of game elements can diminish over time [47,53].


Overview

This paper proposes a 7-step framework to guide the design, development, and evaluation of gamified cognitive tasks designed to assess or train cognition. Within these steps, there are a series of key recommendations on how each step should be operationalized. Along with the framework, the article presents the OMDE guideline at stage 5 of the framework (prototyping using OMDE), which contains vital recommendations for advancing the understanding of design complexities when applying gamification in cognitive tasks.

The prototypes of the framework were designed and evaluated extensively by evidence from 3 sources: (1) existing gamification design frameworks, (2) project reports of applying game elements into cognitive assessment and training, and (3) expert experiences. To our knowledge, this is the first study of its kind that has converged these sources to propose a unified model to design gamification in cognitive tasks. The significant point about gamification efforts in cognitive tasks is that they do not use a specific design process to incorporate game elements. We used an alternative solution, such that by observing each work, we tried to extract clues or pieces of information to propose an abstract process for each of them, if possible. This work helped us to identify critical factors and considerations for gamifying cognitive tasks.

Comparison of the Proposed Framework With General Gamification Frameworks

Like general gamification frameworks, the proposed framework follows a user-centered design to improve users’ participation but has many added features that make it appropriate to gamify cognitive tasks. One main finding of this study that gamification designers need to be aware of is that an intermediate design is required for gamifying cognitive tasks, meaning that game elements cannot be selected without considering the targeted cognitive context characteristics. Otherwise, an irrelevant cognitive load may jeopardize data quality. Therefore, collaboration between both gamification and mental experts is required to examine the interplay of game elements and cognitive processes [9]. The intermediate design is a crucial feature that differentiates the gamification design framework required for cognitive assessment and training from other contexts. Based on examined gamification efforts in cognitive tasks, we concluded that 2 techniques have been widely used to gamify cognitive tasks besides designing a new gamified task from scratch: (1) gaming-up an existing cognitive task and (2) mapping an existing game (cognitive and classical games) to a cognitive function or impairment to assess or train it. The required details to use game-up and mapping techniques were provided in step 3 of the framework as far as possible.

Limitations

The main limitation was that only a few studies have discussed how game elements impact participants when interacting with the gamified task and how they should be utilized in cognitive tasks to positively influence data quality and user engagement. Gathering evidence from the mentioned sources only gave an initial evaluation for the proposed framework because the number of experts and relevant studies was limited. Hence, a more robust evaluation is necessary. For this purpose, we listed most of the recognized experts in the area to evaluate and refine the framework in subsequent studies.

Possible Future Studies

By further developing the framework exploited in this work and utilizing machine learning and deep learning algorithms, it is possible to create a recommender system that can suggest the most appropriate game elements according to characteristics of the targeted cognitive context and users' preferences or requirements. In our work, only existing games that have been developed or examined in scientific papers were investigated. It is also possible to analyze a great number of current games, from brain games to classical games, for further development of the framework. Due to the time constraints, establishing the feasibility of proposing different gamification design frameworks for cognitive functions that inherently share similar cognitive processes like processing speed (Gs) and working memory [20] was not provided.

Conclusions

While more work is needed to further refine and evaluate the framework, we believe our framework has great potential to be used as a foundation for developing effective gamified cognitive tasks. Furthermore, ideas presented in the paper can be further developed and researched by many other researchers and practitioners.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Explored gamification design frameworks.

DOC File , 100 KB

Multimedia Appendix 2

Included papers applied gamification in cognitive training/ testing.

DOC File , 231 KB

References

  1. Global cognitive assessment and training market 2018-2022. Business Wire. 2018 Feb 08.   URL: https:/​/www.​businesswire.com/​news/​home/​20180208005680/​en/​Global-Cognitive-Assessment-Training-Market-2018-2022-- [accessed 2019-04-05]
  2. Gamberini L, Martino F, Seraglia B, Spagnolli A, Fabregat M, Ibanez F, et al. Eldergames project: An innovative mixed reality table-top solution to preserve cognitive functions in elderly people. 2009 Presented at: 2nd Conference on Human System Interactions; May 21-23, 2009; Catania, Italy. [CrossRef]
  3. Chan JY, Kwong JS, Wong A, Kwok TC, Tsoi KK. Comparison of Computerized and Paper-and-Pencil Memory Tests in Detection of Mild Cognitive Impairment and Dementia: A Systematic Review and Meta-analysis of Diagnostic Studies. J Am Med Dir Assoc 2018 Sep;19(9):748-756.e5. [CrossRef] [Medline]
  4. Potter G, Steffens D. Contribution of depression to cognitive impairment and dementia in older adults. Neurologist 2007 May;13(3):105-117. [CrossRef] [Medline]
  5. Boendermaker WJ, Prins PJ, Wiers RW. Cognitive Bias Modification for adolescents with substance use problems--Can serious games help? J Behav Ther Exp Psychiatry 2015 Dec;49(Pt A):13-20 [FREE Full text] [CrossRef] [Medline]
  6. Mayes R, Bagwell C, Erkulwater J. ADHD and the rise in stimulant use among children. Harv Rev Psychiatry 2008;16(3):151-166. [CrossRef] [Medline]
  7. Afrasiabi Navan A, Khaleghi A. Using Gamification to Improve the Quality of Educating Children with Autism. Rev. Cient 2020 Jan 01;37(1):90-106. [CrossRef]
  8. Vourvopoulos A, Faria A, Ponnam K, Bermudez BS. RehabCity: designvalidation of a cognitive assessment and rehabilitation tool through gamified simulations of activities of daily living. 2014 Presented at: ACE '14: 11th Conference on Advances in Computer Entertainment Technology; November 2014; Funchal, Portugal. [CrossRef]
  9. Lumsden J, Edwards EA, Lawrence NS, Coyle D, Munafò MR. Gamification of Cognitive Assessment and Cognitive Training: A Systematic Review of Applications and Efficacy. JMIR Serious Games 2016 Jul 15;4(2):e11 [FREE Full text] [CrossRef] [Medline]
  10. Folstein MF, Folstein SE, McHugh PR. “Mini-mental state”. Journal of Psychiatric Research 1975 Nov;12(3):189-198. [CrossRef]
  11. Inouye SK, van Dyck CH, Alessi CA, Balkin S, Siegal AP, Horwitz RI. Clarifying confusion: the confusion assessment method. A new method for detection of delirium. Ann Intern Med 1990 Dec 15;113(12):941-948. [CrossRef] [Medline]
  12. Nasreddine Z, Phillips N, Bédirian V, Charbonneau S, Whitehead V, Collin I, et al. The Montreal Cognitive Assessment, MoCA: a brief screening tool for mild cognitive impairment. J Am Geriatr Soc 2005 Apr;53(4):695-699. [CrossRef] [Medline]
  13. CogMed.   URL: https://www.cogmed.com/ [accessed 2020-09-21]
  14. Nintendo Brain Age.   URL: https://www.nintendo.com/games/detail/brain-age-train-your-brain-in-minutes-a-day-wii-u/ [accessed 2021-04-17]
  15. Lumosity.   URL: https://www.lumosity.com [accessed 2020-09-21]
  16. BrainHQ. Posit Science.   URL: https://www.brainhq.com/ [accessed 2020-09-21]
  17. Gray SI, Robertson J, Manches A, Rajendran G. BrainQuest: The use of motivational design theories to create a cognitive training game supporting hot executive function. International Journal of Human-Computer Studies 2019 Jul;127:124-149. [CrossRef]
  18. Stroop JR. Studies of interference in serial verbal reactions. Journal of Experimental Psychology 1935;18(6):643-662. [CrossRef]
  19. Lumsden J, Skinner A, Woods A, Lawrence N, Munafò M. The effects of gamelike features and test location on cognitive test performance and participant enjoyment. PeerJ 2016;4:e2184 [FREE Full text] [CrossRef] [Medline]
  20. McPherson J, Burns NR. Assessing the validity of computer-game-like tests of processing speed and working memory. Behav Res Methods 2008 Nov;40(4):969-981. [CrossRef] [Medline]
  21. McPherson J, Burns NR. Gs invaders: assessing a computer game-like test of processing speed. Behav Res Methods 2007 Nov;39(4):876-883. [CrossRef] [Medline]
  22. Ninaus M, Pereira G, Stefitz R, Prada R, Paiva A, Neuper C, et al. Game elements improve performance in a working memory training task. IJSG 2015 Feb 10;2(1):1. [CrossRef]
  23. Birk MV, Mandryk R, Bowey J, Buttlar B. The effects of adding premise and backstory to psychological tasks. 2015 Presented at: CHI'15 Workshop Researching Gamification: Strategies, Opportunities, Challenges, Ethics; April 19, 2015; Seoul, Korea.
  24. Lumsden J, Skinner A, Coyle D, Lawrence N, Munafo M. Attrition from Web-Based Cognitive Testing: A Repeated Measures Comparison of Gamification Techniques. J Med Internet Res 2017 Nov 22;19(11):e395 [FREE Full text] [CrossRef] [Medline]
  25. Katz B, Jaeggi S, Buschkuehl M, Stegman A, Shah P. Differential effect of motivational features on training improvements in school-based cognitive training. Front Hum Neurosci 2014 Apr 24;8:242 [FREE Full text] [CrossRef] [Medline]
  26. Hawkins GE, Rae B, Nesbitt KV, Brown SD. Gamelike features might not improve data. Behav Res Methods 2013 Jun 9;45(2):301-318. [CrossRef] [Medline]
  27. Miranda AT, Palmer EM. Intrinsic motivation and attentional capture from gamelike features in a visual search task. Behav Res Methods 2014 Mar 9;46(1):159-172. [CrossRef] [Medline]
  28. Deterding S, Dixon D, Khaled R, Nacke L. From Game Design Elements to Gamefulness: Defining "Gamification". 2011 Presented at: MindTrek '11: 15th International Academic MindTrek Conference: Envisioning Future Media Environments; September 28-30, 2011; Tampere, Finland p. 9-15. [CrossRef]
  29. Ryan RM, Deci EL. Intrinsic and Extrinsic Motivations: Classic Definitions and New Directions. Contemp Educ Psychol 2000 Jan;25(1):54-67. [CrossRef] [Medline]
  30. Vermeir JF, White MJ, Johnson D, Crombez G, Van Ryckeghem DML. The Effects of Gamification on Computerized Cognitive Training: Systematic Review and Meta-Analysis. JMIR Serious Games 2020 Aug 10;8(3):e18644 [FREE Full text] [CrossRef] [Medline]
  31. Ryan RM, Deci EL. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am Psychol 2000 Jan;55(1):68-78. [CrossRef] [Medline]
  32. Csikszentmihalyi M. Flow: The Psychology of Optimal Experience. New York, NY: Harper Collins; 1990.
  33. Craven MP, Groom MJ. Computer Games for User Engagement in Attention Deficit Hyperactivity Disorder (ADHD) Monitoring and Therapy. 2015 Presented at: International Conference on Interactive Technologies and Games; October 22-23, 2015; Nottingham, UK p. 22-23. [CrossRef]
  34. Boendermaker WJ, Boffo M, Wiers RW. Exploring Elements of Fun to Motivate Youth to Do Cognitive Bias Modification. Games Health J 2015 Dec;4(6):434-443. [CrossRef] [Medline]
  35. Van de Weijer-Bergsma E, Kroesbergen EH, Prast EJ, Van Luit JEH. Validity and reliability of an online visual-spatial working memory task for self-reliant administration in school-aged children. Behav Res Methods 2015 Sep 26;47(3):708-719. [CrossRef] [Medline]
  36. Openclipart.   URL: https://openclipart.org/ [accessed 2020-09-25]
  37. 4vector.   URL: https://4vector.com/ [accessed 2020-09-25]
  38. Dörrenbächer S, Müller PM, Tröger J, Kray J. Dissociable effects of game elements on motivation and cognition in a task-switching training in middle childhood. Front Psychol 2014 Nov 13;5:1275 [FREE Full text] [CrossRef] [Medline]
  39. Prins PJ, Dovis S, Ponsioen A, ten Brink E, van der Oord S. Does computerized working memory training with game elements enhance motivation and training efficacy in children with ADHD? Cyberpsychol Behav Soc Netw 2011 Mar;14(3):115-122. [CrossRef] [Medline]
  40. Prins PJ, Brink ET, Dovis S, Ponsioen A, Geurts HM, de Vries M, et al. "Braingame Brian": Toward an Executive Function Training Program with Game Elements for Children with ADHD and Cognitive Control Problems. Games Health J 2013 Feb;2(1):44-49. [CrossRef] [Medline]
  41. Baniqued PL, Lee H, Voss MW, Basak C, Cosman JD, Desouza S, et al. Selling points: What cognitive abilities are tapped by casual video games? Acta Psychol (Amst) 2013 Jan;142(1):74-86 [FREE Full text] [CrossRef] [Medline]
  42. Brown HR, Zeidman P, Smittenaar P, Adams RA, McNab F, Rutledge RB, et al. Crowdsourcing for cognitive science--the utility of smartphones. PLoS One 2014 Jul 15;9(7):e100662 [FREE Full text] [CrossRef] [Medline]
  43. Valladares-Rodríguez S, Pérez-Rodríguez R, Anido-Rifón L, Fernández-Iglesias M. Trends on the application of serious games to neuropsychological evaluation: A scoping review. J Biomed Inform 2016 Dec;64:296-319 [FREE Full text] [CrossRef] [Medline]
  44. Robson K, Plangger K, Kietzmann JH, McCarthy I, Pitt L. Is it all a game? Understanding the principles of gamification. Business Horizons 2015 Jul;58(4):411-420. [CrossRef]
  45. Aparicio AF, Vela FLG, Sánchez JLG, Montes JLI. Analysis and application of gamification. 2012 Presented at: 13th International Conference on Interacción Persona-Ordenador; October 2012; Elche, Spain p. 03-05. [CrossRef]
  46. Brito J, Vieira V, Duran A. Towards a Framework for Gamification Design on Crowdsourcing Systems: The G.A.M.E. Approach. 2015 Presented at: 12th International Conference on Information Technology - New Generations; April 13-15, 2015; Las Vegas, NV. [CrossRef]
  47. Kumar J. Gamification at work: Designing engaging business software. In: International Conference of Design, User Experience, and Usability. Berlin, Germany: Springer Publishing Company; 2012:528-537.
  48. Marache-Francisco C, Brangier E. Process of Gamification From The Consideration of Gamification To Its Practical Implementation. 2013 Presented at: CENTRIC 2013: The Sixth International Conference on Advances in Human oriented and Personalized Mechanisms, Technologies, and Services; October 2013; Venice, Italy p. 527-537.
  49. Werbach K, Hunter D. For The Win: How Game Thinking Can Revolutionize Your Business. Pennsylvania, PA: Wharton Digital Press; 2012.
  50. Deterding S. The Lens of Intrinsic Skill Atoms: A Method for Gameful Design. Human–Computer Interaction 2015 May 15;30(3-4):294-335. [CrossRef]
  51. Li X. A Method to Support Gamification Design Practice with Motivation Analysis and Goal Modeling. 2018 Presented at: 2nd International GamiFIN Conference; 2018; Pori, Finland p. 151-158.
  52. Liu D, Santhanam R, Webster J. Toward Meaningful Engagement: A Framework for Design and Research of Gamified Information Systems. MISQ 2017 Apr 4;41(4):1011-1034. [CrossRef]
  53. Morschheuser B, Hassan L, Werder K, Hamari J. How to design gamification? A method for engineering gamified software. Information and Software Technology 2018 Mar;95:219-237. [CrossRef]
  54. Mora A, Riera D, González C, Arnedo-Moreno J. Gamification: a systematic review of design frameworks. J Comput High Educ 2017 May 23;29(3):516-548. [CrossRef]
  55. Azouz O, Lefdaoui Y. Gamification design frameworks: a systematic mapping study. 2018 Presented at: 6th International Conference on Multimedia Computing and Systems (ICMCS); May 10-12, 2018; Rabat, Morocco. [CrossRef]
  56. Hansen OG. What gamification design do users want in a self-management application for chronic diseases? - The case of Cystic Fibrosis. Norwegian University of Science and Technology. 2017.   URL: https://ntnuopen.ntnu.no/ntnu-xmlui/handle/11250/2458481 [accessed 2021-04-17]
  57. Johannesson P, Perjons E. An Introduction to Design Science. New York, NY: Springer Publishing Company; 2014.
  58. Hevner, March, Park, Ram. Design Science in Information Systems Research. MIS Quarterly 2004;28(1):75. [CrossRef]
  59. Brinkkemper S. Method engineering: engineering of information systems development methods and tools. Information and Software Technology 1996 Jan;38(4):275-280. [CrossRef]
  60. Hunicke R, LeBlanc M, Zubek R. MDA: A formal approach to game designgame research. 2004 Presented at: AAAI Workshop on Challenges in Game AI; January 2004; San Jose, CA.
  61. Mummah SA, Robinson TN, King AC, Gardner CD, Sutton S. IDEAS (Integrate, Design, Assess, and Share): A Framework and Toolkit of Strategies for the Development of More Effective Digital Interventions to Change Health Behavior. J Med Internet Res 2016 Dec 16;18(12):e317 [FREE Full text] [CrossRef] [Medline]
  62. Herger M. Enterprise Gamification: Engaging people by letting them have fun. Leipzig: CreateSpace Independent Publishing Platform; 2014.
  63. Manera V, Petit P, Derreumaux A, Orvieto I, Romagnoli M, Lyttle G, et al. 'Kitchen and cooking,' a serious game for mild cognitive impairment and Alzheimer's disease: a pilot study. Front Aging Neurosci 2015 Mar 17;7:24 [FREE Full text] [CrossRef] [Medline]
  64. Delgado MT, Uribe PA, Alonso AA, Díaz RR. TENI: A comprehensive battery for cognitive assessment based on games and technology. Child Neuropsychol 2016 Nov 14;22(3):276-291. [CrossRef] [Medline]
  65. Shaban A, Pearson E. A Learning Design Framework to Support Children with Learning Disabilities Incorporating Gamification Techniques. 2019 Presented at: CHI EA '19: 2019 CHI Conference on Human Factors in Computing Systems; May 2019; Glasgow, Scotland. [CrossRef]
  66. Carreño AM. A framework for agile design of personalized gamification services. Universitat Oberta de Catalunya. 2018 Jun.   URL: https://www.tesisenred.net/bitstream/handle/10803/664190/Thesis_June.pdf [accessed 2021-04-17]
  67. Serrat O. The five whys technique. In: Knowledge Solutions. Singapore: Springer Publishing Company; May 23, 2017:307-310.
  68. Chou Y. Octalysis: complete gamification framework. Gamification & Behavioral Design.   URL: http://yukaichou.com/gamification-examples/octalysis-complete-gamification-framework/ [accessed 2020-09-17]
  69. Söderberg S. A Case Study on the Creative Process of Making a Game for Cognitively Impaired Children. Mälardalens University School of Innovation, Design and Engineering. 2011 Oct 2.   URL: https://www.diva-portal.org/smash/get/diva2:445565/FULLTEXT01.pdf [accessed 2021-04-17]
  70. Guidelines for the development of entertaining software for people with multiple learning disabilities. MediaLT. 2004.   URL: http://www.medialt.no/rapport/entertainment_guidelines/ [accessed 2019-07-25]
  71. Jaramillo-Alcázar A, Luján-Mora S, Salvador-Ullauri L. Inclusive Education: Mobile Serious Games for People with Cognitive Disabilities. Enfoque UTE 2018 Mar 30;9(1):53-66. [CrossRef]
  72. Tong T, Chignell M, Tierney MC, Lee J. A Serious Game for Clinical Assessment of Cognitive Status: Validation Study. JMIR Serious Games 2016 May 27;4(1):e7 [FREE Full text] [CrossRef] [Medline]
  73. Robert PH, König A, Amieva H, Andrieu S, Bremond F, Bullock R, et al. Recommendations for the use of Serious Games in people with Alzheimer's Disease, related disorders and frailty. Front Aging Neurosci 2014 Mar 24;6:54 [FREE Full text] [CrossRef] [Medline]
  74. Klevers M, Sailer M, Günthner W. Implementation model for the gamification of business processes: a study from the field of material handling. In: Kaneda T, Kanegae H, Toyoda Y, Rizzo P, editors. Simulation and gaming in the network society. New York, NY: Springer Publishing Company; 2016:173-184.
  75. Valladares-Rodriguez S, Perez-Rodriguez R, Facal D, Fernandez-Iglesias M, Anido-Rifon L, Mouriño-Garcia M. Design process and preliminary psychometric study of a video game to detect cognitive impairment in senior adults. PeerJ 2017;5:e3508 [FREE Full text] [CrossRef] [Medline]
  76. Zeng Z, Fauvel S, Hsiang BTT, Wang D, Qiu Y, Khuan PCO, et al. Towards long-term tracking and detection of early dementia: a computerized cognitive test battery with gamification. 2018 Presented at: ICCSE'18: 3rd International Conference on Crowd Science and Engineering; July 2018; Singapore. [CrossRef]
  77. Zygouris S, Tsolaki M. Computerized cognitive testing for older adults: a review. Am J Alzheimers Dis Other Demen 2015 Feb 13;30(1):13-28 [FREE Full text] [CrossRef] [Medline]
  78. Wild K, Howieson D, Webbe F, Seelye A, Kaye J. Status of computerized cognitive testing in aging: a systematic review. Alzheimers Dement 2008 Nov 01;4(6):428-437 [FREE Full text] [CrossRef] [Medline]
  79. Tierney MC, Lermer MA. Computerized cognitive assessment in primary care to identify patients with suspected cognitive impairment. J Alzheimers Dis 2010;20(3):823-832. [CrossRef] [Medline]
  80. Gray SI. Developing and evaluating the feasibility of an active training game for smartphones as a tool for promoting executive function in children. The University of Edinburgh. 2017 Jun 07.   URL: https://era.ed.ac.uk/handle/1842/23414 [accessed 2021-04-17]
  81. Franceschini S, Gori S, Ruffino M, Viola S, Molteni M, Facoetti A. Action video games make dyslexic children read better. Curr Biol 2013 Mar 18;23(6):462-466 [FREE Full text] [CrossRef] [Medline]
  82. Green CS, Bavelier D. Action video game modifies visual selective attention. Nature 2003 May 29;423(6939):534-537. [CrossRef] [Medline]
  83. Green CS, Bavelier D. Action video game training for cognitive enhancement. Current Opinion in Behavioral Sciences 2015 Aug;4:103-108. [CrossRef]
  84. Dye MW, Green CS, Bavelier D. Increasing Speed of Processing With Action Video Games. Curr Dir Psychol Sci 2009 Dec;18(6):321-326 [FREE Full text] [CrossRef] [Medline]
  85. Boot WR. Video games as tools to achieve insight into cognitive processes. Front Psychol 2015;6:3 [FREE Full text] [CrossRef] [Medline]
  86. Bavelier D, Green CS. The Brain-Boosting Power of Video Games. Sci Am 2016 Jul 14;315(1):26-31. [CrossRef] [Medline]
  87. Anguera JA, Gazzaley A. Video games, cognitive exercises, and the enhancement of cognitive abilities. Current Opinion in Behavioral Sciences 2015 Aug;4:160-165. [CrossRef]
  88. Brilliant TD, Nouchi R, Kawashima R. Does Video Gaming Have Impacts on the Brain: Evidence from a Systematic Review. Brain Sci 2019 Sep 25;9(10):251 [FREE Full text] [CrossRef] [Medline]
  89. Green C, Bavelier D. Learning, attentional control, and action video games. Curr Biol 2012 Mar 20;22(6):R197-R206 [FREE Full text] [CrossRef] [Medline]
  90. Bavelier D, Green CS, Han DH, Renshaw PF, Merzenich MM, Gentile DA. Brains on video games. Nat Rev Neurosci 2011 Nov 18;12(12):763-768 [FREE Full text] [CrossRef] [Medline]
  91. Doherty SM, Keebler JR, Davidson SS, Palmer EM, Frederick CM. Recategorization of Video Game Genres. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 2018 Sep 27;62(1):2099-2103. [CrossRef]
  92. Gielis K, Brito F, Tournoy J, Abeele VV. Can Card Games Be Used to Assess Mild Cognitive Impairment? A Study of Klondike Solitaire and Cognitive Functions. 2017 Presented at: CHI PLAY '17: Annual Symposium on Computer-Human Interaction in Play; October 2017; Amsterdam, Netherlands p. 269-276. [CrossRef]
  93. Jimison H, Pavel M, McKanna J, Pavel J. Unobtrusive monitoring of computer interactions to detect cognitive status in elders. IEEE Trans Inf Technol Biomed 2004 Sep;8(3):248-252. [CrossRef] [Medline]
  94. Groznik V, Sadikov A. Gamification in Cognitive Assessment and Cognitive Training for Mild Cognitive Impairment. In: Geroimenko V, editor. Augmented Reality Games II. New York, NY: Springer Publishing Company; 2019:179-204.
  95. Valladares-Rodriguez S, Pérez-Rodriguez R, Fernandez-Iglesias JM, Anido-Rifón L, Facal D, Rivas-Costa C. Learning to Detect Cognitive Impairment through Digital Games and Machine Learning Techniques. Methods Inf Med 2018 Sep 24;57(04):197-207. [CrossRef]
  96. Peijnenborgh JC, Hurks PP, Aldenkamp AP, van der Spek ED, Rauterberg G, Vles JS, et al. A Study on the Validity of a Computer-Based Game to Assess Cognitive Processes, Reward Mechanisms, and Time Perception in Children Aged 4-8 Years. JMIR Serious Games 2016 Sep 22;4(2):e15 [FREE Full text] [CrossRef] [Medline]
  97. Cota TT, Ishitani L, Vieira N. Mobile game design for the elderly: A study with focus on the motivation to play. Computers in Human Behavior 2015 Oct;51:96-105. [CrossRef]
  98. Castronova E, Knowles I. Modding board games into serious games: The case of climate policy. International Journal of Serious Games 2015;2(3):41-62 [FREE Full text]
  99. Vasiljevic GAM, de Miranda LC. Brain–Computer Interface Games Based on Consumer-Grade EEG Devices: A Systematic Literature Review. International Journal of Human–Computer Interaction 2019 Jun 03;36(2):105-142. [CrossRef]
  100. Joselli M, Binder F, Clua E, Soluri E. Concept, development and evaluation of a mind action game with the electro encephalograms as an auxiliary input. JIS 2017 Sep 14;8(1):1. [CrossRef]
  101. Chesham A, Gerber SM, Schütz N, Saner H, Gutbrod K, Müri RM, et al. Search and Match Task: Development of a Taskified Match-3 Puzzle Game to Assess and Practice Visual Search. JMIR Serious Games 2019 May 09;7(2):e13620 [FREE Full text] [CrossRef] [Medline]
  102. Tong T, Chignell M, Sieminowski T. Case Study: A Serious Game for Neurorehabilitation Assessment. Procedia Computer Science 2015;69:125-131. [CrossRef]
  103. Tong T, Chignell M. Developing a Serious Game for Cognitive Assessment: Choosing Settings and Measuring Performance. 2014 Presented at: Chinese CHI '14: Second International Symposium of Chinese CHI; April 26, 2014; Toronto, Ontario, Canada p. 70-79. [CrossRef]
  104. Tong T, Chan JH, Chignell M. Serious Games for Dementia. 2017 Presented at: WWW '17 Companion: 26th International Conference on World Wide Web Companion; April 2017; Perth, Australia p. 1111-1115. [CrossRef]
  105. Khademi M, Hondori HM, McKenzie A, Dodakian L, Lopes CV, Cramer SC. Free-hand interaction with leap motion controller for stroke rehabilitation. 2014 Presented at: CHI EA '14: Human Factors in Computing Systems; April 2014; Toronto, Ontario, Canada p. 1663-1668. [CrossRef]
  106. Tan CW, Yu PD, Lin L. Teaching Computational Thinking Using Mathematics Gamification in Computer Science Game Tournaments. In: Kong SC, Abelson H, editors. Computational Thinking Education. Singapore: Springer Publishing Company; 2019:167-181.
  107. Green CS, Seitz AR. The Impacts of Video Games on Cognition (and How the Government Can Guide the Industry). Policy Insights from the Behavioral and Brain Sciences 2015 Oct 01;2(1):101-110. [CrossRef]
  108. Bavelier D, Green C, Seidenberg M. Cognitive development: gaming your way out of dyslexia? Curr Biol 2013 Apr 08;23(7):R282-R283 [FREE Full text] [CrossRef] [Medline]
  109. Trajkoski M. Whack A Mole Game. App Store.   URL: https://apps.apple.com/ms/app/whack-a-mole-game/id1233662398 [accessed 2019-12-26]
  110. Jackson SA, Marsh HW. Development and validation of a scale to measure optimal experience: the flow state scale. Journal of Sport and Exercise Psychology 1996;18(1):17-35. [CrossRef]
  111. Dörrenbächer S, Kray J. The Impact of Game-Based Task-Shifting Training on Motivation and Executive Control in Children with ADHD. J Cogn Enhanc 2018 Jun 19;3(1):64-84. [CrossRef]
  112. Friehs MA, Dechant M, Vedress S, Frings C, Mandryk RL. Effective Gamification of the Stop-Signal Task: Two Controlled Laboratory Experiments. JMIR Serious Games 2020 Sep 08;8(3):e17810 [FREE Full text] [CrossRef] [Medline]
  113. Lan L, Lian Z. Application of statistical power analysis – How to determine the right sample size in human health, comfort and productivity research. Building and Environment 2010 May;45(5):1202-1213. [CrossRef]
  114. Faul F, ErdFelder E, Lang AG, Buchner A. G*power 3.1 manual. Behav Res Methods 2007;39(2):175-191 [FREE Full text]
  115. Faber J, Fonseca LM. How sample size influences research outcomes. Dental Press J Orthod 2014 Aug;19(4):27-29 [FREE Full text] [CrossRef] [Medline]
  116. Aalbers T, Baars MAE, Olde Rikkert MGM, Kessels RPC. Puzzling with online games (BAM-COG): reliability, validity, and feasibility of an online self-monitor for cognitive performance in aging adults. J Med Internet Res 2013 Dec 03;15(12):e270 [FREE Full text] [CrossRef] [Medline]
  117. Fedele D, McConville A, Moon J, Thomas J. Topical Review: Design Considerations When Creating Pediatric Mobile Health Interventions: Applying the IDEAS Framework. J Pediatr Psychol 2019 Apr 01;44(3):343-348 [FREE Full text] [CrossRef] [Medline]
  118. GitHub.   URL: https://github.com/ [accessed 2020-09-24]
  119. MD2K.   URL: https://md2k.org/ [accessed 2020-09-24]


ADHD: attention deficit hyperactivity disorder
ADLs: activities of daily living
CAM: Confusion Assessment Method
DSR: Design Science Research
MD2K: Mobile Sensor Data-To-Knowledge
MDA: Mechanics, Dynamics and Aesthetics
MMSE: Mini-Mental State Examination
MoCA: Montreal Cognitive Assessment
OMDE: Objects, Mechanics, Dynamics, Emotions
RCT: randomized controlled trial
SDT: self-determination theory


Edited by N Zary; submitted 29.06.20; peer-reviewed by T Tong, P Robert, C Lara; comments to author 12.08.20; revised version received 05.10.20; accepted 08.02.21; published 18.05.21

Copyright

©Ali Khaleghi, Zahra Aghaei, Mohammad Amin Mahdavi. Originally published in JMIR Serious Games (https://games.jmir.org), 18.05.2021.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Serious Games, is properly cited. The complete bibliographic information, a link to the original publication on https://games.jmir.org, as well as this copyright and license information must be included.