Abstract
Collaborative problem solving (CPS) has emerged as a crucial 21st century competence that benefits students’ studies, future careers, and general well-being, prevailing across disciplines and learning approaches. Given the complex and dynamic nature of CPS, teacher-facing learning analytics dashboards (LADs) have increasingly been adopted to support teachers’ CPS assessments by analysing and visualising various dimensions of students’ CPS. However, there is limited research investigating K-12 teachers’ integration of LADs for CPS assessments in authentic classrooms. In this study, a LAD was implemented to assist K-12 teachers in assessing students’ CPS skills in an educational game. Based on the person-environment fit theory, this study aimed to (1) examine the extent to which teachers’ environmental and personal factors influence LAD usage intention and behaviour and (2) identify personal factors mediating the relationships between environmental factors and LAD usage intention and behaviour. Survey data of 300 in-service teachers from ten Chinese K-12 schools were collected and analysed using partial least squares structural equation modelling (PLS-SEM). Results indicated that our proposed model showed strong in-sample explanatory power and out-of-sample predictive capability. Additionally, subjective norms affected technological pedagogical content knowledge (TPACK) and self-efficacy, while school support affected technostress and self-efficacy. Moreover, subjective norms, technostress, and self-efficacy predicted behavioural intention, while school support, TPACK, and behavioural intention predicted actual behaviour. As for mediation effects, school support indirectly affected behavioural intention through self-efficacy, while subjective norms indirectly affected behavioural intention through self-efficacy and affected actual behaviour through TPACK. This study makes theoretical, methodological, and practical contributions to technology integration in general and LAD implementation in particular.
Similar content being viewed by others
1 Introduction
Collaborative problem solving (CPS) is a socio-cognitive process in which group members utilise their shared knowledge, experiences, and skills, and navigate through a series of steps to reach a mutually agreed-upon solution to a particular problem (Fiore et al., 2017; Graesser et al., 2018; Griffin & Care, 2015; OECD, 2017a). Scholars and educators recognise CPS as a critical competence for the younger generation in the 21st century (Cukurova et al., 2018; Fiore et al., 2018). As a domain-general competence (Graesser et al., 2017; Greiff et al., 2014), CPS skills are essential in various active learning approaches (e.g., project-based learning, inquiry-based learning) (Song, 2018; Saleh et al., 2022) and prevail in diverse disciplines (e.g., language, mathematics, and computer science) (see review by Baucal et al., 2023 and Tian & Zheng, 2023). More educational and governmental initiatives worldwide, including the Programme for International Student Assessment (PISA), Assessment and Teaching of 21st Century Skills (ATC21S), and Educational Testing Service (ETS), have increasingly emphasised the importance of students mastering CPS competence (von Davier & Halpin, 2013; Griffin & Care, 2015; OECD, 2017a). Students competent in CPS are likely to not only excel academically but also become better equipped to effectively address communication issues and navigate interpersonal conflicts in future teamwork scenarios (Fiore et al., 2018; OECD, 2017b; Sun et al., 2022). To achieve successful CPS, students are required to engage both effectively and collectively in the processes of identifying and representing problems, planning, execution, and monitoring (Hesse et al., 2015; OECD, 2017a). Cognitive and social skills play pivotal roles during CPS activities, enabling teams to coordinate and communicate effectively and pool individual knowledge, experiences, and skills, thereby arriving at better solutions more efficiently than when individuals work alone (Andrews-Todd & Forsyth, 2020; Care et al., 2016; Hesse et al., 2015). However, given the interactive, interdependent, and temporal nature of CPS (Swiecki et al., 2020), it is challenging for teachers to assess and support students’ CPS skills and performance at both individual and group levels during their actual instruction (Scoular & Care, 2018), particularly in classroom settings (Care & Kim, 2018; Martinez-Maldonado, 2019).
In recent years, well-designed games have gained recognition as suitable vehicles for assessing and fostering students’ higher-order skills, including CPS (see review by Qian & Clark, 2016 and Gomez et al., 2022). As an active learning approach, digital game-based learning creates an immersive and playful environment, attracting students’ attention and promoting their engagement in learning tasks, thereby facilitating them to stay focused on the learning objectives (Hsu et al., 2021; Kaimara et al., 2021; Stieler-Hunt & Jones, 2017). More importantly, as students interact with games and teammates, they generate a substantial amount of multimodal gameplay data, such as clickstreams and conversations (Sun et al., 2022; see review by Tlili et al., 2021a). Such multimodal gameplay data can be captured, analysed, and visualised through learning analytics dashboards (LADs), providing teachers with meaningful and actionable information about students’ demonstration of higher-order skills and learning attainments (Chen et al., 2022; Lee-Cultura et al., 2023; Ruipérez-Valiente et al., 2021; Tlili et al., 2021b). LADs, harnessing the power of both learning analytics and visual analytics (Sun & Liu 2022), can illustrate various metrics that reflect different aspects of students’ CPS. Not only do they display students’ individual skills and contributions to collaborative learning along with their changes over time (Hu et al., 2022), but they also illuminate team performance and group dynamics during the collaborative learning processes (Liu et al., 2024; Bao et al., 2021; Zheng et al., 2021).
In CPS contexts, teachers play a guiding role in facilitating students’ collaboration and problem solving (Griffin, 2017). With the assistance of LADs, teachers can monitor students’ CPS progression and evaluate individual contributions and team performance more accurately (Liu et al., 2024; Martinez-Maldonado, 2019). This enables teachers to identify students who are struggling with CPS tasks or ‘gaming the system’ in a timely manner and provide actionable feedback as well as adaptive and personalised support (Chen et al., 2021; Huang et al., 2023). Teacher-facing LADs are widely regarded as effective tools that can aid teachers in facilitating students’ CPS practices (Kaliisa & Dolonen, 2023; Van Leeuwen et al., 2019). However, teachers’ sensemaking of the LADs relies on how they associate dashboard information with their own pedagogical decisions and actions (Van Leeuwen, 2019), and multiple factors can influence this sensemaking process. For example, existing studies (e.g., Zheng et al., 2021) have found that teachers often struggle with complex visualisations in LADs. Some have reported the misalignment between visual representations of LADs and teachers’ diagnoses (Li et al., 2022), whereas others have demonstrated that teachers might find it difficult to use LADs, suggesting the necessity of providing professional training and technological support for those working with LADs (Rienties et al., 2018).
Despite the substantial findings in LAD research, there exists a critical gap between the potential role of LADs designed for CPS assessments and their actual usage by teachers. According to person-environment fit theory (P-E fit theory), the degree of compatibility between a person’s individual characteristics and external environments can shape the person’s actual behaviour (Kristof-Brown et al., 2005; Kristof-Brown & Guay, 2011). Accordingly, this study posits that both teachers’ personal and environmental factors play critical roles in determining their usage behaviour of LADs for CPS assessments. Moreover, through reviewing previous literature on the P-E fit theory (e.g., Al-Fudail & Mellar, 2008; Chou & Chou, 2021; Dong et al., 2020; Govender & Mpungose, 2022) and determinants of teacher-facing LAD adoption (e.g., Kaliisa et al., 2021; Rienties et al., 2018; Van Leeuwen et al., 2021), this study identified several specific personal and environmental factors that may affect teachers’ LAD usage and constructed an integrated conceptual model (see Fig. 1). Specifically, teachers’ personal attributes include technological pedagogical content knowledge (TPACK), technostress, and self-efficacy, while the environmental factors include subjective norms and school support. In light of the potential of LAD for supporting teachers’ CPS assessments and the possible impacts of these factors on their LAD usage, this study aims to examine how these personal and environmental factors and their relationships shape the LAD usage intention and behaviour of K-12 in-service teachers in the classroom context of game-based CPS assessments.
Our paper is structured as follows: first, we define collaborative problem-solving, describe its prevalence in education across learning approaches and contexts, and introduce how educational games can develop and assess learners’ CPS competence. Then, we illustrate the potential of teacher-facing learning analytics dashboards (LADs) for supporting teachers’ assessment of learners’ CPS skills and problematize the gap between their potential and teachers’ actual usage. Next, we present our literature review on the affordances of teacher-facing LADs for facilitating computer-based collaborative learning and the person-environment fit theory, encompassing various environmental and personal factors influencing teachers’ technology integration. Then, we articulate our research aims, conceptual model and hypotheses, and our methodology for examining our model hypotheses and addressing our research objective. Next, we present the results of our analysis, including the structural and mediation effects of the environmental and personal factors on teachers’ LAD usage intention and behaviour. Finally, we discuss these results and make concluding remarks regarding the contributions of this study, its limitations, and our recommendations for future research.
2 Literature review
2.1 Teacher-facing LADs for computer-based collaborative learning
Teacher-facing LADs function as a supportive tool, enabling teachers to assess and intervene in students’ learning processes effectively and efficiently based on timely visualizations (Van Leeuwen et al., 2019). These LADs provide fine-grained insights that empower teachers to understand, rationalise, and make informed decisions based on the complex data derived from student online learning trajectories and activities (Calvo-Morata et al., 2019; Li et al., 2022). Research has underscored the benefits of integrating LADs into computer-based collaborative learning. For instance, a study by Van Leeuwen et al. (2019) unveiled a teacher-centric LAD aimed at tracking students’ collaborative endeavours and providing indirect support within the classroom setting. The dashboard offers teachers an overview of students’ collaborative learning situations derived from computational analytics, assisting them in detecting students’ problematic learning situations. Kaliisa and Dolonen (2023) introduced university instructors to a LAD designed for online problem-oriented discussions. This dashboard, equipped with visualisations of team interaction and automated discourse analysis of students’ discussion content, facilitated teachers in interpreting students’ learning dynamics within collaborative discussions. While LADs were deemed beneficial for teachers to gain a deeper understanding of student learning and interactions within computer-supported collaborative learning environments, teachers might encounter multiple challenges when attempting to utilise LADs in their teaching (Zheng et al., 2021). For instance, Liu et al. (2023) found that teachers’ resistance to embracing LADs might result from a deficit in supportive knowledge and skills, such as data visualisation literacy, and possible stress and anxiety induced by technology adoption. Li et al. (2022) revealed that teachers viewed the complexity of visualisations from LADs and the inadequacy of LAD capabilities as barriers to interpreting their information. Thus, the identification of these challenges and the implementation of corresponding solutions are crucial in enhancing teachers’ capacity to integrate LADs into teaching practices.
2.2 Person-environment fit theory
Built on various theories and models of technology integration, previous studies have identified numerous factors that may facilitate or impede teachers’ effective use of LADs (e.g., Liu et al., 2023, 2024; Li et al., 2022). Certain factors are tied to teachers’ personal attributes, encompassing their knowledge, skills, and literacy in relation to LADs, as well as psychological aspects like self-efficacy beliefs. On the other hand, some factors derive from external environments around teachers, such as the availability of school support or broader social conditions. To better conceptualise and situate these factors, we employ the P-E fit theory as the theoretical ground for this study. The P-E fit theory refers to the compatibility that arises when people and their environments are well matched (Kristof-Brown et al., 2005). It stresses that compatibility between individual and work environment characteristics influences behaviour and psychological functions, with a higher congruity associated with improved performance and increased productivity (Edwards et al., 1998; Kristof-Brown & Guay, 2011). In this study, the P-E fit theory offers a lens to investigate how individual and environmental factors influence teachers’ use of LADs for assessing and supporting students’ CPS skills. Past research has typically leveraged technology acceptance models (TAMs) to delve into various psychological drivers of technology adoption among teachers (Scherer et al., 2019). While both TAM and P-E fit theories underscore influential factors related to technology integration, P-E fit theory offers added value by clustering these factors and further clarifying the relationships between these clusters. In the following sections, we will elaborate on the personal and environmental factors influencing teachers’ technology integration into teaching practices.
2.3 Environmental factors: subjective norms and school support
Ajzen and Fishbein (1980) defined subjective norms (SN) as one’s normative belief that ‘specific individuals or groups think he should or should not perform the behaviour and his motivation to comply with the specific referents’ (p. 8). A systematic review conducted by Wijnen et al. (2021) highlighted the pivotal role of subjective norms in shaping teachers’ acceptance and adoption of technologies designed to foster primary school students’ higher-order thinking skills. Jeong and Kim (2017) found that teachers with a stronger sense of subjective norms were more inclined to utilise information and communication technologies (ICT) for instructional purposes in early childhood education. Shin’s (2015) study disclosed that a substantial proportion of elementary school teachers perceived the attitudes of administrators towards technology use as a vital determinant in achieving high-quality technology integration.
School support (SS) refers to material and psychological support provided by school administrators for teaching-related technology use (Chou & Chou, 2021). Empirical research has demonstrated that school support can facilitate teachers’ utilisation of technology (Atman Uslu & Usluel, 2019; Hew et al., 2017; Porter & Graham, 2016). For instance, Lam et al. (2010) discovered that secondary school teachers exhibited greater motivation and willingness to adopt technological innovations when they perceived their schools as being more supportive of their competence and autonomy. Koh et al. (2017) revealed that the primary school teachers’ integration of technology was fostered by peer support in teacher professional development activities organised by the schools. Regarding pre-service teachers, their intent to use technology was positively influenced by a range of facilitative conditions, such as the availability of infrastructure, technical assistance, and encouraging policies (Kaimara et al., 2021).
2.4 Personal factors: TPACK, self-efficacy, and technostress
Teachers’ technological, pedagogical, and content knowledge (TPACK) emphasises the affordances of using technologies to improve teaching practices (Archambault & Crippen, 2009; Schmidt et al., 2009). Well-developed TPACK is commonly associated with the successful integration of educational technologies in teaching and learning (Anthony et al., 2021; Koh et al., 2017), as TPACK not only informs teachers about what technologies to use but, more importantly, fosters teachers to think, analyse, and reflect on the use of technology (Huang et al., 2021a). Consequently, TPACK is widely discussed when researchers evaluate teachers’ technology usage and integration. For instance, Schmid et al. (2021) revealed that pre-service teachers’ TPACK skills were associated with how they implemented technologies in lesson plans. Furthermore, previous research has emphasised the positive impact of TPACK on teachers’ attitudes, behavioural intentions, and actual behaviours towards technology integration among K-12 and higher education teachers (e.g., Hsu et al., 2021; Jung et al., 2019; Zhang & Chen, 2022). In other words, teachers with a high level of TPACK are more likely to form favourable perceptions of the value technology can add to teaching and learning. Conversely, the lack of TPACK often poses obstacles to successful technology integration (Liu et al., 2024).
Perceived self-efficacy (SE) was defined as ‘beliefs in one’s capabilities to organise and execute the courses of action required to produce given attainments’ (Bandura, 1997, p. 3). Numerous studies have suggested that teachers’ self-efficacy positively influences their intentions to incorporate educational technologies into actual teaching (e.g., Panisoara et al., 2020; Wijnen et al., 2021). For instance, Joo et al. (2018) found that pre-service teachers’ perceived self-efficacy was positively related to their intention towards technology integration. The same positive relationship between self-efficacy and technology use can also be observed in K-12 educational contexts. Petko et al. (2018) reported that primary school teachers’ perceived technology-related beliefs strongly predicted their short-term and long-term technology use for teaching project-based learning. Similarly, Kwon et al. (2019) demonstrated that teachers’ self-efficacy for technology integration significantly affected their actual use of computing devices (e.g., smartphones and tablets) in secondary schools.
Technostress (TS), particularly in educational settings, has garnered significant research attention due to the pervasive infiltration of new technologies into classrooms spanning various subjects and educational levels (see review by Fernández-Batanero et al., 2021). Weil and Rosen (1997) defined technostress as ‘any negative impact on attitudes, thoughts, behaviors, or body psychology caused directly or indirectly by technology’ (p. 5). Ayyagari et al. (2011) suggested that a high level of technostress leads to users’ lower performance in their actual usage and diminished behavioural intention towards future adoption. Similar findings have likewise been noted within the realm of educational research (e.g., Chou & Chou, 2021; Joo et al., 2016). In particular, technostress could lead to K-12 teachers’ psychological frustration and an inability to cope with teaching tasks (Al-Fudail & Mellar, 2008). Similarly, a large-scale survey on the technology integration behaviour of K-12 teachers showed that teachers’ efforts to integrate technologies were hindered by technostress, which diminished satisfaction with educational technology usage and adversely affected their perceptions of using ICT for teaching (Wu et al., 2022).
2.5 The current study
With the complex and dynamic nature of CPS, there is a growing trend towards the development of teacher-facing LADs to analyse and visualise multimodal data related to students’ CPS skills (e.g., clickstreams, group conversations) collected from learning technologies, including educational games (Liu et al., 2024; Azevedo & Gašević, 2019; Chen et al., 2022; Tlili et al., 2021b). Meanwhile, when attempting to adopt LADs for their teaching, teachers often encounter various barriers and challenges (Kaliisa et al., 2021; Lee-Cultura et al., 2023; Li et al., 2022), often due to the misfit between personal and environmental characteristics (Chou & Chou, 2021; Dong et al., 2020; Govender & Mpungose, 2022). The misfit in turn influences teachers’ technology usage intention and performance (Chou & Chou, 2021; Li & Wang, 2021; Joo et al., 2016). Consequently, to maximise the utilities of LADs and mitigate their stress from integrating LAD, it is necessary to identify personal and environmental stressors and predict teachers’ LAD usage from a person-environment fit perspective. Despite the substantial existing findings on the influence of personal and environmental factors on teacher’s technology acceptance and adoption, scant research has delved into how these factors affect K-12 teachers’ integration of LADs for CPS assessments in classroom settings. Therefore, in the current study, we implemented a teacher-facing LAD for assessing students’ CPS in K-12 classrooms, and examined the relationships between environmental and personal factors and how they shape K-12 teachers’ behavioural intention and actual usage of the LAD. The current study will extend our theoretical understanding of the P-E fit theory through its application in the context of technology-enhanced CPS assessment. Methodologically, although previous studies on P-E fit have proposed and validated diverse technology integration models, they focused on checking their models’ explanatory power (i.e., \({R}^{2}\)) and offered limited evidence of the models’ predictive capability and external validity. Thus, whether our proposed model possesses strong out-of-sample predictive power warrants further investigation. Our research findings could shed light on the determinants that either facilitate or hinder teachers’ successful integration of LADs into teaching practices and offer recommendations on how to support teachers in leveraging LADs to foster students’ CPS skills.
2.6 Conceptual model and hypotheses
Based on the P-E fit theory and our literature review as discussed above, this study constructed an integrated conceptual model (see Fig. 1), where the hypothesised model relationships are as follows:
The integrated conceptual model
-
H1-H5: subjective norms (SN) affect TPACK, self-efficacy (SE), technostress (TS), behavioural intention (BI), and actual behaviour (AB) respectively.
-
H6-H10: school support (SS) affects TPACK, self-efficacy (SE), technostress (TS), behavioural intention (BI), and actual behaviour (AB) respectively.
-
H11-H13: TPACK affects technostress (TS), behavioural intention (BI), and actual behaviour (AB) respectively.
-
H14-H16: self-efficacy (SE) affects technostress (TS), behavioural intention (BI), and actual behaviour (AB) respectively.
-
H17-H18: technostress (TS) affects behavioural intention (BI) and actual behaviour (AB) respectively.
-
H19: behavioural intention (BI) affects actual behaviour (AB).
Additionally, we postulate that TPACK and self-efficacy indirectly affect behavioural intention and actual behaviour through technostress. Firstly, it was discovered that teachers’ TPACK abilities lessened their levels of technostress, which subsequently enhanced their intention to utilise educational technologies (Joo et al., 2016). Concerning the relationship between SE and technostress, in studies on K-12 and university teachers’ technostress towards online teaching tools, teachers’ self-efficacy reduced their technostress (Chou & Chou, 2021). Similarly, Dong et al. (2020) illustrated that self-efficacy mitigated technostress among K-12 in-service teachers during the integration of ICT into teaching activities. We therefore hypothesise:
-
H20-H21: technostress (TS) mediates the relations between TPACK and behavioural intention (BI) and actual behaviour (AB).
-
H22-H23: technostress (TS) mediates the relations between self-efficacy (SE) and behavioural intention (BI) and actual behaviour (AB).
We also propose that school support indirectly affects behavioural intention and actual behaviour through TPACK, self-efficacy, and technostress. School support has been identified as a crucial catalyst in motivating teachers to incorporate e-learning into their teaching practices (Atman Uslu & Usluel, 2019; Ifinedo & Kankaanranta, 2021; Liu et al., 2017). Both administrative and collegial support were found to positively influence teachers’ TPACK and computer self-efficacy within K-12 school settings (Dong et al., 2020). In the absence of sufficient school support, K-12 teachers are prone to ‘experience resistance and animosity from colleagues’, which could consequently undermine their self-efficacy when applying educational games to their teaching activities (Stieler-Hunt & Jones, 2017). Numerous studies also suggest that school support can help teachers alleviate their technostress, which in turn facilitates the integration of emerging technologies into teaching (e.g., Joo et al., 2016; Özgür, 2020; Chou & Chou, 2021). We therefore hypothesise:
-
H24-H25: TPACK mediates the relations between school support (SS) and behavioural intention (BI) and actual behaviour (AB).
-
H26-H27: self-efficacy (SE) mediates the relations between school support (SS) and behavioural intention (BI) and actual behaviour (AB).
-
H28-H29: technostress (TS) mediates the relations between school support (SS) and behavioural intention (BI) and actual behaviour (AB).
Subjective norms have been recognised as a key factor in shaping teachers’ technology usage intention and behaviour (Jeong & Kim, 2017; Shin, 2015; Wijnen et al., 2021). Nevertheless, there is a scarcity of knowledge concerning whether and how teachers’ personal factors mediate the influences of subjective norms on their usage intention and behaviour towards educational technologies. Previous studies (e.g., Kwon et al., 2019; Wu et al., 2022; Zhang & Chen, 2022) demonstrated the significant influences of teachers’ TPACK, self-efficacy, and technostress on their behavioural intention and actual usage towards digital learning technologies. At the same time, subjective norms were found to be highly associated with TPACK, self-efficacy, and technostress (Dong et al., 2020; Jang et al., 2021; Scherer et al., 2019). Hence, we expect that subjective norms have significant indirect effects on behavioural intention and actual behaviours through TPACK, self-efficacy, and technostress, respectively. We therefore hypothesise:
-
H30-H31: TPACK mediates the relations between subjective norms (SN) and behavioural intention (BI) and actual behaviour (AB).
-
H32-H33: self-efficacy (SE) mediates the relations between subjective norms (SN) and behavioural intention (BI) and actual behaviour (AB).
-
H34-H35: technostress (TS) mediates the relations between subjective norms (SN) and behavioural intention (BI) and actual behaviour (AB).
3 Methods
3.1 Overview of the CPS game and the LAD
For facilitating the assessment of young students’ CPS skills, Digital City Fighter (D-City Fighter) was developed as part of a larger theme-based research project titled Learning and Assessment for Digital Citizenship. D-City Fighter (see Fig. 2) is a mobile online role-playing game focused on CPS with a 3D interface supporting multiple players. Based on the principles of evidence-centred design, three CPS quests have been designed and developed in the game (Tsang et al., 2020; Liu et al., 2024). As Fig. 2 shows, one of the quests requires a group of four student-players to locate puzzle pieces scattered throughout the digital city and assemble them within a 15-minute timeframe. Specifically, players are tasked with identifying puzzle pieces whose frames correspond to the colour of the circle at their feet and positioning these pieces correctly in their respective locations within the puzzle area. A hidden clue for completing this puzzle task becomes available to players upon entering the bush. Various tools (i.e., Virtual Joystick, Pickup/Putdown, Emoji, Chat, Scoreboard, Map, and Timer) are incorporated into the graphical user interface to support players’ CPS processes. To assess the players’ CPS skills, measures from their gameplay data (e.g., movement trajectories, clickstreams) were mapped to CPS skills, referencing a well-known framework for teachable CPS skills proposed by Hesse et al. (2015): participation, perspective taking, task regulation, social regulation, and knowledge building.
The interface of D-City Fighter (Payer E1’s point of view)
CPSLens (see Fig. 3) is a teacher-facing mobile LAD designed to analyse and visualise students’ interactions within virtual CPS environments, such as D-City Fighter. It aims to assist teachers in assessing students’ CPS skills and performance. The pipeline for game learning analytics of CPS is depicted in Fig. 4. Specifically, student-players’ gameplay data are fed into CPSLens, which visualises their CPS skills and processes as well as quest performance and engagement to teachers. Teachers can then use the generated visualisations to evaluate students’ CPS skills and performance. In CPSLens, six visualisation panels are offered: Quest Performance, Player Movement, CPS Performance, Quest Playback, Quest Engagement, and CPS Dynamics. Moreover, CPSLens allows teachers to switch between the visualisation interfaces of different groups and group members.
The interface of CPSLens (Group B and Player B1)
Game learning analytics pipeline adapted from Calvo-Morata et al. (2019)
Students’ CPS skills and processes are visualised via the CPS Performance, CPS Dynamics, Player Movement, and Quest Playback panels. Through the CPS Performance panel, teachers can assess group members’ performance on five CPS skills identified in the literature (Hesse et al., 2015), represented by a circular bar chart. Teachers can also visually check the transition patterns among the CPS skills, depicted by nodes of varying sizes corresponding to skill performance levels. When clicking specific skill bars, such as participation (yellow) and task regulation (pink) (See Fig. 3), teachers will be presented with multiple black-and-white bars. Each of these black-and-white bars indicates the proportion of the selected skill performed by one player (black part) relative to the corresponding skill performance of the entire group at a given time point on the x-axis of the CPS Dynamics panel. The CPS Dynamics panel displays a stacked bar chart with coordinates; the x-axis represents time points (in minutes), and the y-axis signifies one group member’s performance scores on the CPS skills, providing teachers with insights into dynamic changes in individual CPS skills over time. The Player Movement panel visualises different group members’ movement trajectories within the digital city. The city’s geography is divided into quest zones containing quest-related elements (e.g., puzzle pieces) and non-quest zones without relevant problem-solving information. This feature enables teachers to track and monitor different group members’ task progress. For instance, teachers can leverage this visualisation panel to identify difficulties that group members encounter as well as unintended events (e.g., ‘gaming the system’, off-quest behaviours) in the CPS process by comparing their actual movement trajectories with the expected ones. Finally, the Quest Playback panel allows teachers to review the video recording of each group member’s CPS process individually.
The Quest Performance and Quest Engagement panels visualise the performance and engagement of the CPS quests, respectively. Teachers can use the Quest Performance panel to examine the quest scores at both individual and group levels, as well as check the remaining and expended time during group members engaging with the quests. The quest scores are computed using performance metrics derived from students’ gameplay data, such as the duration of quest completion and the count of incorrect attempts. Additionally, the Quest Engagement panel showcases the level of engagement of each group member, evaluated through the group member’s interactions with teammates, in-game support tools, quest elements, and both quest and non-quest zones. A higher level of engagement correlates with darker shades of the square colour. Quest Engagement is represented as coordinates, with each point on the x-axis marking a time point (in minutes) and each label (e.g., B1-B4 in Group B) on the y-axis representing a group member. Quest Engagement informs teachers about different group members’ temporal changes in engagement levels in the entire CPS process.
The use of CPSLens offers interactive, near real-time visualisations of students’ strengths or weaknesses regarding particular CPS skills. Such feedback can not only support teachers to foster students’ CPS skills in an adaptive and personalised way but also help teachers improve their design and implementation of collaborative learning activities, such as building an appropriate group composition referring to CPS assessment results from CPSLens.
3.2 Participants and procedure
A total of 300 in-service teachers from ten K-12 schools in China participated in this study. Among the participants, 59.3% were female, with an overall average age of 40.40 (SD = 7.05). Table 1 shows their demographic details. None of the participating teachers had experience implementing LADs or educational games in their classrooms. Before data collection, informed consent was obtained from the participating teachers, principals, students, and their parents. The researchers emphasized to the participating teachers that they had the freedom to withdraw from the study at any time without any consequences.
Initially, the research team disseminated multimedia materials for the teachers to acquire knowledge about technology-enhanced CPS assessments, LADs, and their pedagogical affordances. Then, considering teachers’ lack of experience in implementing LADs, the researchers conducted a full-day onsite training session in each school to familiarise the teachers with the operations of D-City Fighter and CPSLens. This comprehensive training, guided by the researchers, included a live demonstration of D-City Fighter and CPSLens, followed by a hands-on trial allowing teachers to gain practical experience with these technology applications. Upon completion, the participating teachers were invited to integrate D-City Fighter and CPSLens into their teaching and implement them within classroom environments in the subsequent semester. Inherently functioning as a domain-general competence (Graesser et al., 2017; Greiff et al., 2014), CPS has been widely applied across various disciplines (e.g., mathematics, computer science) (Baucal et al., 2023; Tian & Zheng, 2023) and different active learning approaches (e.g., project-based learning, problem-based learning) (Song ,2018; Saleh et al., 2022). In this study, as shown in Table 1, the participating teachers implemented the CPS game and the LAD in STEM (e.g., technology, mathematics) and non-STEM (e.g., language) subjects.
Throughout the implementation of D-City Fighter and CPSLens, researchers provided remote assistance via a social media group whenever participants encountered and reported technical difficulties. Excluding these sporadic cases of troubleshooting, all the participants received similar amounts of support from the researchers. One semester after the implementation (approximately four months), we distributed questionnaires to elicit teachers’ perceptions of constructs corresponding to environmental and personal factors (as detailed in the next section), as well as the behavioural intentions and actual use of CPSLens. Log files from CPSLens showed that the teachers, on average, accessed CPSLens 4.63 times, spending approximately 10.13 min during each interaction.
3.3 Measures
We measured subjective norms (SN), self-efficacy (SE), and behavioural intention (BI) using 12 items with a 7-point Likert scale (1 = strongly disagree; 7 = strongly agree). The items were adapted from Admiraal et al. (2017) and Teo and van Schaik (2009). Cronbach’s alpha for SN (five items), SE (four items), and BI (three items) ranged from 0.838 to 0.902.
Technological pedagogical content knowledge (TPACK), school support (SS), and technostress (TS) were measured using 11 items with the same 7-point Likert scale used above. The items were adapted from the highly cited literature (Archambault & Crippen, 2009; Ayyagari et al., 2011; Chou & Chou, 2021; Lam et al., 2010; Schmidt et al., 2009). Cronbach’s alpha for TPACK (four items), SS (four items), and TS (three items) ranged from 0.784 to 0.907.
We assessed actual behaviour (AB) using five items adapted from Davis et al. (1989), Schildkamp et al. (2017), and Siyam (2019). These five 7-point Likert-type items included the teachers’ frequency of utilising the LAD over the one-semester period (from 1 = not at all to 7 = more than 10 times), their duration per LAD usage (from 1 = less than five minutes to 7 = more than half an hour), and three other items measuring the extent to which the teachers used LAD for instructional purposes (from 1 = never to 7 = always). Cronbach’s alpha for AB was 0.838.
3.4 Data Analysis
All data analyses were performed using R 4.3.2 and SmartPLS 4.0.7.8. For examining the relationships between personal and environmental factors and how they influenced teachers’ behavioural intention and actual usage of the LAD, we evaluated the proposed model (see Fig. 1) through partial least squares structural equation modelling (PLS-SEM). The PLS-SEM algorithm combines principal component analysis with ordinary least squares regressions to estimate model structures. Compared with covariance-based SEM commonly adopted in studying technology acceptance and adoption, PLS-SEM offers several advantages, including accommodating non-normal data distributions, achieving sufficient statistical power with small sample sizes, and managing complex models with multiple latent and observed variables and their interrelationships. PLS-SEM is particularly exceptional at assessing a model’s out-of-sample predictive power. These strengths align well with our study’s characteristics, such as the non-normality of our data (see item descriptive statistics in Appendix Table 4), complex model relationships within a relatively small sample size, and the requirement to assess the model’s predictive capacity. Despite this study’s small sample size, it still meets the minimum sample size as calculated by G*power 3.1.9.4 (Faul et al., 2009). With population effect size, power level, and significance level \(\alpha\) set to 0.15, 0.95, and 0.01 respectively, G*power suggests that 189 is the minimum sample size required for our model estimation.
Adopting a two-stage process as delineated by Anderson and Gerbing (1988), the PLS-SEM analysis comprised assessments of the measurement model and structural model. In the measurement model, we checked indicator reliability (indicator loading \(\ge\) 0.708 and statistically significant), internal consistency reliability (0.70 \(\le\) Cronbach’s alpha and composite reliability (CR) \(\le\) 0.95), convergent validity (average variance extracted (AVE) \(\ge\) 0.50), and discriminant validity (the heterotrait-monotrait ratio of correlations (HTMT) \(<\) 0.85; Square root of AVE for a construct higher than its correlation with other constructs). In the structural model, we examined the statistical significance of path coefficients, collinearity (variance inflation factor (VIF) \(<\) 3.3), and effect size of path coefficients and model explanatory power (i.e., \({R}^{2}\)), ranging from weak (0 \(\le\) x \(\le\) 0.10), modest (0.10 \(<\) x \(\le\) 0.30), moderate (0.30 \(<\) x \(\le\) 0.50) to strong (x \(>\) 0.50). These assessment criteria are from Hair et al.’s (2021) guidelines for PLS-SEM model evaluation.
Following the recommended procedures in Shmueli et al. (2019), this study assessed the out-of-sample predictive power of the proposed model. Initially, a holdout sample-based procedure was executed involving three-fold cross-validation and 20 repetitions to derive training and holdout samples. Next, out-of-sample prediction metrics for the model indicators were computed and compared with the linear regression model (LM) benchmark. According to Shmueli et al. (2019), a PLS model’s Qpredict2 larger than zero and prediction errors (e.g., root-mean-squared error) lower than the LM benchmark indicate the sufficient predictive power of the model. Because Shmueli et al. (2019) emphasise that the assessment of a PLS model’s prediction performance should concentrate on its key endogenous constructs, our analysis primarily targeted the BI and AB constructs within the model. For mediation analyses, the multiple mediation effects were analysed with reference to the procedure in Zhao et al. (2010).
4 Results
4.1 Assessing the measurement model
The results from the measurement model assessment can be found in Appendix Table 4. All indicator loadings proved statistically significant, with all but two items (AB4 = 0.560; AB5 = 0.560) surpassing the threshold of 0.708. These two items were retained due to their role in measuring the frequency and duration of LAD use—crucial for the content validity of the measurement—and the fact that the CR and AVE of the construct (i.e., AB) exceeded recommended thresholds (Hair et al., 2021). The CR of all constructs ranged from 0.869 to 0.931, larger than the cut-off value of 0.70, and Cronbach’s alpha ranging from 0.784 to 0.907—considered ‘satisfactory to good’ according to Hair et al. (2021). Convergent validity was achieved, as denoted by AVE (ranging between 0.598 and 0.795) exceeding 0.50 (see Appendix Table 4). The discriminant validity of the measurement model was acceptable, with HTMT below 0.85 and the square root of each construct’s AVE greater than its correlations with other constructs within the model (see Table 2).
4.2 Assessing the structural model and mediation effects
Appendix Table 5 and Fig. 5 showcase the results of the hypothesis testing. The structural model did not have the problem of collinearity, as the VIF values ranged from 1.109 to 2.274, not exceeding the threshold of 3.3. TPACK (𝛽 = 0.596, CI [0.496; 0.681], strong effect size) and SE (𝛽 = 0.494, CI [0.389; 0.587], moderate effect size) were positively predicted by SN, supporting H1 and H2. While SS was a positive predictor of SE (𝛽 = 0.270, CI [0.159; 0.370]), it occurred to be a negative predictor of TS (𝛽 = -0.291, CI [-0.413; -0.143]), with both paths bearing modest effect sizes and accepting H7 and H8. SN (𝛽 = 0.346, CI [0.217; 0.473]) and SE (𝛽 = 0.366, CI [0.216; 0.499]) positively predicted BI respectively with moderate effect sizes, while TS (𝛽 = -0.093, CI [-0.187; -0.010]) negatively predicted BI with a weak effect size, confirming H4, H15, and H17. SS (𝛽 = 0.087, CI [0.011; 0.167]) and TPACK (𝛽 = 0.162, CI [0.056; 0.275]) were predictive of AB, showing a weak and modest effect size, respectively. Therefore, H10 and H13 were established. With a strong effect size, BI was found to be a predictor of AB (𝛽 = 0.524, CI [0.409; 0.636]), supporting H19
Results of structural model relationships
The model explained 54.5% of the variance for BI and 62.1% for AB, showing that our research model possessed a strong explanatory power (i.e., in-sample predictive power). According to the results of PLSpredict analysis (see Table 3), the Qpredict2 values for the BI and AB indicators exceeded zero, and the prediction errors in the PLS model were lower than the LM model, indicating that our model had high out-of-sample predictive power. Appendix Table 5 and Fig. 6 show the results of mediation analyses. TPACK (𝛽 = 0.097, CI [0.034; 0.167]) was a mediator of the relationship between SN and AB, supporting H31. SE mediated the relationships between SN and BI (𝛽 = 0.181, CI [0.101; 0.267]) and between SS and BI (𝛽 = 0.099, CI [0.053; 0.162]), confirming H26 and H32.
Statistically significant mediation effects
5 Discussion
The overarching objective of the current study was to formulate and validate a model that elucidates and predicts in-service K-12 teachers’ integration of LADs for assessing students’ CPS skills in an educational game. In particular, this study investigated the postulated relationships among environmental factors (i.e., SN and SS), personal factors (i.e., TPACK, TS, and SE), as well as intention (BI) and behaviour (AB) regarding LAD usage. Beyond a high explanatory power, our model demonstrated a strong out-of-sample predictive power, which provides supporting evidence for its predictive capability and external validity for similar research contexts of exploring teachers’ technology integration. It was found that SN positively predicted TPACK (SN→TPACK) and SE (SN→SE), which echoes the findings of previous studies, such as Jang et al. (2021), who examined in-service teachers’ integration of augmented reality and virtual reality techniques into teaching practices in elementary schools in South Korea, and Scherer et al. (2019), who identified the environmental and personal factors that determine the success of teacher technology integration using meta-analytic SEM. In line with previous studies (e.g., Dong et al., 2020; Joo et al., 2016), the impacts of SS on SE (SS→SE) and TS (SS→TS) were supported in the current study, indicating that backing and promotion of technology integration by school administrators could bolster teachers’ confidence and mitigate their technostress during the implementation of new technologies, such as LADs, in classrooms.
Results also showed that SS significantly predicted AB (SS→AB), which corroborates Atman Uslu and Usluel’s (2019) assertion that school support directly affected teachers’ utilisation of ICT in K-12 education. This was further substantiated by Hew and Syed A. Kadir (2017), who ascertained that school support is a fundamental prerequisite for the implementation of cloud computing and web 2.0 technologies for teaching purposes. Teachers’ SN was found to positively affect their BI (SN→BI) to employ the LAD, aligning with Wijnen et al.’s (2021) findings of systematic review, which highlighted the significance of the social acceptability of e-learning technologies within K-12 educational contexts. With respect to the personal factors, namely TPACK, TS, and SE, we found that SE (SE→BI) and TS (TS→BI) significantly predicted BI, which resonates with findings from various previous studies (e.g., Chou & Chou, 2021; Joo et al., 2018; Panisoara et al., 2020). To illustrate, Chou and Chou (2021) underscored the pivotal role of self-efficacy and technostress in shaping K-12 teachers’ intent to persistently employ online teaching technologies, even beyond the COVID-19 pandemic. Apart from SS, our study also identified both TPACK (TPACK→AB) and BI (BI→AB) as significant predictors of AB. Similar results were obtained in a variety of empirical studies (e.g., Anthony et al., 2021; Hsu et al., 2021; Zhang & Chen, 2022). For instance, Anthony et al. (2021) pinpointed lecturers’ TPACK and intention to adopt technologies for teaching as key determinants of their actual usage of e-learning systems.
Additionally, this study supported the mediating role of TAPCK in the linkage from SN to AB (SN→TPACK→AB). In other words, when teachers were exposed to social pressure (e.g., peers, societal trends) in using LADs, they would more likely incorporate LADs into their teaching with the consideration of their pedagogy and subject matter, which would subsequently lead to their increasing actual usage of LADs. This finding implies the importance of peer influence in teachers’ adoption of emerging educational technologies. The mediation analyses also revealed that SN (SN→SE→BI) and SS (SS→SE→BI) had indirect effects on BI through SE. That is, teachers experiencing social pressure to use LADs or receiving school support from administrators are more likely to feel confident in actual LAD use within their classrooms, thereby developing stronger intentions towards LAD integration into teaching. Upon reviewing existing literature on technology acceptance and adoption, we discovered numerous studies providing evidence of the direct influences of SN on AB (e.g., see review by Wijnen et al., 2021) and also those of SN and SS on BI (e.g., Jeong & Kim, 2017; Jung et al., 2019; Porter & Graham, 2016). However, no studies, to the best of our knowledge, have examined the mediation effects on these relationships.
A plausible explanation for such significant mediation effects could lie in the active environments we created—such as offering training sessions and establishing social media communities—where teachers could learn, discuss, and share ideas on incorporating LAD into their teaching activities, subsequently enhancing their TPACK and confidence in LAD integration. Our findings support the claim of Fishbein and Ajzen (2011) that actual behaviour is shaped by the confluence of personal competence and environmental support, a pattern particularly evident in our research context. Furthermore, in cultures characterised by collectivist traditions and Confucian values, which emphasise conformity and respect for authority, the decisions teachers make regarding technology integration into classroom settings may be directly or indirectly influenced by environmental factors (Huang et al., 2019; Huang & Teo, 2020; Teo et al., 2019). Specifically, the subjective opinions and tangible support from influential figures around them, such as school leaders, administrators, and colleagues, can play crucial roles (Huang & Teo, 2020).
6 Conclusion
To investigate in-service teachers’ integration of a mobile LAD for game-based CPS assessments in K-12 classrooms, the present study constructed and tested an integrated conceptual model based on the person-environment fit theory. This model was validated using PLS-SEM on survey data collected from 300 K to 12 in-service teachers from ten schools in China. It was found that teachers’ subjective norms significantly influenced TPACK and self-efficacy, while school support significantly influenced technostress and self-efficacy. More importantly, our proposed model exhibited both strong in-sample explanatory power and out-of-sample predictive capability. In particular, behavioural intention was predicted by subjective norms, technostress, and self-efficacy, while actual behaviour was predicted by school support, TPACK, and behavioural intention. Our analysis results also highlighted the mediating roles of TPACK and self-efficacy. Specifically, TPACK mediated the impact of subjective norms on actual behaviour, and self-efficacy mediated the impacts of subjective norms and school support on behavioural intention.
Our findings yield theoretical implications for studies concerning teacher integration of advanced learning technologies, empowered by artificial intelligence and big data (e.g., learning analytics tools), into teaching practices. Grounded in the person-environment fit theory, this study advances the theoretical understanding of the factors that determine the extent to which teachers incorporate learning analytics applications into their teaching. Our study also extends the literature on teacher technology integration by uncovering the mediation effects of personal factors on the linkages from environmental factors to technology acceptance and adoption. The methodological implication of this study is underscored by its demonstration of how to assess a model’s predictive capability and external validity through out-of-sample predictive power.
The novelty of this study resides in not only in the implementation of a LAD designed for CPS assessments in authentic classroom settings but also in the investigation of teachers’ acceptance and adoption of such emerging technologies within educational contexts. Given the interactive, interdependent, and temporal features that are inherent in CPS (Swiecki et al., 2020), it can be challenging for teachers to measure students’ CPS skills and performance at both individual and group levels during live instruction within physical classrooms. In this study, we introduced a solution leveraging a LAD, which provides teachers with immediate and actionable feedback on individual contributions and group performance. This equips teachers with the ability to implement an evidence-based, data-driven approach to teaching 21st century skills and delivering adaptive learning support. Consequently, teachers’ LAD-empowered teaching may improve student engagement and encourage better learning attainments. Technologically, besides game-based learning, due to the prevalence of CPS skills across educational contexts, our LAD holds the potential to be applied to supporting other active learning approaches (e.g., project-based learning, problem-based learning) across a diversity of STEM and non-STEM disciplines. Our study has also constructed a research model characterised by high explanatory capacity and external validity, which could be generalised to other contexts of educational technology integration. This model illuminates the intricate relationships among environmental factors, personal characteristics, and technology acceptance and adoption. In doing so, it encapsulates multiple critical elements that shape technology integration into teaching practices, which can inform the design, implementation, and evaluation of LADs for CPS assessments.
Our research findings lend themselves to practical recommendations for facilitating teachers’ usage of LADs in their teaching. Firstly, it is advisable for teachers to forge mutually beneficial virtual communities via social media. This would provide a constructive and relaxed atmosphere conducive to dialogues and problem solving, thereby fostering LAD integration into teaching practices. Secondly, schools can launch professional and technological training initiatives, inclusive of workshops, seminars, and certificate programs, with the objective of enhancing teachers’ TPACK, an essential prerequisite for the seamless and sustainable integration of LADs. These professional development programs should also develop teachers’ data literacy knowledge and skills, such as how to interpret data and formulate pedagogical responses (Liu et al., 2023; Khulbe & Tammets, 2023), particularly through capacity-building and reflective activities (Cui & Zhang, 2022). Thirdly, it is suggested that schools provide required software, hardware, and timely assistance both in-person and online under researchers’ support. These endeavours can mitigate teachers’ technostress, build their confidence, and even directly affect their actual utilisation of LAD. Lastly, given the significant mediating roles of TPACK and self-efficacy, school administrators should pay close attention to the needs of teachers displaying inadequate TPACK skills and low confidence in LAD usage. This is particularly applicable to those possessing traditional teaching conceptions (Tsai & Tsai, 2019) or limited information and digital literacy (Lim, 2023).
This study has several limitations that should be addressed in future research. Firstly, all the participating teachers are from China, which might impact the generalisation of our research findings. Future researchers are encouraged to leverage our proposed model to investigate teachers’ integration of other emerging technologies in other sociocultural contexts. In particular, the cultural norms in different educational systems (e.g., collectivist versus individualist tendencies, respectively in Chinese and Western systems) and the teachers’ cultural beliefs can also be considered influential factors on LAD adoption (Huang et al., 2019, 2021b; Teo & Huang, 2019). Secondly, although the proposed model has been validated in this study, the exclusive reliance on survey data might limit our understanding of the in-depth reasons behind teachers’ intentions and behaviours regarding LAD integration. Future investigations would benefit from gathering and analysing multimodal data (e.g., interviews and physical signals) to corroborate and enrich our research findings. For instance, qualitative data and methodologies (e.g., document analysis of institutional policies) can reveal the extent to which inter-twined policy-related and institutional factors (e.g., comprehensiveness of infrastructure) would affect LAD integration (Broos et al., 2020). Finally, despite the use of PLS-SEM in our study to examine relationships among variables, this variable-centred method may not fully account for the influences of teachers’ individual characteristics (e.g., digital literacy levels) on the actual use of LADs. Person-centred methods (e.g., clustering analysis and finite mixture modelling) could be adopted to further probe how distinct teacher profiles contribute to variability in integrating LADs into classrooms.
Data availability
The raw datasets used in the current study are not publicly available due to ethics requirements, but the anonymized data are available from the corresponding author upon reasonable request.
References
Admiraal, W., Louws, M., Lockhorst, D., Paas, T., Buynsters, M., Cviko, A., & van der Ven, F. (2017). Teachers in school-based technology innovations: A typology of their beliefs on teaching and technology. Computers & Education,114, 57–68. https://doi.org/10.1016/j.compedu.2017.06.013
Ajzen, I., & Fishbein, M. (1980). Understanding attitudes and predicting social behaviour. Prentice-Hall.
Al-Fudail, M., & Mellar, H. (2008). Investigating teacher stress when using technology. Computers & Education,51(3), 1103–1110. https://doi.org/10.1016/j.compedu.2007.11.004
Anderson, J. C., & Gerbing, D. W. (1988). Structural equation modeling in practice: A review and recommended two-step approach. Psychological Bulletin,103(3), 411–423.
Andrews-Todd, J., & Forsyth, C. M. (2020). Exploring social and cognitive dimensions of collaborative problem solving in an open online simulation-based task. Computers in Human Behavior,104, 105759. https://doi.org/10.1016/j.chb.2018.10.025
Anthony, B., Kamaludin, A., & Romli, A. (2021). Predicting academic staffs behaviour intention and actual use of blended learning in higher education: Model development and validation. Technology Knowledge and Learning,28, 1223–1269. https://doi.org/10.1007/s10758-021-09579-2
Archambault, L., & Crippen, K. (2009). Examining TPACK among K-12 online distance educators in the United States. Contemporary Issues in Technology and Teacher Education, 9(1), 71–88. Retrieved from https://www.learntechlib.org/primary/p/29332/
Atman Uslu, N., & Usluel, Y. K. (2019). Predicting technology integration based on a conceptual framework for ICT use in education. Technology Pedagogy and Education,28(5), 517–531. https://doi.org/10.1080/1475939X.2019.1668293
Ayyagari, R., Grover, V., & Purvis, R. (2011). Technostress: Technological antecedents and implications. MIS Quarterly,35(4), 831–858. https://doi.org/10.2307/41409963
Azevedo, R., & Gašević, D. (2019). Analysing multimodal multichannel data about self-regulated learning with advanced learning technologies: Issues and challenges. Computers in Human Behavior,96, 207–210. https://doi.org/10.1016/j.chb.2019.03.025
Bandura, A. (1997). Self-efficacy: The exercise of control. W. H. Freeman and Company.
Bao, H., Li, Y., Su, Y., Xing, S., Chen, N. S., & Rosé, C. P. (2021). The effects of a learning analytics dashboard on teachers’ diagnosis and intervention in computer-supported collaborative learning. Technology Pedagogy and Education,30(2), 287–303. https://doi.org/10.1080/1475939X.2021.1902383
Baucal, A., Jošić, S., Ilić, I. S., Videnović, M., Ivanović, J., & Krstić, K. (2023). What makes peer collaborative problem solving productive or unproductive: A qualitative systematic review. Educational Research Review, 100567.https://doi.org/10.1016/j.edurev.2023.100567
Broos, T., Hilliger, I., Pérez-Sanagustín, M., Htun, N. N., Millecamp, M., Pesántez-Cabrera, P., & De Laet, T. (2020). Coordinating learning analytics policymaking and implementation at scale. British Journal of Educational Technology,51(4), 938–954. https://doi.org/10.1111/bjet.12934
Calvo-Morata, A., Alonso-Fernández, C., Pérez-Colado, I. J., Freire, M., Martínez-Ortiz, I., & Fernandez-Manjon, B. (2019). Improving teacher game learning analytics dashboards through ad-hoc development. Journal of Universal Computer Science,25(12), 1507–1530. https://doi.org/10.3217/jucs-025-12-1507
Care, E., & Kim, H. (2018). Assessment of 21st century skills: The issue of authenticity. In E. Care, P. Griffin, & M. Wilson (Eds.), Assessment and teaching of 21st century skills: Research and applications (pp. 21–39). Springer.
Care, E., Scoular, C., & Griffin, P. (2016). Assessment of collaborative problem solving in education environments. Applied Measurement in Education,29(4), 250–264. https://doi.org/10.1080/08957347.2016.1209204
Chen, Y., Hmelo-Silver, C. E., Lajoie, S. P., Zheng, J., Huang, L., & Bodnar, S. (2021). Using teacher dashboards to assess group collaboration in problem-based learning. Interdisciplinary Journal of Problem-Based Learning,15(2), 1–23. https://doi.org/10.14434/ijpbl.v15i2.28792
Chen, Y., Bae, H., Saleh, A., Uttamchandani, S., Hmelo-Silver, C. E., Glazewski, K., ... & Lester, J. (2022). A real-time teacher dashboard for a game-based collaborative inquiry learning environment. In J., Oshima, T., Mochizuki, & Hayashi, Y. (Eds.), General Proceedings of the 2nd Annual Meeting of the International Society of the Learning Sciences (pp. 32–35).
Chou, H. L., & Chou, C. (2021). A multigroup analysis of factors underlying teachers’ technostress and their continuance intention toward online teaching. Computers & Education,175, 104335. https://doi.org/10.1016/j.compedu.2021.104335
Cui, Y., & Zhang, H. (2022). Integrating teacher data literacy with TPACK: A self-report study based on a novel framework for teachers’ professional development. Frontiers in Psychology,13, 966575. https://doi.org/10.3389/fpsyg.2022.966575
Cukurova, M., Luckin, R., Millán, E., & Mavrikis, M. (2018). The NISPI framework: Analysing collaborative problem-solving from students’ physical interactions. Computers & Education,116, 93–109. https://doi.org/10.1016/j.compedu.2017.08.007
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science,35(8), 982–1003. https://doi.org/10.1287/mnsc.35.8.982
Dong, Y., Xu, C., Chai, C. S., & Zhai, X. (2020). Exploring the structural relationship among teachers’ technostress, technological pedagogical content knowledge (TPACK), computer self-efficacy and school support. The Asia-Pacific Education Researcher,29, 147–157. https://doi.org/10.1007/s40299-019-00461-5
Edwards, J. R., Caplan, R. D., & Harrison, V. R. (1998). Person-environment fit theory: Conceptual foundations, empirical evidence, and directions for future research. In C. L. Cooper (Ed.), Theories of organisational stress (pp. 28–67). Oxford University Press.
Faul, F., Erdfelder, E., Buchner, A., & Lang, A. G. (2009). Statistical power analyses using G*Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods,41(4), 1149–1160. https://doi.org/10.3758/BRM.41.4.1149
Fernández-Batanero, J. M., Román-Graván, P., Reyes-Rebollo, M. M., & Montenegro-Rueda, M. (2021). Impact of educational technology on teacher stress and anxiety: A literature review. International Journal of Environmental Research and Public Health,18(2), 548. https://doi.org/10.3390/ijerph18020548
Fiore, S. M., Graesser, A., Greiff, S., Griffin, P., Gong, B., Kyllonen, P., ... von Davier, A. A. (2017). Collaborative problem solving: Considerations for the National Assessment of Educational Progress. National Center for Education Statistics.
Fiore, S. M., Graesser, A., & Greiff, S. (2018). Collaborative problem-solving education for the twenty-first-century workforce. Nature Human Behaviour,2(6), 367–369. https://doi.org/10.1038/s41562-018-0363-y
Fishbein, M., & Ajzen, I. (2011). Predicting and changing behaviour: The reasoned action approach. Taylor and Francis. https://doi.org/10.4324/9780203838020
Gomez, M. J., Ruipérez-Valiente, J. A., & Clemente, F. J. G. (2022). A systematic literature review of game-based assessment studies: Trends and challenges. IEEE Transactions on Learning Technologies,16(4), 500–515. https://doi.org/10.1109/TLT.2022.3226661
Govender, R., & Mpungose, C. (2022). Lecturers’ technostress at a South African university in the context of coronavirus (COVID-19). Cogent Education,9(1), 2125205. https://doi.org/10.1080/2331186X.2022.2125205
Graesser, A., Kuo, B. C., & Liao, C. H. (2017). Complex problem solving in assessments of collaborative problem solving. Journal of Intelligence,5(2), 10. https://doi.org/10.3390/jintelligence5020010
Graesser, A. C., Fiore, S. M., Greiff, S., Andrews-Todd, J., Foltz, P. W., & Hesse, F. W. (2018). Advancing the science of collaborative problem solving. Psychological Science in the Public Interest,19(2), 59–92. https://doi.org/10.1177/1529100618808244
Greiff, S., Wüstenberg, S., Csapó, B., Demetriou, A., Hautamäki, J., Graesser, A. C., & Martin, R. (2014). Domain-general problem solving skills and education in the 21st century. Educational Research Review,13, 74–83. https://doi.org/10.1016/j.edurev.2014.10.002
Griffin, P. (2017). Assessing and teaching 21st century skills: Collaborative problem solving as a case study. In A. A. von Davier, M. Zhu, & P. C. Kyllonen (Eds.), Innovative assessment of collaboration (pp. 113–134). Springer.
Griffin, P., & Care, E. (2015). Assessment and Teaching of 21st Century skills: Methods and approach. Springer.
Hair, J. F., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. (2021). A primer on partial least squares structural equation modeling (PLS-SEM). Sage Publications.
Hesse, F., Care, E., Buder, J., Sassenberg, K., & Griffin, P. (2015). A framework for teachable collaborative problem solving skills. In P. Griffin, and E. Care (Eds.), Assessment and teaching of 21st century skills: Methods and approach (pp. 37–56). Springer.
Hew, T. S., Syed, A., & Kadir, S. L. (2017). Applying channel expansion and self-determination theory in predicting use behaviour of cloud-based VLE. Behaviour & Information Technology,36(9), 875–896. https://doi.org/10.1080/0144929X.2017.1307450
Hsu, C. Y., Liang, J. C., Chuang, T. Y., Chai, C. S., & Tsai, C. C. (2021). Probing in-service elementary school teachers’ perceptions of TPACK for games, attitudes towards games, and actual teaching usage: A study of their structural models and teaching experiences. Educational Studies,47(6), 734–750. https://doi.org/10.1080/03055698.2020.1729099
Hu, X., Ng, J. T. D., & Chu, S. K. (2022). Implementing learning analytics in wiki-supported collaborative learning in secondary education: A framework-motivated empirical study. International Journal of Computer-Supported Collaborative Learning, 17(3), 427–455. https://doi.org/10.1007/s11412-022-09377-7
Huang, F., & Teo, T. (2020). Influence of teacher-perceived organisational culture and school policy on Chinese teachers’ intention to use technology: An extension of technology acceptance model. Educational Technology Research and Development,68(3), 1547–1567. https://doi.org/10.1007/s11423-019-09722-y
Huang, F., Teo, T., Sánchez-Prieto, J. C., García-Peñalvo, F. J., & Olmos-Migueláñez, S. (2019). Cultural values and technology adoption: A model comparison with university teachers from China and Spain. Computers & Education,133, 69–81. https://doi.org/10.1016/j.compedu.2019.01.012
Huang, L., Li, S., Poitras, E. G., & Lajoie, S. P. (2021a). Latent profiles of self-regulated learning and their impacts on teachers’ technology integration. British Journal of Educational Technology, 52(2), 695–713. https://doi.org/10.1111/bjet.13050.
Huang, F., Sánchez-Prieto, J. C., Teo, T., García-Peñalvo, F. J., Olmos-Migueláñez, S., & Zhao, C. (2021b). A cross-cultural study on the influence of cultural values and teacher beliefs on university teachers’ information and communications technology acceptance. Educational Technology Research and Development,69, 1271–1297. https://doi.org/10.1007/s11423-021-09941-2
Huang, L., Zheng, J., Lajoie, S. P., Chen, Y., Hmelo-Silver, C. E., & Wang, M. (2023). Examining university teachers’ self-regulation in using a learning analytics dashboard for online collaboration. Education and Information Technologies, 1–25. https://doi.org/10.1007/s10639-023-12131-7
Ifinedo, E., & Kankaanranta, M. (2021). Understanding the influence of context in technology integration from teacher educators’ perspective. Technology Pedagogy and Education,30(2), 201–215. https://doi.org/10.1080/1475939X.2020.1867231
Jang, J., Ko, Y., Shin, W. S., & Han, I. (2021). Augmented reality and virtual reality for learning: An examination using an extended technology acceptance model. IEEE Access : Practical Innovations, Open Solutions,9, 6798–6809. https://doi.org/10.1109/ACCESS.2020.3048708
Jeong, H. I., & Kim, Y. (2017). The acceptance of computer technology by teachers in early childhood education. Interactive Learning Environments,25(4), 496–512. https://doi.org/10.1080/10494820.2016.1143376
Joo, Y. J., Lim, K. Y., & Kim, N. H. (2016). The effects of secondary teachers’ technostress on the intention to use technology in South Korea. Computers & Education,95, 114–122. https://doi.org/10.1016/j.compedu.2015.12.004
Joo, Y. J., Park, S., & Lim, E. (2018). Factors influencing preservice teachers’ intention to use technology: TPACK, teacher self-efficacy, and technology acceptance model. Journal of Educational Technology & Society, 21(3), 48–59. Retrieved November 11, 2023, from https://www.jstor.org/stable/26458506
Jung, Y. J., Cho, K., & Shin, W. S. (2019). Revisiting critical factors on teachers’ technology integration: The differences between elementary and secondary teachers. Asia Pacific Journal of Education, 39(4), 548–561. https://doi.org/10.1080/02188791.2019.1620683.
Kaimara, P., Fokides, E., Oikonomou, A., & Deliyannis, I. (2021). Potential barriers to the implementation of digital game-based learning in the classroom: Preservice teachers’ views. Technology Knowledge and Learning,26(4), 825–844. https://doi.org/10.1007/s10758-021-09512-7
Kaliisa, R., & Dolonen, J. A. (2023). CADA: A teacher-facing learning analytics dashboard to foster teachers’ awareness of students’ participation and discourse patterns in online discussions. Technology Knowledge and Learning,28(3), 937–958. https://doi.org/10.1007/s10758-022-09598-7
Kaliisa, R., Gillespie, A., Herodotou, C., Kluge, A., & Rienties, B. (2021). Teachers’ perspectives on the promises, needs and challenges of learning analytics dashboards: Insights from institutions offering blended and distance learning. In M. Sahin & D. Ifenthaler (Eds.), Visualisations and dashboards for learning analytics (pp. 351–370). Springer. https://doi.org/10.1007/978-3-030-81222-5_16
Khulbe, M., & Tammets, K. (2023). Mediating teacher professional learning with a learning analytics dashboard and training intervention. Technology Knowledge and Learning,28(3), 981–998. https://doi.org/10.1007/s10758-023-09642-0
Koh, J. H. L., Chai, C. S., & Lim, W. Y. (2017). Teacher professional development for TPACK-21CL: Effects on teacher ICT integration and student outcomes. Journal of Educational Computing Research,55(2), 172–196. https://doi.org/10.1177/0735633116656848
Kristof-Brown, A., & Guay, R. P. (2011). Person-environment fit. In S. Zedeck (Ed.), Handbook of industrial/organisational psychology (Vol. 3, pp. 3–50). American Psychological Association.
Kristof-Brown, A. L., Zimmerman, R. D., & Johnson, E. C. (2005). Consequences of individuals’ fit at work: A meta‐analysis of person-job, person-organisation, person-group, and person-supervisor fit. Personnel Psychology, 58(2), 281–342. https://doi.org/10.1111/j.1744-6570.2005.00672.x.
Kwon, K., Ottenbreit-Leftwich, A. T., Sari, A. R., Khlaif, Z., Zhu, M., Nadir, H., & Gok, F. (2019). Teachers’ self-efficacy matters: Exploring the integration of mobile computing device in middle schools. TechTrends,63, 682–692. https://doi.org/10.1007/s11528-019-00402-5
Lam, S. F., Cheng, R. W. Y., & Choy, H. C. (2010). School support and teacher motivation to implement project-based learning. Learning and Instruction,20(6), 487–497. https://doi.org/10.1016/j.learninstruc.2009.07.003
Lee-Cultura, S., Sharma, K., & Giannakos, M. (2023). Multimodal teacher dashboards: Challenges and opportunities of enhancing teacher insights through a case study. IEEE Transactions on Learning Technologies, 1–19. https://doi.org/10.1109/TLT.2023.3276848
Li, L., & Wang, X. (2021). Technostress inhibitors and creators and their impacts on university teachers’ work performance in higher education. Cognition Technology & Work,23, 315–330. https://doi.org/10.1007/s10111-020-00625-0
Li, Y., Zhang, M., Su, Y., Bao, H., & Xing, S. (2022). Examining teachers’ behavior patterns in and perceptions of using teacher dashboards for facilitating guidance in CSCL. Educational Technology Research and Development,70(3), 1035–1058. https://doi.org/10.1007/s11423-022-10102-2
Lim, E. M. (2023). The effects of pre-service early childhood teachers’ digital literacy and self-efficacy on their perception of AI education for young children. Education and Information Technologies,28, 12969–12995. https://doi.org/10.1007/s10639-023-11724-6
Liu, F., Ritzhaupt, A. D., Dawson, K., & Barron, A. E. (2017). Explaining technology integration in K-12 classrooms: A multilevel path analysis model. Educational Technology Research and Development,65, 795–813. https://doi.org/10.1007/s11423-016-9487-9
Liu, Y., Huang, L., & Doleck, T. (2023). How teachers’ self-regulation, emotions, perceptions, and experiences predict their capacities for learning analytics dashboard: A Bayesian approach. Education and Information Technologies, 1–36. https://doi.org/10.1007/s10639-023-12163-z
Liu, Y., Ng, J. T. D., Hu, X., Ma, Z., & Lai, X. (2024). Adopt or abandon: Facilitators and barriers of in-service teachers’ integration of game learning analytics in K–12 classrooms?. Computers & Education, 209, 104951. https://doi.org/10.1016/j.compedu.2023.104951
Martinez-Maldonado, R. (2019). A handheld classroom dashboard: Teachers’ perspectives on the use of real-time collaborative learning analytics. International Journal of Computer-Supported Collaborative Learning,14, 383–411. https://doi.org/10.1007/s11412-019-09308-z
OECD (2017b). PISA 2015 results (volume V): Collaborative problem solving. OECD. https://doi.org/10.1787/9789264285521-en
OECD (2017a). PISA 2015 assessment and analytical framework: Science, reading, mathematic, financial literacy and collaborative problem solving. OECD. https://doi.org/10.1787/9789264281820-en
Özgür, H. (2020). Relationships between teachers’ technostress, technological pedagogical content knowledge (TPACK), school support and demographic variables: A structural equation modeling. Computers in Human Behavior,112, 106468. https://doi.org/10.1016/j.chb.2020.106468
Panisoara, I. O., Lazar, I., Panisoara, G., Chirca, R., & Ursu, A. S. (2020). Motivation and continuance intention towards online instruction among teachers during the COVID-19 pandemic: The mediating effect of burnout and technostress. International Journal of Environmental Research and Public Health,17(21), 8002. https://doi.org/10.3390/ijerph17218002
Petko, D., Prasse, D., & Cantieni, A. (2018). The interplay of school readiness and teacher readiness for educational technology integration: A structural equation model. Computers in the Schools,35(1), 1–18. https://doi.org/10.1080/07380569.2018.1428007
Porter, W. W., & Graham, C. R. (2016). Institutional drivers and barriers to faculty adoption of blended learning in higher education. British Journal of Educational Technology,47(4), 748–762. https://doi.org/10.1111/bjet.12269
Qian, M., & Clark, K. R. (2016). Game-based learning and 21st century skills: A review of recent research. Computers in Human Behavior,63, 50–58. https://doi.org/10.1016/j.chb.2016.05.023
Rienties, B., Herodotou, C., Olney, T., Schencks, M., & Boroowa, A. (2018). Making sense of learning analytics dashboards: A technology acceptance perspective of 95 teachers. International Review of Research in Open and Distributed Learning,19(5), 187–202. https://doi.org/10.19173/irrodl.v19i5.3493
Ruipérez-Valiente, J. A., Gomez, M. J., Martínez, P. A., & Kim, Y. J. (2021). Ideating and developing a visualisation dashboard to support teachers using educational games in the classroom. IEEE Access : Practical Innovations, Open Solutions,9, 83467–83481. https://doi.org/10.1109/ACCESS.2021.3086703
Saleh, A., Phillips, T. M., Hmelo-Silver, C. E., Glazewski, K. D., Mott, B. W., & Lester, J. C. (2022). A learning analytics approach towards understanding collaborative inquiry in a problem‐based learning environment. British Journal of Educational Technology,53(5), 1321–1342. https://doi.org/10.1111/bjet.13198
Scherer, R., Siddiq, F., & Tondeur, J. (2019). The technology acceptance model (TAM): A meta-analytic structural equation modeling approach to explaining teachers’ adoption of digital technology in education. Computers & Education,128, 13–35. https://doi.org/10.1016/j.compedu.2018.09.009
Schildkamp, K., Poortman, C., Luyten, H., & Ebbeler, J. (2017). Factors promoting and hindering data-based decision making in schools. School Effectiveness and School Improvement,28(2), 242–258. https://doi.org/10.1080/09243453.2016.1256901
Schmid, M., Brianza, E., & Petko, D. (2021). Self-reported technological pedagogical content knowledge (TPACK) of preservice teachers in relation to digital technology use in lesson plans. Computers in Human Behavior,115, 106586. https://doi.org/10.1016/j.chb.2020.106586
Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S. (2009). Technological pedagogical content knowledge (TPACK): The development and validation of an assessment instrument for preservice teachers. Journal of Research on Technology in Education,42(2), 123–149. https://doi.org/10.1080/15391523.2009.10782544
Scoular, C., & Care, E. (2018). Teaching 21st century skills: Implications at system levels in Australia. In E. Care, P. Griffin, & M. Wilson (Eds.), Assessment and teaching of the 21st century skills: Research and applications (pp. 145–162). Springer.
Shin, W. S. (2015). Teachers’ use of technology and its influencing factors in Korean elementary schools. Technology Pedagogy and Education, 24(4), 461–476. https://doi.org/10.1080/1475939X.2014.915229.
Shmueli, G., Sarstedt, M., Hair, J. F., Cheah, J. H., Ting, H., Vaithilingam, S., & Ringle, C. M. (2019). Predictive model assessment in PLS-SEM: Guidelines for using PLSpredict. European Journal of Marketing,53(11), 2322–2347. https://doi.org/10.1108/EJM-02-2019-0189
Siyam, N. (2019). Factors impacting special education teachers’ acceptance and actual use of technology. Education and Information Technologies, 24(3), 2035–2057. https://doi.org/10.1007/s10639-018-09859-y.
Song, Y. (2018). Improving primary students’ collaborative problem solving competency in project-based science learning with productive failure instructional design in a seamless learning environment. Educational Technology Research and Development, 66, 979–1008. https://doi.org/10.1007/s11423-018-9600-3.
Stieler-Hunt, C. J., & Jones, C. M. (2017). Feeling alienated–teachers using immersive digital games in classrooms. Technology Pedagogy and Education,26(4), 457–470. https://doi.org/10.1080/1475939X.2017.1334227
Sun, J. C. Y., & Liu, Y. (2022). The mediation effect of online self-regulated learning between engagement and cognitive load: A case of an online course with smart instant feedback. International Journal of Online Pedagogy and Course Design (IJOPCD), 12(1), 1–17. https://doi.org/10.4018/IJOPCD.295953
Sun, C., Shute, V. J., Stewart, A. E. B., Beck-White, Q., Reinhardt, C. R., Zhou, G., ... & D’Mello, S. K. (2022). The relationship between collaborative problem solving behaviors and solution outcomes in a game-based learning environment. Computers in Human Behavior, 128, 107120. https://doi.org/10.1016/j.chb.2021.107120
Swiecki, Z., Ruis, A. R., Farrell, C., & Shaffer, D. W. (2020). Assessing individual contributions to collaborative problem solving: A network analysis approach. Computers in Human Behavior,104, 105876. https://doi.org/10.1016/j.chb.2019.01.009
Teo, T., & Huang, F. (2019). Investigating the influence of individually espoused cultural values on teachers’ intentions to use educational technologies in Chinese universities. Interactive Learning Environments,27(5–6), 813–829. https://doi.org/10.1080/10494820.2018.1489856
Teo, T., & van Schaik, P. (2009). Understanding technology acceptance among preservice teachers: A structural-equation modeling approach. The Asia-Pacific Education Researcher,18(1), 47–66. https://doi.org/10.3860/taper.v18i1.1035
Teo, T., Sang, G., Mei, B., & Hoi, C. K. W. (2019). Investigating pre-service teachers’ acceptance of web 2.0 technologies in their future teaching: A Chinese perspective. Interactive Learning Environments,27(4), 530–546. https://doi.org/10.1080/10494820.2018.1489290
Tian, Q., & Zheng, X. (2023) Effectiveness of online collaborative problem-solving method on students’ learning performance: A meta‐analysis. Journal of Computer Assisted Learning, 1–16. https://doi.org/10.1111/jcal.12884
Tlili, A., Chang, M., Moon, J., Liu, Z., Burgos, D., Chen, N. S., & Kinshuk, K. (2021a). A systematic literature review of empirical studies on learning analytics in educational games. International Journal of Interactive Multimedia and Artificial Intelligence,7(2), 250–261. https://doi.org/10.9781/ijimai.2021.03.003
Tlili, A., Hattab, S., Essalmi, F., Chen, N. S., Huang, R., Martínez, K., ... & Burgos, D. (2021b). A smart collaborative educational game with learning analytics to support english vocabulary teaching. International Journal of Interactive Multimedia and Artificial Intelligence, 6(6), 215–224. https://doi.org/10.9781/ijimai.2021.03.002
Tsai, P. S., & Tsai, C. C. (2019). Preservice teachers’ conceptions of teaching using mobile devices and the quality of technology integration in lesson plans. British Journal of Educational Technology,50(2), 614–625. https://doi.org/10.1111/bjet.12613
Tsang, H. W. C., Liu, Y., & Law, N. (2020). An in-depth study of assessment of collaborative problem solving (CPS) skills of students in both technological and authentic learning settings (pp. 1381–1388). International Society of the Learning Sciences. Retrieved November 11, 2023, from https://repository.isls.org//handle/1/6340
Van Leeuwen, A. (2019). Teachers’ perceptions of the usability of learning analytics reports in a flipped university course: When and how does information become actionable knowledge? Educational Technology Research and Development,67, 1043–1064. https://doi.org/10.1007/s11423-018-09639-y
Van Leeuwen, A., Rummel, N., & Van Gog, T. (2019). What information should CSCL teacher dashboards provide to help teachers interpret CSCL situations? International Journal of Computer-Supported Collaborative Learning,14, 261–289. https://doi.org/10.1007/s11412-019-09299-x
Van Leeuwen, A., Knoop-van Campen, C. A. N., Molenaar, I., & Rummel, N. (2021). How teacher characteristics relate to how teachers use dashboards: Results from two case studies in K-12. Journal of Learning Analytics,8(2), 6–21. https://doi.org/10.18608/jla.2021.7325
von Davier, A. A., & Halpin, P. F. (2013). Collaborative problem solving and the assessment of cognitive skills: Psychometric considerations. ETS Research Report Series,2013(2), i–36. https://doi.org/10.1002/j.2333-8504.2013.tb02348.x
Weil, M. M., & Rosen, L. D. (1997). Technostress: Coping with technology @Work @Home @Play. Wiley.
Wijnen, F., van der Walma, J., & Voogt, J. (2021). Primary school teachers’ attitudes toward technology use and stimulating higher-order thinking in students: A review of the literature. Journal of Research on Technology in Education, 55(4), 545–567. https://doi.org/10.1080/15391523.2021.1991864.
Wu, D., Yang, X., Yang, W., Lu, C., & Li, M. (2022). Effects of teacher- and school- level ICT training on teachers’ use of digital educational resources in rural schools in China: A multilevel moderation model. International Journal of Educational Research,111, 101910. https://doi.org/10.1016/j.ijer.2021.101910
Zhang, M., & Chen, S. (2022). Modeling dichotomous technology use among university EFL teachers in China: The roles of TPACK, affective and evaluative attitudes towards technology. Cogent Education,9(1), 2013396. https://doi.org/10.1080/2331186X.2021.2013396
Zhao, X., Lynch, J. G., & Chen, Q. (2010). Reconsidering Baron and Kenny: Myths and truths about mediation analysis. Journal of Consumer Research, 37(2), 197–206. https://doi.org/10.1086/651257.
Zheng, J., Huang, L., Li, S., Lajoie, S. P., Chen, Y., & Hmelo-Silver, C. E. (2021). Self-regulation and emotion matter: A case study of instructor interactions with a learning analytics dashboard. Computers & Education, 161, 104061. https://doi.org/10.1016/j.compedu.2020.104061.
Acknowledgements
The authors would like to extend the gratitude to the participating schools, teachers, and students.
Funding
This work was supported by the Research Grants Council of the HKSAR Government, #T44-707/16 N, under the Theme Based Research Scheme and Guangdong Planning Office of Philosophy and Social Science, China [Grant No. GD24YJY14].
Author information
Authors and Affiliations
Contributions
Yiming Liu: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Software, Visualization, Writing - original draft, Writing - review & editing. Xiao Hu: Conceptualization, Funding acquisition, Methodology, Resources, Supervision, Writing - review & editing. Jeremy T. D. Ng: Investigation, Validation, Writing - review & editing. Zhengyang Ma: Investigation, Validation, Software, Visualization, Writing - review & editing. Xiaoyan Lai: Investigation, Validation, Writing - review & editing.
Corresponding author
Ethics declarations
Conflicts of interest
There are no conflicts of interest to declare.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix A
Appendix B
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Liu, Y., Hu, X., Ng, J.T.D. et al. Ready or not? Investigating in-service teachers’ integration of learning analytics dashboard for assessing students’ collaborative problem solving in K–12 classrooms. Educ Inf Technol 30, 1745–1776 (2025). https://doi.org/10.1007/s10639-024-12842-5
Received:
Accepted:
Published:
Version of record:
Issue date:
DOI: https://doi.org/10.1007/s10639-024-12842-5