Tracking Students’ Mental Engagement Using EEG Signals during an Interaction with a Virtual Learning Environment

Abstract

Monitoring students’ level of engagement during learning activities is an important challenge in the development of tutoring interventions. In this paper, we explore the feasibility of using electroencephalographic signals (EEG) as a tool to monitor the mental engagement index of novice medicine students during a reasoning process. More precisely, the objectives were first, to track students’ mental engagement evolution in order to investigate whether there were particular sections within the learning environment that aroused the highest engagement level among the students, and, if so, did these sections have an impact on learners’ performance. Experimental analyses showed the same trends in the different resolution phases as well as across the different regions of the environments. However, we noticed a higher engagement index during the treatment identification phase since it aroused more mental effort. Moreover statistically significant effects were found between mental engagement and students’ performance.

Share and Cite:

Khedher, A. , Jraidi, I. and Frasson, C. (2019) Tracking Students’ Mental Engagement Using EEG Signals during an Interaction with a Virtual Learning Environment. Journal of Intelligent Learning Systems and Applications, 11, 1-14. doi: 10.4236/jilsa.2019.111001.

1. Introduction

Endowing computer-based environments with the faculty of monitoring users’ experience is an important step toward improving human-computer interactions and understanding users’ needs [1] [2] [3] . More precisely, the use of neurophysiological sensors such as heart rate, galvanic skin response, body temperature and electroencephalography (EEG) is gaining importance continuously as methods to assess users’ behavior and mental state.

In particular, EEG data provide valuable quantitative and unbiased information on brain activity in a millisecond time-frame. It has become a low-cost and a non-invasive tool that is increasingly being used in a wide range of applications, including: medical diagnosis (e.g. patient coma monitoring and epilepsy detection) [4] [5] [6] , emotion recognition (e.g. boredom assessment) [7] [8] [9] , affective modeling (e.g. uncertainty modeling, emotional state classification) [10] [11] , performance assessment (e.g. outcome prediction, learners’ classification) [12] [13] [14] [15] [16] and users’ mental activity assessment [17] [18] [19] .

Monitoring learners’ mental state is of primary interest particularly in computer-based learning environments [20] [21] [22] . The ability to recognize and measure students’ attention during the learning process is an important part of a successful knowledge acquisition since it impacts their cognitive performance. Therefore, obtaining an accurate view of learners’ mental state may allow interactive learning systems to adjust the tutoring content, formulate the appropriate help strategies and enhance learning outcomes.

In this paper, we recorded learners’ electrophysiological activity as they interact with a medical serious game in order to track their mental state and assess their learning performance. We aim first to monitor EEG signals in order to explore how students’ mental engagement evolve across the different phases of the learning environment. Second, we examine whether being engaged during particular phases of the game or paying attention to specific relevant areas of the environment have an impact on the learners’ outcomes.

The remainder of the paper is organized as follows: Section 2 outlines some previous work concerning students’ engagement assessment within computer-based learning environments. Section 3, describes our experimental setup. Section 4, discusses the obtained results and Section 5 presents a conclusion and future works.

2. Related Work

The key to a successful learning experience is the ability for educators to provide adequate assistance to learners. In this context, many researchers are currently devoting a growing interest in assessing students’ mental engagement in an effort to keep students engaged during computer-based learning interactions [11] [23] . Engagement is a mental state that can be seen as attention, involvement, focus, or interest [24] [25] . In the education context, the engagement state consists in deploying all the mental mechanisms involved in information processing to achieve optimal learning performance [26] . Both qualitative and quantitative methods are used to measure students’ engagement.

The quantitative approaches are the most common measures. They are based on self-assessment tools such as questionnaires and surveys to be completed by the student. Self-report measures are used to assess students’ emotional and cognitive engagement [27] [28] . Pintrich and De Groot (1990) used a self-report questionnaire to measure students’ self-regulated learning components, and how these are related to students’ performance and cognitive engagement. The results demonstrated a positive correlation between self-efficacy and cognitive engagement. In fact, students who are self-regulating are more cognitively engaged and perform better than the others. These methods are practical, low cost and easy to use for large samples and distant learning [29] . However, the obtained data are not entirely reliable since the responses may be biased. In fact, as the learners fill in the self-report questionnaires by themselves, the data will not be enough objective to draw effective conclusions about students’ engagement. One of the proposed solutions to face these issues is the use of qualitative approaches.

Qualitative methods measure students’ engagement by means of various techniques such as observations, interviews and educators’ ratings [30] [31] [32] . Helme and Clarke (2008) used interview data to identify indicators of cognitive engagement during mathematics lessons in four classroom situations. Results revealed different patterns of cognitive engagement. For instance, in student-student interactions activity, the learners showed a higher level of engagement compared to student-teacher activity. Another study was based on the use of classroom observations in order to help conceptualize students’ engagement and identify academic disengagement. The types of observations conducted in this work include discussions, projects and labs [33] . Despite qualitative measures provide reliable indicators of the level of students’ engagement, they remain time-consuming and non-scalable since the data are gathered and analyzed by humans.

Another alternative method to measure students’ engagement was observed during these last years, which is the use of physiological sensors. The prime advantage of using such sensing technologies in learning environments is that they can provide valuable quantitative data about the cognitive behavior of the learner, which cannot be directly observable. The use of physiological sensors such as skin conductance, heart rate and electroencephalography proved its effectiveness in monitoring changes in learners’ mental state [34] - [39] . Boucheix et al. (2013) used eye tracking to study how different graphic representations can have an impact on students’ engagement and learning outcomes. Whitehill et al. (2014) used facial expressions for automatic detection of students’ engagement. D’Mello, Chipman and Graesser (2018) used student’s posture to discriminate between low engagement (boredom) and high engagement (flow).

Among all the existing sensors, researchers reported promising results when using EEG to measure students’ engagement. In fact, EEG can be a viable indicator of moment-to-moment changes in learners’ attention. In their tutoring system, Chaouachi and his colleagues (2015) automatically adapt the learning material according to learners’ attention and workload, measured using EEG signals. For instance, when the learner is disengaged or overloaded/underloaded, a worked example is given as a next activity in order to keep the learner engaged. In another study, EEG technology was used in an e-learning environment that detects individual mental effort using a mental state classification system [40] . The authors used a supervised learning technique to identify video segments where students produce high mental effort. Both students and instructors can have access to these segments in order to improve learning outcomes: first, students can self-regulate their mental state and second, instructors can adapt the learning activities accordingly.

In this paper, we propose to use EEG to track learners’ mental engagement while they are reasoning and solving different medical cases. We aim to assess how students’ engagement varies between the three phases of the problem-solving task and across the different areas of interest within the environment. Then we investigate whether learners’ engagement can have an impact on their reasoning outcomes.

3. Experimental Design

Fifteen undergraduate medicine students (7 females) with a mean age of 21.8 ± 2.73 years ranging from 20 to 27 years participated in the experiment with a compensation of 20 dollars.

Upon arrival at the laboratory, participants were asked to sign a consent form explaining the experiment and the material. They were then outfitted with the Emotiv EEG headset and placed in front of the eye tracker to record respectively their brain activity and eye movements while interacting with our medical learning environment called Amnesia. During the session, 30 - 45 minutes were dedicated to the game and at the end, participants were invited to complete a post-game questionnaire in order to collect feedbacks about the ergonomics and usability of the game.

3.1. Amnesia: An Educational Video Game

Amnesia is a learning environment developed for novice medicine students to assess their clinical abilities through different problem-solving tasks. The game features a virtual hospital where the player who plays the role of a doctor is mistakenly diagnosed with amnesia and trapped within the hospital. In order to get out of the trap, the students need to resolve first some cognitive tasks such as logic tests and second they have to prove their clinical skills by resolving six medical cases that were designed and validated with the help of a medical professional. Each medical case represents a different disease: flu, bacterial pneumonia, measles, Ebola, mumps and whooping cough.

The resolution task is divided into three phases: exploration, diagnostic and treatment and in each phase, they can also collect additional data such as analyses and antecedents to establish a diagnosis. The exploration phase represents the first part of the clinical case resolution process where the student should analyze the patient’s demographic information as well as his/her clinical data (e.g. symptoms, antecedents, etc.). Once all the useful data gathered, the objective of the diagnostic phase is to identify the correct disease from a list of six proposed ones. The student has up to three trials to find out the correct response. After completing the diagnosis, the student is shown a list of different treatments and he/she is asked to find out the adequate one(s). Participants are also given up to three attempts to discover the appropriate treatment.

Furthermore, in each medical case, we divided the resolution environment into different sections in order to analyze the level of engagement according to different areas of interest (AOI). For that purpose, we defined six specific sections, representing six task-relevant regions of the screen, as follows: Information (I), Antecedents (A), Symptoms (S), Analyses (N), Diagnosis (D) and Treatment (T). The I area of interest includes the demographic information of the patient (e.g. name, origin, weight and height). In the antecedents AOI, we introduce the diseases that the patient has had before (e.g. allergies). The S region includes all the symptoms related to the specific disease. In the analyses AOI, we present other clinical data (e.g. temperature, heart rate and blood pressure). The D area shows the different diagnoses offered to the student among which he/she has to choose the correct answer. Finally, the treatment AOI presents different proposals among which the student should select the appropriate treatments.

3.2. EEG Recordings

Participants’ brain activity was recorded using the Emotiv headset that contains 16 electrodes placed according to the 10-20 international standard [41] . It records simultaneously 14 regions of the brain (O1, O2, P7, P8, T7, T8, C5, FC6, F3, F4, F7, F8, AF3 and AF4). Two more electrodes are used as references corresponding to the P3 and P4 regions called respectively Driven Right Leg (DRL) and Common Mode Sense (CMS). The EEG data were recorded at a sampling rate of 128 Hz. The methodology of Chaouachi and his colleagues [42] [43] was used to measure each second, an engagement index by establishing a ratio between the three EEG frequency bands namely, θ (4 - 8 Hz), α (8 - 13 Hz) and β (13 - 22 Hz) as follows: Engagement index = β/θ + α.

The three frequency bands were extracted by multiplying one second of the EEG signal by a Hamming window and applying a Fast Fourrier Transform. Then, a combined value of the θ, α and β was computed by summing their values over all the 14 measured regions. Finally, as in [44] , the EEG engagement index at instant T is computed by averaging each engagement ratio within a 40-second sliding preceding instant T. This procedure is repeated every 2 s and a new 40 s sliding window is used to update the index.

4. Results and Discussion

In this study, EEG data were gathered and analyzed among all participants to investigate students’ mental state using the engagement brain index. The experimental results are mainly divided into three parts: 1) We assess the evolution of students’ brain activity during the three phases of the clinical problem-solving task (exploration, diagnostic and treatment), 2) we analyze the distribution of the engagement index across the different AOIs, and 3) we investigate the impact of the engagement level on performance.

4.1. Engagement Index Evolution through the Resolution Process

Our first objective was to analyze how the engagement index evolves in the different phases of the resolution process across all participants, in order to investigate whether there was a particular period of time within the game that aroused the most attention. From Figure 1, we can clearly see that case 1 yielded the highest engagement index especially at the beginning of the case. This result is pretty obvious since the interface of the medical case resolution was shown for the first time to the participants. In fact, they needed to provide more mental effort to understand the different elements with which they had to interact in the environment. In case 3, we notice a decrease in the engagement index during a certain period of time during which the students must identify the correct disease. This decrease in the participants’ attention has affected their performance since the number of failed attempts, in this case, is equal to 21 which represents 54% of the total number of attempts (see Table 1). The remaining cases have similar variations.

As to have detailed comparisons, we dressed a bar chart to study the variation of the engagement index across the exploration, diagnostic and treatment phases.

Figure 2 shows the distribution of the engagement index in each phase of each medical case. We notice that in almost all cases the exploration phase caught most attention followed second, by the treatment phase and third the diagnostic. However, in general, we can clearly distinguish that there is not an important difference between the cases or the phases in terms of engagement level. The highest variation is about 30% in the first case. To support this conclusion, statistical testing was performed using analyses of variance (ANOVA)

Figure 1. EEG engagement evolution in each medical case.

Figure 2. Distribution of students’ engagement index across the different phases of the resolution process.

Table 1. Descriptive statistics for the medical case resolution.

with repeated measures. No statistically significant results (p = n.s) were found across the three phases suggesting that the engagement index is not related to a particular phase or a case. These findings underline the fact that the students were engaged in the same way through all the reasoning process from the exploration to the treatment identification.

4.2. Engagement Index Distribution across the AOIs

In the previous analyses, it was found that the engagement index varies in a nearly similar way in all phases of the game all cases combined. In our next investigation, we performed further analyses to examine how the engagement index is distributed across the AOIs and similarly whether there is a specific area among those identified (i.e. Information, Antecedents, Symptoms, Analyses, Diagnosis and Treatment) that may have an impact on students’ engagement.

Preliminary statistical comparisons were made in terms of engagement index. Table 2 shows the descriptive statistics using mean values and standard deviations in order to see, in which areas there was a high level of engagement.

In the first case, the highest value was obtained for the Information area. Indeed, this corresponds to the first time the participants interacted with the clinical environment. That is why they were more engaged and focused to

Table 2. Descriptive statistics for the engagement index per medical case, means (standard deviation).

understand the tutoring material compared to the other regions. The Antecedents area was the least engaging in almost all cases since it did not contain specific relevant material, except in the fourth medical case where the engagement index was high. Indeed, in order to correctly resolve this case and to be able to identify the appropriate disease, the students needed to pay a particular attention to the important information contained in this area. As in this case, the diagnosis was Ebola, the learners needed to focus on the Antecedents area to successfully identify the disease, which includes an important clue (“recently travelled to Guinea”) that could help them provide the correct answer. In the remaining areas, the Treatment AOI has sparked the highest engagement level. Indeed, once the diagnosis is made, the students had to establish the appropriate treatments for the identified illness. A list of six different treatments was presented to the participants, and they were asked to choose the right ones (two or three depending on the case). Hence, the students had to pay attention to all the suggestions in order not to make mistakes. As a result, this step required more concentration than the others especially that the students were informed that the game will be over after three errors.

Based upon the obtained results, one-way ANOVAs were performed to examine whether there were significant differences among all AOIs in terms of engagement index. Three main effects were found respectively in the first case (F (5, 1312) = 7.905, p < 0.01), the third case (F (5, 738) = 4.495, p < 0.01) and the fifth one (F (5, 292) = 2.559, p < 0.05). This suggests that statistically significant differences exist between the six regions of interest in terms of mental engagement.

Post hoc tests with a Bonferroni adjustment were run in each case to show which specific regions raised the highest attention. Results indicate that in the first case, the engagement level was statistically significantly higher in the Information area than in the Symptoms (p = 0.021), Diagnosis (p = 0.006) and Treatment (p = 0.000) areas. Significant differences were also found in case 3 for the Treatment area compared to the Symptoms (p = 0.010) and Analyses (p = 0.002). Finally, in case 5 the pairwise comparisons demonstrated no significant effect indicating that there were no differences between the AOIs. Indeed, in this case, and as shown in Table 2 all the values of the engagement index are relatively similar.

These results demonstrated that there were no particular AOIs that aroused the most the students’ attention. These findings are contradictory with our previous work [2] where the statistical analyses showed the opposite effect in terms of fixation duration. In fact, one-way ANOVAs showed that the S region was the most fixated one over all the medical cases. Thus, one can explain that longer fixations do not necessarily imply higher attention. Yet, does this imply that paying attention to specific relevant information can potentially lead students to correctly resolve the medical cases? We will explore this question in the next section. First, we will be interested in checking whether there exist statistical relationships between performance in the medical case resolution (success or failure) and engagement. Second, we will check whether the engagement level in each AOI has an impact on students’ performance.

4.3. Engagement Index and Performance

One way ANOVAs were first conducted to compare two groups of learners (group 1: success; group 2: failure) in terms of engagement level throughout the case resolution. For cases 2 and 5, the analysis was not conducted since the number of subjects was not fair between the two groups (all the participants succeeded in resolving the cases). For the remaining cases, the analysis yielded a main effect (p < 0.05) showing a significant difference of the students’ engagement index in both groups as depicted in Table 3. In cases 1, 4 and 6, we observed the highest engagement levels for group 1 compared to group 2. This suggests that being involved and concentrated in the task’s resolution lead to a correct reasoning and thus a correct response. In fact, the more the learners were engaged the more they were able to reach the solution. Yet, in case 3 another trend was observed: the highest engagement index was identified in group 2 (i.e. the participants who failed in resolving the medical case). One can explain this differently; in that case, the learners were rather struggling in identifying the diagnosis/treatment. Hence, they tried to be more engaged and provided more mental effort to reach the solution.

Moreover, within-subject repeated measure ANOVAs were performed to investigate whether there is a specific region that may have an effect on students’ performance. Yet, no significant differences (p = n.s.) were found between the AOIs in terms of success/failure over all cases suggesting that being engaged when focusing on a specific AOI did not have an impact on students’ performance.

These results confirm our hypothesis in [2] regarding fixation duration. Indeed it was found that longer fixation duration on specific information areas has

Table 3. ANOVAs’ results summary.

no effect on learners’ performance. One can explain these results in different ways: the first reason can be that the fixated AOI may not contain important information, i.e. the student is admittedly engaged but not on the right direction, as is in case 4 where learners had to pay particular attention to the Antecedents area of interest. However, based on the statistical results, it is shown that this was the area with the lowest engagement index (see Table 2). The second explanation is that the student can face difficulties to understand some knowledge. So, he/she tends to engage more and provide more mental effort when focusing on certain materials that can help her/him find the right answer such as the S or the N areas.

To sum up, the first experimental analyses are in line with our assumptions. In case 1 and during the exploration phase, students were more likely to experience high mental engagement since it is the first medical case they had to resolve. Also, the displayed interface differed from the other scenes of the game, which makes them more attentive in order to understand how they would solve the case. No particular variation was identified for the other cases in terms of engagement index: the three resolution phases aroused almost the same level of attention. When solving the medical cases, participants needed to focus on particular regions of the screen. Therefore, we were interested to examine whether there was a specific area(s) of interest whither the mental engagement was high. Similar trends have been observed in most cases: learners were mostly engaged when focusing on the Treatment AOI. In fact, this part requires more mental effort knowing that for each identified disease in each medical case, two treatments at least are needed. On the other side, the Antecedents area is the least engaging, as it does not contain significant information to support the students’ reasoning process. Finally, in the second part of the experimental design, we developed the hypothesis that a change in the students’ engagement level across the fixated AOIs can have an impact on learners’ performance. Results showed statistically significant relationships between students’ performance and engagement index. Nevertheless, based on the remaining analyses we cannot conclude that there is a unique area that has an effect on students’ outcomes: i.e. being engaged when fixating an AOI, important or not, cannot necessarily lead to their success or failure in the medical case resolution. Indeed in clinical problem-solving tasks, all the steps of the resolution are mandatory. For instance, we cannot only be engaged when reading the symptoms and forget about the antecedents that may involve relevant information as well.

5. Conclusions

In this study, an experimental protocol was established to measure the mental engagement index of fifteen participants using EEG signals. Participants were shown six medical cases and asked to resolve them by identifying the correct diagnosis and treatment. The objective of this research was to draw a general overview of the students’ brain activity changes when reasoning in order to subsequently develop appropriate help strategies. Indeed, tracking learners’ mental engagement is very important especially in high-risk medical learning environments in order to constantly have an updated monitoring of their progress and level of knowledge acquisition.

First, we assessed the evolution of the engagement index across the different phases of the resolution process and the different areas of interest, with the aim to identify potential time frames or regions that could yield a high level of engagement. We found out that the first case aroused a high level of engagement especially during the exploration phase, which was the first scene exhibited to the participants. Then, we analyzed the relationship between mental engagement and learners’ performance. The results showed that engagement has overall a significant positive impact on students’ outcomes, however, we found that this relationship was not modulated by paying more attention to a particular section of the game or to a particular area of interest.

In our future research, we propose to incorporate eye tracking data with EEG in order to have a multimodal sensor-based assessment of students’ learning behavior. In the long term, we plan to provide novice medicine students with timely interventions to foster their analytical reasoning process according to both their mental and visual behaviors.

Acknowledgements

This work was supported by NSERC (National Science and Engineering Research Council) and SSHRC (Social Science and Human Research Council) through the LEADS project.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Jraidi, I., Chaouachi, M. and Frasson, C. (2013) A Dynamic Multimodal Approach for Assessing Learners’ Interaction Experience. 15th International Conference on Multimodal Interaction, Sydney, 9-13 December 2013, 271-278.
https://doi.org/10.1145/2522848.2522896
[2] Ben Khedher, A., Jraidi, I. and Frasson, C. (2018) Static and Dynamic Eye Movement Metrics for Students’ Performance Assessment. Smart Learning Environments, 5, 14.
https://doi.org/10.1186/s40561-018-0065-y
[3] Ben Khedher, A., Jraidi, I. and Frasson, C. (2017) Assessing Learners’ Reasoning Using Eye Tracking and a Sequence Alignment Method. International Conference on Intelligent Computing, Liverpool, 7-10 August 2017, 47-57.
https://doi.org/10.1007/978-3-319-63312-1_5
[4] Roach, B.J. and Mathalon, D.H. (2008) Event-Related EEG Time-Frequency Analysis: An Overview of Measures and an Analysis of Early Gamma Band Phase Locking in Schizophrenia. Schizophrenia Bulletin, 34, 907-926.
https://doi.org/10.1093/schbul/sbn093
[5] Ben Hamida, S., Penzel, T. and Ahmed, B. (2015) EEG Time and Frequency Domain Analysis of Primary Insomnia. 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, 25-29 August 2015, 6206-6209.
[6] Loo, S.K., Lenartowicz, A. and Makeig, S. (2016) Use of EEG Biomarkers in Child Psychiatry Research: Current State and Future Directions. Journal of Child Psychology and Psychiatry, 57, 4-17.
https://doi.org/10.1111/jcpp.12435
[7] Soleymani, M., Asghari-Esfeden, S., Fu, Y. and Pantic, M. (2016) Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection. IEEE Transactions on Affective Computing, 7, 17-28.
https://doi.org/10.1109/TAFFC.2015.2436926
[8] Kim, J., Seo, J. and Laine, T.H. (2018) Detecting Boredom from Eye Gaze and EEG. Biomedical Signal Processing and Control, 46, 302-313.
https://doi.org/10.1016/j.bspc.2018.05.034
[9] Zhuang, N., Zeng, Y., Tong, L., Zhang, C., Zhang, H. and Yan, B. (2017) Emotion Recognition from EEG Signals Using Multidimensional Information in EMD Domain. BioMed Research International, 2017, Article ID: 8317357.
https://doi.org/10.1155/2017/8317357
[10] Wang, X.-W., Nie, D. and Lu, B.-L. (2014) Emotional State Classification from EEG Data Using Machine Learning Approach. Neurocomputing, 129, 94-106.
https://doi.org/10.1016/j.neucom.2013.06.046
[11] Jraidi, I., Chaouachi, M. and Frasson, C. (2014) A Hierarchical Probabilistic Framework for Recognizing Learners’ Interaction Experience Trends and Emotions. Advances in Human-Computer Interaction, 2014, Article ID: 632630.
https://doi.org/10.1155/2014/632630
[12] Lujan-Moreno, G.A., Atkinson, R.K. and Runger, G. (2016) EEG-Based User Performance Prediction Using Random Forest in a Dynamic Learning Environment. Intelligent Tutoring Systems: Structure, Applications and Challenges, 105-128.
[13] Jraidi, I., and Frasson, C. (2010) Subliminally Enhancing Self-Esteem: Impact on Learner Performance and Affective State. Intelligent Tutoring Systems, 11-20.
https://doi.org/10.1007/978-3-642-13437-1_2
[14] Van der Hiele, K., et al. (2007) EEG Correlates in the Spectrum of Cognitive Decline. Clinical Neurophysiology, 118, 1931-1939.
https://doi.org/10.1016/j.clinph.2007.05.070
[15] Jraidi, I., Chalfoun, P. and Frasson, C. (2012) Implicit Strategies for Intelligent Tutoring Systems. Intelligent Tutoring Systems, 1-10.
[16] Ben Khedher, A., Jraidi, I. and Frasson, C. (2018) Exploring Students’ Eye Movements to Assess Learning Performance in a Serious Game. EdMedia + Innovate Learning, 394-401.
[17] Aricò, P., Borghini, G., Di Flumeri, G., Colosimo, A., Pozzi, S. and Babiloni, F. (2016) A Passive Brain-Computer Interface Application for the Mental Workload Assessment on Professional Air Traffic Controllers during Realistic Air Traffic Control Tasks. Progress in Brain Research, 228, 295-328.
https://doi.org/10.1016/bs.pbr.2016.04.021
[18] Wang, S., Gwizdka, J. and Chaovalitwongse, W.A. (2016) Using Wireless EEG Signals to Assess Memory Workload in the n-Back Task. IEEE Transactions on Human-Machine Systems, 46, 424-435.
https://doi.org/10.1109/THMS.2015.2476818
[19] Keith, J.R., Rapgay, L., Theodore, D., Schwartz, J.M. and Ross, J.L. (2015) An Assessment of an Automated EEG Biofeedback System for Attention Deficits in a Substance Use Disorders Residential Treatment Setting. Psychology of Addictive Behaviors, 29, 17-25.
https://doi.org/10.1037/adb0000016
[20] Jraidi, I. and Frasson, C. (2013) Student’s Uncertainty Modeling through a Multimodal Sensor-Based Approach. Journal of Educational Technology & Society, 16, 219-230.
[21] Chen, C.-M., Wang, J.-Y. and Yu, C.-M. (2017) Assessing the Attention Levels of Students by Using a Novel Attention Aware System Based on Brainwave Signals. British Journal of Educational Technology, 48, 348-369.
https://doi.org/10.1111/bjet.12359
[22] Mills, C., Fridman, I., Soussou, W., Waghray, D., Olney, A.M. and D’Mello, S.K. (2017) Put Your Thinking Cap on: Detecting Cognitive Load Using EEG during Learning. Proceedings of the 7th International Learning Analytics & Knowledge Conference, Vancouver, 13-17 March 2017, 80-89.
https://doi.org/10.1145/3027385.3027431
[23] Henrie, C.R., Halverson, L.R. and Graham, C.R. (2015) Measuring Student Engagement in Technology-Mediated Learning: A Review. Computers & Education, 90, 36-53.
https://doi.org/10.1016/j.compedu.2015.09.005
[24] Nakamaru, S. (2011) Investment and Return. Journal of Research on Technology in Education, 44, 273-291.
https://doi.org/10.1080/15391523.2012.10782591
[25] Yang, Y.-F. (2011) Engaging Students in an Online Situated Language Learning Environment. Computer Assisted Language Learning, 24, 181-198.
https://doi.org/10.1080/09588221.2010.538700
[26] Azevedo, R. (2015) Defining and Measuring Engagement and Learning in Science: Conceptual, Theoretical, Methodological, and Analytical Issues. Educational Psychologist, 50, 84-94.
https://doi.org/10.1080/00461520.2015.1004069
[27] Chen, P.-S.D., Lambert, A.D. and Guidry, K.R. (2010) Engaging Online Learners: The Impact of Web-Based Learning Technology on College Student Engagement. Computers & Education, 54, 1222-1232.
https://doi.org/10.1016/j.compedu.2009.11.008
[28] Jaafar, S., Awaludin, N.S. and Bakar, N.S. (2014) Motivational and Self-Regulated Learning Components of Classroom Academic Performance. Journal of Educational Psychology, 82, 33-40.
[29] Fredricks, J.A. and McColskey, W. (2012) The Measurement of Student Engagement: A Comparative Analysis of Various Methods and Student Self-Report Instruments. In: Christenson, S.L., Reschly, A.L. and Wylie, C., Eds., Handbook of Research on Student Engagement, Springer US, Boston, 763-782.
[30] Wigfield, A., et al. (2008) Role of Reading Engagement in Mediating Effects of Reading Comprehension Instruction on Reading Outcomes. Psychology in the Schools, 45, 432-445.
https://doi.org/10.1002/pits.20307
[31] Helme, S. and Clarke, D. (2001) Identifying Cognitive Engagement in the Mathematics Classroom. Mathematics Education Research Journal, 13, 133-153.
https://doi.org/10.1007/BF03217103
[32] Alford, B.L., Rollins, K.B., Padrón, Y.N. and Waxman, H.C. (2016) Using Systematic Classroom Observation to Explore Student Engagement as a Function of Teachers’ Developmentally Appropriate Instructional Practices (DAIP) in Ethnically Diverse Pre-Kindergarten through Second-Grade Classrooms. Early Childhood Education Journal, 44, 623-635.
https://doi.org/10.1007/s10643-015-0748-8
[33] Turner, J.C., Christensen, A., Kackar-Cam, H.Z., Trucano, M. and Fulmer, S.M. (2014) Enhancing Students’ Engagement: Report of a 3-Year Intervention with Middle School Teachers. American Educational Research Journal, 51, 1195-1226.
https://doi.org/10.3102/0002831214532515
[34] Whitehill, J., Serpell, Z., Lin, Y., Foster, A. and Movellan, J.R. (2014) The Faces of Engagement: Automatic Recognition of Student Engagement from Facial Expressions. IEEE Transactions on Affective Computing, 5, 86-98.
https://doi.org/10.1109/TAFFC.2014.2316163
[35] Benlamine, S., Bouslimi, S., Harley, J., Frasson, C. and Dufresne, A. (2015) Toward Brain-Based Gaming: Measuring Engagement during Gameplay. EdMedia: World Conference on Educational Media and Technology, Montréal, 22-25 June 2015, 717-722.
[36] Berka, C., et al. (2007) EEG Correlates of Task Engagement and Mental Workload in Vigilance, Learning, and Memory Tasks. Aviation, Space, and Environmental Medicine, 78, B231-B244.
[37] D’Mello, S., Chipman, P. and Graesser, A. (2007) Posture as a Predictor of Learner’s Affective Engagement. Proceedings of the 29th Annual Meeting of the Cognitive Science Society, Nashville, 1-4 August 2007, 905-910.
[38] Pham, P. and Wang, J. (2015) Attentive Learner: Improving Mobile MOOC Learning via Implicit Heart Rate Tracking. International Conference on Artificial Intelligence in Education, Madrid, 21-25 June 2015, 367-376.
https://doi.org/10.1007/978-3-319-19773-9_37
[39] Boucheix, J.-M., Lowe, R.K., Putri, D.K. and Groff, J. (2013) Cueing Animations: Dynamic Signaling Aids Information Extraction and Comprehension. Learning and Instruction, 25, 71-84.
https://doi.org/10.1016/j.learninstruc.2012.11.005
[40] Lin, F.-R. and Kao, C.-M. (2018) Mental Effort Detection Using EEG Data in E-Learning Contexts. Computers & Education, 122, 63-79.
https://doi.org/10.1016/j.compedu.2018.03.020
[41] Klem, G.H., Lüders, H.O., Jasper, H.H. and Elger, C. (1999) The Ten-Twenty Electrode System of the International Federation. The International Federation of Clinical Neurophysiology. Electroencephalography and Clinical Neurophysiology, 52, 3-6.
[42] Chaouachi, M., Jraidi,I. and Frasson, C. (2015) MENTOR: A Physiologically Controlled Tutoring System. In: User Modeling, Adaptation and Personalization, Springer, Berlin, 56-67.
https://doi.org/10.1007/978-3-319-20267-9_5
[43] Chaouachi, M., Jraidi, I. and Frasson, C. (2011) Modeling Mental Workload Using EEG Features for Intelligent Systems. In: User Modeling, Adaption and Personalization, Springer, Berlin, 50-61.
https://doi.org/10.1007/978-3-642-22362-4_5
[44] Chaouachi, M., Chalfoun, P., Jraidi, I. and Frasson, C. (2010) Affect and Mental Engagement: Towards Adaptability for Intelligent Systems. 23rd International FLAIRS Conference, Florida, 19-21 May 2010, 6.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.