From Teaching Excellence to Expertise Development: A Pedagogical Framework for Developing Expertise

Abstract

Given the challenges which the Covid-19 pandemic has brought about, many organizations are reconfiguring their operations to lower costs and raise productivity. One increasingly viable approach is to use artificial intelligence (AI) which, as it gets deployed more widely, will see the post-pandemic workplace transformed accordingly. If new graduates are to be ready for work in these new organisational contexts, it behooves educationists to transform teaching from an instructor-centric model that focuses on “teaching excellence” to a learner-centric one that focuses on developing the kind of expertise that will be particularly needed in the AI-enabled workplace. This paper proposes a pedagogical framework for expertise development, one built upon two concepts, viz., metacognitive development, and deliberate practice. This framework has been put to actual use by the author over four semesters in a university in Singapore to teach an introductory course in organizational behavior (OB). Instructors can use this framework to develop learners who will have a sound understanding of AI through a business lens. Such learners can become workers with skill sets and the requisite expertise to excel in AI-enabled organizations in the future.

Share and Cite:

Lang, J. (2021) From Teaching Excellence to Expertise Development: A Pedagogical Framework for Developing Expertise. Creative Education, 12, 907-933. doi: 10.4236/ce.2021.124066.

1. Introduction

Given the Covid-19 pandemic, many organizations are having to reconfigure their operations to lower costs and raise productivity. One increasingly viable solution for many organizations is to use artificial intelligence (AI), which is getting deployed more widely in the present time. But even before the Covid-19 pandemic side-swiped the world, a new workplace that was enabled increasingly by AI, machine learning, and robotics was already emerging, whether decision-makers in organizations were sensitized to it or not.

We regard as unique to human these qualities of thinking, emotion, and creativity, but the coming of human-like robots may invalid such an idea. Already there are Einstein robots, so-called, that are able already to recognize hundreds of human facial expressions. That ability enables them to interact with humans at the emotional level and converse with people while maintaining eye contact with their human interlocutors, updating their knowledge about such humans because they can learn. As their makers have included personalities in their algorithms, these robots with AI-empowered cognitive abilities are getting more and more human (Zhao & Liu, 2018, Crowe, 2017). A robot named YuMi has performed in Pisa, Italy, actually conducting an orchestra that had the world-famous tenor, Andrea Bocelli, performing with it during the occasion as well (Ong, 2017). In the Kodaiji temple in Kyoto, Japan, a robot priest named Mindar can preach on the Buddhist scripture called Heart Sutra (Hardingham-Gill, 2019; Samuel, 2020). Finally, there is a robot artist named Ai-Da, whose works will be exhibited at the Design Museum in London in May 2021. But alongside these almost fantastical robots, we already have in our diurnal lives virtual assistants like Alexa or Siri, as well as chatbots that can respond instantly to frequently asked questions.

More mundane examples are already seen in Singapore, where AI-enabled autonomous robot couriers are being deployed to deliver groceries that consumers buy, whether in-store or online (Tan, 2021). Even a student start-up in a Singapore university has been using autonomous robots to deliver cooked food since June 2020 (Chong, 2021). In addition, other AI-enabled autonomous robots are being tried out at food courts in Singapore to collect dirty wishes, clean floors, inspect false ceilings, disinfect lift panels, and even map the density of mosquitoes in surrounding areas (Choo, 2020).

As AI systems creep into the economy, there will be rising market demand for people who can perform those tasks that call for non-routine cognitive skills, e.g., managers, engineers and health professionals, but also people who can perform the dirty and dangerous tasks that call for non-routine manual skills, e.g., roofers, plumbers, artisans. By the same token, market demand will fall for workers who can only perform tasks that require merely routine manual and cognitive abilities, e.g., clerks, machine operators, and assemblers.

With predictive and self-maintaining machines at factory level that can communicate not only with each other but also with suppliers and customers, industry based on Internet-of-Things may see not only customized but even personalized manufacturing. New services will be needed in R & D, robotics, and data analytics. These things together could lead to more localized production, especially when 3-D printing can be scaled up, which might mean extended supply chains could be a thing of the past.

Iu AI in autonomous transportation might help ameliorate the traffic congestion that bedevils many cities, which ought to improve productivity. AI will likely improve healthcare technology given that it will boost monitoring and diagnostic capabilities, making for personalized medicine. But if these gains translate into higher healthcare costs, then the impact will be adverse for the socioeconomically less well off.

The quick survey above of how AI is changing the economy is obviously neither comprehensive nor granular, but it is enough to show that there will be a need for new workers with the kind of expertise that is appropriate to an AI-enabled workplace. Thus, it behooves instructors to design their courses in ways that can help their students develop that sort of expertise. That is, instructional design will need to focus on developing the kind of expertise that new workers will need in the AI-enabled workplace.

How may pedagogies be tweaked to future-proof workers for an AI-prevalent workplace? According to the 2020 World Economic Forum report entitled The Future of Jobs 2020, “skills gaps continue to be high as in-demand skills across jobs change in the next five years. The top skills and skill groups which employers see as rising in prominence in the lead up to 2025 include groups such as critical thinking and analysis as well as problem-solving, and skills in self-management such as active learning, resilience, stress tolerance and flexibility” (World Economic Forum, 2020: p. 35). One would also add ‘creativity’ to this list of desirable skills, so that workers may readily innovate when opportunities present themselves in the workplace, when creative solutions may transform business operations and processes. In a 2016 report entitled, “The new basics: Big data reveals the skills young people need for the new work order,” the Foundation of Young Australians identified “enterprise skills” that are transferable include problem solving, critical thinking, communication, team­work, and presentation skills (FYA, 2016). Likewise, Rampersad (2020) found that the factors that were significant in driving innovation among students included: critical thinking, problem solving, communication, and teamwork. These qualities clearly will remain relevant in an AI-enabled workplace where the human worker performs those tasks that robots/AI cannot yet do on their own, or those tasks that are to be done in collaboration with robots.

In view of all that has been sketched above, this paper proposes an expertise development pedagogical framework, the implementation of which will be described in the delivery of an undergraduate OB course over the past few semesters in a medium size university in Singapore. In Section 2 immediately following, the case is made for moving away from the traditional focus on the “teaching excellence” of instructors to a new focus on “expertise development” in students. The proposed framework is adumbrated in Section 3, while the actual implementation of said framework in a real classroom setting is described in Section 4. Section 5 provides some qualitative evidence of the framework’s effectiveness, while Section 6 provides a quick recapitulation of how the framework may help in advancing metacognitive development and deliberate practice. The paper then concludes.

2. From Teaching Excellence to Expertise Development

Heretofore, educational institutions have tended to focus on the teaching of their instructors, hence the ubiquitous Teacher of the Year award – whereas such a focus may well be afflicted with the unintended but predictable consequence of crowd-pleasing on the part of some instructors. The ones assessing instructors in this case are their students, young people who are not able to look at pedagogical approaches with any criticality since students have, by definition, no training to do so. As such, instructors may become risk-averse and shrink away from challenging their students to think harder or experimenting with innovative pedagogies that make students work harder (Gourlay & Stevenson, 2017). They might well feel that it would be wiser to stay with the tried and tested pedagogical approaches that most students may be comfortable with.

Student feedback is usually more a measure of a teacher’s popularity, which may all come down to a teacher’s demeanor, personable-ness, friendliness, approachability, and leniency. As the Best Teacher of the Year award is inherently a competitive one, what it tends to conduce to is more the rewarding of performativity (Behari-Leek & McKenna, 2017; Saunders & Ramirez, 2017), while that which students really need, i.e., domain expertise, may go unattended. Collegiality may also suffer if competition for teaching excellence awards leads to tension among instructors (Bahia et al., 2017), all of which may well lead to mediocrity as well (Morley, 2003).

Given that the pace of technological change that AI is fostering, instructors whether in contention for Best Teacher awards or not, must look beyond the mere imparting of domain knowledge to students. Instead, they ought to be aiming to develop relevant expertise in their students so that they can function well in AI- and robotics-enabled workplaces. Thus, instructors must look for ways to design their courses so that their students who are novices can be taken on a journey to become workers with the right kind of expertise.

Clearly, expertise is not something that novices have, for otherwise they would be experts. Novices differ from experts in several dimensions (Persky & Robinson, 2017). Obviously, experts know more than novices but, more importantly, experts have better knowledge structures that help them organize their knowledge, integrate new knowledge into the structure, and analyze new contexts to fit into the structure. Better knowledge structures enable experts to access their knowledge more efficiently than novices. Moreover, experts also use their knowledge more deftly to interpret information, analyze situations, and develop solutions to problems. Thus, a retired physician who has a time-tested structure of medical knowledge in his head can understand and use new facts about the Covid-19 pandemic as they become known whereas a person who is not medically trained has no such structure upon which to hang these new facts about evolving pandemic scenarios to make good sense of them.

However, the accumulation of experience or of knowledge per se does not automatically lead to expertise development (Ericsson, 2006; Chi, Glaser, & Farr, 1988; Swanson, O’Connor, & Cooney, 1990). How then can instructors help learners develop domain-related expertise? The answer may well pivot on framing one’s pedagogy in ways to incorporate two key components, namely, metacognitive development and deliberate practice (Persky & Robinson, 2017). Recent technological advancements and the emergence of new knowledges have made it necessary for students to be knowledgeable about these things – in addition to their own disciplines. Above all, they must acquire the skills to learn new things quickly, for which purpose developing metacognition is critical, metacognition being “knowledge and cognition about cognitive phenomena,” (Flavell, 1979: p. 906).

For Tarricone (2011), metacognition is both the knowledge and regulation of cognition, while Flavell & Wellmann (1977) emphasize how metacognition is the knowledge of cognitive processes themselves. Metacognition helps students to organize and regulate their learning (Carneiro, 2007) and also gets them to concurrently develop their cognitive skills (Gourgey, 2001; Hartman, 2001). It improves student ability to apply what they know to different contexts.

Metacognition is not automatic (Bransford, Brown, & Cocking, 2000) or reflexive. Instead, it must be intentionally fostered and purposively applied in specific contexts for a particular topic, domain, or discipline (Zohar & David, 2009). For Pintrich (2002), it is important to distinguish metacognitive knowledge from metacognitive control and self-regulatory processes. The former is what one knows about one’s cognition, including “knowledge of general strategies that might be used for different tasks, knowledge of conditions under which the strategies are effective, and knowledge of self” (p. 219). The latter comprises “cognitive processes that learners use to monitor, control and regulate their cognition and learning” (p. 220).

It may be useful to think of three types of metacognitive knowledge, namely, “strategic knowledge,” “knowledge about cognitive tasks,” and “self-knowledge” (Pintrich, 2002). The first is that which is not specific to any domain of knowledge. Instead, it comprises generic ways to learn, cogitate and develop solutions to problems. This “meta-strategic knowledge” (Zohar & David, 2009: p. 179) is generic knowledge about strategies for higher order thinking.

The second type of metacognitive knowledge is knowledge about cognitive tasks: “Because not all strategies are appropriate for all situations, the learner must develop some knowledge of the different conditions and tasks where the different strategies are used most appropriately” (Zohar & David, 2009: p. 221).

The third type of metacognitive knowledge is self-knowledge, which includes being aware about those areas in which one may be strong and those areas in which one might be weak, as well as how deep and how broad one’s knowledge base might be, but also how one is motivated and if one is self-efficient as well (Flavell, 1979).

Deliberate practice, the second component of the proposed pedagogical framework, involves highly structured activities that have been especially designed to improve performance. One who is practicing but is not doing it deliberately does that which one already knows. One who is practicing deliberately does that which what one does not do well or that which one cannot do at all. Deliberate practice involves specific, substantive, and constant effort, so learners must not only “practice deliberately but also think deliberately” about that practicing while in the process of practicing itself (Ericsson, Prietula, & Cokely, 2007: p. 118). Only by thinking about what one is practicing may one be practice deliberately.

Regardless of what specific domain knowledge is involved, it is only practicing something that one does not do well that can transform one from being a novice to one who is an expert (Ericsson, Prietula, & Cokely, 2007). That is why military schools run war games, and law schools run moot courts. These exercises provide ample opportunity for their students to experience over and over again and also practice repeatedly the crucial facets of a work situation, whether fighting or litigating, say, while bettering their performance step by step by attending to feedback that their instructors and peers may provide (Ericsson, Krampe, & Tesch-Römer, 1993). Of course, there is no assurance that repeated practice will propel one up into the highest performance levels by and in of itself. Instead, learners must look at their own learning metacognitively (Campitelli & Gobet, 2011).

To change gears a bit and now consider learning to golf, for instance, the golfing novice studies or is taught the basic strokes, and she will typically deliberately take great pains to avoid making blatant mistakes, such as hitting the ball in the direction of a fellow golfer in the group ahead of her on the fairway. She might practice diligently at the putting green, as well as the driving range. She might go all nine holes – half a full game – probably with other novices and other amateurs who have never gotten very far with their game anyway. It is said that in, perhaps, just fifty hours or so of such practice that her game would have improved as much as it ever will and, afterwards, more practice simply leads to her strokes becoming more and more automatic ones. From then on, she will play intuitively, without overthinking the process of doing so. From then on, golf becomes a welcome social engagement. On and off, she may need to focus on how she is actually hitting the ball but, from then on, more time on the green will not see her game significantly improve for decades to come because there is no deliberate learning (Ericsson, Prietula, & Cokely, 2007) and, arguably, there is no attending to her own metacognition.

Such golfers never quite make it to higher performance levels in a game, which is self-paced, and lasts for hours, so there are extended intervals between shots, during which golfers could, in fact, focus on their own metacognitive processes involved in the game. They could reflect upon those processes, and plan what and how to hit the ball better next time, if they are to improve their game (Singer, 2002). Deliberative thought pro­cessing while doing a task matters (Ericsson & Kintsch, 1995). Golfers who are tasked to put the ball in a lab setting have demonstrated that their thought processes range to-and-fro to take stock of the situation; plan how to execute what the situation requires of them; go over the actual situation again; and then, and only then, prepare psychologically and physically to actually do the putting (Eccles & Arsal, 2017).

Ideally, deliberate practice must go along with metacognition development which, among other things, has been recognized as being key to how elite sportspeople perform at world-class levels (MacIntyre, Igou, Campbell, Moran, & Matthews, 2014). These persons whose motor skills far exceed those of normal mortals who cannot metacognize about, reflect upon, and plan their actions (MacIntyre et al., 2014). Elite sportsperson, who have the expertise to perform at very high levels, have acquired domain-specific representations and working memory skills, which support specialized planning, reasoning, and evaluation (Ericsson & Kintsch, 1995).

In addition to developing their metacognition about, reflecting upon, and planning what their actions will be, these elite performers may also deliberately shift their thinking to-and-fro between their long-term memory and whatever is happening in their immediate environment (Ericsson & Kintsch, 1995). It has been shown in better golfers that their pre-shot and post-shot cognitive processes involve knowing both what self-regulating strategies work best for oneself and also when to use which ones (Whitehead & Jackman, 2021). As such, thinking about which strategy to use and when to use it pre-shot and post-shot could well help golfers self-regulate much better (Flavell, 1979). Likewise, it has also been shown that planning and reasoning about one’s performance as well as evaluating problems as they arise during a game matter a lot to how well very good tennis players perform on the court (McPherson, 1999). Likewise, in the performance of triathletes (Baker et al., 2005) and Australian rules footballers (Elliott et al., 2020).

These examples from sports, especially elite sports where the star performers must necessarily be very good learners to get to where they are, suggest that the pedagogical design of a course ought to promote metacognitive development and enable deliberate practice concurrently. For the former, instructors must strive to make the thought processes that may be involved quite explicit. For the latter, the design should require students to acquire the relevant background knowledge, prompt them to make meaningful connections with previously learned content, and also reinforce their learning through repeated recalls and reviews. In other words, the course design should encourage students to search for relevant contextual information and repeatedly retrieve knowledge they have already acquired. Such repeated information seeking, and repeated knowledge retrieval could be incorporated into structured learning activities and assignments to be completed outside of class. All these ideas can now be put together in an “expertise development pedagogical framework” in the section following.

3. An Expertise Development Pedagogical Framework

Figure 1 presents an overview of an expertise development pedagogical framework that the author has implemented in an introductory course in OB, one that has now been delivered to four cohorts of undergraduate business students in a university in Singapore, over as many semesters. This course was designed so that the development of several skill sets is imbricated into the very learning of OB domain knowledge itself. Specifically, metacognitive development is incorporated into the design of assessment methods and their associated assessment rubrics, while deliberate practice is incorporated into the design of several types of active learning activities.

The framework consists of three elements that are tightly linked to one another. The first element specifies clearly the learning outcomes of a particular course. The second element specifies how the performance for the course will be assessed, i.e., what assignments students will have to do and how the assignments will be graded. The third element provides details on how the course will be delivered, i.e., its mode(s) of delivery, and its kinds of learning activities.

In terms of delivery modes, one may adopt a flipped-classroom approach for in-person classes or run classes in a hybrid-mode with both in-person classes and built-in e-learning components. In terms of the kinds of learning activities to be used in a class, instructors could choose to conduct a pop-up quiz, run a fun game, or get students to role play.

Figure 1. An expertise development pedagogical framework.

The three elements must be well aligned. That means the second element must truly assess student performance in terms of the learning outcomes stated in the first element. And the third element must comprise learning activities that facilitate student acquisition of the requisite domain knowledge as well as the skill sets essential for them to complete their course assignments as required in the second element. Figure 1 shows how these three elements make up together the framework proposed.

3.1. The First Element: Learning Outcomes

The first design element involves the explicit stating of what the key learning outcomes of a particular course will be, i.e., the acquisition of OB expertise, in my example, and laying out what that acquisition actually entails. These unambiguous statements permit both instructor and students to know clearly what skillsets are essential for acquiring OB expertise, viz., critical thinking skills, knowledge acquisition skills; self-learning and knowledge sharing skills; collaborative learning skills; design thinking skills; and reflective learning skills (Figure 1).

Next to be made explicit are the thinking processes involved, and the behavioral manifestations expected for each skillset: these are made explicit in the design of their assessment rubrics. For examples, see Table 2.

3.2. The Second Element: Learning Assessments

The second design element deals with formal learning assessments. Here, the assessment methods, their associated learning goals, assignment instructions, and assessment rubrics are clearly stated. The design of appropriate assessments and the provision of feedback are important in determining whether the course design and its learning activities are effective in achieving its stated learning outcomes. Whether course assessments are appropriate or not might be determined by the extent to which they provide students with opportunities to demonstrate the necessary skill sets.

The six assessment methods for said OB course are listed in Table 1, which I name as: Critical thinking assignment; the teamUP case analysis; the individual lightUP knowledge sharing assignment; the Design thinking project; the end-of-semester My Reflective Learning assignment; and the voiceUP assessment of participation and contribution assessment. (Giving these assignments their own names facilitates referral to and communicating about them during the entire semester.)

First, the critical thinking assignment is a learning activity to develop critical thinking skills in students (Lang, 2017). The thought processes for critical thinking are laid out explicitly in a critical thinking rubric. For this assignment, students are expected to write a critique of an article about real life business issues assigned from Harvard Business Review. To do well for this assignment, students are expected to use the criteria stated in the critical thinking rubric to guide them in their thinking processes. To enrich their understanding of the assigned article, students are expected to search for other viewpoints and theoretical perspectives that different experts may have about the subject matter of the article concerned. By doing so, students get to develop a more comprehensive understanding of the subject matter and become equipped with the necessary knowledge to critically analyse the assigned article. By going through a process of examining alternative perspectives, assessing the validity and reliability of various arguments, investigating the strength of evidence offered in support of an argument, and reconciling different viewpoints, students learn the inherent complexities of a particular issue in real life, so they develop a more nuanced understanding of the subject.

Table 1. Assessment methods.

Second, the teamUP assessment is a learning activity designed to facilitate the development of knowledge acquisition skills and collaborative learning skills. This teamUP case analysis requires students, working in teams, to analyze real-life events sourced from the Internet. Collaborative learning is important for three main reasons. First, collaborative learning makes learning meaningful, relevant, and enjoyable. Second, collaborative learning speeds up the process of learning through active discussion and deliberation. Finally, collaborative learning facilitates the acquisition and accumulation of a broader set of knowledge, which is not possible with individual learning alone.

An embedded objective is for students to keep abreast with the latest HR developments in AI-enabled organizations, of which Google is the exemplar par excellence. Students in my course were asked to access https://www.fastcompany.com/90230655/how-google-motivates-its-employees to get to know what Google did in motivating its employees. Guided by the first criterion given in the assessment rubric, students attempted to apply the relevant motivation theories and frameworks to identify the strengths and weaknesses of Google’s practices.

In order to analyze the effectiveness or ineffectiveness of Google’s practices, students then performed what is required by the rubric’s second criterion, i.e., search for additional information from the Internet to know more about the relevant contexts at the time that Google was carrying out its various motivational practices. With all this additional information in hand, students were able to examine how external contexts might have influenced the appropriateness and effectiveness of Google’s practices.

To do well for the teamUP assignment, students have to demonstrate their skills in knowledge acquisition and in collaborative learning. Students have to pre-learn, on their own, the concepts and theoretical frameworks of an assigned OB topic, and then discuss how these may be applied appropriately to a given real-life context, like Google, say, to gain a deeper understanding of the context and then to perform an analysis.

The third criterion in the rubric is raising student awareness about the limitations of their understanding of the case or their analysis of Google’s practices by focusing explicitly on what information is ambiguous or which may be lacking. It can be seen that by following the thought processes neatly detailed in the assessment rubrics, students can develop some metacognition of their own knowledge acquisition.

Third, the lightUP assessment method is a learning activity meant to motivate students to explore frontier knowledge by extracting OB insights from recent academic publications. Students are then encouraged to share new insights with their course-mates in an engaging manner so as to facilitate collaborative learning. Meant to try to keep students at the frontiers of knowledge, lightUP consists of two components, i.e., a self-learning component, and a knowledge-sharing one. Students are required to search for recently published academic articles on artificial intelligence, neuroscience, or cross-cultural research; extract some significant insights from these articles that may enrich the understanding of certain OB concepts or frameworks; and then share these insights with the whole class.

The lightUP assessment gets students up-to-speed with the latest developments in AI and neuroscience, so they become knowledgeable about the capabilities of AI and machine learning in transforming the workplace. In addition, careful perusal of the latest neuroscience publications can lead students to better understand how the brain functions, and how that may impact human emotion, perception, and motivation, which are important to understanding worker behavior in organizations.

In their sharing, students are discouraged from using PowerPoints. They are also told to not present their insights in an un-nuanced manner such as using a straightforward delivery as of a memorized speech. Instead, they are expected to share their insights as if they were having an interesting discussion with or animated conversation among one’s peers. Indeed, students do come up with interesting and enriching insights which they share in class. The following is a sampling of what students in my OB classes have shared for their lightUP assignments:

· In a class on employee motivation, an engineering student taking the OB course studied very thoroughly a very recent neuroscience article published in a prestigious journal. He then shared about the neural networks in the brain, explaining how different mental processes might fire up different parts of the brain, which he related to various motivational theories that were being discussed in that seminar. Using a variety of visuals aids and props to enhance the clarity of his explanation, that student spoke conversationally to engage his audience.

· In a class on workplace emotions, a student shared about emotional AI, explaining how AI algorithms were already being used in decoding facial expressions and analyzing vocal patterns to better understand true emotions in the workplace. That student also discussed deliberatively about possible biases in emotional AI.

· In a class on team dynamics, a student shared about the design of robot co-workers, the different kinds of human-robot interactions, and how AI could assist in fostering robot-human team collaboration.

· In a class on power and influence, a student discussed how power might influence brain function, both cognitively and emotionally. Another student discussed the relationships between power, testosterone levels, and dopaminergic activity in the striatal reward networks in the adult brain.

Fourth, the Design Thinking project is designed to encourage students to think creatively by way of seeking out any connections and links which others may not have detected. Working in teams, students identify a specific job to focus on, and then interview a worker who is currently doing that very job so as to better understand the intricacies and complexities of the job in real life. With a good knowledge of the job in hand, students then seek inspiration from a diversity of places, such as the Salvador Dali painting of melting clocks. Students then try to explore ways to re-design the job to make it future ready. One student team connected the Dali melting clocks to the notion of “stretchable time,” and used this idea to redesign a specific element in the job of a court interpreter.

Fifth, My Reflective Learning is a learning activity that requires a report that students must write at the end of the course. It is used to encourage students to retrieve knowledge and skills that they may have acquired from the course and then apply them to a significant situation or event in their personal lives, thus making the knowledge they have acquired relevant to and meaningful in their lives.

For example, one student reflected upon the time when his father passed away unexpectedly, and how distressed and helpless he felt then. By analyzing his own perceptions of the event and applying the stress framework to that significant life event of his, he was able to generate a broader range of actions that he may be able to take to manage stress better should he face another as highly stressful a situation in the future.

Finally, voiceUP, is a learning activity based on the idea that creating a “community of learners” will foster metacognitive development. Within such a community, it is suggested, “the development of a discourse genre in which constructive discussion, questioning, querying, and criticism are the mode rather than the exception. In time, these reflective activities become internalized as self-reflective practices” (Brown, 1997: p. 406).

Since the appropriate social environment for learning is important if learning is to be meaningful and relevant, voiceUP was designed to encourage active collaborative learning and peer-to-peer interaction with sharing. These elements make for an environment that is conducive to active, engaged learning. In contrast to traditional modes of assessing class participation, the quality of student contribution to four core learning activities was taken into consideration. In addition to using teamUP, and lightUP for this latter purpose as well, I also designed sizeUP, and powerUP for assessing the quality of student contribution to learning (Figure 1).

For the teamUP case analysis, a question-and-answer session was incorporated that permitted students to question the very team that performed the analysis itself, the members of which would try to answer those questions posed to them. Students were graded based on the quality of the questions asked and answers given.

For lightUP, students searched for recently published academic articles on artificial intelligence, neuroscience, or cross-cultural research. They then, extracted some significant insights from these publications, which they presented to their classmates, who then provided their feedback to these speakers. Students were graded based on the quality of their feedback to the lightUP speakers.

While teamUP, and lightUP (already discussed earlier) are graded learning activities, sizeUP is a learning activity comprising un-graded quizzes given to students to check if their self-learning is effective. These quizzes are discussed in class, during which students are encouraged to contribute examples that manage to clarify their understanding of OB theories and frameworks. The other un-graded learning activity designed to assess class participation is powerUP, where students are given the opportunity to share with classmates any insights generated from their active learning activities.

Students are told clearly what knowledge and skill sets they ought to demonstrate in completing the assignment. For this purpose, each assessment method is accompanied by an appropriate assessment rubric, each of which lays out the observable traits of the skills to be assessed and describes how different performance levels of each trait are ascertained. Table 2 provides sample rubrics for assessing the skill sets specified in element 1. It is important to ensure that the observable traits of each rubric are valid measurements of the relevant skill. Assessment rubrics facilitate the provision of precise and actionable feedback so that students become more aware of their own skill levels.

The development of metacognition is facilitated by specifying clearly the thought processes that students must engage in so as to complete a particular assignment. Students are unlikely to engage in metacognitive thinking unprompted (Lin, 2001), so it is important to include in one’s instructional design support for metacognitive thinking. This is akin to Hartman and Sternberg’s (1993) point that attention must be paid to both instructional techniques and the classroom environment to improve learner cognition and metacognition. It is known that some instructional techniques can impact learner metacognition positively, e.g., “reciprocal teaching” (Palincsar & Brown, 1984), or “peer instruction,” (Mazur, 2017).

The critical thinking rubric specifies five observable processes that are used to grade the quality of students’ critique of an article. These five criteria have been identified as the main components of critical thinking (Alghalith, 2015).

The knowledge acquisition rubric, which is used to grade students’ performance of the teamUP case analysis, states clearly four observable processes are used to judge their knowledge acquisition skills. These four processes are derived from Glazer (1998), concerning “Measuring the knower: Towards a theory of knowledge equity.” This rubric shows that it will not be enough if students just learn concepts or theoretical frameworks a-contextually. Instead, they must be able to demonstrate their skills in applying these concepts or theoretical framework to a real-life situation appropriately. They must examine how the temporal and contextual properties of information may impact their analysis of the situation, identifying possible gaps in the information, and providing an overall conclusion.

Finally, the design thinking rubric is used to assess the quality of students’ design thinking project. The observable traits in the design thinking rubric are delineated based on the key concepts discussed in Brown (2008). This rubric is used to guide students’ thinking processes in their design thinking project. The reflective learning rubric specifies clearly the thought processes to be used for reflective learning. The observable traits used in this rubric are based on the key concepts proposed by Ryan and Ryan (2012), as well as Bain et al. (2002).

Table 2. Assessment rubrics.

3.3. The Third Element: Learning Pedagogy

The third element of the framework specifies the modes of course delivery and the design of learning activities that can promote active learning. The author adopted a flipped classroom approach for the delivery of this course to encourage self-learning and collaborative learning. These were fully in-person classes conducted during pre-Covid-19 semesters. During the height of the pandemic in 2020, all in-person classes were moved online from the fifth week of classes. For the 2020/2021 academic year, when a return to some in-person classes became possible, a hybrid mode was adopted with a combination of in-person classes and online learning.

Active learning pedagogy transforms students from passive learners into active ones when they participate actively in various learning activities. Instead of lectures, or even mini lectures, domain knowledge is deftly woven into and embedded in the design of each learning activity.

A good learning activity is one that places students within a context that not only makes learning meaningful and relevant but also requires the application of relevant knowledge and skills. Students are guided in their learning as they perform various learning activities. This is akin to a child learning about gravity as in the play activity that involves stacking up differently shaped blocks as it tries to build as stable a structure as it can.

Beside facilitating metacognitive development by having assessment rubrics that lay out clearly the thinking processes involved, students are prompted to develop their own metacognitive strategies in view of the fact that their self-learning will be assessed with a quiz and them being expected to participate in class activities to demonstrate that they have acquired the requisite knowledge. Being aware that they will have to make their learning explicit in class, students may adopt effective learning processes that involve more metacognitive strategies to help them retain the newly acquired knowledge.

Indeed, sizeUP and powerUP are specially designed for this purpose, where sizeUP are quizzes conducted using response-ware, i.e., online tools such as TurningPoint, wooclap, or Kahoot. These provide a quick-and-easy way for students to recall and review what they have learned on their own before a particular class. By having students learn the basic course material outside of actual class time, more class time becomes available for collaborative learning and face-to-face discussion.

For powerUP, students participate in a variety of active learning exercises especially designed for them to apply theories and frameworks specific to a particular topic in OB. For example, students may be placed in a situation where a team member role plays as a “manager” who conducts a performance appraisal that motivates her subordinates, while other team members role play as “subordinates” who have different personalities and varying performance levels. The “manager” is then required to apply the appropriate motivation theories to come up with some effective strategies to motivate her “subordinates.” Role playing is enriching for students not only in allowing them to experience the emotional dynamics of a workplace setting but also to understand the complexities of motivating people (Lang, 2019).

The teamUP segment is time allocated for the pre-assigned team to present their case analysis to the class to facilitate collaborative learning. To encourage students to pay close attention to what is being shared and to make them think more deeply about the analyses being presented, students are required to write probing questions that promote critical thinking. Having students generate questions can be a great way to see whether they have watched the team presentations (in-person or on videos) diligently enough.

Requiring students to come up with probing questions encourages them to put in the time and effort needed to learn from the team presentations. Because they know they are expected to generate probing questions, students tend to pay closer attention to their peers’ presentations of their case analyses. They also think more critically about what needs to be asked so that their peers will regard their questions as being worthy of further discussion. All of this is time well spent in honing their skills in developing probing questions. This activity not only helps in engaging learners but also impresses upon them that good ideas are valued.

In my course, written questions were submitted to a course blog, and students were encouraged to study the questions that others had posted, which helped them think more deeply about a particular topic. The author has also consistently found it very rewarding, even illuminating, to read some of the intelligent and interesting questions that students may post. This exercise allows the instructor to access how their students think and it can be quite amazing to see how their minds work.

For example, in a class of mine, one student team was assigned to analyze the leadership of Daniel Zhang, who became Alibaba CEO in 2015, and then its executive chairman in 2019. This team’s case analysis was shared with other students during the teamUP segment of a class. After watching the presentation, two students in the audience asked the presentation team these questions: 1) Your team mentioned that Daniel Zhang displayed transformational, managerial leadership (task-oriented), path-goal leadership (participative) and authentic leadership: In your own opinion, which leadership style (or a mix of leadership styles) do you think propelled Zhang to success as Jack Ma’s successor? 2) If Daniel Zhang were to manage another company that was less successful than Alibaba, do you think the company will be successful under his management style as you discern it?

Students are pre-assigned to the lightUP segment, which is class-time allocated for these students to individually share OB insights that they might have gathered by reading a recently published article on AI or neuroscience. The requirement for students to provide written feedback to the lightUP speaker is to encourage them to learn from the lightUP speaker while also being critical in their observations about the quality of the sharing. The following are two samples of feedback provided by two students to a particular lightUP speaker:

Sample #1:

· Key learning points: Our teammates in the future might well be machines with humanlike qualities such as having helpful personalities, being able to read human body language, have moral values, can set goals and attain them. Machines that can learn might also be able to bring leadership skills and problem-solving skills to help team members.

· Key strengths of the speaker: There was a lot of emphasis on those words and phrases that he wanted to highlight to the audience. His gestures were appropriate and, interestingly, he incorporated intelligent machines into the concept of team dynamics.

· Areas for improvement: He could speak a bit faster, it not being necessary to emphasize so many words. Instead, he should stress only those words that are central to his argument in order to not sound so choppy.

Sample #2:

· Key learning points: 1) AI-robots can expand team diversity, which may enable teams to better deal with complex problems. Machines may have technical capabilities that humans do not have. 2) Since robots can work 24/7, team performance ought to be boosted thereby.

· Key strengths of the speaker: I really liked how he aligned his verbal with nonverbal communication by using visual aids to make the sharing more memorable. For example, he showed how the robot was able to read body languages and set goals for its team. He then shared that the robot could even be the team leader, sharing his bigger and deeper store of knowledge and coordinating more seamlessly among team members. I liked how he considered opposing views as well, showing that he had conscientiously researched the topic at hand. Presenting also counterarguments enabled the audience to consider both sides of the debate in deciding if AI and robots are truly good for people.

· Areas for improvement: He could speak faster, with better intonation to engage the audience more consistently throughout his presentation. He could also include more frameworks and more concepts in his sharing so that the audience may grasp better how the issue at hand is connected to conceptual OB material we were dealing with at that juncture.

Now, what about deliberate practice? This is embedded in the very sequencing of the various learning activities, which facilitate frequent information seeking and repeated knowledge retrieval by students. First, sizeUP provides students the very first opportunity to recall what they have learned on their own before the actual class and to review the effectiveness of their self-learning.

This is followed by the powerUP active learning activity, which requires students to recall their OB knowledge about a topic to apply them in specific activities, be it a role play, a video case analysis, or a real-life vignette analysis.

Next comes teamUP which again compels students to recall knowledge learned and apply it to real-life workplace situations or organizations based on information gathered from the Internet.

Then there is lightUP which encourages students to be on the lookout for emerging knowledge in other disciplines, especially artificial intelligence, neuroscience, and cross-cultural research, and to bring them to bear on the OB knowledge they have acquired.

Thus, in each class, students are given four opportunities to retrieve and apply the knowledge they have acquired in different contexts. That learning culminates in a final written assignment of self-reflection at the end of the course, when all students must submit a “My Reflective Learning” essay. This requires them to apply the OB knowledge they have acquired the whole semester to personal life events. With deliberate practice built into the entire course design, the probability of students storing their knowledge with better knowledge structures in their memories ought to be heightened.

Having described the actual learning activities in some detail, one will need to describe how to implement the whole plan, which is the burden of the section immediately following below.

4. The Implementation Plan

The reader may think that this course design might be too cumbersome to implement or that the course may be too burdensome for some students. In fact, neither is true. The implementation plan for the introductory OB course that the author has personally conducted is shown in Table 4. The entire course which is delivered in thirteen (13) seminar sessions has been taught for four semesters in two academic years.

In actual practice, the first four weeks of a semester were devoted to guiding students in regard to the various graded assessments. During these four weeks, students were given the opportunity to trial run the various learning activities so that they might understand the criteria embedded in the various assessment rubrics. The sole purpose of these trial runs was for students to develop their self-confidence and self-efficacy in completing the various assessments.

Not all learning activities were graded. To cultivate a culture where students would be more open to learning from their own mistakes, performance in sizeUP quizzes and powerUP learning activities were not graded. Only a student’s voluntary participation in class discussions was graded in voiceUP.

The arrangements for teamUP and lightUP were as follows. For a class of about forty students, eight teams of four or five students per team were created. Each team was preassigned an Internet article about a real-life case to be studied and analyzed. Each team then shared its analysis with the class in a specified seminar.

For lightUP, four or five students were scheduled as lightUP speakers for each seminar. From Seminar 5 onwards, there was a teamUP group presenting its case analysis while four or five lightUP speakers would present their analytical insights individually. The rest of the class would watch the teamUP presentation and then post their questions to the teamUP group. Then, they would watch the presentation by the lightUP speakers, and provide their written feedback to the instructor, who would then share their feedback anonymously with the lightUP speakers. The instructor would then grade the quality of student questions and feedback in voiceUP.

Table 3 shows how the various learning activities were scheduled for a typical three-hour in-person class after the first four weeks. Table 4 shows how this schedule will look like if the course is delivered in a hybrid mode, if there is some e-learning component added to the course. At the peak of the pandemic in early 2020, the author was able to move all in-person class activities online with minimal disruption.

Now that the actual way to implement the plan has been sketched above, one may wonder if this course design is effective for developing expertise in learners. The next section offers some qualitative evidence that it does.

5. Evidence of Pedagogical Effectiveness

Is there empirical evidence to show that this course design adopted in teaching an OB course in both the Fall and Spring semesters was effective? Perusal of end-of-course student evaluations over the last three consecutive semesters would seem to suggest so. Some examples of the comments that students have provided in their feedback are provided in Table 5.

Table 3. Scheduling of learning activities for a three-hour in-person seminar class.

Table 4. Scheduling of learning activities for a three-hour seminar delivered in a hybrid mode.

Table 5. Samples of qualitative feedback from students.

6. Discussion

This paper proposes a pedagogical framework that incorporates two elements that are critical to the development of learner expertise, viz. metacognitive development, and deliberate practice. To encourage student development of metacognition, several assessment methods and assessment rubrics were created to see if OB students can acquire the three types of metacognitive knowledge, i.e., strategic knowledge (strategies for learning and thinking), contextual knowledge (knowledge about different types of cognitive tasks in different contexts), and self-knowledge ((knowledge about oneself, literally).

Learning and applying theories and conceptual frameworks in OB can enable the learner to develop that which Pintrich (2002) cited earlier called “strategic knowledge,” “knowledge about cognitive tasks,” and “self-knowledge.” When students do the teamUP case analysis, the design thinking project, and the self-reflection essay, they acquire “strategic knowledge” that is non-domain specific. All three course assessment methods require different cognitive processes that are made explicit in the respective assessment rubrics. Their knowledge acquisition skills, design thinking skills, and reflection skills acquired in this OB course can be applied to other domains or disciplines.

Students also acquire “knowledge about cognitive tasks” when instructors provide clear instructions on the assignments to be done and delineate the thinking processes that are needed when doing those assignments.

Students acquire contextual knowledge specific to the tasks involved when they examine the temporal and contextual properties of the teamUP cases they are assigned. Students also acquire contextual knowledge specific to significant life events in writing the “My Reflective Learning” essay. This is an exercise promoting a heightened sense of self-awareness, i.e., self-knowledge, as students reflect on their own personalities, values, strengths and weaknesses, and also examine how these things may impact their own actions or reactions in the context of a significant life event.

Students’ metacognitive control and self-regulatory processes are activated through a process of questioning and receiving feedback. They are encouraged to ask questions during sizeUP, powerUP, and teamUP segments, and when they provide feedback to lightUP speakers. Students also receive formative and summative feedback from the course instructor as well.

Deliberate practice is embedded in the design of various active learning activities to be conducted in-class. The four active learning activities, sizeUP, powerUP, teamUP, and lightUP, encourage frequent information seeking and knowledge retrieval. Students may have to retrieve some OB knowledge that they might have acquired on their own to do the sizeUP quizzes. They may also retrieve the same knowledge again to apply in powerUP learning activities, which may include role plays, or analysis of video cases, or real-life vignettes. They could retrieve the same knowledge a third time when they watch a teamUP group presentation. They have to do that with a critical eye so that they can come up with interesting and probing questions for the presentation team. And they may also retrieve the knowledge a fourth time when they watch lightUP speakers sharing new insights from newly published articles about AI, neuroscience, or cross-cultural studies.

Through such deliberate practice, students may internalize what they have learned more effectively, be more likely to remember the conceptual material they have learned, which they can use as knowledge structures to help retain learning, and on to which new knowledge in different contexts can be hung so they have a mental map that makes better sense of new information.

7. Conclusion

Artificial intelligence and robotics are fast becoming important features in workplaces. With machine learning, deep learning and reinforcement learning, they are quickly becoming “AI experts” in different disciplines. These already include AI-driven robotic surgery, AI expert systems read chest X-rays, AI systems can give social support to the elderly, and so on. Since the workplace is being transformed by advancements in AI, instructors ought to transform the way they educate their students so they can land running in such future workplaces.

This paper proposes a pedagogical framework to help students develop expertise in any domain of knowledge, so they get ready for the new workplace. The framework is grounded in two main factors for the development of expertise in learners, namely, metacognitive development and deliberate practice (Ericsson, Krampe, & Tesch-Römer, 1993; Medina, Castleberry, & Persky, 2017). This framework has three main elements, the first of which is explicitly stating upfront what skill-based learning outcomes (skill sets) are being aimed for in a particular course of instruction. The second element comprises details of the assessment methods to be used in said course, along with their attendant assessment rubrics that spell out clearly the thinking processes or behaviors needed to develop those skill sets specified beforehand. This element facilitates the development of metacognition in students because they are made to go through various thinking processes to complete their assignments. Finally, the third element comprises various learning activities that foster deliberate practice. By getting students to participate in different activities sequentially within a variety of contexts, students get to see how theoretical concepts relevant to a particular topic may be applied in various situations. In this way, they will not only internalize what they may have learned but also acquire the mental agility to apply that knowledge.

By requiring students to do Internet searches to look for detailed, up-to-date information, the process leads them to the frontiers of knowledge, which makes self-learning meaningful. Such learners often feel a certain sense of pride in acquiring knowledge on their own accord, which they can share with others. Since new knowledge appears very much faster than authors can revise their textbooks, this way of learning is better suited to today’s demands than the old textbook-based, chalk-and-talk, sage-on-stage model of teaching excellence.

Developing expertise is difficult, which is also to say that metacognition development and deliberate practice are not simple or easy. What instructors can deliver in a semester-long course will only benefit their students if the latter continue to review and practice the skills they have acquired in that semester over the long term, either by reinforcing those skill sets in other courses, or practicing them constantly in their personal or work lives. To contribute to the development of expertise in their students that is associated with their domain knowledge, instructors can intentionally scaffold their delivery methods and course assessment methods upon the proposed pedagogical framework to create a learning environment where there is the development of metacognitive thinking, and also deliberate practice.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Alghalith, N. (2015). Using Course-Embedded Assessment: Defining and Assessing Critical Thinking Skills of MIS Students. Journal of Higher Education Theory and Practice, 15, 77-83.
[2] Bahia, S., Freire, I. P., Estrela, M. T., Amaral, A., & Santo, J. A. E. (2017). The Bologna Process and the Search for Excellence: Between Rhetoric and Reality, the Emotional Reactions of Teachers. Teaching in Higher Education, 22, 467-482.
https://doi.org/10.1080/13562517.2017.1303471
[3] Bain, J. D., Ballantyne, R., Mills, C., & Lester, N. C. (2002). Reflecting on Practice: Student Teachers’ Perspectives. Flaxton: Post Pressed.
[4] Baker, J., Cote, J., & Deakin, J. (2005). Cognitive Characteristics of Expert, Middle of the Pack, and Back of the Pack Ultra-Endurance Triathletes. Psychology of Sport and Exercise, 6, 551-558.
https://doi.org/10.1016/j.psychsport.2004.04.005
[5] Behari-Leek, K., & McKenna, S. (2017). Generic Gold-Standard or Contextualized Public Good? Teaching Excellence Awards in Postcolonial South Africa. Teaching in Higher Education, 22, 408-422.
https://doi.org/10.1080/13562517.2017.1301910
[6] Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How People Learn: Brain, Mind, Experience, and School. Washington DC: National Academy Press.
[7] Brown, A. (1997). Transforming Schools into Communities of Thinking and Learning about Serious Matters. American Psychologist, 52, 399-413.
https://doi.org/10.1037/0003-066X.52.4.399
[8] Brown, T. (2008). Design Thinking. Harvard Business Review, June, 84-92.
[9] Campitelli, G., & Gobet, F. (2011). Deliberate Practice: Necessary but Not Sufficient. Current Directions in Psychological Science, 20, 280-285.
https://doi.org/10.1177/0963721411421922
[10] Carneiro, R. (2007). The Big Picture: Understanding Learning and Meta-Learning Challenges. European Journal of Education, 42, 151-172.
https://doi.org/10.1111/j.1465-3435.2007.00303.x
[11] Chi, M. T. H., Glaser, R., & Farr, M. J. (1988). The Nature of Expertise. Hillsdale, NJ: L. Erlbaum Associates.
[12] Chong, C. (2021). Student Start-Up Serves Up Food Deliveries on NTU Campus on Self-Driving Robots. The Straits Times.
https://www.straitstimes.com/singapore/student-start-up-serves-up-food-deliveries-on-ntu-campus-on-self-driving-robots
[13] Choo, Y. T. (2020). Robots That Clean Floors, Chase Pigeons Away among Technologies Being Tested at Tampines Food Centre. The Straits Times.
https://www.straitstimes.com/singapore/robots-for-cleaning-floors-disinfecting-lift-panels-among-technologies-being-tested-at
[14] Crowe, S. (2017). Professor Einstein Robot Launches on Kickstarter. Robotics Business Review, January 23.
https://www.roboticsbusinessreview.com/rbr/professor_einstein_robot_launches_on_kickstarter
[15] Eccles, D. W., & Arsal, G. (2017). The Think Aloud Method: What Is It and How Do I Use It? Qualitative Research in Sport, Exercise and Health, 9, 514-531.
https://doi.org/10.1080/2159676X.2017.1331501
[16] Elliott, S., Whitehead, A., & Magias, T. (2020). Thought Processes During Set Shot Goalkicking in Australian Rules Football: An Analysis of Youth and Semi-Professional Footballers Using Think Aloud. Psychology of Sport and Exercise, 48, Article ID: 101659.
https://doi.org/10.1016/j.psychsport.2020.101659
[17] Ericsson, K. A. (2006). The Cambridge Handbook of Expertise and Expert Performance. New York: Cambridge University Press.
https://doi.org/10.1017/CBO9780511816796
[18] Ericsson, K. A., & Kintsch, W. (1995). Long-Term Working Memory. Psychological Review, 102, 211-245.
https://doi.org/10.1037/0033-295X.102.2.211
[19] Ericsson, K. A., Krampe, R. T., & Tesch-Romer, C. (1993). The Role of Deliberate Practice in the Acquisition of Expert Performance. Psychological Review, 100, 363-406.
https://doi.org/10.1037/0033-295X.100.3.363
[20] Ericsson, K. A., Prietula, M. J., & Cokely, E. T. (2007). The Making of an Expert. Harvard Business Review, July-August, 115-121.
[21] Flavell, J. H. (1979). Metacognition and Cognitive Monitoring: A New Area of Cognitive-Developmental Inquiry. American Psychologist, 34, 906-911.
https://doi.org/10.1037/0003-066X.34.10.906
[22] Flavell, J. H., & Wellmann, H. M. (1977). Metamemory. In R. V. Kail, & J. W. Hagen (Eds.), Perspectives on the Development of Memory and Cognition (pp. 3-33). Hillsdale, NJ: Lawrence Erlbaum Associates.
[23] FYA (2016). The New Basics: Big Data Reveals the Skills Young People Need for the New Work Order. Foundations for Young Australians Report.
[24] Glazer, R. (1998). Measuring Knower: Towards a Theory of Knowledge Equity. California Management Review, 40, Article ID: 1750194.
https://doi.org/10.2307/41165949
[25] Gourgey, A. F. (2001). Metacognition in Basic Skills Instruction. In H. J. Hartman (Ed.), Metacognition in Learning and Instruction (pp. 17-32). Dordrecht: Springer.
https://doi.org/10.1007/978-94-017-2243-8_2
[26] Gourlay, L., & Stevenson, J. (2017). Teaching Excellence in Higher Education: Critical Perspectives. Teaching in Higher Education, 22, 391-395.
https://doi.org/10.1080/13562517.2017.1304632
[27] Hardingham-Gill, T. (2019). The Android Priest That’s Revolutionizing Buddhism. CNN.
https://edition.cnn.com/travel/article/mindar-android-buddhist-priest-japan/index.html
[28] Hartman, H. J. (2001). Developing Students’ Metacognitive Knowledge and Skills. In H. J. Hartman (Ed.), Metacognition in Learning and Instruction (pp. 33-68). Dordrecht: Springer.
https://doi.org/10.1007/978-94-017-2243-8_3
[29] Hartman, H., & Sternberg, R. J. (1993). A Broad BACEIS for Improving Thinking. Instructional Science, 21, 401-425.
https://doi.org/10.1007/BF00121204
[30] Lang, C. J. (2017). The Flipped Classroom for Teaching Millennials: A Competency-Based Pedagogical Approach. Creative Education, 8, 1571-1589.
https://doi.org/10.4236/ce.2017.810108
[31] Lang, C. J. (2019). Teaching Leadership Better: A Framework for Developing Contextual-Intelligent Leadership. Creative Education, 10, 443-463.
https://doi.org/10.4236/ce.2019.102032
[32] Lin, X. (2001). Designing Metacognitive Activities. Educational Technology Research and Development, 49, 23-40.
https://doi.org/10.1007/BF02504926
[33] MacIntyre, T. E., Igou, E. R., Campbell, M. J., Moran, A. P., & Matthews, J. (2014). Metacognition and Action: A New Pathway to Understanding Social and Cognitive Aspects of Expertise in Sport. Frontiers in Psychology, 5, 1-12.
https://doi.org/10.3389/fpsyg.2014.01155
[34] Mazur, E. (2017). Peer Instruction (pp. 9-19). Upper Saddle River, NJ: Prentice Hall.
https://doi.org/10.1007/978-3-662-54377-1
[35] McPherson, S. L. (1999). Expert-Novice Differences in Performance Skills and Problem Representations of Youth and Adults during Tennis Competition. Research Quarterly for Exercise & Sport, 70, 233-251.
https://doi.org/10.1080/02701367.1999.10608043
[36] Medina, M. S., Castleberry, A. N., & Persky, A. M. (2017) Strategies to Improving Learner Metacognition in Health Professional Education. American Journal of Pharmaceutical Education, 81, 72-80.
https://doi.org/10.5688/ajpe81478
[37] Morley, L. (2003). Quality and Power in Higher Education. Maidenhead: Open University Press.
[38] Ong, T. (2017). YuMi the Robot Makes Debut as Orchestra Conductor in Italy. The Verge.
https://www.theverge.com/2017/9/14/16306528/yumi-robot-abb-debut-orchestra-conductor-italy
[39] Palincsar, A. S., & Brown, A. L. (1984). Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction, 1, 117-175.
https://doi.org/10.1207/s1532690xci0102_1
[40] Persky, A. M., & Robinson, J. D. (2017). Moving from Novice to Expertise and Its Implications for Instruction. American Journal in Pharmaceutical Education, 81, 6065.
https://doi.org/10.5688/ajpe6065
[41] Pintrich, P. R. (2002). The Role of Metacognitive Knowledge in Learning, Teaching, and Assessing. Theory into Practice, 41, 219-225.
https://doi.org/10.1207/s15430421tip4104_3
[42] Rampersad, G. (2020). Robot Will Take Your Job: Innovation for an Era or Artificial Intelligence. Journal of Business Research, 116, 68-74.
https://doi.org/10.1016/j.jbusres.2020.05.019
[43] Ryan, M., & Ryan, M. (2012). Theorising a Model for Teaching and Assessing Reflective Learning in Higher Education. Higher Education Research & Development, 32, 244-257.
https://doi.org/10.1080/07294360.2012.661704
[44] Samuel, S. (2020). Robot Priests Can Bless You, Advise You, and Even Perform Your Funeral. Vox.
https://www.vox.com/future-perfect/2019/9/9/20851753/ai-religion-robot-priest-mindar-buddhism-christianity
[45] Saunders, D., & Ramirez, G. B. (2017). Against “Teaching Excellence”: Ideology, Commodification and the Neoliberalisation of Postsecondary Education. Teaching in Higher Education, 22, 396-407.
https://doi.org/10.1080/13562517.2017.1301913
[46] Singer, R. N. (2002). Pre-Performance State, Routines, and Automaticity: What Does It Take to Realize Expertise in Self-Paced Events? Journal of Sport & Exercise Psychology, 24, 359-375.
https://doi.org/10.1123/jsep.24.4.359
[47] Swanson, H. L., O’Connor, J. E., & Cooney, J. B. (1990). An Information Processing Analysis of Expert and Novice Teachers’ Problem Solving. American Educational Research Journal, 27, 533-556.
https://doi.org/10.3102/00028312027003533
[48] Tan, A. (2021). Robots Deliver Groceries and Parcels to Punggol Residents in One-Year Trial. The Straits Times.
https://www.straitstimes.com/singapore/consumer/robots-to-deliver-groceries-and-parcels-to-some-residents-in-punggol-in-one-year
[49] Tarricone, P. (2011). The Taxonomy of Metacognition. New York: Psychology Press.
https://doi.org/10.4324/9780203830529
[50] Whitehead, A. E., & Jackman, P. C. (2021). Towards a Framework of Cognitive Processes during Competitive Golf Using the Think Aloud Method. Psychology of Sport & Exercise, 53, Article ID: 101869.
https://doi.org/10.1016/j.psychsport.2020.101869
[51] World Economic Forum (2020). The Future of Jobs Report 2020.
https://www.weforum.org/reports/the-future-of-jobs-report-2020
[52] Zhao, Y., & Liu, G. (2018). How Do Teachers Face Educational Changes in Artificial Intelligence Era. Advances in Social Science, Education and Humanities Research, 300, 47-50.
[53] Zohar, A., & David, A. B. (2009). Paving a Clear Path in a Thick Forest: A Conceptual Analysis of a Metacognitive Component. Metacognition Learning, 4, 177-195.
https://doi.org/10.1007/s11409-009-9044-6

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.