Individual education approach in the context of your own discipline, students and the wider context of HE

Word count: 1148

In my high school years, there were only a few subjects, each with a huge amount of materials, meaning that one has to study the subject in depth. The approach of drilling past exam papers and memorising the highlights of each chapter that most of us adopted for the (equivalent of) GCSE exams would not work any more. Being more able in the mathematics-related subjects than most of the class, the frequent approaches by my classmates for help presented me an opportunity to learn in a unique way, which is by explaining the questions. This unintentionally trained me to explain abstract concepts clearly and plainly, not only to others but also to myself, echoing a good quality of teaching mentioned by Pratt (2002).

This approach is continued in my undergraduate studies, when I helped out my friends occasionally, in my research postgraduate studies, when I was also a teaching assistant helping with tutorials of some modules, and now in my teaching practice. This is particularly important for advanced mathematics and statistics modules, as they usually contain concepts unfamiliar to even somebody who has taken high school mathematics. And it is not uncommon that these concepts are so abstract or alien that they have never come across the students’ mind until the module begins.

Due to their aforementioned nature, advanced technical topics, “unlike many social sciences, do not easily lend themselves to energetic discussions of controversial issues(Rosenthal 1995). Coupled with the idea that black/whiteboard “makes visible the process of mathematical reasoning(Greiffenhagen 2014), this led me to adopt by default the traditional lecture format, as well as the transmission perspective on teaching (Pratt 2002). For the rest of this patch, I would argue that this format-perspective pair is neither wrong nor bad in itself by making some observations on my teaching.

First, the extra training I received in my learning years helped me overcome some difficulties mentioned by Pratt (2002). For example, I would understand that students might not completely grasp the logic of the content in the first instance, and that it might take a few iterations of thinking to reach the “Eureka” moment. The key is to not assume the same level of understanding as mine, and to explain things as simple and intuitive as possible. This echoes a good quality of teaching by Ramsden (2003), which is “facility for engaging with students at their level of understanding”.

Second, the perspective is not independent of the topic being taught. Over the last few years I have been teaching a statistical programming module (which also appears in subsequent patches). Its practical nature and the wide industrial adoption of the programming language naturally requires the apprenticeship perspective, as students will use the skills they learn in their future career. On the other hand, the developmental perspective comes in when teaching statistical methods, because of the nature of the subject - all models are wrong (Box 1976). When applying the methods to actual data, students will choose the model based on some principles on selecting a most appropriate model, as well as on the specialised techniques they learn in other modules. Interestingly, this perspective is close to the student-centred approach in mathematics education (Felton and Koestler 2015), where “students are asked to invent their own strategies using what they already know, … and to consider mathematical connections between them”.

Third, neither is the teaching format independent of the subject matter, and such dependence is related to the what Brookfield (2017) said about assumptions of good teaching. Just because group discussion with minimal lecture worked for some subjects at some point does not say much about whether they will work (or not) for the statistical topic I am teaching. In fact, the two most inspiring teachers in my high school and university taught exclusively via the black/whiteboard in lectures, but they succeeded in doing so because they injected quality teaching into their lectures, by having “an ability to make the material being taught stimulating and interesting(Ramsden 2003).

This leads to the fourth observation, which is what I inject into my teaching. I teach with flexibility and reactiveness, bearing in mind that the perspective and teaching activities may change over the course of the module. While a certain amount of content has to be taught in the first few lectures before giving students more power to own their learning, I usually gauge the students’ understanding and obtain feedback around the mid-point, as experience tells me that I likely have to further explain some concepts before moving on with the set materials. Looking back, this iterative process fits other qualities of good teaching by Ramsden (2003), such as “ability to improvise and adapt to new demands” and “learn from students … about the effects of teaching and how it can be improved”.

The quality of teaching is implicitly dependent on the diversity of students, as good teaching to one student might not be good to another from a different background or ability level. This diversity is visible in my cohort as there is a roughly equal split of UK and international students, intertwined with both a substantial proportion of mature (and usually working) students, and a wide distribution of prior mathematical and programming abilities. While this is markedly different to, and less challenging than, studies on diversity where primarily White (and female) teachers were enrolled in courses “designed to prepare teachers to teach mathematics for diversity(Dunn 2004) or in schools “where students are primarily poor and ethnically diverse(Shandomo 2010), there still is a challenge. To level the playing field, more time and help are offered to the lesser able students in teaching sessions, and multiple solutions that complete the same task are accepted in assessment, which is usually the case in programming. The latter echoes “environment with multiple voices” and “appropriate assessment” advocated by Northedge (2003) for teaching in the context of diversity, albeit superficially.

To summarise, the traditional approach could work well in statistics education, if aspects of enhancing learning experience are put into practice. These include paying attention to the dependence of the topic, format and perspective, being flexible and reactive, injecting good qualities of teaching, and taking student diversity into account. This of course does not necessarily mean other approaches will not work. The group-based approach, which comes from the constructivist philosophy of learning, has been advocated by some in mathematics and statistics education, for example Rosenthal (1995), Voskoglou (2019), and Jones and Palmer (2022). I would like to explore this and the flipped classroom approach (Lo, Hew, and Chen 2017), which has become popular recently (see patch 5). These approaches can be facilitated by evolving technological tools, a lot of which is suggested by Jones and Palmer (2022), and Teaching and Learning Mathematics Online1 (TALMO). Embedding them in my current and future approaches will be another of my development aims.

Evaluation of teaching and learning support practice to develop effective learning environments

Word count: 1202

There has never been a better time to discuss online and hybrid teaching, which became prevalent and necessary due to the COVID-19 pandemic. Before such discussion, the terminology should be made clear, as different terms have been used interchangeably:

  1. in-person is equivalent to face-to-face or physical classroom, while online is equivalent to virtual or remote;
  2. synchronous means the learning activities take place in real-time e.g. a live lecture (in-person or online), while asynchronous means the teaching and learning potentially take place at different instants e.g. a pre-recorded video.
  3. hybrid and blended are equivalent, and mean both in-person and online teaching and learning taking place simultaneously; this definition is echoed by Gilbert, Hodds, and Lawson (2021);
  4. dual-mode means having both in-person and online activities that take place at different times, and is therefore different to hybrid / blended.

I shall focus on a 2-hour synchronous teaching session in a hybrid environment, as it encompasses both online and in-person environments. This session took place in a computer lab, and was essentially a “traditional” lecture plus a workshop. In the first hour I went through some slides of examples of a statistical programming language. In the second hour, with the help of the teaching assistants, students individually went through the slides and lecture notes, and practised on programming tasks. Towards the end, I summarised the content covered, and cleared any confusion that arose when the students practised their own coding.

Deviating from a physical classroom setting, I incorporated two technological changes to ensure the teaching and learning happen in-person and online simultaneously. The first is that my teaching was shared over Microsoft Teams via the desktop in the lab. This means the students joining online and those physically present were listening to and seeing the same materials without time lag. My mentor observed that I “was skilled and prompt at dealing with resulting issues so the experience for those attending in-person was barely affected by the blended model.” To make this environment inclusive (albeit less effective) for students who cannot join at the scheduled time, the “lecture” was also recorded.

By looking into the literature, this hybrid environment covers most ground in terms of accessibility and inclusivity. First, practitioners surveyed by Gilbert, Hodds, and Lawson (2021) agreed that the biggest benefit of an online element is accessibility for remote learners, but such element is a doubled-edged sword as issues like poor wi-fi might hinder their learning. Our hybrid environment conversely provides an alternative to these students – simply turn up to the physical classroom if they can. Another issue is brought out by Kempen and Liebendörfer (2021), who in their cluster analysis discovered a significant group of “digitals”. Students in this group rely more on external videos and webpages of unverified quality than traditional lectures or tutorials. The recorded lecture in the hybrid environment allows them to access the materials tailored for the module, but in their preferred way.

In another mentor observation, there were “suitable pauses and opportunities for interaction” during the first hour of my teaching. While both groups of students were prompted to ask questions directly, most were rarely willing to speak or type in the chat in front of a large class. This brewed the second change, which is that their feedback was solicited on Mentimeter2, an online polling platform. Comments were instant and anonymous, with no distinction between in-person and online students. My mentor observed that “this was a great way of soliciting student involvement and got a number of prompt responses”. To strengthen the usefulness of this exercise, I collated the comments and made a response during the closing summary. This exercise gives the students a collective voice, without exerting much pressure on them, and lets me instantaneously gauge their understanding. See a similar example in Chapter 5 of Bovill (2020).

Not all good practices were incorporated right from the start. For example, asking for instant feedback should have been done earlier in the module to monitor the students’ progress, as there was a wide distribution of prior programming ability, and some students struggled to catch up with the materials early on. On the other hand, from this positive experience, it is hard to see why such practices cannot be used in the traditional classroom. In fact, effective online tools that emerged from the move to online teaching should be encouraged when in-person teaching returns, as they greatly enhance the teaching and learning experience.

Another aspect enhanced after peer observation is live coding in front of the class, which is a direct way for students to learn programming. My mentor observed that, as this module “is a very hands-on programming module, real time demonstration of some of the code may be effective in reinforcing the students’ understanding”. Whilst not directly related to the hybrid environment, this element can be incorporated without any adjustment, thanks to the sharing functionality of Microsoft Teams, and precisely its independence of the learning environment.

Incorporating these practices requires effort, and so familiarising oneself with the equipment in the classroom or lab is recommended, to iron out potential technical issues beforehand. And there is more to practising clicking buttons and opening apps on the computer. My peer observer noted that setting up takes time at the beginning, and it is worth briefing the students the session plan to keep their attention. Talking coherently and simultaneously sorting out the technical issues is therefore worth practising too.

The issue of requiring to be well-trained for these technologies is shared by others surveyed in Ní Fhloinn and Fitzmaurice (2021). They pointed out that mathematics lecturers did not make the best use of technology pre-pandemic, because they were wedded to the traditional form of lecturing on a black/whiteboard. However, forced by the move to online teaching, there has been a positive shift (Gilbert, Hodds, and Lawson 2021). Comparing the results of their survey conducted in early 2021 to those in May 2020, they found that practitioners are more experienced and confident in using online methods.

The literature on hybrid teaching and learning is understandably still developing. While TALMO3 contains an up-to-date list of resources, of most relevance is the journal Special Issue “Restarting the new normal” (Gillard et al. 2021). This issue includes the articles cited in this patch, the findings of which lend support to the hybrid environment.

In the survey results of Ní Fhloinn and Fitzmaurice (2021), lecturers were less inclined to take up technological tools pre-pandemic because of the difficulty to replicate an in-person environment. While it is possible to create an environment close to a black/whiteboard with more effort (Green 2021), the hybrid environment requires minimal additional effort, and can be a good starting point of incorporating technology in teaching.

While the aforementioned practices seem challenging or alien to some, we should not be afraid of them. Most students these days grow up with internet technology and welcome tools that they feel comfortable with. The more we are willing to embrace them, the more accepting they are. As higher education evolves with technology, it is hoped that such practices will enable more flexible learning (and thus less falling behind) for the students, while incurring minimal additional setup costs for the lecturers.

Curriculum design in context

Word count: 1218

In this patch I will focus on a module taught in the first term of the 21/22 academic year, which is the same module discussed in patches 2 and 4. The module is titled ‘MATH550: Statistics in Practice’, a compulsory 15-credit module at Level 7 of the Higher Education Qualification Frameworks. The module was attended by 34 students in the MSc Statistics programme, accredited by the Royal Statistical Society (RSS).

Regarding the learning outcomes (LOs), on completion of the module a student should:

  1. be able to use statistical (R and either SAS or Python) and type-setting software (LaTeX);
  2. understand and demonstrate appropriate report writing structure;
  3. present data and the results of statistical models in graphs, tables and orally;
  4. work as part of a group.

In my opinion, the standard of these LOs are set lower and less specific than one would expect for a Level 7 module, for two reasons. First, the active verbs ‘be able to use’ and ‘understand’ are on a similar level to those categorised to be vague or broad (Educational Development, n.d.), while ‘understand’ also refers to learning but not the representation of learning (Moon 2002). Borrowing from Educational Development (2022) and Anderson and Krathwohl (2001), potential improvements include adding “effectively and independently” to some LOs, or changing the first one to “synthesise and apply statistical and type-setting software to complex data analysis problems”. Second, from my observation, the Year 2 undergraduate students I supervised in a group project would have achieved the exact same LOs on face value. However, I would stress that the actual content taught reached the level required, as students evaluated that they learned things they did not in the undergraduate studies.

Another thing I have learned from Educational Development (n.d.) is that the LOs should be future-proof. Then why are R, SAS, Python, and LaTeX all specified in the first LO? This is due to the subject nature, and the popularity of those listed. LaTeX is popular in the science side of academia, SAS in the pharmaceutical industry, and R in statistics in general (Ghosh, n.d.; Muenchen 2019). Moreover, the popularity of R and Python is such that the accreditation body RSS provides its own training courses4. As such dominance is expected to continue for a while, I would safely claim the LOs to be future-proof “for at least the next 20 years”. Furthermore, the programme LOs are kept broad and have not mentioned the specific software, so it is possible to change this module without affecting the programme as a whole. Lastly, that the software and the concepts of programming are universal (at least in statistics) indicates that the curriculum itself is sufficiently inclusive and accessible for the diverse student body mentioned in patch 1.

This is the third year I have been teaching the module, and the LOs have stayed the same since 18/19, barring the last one that was changed due to a programme overhaul. However, the teaching activities were implemented (seemingly) differently in each iteration, due to both changing student numbers and the move to online and hybrid teaching, thanks to the COVID-19 pandemic. Traditional teaching sessions took place in 19/20, with some lecturing in the first third of each session, followed by a workshop for the remaining two thirds, where students worked on problems similar to those in the assessed coursework. While some might think that the move to online teaching in 20/21 allows us to answer the question “do things have to be done the same way as long as LOs are met”, I am here to disappoint them, as the format of the teaching sessions remained largely the same, apart from taking place exclusively on Microsoft Teams. In 21/22, we moved the teaching sessions back on campus, with the option of joining online, leading to the hybrid environment described in patch 2. While I was only responsible for the teaching of R, with two colleagues being responsible for SAS and LaTeX, the way the teaching was conducted has been largely the same (lecturing followed by practical) across the three components.

Looking back on the teaching approach, the lecturing part is largely traditional as slides and notes are involved, which suggests a possible misalignment with the LOs, as using any statistical programming language and LaTeX is mainly for solving data analysis problems and presenting the analyses. An alternative to encouraging students to work towards achieving the LOs would be a problem-based approach, investigated by Chis et al. (2018), Taşpolat, Özdamli, and Soykan (2021), Jones and Palmer (2022), Lee and Ban (2022) in the context of statistics and programming (see patch 5). Also, while using such approach to integrate directed independent learning (DIL) in the curriculum design is a noble pursuit, being the very first core module of the programme makes it difficult, echoing the observation by Thomas, Jones, and Ottaway (2015) that “many staff feel that students need some core knowledge in order to undertake effective DIL”.

In my implementation, instead of overhauling the teaching approach, I made adjustments by demonstrating live coding before the class, echoing my mentor’s observation in patch 2. Essentially, demonstrating the skills the students are expected to acquire (i.e. using software) using computers and information technology is the most natural way of creating an effective learning environment. This is also where the apprenticeship perspective (Pratt 2002) mentioned in patch 1 came in. Situated in a computer lab with internet connection and all required software available, thus an inclusive and accessible environment in the narrow sense, students indeed “work on authentic tasks in real settings of application”.

Teaching technologies come in to broaden the scope of “environment” and enhance its effectiveness. These include not only those mentioned above and in patch 2, but also one aspect concerning the materials (which are available in pdf and html, hence accessible again in the narrow sense). The technology lies in that I made the materials using the skills I teach the students, and in the demonstration thereof, essentially preaching what I do and doing what I preach. Students are therefore learning from not only the materials but also how the materials were made. This echoes Stewart (2013) on learning digital fluency that students not only learn through technologies, they also learn about technologies, as well as Hart (1987) that “what we need from educational technology is forms of knowledge which may lead to understanding, rather than information overload and confusion”. Such strange loop for the materials will be demonstrated again in patch 4, when the alternative media is in the same format as the assessment I examine.

Speaking of assessment, in addition to the component presented in patch 4 that meets LOs 1 and 3 partially, there are a written report and a group project, both due towards the end of the module, that jointly meet all the LOs. While there is room for improvement for how they are written, they are clear enough for students to know whether they have achieved them. However, it does not mean that they have to be designed in this way. For example, a group presentation at the end of an in-class problem-solving session could easily meet LOs 1, 3, and 4. To avoid exceeding the word limit and too much overlapping, we shall delay discussion of such activity to patch 5.

Assessment for learning rationale and analysis

Duration: 12:32

Link to recorded presentation: https://web.microsoftstream.com/video/ef6b97c4-aaa5-48b7-82d4-445ea3f1289c

Link to slides: https://clement-lee.github.io/pgcap/patch4.pdf

References used in the presentation:

Critical reflections towards future development and practice

Word count: 1384

In patch 1 we looked at how I ended up with the traditional lecture approach and various perspectives, and identified trying other approaches, such as flipped classroom or group-based learning, as the main development aim. In patch 2 we looked at the use of technology and online tools to facilitate hybrid learning and obtaining student feedback. In patch 3 we saw that teaching activities and assessment methods could vary in the module we looked into, as look as the learning outcomes are fulfilled. This is largely due to constructive alignment (Biggs 1996), the outcome-based approach to teaching in which the learning outcomes are defined before teaching takes place. In patch 4 we looked at an assessment method for statistical programming that is new to the module, involves web tools and recorded presentation, and thus potentially improves students’ employability in the post-pandemic world.

One common observation is that the generation who grew up with the boom of Web 2.0 learns via different media to how we teachers learned back in the days. They predominantly watch videos (mostly on YouTube) and search on the Internet to learn things. Naturally there is academic research on the effect of videos on learning in higher education, and this has been systematically reviewed by Noetel et al. (2021). Coupled with the summaries above, can we synergise the approaches I want to explore, the flexible hybrid/online environment, the predominant way students these days learn, to design teaching activities and assessment methods that are different to what I have been practising but still meet the learning outcomes? This will be highly relevant for mathematics and statistics, as these two subjects increasingly involve programming, while watching videos and using web tools involve computers and information technology.

I would like to set two things straight before diving into the literature. The first is the focus of the search for relevant works. While (pure) mathematics and statistics, as two subjects, share some core modules on undergraduate level, topics on an advanced or postgraduate level are taught substantially differently in terms of resources and methods. While lectures on the board may suffice for some mathematics modules, some programming is almost always required for statistics modules. In fact, the study by Mascaró, Sacristán, and Rufino (2016) revealed that incorporating programming helps understanding statistics even when it is not compulsory in the module design. In fact, statistics taught at an advanced level is closer to programming, data science, or even software engineering, than to pure mathematics. (There have been efforts on using programming (Sterling and Pollack 1966; Green 1990) and web technologies (Dokter and Heimann 1999) to aid learning statistics, long before the boom of informative technology over the last two decades.) Therefore, I shall focus on teaching statistical methods and (general) programming, to avoid some observations or recommendations being not applicable to the range of modules concerned. Studies in higher education is implicitly assumed, due to the context of this assessment.

The second thing to set straight is laying out my prior beliefs on the applicability of the aforementioned approaches. This is because not only is specifying prior belief and adjusting according to the presence of data the routine workflow for a statistician, I would also like to document the changes in my perceptions as a starting point of my future practice and ongoing professional development. My prior belief is that flipped classroom will be more useful for teaching statistical methods than for statistical programming, because the latter requires practice within the teaching session, while group-based learning will be useful for both but more effective for the latter, as collaboration in programming is highly common these days.

Results are mildly surprising to me after looking into the literature. A lot of flipped classroom studies involved in-class group activities, both in statistics (Chen, Chen, and Chen 2015; Meyer 2022; Lee and Ban 2022), and in programming (Knutas et al. 2016; Durak 2020; Petersson, Hatakka, and Chatzipetrou 2020; Meyer 2022). Some studies involved group work as part of the assessment, in statistics (Chen, Chen, and Chen 2015; Nielsen, Bean, and Larsen 2018) and in programming (Paez 2017; Pattanaphanchai 2019). It can be established that there is improvement in students’ performance, as most, if not all, studies more validity reported a controlled experiment (i.e. a group in traditional classroom versus a group in flipped classroom), including Souza and Rodrigues (2015), Peterson (2016), Chis et al. (2018), Nielsen, Bean, and Larsen (2018), P., V.G., and Murthy (2018), Cao and Grabchak (2019), Farmus, Cribbie, and Rotondi (2020), Taşpolat, Özdamli, and Soykan (2021). While there is likely to be publication bias, and the sample sizes are usually small enough (20-30) for positive effect to have happened by chance, these studies provide useful indicators of what potentially works, which will serve well as a starting point of my practice development.

These results led to some new understandings that will help answering the questions I asked in the beginning of this patch. The first is that flipped classroom, or any innovative teaching approach in general, can be implemented independent of the subject, as long as it is well implemented. In fact, in the study by Tarimo, Deeb, and Hickey (2016) for a programming class, they performed a controlled experiment not between flipped and traditional classrooms, but between a flipped classroom with computers and one without computers, and they found that the learning outcomes were still fulfilled in the latter, indicating successful implementation.

Of course, not everything in the garden is rosy, and the above condition comes with the catch that not all studies, nor all aspects of each study, have been well implemented. For example, Petersson, Hatakka, and Chatzipetrou (2020) found that students were not engaging in classroom activities such group workshops, when they were carried out online, as opposed to in-person. Chen, Chen, and Chen (2015) found that there are distinctive sets of opinion on flipped classroom, just as there are learners with diverse needs and backgrounds in traditional classrooms. Peterson (2016), Cao and Grabchak (2019), and Taşpolat, Özdamli, and Soykan (2021) all mentioned that students did not spend enough time on the preparatory material, while Taşpolat, Özdamli, and Soykan (2021) also pointed to technological requirements, and poor engagement and interaction. Last but not least, the systematic review by Veras, Rocha, and Viana (2020) found that teachers are overworked and time-constrained, and it is difficult in sustaining student motivation. On top of learning what works, bearing these pitfalls in mind would be equally important when implementing a flipped classroom module.

The second understanding is that flipped classroom and group-based (or collaborative) learning usually take place together. This is because, on the practical level, the in-class activities in a flipped classroom commonly involve problem-solving, discussion, and in-class assignments, all of which are usually more effective in a group setting. More importantly, on the theoretical level, they are all from the constructivist view of education, or equivalently the development perspective advocated by Pratt (2002). Such view is further shared by problem-based (or inquiry-based) learning (Chis et al. 2018; Taşpolat, Özdamli, and Soykan 2021; Lee and Ban 2022), and self-directed (or self-regulated) learning (Çakiroğlu and Öztürk 2017; Nuri and Marsigit 2019; Veras, Rocha, and Viana 2020). Another example of such intersection of all these approaches can be seen from review on group-based learning and problem-based learning by Jones and Palmer (2022), without mentioning flipped classroom, let alone active learning. To take this constructivist view further, there are studies looking into the collaborative learning among teachers (Jaworski 2003) and between teachers and researchers (Heaton and Mickelson 2002).

The best way to wrap up is to come back to my question in the second paragraph, to which the short answer is yes. However, the more I read, the more I realised that there have already been studies who ticked all the boxes, and they serve as the best starting point for my practice development. Also, as all these approaches share the same perspective, the exact approach is not as important as keeping such perspective in mind while developing activities based on one or more of them. At the risk of copying too much, I would end this patch with the words of Fraser (2015): “an innovative teacher is identified as more than just one who uses a new or significantly improved technique for teaching and learning, but rather s/he is committed to the goals or philosophy of improving the quality of student learning through innovation”.

References

Anderson, Lorin W., and David R. Krathwohl, eds. 2001. A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. Longman.

Biggs, John. 1996. “Enhancing Teaching Through Constructive Alignment.” Higher Education 32: 347–64. https://doi.org/10.1007/bf00138871.

Boud, David, and Nancy Falchikov, eds. 2007. Rethinking Assessment in Higher Education. 1st ed. Routledge. https://doi.org/10.4324/9780203964309.

Bovill, Catherine. 2020. Co-Creating Learning and Teaching: Towards Relational Pedagogy in Higher Education. 1st ed. St Albans: Critical Publishing.

Box, George E. P. 1976. “Science and Statistics.” Journal of the American Statistical Association 71 (356): 791–99. https://doi.org/10.1080/01621459.1976.10480949.

Brookfield, Stephen D. 2017. Becoming a Critically Reflective Teacher. 2nd ed. John Wiley & Sons.

Bryan, Cordelia, and Karen Clegg, eds. 2019. Innovative Assessment in Higher Education: A Handbook for Academic Practitioners. 2nd ed. Taylor & Francis Group.

Çakiroğlu, Ünal, and Mücahit Öztürk. 2017. “Flipped Classroom with Problem Based Activities: Exploring Self-Regulated Learning in a Programming Language Course.” Journal of Educational Technology & Society 20 (1): 337–49. http://www.jstor.org/stable/jeductechsoci.20.1.337.

Cao, Lijuan, and Michael Grabchak. 2019. “Interactive Preparatory Work in a Flipped Programming Course.” In Proceedings of the Acm Conference on Global Computing Education, 229–35. CompEd ’19. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/3300115.3309520.

Chen, Liwen, Tung-Liang Chen, and Nian-Shing Chen. 2015. “Students’ Perspectives of Using Cooperative Learning in a Flipped Statistics Classroom.” Australasian Journal of Educational Technology 31 (6). https://doi.org/10.14742/ajet.1876.

Chis, Adriana E., Arghir-Nicolae Moldovan, Lisa Murphy, Pramod Pathak, and Cristina Hava Muntean. 2018. “Investigating Flipped Classroom and Problem-Based Learning in a Programming Module for Computing Conversion Course.” Journal of Educational Technology & Society 21 (4): 232–47. http://www.jstor.org/stable/26511551.

Cox, Andrew M., Ana Cristina Vasconcelos, and Peter Holdridge. 2010. “Diversifying Assessment Through Multimedia Creation in a Non-Technical Module:Reflections on the Maik Project.” Assessment & Evaluation in Higher Education 35 (7): 831–46. https://doi.org/10.1080/02602930903125249.

Deeley, Susan J. 2018. “Using Technology to Facilitate Effective Assessment for Learning and Feedback in Higher Education.” Assessment & Evaluation in Higher Education 43 (3): 439–48. https://doi.org/10.1080/02602938.2017.1356906.

Dokter, Christina, and Larry Heimann. 1999. “A Web Site as a Tool for Learning Statistics.” Computers in the Schools 16 (1): 221–29. https://doi.org/10.1300/J025v16n01_05.

Dunn, Thea K. 2004. “Engaging Prospective Teachers in Critical Reflection: Facilitating a Disposition to Teach Mathematics for Diversity.” In Preparing Mathematics and Science Teachers for Diverse Classrooms: Promising Strategies for Transformative Pedagogy, edited by Alberto J. Rodriguez and Richard S. Kitchen, 139–54. Taylor & Francis Group.

Durak, Hatice Yildiz. 2020. “Modeling Different Variables in Learning Basic Concepts of Programming in Flipped Classrooms.” Journal of Educational Computing Research 58 (1): 160–99. https://doi.org/10.1177/0735633119827956.

Educational Development, Lancaster University. 2022. “Learning Outcomes by Level and Language Updated Jan 2022.” https://www.lancaster.ac.uk/media/lancaster-university/content-assets/images/oed/ed/LearningOutcomes-bylevelandlanguageupdateJanuary2022.pdf.

Farmus, Linda, Robert A. Cribbie, and Michael A. Rotondi. 2020. “The Flipped Classroom in Introductory Statistics: Early Evidence from a Systematic Review and Meta-Analysis.” Journal of Statistics Education 28 (3): 316–25. https://doi.org/10.1080/10691898.2020.1834475.

Felton, Mathew D., and Courtney Koestler. 2015. “"Math Is All Around Us and … We Can Use It to Help Us": Teacher Agency in Mathematics Education Through Critical Reflection.” The New Educator 11 (4): 260–76. https://doi.org/10.1080/1547688X.2015.1087745.

Fraser, Sharon P. 2015. “Transformative Science Teaching in Higher Education.” Journal of Transformative Education 13 (2): 140–60. https://doi.org/10.1177/1541344615571417.

Ghosh, Amit. n.d. “What’s the Best Statistical Software? A Comparison of R, Python, SAS, SPSS and STATA.” https://www.inwt-statistics.com/read-blog/comparison-of-r-python-sas-spss-and-stata.html.

Gilbert, Holly, Mark Hodds, and Duncan Lawson. 2021. “‘Everyone seems to be agreeing at the minute that face-to-face is the way forward’: practitioners’ perspectives on post-pandemic mathematics and statistics support.” Teaching Mathematics and Its Applications: An International Journal of the IMA 40 (4): 296–316. https://doi.org/10.1093/teamat/hrab019.

Gillard, Jonathan, Claire Ketnor, Ciarán Mac an Bhaird, and Cathy Smith. 2021. “Special issue editorial: restarting the new normal.” Teaching Mathematics and Its Applications: An International Journal of the IMA 40 (4): 249–53. https://doi.org/10.1093/teamat/hrab026.

Green, David. 1990. “Using Computer Simulation to Develop Statistical Concepts.” Teaching Mathematics and Its Applications 9 (2). https://doi.org/10.1093/teamat/9.2.58.

Green, Dermot. 2021. “"Lightboard" for Connected Learning in Applied Mathematics and Theoretical Physics – Dermot Green (Queen’s University Belfast).” http://talmo.uk/dayMTFT2021.html\#Green.

Greiffenhagen, Christian. 2014. “The Materiality of Mathematics: Presenting Mathematics at the Blackboard.” The British Journal of Sociology 65 (3): 502–28. https://doi.org/10.1111/1468-4446.12037.

Hart, Andrew. 1987. “The Political Economy of Interactive Video in British Higher Education.” In Interactive Media: Working Methods and Practical Applications, 171–89. USA: Halsted Press.

Heaton, R.M., and W.T. Mickelson. 2002. “The Learning and Teaching of Statistical Investigation in Teaching and Teacher Education.” Journal of Mathematics Teacher Education 5: 35–59. https://doi.org/10.1023/A:1013886730487.

Irwin, Brian, and Stuart Hepplestone. 2012. “Examining Increased Flexibility in Assessment Formats.” Assessment & Evaluation in Higher Education 37 (7): 773–85. https://doi.org/10.1080/02602938.2011.573842.

Jaworski, B. 2003. “Research Practice into/Influencing Mathematics Teaching and Learning Development: Towards a Theoretical Framework Based on Co-Learning Partnerships.” Educational Studies in Mathematics 54: 249–82. https://doi.org/10.1023/B:EDUC.0000006160.91028.f0.

Jones, Elinor, and Tom Palmer. 2022. “A review of group-based methods for teaching statistics in higher education.” Teaching Mathematics and Its Applications: An International Journal of the IMA 41 (1): 69–86. https://doi.org/10.1093/teamat/hrab002.

Kempen, Leander, and Michael Liebendörfer. 2021. “University students’ fully digital study of mathematics: an identification of student-groups via their resources usage and a characterization by personal and affective characteristics.” Teaching Mathematics and Its Applications: An International Journal of the IMA 40 (4): 436–54. https://doi.org/10.1093/teamat/hrab020.

King, Helen. 2019. “Stepping Back to Move Forward: The Wider Context of Assessment in Higher Education.” In Innovative Assessment in Higher Education, edited by Cordelia Bryan and Karen Clegg, 2nd ed. Taylor & Francis Group.

Knutas, Antti, Antti Herala, Erno Vanhala, and Jouni Ikonen. 2016. “The Flipped Classroom Method: Lessons Learned from Flipping Two Programming Courses.” In Proceedings of the 17th International Conference on Computer Systems and Technologies 2016, 423–30. CompSysTech ’16. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/2983468.2983524.

Lee, Jae Ki, and Sun Young Ban. 2022. “Teaching Statistics with an Inquiry-Based Learning Approach.” Journal of Mathematics Education at Teachers College 12 (2): 21–32. https://doi.org/10.52214/jmetc.v12i2.8544.

Lo, Chung Kwan, Khe Foon Hew, and Gaowei Chen. 2017. “Toward a Set of Design Principles for Mathematics Flipped Classrooms: A Synthesis of Research in Mathematics Education.” Educational Research Review 22: 50–73. https://doi.org/10.1016/j.edurev.2017.08.002.

Mascaró, Maite, Ana Isabel Sacristán, and Marta M. Rufino. 2016. “For the love of statistics: appreciating and learning to apply experimental analysis and statistics through computer programming activities.” Teaching Mathematics and Its Applications: An International Journal of the IMA 35 (2): 74–87. https://doi.org/10.1093/teamat/hrw006.

Meyer, Cosima. 2022. “Bringing the World to the Classroom: Teaching Statistics and Programming in a Project-Based Setting.” PS: Political Science & Politics 55 (1): 193–97. https://doi.org/10.1017/S1049096521001104.

Moon, J. 2002. The Module and Programme Development Handbook: A Practical Guide to Linking Levels, Outcomes and Assessment Criteria. London: Taylor & Francis Group.

Muenchen, Robert A. 2019. “The Popularity of Data Science Software.” https://r4stats.com/articles/popularity/.

Murillo-Zamorano, Luis R., and Manuel Montanero. 2018. “Oral Presentations in Higher Education: A Comparison of the Impact of Peer and Teacher Feedback.” Assessment & Evaluation in Higher Education 43 (1): 138–50. https://doi.org/10.1080/02602938.2017.1303032.

Murphy, Karen, and Shane Barry. 2016. “Feed-Forward: Students Gaining More from Assessment via Deeper Engagement in Video-Recorded Presentations.” Assessment & Evaluation in Higher Education 41 (2): 213–27. https://doi.org/10.1080/02602938.2014.996206.

Nielsen, Perpetua Lynne, Nathan William Bean, and Ross Allen Andrew Larsen. 2018. “The Impact of a Flipped Classroom Model of Learning on a Large Undergraduate Statistics Class.” Statistics Education Research Journal 17 (1): 121–40. https://doi.org/10.52041/serj.v17i1.179.

Ní Fhloinn, Eabhnat, and Olivia Fitzmaurice. 2021. “How and why? Technology and practices used by university mathematics lecturers for emergency remote teaching during the COVID-19 pandemic.” Teaching Mathematics and Its Applications: An International Journal of the IMA 40 (4): 392–416. https://doi.org/10.1093/teamat/hrab018.

Noetel, Michael, Shantell Griffith, Oscar Delaney, Taren Sanders, Philip Parker, Borja del Pozo Cruz, and Chris Lonsdale. 2021. “Video Improves Learning in Higher Education: A Systematic Review.” Review of Educational Research 91 (2): 204–36. https://doi.org/10.3102/0034654321990713.

Northedge, Andrew. 2003. “Rethinking Teaching in the Context of Diversity.” Teaching in Higher Education 8 (1): 17–32. https://doi.org/10.1080/1356251032000052302.

Nuri, Bulan, and Marsigit. 2019. “Self-Directed Learning of Student in Mathematics Education: Is There Any Problem?” In Proceedings of the 2019 International Conference on Mathematics, Science and Technology Teaching and Learning, 44–47. ICMSTTL 2019. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/3348400.3348409.

P., Manoj Kumar, Renumol V.G., and Sahana Murthy. 2018. “Flipped Classroom Strategy to Help Underachievers in Java Programming.” In 2018 International Conference on Learning and Teaching in Computing and Engineering (Latice), 44–49. https://doi.org/10.1109/LaTICE.2018.000-7.

Paez, Nicolás Martín. 2017. “A Flipped Classroom Experience Teaching Software Engineering.” In 2017 Ieee/Acm 1st International Workshop on Software Engineering Curricula for Millennials (Secm), 16–20. https://doi.org/10.1109/SECM.2017.6.

Pattanaphanchai, Jarutas. 2019. “An Investigation of Students’ Learning Achievement and Perception Using Flipped Classroom in an Introductory Programming Course: A Case Study of Thailand Higher Education.” Journal of University Teaching and Learning Practice 16 (5). https://eric.ed.gov/?id=EJ1237873.

Peterson, Daniel J. 2016. “The Flipped Classroom Improves Student Achievement and Course Satisfaction in a Statistics Course: A Quasi-Experimental Study.” Teaching of Psychology 43 (1): 10–15. https://doi.org/10.1177/0098628315620063.

Petersson, Johan, Mathias Hatakka, and Panagiota Chatzipetrou. 2020. “Students Perception on Group Workshops: A Comparison Between Campus-Based and Online Workshops.” In European Conference on E-Learning, 397–405, XV, XIX. https://www.proquest.com/conference-papers-proceedings/students-perception-on-group-workshops-comparison/docview/2473444688/se-2?accountid=12753.

Pratt, Daniel D. 2002. “Good Teaching: One Size Fits All?” New Directions for Adult and Continuing Education, no. 93: 5–15. https://doi.org/10.1002/ace.45.

Ramsden, Paul. 2003. Learning to Teach in Higher Education. 2nd ed. Taylor & Francis Group.

Rosenthal, Jeffrey S. 1995. “Active Learning Strategies in Advanced Mathematics Classes.” Studies in Higher Education 20 (2): 223–28. https://doi.org/10.1080/03075079512331381723.

Sambell, Kay, Liz McDowell, and Catherine Montgomery. 2012. Assessment for Learning in Higher Education. Taylor & Francis Group.

Shandomo, Hibajene M. 2010. “The Role of Critical Reflection in Teacher Education.” School-University Partnerships 4 (1): 101–13. https://eric.ed.gov/?id=EJ915885.

Souza, Manoy Joseph D, and Paul Rodrigues. 2015. “Investigating the Effectiveness of the Flipped Classroom in an Introductory Programming Course.” The New Educational Review 40: 129–39. https://doi.org/10.15804/tner.2015.40.2.11.

Sterling, Theodor D., and Seymour V. Pollack. 1966. “Use of the Computer to Teach Introductory Statistics.” Communications of the ACM 9 (4). https://doi.org/10.1145/365278.365536.

Stewart, Martyn. 2013. “Understanding Learning: Theories and Critique.” In University Teaching in Focus : A Learning-Centred Approach, edited by Lynne Hung and Denise Chalmers. London: Taylor & Francis Group.

Tarimo, William T., Fatima Abu Deeb, and Timothy J. Hickey. 2016. “A Flipped Classroom with and Without Computers.” In Computer Supported Education, edited by Susan Zvacek, Maria Teresa Restivo, James Uhomoibhi, and Markus Helfert, 333–47. Cham: Springer International Publishing.

Taşpolat, Ata, Fezile Özdamli, and Emrah Soykan. 2021. “Programming Language Training with the Flipped Classroom Model.” SAGE Open 11 (2): 21582440211021403. https://doi.org/10.1177/21582440211021403.

Thomas, Liz, Robert Jones, and Dr James Ottaway. 2015. “Effective Practice in the Design of Directed Independent Learning Opportunities.” https://www.advance-he.ac.uk/knowledge-hub/effective-practice-design-directed-independent-learning-opportunities.

Veras, Nécio L., Lincoln S. Rocha, and Windson Viana. 2020. “Flipped Classroom in Software Engineering: A Systematic Mapping Study.” In Proceedings of the 34th Brazilian Symposium on Software Engineering, 720–29. SBES ’20. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/3422392.3422490.

Voskoglou, Michael. 2019. “Comparing Teaching Methods of Mathematics at University Level.” Education Sciences 9 (204). https://doi.org/10.3390/educsci9030204.


  1. http://talmo.uk/resources.html

  2. https://www.menti.com/

  3. http://talmo.uk/resources.html

  4. https://rss.org.uk/training-events/training/public-courses/software-training/