Exploring Context-based Teaching of Statistical Literacy and Graphical Communication in a Life Sciences Module in an Asian University

Exploring Context-based Teaching of Statistical Literacy and Graphical Communication in a Life Sciences Module in an Asian University


LEE Zheng-Wei, CHEW Yuanyuan, and YEONG Foong May

Department of Biochemistry, Yong Loo Lin School of Medicine, National University of Singapore

Name:     A/P YEONG Foong May
Address: MD4 Level 1, Yong Loo Lin School of Medicine, National University of Singapore (NUS), 10 Medical Drive, Singapore 117597
Email:    bchyfm@nus.edu.sg      

Recommended Citation:
Lee Z-W., Chew YY., & Yeong F. M. (2020). Exploring context-based teaching of statistical literacy and graphical communication in a Life Sciences module in an Asian University. Asian Journal of the Scholarship of Teaching and Learning, 10(2). 118-133.

View as PDF      Current Issue


In the first run of our Life Sciences module, we highlighted to students that they should apply their prior knowledge of statistical skills in their mini-projects. In their final reports and presentations, we noticed a lack of statistics and graphical representation skills among students. We therefore designed a context-based approach to teach students these skills using EXCEL as a tool during the second run of the module. We used a mixed method approach to analyse students’ artefacts from these two student cohorts. We used content analysis to code students’ reports and presentations based on criteria derived from scientific practices that the module instructors were familiar with and aligned to the meta-representational competence. Frequencies of use and quality of statistics among students in the second cohort were higher, though not statistically different from the first cohort. For graphical representation of data, there was a statistically significant increase in frequencies of use in the second cohort, with improvement in quality. From our student perception survey, students in the second cohort recognised the relevance of these skills beyond the module. Unlike students in the first cohort, those in the second cohort likely saw the practical purpose of learning the skills. This implies the pragmatic approach our local students might take in learning to apply them in graded assignments. We suggest that context-based instruction could be an effective way to foster learning of competencies.

Keywords: Context-based teaching; statistical literacy; graphical communication; EXCEL skills; pragmatic learning approach.


Undergraduate modules in Life Sciences are traditionally taught in a didactic manner, and mostly with a narrow focus on domain knowledge. Domain-specific content, especially when too academic, might reduce student engagement if students deem the materials irrelevant to themselves. Demonstrating the connections between materials taught in classes and the practical use of the knowledge could be one way to engage students. Indeed, successful student learning depends largely on the possibility of linking and organising modular materials, such as by the teachers “making sense” of the content for students (Belt, 2006; Overton, 2007). Such context-based or situated-learning design of teaching and learning activities could better engage students in learning and acquiring useful skills. 

In a third-year Medical Mycology and Drug Discovery module taught at the National University of Singapore (NUS), students worked on a laboratory-based mini-project to screen for possible novel antifungals targeting a real-world fungal pathogen. The experimental data generated needed to be processed, analysed, and presented in graded final presentations and written reports. In the first run of the module in 2018, we observed that students were ill-equipped with data representation and fundamental statistical analysis skills. Furthermore, anecdotal feedback from Life Sciences students returning from internships suggested a lack of EXCEL skills training that seem necessary for various workplace settings. To deliver beyond domain-specific content, in 2019 we designed activities to incorporate the use of EXCEL functions and basic statistical analysis. The intended outcomes included improved scientific research competencies that are core to STEM students (Hilborn & Friedlander, 2013).

We will discuss the outcomes of the intervention in terms of the impact on scientific competencies and whether the introduction of general skills into a domain-specific, context-relevant use could be accepted by students. We will also discuss if interventions designed with a deliberate aim to promote skills application could be one way to enhance students’ learning outcomes in our Asian context.

Theoretical Background

Context-based teaching

The notion that what is actually learned is normally more effective within a relevant context or specific practice has roots in ideas previously proposed by Dewey (2004), as well as Lave and Wenger (1991), among others. By providing relevant contexts to students, instructors could foster interest among students to learn (Dewey, 2004; Gilbert, 2006). Contexts are “focal events” embedded within a cultural setting [Duranti & Goodwin, 1992, p. 3 as cited by Gilbert (2006)]. The aspect of student interest in learning is linked to models of engagement, as it pertains to students’ investment of effort at learning a concept or task presented to them, and has implications in student achievement in schools (Finn & Zimmer, 2012). 

How a context works in engagement and learning could be due to the context or situation functioning to provide a coherent structural basis that students can use to build mental models, so as to make connections between new concepts being taught and prior knowledge (Gilbert, 2006). According to Gilbert (2006, and citations therein), there are several attributes that define an education context, namely: a setting within which mental encounters with focal events are situated; a behavioural environment of the encounters such that task(s) related to the focal event are used to frame a discussion; the use of specific language associated with the focal event; and a relationship to background knowledge. 

Indeed, context-based teaching is not uncommon and has been applied in STEM education, including mathematics (Anderson et al., 1999), chemistry (Belt et al., 2006) and biology (Wieringa et al., 2011). In terms of scientific literacy, we require students to be proficient in competencies such as data analysis and representation. We tapped on the meta-representational competence (MRC) model that includes the ability to select, produce, and use appropriate representations, and the abilities to critique and modify representations (Andrea & Sherin, 2000). According to the authors, children might have their own prior ideas of how to represent data through daily activities. However, there are benefits to providing instructions directly on MRC, not least because students get to learn practices used in science. Indeed, in Life Sciences, recent attempts to understand MRC among undergraduate and professors indicated that there is room for scaffolding MRC for undergraduates (Angra & Gardner, 2017).

In this module, we embedded the MRC model using focal events of statistical analysis and graphical representations as a form of a community of practice among researchers (Lave & Wenger, 1991). These focal events have broader implications beyond the module itself, as it is related to day-to-day scientific practices. As researchers themselves, the instructors are demonstrating to students, who are mostly located at the peripheral level of participation in the research arena, the practices of data processing and analysis in an acceptable manner consistent with professional researchers. This can support students to learn the skills that would enable them to develop from a novice to an expert (Benner, 1982), and enter the community of researchers in time to come.  

We further draw upon Chi and Wylie’s (2014) “Interactive > Constructive > Active > Passive” (ICAP) cognitive engagement model when purposefully designing our intervention. In this model, it was proposed that an interactive mode of engagement achieves the greatest level of learning, greater than the Constructive mode, which is greater than the Active mode, which in turn is greater than the Passive mode. As such, we incorporated activities during classes for students to be engaged in, such as tutorials for statistical and EXCEL skills that are relevant for students when they write their project report (see below). Students had opportunities to interact not only with the instructors during the tutorials, but they were also able to use their own data to practise using the skills taught. This taps on experiential learning that among other things, supposes that learners need to transact with their environment so that learners might construct their knowledge (Kolb & Kolb, 2005). 

Asian students’ learning styles

In the general context of Asian students’ learning styles, teacher-centred learning approach is prioritised (Ballard & Clanchy, 1991). There is also pressure for students to score well in high stakes assignments (Loh & Teo, 2017; Thanh-Pham, 2011). There have been however, different studies showing that there might not be as much difference between Asian and Western students (e.g. Campbell & Li, 2008; Kember, 2000), though these studies were conducted in different settings. There are also indications that culture might play some role in learning styles (Manikutty et al., 2007).

As we were dealing mostly with Singaporean students, we looked to studies on their learning styles. At least one study was conducted on Singaporean students at the Secondary Three level to understand students’ purposes of learning in relation to academic achievement in mathematics (Luo et al., 2011). The authors derived two main goal types, one of mastery goals and another of performance goals. As part of mastery goals, students aim to develop competence through the acquisition of new skills while for performance goals, students aim to demonstrate competence relative to others, and includes performance avoidance so as not to look incompetent to peers and teachers. 

The students were clustered in groups and the data indicated that in general, students with high mastery and performance approach goals, combined with low performance avoidance goals, showed best academic achievement as these students demonstrated higher levels of self-efficacy, subjective task values, class engagement, homework engagement, time management, and meta-cognitive self-regulation than those with moderate mastery or low performance approach and avoidance. Therefore, to motivate student learning, the instructors could foster an environment that promotes mastery and performance goals while minimising performance avoidance behaviour.

Problem statement

In our module, we discussed with students the use of statistical and graphical representations for the data that they had obtained from their mini-projects. However, in the module’s first run, we made the assumption that students had basic skills in statistics and graphical representations from prior modules they had read, either in biostatistics modules or applications of EXCEL in Life Sciences. When we evaluated students’ final reports and presentations, we noted an inadequate demonstration of those skills. 

This observation led us to question whether there was a need to explicitly instruct students on the use of EXCEL for basic statistical analysis of data and to demonstrate the use of various graphical representations of quantitative data in the context of their project work. We hence designed three tutorial sessions on the use of EXCEL for basic statistical analysis of data they had obtained from their projects. We also intentionally provided time in these tutorial sessions to discuss and demonstrate the use of various graphical representations of quantitative data. While the sessions were not graded and no specific assessments on these were used, during the tutorials, we discussed the usefulness of data analysis and representations in scientific reports as a means of prompting students to learn and apply them in their final reports and project presentations. Moreover, we anticipated that these MRC activities are highly engaging (Andrea & Sherin, 2000) and could foster student learning. 

Objectives and Research Questions

The main objective of this evaluation project is therefore to assess the outcomes of the student accomplishment on the use of context-based teaching in delivering statistical literacy and graphical communication. 

Concerning the incorporation of tutorials that explicitly involved teaching of basic statistics and graphical representation using EXCEL in the module, our research questions are as follows:

When comparing between the two cohorts,

  1. did the tutorials increase the use of statistics and graphical representations by students in their mini-project reports and presentations?
  2. did such tutorials improve the quality of the use of statistics and graphical representations by students, if any?

We would expect students from the cohort provided with explicit EXCEL training to be more ready to use statistical or graphical representations to summarise and conclude their own results in their mini-project reports and presentations. Ideally, the incorporation of EXCEL skills in the tutorial sessions should equip students with higher levels of confidence and knowledge in statistics and graphical communication.

Materials and Methods

Module information

Our module LSM3226 “Medical Mycology and Drug Discovery” is a third-year elective module for Life Sciences students that runs for 13 weeks. This is a small-class module with an embedded mini-project that spans across five practical sessions (5 × 4 hrs = 20 hrs), with the rest of class time devoted to lectures (8 × 2 hrs = 16 hrs) or tutorials (5 × 1 hr = 5 hrs). There were two (2019) to three (2018) lecturers for the module, and one full-time teaching assistant (TA) (this paper’s first author) running this module. The coordinator for the module (this paper’s last author) taught the bulk of the classes and the TA was also involved in all the practical and tutorial sessions on statistics and graphical representations using EXCEL. The student enrolment was 18 in 2018 and 21 in 2019. 

Intervention design

In the first year we ran the module in 2018 (referred to as Cohort 1), students were not taught statistics and graphical representations specifically. In the second run of the module in 2019 (referred to as Cohort 2), we incorporated different components of statistical literacy and graphical communication over three tutorial sessions of the module. We reasoned that our students would find those skills useful when the tutorials were based on their own generated data from practical experiments. Moreover, our students needed to perform similar data analysis and perhaps use the relevant statistics and graphical representations in their graded mini-project reports and presentations. The detailed components, examples of questions used in the tutorials, and the teaching and learning activities are summarised in Table 1. The activities designed were informed by the module instructors’ research experiences and were aligned with the MRC model (Andrea & Sherin, 2000).

Table 1
Topics covered in each tutorial session on statistical literacy and graphical communication using EXCEL as a tool (click on Table 1 for full view)
Lee et al_Table1

Based on Gilbert (2006), our intervention fits the idea of context-based teaching based on the attributes defined, as illustrated in Table 2.

Table 2
Aspects of our intervention mapped to attributes of “context” (Click on Table 2 for full view)
Lee et al_Table2

Analysis of students’ work

We applied a mixed methodology approach in our study. Our analysis was based on students’ group presentations and individual reports. We evaluated students’ artefacts in relation to data processing, analysis and representation. The frequency count of the use of statistics and graphical representations in the students’ artefacts would inform us how readily our students apply such knowledge in their works. Deeper evaluation on the performance level of the use of statistics and graphs by students can also reflect students’ strength on those topics and possibly, highlight to us specific topics that require our attention in our future instructional design. Components and criteria used for the evaluation are summarised in Table 3.

In Cohort 2, we conducted a student perception survey asking respondents to reflect skills they used in the module. We quantified the number of student responses that mentioned the teaching of EXCEL skills (without explicit prompting on our part), the usefulness of the skills learnt, and if they think these skills were useful in their future module or workplace. These number would indicate how well our students perceived the teaching and learning of the EXCEL tutorials.

Table 3
Components and criteria used to evaluate the use of statistics and graphical representations in students' mini-project report and presentation (Click on Table 3 for full view)
Lee et al_Table3

Statistical method

The comparison between frequency use and performance level of the statistics and graphical representations between two cohorts were performed using Pearson’s Chi-square 2×2 or 2×3 contingency table where appropriate. A p-value < 0.05 was considered statistically significant. 


Students used more statistics and graphical representations when taught skills explicitly 

Throughout the mini-project, our students performed drug treatments and aimed to identify novel compound with anti-fungal activity. Each individual student performed at least triplicate for a single treatment condition. Two or three students worked together as a group. One common way to summarise their group data was by obtaining the mean of a same treatment condition and account for data point discrepancy, usually to calculate the standard deviation (SD) or standard error (SE), where appropriate. 

Among the students’ artefacts including their mini-project reports and presentations, from Cohort 1 there were attempts among about 88% of the students that used mean values to summarise their individual triplicate sets data or combined group data (Table 4). There were also attempts to report SD or SE values together with the mean values among 83% of the students. Students in Cohort 2 also showed similar frequencies of use of mean values and SE or SD values, at slightly higher albeit not statistically significant frequencies. Thus, it seems that our students, perhaps with their previous exposure in an introductory statistics module, learned how to express data using mean values, and SD or SE.

In terms of presenting data in graphical form, in Cohort 1, only 41.7% of our students used graphs in their reports or presentations to present their data, while only 8.3% was able to describe the variables and patterns in the graph. The frequency count of these two components significantly increased in Cohort 2, where 96.4% of students presented their data with graphs and 57.1% was able to describe trends in the graphs. Overall, the tutorials seemed to have increase students’ levels of familiarity and confidence in graphical communication in Cohort 2 than in Cohort 1 (p-values < 0.001).

Table 4
Frequency count of the use of statistics and graphical representations in reports and presentations (Click on Table 4 for full view)
Lee et al_Table4

Students demonstrated better quality of statistics and graphical representations when taught skills explicitly

We further assessed the quality of students’ use of statistics and graphical representations, such as whether the correct raw data sets were used to determine mean values and if SD or SE values were correctly derived. We also examined whether appropriate graphs were used to represent the data types, using the evaluation criteria defined in Table 3. 

Three-quarters of Cohort 1 students fell in the “Unsatisfactory” rating for their use of statistics while the remaining were rated as being “Satisfactory” (Table 5). Among those with the “Unsatisfactory” grading, there were incorrect understanding of SD values, such as students calculating SD even though only two data points were included for analysis. For the same statistical literacy component, Cohort 2 students demonstrated better performance levels, though not statistically significant when we used Pearson’s Chi-square test analysis to compare the two cohorts. Half the Cohort 2 students were rated “unsatisfactory” in their statistical performance, where we observed that students were confused between SD and SE, and used the terms interchangeably. However, we observed an increased proportion of “Satisfactory” (46.4%) and one student’s work (3.6%) was rated as having managed to “Exceed Standards”, where proper statistical test and p-value were used to arrive at an evidence-based conclusion for hypothesis testing. 

Given categorical data, our students were able to use the correct graphs, i.e. bar chart, to present their results. However, several students had inadequate or no labelling of axes on their graphs. A graph without proper labelling is not acceptable as there will be no indication of variables plotted. Apart from that, some students failed to indicate clearly that the value plotted on the bar chart was a mean value calculated from their replicates. Such “Unsatisfactory” performance in graphical communication was high in students in Cohort 1 (70.8%). Only 29% of Cohort 1 students were rated as “Satisfactory” for their ability to use graphs and communicate properly. Although the comparison between the two cohorts was not statistically significant, our tutorials promoted the quality of student’s graphical communication. We observed lesser proportion of “Unsatisfactory” ratings (32.1%), higher proportions of “Satisfactory” (53.6%), and 14.3% of student’s artefacts managed to “Exceed Standards” in Cohort 2, as compared to in Cohort 1.

Table 5
Performance level of the use of statistics and graphical representations by students (Click on Table 5 for full view)
Lee et al_Table5

Integration of statistical literacy and graphical communication by students in making scientific inferences 

One important aspect of statistical literacy in scientific research, particularly in discovery science as is the case here, is to be able to make an inference or conclusion based on the data and evidence collected. None of our students in Cohort 1 demonstrated the ability to refer to appropriate pieces of evidence while making comparisons between two groups of data to make validated inferences (Table 4). In Cohort 2, who underwent our tutorial intervention, this proportion increased slightly (14.3%). However, there was no statistical significance between Cohorts 1 and 2 across the different components of statistical literacy we defined (p-value = 0.054). 

When we examined in greater detail students’ artefacts from both cohorts, we noted that those who fell into the “Unsatisfactory” rating were especially weak in making valid comparisons. Many of them merely wrote statements, such as “Drug X is better than Drug Y”, without further elaboration or making references to relevant evidence. The graphs presented were without clear annotations on any group comparison despite it being mentioned in their text. An expected statement would be “Drug X is better than Drug Y because the MIC value for drug X is lower than drug Y (concentration ± SE vs concentration ± SE, p<0.05)”. This highlighted an important aspect that would require our attention in future semesters. 

Student’s perception survey

At the end of semester, we conducted an anonymous student survey for Cohort 2 on various aspects of the module. We provided a text box for students to respond freely to a question on their perceptions of the skills they used in the module. We reasoned that if our students mentioned the use of statistics and graphical representations in the survey without direct prompting from us, we could estimate how much our students acknowledge and think about whether the tutorials were useful to them.

Many of our students referred to the statistics and graphical representations skills collectively as “EXCEL skills”, and therefore we referred it as so hereafter. Of all the 20 survey responses we collected from our Cohort 2 students, 80% acknowledged the use of EXCEL skills in the module, particularly in collecting, analysing, and interpreting the data they generated during the mini-project (Table 6). 45% of student responses stated that they found the EXCEL skills useful in the context of the module. 15% thought that these EXCEL skills would be useful beyond the module. Particularly, these students mentioned in their survey responses that the EXCEL skills would come in handy to them in lab attachments that they were currently doing. They could directly see the use of such skills in their data presentation outside of the module, and possibly in their future workplace. In fact, we recently had a Cohort 2 student who highlighted to us the usefulness of EXCEL skills, such as summarising large data using pivot table, that he applied it in the industrial attachment he undertook after our module.

Table 6
Students' perception survey on the skills used in the module (Click on Table 6 for full view)
Lee et al_Table6


In this study, we examined whether our intervention, in which we provided explicit instruction on statistical analysis and graphical representation of data in a context-specific way, was effective at getting students to process, analyse and represent their experimental data appropriately in scientific reports and presentations. The intervention, based on the idea of context-based teaching (Gilbert, 2006), was designed to address what we saw during the previous semester, which was a lack of such skills in students. From our exploratory study, we observed that students exposed to the intervention were able to demonstrate use of the skills more frequently, and with better quality.

Our students in both cohorts, perhaps with their previous exposure to fundamental statistics from other modules, had sufficient knowledge to attempt to provide mean values and SD/SE in their data descriptions. However, the quality of the skills they demonstrated was inadequate. For instance, some students were not able to derive the correct mean and SD values. Very often, they merely calculated the SD/SE values but did not use it to address discrepancy of the data in their discussion. In terms of inferential statistics, the students used average values and calculated p-values between two comparison groups but did not make proper reference to the statistical test they performed when making conclusive statements. This observation highlighted to us gaps where students might need further scaffolding and practice in making the inferences.

Students also lacked graphical communication skills to present data. This appeared to be a skill that required prompting from the instructors. In Cohort 1, most of our students presented their data as numerical mean values in a table format and failed to discuss their data thoroughly. Our demonstrations in tutorials using similar data and students’ practices using their own data could have promoted students’ familiarity with the skills, increasing their confidence to apply them in their graded presentations and reports. We could perhaps emphasise in future tutorials the visual aesthetics of graphical representation to improve students’ graphs that were mostly low-quality plots.

The statistics and graphical representation skills that we targeted were essential to prepare students for their future workplace. Kemp and Seagraves (1995) had previously reported that a substantial number of students who entered industrial practice identified that they needed more help especially in four major personal transferable skills (PTS): report writing, oral presentation, graphical communication, and group working. 

Despite many instructors claiming that the above-mentioned four components were their intended learning outcomes and were part of their teaching and learning activities, students perceived that formal instruction might or might not contribute directly to their preparedness for the workplace (Kemp & Seagraves, 1995). The authors posited that the skills “whether learned or not, are required to be used by the students, on a daily basis”. The integration of these skills in a “relevant context” could possibly raise students’ motivational levels and make those skills “transferable”. Indeed, this was likely what we observed across the two student cohorts, though future in-depth study on this aspect of student learning is needed among our local students. Nonetheless, in our aim to get our students to relate the use of these skills to their own projects, we had intentionally emphasised the use of such statistical and graphical skills in the context of our mini-project. 

Domain-specific knowledge is critical for delivering important content to students in STEM education. However, years of studies in chemical education have identified several challenges including content overload, isolated facts, lack of transfer, lack of relevance, and inadequate emphasis on the development of “scientific literacy” (Gilbert, 2006). These challenges identified in chemical education are similar to our Life Sciences undergraduate education. Hence, if we were to improve our efforts at educating our students to be scientifically literate, we should create “contexts” appropriately for our students (see Table 2), such that domain-specific knowledge or skills we intend to deliver “make sense” to them. The student’s learning takes place when they are able to make personal relevance to the knowledge and make sense of the knowledge. 

Asian students see “effort” as internal and controllable by themselves, and by being strategic in making adaptive coping, they will be able to attain higher levels of performance. They like to be able to “see a practical purpose” in tasks they have to do, and they could do this with a high degree of autonomy (William, 2000). Being pragmatic in learning, perhaps, is in line with repeated observations in Asian students where they consider internal regulation, particularly “effort”, to be the most important attribute for academic success (Luo et al., 2014). 

From Singaporean students’ point of view, understanding the relevance of knowledge might be for pragmatic reasons such as getting good grades in assessments (performance approach goals), or not to be looking bad in front of their peers or teachers (performance avoidance goals) (Luo et al., 2011). However, these behaviours of needing practical relevance and being strategic in learning are in fact not very different between Asian and European countries (William, 2000). The fact that we observed our students being engaged and performing better in statistics and graphical representation with context-based teaching, might not in fact be specific to Asian students. Nonetheless, “effort” as a determinant for academic success is highly correlated with instructors’ efforts at providing routes and guides to students to actualise their potential (Luo et al., 2014). 

Our survey responses shown that our students acknowledged the usefulness of the statistics and graphical representation skills in and beyond the module. This was a good indication that it is worthy to have taken time to demonstrate how the skills were used and also given students opportunities to practise and apply them. his experiential learning design (Kolb & Kolb, 2005) was perhaps important for supporting performance goals. That some of them also recognised the possible transfer of what they have learnt about EXCEL functions to their future workplace further underscore the need for engaging teaching and learning activities to encourage mastery of skills (Luo et al., 2011) for purposes beyond academic achievements. 


In conclusion, we find that designing instructional materials to scaffold learning of specific skills is useful in driving student learning among our local students. The context-based approach made clear the relevance of skills to students. Based on our experience, we will improve our learning activities using context-based approaches where possible, to encourage better use of other scientific skills, as we believe such skills will serve students beyond our module. It would be useful to considering doing longitudinal studies in future to investigate whether our interventions have lasting effects. 


Anderson, J. R., Simon, H. A., & Reder, L. M. (1999). Situated learning and education. Educational Researcher, 25(4), 5–11. http://dx.doi.org/10.3102/0013189x025004005

Angra, A., & Gardner, S. M. (2017). Reflecting on graphs: Attributes of graph choice and construction practices in biology. CBE Life Sciences Education, 16(3), 1–15. http://dx.doi.org/10.1187/cbe.16-08-0245

Ballard, B. & Clanchy, J. (1991). Teaching students from overseas: A brief guide for lecturers and supervisors. Melbourne: Longman Cheshire.

Belt, S. T., Leisvik, M. J., Hyde, A. J. & Overton, T. L. (2006). Using a context‐based approach to undergraduate chemistry teaching: A case study for introductory physical chemistry. Chemistry Education Research and Practice, 6(3), 166–179. http://dx.doi.org/10.1039/B5RP90007G

Benner, P. (1982). From novice to expert. The American Journal of Nursing, 82(3), 402-407. http://dx.doi.org/10.2307/3462928

Campbell, J., & Li, M. (2008). Asian students’ voices: An empirical study of Asian students’ learning experiences at a New Zealand University. Journal of Studies in International Education, 12(4), 375–396. http://dx.doi.org/10.1177/1028315307299422

Chi, M. T. H., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist, 49(4), 219–243. http://dx.doi.org/10.1080/00461520.2014.965823

Dewey, J. (2004). Democracy and education: An introduction to the philosophy of education. Norwood, Mass.: Macmillian.

diSessa, A. A., & Sherin, B. L. (2000). Meta-representation: An introduction. The Journal of Mathematical Behavior, 19(4), 385-398. http://dx.doi.org/10.1016/S0732-3123(01)00051-7

Finn, J. D., & Zimmer, K. S. (2012). Student engagement: What is it? Why does it matter. In S. L. Christenson (Ed.), Handbook of Research on Student Engage (pp. 97–131). Springer.

Gilbert, J. K. (2006). On the nature of “context” in chemical education. International Journal of Science Education, 28(9), 957-976. http://dx.doi.org/10.1080/09500690600702470

Hilborn, R. C., & Friedlander, M. J. (2013). Biology and physics competencies for pre-health and other life sciences students. CBE Life Sciences Education, 12(2), 170–174. http://dx.doi.org/10.1187/cbe.12-10-0184

Kember, D. (2000). Misconceptions about the learning approaches, motivation and study practices of Asian students. Higher Education, 40(1), 99–121. http://dx.doi.org/10.1023/A:1004036826490

Kemp I. J., & Seagraves L. (1995). Transferable skills—Can higher education deliver? Studies in Higher Education, 20(3), 315-328. http://dx.doi.org/10.1080/03075079512331381585

Kolb, A. Y., & Kolb, D. A. (2005). Learning styles and learning spaces: Enhancing experiential learning in higher education. Academy of Management Learning & Education, 4(2), 193–212. http://dx.doi.org/10.5465/amle.2005.17268566

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge University Press.

Loh, C. Y. R. & Teo, T. C. (2017). Understanding Asian students learning styles, cultural influence and learning strategies. Journal of Education & Social Policy, 4(1), 194-210. http://jespnet.com/journals/Vol_4_No_1_March_2017/23.pdf

Luo, W., Hogan, D. J., Yeung, S. A, Sheng, Y. Z. & Aye, K. M. (2014). Attributional beliefs of Singapore students: relations to self-construal, competence and achievement goals. Educational Psychology, 34(2), 154-170. https://doi.org/10.1080/01443410.2013.785056

Luo, W., Paris, S. G., Hogan, D., & Luo, Z. (2011). Do performance goals promote learning? A pattern analysis of Singapore students’ achievement goals. Contemporary Educational Psychology, 36(2), 165–176. https://doi.org/10.1016/j.cedpsych.2011.02.003

Manikutty, S., Anuradha, N. S., & Hansen, K. (2007). Does culture influence learning styles in higher education? International Journal of Learning and Change, 2(1), 70–87. https://dx.doi.org/10.1504/IJLC.2007.014896

Overton, T. (2007). Context and problem‐based learning. New Directions in the Teaching of Physical Science, 3, 7–12. https://doi.org/10.29311/ndtps.v0i3.409

Thanh-Pham T. H. (2011). Issues to consider when implementing student-centred learning practices at Asian higher education institutions. Journal of Higher Education Policy and Management, 33(5), 519-528. https://doi.org/10.1080/1360080X.2011.605226

Wieringa, N., Janssen, F. J. J. M., & van Driel, J. H. (2011). Biology teachers designing context-based lessons for their classroom practice-The importance of rules-of-thumb. International Journal of Science Education, 33(17), 2437–2462. https://doi.org/10.1080/09500693.2011.553969

William, L. (2000). Do Asian students really want to listen and obey? ELT Journal, 54(1), 31–36. https://doi.org/10.1093/elt/54.1.31  

About the Corresponding Author

YEONG Foong May is Associate Professor at the Department of Biochemistry, NUS. She is a cell and molecular biologist and her interest in higher education revolves around collaborative and authentic learning. She is a Fellow of the NUS Teaching Academy, affiliate of the NUS Institute of Applied Learning Sciences and Educational Technology (NUS ALSET) and Senior Fellow of the Higher Education Academy, UK.