The Programme for International Student Assessment (PISA) has attracted the attention of many researchers and educational policy makers over the past few years. PISA measures determine the extent to which 15-year-old students have obtained essential knowledge and competencies to achieve success in modern societies and economies as they near the end of their compulsory education. To do so, the PISA items, which students resolve with the use of their knowledge and competencies, are constructed with a variety of personal, social, and global contexts in mind. The PISA 2015 survey does not just measure how well a 15-year-old student has acquired essential knowledge and competencies but also includes items probing students’ attitudes toward science, in order to improve our understanding of why students perform well in the knowledge and competency domains. PISA alternates reading, mathematics, and science as the major assessment domains every three years. The main assessment domains in PISA 2012 and PISA 2015 were mathematical literacy and scientific literacy, respectively. Additionally, for both the 2012 and 2015 surveys, PISA placed a focus on the measurement of opportunity to learn. The science test items used have also continuously evolved in the light of the rapidly changing world in order to reflect the latest advances in science and technology. Other revisions to the PISA assessment reflect innovations in existing theories and models of science and mathematics education. The return of mathematics and science as the major assessment domains in PISA 2012 and PISA 2015 provided participating countries with opportunities to compare their students’ performances over 9 years, and to re-examine their curricula and educational policies in light of the changes that have occurred in their own educational systems and instruction. The main intention of the PISA surveys is to inform the decision making of educational researchers and policy makers such that their decisions have greater impact at the real classroom level. Taken together, these various benefits have resulted in a significant increase in the number of countries/economies willing to participate in PISA international assessments over the years.

In PISA 2006, the scientific literacy framework consisted of contexts (personal, social, and global), knowledge (content and procedure), and competencies (explain phenomena scientifically, identify scientific issues, and use scientific evidence) (Organisation for Economic Co-operation and Development [OECD], 2006). The PISA 2015 scientific literacy framework evolved to include refined categorization of contexts (personal, local/national, and global), knowledge (content, procedure, and epistemic), and competencies (explain phenomena scientifically, evaluate and design scientific inquiry, and interpret data and evidence scientifically) (OECD, 2016a) (Fig. 1). One of the major differences from PISA 2006 is that epistemic knowledge was assessed as an additional category of knowledge about science in PISA 2015 besides procedural and content knowledge. To distinguish procedure and epistemic knowledge, we provide two example questions. In one question testing procedural knowledge, students had to identify a factor that might make volunteers’ counts of migrating birds inaccurate, and explain how it may affect the count. (OECD, 2016a, p. 463). In a question testing epistemic knowledge, students had to use given data to decide whether a difference in soil moisture between two slopes of a valley was due to different solar radiation or different rainfall. Success in doing this demonstrates one major aspect of epistemic knowledge which is to evaluate two claims by interpreting data (OECD, 2016a, p. 470). Epistemic knowledge provides a rationale for the procedures and practice in which scientists engage, a knowledge of the structures and defining features that guide scientific inquiry, and foundation for the basis of belief in the claims that science makes about the natural world (OECD, 2016b, p. 28). Scientific epistemic knowledge and scientific inquiry ability have long been considered as pivotal to the core of scientific learning. Hence, the PISA surveys continue to evolve the approach to evaluating scientific inquiry. In PISA 2006, the nature of scientific inquiry only measured the capacity to identify scientific issues, but in PISA 2015, this was replaced by the capacity to evaluate and design scientific inquiry. It was indeed a great move to include ‘epistemic knowledge’ in the knowledge domain and ‘the evaluation and design of scientific inquiry’ in the competency domain. Thus, country scores were reported for overall performance, performance on each of the three content knowledge categories (living systems, physical systems, earth and space systems), and performance on combining both procedural and epistemic knowledge, and performance on each of the three competencies (explain phenomena scientifically, evaluate and design scientific inquiry, and interpret data and evidence scientifically). The format of the 2015 assessment also changed from a paper-based to computer-based delivery to reflect the availability of digital devices in schools, with this change also providing the opportunity to assess students’ competency in evaluating and designing scientific inquiry by interacting with simulations of experiments. PISA is thus guiding the direction of international assessment by including the most updated theories and core aspects of science education and mathematics education.

Fig. 1
figure 1

PISA 2015 scientific literacy assessment framework (OECD, 2016a)

PISA 2003 was the first survey to assess students’ mathematical literacy, and its framework consisted of situations/contexts (personal, educational/occupational, public, and scientific) in which the problems were designed, the mathematical content categories (quantity, space and shape, change and relationships, uncertainty), and the competencies (thinking and reasoning; argumentation; communication; modelling; problem posing and solving; representation; using symbolic, formal, and technical language and operations; and the use of aids and tools) that were used to solve the problems (OECD, 2003). Of these framework elements, country scores were only produced for the overall performance and performance in each of the four content categories. In PISA 2012, the mathematical literacy framework evolved significantly to include refined classification of contexts (personal, societal, occupational, and scientific), mathematical content (quantity, uncertainty and data, change and relationships, space and shape), processes (formulating situations mathematically; employing mathematical concepts, facts, procedures, and reasoning; and interpreting, applying, and evaluating mathematical outcomes), and the associated underlying fundamental capabilities (communication; representation; devising strategies; mathematisation; reasoning and argument; using symbolic, formal, and technical language and operations; and using mathematical tools) (OECD, 2013) (Fig. 2). One of the major changes from PISA 2003 was that PISA 2012 defined the mathematic processes in which students engage as they solve problems, making the PISA 2012 the first survey in which the processes were used as a primary assessing dimension for mathematics. Thus, country scores were reported for overall performance, performance on each of the four content categories (quantity, uncertainty and data, change and relationships, space and shape), and performance on each of the three processes (formulating situations mathematically; employing mathematical concepts, facts, procedures, and reasoning; and interpreting, applying, and evaluating mathematical outcomes). Relatedly, the constructs of PISA 2012 mathematical literacy evolved from only measuring the competencies to combining both processes and underlying capabilities (competencies) (Fig. 2). In addition, in PISA 2012, mathematical literacy also evolved to a more sophisticated model which encompassed more relevant and explicit components in light of this rapidly changing world. An optional additional assessment of computer-based mathematics (including items where the computational power of the computer was used to obtain the solution) was also available for countries in 2012. In 2015, the previously paper-based items of PISA 2012 were delivered in computer-based mode in many countries.

Fig. 2
figure 2

PISA 2012 mathematics literacy assessment framework (OECD, 2013)

The theme of this special issue is Science and Mathematics Literacy: PISA for Better School Education. This Special Issue is intended to go beyond the official PISA reports regarding frameworks and results in order to explore the implications of PISA results regarding the shaping of students’ science and mathematics learning and teachers’ instruction whilst also guiding further changes in education systems within individual countries and around the world. The first article in the special issue, by Xiang Hu, Frederick Leung, and Yuan Teng, provides an in-depth analysis of the influence of culture on students’ mathematics achievement by using data from the PISA 2015 database for a two-level hierarchical linear modelling analysis. The data of 51 of the 71 countries that participated in PISA 2015 was analysed to explore the influence of six cultural dimensions on students’ mathematical literacy. At the same time, two variables at the student level, gender and socioeconomic status, and one at the country level, GDP per capita, were controlled for. The results indicated that national culture accounted for 23.89% of the between-country performance differences. One of the cultural dimensions in particular, namely, long-term orientation, was found to have a positive association with students’ mathematical literacy.

Meanwhile, the article by Jihyun Hwang, Kyong Mi Choi, Yejun Bae, and Dong Hoon Shin investigates whether student-centred instruction moderates equity in mathematical literacy and scientific literacy performance using the PISA 2012 database and PISA 2015 database, respectively, for their linear regression model analysis. Ten countries that participated in both PISA 2012 and PISA 2015 were selected for the analysis, namely, Brazil, Taiwan, Finland, France, Korea, Norway, Peru, Qatar, Singapore, and the USA. The results indicated that mathematical literacy scores in all ten countries were likely to decrease as the frequency with which students received student-oriented instruction in mathematics classrooms increased. A similar pattern was also reported with respect to scientific literacy, i.e. that students’ scientific literacy scores generally decreased as inquiry-based instruction was given more frequently. However, the gap in the mathematics and scientific literacy between students with high and low socioeconomic status varied across countries, with the gap being narrow in some countries, non-existent in others, and broad in still others. The implications of these findings are thoroughly discussed in the paper.