Jump to content

Programme for International Student Assessment: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Rji (talk | contribs)
→‎Results: Less nonsense.
Line 155: Line 155:


[[Professor]] [[Jouni Välijärvi]] was in charge of the [[Finnish PISA study]]: he believed that the high Finnish score was due both to the excellent Finnish teachers and to Finland's 1990s LUMA programme which was developed to improve children's skills in mathematics and natural sciences. He also drew attention to the Finnish school system which teaches the same [[curriculum]] to all pupils. Indeed individual Finnish students' results did not vary a great deal and all schools had similar scores.
[[Professor]] [[Jouni Välijärvi]] was in charge of the [[Finnish PISA study]]: he believed that the high Finnish score was due both to the excellent Finnish teachers and to Finland's 1990s LUMA programme which was developed to improve children's skills in mathematics and natural sciences. He also drew attention to the Finnish school system which teaches the same [[curriculum]] to all pupils. Indeed individual Finnish students' results did not vary a great deal and all schools had similar scores.

On the other hand, Finnish professor [[Pauli Siljander]] believes that Finland's good results are due to many socio-political decisions, factors related to the history of ideas and [[education]], macro-level changes affecting the educational system and learning theoretical ideas and practical measures, in which the various contributing factors are linked firmly to each other. He emphasizes that education is an important project in the Finnish welfare state. Therefore, education cannot be considered in separation from the socio-political context. (Siljander, 2005)


An evaluation of the 2003 results showed that the countries which spent more on education did not necessarily do better than those which spent less. [[Australia]], [[Belgium]], [[Canada]], the [[Czech Republic]], [[Finland]], [[Japan]], [[Korea]] and the [[Netherlands]] spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared. This may be due to the large difference between teacher salaries in different countries. The average American teacher is paid around US$40,000 (~[[Euro|€]]30,000) per year. The average Finnish teacher is paid [[Euro|€]]25,000 per year.
An evaluation of the 2003 results showed that the countries which spent more on education did not necessarily do better than those which spent less. [[Australia]], [[Belgium]], [[Canada]], the [[Czech Republic]], [[Finland]], [[Japan]], [[Korea]] and the [[Netherlands]] spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared. This may be due to the large difference between teacher salaries in different countries. The average American teacher is paid around US$40,000 (~[[Euro|€]]30,000) per year. The average Finnish teacher is paid [[Euro|€]]25,000 per year.

Revision as of 00:33, 5 December 2007

The Programme for International Student Assessment (PISA) is a triennial world-wide test of 15-year-old schoolchildren's scholastic performance, the implementation of which is coordinated by the Organisation for Economic Co-operation and Development (OECD).

The aim of the PISA study is to test and compare schoolchildren's performance across the world, with a view to improving educational methods and outcomes.

Development and implementation

Developed from 1997, the first PISA assessment was carried out in 2000. The tests are taken every three years. Every period of assessment specialises in one particular subject, but also tests the other main areas studied. The subject specialisation is rotated through each PISA cycle.

In 2000, 265 000 students from 32 countries took part in PISA; 28 of them were OECD member countries. In 2002 the same tests were taken by 11 more "partner" countries (i.e. non-OECD members). The main focus of the 2000 tests was reading literacy, with two thirds of the questions being on that subject.

PISA’s debut round in 2000 was delivered on OECD’s behalf by an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). It continued to lead the design and implementation of subsequent rounds of PISA for OECD.

Over 275 000 students took part in PISA 2003, which was conducted in 41 countries, including all 30 OECD countries. (Britain data collection however, failed to meet PISA’s quality standards and so the UK was not included in the international comparisons.) The focus was mathematics literacy, testing real-life situations in which mathematics is useful. Problem solving was also tested for the first time.

In 2006, 57 countries participated, and the main focus of PISA 2006 was science literacy. Results are due out in late 2007. Researchers have begun preparation for 2009, in which reading literacy will again be the main focus, giving the first opportunity to measure improvements in that domain. At last count (end-March 2007), about 63 countries were set to participate in PISA 2009. It is anticipated that more countries will join in before 2009.

Development of the methodology and procedures required to implement the PISA survey in all participating countries are led by ACER. It also leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA’s Reading, Mathematics, Science, Problem-solving, Computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data.

The process of seeing through a single PISA cycle, start-to-finish, takes over 4 years.

Comparison with TIMSS and PIRLS

Another international mathematics assessment test is the Trends in International Mathematics and Science Study (TIMSS), undertaken by the International Association for Evaluation of Educational Achievement (IEA). Results from the TIMSS often contradict results of the PISA test. The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in various real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them. It divides mathematical domains into two dimensions: first, the applied-knowledge "cognitive domains" and secondly more traditional "contents domains". The cognitive domains it covers are "Knowing Facts and Procedures, Using Concepts, Solving Routine Problems and Reasoning", and the contents domains are "Number, Algebra, Measurement, Geometry and Data". The latter reflect "the importance of being able to continue comparisons of achievement with previous assessments in these content domains" (TIMSS Assessment Framework 2003, pdf) PISA argues that international assessment should not be restricted to a set body of knowledge. Instead, it deals with education's application to real-life problems and life-long learning.

In reading literacy, the equivalent to TIMSS is the Progress in International Reading Literacy Study or PIRLS. According to the OECD: "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling". Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts" (Chapter 2 of the publication "PISA 2003 Assessment Framework", pdf) PIRLS, on the other hand, describes reading literacy as "the ability to understand and use those written language forms required by society and/or valued by the individual." (Chapter 1 of PIRLS 2006 Assessment Framework, pdf)-- PIRLS includes using language forms in reading literacy. However, according to the IEA, in scoring the PIRLS tests, "the focus is solely on students’ understanding of the text, not on their ability to write well." (Chapter 4 of PIRLS 2006 Assessment Framework, pdf).

Method of testing

The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006 , however, several countries also used a grade-based sample of students. This made it possible also to study how age and school year interact.

Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. In total there are six and a half hours of assessment material, but each student is not tested on all the parts. Participating students also answer a questionnaire on their background including learning habits, motivation and family. School directors also fill in a questionnaire describing school demographics, funding etc.

Results

The results of each period of assessment normally take at least a year to be analysed. The first results for PISA 2000 came out in 2001 (OECD, 2001a) and 2003 (OECD, 2003c), and were followed by thematic reports studying particular aspects of the results. The evaluation of PISA 2003 was published in two volumes: Learning for Tomorrow’s World: First Results from PISA 2003 (OECD, 2004) and Problem Solving for Tomorrow’s World – First Measures of Cross-Curricular Competencies from PISA 2003 (OECD, 2004d)

Here is an overview of the six places with the highest scores in 2003:

Mathematics Reading literacy Science Problem solving
1. Hong Kong 550
2. Finland 544
3. South Korea 542
4. Netherlands 538
5. Liechtenstein 536
6. Japan 534
1. Finland 543
2. South Korea 534
3. Canada 528
4. Australia 525
5. Liechtenstein 525
6. New Zealand 522
1. Finland 548
1. Japan 548
3. Hong Kong 539
4. South Korea 538
5. Liechtenstein 525
5. Australia 525
6. Macau 525
1. South Korea 550
2. Finland 548
2. Hong Kong 548
4. Japan 547
5. New Zealand 533
6. Macau 532

Professor Jouni Välijärvi was in charge of the Finnish PISA study: he believed that the high Finnish score was due both to the excellent Finnish teachers and to Finland's 1990s LUMA programme which was developed to improve children's skills in mathematics and natural sciences. He also drew attention to the Finnish school system which teaches the same curriculum to all pupils. Indeed individual Finnish students' results did not vary a great deal and all schools had similar scores.

An evaluation of the 2003 results showed that the countries which spent more on education did not necessarily do better than those which spent less. Australia, Belgium, Canada, the Czech Republic, Finland, Japan, Korea and the Netherlands spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared. This may be due to the large difference between teacher salaries in different countries. The average American teacher is paid around US$40,000 (~30,000) per year. The average Finnish teacher is paid 25,000 per year.

Compared with 2000, Poland, Belgium, the Czech Republic and Germany all improved their results. In fact, apparently due to the changes to the school system following PISA 2000, Polish students had above average reading skills in PISA 2003; in PISA 2000 they were near the bottom of the list.

Another point made in the evaluation was that students with higher-earning parents are better-educated and tend to achieve higher results. This was true in all the countries tested, although more obvious in certain countries, such as Germany.

Reactions to the results

For many countries, the first PISA results were a rude awakening; in Germany, for example, the comparatively low scores brought on heated debate about how the school system should be changed. Other countries had an agreeable surprise. Some headlines in national newspapers, for example, were:

See also

Template:Lists of countries

Further information

Official websites and reports

International:

National:

Criticism

  • Jahnke, Thomas und Meyerhöfer, Wolfram (Hrsg.): PISA & Co --- Kritik eines Programms. Franzbecker, Hildesheim (2006). ISBN 978-388120-428-6. In German.
  • Hopmann, Stefan Thomas / Brinek, Gertrude / Martin Retzl (Hg./Eds.): PISA zufolge PISA - PISA according to PISA. Hält PISA, was es verspricht? - Does PISA keep, what it promises? LIT, Wien (2007). ISBN 978-3-8258-0946-1. Four contributions in German, thirteen contributions in English.
  • Rindermann, Heiner (2007). The g-factor of international cognitive ability comparisons: the homogeneity of results in PISA, TIMSS, PIRLS and IQ-tests across nations. European Journal of Personality, 21, 667-706 [1]