Jump to content

Programme for International Student Assessment

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Jharris6 (talk | contribs) at 02:00, 16 October 2013 (Undid revision 577369277 by 148.88.244.105 (talk)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Programme for International Student Assessment
AbbreviationPISA
Formation1997
PurposeComparison of education attainment across the world
HeadquartersOECD Headquarters
Location
Region served
World
Membership
59 government education departments
Head of the Early Childhood and Schools Division
Michael Davidson
Main organ
PISA Governing Body (Chair - Lorna Bertrand, England)
Parent organization
OECD
WebsitePISA

The Programme for International Student Assessment (PISA) is a worldwide study by the Organisation for Economic Co-operation and Development (OECD) in member and non-member nations of 15-year-old school pupils' scholastic performance on mathematics, science, and reading. It was first performed in 2000 and then repeated every three years. It is done with view to improving education policies and outcomes. The data has increasingly been used both to assess the impact of education quality on incomes and growth and for understanding what causes differences in achievement across nations.[1]

470,000 15-year-old students representing 65 nations and territories participated in PISA 2009. An additional 50,000 students representing nine nations were tested in 2010.[2]

The Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS) by the International Association for the Evaluation of Educational Achievement are similar studies.

Framework

PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA's methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA's Progress in International Reading Literacy Study (PIRLS).

PISA aims at testing literacy in three competence fields: reading, mathematics, science.

The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education's application to real-life problems and lifelong learning (workforce knowledge).

In the reading test, "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling." Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts."[3]

Development and implementation

Developed from 1997, the first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and a half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook only took place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years.

Every period of assessment focuses on one of the three competence fields of reading, math, science; but the two others are tested as well. After nine years, a full cycle is completed: after 2000, reading was again the main domain in 2009.

Period Main focus # OECD countries # other countries # students Notes
2000 Reading 28 4 265,000 The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002
2003 Mathematics 30 11 275,000 UK disqualified from data analysis. Also included test in problem solving.
2006 Science 30 27
2009 Reading 34 33? Results made available on 7 December 2010.[4]

PISA is sponsored, governed, and coordinated by the OECD. The test design, implementation, and data analysis is delegated to an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). ACER leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA's reading, mathematics, science, problem-solving, computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data. The source code of the data analysis software is not made public.

Method of testing

Sampling

The students tested by PISA are ages between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006, however, several countries also used a grade-based sample of students. This made it possible to study how age and school year interact.

To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are fewer than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.

Test

PISA test documents on a school table (Neues Gymnasium, Oldenburg, Germany, 2006)

Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. There are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation and family. School directors fill in a questionnaire describing school demographics, funding, etc.

In selected countries, PISA started experimentation with computer adaptive testing.

National add-ons

Countries are allowed to combine PISA with complementary national tests.

Germany does this in a very extensive way: On the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in the international and the national test, another 45,000 take only the latter. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the "PISA" label for national tests.[5]

Data scaling

From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be 'scaled' to allow meaningful comparisons. This scaling is done using the Rasch model of item response theory (IRT). According to IRT, it is not possible to assess the competence of students who solved none or all of the test items. This problem is circumvented by imposing a Gaussian prior probability distribution of competences.[6]

One and the same scale is used to express item difficulties and student competences. The scaling procedure is tuned such that the a posteriori distribution of student competences, with equal weight given to all OECD countries, has mean 500 and standard deviation 100.

Results

The official reports only contain domain-specific scores and do not combine the different domains into an overall score. The final scoring is adjusted so that the OECD average in each domain is 500 and the standard deviation is 100.[7]

Historical tables

All PISA results are broken down by countries. Public attention concentrates on just one outcome: achievement mean values by countries. These data are regularly published in form of "league tables".[citation needed]

The following table gives the mean achievements of OECD member countries in the principal testing domain of each period:[8]

In the official reports, country rankings are communicated in a more elaborate form: not as lists, but as cross tables, indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.[citation needed]

In some popular media, test results from all three literacy domains have been consolidated in an overall country ranking. Such meta-analysis is not endorsed by the OECD. The official reports only contain domain-specific country scores. In part of the official reports, however, scores from a period's principal testing domain are used as proxy for overall student ability.[9]

2000–2006

Top results for the main areas of investigation of PISA, in 2000, 2003 and 2006.

OECD members as of the time of the study are in boldface. The 11 partner countries tested in 2002 after the main group of 32 are italicized.
Mathematics Science Reading
1  Hong Kong, China 560
2  Japan 557
3  Korea 547
4  New Zealand 537
5  Finland 536
6  Australia 533
7  Canada 533
8   Switzerland 529
9  United Kingdom 529
10  Belgium 520
11  France 517
12  Austria 515
13  Denmark 514
14  Iceland 514
15  Liechtenstein 514
16  Sweden 510
17  Ireland 503
18  Norway 499
19  Czech Republic 498
20  United States 493
21  Germany 490
22  Hungary 488
23  Russia 478
24  Spain 476
25  Poland 470
26  Latvia 463
27  Italy 457
28  Portugal 454
29  Greece 447
30  Luxembourg 446
31  Israel 433
32  Thailand 432
33  Bulgaria 430
34  Argentina 388
35  Mexico 387
36  Chile 384
37  Albania 381
38  Macedonia 381
39  Indonesia 367
40  Brazil 334
41  Peru 292
1  Korea 552
2  Japan 550
3  Hong Kong, China 541
4  Finland 538
5  United Kingdom 532
6  Canada 529
7  New Zealand 528
8  Australia 528
9  Austria 519
10  Ireland 513
11  Sweden 512
12  Czech Republic 511
13  France 500
14  Norway 500
15  United States 499
16  Hungary 496
17  Iceland 496
18  Belgium 496
19   Switzerland 496
20  Spain 491
21  Germany 487
22  Poland 483
23  Denmark 481
24  Italy 478
25  Liechtenstein 476
26  Greece 461
27  Russia 460
28  Latvia 460
29  Portugal 459
30  Bulgaria 448
31  Luxembourg 443
32  Thailand 436
33  Israel 434
34  Mexico 422
35  Chile 415
36  Macedonia 401
37  Argentina 396
38  Indonesia 393
39  Albania 376
40  Brazil 375
41  Peru 333
1  Finland 546
2  Canada 534
3  New Zealand 529
4  Australia 528
5  Ireland 527
6  Hong Kong, China 525
7  Korea 525
8  United Kingdom 523
9  Japan 522
10  Sweden 516
11  Austria 507
12  Belgium 507
13  Iceland 507
14  Norway 505
15  France 505
16  United States 504
17  Denmark 497
18   Switzerland 494
19  Spain 493
20  Czech Republic 492
21  Italy 487
22  Germany 484
23  Liechtenstein 483
24  Hungary 480
25  Poland 479
26  Greece 474
27  Portugal 470
28  Russia 462
29  Latvia 458
30  Israel 452
31  Luxembourg 441
32  Thailand 431
33  Bulgaria 430
34  Mexico 422
35  Argentina 418
36  Chile 410
37  Brazil 396
38  Macedonia 373
39  Indonesia 371
40  Albania 349
41  Peru 327

2006

Top 10 countries for Pisa 2006 results in Mathematics, Sciences and Reading.

OECD members as of the time of the study are in boldface. Reading scores for the United States were disqualified.
Mathematics Science Reading
1  Taiwan 549
2  Finland 548
3  South Korea 547
4  Hong Kong, China 547
5  Netherlands 531
6   Switzerland 530
7  Canada 527
8  Macau, China 525
9  Liechtenstein 525
10  Japan 523
11  New Zealand 522
12  Belgium 520
13  Australia 520
14  Estonia 515
15  Denmark 513
16  Czech Republic 510
17  Iceland 506
18  Austria 505
19  Slovenia 504
20  Germany 504
21  Sweden 502
22  Ireland 501
23  France 496
24  United Kingdom 495
25  Poland 495
26  Slovakia 492
27  Hungary 491
28  Norway 490
29  Luxembourg 490
30  Lithuania 486
31  Latvia 486
32  Spain 480
33  Russia 476
34  Azerbaijan 476
35  United States 474
36  Croatia 467
37  Portugal 466
38  Italy 462
39  Greece 459
40  Israel 442
41  Serbia 435
42  Uruguay 427
43  Turkey 424
44  Thailand 417
45  Romania 415
46  Bulgaria 413
47  Chile 411
48  Mexico 406
49  Montenegro 399
50  Indonesia 391
51  Jordan 384
52  Argentina 381
53  Colombia 370
54  Brazil 370
55  Tunisia 365
56  Qatar 318
57  Kyrgyzstan 311
1  Finland 563
2  Hong Kong, China 542
3  Canada 534
4  Taiwan 532
5  Japan 531
6  Estonia 531
7  New Zealand 530
8  Australia 527
9  Netherlands 525
10  Liechtenstein 522
11  South Korea 522
12  Slovenia 519
13  Germany 516
14  United Kingdom 515
15  Czech Republic 513
16   Switzerland 512
17  Austria 511
18  Macau, China 511
19  Belgium 510
20  Ireland 508
21  Hungary 504
22  Sweden 503
23  Poland 498
24  Denmark 496
25  France 495
26  Croatia 493
27  Iceland 491
28  Latvia 490
29  United States 489
30  Slovakia 488
31  Spain 488
32  Lithuania 488
33  Norway 487
34  Luxembourg 486
35  Russia 479
36  Italy 475
37  Portugal 474
38  Greece 473
39  Israel 454
40  Chile 438
41  Serbia 436
42  Bulgaria 434
43  Uruguay 428
44  Turkey 424
45  Jordan 422
46  Thailand 421
47  Romania 418
48  Montenegro 412
49  Mexico 410
50  Indonesia 393
51  Argentina 391
52  Brazil 390
53  Colombia 388
54  Tunisia 386
55  Azerbaijan 382
56  Qatar 349
57  Kyrgyzstan 322
1  South Korea 556
2  Finland 547
3  Hong Kong, China 536
4  Canada 527
5  New Zealand 521
6  Ireland 517
7  Australia 513
8  Liechtenstein 510
9  Poland 508
10  Sweden 507
11  Netherlands 507
12  Belgium 501
13  Estonia 501
14   Switzerland 499
15  Japan 498
16  Taiwan 496
17  United Kingdom 495
18  Germany 495
19  Denmark 494
20  Slovenia 494
21  Macau, China 492
22  Austria 490
23  France 488
24  Iceland 484
25  Norway 484
26  Czech Republic 483
27  Hungary 482
28  Latvia 479
29  Luxembourg 479
30  Croatia 477
31  Portugal 472
32  Lithuania 470
33  Italy 469
34  Slovakia 466
35  Spain 461
36  Greece 460
37  Turkey 447
38  Chile 442
39  Russia 440
40  Israel 439
41  Thailand 417
42  Uruguay 413
43  Mexico 410
44  Bulgaria 402
45  Serbia 401
46  Jordan 401
47  Romania 396
48  Indonesia 393
49  Brazil 393
50  Montenegro 392
51  Colombia 385
52  Tunisia 380
53  Argentina 374
54  Azerbaijan 353
55  Qatar 312
56  Kyrgyzstan 285

2009

The PISA 2009 results in Maths, Sciences and Reading for all 34 OECD members and 37 partner countries. Of the partner countries, only selected areas of three countries—India, Venezuela and China—were assessed. Due to scheduling constraints, 10 of those partners actually carried out their tests in 2010, not 2009.

OECD members as of the time of the study are in boldface. Participants in PISA 2009+, which were tested in 2010 after the main group of 65, are italicized.
Mathematics Science Reading
1 China Shanghai, China 600
2  Singapore 562
3  Hong Kong, China 555
4  South Korea 546
5  Taiwan 543
6  Finland 541
7  Liechtenstein 536
8   Switzerland 534
9  Japan 529
10  Canada 527
11  Netherlands 526
12  Macau, China 525
13  New Zealand 519
14  Belgium 515
15  Australia 514
16  Germany 513
17  Estonia 512
18  Iceland 507
19  Denmark 503
20  Slovenia 501
21  Norway 498
22  France 497
23  Slovakia 497
24  Austria 496
25  Poland 495
26  Sweden 494
27  Czech Republic 493
28  United Kingdom 492
29  Hungary 490
30  Luxembourg 489
31  United States 487
32  Portugal 487
33  Ireland 487
34  Spain 483
35  Italy 483
36  Latvia 482
37  Lithuania 477
38  Russia 468
39  Greece 466
40  Malta 463
41  Croatia 460
42  Israel 447
43  Turkey 445
44  Serbia 442
45  Azerbaijan 431
46  Bulgaria 428
47  Uruguay 427
48  Romania 427
49  United Arab Emirates 421
50  Chile 421
51  Mauritius 420
52  Thailand 419
53  Mexico 419
54  Trinidad and Tobago 414
55  Costa Rica 409
56  Kazakhstan 405
57  Malaysia 404
58  Montenegro 403
59  Moldova 397
60 Venezuela Miranda, Venezuela 397
61  Argentina 388
62  Jordan 387
63  Brazil 386
64  Colombia 381
65  Georgia 379
66  Albania 377
67  Tunisia 371
68  Indonesia 371
69  Qatar 368
70  Peru 365
71  Panama 360
72 India Tamil Nadu, India 351
73 India Himachal Pradesh, India 338
74  Kyrgyzstan 331
1 China Shanghai, China 575
2  Finland 554
3  Hong Kong, China 549
4  Singapore 542
5  Japan 539
6  South Korea 538
7  New Zealand 532
8  Canada 529
9  Estonia 528
10  Australia 527
11  Netherlands 522
12  Liechtenstein 520
13  Germany 520
14  Taiwan 520
15   Switzerland 517
16  United Kingdom 514
17  Slovenia 512
18  Macau, China 511
19  Poland 508
20  Ireland 508
21  Belgium 507
22  Hungary 503
23  United States 502
24  Norway 500
25  Czech Republic 500
26  Denmark 499
27  France 498
28  Iceland 496
29  Sweden 495
30  Latvia 494
31  Austria 494
32  Portugal 493
33  Lithuania 491
34  Slovakia 490
35  Italy 489
36  Spain 488
37  Croatia 486
38  Luxembourg 484
39  Russia 478
40  Greece 470
41  Malta 461
42  Israel 455
43  Turkey 454
44  Chile 447
45  Serbia 443
46  Bulgaria 439
47  United Arab Emirates 438
48  Costa Rica 430
49  Romania 428
50  Uruguay 427
51  Thailand 425
52 Venezuela Miranda, Venezuela 422
53  Malaysia 422
54  Mauritius 417
55  Mexico 416
56  Jordan 415
57  Moldova 413
58  Trinidad and Tobago 410
59  Brazil 405
60  Colombia 402
61  Tunisia 401
62  Montenegro 401
63  Argentina 401
64  Kazakhstan 400
65  Albania 391
66  Indonesia 383
67  Qatar 379
68  Panama 376
69  Georgia 373
70  Azerbaijan 373
71  Peru 369
72 India Tamil Nadu, India 348
73  Kyrgyzstan 330
74 India Himachal Pradesh, India 325
1 China Shanghai, China 556
2  South Korea 539
3  Finland 536
4  Hong Kong, China 533
5  Singapore 526
6  Canada 524
7  New Zealand 521
8  Japan 520
9  Australia 515
10  Netherlands 508
11  Belgium 506
12  Norway 503
13  Estonia 501
14   Switzerland 501
15  Poland 500
16  Iceland 500
17  United States 500
18  Liechtenstein 499
19  Sweden 497
20  Germany 497
21  Ireland 496
22  France 496
23  Taiwan 495
24  Denmark 495
25  United Kingdom 494
26  Hungary 494
27  Portugal 489
28  Macau, China 487
29  Italy 486
30  Latvia 484
31  Greece 483
32  Slovenia 483
33  Spain 481
34  Czech Republic 478
35  Slovakia 477
36  Croatia 476
37  Israel 474
38  Luxembourg 472
39  Austria 470
40  Lithuania 468
41  Turkey 464
42  Russia 459
43  Chile 449
44  Costa Rica 443
45  Malta 442
46  Serbia 442
47  United Arab Emirates 431
48  Bulgaria 429
49  Uruguay 426
50  Mexico 425
51  Romania 424
52 Venezuela Miranda, Venezuela 422
53  Thailand 421
54  Trinidad and Tobago 416
55  Malaysia 414
56  Colombia 413
57  Brazil 412
58  Montenegro 408
59  Mauritius 407
60  Jordan 405
61  Tunisia 404
62  Indonesia 402
63  Argentina 398
64  Kazakhstan 390
65  Moldova 388
66  Albania 385
67  Georgia 374
68  Qatar 372
69  Panama 371
70  Peru 370
71  Azerbaijan 362
72 India Tamil Nadu, India 337
73 India Himachal Pradesh, India 317
74  Kyrgyzstan 314

Comparison with other studies

The correlation between PISA 2003 and TIMSS 2003 grade 8 country means is 0.84 in mathematics, 0.95 in science. The values go down to 0.66 and 0.79 if the two worst performing developing countries are excluded. Correlations between different scales and studies are around 0.80. The high correlations between different scales and studies indicate common causes of country differences (e.g. educational quality, culture, wealth or genes) or a homogenous underlying factor of cognitive competence. Western countries perform slightly better in PISA; Eastern European and Asian countries in TIMSS. Content balance and years of schooling explain most of the variation.[10]

Reception

For many countries, the results from PISA 2000 were surprising. In Germany and the United States, for example, the comparatively low scores brought on heated debate about how the school system should be changed.[citation needed] Some headlines in national newspapers, for example, were:

The results from PISA 2003 and PISA 2006 were featured in the 2010 documentary Waiting for "Superman".[12]

China

Education professor Yong Zhao has noted the PISA 2009 did not receive much attention in the Chinese media, and that the high scores in China are due to excessive workload and testing, adding that it's "no news that the Chinese education system is excellent in preparing outstanding test takers, just like other education systems within the Confucian cultural circle: Singapore, Korea, Japan, and Hong Kong."[13]

India

Of the 74 countries tested in the PISA 2009 cycle including the "+" nations, the two Indian states came up 72nd and 73rd out of 74 in both reading and maths, and 73rd and 74th in science. India's poor performance may not be linguistic as some suggested. 12.87% of US students, for example, indicated that the language of the test differed from the language spoken at home. while 30.77% of Himachal Pradesh students indicated that the language of the test differed from the language spoken at home, a significantly higher percent [14] However, unlike American students, those Indian students with a different language at home did better on the PISA test than those with the same language. [15] India's poor performance on the PISA test is consistent with India's poor performance in the only other instance when India's government allowed an international organization to test its students [16] and consistent with India's own testing of its elite students in a study titled Student Learning in the Metro 2006. These studies were conducted using TIMSS questions. The poor result in PISA was greeted with dismay in the Indian media.[17] The BBC reported that as of 2008, only 15% of India's students reach high school.[18]

India pulled out of the 2012 round of PISA testing, in August 2012, with the Indian government attributing its action to the unfairness of PISA testing to Indian students.[19] The Indian Express reported on 9/3/2012 that "The ministry (of education) has concluded that there was a socio-cultural disconnect between the questions and Indian students. The ministry will write to the OECD and drive home the need to factor in India's "socio-cultural milieu". India's participation in the next PISA cycle will hinge on this".[20] The Indian Express also noted that "Considering that over 70 nations participate in PISA, it is uncertain whether an exception would be made for India".

In June 2013, the Indian government, still concerned with the future prospect of fairness of PISA testing relating to Indian students, again pulled India out from the 2015 round of PISA testing.[21]

UK

On 26 July, 2013 the TES (Times Educational Supplement) published academic criticisms of the conceptual foundations and methods of PISA.[22]

It quotes University of Bristol Professor Harvey Goldstein, who explains that when the OECD tries to rule out questions suspected of bias, it can have the effect of “smoothing out” key differences between countries. “That is leaving out many of the important things,” he warns. “They simply don’t get commented on. What you are looking at is something that happens to be common. But (is it) worth looking at? PISA results are taken at face value as providing some sort of common standard across countries. But as soon as you begin to unpick it, I think that all falls apart.”

The article cites University of Copenhagen Professor Svend Kreiner, who looked at the reading results for 2006 in detail. He says that only about 10 per cent of the students who took part in PISA were tested on all 28 reading questions. “This in itself is ridiculous,” Kreiner told the TES. “Most people don’t know that half of the students taking part in PISA (2006) do not respond to any reading item at all. Despite that, PISA assigns reading scores to these children.”[23]

Queen's University Belfast mathematician Dr. Hugh Morrison argues that the statistical model underlying PISA has a fundamental, insoluble mathematical error that renders Pisa rankings “valueless”. [24] Goldstein told the TES that Morrison highlights “an important technical issue” rather than the “profound conceptual error”. PISA “has been used inappropriately and some of the blame for that lies with PISA itself. I think it tends to say too much for what it can do and it tends not to publicise the negative or the weaker aspects.”

Research on causes of country differences

Large international student assessment programs such as PISA and TIMSS have provided essential data for many recent analyses of how student achievement affects society at large, such as economic development,[25] democratization and health.[26]

Although PISA and TIMSS officials and researchers themselves generally refrain from hypothesizing about the large and stable differences in student achievement between countries, other researchers have investigated single educational factors like central exams[27] private schools or streaming between schools at later age.[28] An extensive literature related to cross-countries difference in scores has also developed since 2000.[1]

Finland

The stable, high marks of Finnish students have attracted a lot of attention. According to Hannu Simola[29] the results are due to a paradoxical mix of progressive policies implemented through a rather conservative pedagogic setting, where the high levels of teachers' academic preparation, social status, professionalism and motivation for the job are concomitant with the adherence to traditional roles and methods by both teachers and pupils in Finland's changing, but still rather authoritarian culture. Others have suggested that Finland's low poverty rate is a reason for its success.[30][31]

Finnish education reformer Pasi Sahlberg suggests that the reason for Finland's high educational achievements is because of the country's focus on access to quality education for all, as opposed to a focus on competition among teachers and schools.[32] Lynn and Meisenberg (2010) found very high correlations (r>0.90) between mean student assessment results from PISA, TIMSS, PIRLS and others and IQ measurements at the country data level.[33]

An evaluation of the 2003 results showed that countries that spent more on education did not necessarily do better. Australia, Belgium, Canada, the Czech Republic, Finland, Japan, South Korea, New Zealand and the Netherlands spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared.[citation needed]

Another point made in the evaluation was that students with higher-earning parents are better-educated and tend to achieve higher results. This was true in all the countries tested, although more obvious in certain countries, such as Germany.[citation needed]

China

In 2010, the 2009 Program for International Student Assessment (PISA) results revealed that Shanghai students scored the highest in the world in every category (Mathematics, Reading and Science). The OECD described Shanghai as a pioneer of educational reform, noting that "there has been a sea change in pedagogy". OECD point out that they "abandoned their focus on educating a small elite, and instead worked to construct a more inclusive system. They also significantly increased teacher pay and training, reducing the emphasis on rote learning and focusing classroom activities on problem solving."[34]

OECD has also noted that even in rural China results approached average levels for the OECD countries: "Citing further, as-yet unpublished OECD research, Mr Schleicher said, 'We have actually done Pisa in 12 of the provinces in China. Even in some of the very poor areas you get performance close to the OECD average.'"[35] For a developing country, China’s 99.4% enrolment in primary education is already, as the OECD puts it, “the envy of many countries” while junior secondary school participation rates in China are now 99%. But in Shanghai not only has senior secondary school enrolment attained 98% but admissions into higher education have achieved 80% of the relevant age group. That this growth reflects quality, not just quantity, is confirmed clearly by the OECD’s ranking of Shanghai’s secondary education as world number one.[35] According to the OECD, China has also expanded school access, and moved away from learning by rote.[36] "'The last point is key: Russia performs well in rote-based assessments, but not in Pisa,' says Schleicher, head of the indicators and analysis division at the OECD’s directorate for education. 'China does well in both rote-based and broader assessments.'"[35]

United States

Two studies have compared high achievers in mathematics on the PISA and the U.S. National Assessment of Educational Progress (NAEP). Comparisons were made between those scoring at the "advanced" and "proficient" levels in mathematics on the NAEP with the corresponding performance on the PISA. Overall, 30 nations had higher percentages than the U.S. of students at the "advanced" level of mathematics. The only OECD countries with worse results were Portugal, Greece, Turkey, and Mexico. Six percent of U.S. students were "advanced" in mathematics compared to 28 percent in Taiwan. The highest ranked state in the U.S. (Massachusetts) was just 15th in the world if it was compared with the nations participating in the PISA. 31 nations had higher percentages of "proficient" students than the U.S. Massachusetts was again the best U.S. state, but it ranked just ninth in the world if compared with the nations participating in the PISA.[37][38]

Comparisons with results for the Trends in International Mathematics and Science Study (TIMSS) appear to give different results—suggesting that the U.S. states actually do better in world rankings.[39] The difference in apparent rankings is, however, almost entirely accounted for by the sampling of countries. PISA includes all of the OECD countries, while TIMSS is much more weighted in its sampling toward developing countries.

Poverty

University of Southern California professor Stephen Krashen[40] and Mel Riddile of the NASSP say that low performance in the United States is closely related to American poverty, but the same reasoning applies to other countries.[30][31]

Reduced school lunch participation is the only available intra-poverty indicator for US schoolchildren; areas with less than 10% of the students having free or reduced price lunch averaged 551 (higher than any other OECD country). In comparison with the rest other OECD countries (which have tabled figures on children living in relative poverty):[31]

Country Percent of reduced school lunches (US)[31]

Percent of relative child poverty (Other OECD countries)[41]

PISA score[42]
United States < 10% 551
Finland 3.4% 536
Netherlands 9.0% 508
Belgium 6.7% 506
United States 10%–24.9% 527
Canada 13.6% 524
New Zealand 16.3% 521
Japan 14.3% 520
Australia 11.6% 515
United States 25–49.9% 502
Estonia 40.1% 501
United States 50–74.9% 471
Russian Federation 58.3% 459
United States > 75% 446

Portugal

According to OECD's PISA, the average Portuguese 15-year-old student was for many years underrated and underachieving in reading literacy, mathematics and science knowledge in the OECD, nearly tied with the Italian and just above those from countries like Greece, Turkey and Mexico. However, since 2010, PISA results for Portuguese students improved dramatically. The Portuguese Ministry of Education announced a 2010 report published by its office for education evaluation GAVE (Gabinete de Avaliação do Ministério da Educação) which criticized the results of PISA 2009 report and claimed that the average Portuguese teenage student had profund handicaps in terms of expression, communication and logic, as well as a low performance when asked to solve problems. They also claimed that those fallacies are not exclusive of Portugal but occur in other countries due to the way PISA was designed.[43]

See also

References

  1. ^ a b Hanushek, Eric A., and Ludger Woessmann. 2011. "The economics of international differences in educational achievement." In Handbook of the Economics of Education, Vol. 3, edited by Eric A. Hanushek, Stephen Machin, and Ludger Woessmann. Amsterdam: North Holland: 89-200.
  2. ^ PISA 2009 Technical Report, 2012, OECD, http://www.oecd.org/dataoecd/60/31/50036771.pdf
  3. ^ Chapter 2 of the publication "PISA 2003 Assessment Framework", pdf
  4. ^ http://www.oecd.org/document/34/0,3343,en_2649_35845621_44949730_1_1_1_1,00.html
  5. ^ C. Füller: Pisa hat einen kleinen, fröhlichen Bruder. taz, 5.12.2007 [1]
  6. ^ The scaling procedure is described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. It is similar to procedures employed in NAEP and TIMSS. According to J. Wuttke Die Insignifikanz signifikanter Unterschiede. (2007, in German), the description in the Technical Reports is incomplete and plagued by notational errors.
  7. ^ PISA 2009. http://www.pisa.oecd.org/document/61/0,3746,en_32252351_32235731_46567613_1_1_1_1,00.html
  8. ^ OECD (2001) p. 53; OECD (2004a) p. 92; OECD (2007) p. 56.
  9. ^ E.g. OECD (2001), chapters 7 and 8: Influence of school organization and socio-economic background upon performance in the reading test. Reading was the main domain of PISA 2000.
  10. ^ M. L. Wu: A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March, 2008. [2].
  11. ^ Preocupe-se. Seu filho é mal educado Veja, November 7, 2007, retrieved April 13, 2013 Template:Br icon
  12. ^ "Waiting for "Superman" trailer". Retrieved October 8, 2010.
  13. ^ Yong Zhao (2010-12-10), A True Wake-up Call for Arne Duncan: The Real Reason Behind Chinese Students Top PISA Performance
  14. ^ http://pisa2009.acer.edu.au/interactive_results.php
  15. ^ http://pisa2009.acer.edu.au/interactive_results.php
  16. ^ http://ddp-ext.worldbank.org/EdStats/INDprwp08b.pdf
  17. ^ Vishnoi, Anubhuti (2012-01-07), Poor PISA ranks: HRD seeks reason, The Indian Express
  18. ^ Masani, Zareer (February 27, 2008). "India still Asia's reluctant tiger". BBC News.
  19. ^ http://articles.timesofindia.indiatimes.com/2012-08-03/mumbai/33019239_1_india-backs-global-stage-math-and-science
  20. ^ http://www.indianexpress.com/news/poor-pisa-score-govt-blames--disconnect--with-india/996890/
  21. ^ http://timesofindia.indiatimes.com/home/education/news/India-chickens-out-of-international-students-assessment-programme-again/articleshow/20375670.cms
  22. ^ http://www.tes.co.uk/article.aspx?storycode=6344672
  23. ^ https://ifsv.sund.ku.dk/biostat/biostat_annualreport/images/c/ca/ResearchReport-2011-1.pdf
  24. ^ http://www.qub.ac.uk/schools/SchoolofEducation/AboutUs/Staff/Academic/DrHughMorrison/Filestore/Filetoupload,387514,en.pdf
  25. ^ Hanushek, Eric; Woessmann, Ludger (2008), "The role of cognitive skills in economic development" (PDF), Journal of Economic Literature, 46 (3): 607–668, doi:10.1257/jel.46.3.607
  26. ^ Rindermann, Heiner; Ceci, Stephen J (2009), "Educational policy and country outcomes in international cognitive competence studies", Perspectives on Psychological Science, 4 (6): 551–577, doi:10.1111/j.1745-6924.2009.01165.x
  27. ^ Bishop, John H (1997), "The effect of national standards and curriculum-based exams on achievement", American Economic Review, 87 (2): 260–264
  28. ^ Hanushek, Eric; Woessmann, Ludger (2006), "Does educational tracking affect performance and inequality? Differences-in-differences evidence across countries" (PDF), Economic Journal, 116 (510): C63–C76
  29. ^ Simola, H. (2005). The Finnish miracle of PISA: Historical and sociological remarks on teaching and teacher education. Comparative Education, 41, 455-470.
  30. ^ a b "The Economics Behind International Education Rankings" National Educational Association
  31. ^ a b c d Riddile, Mel (2010-12-15), PISA: It's Poverty Not Stupid, National Association of Secondary School Principals
  32. ^ http://www.theatlantic.com/national/archive/2011/12/what-americans-keep-ignoring-about-finlands-school-success/250564/
  33. ^ Lynn, R. & Meisenberg, G. (2010). National IQs calculated and validated for 108 nations. Intelligence, 38, 353-360.
  34. ^ Gumbel, Peter (2010-12-07), China Beats Out Finland for Top Marks in Education, TIME, retrieved 2012-06-27
  35. ^ a b c Cook, Chris (2010-12-07), Shanghai tops global state school rankings, Financial Times, retrieved 2012-06-28
  36. ^ Mance, Henry (2010-12-07), Why are Chinese schoolkids so good?, Financial Times, retrieved 2012-06-28
  37. ^ Paul E. Peterson, Ludger Woessmann, Eric A. Hanushek, and Carlos X. Lastra-Anadón (2011) "Are U.S. students ready to compete? The latest on each state’s international standing." Education Next 11, no. 4 (Fall): 51-59. http://educationnext.org/are-u-s-students-ready-to-compete/
  38. ^ Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann (2011) "Teaching math to the talented." Education Next 11, no. 1 (Winter): 10-18. http://educationnext.org/teaching-math-to-the-talented/
  39. ^ Gary W. Phillips (2007) Chance favors the prepared mind: Mathematics and science indicators for comparing states. Washington: American Institutes for Research (November 14); Gary W. Phillips (2009) The Second Derivative:International Benchmarks in Mathematics For U.S. States and School Districts. Washington, DC: American Institutes for Research (June).
  40. ^ "How poverty affected U.S. PISA scores" The Washington Post
  41. ^ "Child poverty statistics: how the UK compares to other countries", The Guardian. The same UNICEF figures were used by Riddile.
  42. ^ Highlights From PISA 2009, Table 3.
  43. ^ Template:Pt icon Estudo do ministério aponta graves problemas aos alunos portugueses, GAVE (Gabinete de Avaliação do Ministério da Educação) 2010 report in RTP

Further reading

Official websites and reports

  • OECD/PISA website
    • OECD (1999): Measuring Student Knowledge and Skills. A New Framework for Assessment. Paris: OECD, ISBN 92-64-17053-7 [3]
    • OECD (2001): Knowledge and Skills for Life. First Results from the OECD Programme for International Student Assessment (PISA) 2000.
    • OECD (2003a): The PISA 2003 Assessment Framework. Mathematics, Reading, Science and Problem Solving Knowledge and Skills. Paris: OECD, ISBN 978-92-64-10172-2 [4]
    • OECD (2004a): Learning for Tomorrow's World. First Results from PISA 2003. Paris: OECD, ISBN 978-92-64-00724-6 [5]
    • OECD (2004b): Problem Solving for Tomorrow's World. First Measures of Cross-Curricular Competencies from PISA 2003. Paris: OECD, ISBN 978-92-64-00642-3
    • OECD (2005): PISA 2003 Technical Report. Paris: OECD, ISBN 978-92-64-01053-6
    • OECD (2007): Science Competencies for Tomorrow's World: Results from PISA 2006 [6]

Reception and political consequences

  • A. P. Jakobi, K. Martens: Diffusion durch internationale Organisationen: Die Bildungspolitik der OECD. In: K. Holzinger, H. Jörgens, C. Knill: Transfer, Diffusion und Konvergenz von Politiken. VS Verlag für Sozialwissenschaften, 2007.

France

  • N. Mons, X. Pons: The reception and use of Pisa in France.

Germany

  • E. Bulmahn [then federal secretary of education]: PISA: the consequences for Germany. OECD observer, no. 231/232, May 2002. pp. 33–34.
  • H. Ertl: Educational Standards and the Changing Discourse on Education: The Reception and Consequences of the PISA Study in Germany. Oxford Review of Education, v 32 n 5 pp 619–634 Nov 2006.

United Kingdom

  • S. Grek, M. Lawn, J. Ozga: Study on the Use and Circulation of PISA in Scotland. [7]

Criticism

Books

  • S. Hopmann, G. Brinek, M. Retzl (eds.): PISA zufolge PISA. PISA According to PISA. LIT-Verlag, Wien 2007, ISBN 3-8258-0946-3 (partly in German, partly in English)
  • T. Jahnke, W. Meyerhöfer (eds.): PISA & Co – Kritik eines Programms. Franzbecker, Hildesheim 2007 (2nd edn.), ISBN 978-3-88120-464-4 (in German)
  • R. Münch: Globale Eliten, lokale Autoritäten: Bildung und Wissenschaft unter dem Regime von PISA, McKinsey & Co. Frankfurt am Main : Suhrkamp, 2009. ISBN 978-3-518-12560-1 (in German)

Websites

Video clips