Jump to content

Collegiate Learning Assessment

From Wikipedia, the free encyclopedia

This is the current revision of this page, as edited by H-stt (talk | contribs) at 13:17, 31 August 2022 (External links: update). The present address (URL) is a permanent link to this version.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

The Collegiate Learning Assessment (CLA) is a standardized testing initiative in United States higher educational evaluation and assessment. It uses a "value-added" outcome model to examine a college or university's contribution to student learning which relies on the institution, rather than the individual student, as the primary unit of analysis. The CLA measures are designed to test for critical thinking, analytic reasoning, problem solving, and written communication skills. The assessment consists of open-ended questions, is administered to students online, and controls for incoming academic ability. An institution's average score on the CLA measures correlates highly with the institution's average SAT score (r = 0.90).[1] Institutional results are not published.

History and Test Format

[edit]

The CLA was first launched in 2000 by the Council for Aid to Education (CAE), a national nonprofit organization based in New York City. Rather than testing for specific content knowledge gained in particular courses or majors, the intent was to assess “the collective and cumulative result of what takes place or does not take place over the four to six years of undergraduate education in and out of the classroom.[2] Of the entire test, the most well-developed and sophisticated part is its performance task component,[3] in which students are given ninety minutes to respond to a writing prompt that is associated with a set of background documents. The testing materials, including the documents, are accessed through a computer. CAE has published several examples of representative performance tasks, one of which is described below (from Academically Adrift: Limited Learning on College Campuses):

The “DynaTech” performance task asks students to generate a memo advising an employer about the desirability of purchasing a type of airplane that has recently crashed. Students are informed: “You are the assistant to Pat Williams, the president of DynaTech, a company that makes precision electronic instruments and navigational equipment. Sally Evans, a member of DynaTech’s sales force, recommended that DynaTech buy a small private plane (a SwiftAir 235) that she and other members of the sales force could use to visit customers. Pat was about to approve the purchase when there was an accident involving a SwiftAir 235.” Students are provided with the following set of documents for this activity: newspaper articles about the accident, a federal accident report on in-flight breakups in single engine planes, Pat Williams’s e-mail to her assistant and Sally Evans’s e-mail to Pat Williams, charts on SwiftAir’s performance characteristics, an article from Amateur Pilot magazine comparing SwiftAir 235 to similar planes, and pictures and descriptions of SwiftAir models 180 and 235. Students are then instructed to “prepare a memo that addresses several questions, including what data support or refute the claim that the type of wing on the SwiftAir 235 leads to more in-flight breakups, what other factors might have contributed to the accident and should be taken into account, and your overall recommendation about whether or not DynaTech should purchase the plane."[4]

CAE also publishes its scoring rubric.[5] The design of both the prompts and the criteria for evaluation demonstrates the CLA's focus on complex, holistic, real-world problem-solving as a measurement of high level learning. As a result, the argument goes, institutions that attempt to “teach to the test” will be schools that teach students “to think critically, reason analytically, solve problems, and communicate clearly."[6]

Criticisms

[edit]

Again according to Academically Adrift, there are four primary criticisms of the CLA. Two criticisms relate to the validity of the instrument:

  1. By focusing on general skills rather than domain knowledge and specialization, the instrument may lack the construct validity to measure the specialized knowledge that students go to college to obtain.
  2. The CLA lacks instrumental validity to measure individual performance. This concern, however, may have been partially addressed by a 2009 test validity study organized by the Fund for the Improvement of Postsecondary Education (FIPSE). The results showed that while these tests should not be used as a basis to make institutional decisions about students as individuals (e.g., promotion or course placement), when aggregated in larger samples they can provide reliable estimates of institutional or group-level differences in performance on these tasks.[7][8]

Two other criticisms relate to the normative implications of the CLA:

  1. An aversion to standardized testing in general and a belief that there is accountability in higher education due to market forces. That is, if students were not gaining something through education then they would not be paying for it. On this view, the CLA is a waste of time and money.
  2. The use of the test implies that the value-added from higher education can be measured. Individuals who oppose the idea that learning consists of acquiring specific knowledge or mental constructs, or is intended to improve performance on some measure (such as income or lifespan), may therefore oppose the measurement of learning.

See also

[edit]

Voluntary System of Accountability—an initiative developed by American Association of State Colleges and Universities (AASCU) and the National Association of State Universities and Land-Grant Colleges (NASULGC) for 4-year public colleges and universities. The VSA endorses the use of the CLA for reporting out student learning outcomes through the College Portrait.

References

[edit]
  1. ^ Benjamin, Robert; Chun, Marc (2003). "A New Field of Dreams: The Collegiate Learning Assessment Project". Peer Review. 5 (4): 26–29. Retrieved 2011-11-19.
  2. ^ Hersch, Richard (2007). "Going Naked". Peer Review. 9.
  3. ^ Arum, Richard; Roksa, Josipa (2011). Academically Adrift: Limited Learning on College Campuses. 536: University of Chicago Press.{{cite book}}: CS1 maint: location (link)
  4. ^ Arum, Richard; Roksa, Josipa (2011). Academically Adrift: Limited Learning on College Campuses. University of Chicago Press.
  5. ^ "Collegiate Learning Assessment Common Scoring Rubric". Council for Aid to Education. New York: Council for Aid to Education, 2008. Retrieved 2013-05-16.
  6. ^ Shavelson, Richard (2008). "The Collegiate Learning Assessment". Ford Policy Forum 2008: Forum for the Future of Higher Education.
  7. ^ Arum, Richard; Roksa, Josipa (2011). Academically Adrift: Limited Learning on College Campuses. University of Chicago Press.
  8. ^ Klein, Stephen; Ou Lydia Liu; James Sconing (September 29, 2009). "Test Validity Study (TVS) Report" (PDF). {{cite journal}}: Cite journal requires |journal= (help)
[edit]