Dr. Libarkin's Home Page Dr.Libarkin's Home Page Graduate Study & Other Programs Graduate Study & Other Programs Geoscience Concept Inventory GCI Wiki MSU Environmental Science & Policy MSU Environmental Science & Policy MSU Cognitive Science MSU Cognitive Science MSU Center for Integrative Studies in General Science Center for Integrative Studies in General Science

*News Feed*


The Geoscience Concept Inventory

A valid and reliable assessment instrument designed for diagnosis of alternative conceptions and assessment of learning in entry-level earth science courses

Overview
Have comments about the Geoscience Concept Inventory (GCI)?
We have been collecting comments and suggestions from users since inception of the GCI. In fact, user comments are an important source of validity information. We are also initiating a new project to revise and expand the GCI, and extend an open invitation to the geosciences community for their participation. Visit the GCI Wiki to view and comment on existing GCI questions or submit a question of your own and become a co-author of the GCI. To assess student learning online, please visit The GCI Assessment Site

What is the Geoscience Concept Inventory (GCI)?
The Geoscience Concept Inventory (GCI) is a multiple-choice assessment instrument for use in the Earth sciences classroom. The GCI v.1.0 consisted of 69 validated questions that could be selected by an instructor to create a customized 15-question GCI subtest for use in their course. These test items cover topics related to general physical geology concepts, as well as underlying fundamental ideas in physics and chemistry, such as gravity and radioactivity, that are integral to understanding the conceptual Earth. Each question has gone through rigorous reliability and validation studies.

We built the the GCI using the most rigorous methodologies available, including scale development theory, grounded theory, and item response theory (IRT). To ensure inventory validity we incorporated a mixed methods approach using advanced psychometric techniques not commonly used in developing content-specific assessment instruments. We conducted ~75 interviews with college students, collected nearly 1000 open-ended questionnaires, grounded test content in these qualitative data, and piloted test items at over 40 institutions nationwide, with ~5000 student participants.

In brief, the development of the GCI involved interviewing students, collecting open-ended questionnaires, generating test items based upon student responses, soliciting external review of items by both scientists and educators, pilot testing of items, analysis of items via standard factor analysis and item response theory, "Think Aloud" interviews with students during test piloting, revision, re-piloting, and re-analysis of items iteratively. Although time consuming, the resulting statistical rigor of the items on an IRT scale is suggestive of the reliability of this method for assessment test development.