Data Collection and Analysis

Tamara van Gog, Fred Paas, Wilhelmina Savenye, Rhonda Robinson, Mary Niemczyk, Robert Atkinson, Tristan E. Johnson, Debra L. O’Connor, Remy M.J.P. Rikers, Paul Ayres, Aaron R. Duley, Paul Ward, Peter A. Hancock

    Research output: Chapter in Book/Report/Conference proceedingChapter

    2 Scopus citations


    The focus of this chapter is on methods of data collection and analysis for the assessment of learning processes and complex performance, the last part of the empirical cycle after theory development and experimental design. In the introduction (van Gog and Paas), the general background and the relation between the chapter sections are briefly described. The section by Savenye, Robinson, Niemczyk, and Atkinson focuses on methods of data collection and analysis for assessment of individual learning processes, whereas the section by Johnson and O’Connor is concerned with methods for assessment of group learning processes. The chapter section by van Gog, Rikers, and Ayres discusses the assessment of complex performance, and the final chapter section by Duley, Ward, Szalma, and Hancock is concerned with setting up laboratories to measure learning and complex performance.

    Original languageEnglish (US)
    Title of host publicationHandbook of Research on Educational Communications and Technology, Third Edition
    PublisherTaylor and Francis
    Number of pages44
    ISBN (Electronic)9781135596910
    ISBN (Print)9780203880869
    StatePublished - Jan 1 2008


    • Assessment criteria: Describe the aspects of performance that will be assessed.
    • Assessment of learning: Measuring learning achievement, performance, outcomes, and processes by many means.
    • Assessment standards: Describe the quality of performance on each of the criteria that can be expected of participants at different stages (e.g., age, grade) based on a participant’s past performance (selfreferenced), peer group performance (norm-referenced), or an objective standard (criterion-referenced).
    • Collective data collection: Obtaining data from individual group members; data are later aggregated or manipulated into a representation of the group as a whole.
    • Complex performance: Refers to real-world activities that require the integration of disparate measurement instrumentation as well as the need for timecritical experimental control.
    • Direct process measure: Continuous elicitation of data from beginning to end of the (group) process; direct process measures involve videotaping, audiotaping, direct researcher observation, or a combination of these methods.
    • Group learning process: Actions and interactions performed by group members during the group learning task.
    • Group: Two or more individuals working together to achieve a common goal.
    • Holistic data collection: Obtaining data from the group as a whole; as this type of data collection results in a representation of the group rather than individual group member, it is not necessary to aggregate or manipulate data.
    • Indirect process measure: Discrete measure at a specific point in time during the (group) process; often involves multiple points of data collection; indirect process measures may measure processes, outcomes, products, or other factors related to group process.
    • Instrumentation: Hardware devices used to assist with the process of data acquisition and measurement.
    • Mixed-methods research: Studies that rely on quantitative and qualitative as well as other methods for formulating research questions, collecting and analyzing data, and interpreting findings.
    • Online/offline measures: Online measures are recorded during task performance, offline measures are recorded after task performance.
    • Process-tracing techniques: Records performance process data such as verbal reports, eye movements, and actions that can be used to make inferences about the cognitive processes or knowledge underlying task performance.
    • Qualitative research: Sometimes called naturalistic; research on human systems whose hallmarks include researcher as instrument, natural settings, and little manipulation.
    • Quantitative research: Often conceived of as more traditional or positivistic; typified by experimental or correlational studies. Data and findings are usually represented through numbers and results of statistical tests.
    • Task complexity: Can be defined subjectively (individual characteristics, such as expertise or perception), objectively (task characteristics, such as multiple solution paths or goals), or as an interaction (individual and task characteristics).

    ASJC Scopus subject areas

    • Social Sciences(all)


    Dive into the research topics of 'Data Collection and Analysis'. Together they form a unique fingerprint.

    Cite this