Ryan Robson
Fix task_ids to use standard HuggingFace task identifiers
adbccda
metadata
license: mit
language:
  - en
tags:
  - education
  - k-12
  - science
  - stem
  - ngss
  - assessment
  - curriculum
  - learning
  - standards
  - educational-ai
  - three-dimensional-learning
  - bloom-taxonomy
  - depth-of-knowledge
  - scientific-practices
  - crosscutting-concepts
pretty_name: K-12 Science Standards Aligned Learning Framework
size_categories:
  - 1K<n<10K
task_categories:
  - text-classification
  - text-generation
  - question-answering
task_ids:
  - text2text-generation
  - multi-class-classification
  - open-domain-qa
dataset_info:
  features:
    - name: instruction
      dtype: string
    - name: input
      dtype: string
    - name: output
      dtype: string
    - name: task
      dtype: string
    - name: metadata_standard_code
      dtype: string
    - name: metadata_grade_level
      dtype: string
    - name: metadata_domain
      dtype: string
    - name: metadata_core_idea
      dtype: string
    - name: metadata_core_idea_title
      dtype: string
    - name: metadata_ngss_practice
      dtype: string
    - name: metadata_crosscutting_concept
      dtype: string
    - name: metadata_dok_level
      dtype: string
    - name: metadata_bloom_level
      dtype: string
    - name: metadata_complexity_level
      dtype: string
    - name: metadata_three_dimensional
      dtype: string
    - name: metadata_ngss_aligned
      dtype: string
    - name: metadata_assessment_type
      dtype: string
    - name: metadata_estimated_time
      dtype: string
  splits:
    - name: train
      num_bytes: 3778013
      num_examples: 4750
    - name: validation
      num_bytes: 808193
      num_examples: 1018
    - name: test
      num_bytes: 810805
      num_examples: 1019
  download_size: 202337
  dataset_size: 5397011
configs:
  - config_name: default
    data_files:
      - split: train
        path: data/train-*
      - split: validation
        path: data/validation-*
      - split: test
        path: data/test-*

K-12 Science Standards Aligned Learning Framework Dataset

A comprehensive dataset of K-12 science curriculum standards aligned with the Next Generation Science Standards (NGSS), designed for training and evaluating educational AI systems.

Dataset Overview

This dataset contains 6,787 examples of educational content spanning all K-12 grade levels and science domains. Each example includes instructional content, student inputs, expected outputs, and rich metadata aligned with educational standards.

Quick Stats

  • Total Examples: 6,787 (Train: 4,750 | Validation: 1,018 | Test: 1,019)
  • Grade Coverage: Kindergarten through Grade 12
  • Science Domains: Life Sciences, Physical Sciences, Earth & Space Sciences, Engineering Design
  • Format: Parquet files for efficient loading and processing

Dataset Structure

Core Fields

  • instruction: Learning objective or task description
  • input: Student prompt or context information
  • output: Expected response or assessment criteria
  • task: Type of scientific thinking skill required

Educational Metadata

  • metadata_standard_code: NGSS standard identifier (e.g., "MS-LS3-5")
  • metadata_grade_level: Grade level (K, 1-12)
  • metadata_domain: Science domain
  • metadata_core_idea: NGSS disciplinary core idea
  • metadata_ngss_practice: Science and engineering practice
  • metadata_crosscutting_concept: NGSS crosscutting concept

Assessment Metadata

  • metadata_dok_level: Depth of Knowledge level (1-4)
  • metadata_bloom_level: Bloom's taxonomy level
  • metadata_complexity_level: Learning complexity assessment
  • metadata_assessment_type: Type of assessment activity
  • metadata_estimated_time: Estimated completion time (minutes)
  • metadata_three_dimensional: Three-dimensional learning indicator
  • metadata_ngss_aligned: NGSS alignment verification

Content Categories

Grade Levels

Kindergarten through Grade 12, providing comprehensive coverage across all K-12 educational levels.

Science Domains

  • Life Sciences: Biology, ecology, heredity, evolution
  • Physical Sciences: Chemistry, physics, energy, matter
  • Earth and Space Sciences: Geology, astronomy, climate, natural resources
  • Engineering Design: Design thinking, problem-solving, technological solutions

Task Types

  • Data Analysis: Interpreting scientific data and evidence
  • Evidence Evaluation: Assessing the validity of scientific claims
  • Experimental Design: Planning and designing investigations
  • Scientific Inquiry: Asking questions and forming hypotheses
  • Scientific Explanation: Constructing evidence-based explanations
  • Model Construction: Building and using scientific models
  • Engineering Design: Solving problems through design processes
  • Hypothesis Formation: Developing testable predictions

Assessment Types

  • Argument Construction
  • Model Building
  • Engineering Design Challenges
  • Computational Modeling
  • Lab Investigations
  • Data Analysis Tasks
  • Scientific Argumentation
  • Research Projects
  • Observation Tasks
  • Hands-on Investigations

Usage

Loading the Dataset

Using Pandas

import pandas as pd

# Load individual splits
train_df = pd.read_parquet('data/train-00000-of-00001.parquet')
val_df = pd.read_parquet('data/validation-00000-of-00001.parquet')
test_df = pd.read_parquet('data/test-00000-of-00001.parquet')

# Load all data
all_df = pd.concat([train_df, val_df, test_df], ignore_index=True)

Using HuggingFace Datasets

from datasets import load_dataset

# Load from local directory
dataset = load_dataset('parquet', data_dir='data/')

# Access splits
train_dataset = dataset['train']
validation_dataset = dataset['validation']
test_dataset = dataset['test']

Example Usage Patterns

Filter by Grade Level

# Get middle school examples (grades 6-8)
middle_school = train_df[train_df['metadata_grade_level'].isin(['6', '7', '8'])]

Filter by Domain

# Get Life Sciences examples
life_sciences = train_df[train_df['metadata_domain'] == 'Life Sciences']

Filter by Complexity

# Get proficient-level assessments
proficient = train_df[train_df['metadata_complexity_level'] == 'Proficient']

Educational Standards Alignment

This dataset is meticulously aligned with:

  • Next Generation Science Standards (NGSS): All content maps to specific NGSS performance expectations
  • Three-Dimensional Learning: Integrates disciplinary core ideas, crosscutting concepts, and science practices
  • Depth of Knowledge (DOK): Content is categorized by cognitive complexity levels
  • Bloom's Taxonomy: Learning objectives are classified by cognitive processes

Applications

This dataset is designed for:

  • Educational AI Training: Developing AI tutors and assessment systems
  • Curriculum Development: Creating standards-aligned educational content
  • Assessment Research: Studying educational measurement and evaluation
  • Learning Analytics: Analyzing student learning patterns and outcomes
  • Teacher Professional Development: Training educators on standards implementation

Data Quality

  • Standards Verification: All content verified against official NGSS documentation
  • Educational Review: Content reviewed by certified science educators
  • Cognitive Alignment: DOK and Bloom's levels validated by assessment experts
  • Three-Dimensional Integration: Ensures authentic scientific learning experiences

Ethical Considerations

  • Content is designed to be inclusive and culturally responsive
  • Assessment examples avoid bias and promote equity in science education
  • All content supports diverse learners and learning styles
  • Aligned with educational best practices for K-12 science instruction

Citation

If you use this dataset in your research, please cite:

@dataset{k12_science_standards_2024,
  title={K-12 Science Standards Aligned Learning Framework Dataset},
  author={[Author Information]},
  year={2024},
  publisher={[Publisher Information]},
  url={[Repository URL]}
}

License

This dataset is released under the MIT License.

Contributing

We welcome contributions to improve the dataset quality and coverage. Please see our contribution guidelines for more information.


This dataset supports the development of AI systems that can provide high-quality, standards-aligned science education for all K-12 students.