CD Code Defense Lab AI use is allowed. Understanding is required.

AI-Era Assessment

Assess understanding, not just code output.

Code Defense Lab helps instructors evaluate whether learners can explain, trace, adapt, and repair AI-assisted code. Professors launch course-based assessments. Learners enter a specific homework and defend the logic step by step.

Professor

Register courses and launch assignments

Build a course space, attach homework, enable hotspot, trace, mutation, and repair, then review consistency by student.

Enter professor workspace

Learner

Open a course and defend a homework

See enrolled courses, choose the exact assignment, paste your code, and continue into explanation, trace, adaptation, and repair.

Enter student workspace
1. choose_role() 2. open_course() 3. select_homework() 4. submit_code() 5. explain_predict_adapt_repair() 6. review_consistency()

Professor Journey

Course creation before evaluation

Professors now begin from their own dashboard, register a course, then attach assignments with the desired understanding checks.

Learner Journey

Course first, homework second, code third

Learners no longer land directly inside a task. They first see enrolled courses and move into a specific homework intentionally.

Assessment Logic

One pipeline, two role-specific entries

The whole system is now easier to understand because instructor setup and learner execution each start from the right place.