Assessment: Every Step Informs the Next
Assessment
Category: Assessment & Evaluation
Use: Measure, guide, and support learner progress
Best for: Checking readiness, tracking progress, and evaluating mastery
Related Concepts: Formative, Summative, Diagnostic, Rubrics, Validity
When we hear the word “assessment,” we often picture final scores, high-stakes exams, or pass/fail checkboxes. However, in learning design, assessment is not simply an endpoint. It is a sequence of intentional steps that guide, inform, and support the learner’s growth.
Like moving from one stepping stone to the next, every assessment — whether diagnostic, formative, or summative — serves a distinct purpose in helping learners make progress. When implemented effectively, assessment becomes less about judgment and more about creating momentum.
In this post, we’ll explore:
- How to choose and apply the three essential types of assessment: diagnostic, formative, and summative
- Common assessment mistakes and how to avoid them
- Best practices for accessibility, feedback, and follow-up
Three Essential Types of Assessment
Before selecting assessment tools or formats, it helps to understand the three core types of assessment — diagnostic, formative, and summative. Each one plays a specific role in the learning process:
- Diagnostic — Where is the learner starting?
- Formative — How are they progressing?
- Summative — Did they reach the goal?
The cards below will help you identify which type of assessment fits your learning moment — and how to use it meaningfully.
Diagnostic Assessment
When: Before learning
Purpose: Identify prior knowledge, readiness, and gaps
Examples: Pre-tests, skills inventories, self-checklists
Real-World Scenarios:
- A manager completes a skills inventory before being assigned a coaching module.
- Learners take a self-assessment to uncover knowledge gaps before starting a technical course.
Common Mistake: Skipping pre-assessment and designing for the “average” learner instead of meeting learners where they are.
Formative Assessment
When: During learning
Purpose: Monitor progress and provide feedback
Examples: Knowledge checks, polls, discussions, peer feedback
Real-World Scenarios:
- Participants respond to a poll during a live session to gauge understanding before moving on.
- A learner receives peer feedback on a draft proposal during a collaborative activity.
Common Mistake: Making feedback too vague or infrequent, which weakens learner growth and confidence.
Summative Assessment
When: After learning
Purpose: Measure mastery and final outcomes
Examples: Final quizzes, scenario tasks, certification tests
Real-World Scenarios:
- A customer service rep completes a simulation-based quiz to earn a course certificate.
- Participants present a final proposal as a capstone in a leadership development program.
Common Mistake: Designing a final assessment that does not align with the learning objectives or real-world application.
Quality Check: Validity and Reliability
Before you finalize an assessment, take a moment to step back and ask: Is it doing what it’s supposed to do, and can others rely on the results? These two questions point to the core of assessment quality: validity and reliability.
The cards below can help you keep both in mind as you design with intention.
🎯 Validity
Does the assessment measure what it is intended to measure?
Example: Are you testing analysis, or just memorization?
📏 Reliability
Would different instructors or reviewers score the learner the same way?
This matters most for subjective assessments like presentations or portfolios.
You do not need to be perfect — but you should be intentional.
Accessibility and Inclusion in Assessments
Great assessments are designed for all learners.
- Use clear, plain language
- Ensure screen reader compatibility and keyboard navigation
- Avoid bias in scenarios (for example, by not assuming cultural norms or access to specific technologies)
Inclusivity begins with awareness and improves with practice.
What happens After the Assessment?
This is where learning design becomes dynamic. Once the assessment is complete, the real value comes from how the results are used. Thoughtful follow-up turns a one-time check into an opportunity for deeper growth.
- Analyze the results: Look for patterns across learners, not just scores.
- Give feedback: Make it timely, clear, and focused on growth.
- Adjust instruction: Address learning gaps or revise future content.
- Support next steps: Recommend resources, coaching, or follow-up activities.
- Track long-term impact: Evaluate whether learning outcomes are sustained over time.
Final Thoughts
Assessment is not about passing or failing. It is about making learning visible, useful, and transformational, for learners and for you as the designer.
When you are planning a course, pause and ask:
What do I need to know about the learner?
How can we use assessment as a bridge rather than a barrier?