Documentation
The Resident Experience Score measures how accessible and user-friendly a form is, scored out of 100.
The four scoring dimensions
How points are allocated
Score bands
What different scores mean
Issue severity levels
High, medium, and low
Confidence levels
Auto vs heuristic checks
Your score is calculated across four dimensions, each measuring a different aspect of the form experience.
Core accessibility requirements: proper labels, keyboard navigation, screen reader compatibility, and semantic HTML structure.
How clearly the form communicates what's required: visible labels, helpful descriptions, and transparent data handling.
How easy it is to fill out the form: appropriate input types, autofill support, and reasonable field counts.
How the form handles mistakes: clear error messages, inline validation, and helpful recovery paths.
| Score | Band | Meaning |
|---|---|---|
| 90–100 | Excellent | Few or no issues detected |
| 75–89 | Good | Minor issues to address |
| 60–74 | Needs work | Notable accessibility gaps |
| Below 60 | High friction | Significant barriers for users |
Each issue is categorised by how much it impacts users:
Blocks or significantly hinders users from completing the form. Examples: missing labels, no submit button.
Creates friction but doesn't fully block completion. Examples: unclear required fields, poor keyboard order.
Minor improvement opportunities. Examples: placeholder text, missing autocomplete hints.
Definitive checks with reliable detection. These have high confidence and rarely produce false positives.
Pattern-based checks that may have edge cases. Use your judgement when these flag potential issues.
The overall score confidence reflects the mix of check types used. A score based primarily on automated checks will have higher confidence than one relying on heuristics.