Understanding the Inter-Rater Consistency Report
Helping you interpret and use Inter-rater consistency insights effectively
Table of Contents
- SimCapture Cloud Release Notes
- SimCapture Cloud Administrator and Faculty Help
- SimCapture Cloud Learner Help
- SimCapture On-Premise Help
- SimCapture for Skills Help
- SimCapture Mobile Camera App Help
- SimCapture Companion Apps Help
- SimCapture Cloud Use Cases
- Integrations Help
- Hardware and Network Technical Information Help
- Glossary
- Samaritan
The Inter-Rater Consistency Report is a powerful tool designed to help administrative-level users analyze the consistency of subjective ratings across multiple evaluators. This report provides a visual overview of how well raters agree when scoring scenarios, evaluations, or courses.
Things to Know Before You Begin
- This report is only available to administrative-level users.
- It is designed to assess agreement among multiple raters who provide subjective evaluations.
- You can export aggregated dataacross:
- Organizations
- Courses
- Scenarios
- Evaluations
For more detailed guidance, refer to the full Inter-Rater Consistency Report documentation.
What the Report Shows
- Visual overview of rating consistency
- Helps identify variability or bias in scoring
- Supports quality assurance and training improvements for evaluators
Exporting Data
You can export the report data for further analysis or record-keeping. Export options typically include:
- CSV or Excel formats
- Filters by organization, course, scenario, or evaluation