Appendix A_Instructions for Document Analysis Rubric and Coding Scheme

Appendix A_Instructions for Document Analysis Rubric and Coding Scheme_v3.docx

Formative Data Collections for DOL Research

Appendix A_Instructions for Document Analysis Rubric and Coding Scheme

OMB: 1290-0043

Document [docx]
Download: docx | pdf

Implementation Evaluation of the Strengthening Community Colleges Training Grant Program (SCC) Cohort 2 and Cohort 3

Instructions for Document Analysis Rubric and Coding Scheme



APPENDIX A

Instructions for Document Analysis Rubric and Coding Scheme


OMB Control Number: 1290 – 0043

OMB Expiration Date: 10/31/2025



Familiarize Yourself with the Document Analysis Rubric.

Before starting the document analysis, each research team member should thoroughly read and understand the Document Analysis Rubric. The team should collectively discuss and clarify any questions or uncertainties about the Rubric to ensure consistency in its application. The analysis will begin after an interrater reliability exercise among all evaluators to review rubric parameters.


Understand the Document

Read each document thoroughly to understand its content and context. Note the document's type, purpose, author, and intended audience, as these can influence its content and presentation. Documents will include grant proposals, quarterly reports, interim and final reports, marketing materials, curriculum and program documents, and related items.


Apply the Rubric

Apply the Rubric to each document individually. For each of the five categories (Alignment with SCC Goals, Program Design and Implementation, Program Outcomes, Challenges and Solutions, Implications and Recommendations), assess the document and assign a score based on the criteria outlined in the Rubric.


Justify Your Score

Make sure to note down specific examples or evidence from the document that support your score. This evidence will ensure that the scoring process is transparent and understood by others. Two raters will evaluate each document. The team will enter all scores from the rubric evaluation into NVivo qualitative software for analysis.


Check for Consistency

It may be helpful for multiple team members to score a few documents independently and then compare their scores to check for consistency in using the Rubric at several points throughout the evaluation period. Discuss any discrepancies and refine your approach if necessary.


Record Scores

Keep a detailed record of the scores assigned to each document using a spreadsheet with columns for each rubric category and rows for each document. Make sure to record any relevant notes or comments.


Interpret the Scores

Once the evaluators complete the scoring, the research team will analyze and interpret the scores. Remember that the scores are not absolute measures but are relative to each document's content and purpose. High scores do not necessarily indicate "good" documents, nor do low scores necessarily indicate "bad" documents.


Apply the Coding Scheme

Following rubric evaluation, two evaluators will reread each document using the coding scheme developed by the research team. Coding categories align with the rubric criteria, and evaluators will note points where each category is evident in the document. Enter all codes into NVivo software for analysis.


Improve Continuously

The team should regularly reflect on the effectiveness of coding and make necessary adjustments to improve its application and usefulness. Again, the team will conduct an initial (pre-coding) interrater reliability exercise among all evaluators.


Respect Confidentiality

All documents should be confidential per ethical guidelines. Handle any personal or potentially sensitive information appropriately.


Communicate the Results

When presenting or reporting your findings, explain the rubric, your scoring process, how the researchers interpret scores, and the coding process and the analysis of coding. This explanation ensures your approach is transparent and understood by others.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDoherty, Kathryn
File Modified0000-00-00
File Created2024-08-02

© 2024 OMB.report | Privacy Policy