Grading Exams
Alejandro Gonzalez Recuenco
2024-01-23
Source:vignettes/GradingExams.Rmd
GradingExams.Rmd
This vignette describes how to grade an exam using the script found
on the exec/
folder. You can find where this script is
found by typing on an R terminal.
system.file("exec", package = "TexExamRandomizer")
## [1] "/home/runner/work/_temp/Library/TexExamRandomizer/exec"
Format of the student’s answers
When you collect the responses from the student, you should follow the following prescriptions:
- The actual answers to each question should be an integer number, (1
being the first choice). They should be written in the order that they
are presented in the student’s exam. The columns should be called
“Question 1”, “Question 2”, etc. (Or “Q 1”, “Q2”, etc).
- If a certain question is left unanswered, give it the value 0 in the table.
- If you want to give extra points, add a column called “Extra Points” with the added points.
- You require a column called “Version”, where the version number for each exam is written.
- All other columns will be placed on the output “_Graded.csv” table.
Class | Exam.Version | Nickname | Roll.Number | Q1 | Q2 | Q3 | Q4 | Q5 | Q6 | Q7 | Q8 | Q9 | Q10 | Q11 | Q12 | Q13 | Q14 | Q15 | ExtraPoints |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
M601 | 21 | Titar | 1 | 3 | 4 | 4 | 1 | 1 | 1 | 1 | 3 | 4 | 1 | 3 | 4 | 1 | 2 | 4 | 9 |
M601 | 1 | Suwannapoe | 3 | 2 | 3 | 1 | 2 | 3 | 3 | 2 | 4 | 4 | 2 | 3 | 3 | 2 | 2 | 3 | 8 |
M601 | 5 | Kan | 3 | 3 | 3 | 4 | 2 | 1 | 3 | 4 | 2 | 1 | 1 | 3 | 4 | 2 | 3 | 3 | 8 |
M601 | 16 | Mei | 16 | 1 | 1 | 2 | 4 | 1 | 2 | 3 | 2 | 1 | 3 | 1 | 3 | 2 | 3 | 1 | 5 |
M601 | 17 | Offside | 6 | 2 | 4 | 4 | 1 | 3 | 2 | 1 | 2 | 3 | 3 | 2 | 1 | 1 | 2 | 3 | 1 |
Format of the answer sheet
When you created the exams by using the examrandomizer
script, it will generate a fullanswersheet.csv
file. When
grading, it assumes that the “correctTag” and the “wrongTag” are
respectively “choice” and “CorrectChoice”.
If that is not the case in your structure, simply change the names to “choice” and “CorrectChoice” of those columns.
Version | index | questions_original | question_original | questions | question | choices_original | X.choice.CorrectChoice._original | choices | choice | CorrectChoice |
---|---|---|---|---|---|---|---|---|---|---|
0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | NA |
0 | 2 | 1 | 1 | 1 | 1 | 1 | 2 | 1 | 2 | NA |
0 | 3 | 1 | 1 | 1 | 1 | 1 | 3 | 1 | 3 | NA |
0 | 4 | 1 | 1 | 1 | 1 | 1 | 4 | 1 | NA | 4 |
0 | 5 | 1 | 2 | 1 | 2 | 1 | 1 | 1 | 1 | NA |
(But if you really want to change that assumption, you can access the
script and edit ASHEET_COLCORRECT
and
ASHEET_COLINCORRECT
)
What if I wrote some questions wrong and I noticed too late? I already printed the exams
If you realize one question is incorrectly written, but it is too
late to rewrite the exam, find the original answer sheet in the
fullanswersheet.csv
(The rows with version number 0). Then,
remove the lines from the answer sheet that refer to the question you
want to ignore. (Keep a backup of the answer sheet, just in case).
When the students answer the exam. Tell them to write any answer in that question, it will be ignored by the program.
Grading the exam
You can specify directly the responses from the students
(--resp
) and the answer sheet (--answer
).
gradeexamrandomizer --resp <student responses csv> --answer <fullanswersheet csv>
If you have both of those files on the same directory and you make
sure their name is somehow similar to “responses*.csv
” and
“answer*.csv
”, you can write the shorthand version.
gradeexamrandomizer --dir <dirname>
The output will be two csv files in the same directory where the
students’ responses file is found, called *_Graded.csv
and
*_Stats.csv
.
Class | Exam.Version | Nickname | Roll.Number | addedPoints | addedAllPoints | maxGrade | Grade | Grade_Total_Exam |
---|---|---|---|---|---|---|---|---|
M601 | 21 | Titar | 1 | 0 | 0 | 14 | 14 | 100.00000 |
M601 | 1 | Suwannapoe | 3 | 0 | 0 | 14 | 10 | 71.42857 |
M601 | 5 | Kan | 3 | 0 | 0 | 15 | 5 | 33.33333 |
M601 | 16 | Mei | 16 | 0 | 0 | 15 | 8 | 53.33333 |
M601 | 17 | Offside | 6 | 0 | 0 | 14 | 6 | 42.85714 |
Version | index | questions_original | question_original | questions | question | choices_original | X.choice.CorrectChoice._original | choices | choice | CorrectChoice | ExamAnswerCount |
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | NA | 1 |
0 | 2 | 1 | 1 | 1 | 1 | 1 | 2 | 1 | 2 | NA | 1 |
0 | 3 | 1 | 1 | 1 | 1 | 1 | 3 | 1 | 3 | NA | 5 |
0 | 4 | 1 | 1 | 1 | 1 | 1 | 4 | 1 | NA | 4 | 18 |
0 | 5 | 1 | 2 | 1 | 2 | 1 | 1 | 1 | 1 | NA | 1 |
How is the grade calculated
In the output *_Graded.csv
table, as you can see in the
example above, there will be 5 rows added to the output:
addedPoints
: Points added to each student, by adding a column in the responses called “ExtraPoints”-
addedAllPoints
: Points added to all students. This points are added differently, like if the exam hadaddedAllPoints
extra questions, and all students got them correctly.(Which right now will always be zero with the current script).
maxGrade
: Maximum number of answers in that exam. If you had removed some questions that you want to remove, but that question is not found on all versions, then some exams will have a greater thanGrade
: The number of correct answers on the student exam. (If there are more than one correct answer on each question, ti won’t be able to detec all of them, only whether the student wrote a correct one or not)Grade_Total_Exam
: The total grade, after scaling. It assumes the maximum grade of the exam is a 100%. For generailty we will denote itMAX
. (To change it, use the option--max <integer>
) \[Grade_{Total Exam} = \frac{Grade + addedAllPoints}{maxGrade + addedAllPoints} \cdot MAX + addedPoints\]