|Contact:||UW-Test Scoring, firstname.lastname@example.org, x4864|
|Drop-off Location:||Help Desk - Andersen Library 2000|
|Help Desk Hours||Monday-Thursday, 8:00 a.m. to 9:00 p.m., Friday 8:00 a.m. to 4:30 p.m.|
For those planning to request teacher evaluation forms, the deadline for making prep requests for all courses (3-week, 8-week or 16-week) is two weeks prior to the end of the final day of class.
Please list any special instructions in the special instructions area on the test request form. If more than one class section is included in an envelope, please specify whether each section should be scored separately or if all sections should be combined.
The scanning of true/false exams, multiple-choice exams or surveys on machine readable answer sheets is a service provided by ICIT. Results are sent to the instructor's campus email account. Note: ICIT will not print results, as they can be printed by the instructor. If you need assistance printing, contact the Help Desk at Ext. 4357. Instructors may also request individual results to be sent to student email accounts.
Scanning requests are picked up Monday through Friday at 10:00 AM (excluding holidays). Requests dropped off after 10:00 AM will be picked up and processed the following business day. Please allow up to two business days from the time the exams/surveys are picked up for processing.
Test Request Forms and answer sheets are available in departmental offices or you may use the following link: Test Request Form (PDF)
Instructors requesting the scanning service must submit the following:Instructors requesting scanning services must submit the following:
1. Test Request Form
Please submit a completed form with each scanning request. An example of a completed Test Scoring Form can be found here.
2. Answer Key
Answer keys must be completed using a No. 2 pencil, filling in bubbles completely and darkly. An example of a completed Answer Key can be found here. (see ExampleCompletedKey.pdf)
The Name Field and Identification Field must be completed as follows:
3. Weight Key (If applicable)
For more information on creating weight keys see our FAQs page by clicking on the "FAQs" tab above.
4. Student answer sheets (Scantrons ONLY)
Scantrons must be completed using a No. 2 pencil. Student information should be completed as follows:
Note: Students must indicate their student ID# on the answer sheet in order for their scores to be sent via email. The instructor's email address must be indicated on the request sheet for scores to be emailed to students.
Emailed results are available as four text files or four worksheets in an Microsoft Excel workbook.
We process all exams as quickly as possible on a first come basis. However, the workload is variable throughout the year and staff is limited. The two business days is to account for high volume periods (such as mid-term or final weeks) and staff shortages due to unexpected absences or scheduled time off. If you have not received your exam results within this allotted time, please contact our offices at Ext. - 4864.
You can drop off your tests at the Help Desk in Andersen Library Room 2000, between the hours of 8:00 AM and 9:00 PM, Monday through Thursday, and between 8:00 AM until 4:30 PM on Fridays during the school year.
Whenever the Help Desk is closed, including weekends, arrangements have been made with the circulation desk at the Library for after hour drop-offs. You are encouraged to call Ext. - 4864 and let us know of after hour drop-offs to be sure they will be picked up and be processed the next business day.
Currently, this can only be accomplished by doing the following:
No. Just leave that question blank on the answer key and make a note in the special instructions that the question or questions are to be omitted so we know the omission is intentional. If you decided to drop a question after you've given the tests back to the students, we will usually need to re-run the results. You can request this by calling Ext. - 4864. We usually do not need to re-scan the tests.
Include that information in the special instructions or call Test Scoring at Ext. - 4864. The tests will be run twice with the two different answers. You will receive two reports and need to take the better of the two scores for each student. If you have more than two right answers or if any answer is correct, you will probably need to hand score that question. Requesting the output in Excel will simplify this process.
We moved the drop off location to the Help Desk to improve customer service. The Help Desk area is also more publicly accessible. We appreciate your understanding and cooperation as we make this transition. All test scoring operations and the campus mail address for Test Scoring remain in McGraw Hall 208. Only the drop off location has changed.
We no longer provide hard copy printouts. However, you may request Notepad file output and you will receive four text files as attachments in your email which can be printed by you. These will be EXACTLY like what we had in the past for standard reports. If you are having any trouble printing, check to see if the print setup is set to "landscape" and font size set to 8. Contact Test Scoring at Ext. - 4864 or the Help Desk at Ext. - 4357 if you need help with this.
By default, each exam question is worth one point. Using a weight key allows you vary the point value of each questions from 1 to 5 using the corresponding scantron bubble (A=1, B=2, C=3, D=4, E=5).
A typical situation where you might use a weight key is if you have true/false questions and multiple-choice questions on the same exam. Say you want the true/false questions to be worth 1 point and the multiple choice questions to be worth 2 points. In this case, you would create a weight key where you would fill in A for each true/false question and B for each multiple choice question. An example of a completed weight key can be found here (PDF).
If you have further questions regarding weight keys and their use, please contact our offices at Ext. - 4864.
Yes. Just be sure to note that in the special instructions on the request sheet. Please call or email email@example.com when you are ready to send the emails to the students.
If you have been getting the output in Notepad format, you may try asking for it in Excel, then you can just highlight whatever you want and either print that or save it into a new Excel file. Call Ext. -4864 if you need tips on working with Excel for your needs. The Help Desk can also help you with technical questions.
Notify us in the "Special Instruction" portion of the exam. The exam includes polling questions and just fill in all bubbles to the question on the answer key. Filling in all bubble on the answer key will cause the scanner to omit the question from grading and serve as a second indicator to Test Scoring Services and yourself that the exam contained poll questions. You will then see how the question was answered in the Item Analysis portion of the exam results you receive.
To see a completed example of an answer key containing poll questions, click here.
Students will receive their scores by email only if the professor requests it, and if the students fill out the Identification Number field with their seven digit Student ID#. If a student does not want the results emailed for some reason, he can leave it blank.
If a student has not received their results while the rest of their class has, this is usually the result of the following:
Tests are sent back one business day after processing. This holding period gives professors time to review exam results and contact Test Scoring Services in case an error has occurred and exams need to be re-scanned. If an instructor needs to have the exam sheets back sooner, she may fill out a Special Pick Up Request form. Special Pick Up Request forms can be found at the Andersen Help Desk location, or can be downloaded by clicking here.
Since the new student ID's were introduced in the fall of 1999, it is no longer ever necessary to use Social Security Numbers (SSN) on any tests. If you have any instructions that still say to use the SSN, they need to be corrected immediately. Please disregard the words "Social Security Number" in the Identification box on the answer sheets and use only the student ID# if requested or if desired. In short, you should NEVER use SSN on tests.
For some basic explanation of item analysis and statistics concepts, you can see the Statistics Explained page.
The test scoring process in no way updates any student records. Test scoring is a self-contained system that does not update or change any other data. We just generate the scores for that particular test and we provide that to the instructor. He/she or the department keeps the running tally for the final grade and submits that to records. If you have any concerns contact your instructor or the department office.
The following description of the test item analysis produced by this program is rather general in nature. For more specific information concerning any particular statistic, reference should be made to any good statistics or test and measurements textbook.
Indicates the question number on the exam
Indicates what the correct answer was as indicated on the answer key.
Indicates the point value of the question.
The point-biserial correlation coefficient measures the relationship between the score on the item and the score on the test. The value of this statistic ranges between -100 and +100. A high positive value indicates that those who answered the item correctly also received higher scores on the test than those who answered the item incorrectly. A high negative value indicates that those who answered the item correctly received lower scores on the test than those who answered the test item incorrectly. A near-zero value indicates that there is little relationship between the score on the item and the score on the test. It is desirable to retain items with a high positive correlation coefficient and to eliminate those with near zero or negative values. As a rough guide, it is suggested that the items with negative or near-zero (10 or less) correlations be eliminated or substantially revised, and those with low positive (10-30) correlations be studied to determine how improvement might be accomplished.
Percentage of students who got the answer correct (the closer to 100, the easier the question).
The “average” student response to an item.
The columns headed A, B, C, D, E, and omitted indicate the percent and number of students who responded accordingly. These can be used to determine the pattern of responses for multiple-choice items.
The Kuder-Richardson internal consistency formula number 20 has been used to compute the reliability estimate provided in this analysis. A reliability coefficient of this type gives an indication of the extent to which individuals taking the test again will receive the same scores. Vales of the Kuder-Richardson reliability estimate range between 0.000 and 1.000. A value close to +1.000 indicates the test exhibits a high degree of reliability. Estimates should be interpreted cautiously if large numbers of students are unable to complete the test within the allotted time. For a typical 50-minute classroom examination covering related subject matter, a reliability coefficient of at least .75 is desirable. Reliability can be improved through item revision based upon the item analysis data computed in this program. Lengthening the test (when this is practical) will also increase reliability, particularly in the case of short examinations.
Standard Error of Measurement
The standard error of measurement is an estimate of the probability of extended error in test scores. It is interpreted in the same manner as standard deviation. A standard error of measurement of 3.500, for example, indicates that for any particular test score, the odds are 2 to 1 that the student's true score (his average score on several similar tests) will not deviate from the one obtained by more than 3.500 points. The more reliable and error-free the test, the smaller the standard error or measurement. This direct application to scores makes the standard error or measurement especially useful when evaluating differences among students or assigning grades.