You must log in to this site in order to participate in the forum.
Is anyone familiar with how difficulty and discrimination are calculated in the item analysis of the assessment report?
An instructor has an essay portion on her assessment. When grading the essay, she also adds feedback in the comment box.
In the grades category under the Reports tab, the students can see their total score (for the multiple choice questions and the final essay); however, the instructors comments do not show in the box.
I never had this issue with the quiz feature. Do I need to adjust the settings somewhere?
Is anyone else as uncomfortable with the overemphasis on timed everything as I am? Obviously, insufficient time results in lower levels of performance but what is less apparent is that just the perception of insufficient time results in reduced performance. Very dangerous.
I am concerned, too, about all the emphasis on assessment. One would think to read about it that there is a shortage of assessment tools. I would like to see some PL authoring tools made available and get the emphasis rebalanced and moved back towards learning.
I have a faculty member that I'm working with who would like to set up quizzes in ANGEL like this:
1) There are a total of 20 multiple choice questions.
2) There is a section heading inserted that selects 10 questions out of 20.
3) Each question is displayed one at a time.
4) The student has 3 attempts to take the quiz- the highest score is recorded.
5) The faculty member wants the same questions and to have them in the same order for all 3 attempts.
When I created the section heading I selected "never scramble" but when I tested the quiz as a student taking it 3 times... each time I ended up getting at least 1 or 2 questions that weren't in the other attempts and the questions were never in the same order each time.
Any suggestions for how to fix this but still have the 20 question bank so that all students aren't receiving the same quiz?
Dow anyone know how well the calculated quiz question wizard works within ANGEL? What has been the faculty experience with this tool? I'm asking so that I know if this is something I should promote/advertise within the college of Engineering.
I'm assisting a professor who administers her exams on Angel. She has her exam set up so that the test is 35 questions, chosen randomly from a pool of 50 questions. She wants to be able to view and print the entire 50 question pool with the answer choices. When I use the preview tool, I can only view 35 questions which are randomly chosen. The defualt view to edit doesn't show the answers, until you go into a specific question to edit it. I also tried the print / save as pdf option, but I still only get 35 questions. Has anyone dealt with a situation like this before, or have an idea how to view all questions & answers at once?
Q: Can grades from UTS be pulled into the Course Gradebook?
A: Instructors can pull in grades from two services provided by UTS: Test Pilot and Bubble sheets. There is a link on the Lessons tab that enables instructors to retrieve these tests scores from shire and then to place them in the Gradebook (figure) . There is a link in ANGEL Help under Faculty Documentation for the Penn State Test Tool that will provide additional information (figure) .