Each major project's project manager should be responsible for producing a report to share at the end of the year. Reports should provide a general summary of the activity for the year, supporting data, and recommendations for future changes.
Academic Year 2008-09
This will be administered in an ongoing fashion beginning with the Fall 2008 semester. Data will be evaluated for inclusion in the annual report (May 2009).
Create a simple survey that will be openly accessible from the front page of the Digital Commons website. Participants will click the link, authenticate, and complete survey. Workflow will be established with consultants to encourage completion. Simple survey of satisfaction with:
- Environment (was it clean, comfortable seating, desks, etc)
- Did someone work with you?
- Working on an assignment, for teaching purposes, or personal interest
Can we target machines for survey presentation? In other words can we show surveys on DC machines and not on standard Macs at UP? Cole will talk with Justin Elliot. Justin will check on this.
Three surveys have been created. The first is for student end-users, the second is for support staff, and a third for administrators. These are being reviewed again before implementation. A Google form survey was sent to local support personnel and the results analyzed. Most folks are very pleased with the DC staff support.
ANGEL 7.2 Web EQ
Since the upgrade to 7.3, Web EQ is now available for faculty use. No assessment is currently planned.
Later on, we may conduct interviews with instructors and surveys of students in courses using the ANGEL WebEQ feature. We might ask how well it meets the needs of math and engineering instructors.
Use and case studies for those using. Perhaps a survey or interviews with those using this service. Ask about awareness of, usage of, and impact from the use of SCOLA.
How do we get people to actually use the service. Targeted marketing that needs to happen very soon. Getting the right information to the campuses early. Possible audiences include
- foreign languages
- cultural studies
- political science
Vicki talked with Derick and they are trying a different approach.
Technology Learning Assistants (TLA)
Temporarily on hold for further review.
This will be administered in an ongoing fashion beginning with the Fall 2008 semester. [Data will be evaluated for inclusion in the annual report (May 2009).]
Create a simple survey that will be openly accessible from the front page of the TLA website. Participants will click the link, authenticate, and complete survey. Workflow will be established with TLAs to encourage completion. Simple survey of satisfaction with:
- Degree of satisfaction with TLA
- What was the issue
- Satisfied with assistance
Also assess the impact on the student learning assistants themselves. How did this experience in tutoring benefit them?
Vicki will talk with Jeff Swain to develop the survey/interview questions.
Can this be included in a general customer service approach? There is also the learning experience from the student's viewpoint. Can we use Lynda.com for any training?
FACAC 2009 Student Survey
Data from the 2008 FACAC survey is being analyzed in more depth in reference to demographics and class or faculty rank. While the information from staff and faculty may change little over a year or two, student behaviors may demonstrate greater variability. As a result, a 2009 survey of undergraduate students would help monitor these behaviors.
Question selection to begin May 2008 with a proposed September 2009 delivery of survey, limited to the PSU undergraduate population.
To collect data such as:
- student use of technology tools (ongoing data)
- student use of multimedia tools
- student gaming habits (hours spent, what games, social implications, etc.)
- student experiences with hybrid / blended learning (readiness / expectations)
The external data collection will be managed by the State Data Center's Center for Survey Research at Penn State Harrisburg. General descriptives and crosstabs will be reported to us and a copy of the dataset will be provided no later than the end of November.
Outcomes from tighter data recognition will be used for "bite sized" information chunks to share detailed insights and facilitate decision-making.
Vicki will talk to each group within TLT regarding their needs for student information.
Question: If PSU participates in the 2009 ECAR survey, perhaps much of the information can be obtained in the PSU subset of that survey, in addition to our own student survey.
A Faculty-TA/Staff survey will come early in 2010.
National Survey of Student Engagement (NSSE)
Vicki and Brett co-chair a committee to review the NSSE results and see how they fit into the TLT Strategic Plan. I currently am reviewing the NSSE results for the College of Earth & Mineral Sciences, to identify issues.
I created a matrix that show the strategic goals matched to NSSE goals. Brett and I have filled the TLT initiatives/services column with what we know and will enlist the Managers' group to contribute during the 6/23 Ops meeting.
Customer Satisfaction of Provided Services
Major plan for overall assessment of services provided by TLT. Barb has list of services. Plan calls for: 1. path for immediate feedback and notice of problems 2. centrally-distributed survey of customer satisfaction 3. information-rich survey for specific service
Math 21 UP
Mastery quizzes Originally, Gary had converted quizzes into an ANGEL friendly format. The idea is to use these quizzes in ANGEL as measures of mastery in Math 21 (College Algebra I). Students can attempt the quizzes as preparation for exams. Fall 08 will pilot these quizzes. Students will be surveyed about their use of and experience with the quizzes. Aggregate grades will be compared to previous semesters.
IRB application in progress; followup questions answered. Students will take the first survey sometime mid-September, followed by another mid-semester, and a third at the end of the semester.
Additional information can be found here
The first survey was administered the second week of September. The data was collected and analyzed and reported by September 29. No real surprised were noted, except that the expectations for grades was high. The second survey is being administered October 6th through 17th and the last should be near November 15th. Vicki met with the instructor to plan out the rest of this semester and the analysis. Stan would like to compare student learning outcomes with things like predictive nature of FTCAP actions and responses to attitudinal questions.
The third and final survey of the semester was administered last week (12-10) and has a 20% return so far. The survey will be closed 12-19 and a summary report provided to the Math 21 instructor, Stan Smith, and James Sellers.
Spring 2009 This semester, we repeated the pilot with another four sections. Procedures were approximately the same. The first two surveys have been completed and reported. The third was administered the third week in April and the results tabulated and distributed May & June 2009.
Math 21 DuBois
Another effort to improve student math performance is underway at PS DuBois. Fall08 semester, the instructor (Rick Brazier) conducted a paper-and-pencil assessment of students' views of how the math course was going in the Fall 08 semester. Spring 2009, three instructors began using the Hawkes Learning System which matches a textbook to a CDROM set of exercises and quizzes. Near the end of the semester, we sent surveys to 67 students in three (3) sections. The data was analyzed and reported in June 2009. Results will be used to plan for next year's instruction.
Textbook selection and online resources for Math 22 (College Algebra II) consists of product demonstrations from Pearson and Cengage and a comparison of features. Currently, the College Algebra, 10/E, Margaret L. Lial, (Pearson Higher Ed)text is favored. In conjunction with the text, MyMathLab is favored over WebAssign at this point, due to its more friendly interface and reference connections to the text. To date, a brief survey was administered to students at the end of the SP08 semester regarding the effect on grades and the frequency with which homework is collected.
Jeff Swain and Vicki meet separately with James Sellers and Mary Erickson on Math 21, 22, and 110 needs.
Vicki met with Mary and discussed the needs for a collaborative math learning space where students could meet to work on homework or peer tutor, as well as with instructors for additional learning opportunities. They need 100 computers for their lab classes and collaborative space for their activity classes.
Spring 2009 We used a survey that was based on the Math 21 entry survey and incorporated questions from the ALEKS folks at Pearson. (3-20-09) The first survey was completed and reported. The second survey was administered 2/3 of the way through the semester. Results were analyzed and reported May/June 2009.
Vicki met with Mary to discuss the implications of the results. Planning for fall is designed to address the student issues highlighted by the survey. They will be changing the weekly class sessions to 1 lecture session (lectures are being prepared by Mary and distributed to TAs to improve consistency of instruction across sections), 1 lab session (held in Sparks 7, students will work on ALEKS and Content-on-Demand online assignments while instructors monitor and assist), and 1 activity session (students are given a set of problems to solve in self-selected collaborative groups and their solutions must be checked by two other students before submission).
Surveys will be repeated this fall.
Toys'n MORE (NSF)
This is an NFS grant project headed by the Science, Engineering, and Technology program (Dushy is leaving -to be replaced by Renata). It will employ interventions and measure math performance in Math 22, 26, and 140 at the Penn State campuses. The pilot phase will include math students at DuBois, Worthington Scranton, and Shenango Valley.
The official description reads: Strategy I: The Math Online Success through Tutoring (MOST) program proposes to increase retention of students in mathematics course sequences leading to calculus courses by combining instructor intervention and supplemental instruction (tutoring) with personalized online tutorials. Coalition mathematics faculty will develop tutorials and the peer tutors, through the Learning Centers located at the coalition campuses, will host the tutoring sessions.
Strategy II: Toy FUN-damentals is an existing 1-credit seminar currently offered to first-year students in Mechanical Engineering students. It uses toy making to introduce engineering design and prototyping. This course has proven to increase retention of women in the College of Engineering. A modified version of the course will be implemented at the 14 coalition campus locations to meet the needs of local STEM fields. This course will be open to all STEM field students.
Spring 2009, students were sent invitations to participate in the pilot which only tested the consent form and the demographics survey.
Two exams in survey format have been created. The first is a diagnostic pre-test and the other is a final exam designed to measure learning gains. (They required a great deal of time, as they are full of equations.) Those will be piloted fall 2009. A third survey has been designed for the Toys participants to measure attitudes toward math.
Engineering Design is implementing the use of an eText and online resources (ichapters.com / Cengage Learning) for class and classroom presentation/lecture. We will be assessing the impact on student learning and faculty workload with its use. This involves an IRB (under construction) and survey for fall semester.
The student satisfaction survey was administered 11/6 and responses were collected from 25 students in his class.
A paper was written and accepted for presentation in January 2009 at the Annual Engineering Design Graphics Division of ASEE's Midyear Meeting at Berkeley.
A spring semester assessment was conducted and included another control group. The instructor has NO required text for the course, but will recommend students purchase the five essential chapters from the Lieu text. A second survey was administered that measured stuent attitudes toward the use of Google apps. Twenty-nine PSU engingeering students and 19 from a school in Spain participated.
To assess the Blended Learning Initiative course, a student survey has been created and will be administered near the end of the semester. Also, an online SRTE form has been created to provide additional feedback for the department and the instructor.
Students were offered extra credit to complete a survey of their experiences with the course. The survey was administered to 27 students and the results analyzed. The online SRTEs for this course were offered to students, but only three have completed them so far. Elizabeth feels the feedback from the survey is more helpful in making revisions.
Student Online Learning Readiness
A self-assessment tool is being developed for determining the readiness of students to take a course that is entirely online. Currently working with Robin Gill from DuBois, Jean McGrath from WC, and Suzanne Weinstein from Schreyer. More at http://ets.tlt.psu.edu/learningdesign/assessment/onlinecontent/online_readiness
A possible Flex format is being investigated with Stevie Rocco for posting on the PSUonline page slong with the Faculty Readiness to Teach Online assessment. Natalia has the Flex code and is changing the text to reflect the Student version. Then Stevie can upload the XML file.
While this approach will work for the ELearning Cooperative or PSU ONline, we still need to validate and test for reliability, then offer it in several venues for use by advisors as well as students.
The URL was sent to the Director of Admissions at PS Harrisburg for testing. His feedback was positive and made suggestions for the feedback provided to students at completion of the survey. The feedback will be improved and made available for general use by advisors and prospective students. In the meantime, I have received additional suggestions from admissions advisers and student advisers. The tool will be revised and ready to use during the second semester 2008-09.
We are evaluating two options. We need to investigate the viability of option one, but we believe it may be programatically possible.
- Ongoing post session assessment to begin in Fall (as close to the start of classes):
- When user selects to exit Adobe Connect, they are prompted to complete short survey.
- Adobe Connect window closes and they are taken to a survey page (not reliable on log-out, only on log-in, since most users do not log-out, but just close window.)
- Advantage is the "in the moment" access to the participant
- Disadvantage is that the length of the survey must be very short (no more than 5 items) and will appear each and every time one logs in, even if it is to set up a meeting or check settings prior to a meeting.
- Invitation to all previous years' participants (results available for annual report):
- Invitations sent to all users in the system between a date range
- Advantage is that we have the potential to capture a more overall perspective on Adobe Connect satisfaction and use
- We currently have a mass mailing survey instrument prepared
- Disadvantages are low response rate, inaccurate invitation lists, and the inability to capture session by session thoughts
Questions to be answered... Are there existing instruments available that we can use to measure Connect impact? Already into Green meter idea with Adobe via Barb. Can we broaden this to Polycom and other video conferencing systems? Determine who can provide data or usage reports on videoconferencing in general? What information is the Nursing Program collecting on this? Does Purdue or someone else have an assessment instrument or process for videoconferencing?
We need to ask John Spotts to maintain an ongoing report of service issues and support categories.
Barb Smith, Yvonne Clark, Vicki Williams, and John Spotts will communicate to accomplish this.
As of 10/28/08, an instrument has been created, but a series of interviews will be conducted first, to determine the validity of the instrument. 25-30 people were randomly selected from the list of users and will be contacted to answer some questions about their use of Connect. AS of 4-15, 150 had been contacted, resulting in 25 good interviews.
For more information on the assessment of Adobe Connect as a delivery system for Training modules, go to Adobe Connect in Training.
Podcasting and iTunes U
This year we propose to introduce a standard survey that will be available towards the end of each semester to gather feedback on the use of podcasting through iTunes U. We maintain a list of all faculty with iTunes U sections. A form email will be sent to faculty with iTunes U sections with a link to a standard assessment form. Data will be evaluated at the end of each semester and rolled into a single report for the annual report. We are recommending a reuse of the original podcasting survey.
Vicki will work with Chris Millet, Tim Perry, and others to do this.
- Ask faculty how they are using blogs
- Ask students how they are using
- Ask all if they are satisfied
Vicki needs to talk with Cole about the questions he wants answered.
Where does the email list come from?
Educational Gaming Commons
Again, we are proposing multiple assessments around the EGC project. We are proposing to address satisfaction with the facility and the services offered within it, providing assistance to faculty who are investigating games for learning, and will conduct focus groups/interviews to better understand faculty utilization using the revised case study methodology described below.
The Facility/Service satisfaction survey will be administered from a link on the EGC website. If we use a card swipe for the doors, we can also send an end of year survey to those who used the room. Basic questions include:
- Environment (was it clean, comfortable seating, desks, etc)
- Did someone work with you? Were they helpful?
- Working on an assignment, for teaching purposes, or personal interest
- Perceived impact on learning
Additionally we will produce a series of case studies that will illustrate the impact of gaming in education. Vicki will work with Brett to identify key faculty, a series of questions, and a methodology for capturing the right kinds of information. We'd like to make these publicly available as well. The first case study should be completed by the end of the Fall 08 semester.
So far, an interview script has been created and we next need to identify the faculty members to interview for the case study write-up.
Vicki will work closely with Brett Bixler, Chris Stubbs, and Bart Pursel.
Brett and Vicki have developed a set of questions.
Collaborative Learning Spaces
Data has been collected on the Pollock facility and the report will be updated to include observations from the Spring 08 semester.
Users of the new spaces in HHD and Warnock will be surveyed regarding their preferences in furniture and layout. The survey is being finalized by Vicki and Mary with Natalia's assistance.
Natalia and Naseem have distributed and collected surveys from 13 students and have given copies to the lab attendants.
Group Problem Solving in Engineering
EMECH 211 and 213 are using clickers and while data was collected last year on the effectiveness of clickers as a learning facilitator, but this year, Group Problem Solving activities were added and the students will be surveyed at semester's end for their effectiveness in comprehension of the course concepts and student-student interaction.
VTech Classroom Assessment
This is in partnership with CLC to investigate utilization and satisfaction in VTECH classrooms, specifically 207 and 409 Burrowes. A survey has been created with administration of the survey to begin in July 08, with results available in August.
The survey was distributed to 150 email addresses of poeple who had logged into the software. Responses included "I never used this" and were generally not helpful in determining the usage patterns.
Videoserver / PSUTube / Lynda.com
Develop assessment of new video services. Related is the assessment of the use of Adobe Connect as a delivery system for Training Services and their version of Lynda.com.
Web 2008 Conference
The evaluation of the Web 2008 Conference in July 08. Data was collected in survey format and results analyzed and reported to committee (Patti Fantaske). Completed 7/10/08.
Web 2009 Conference
Mark Heckel modified the earlier conference evaluation form for use this year.
2009 TLT Symposium. Survey evaluations were sent to 402 participants. Results were available for the annual debrief approximately two weeks after the event.
Digital Media Day
Derick and Vicki have begun working on an evaluation survey for the Digital Media Day, Jan 7, 2009. Derick will determine the kind of information to be collected and Vicki will craft the survey questions.
General Focus Group and Feedback Sessions
We will be inviting members of our audience to participate in informal discussions related to several TLT initiatives throughout the year. These include, but are not limited to the following:
- New TLT website
- Faculty experience with various technologies
- Student reactions to services
- Futures conversations
- Who do we want to collect information from?
- What information do we want to collect?
- Ask students about a site to buy/sell/trade textbooks. find e-texts
- When will we collect it?
- Where do we anticipate collecting it?
- Why is this important and what will we do with the information we collect?
Revised Case Study Methodology
For case studies, we are proposing to change the model. The Assessment Team will provide a script of questions to the Marketing and Communication Team to use while interviewing selected faculty. The Assessment Team will use the responses to generate case studies while the Marketing and Communications Team will use them to create Profile in Success stories.
All ID Faculty Development Needs Assessment
Working with ID community, create a survey that can be customized and administered centrally for University-wide collection of data regarding faculty's need for training and professional development teaching activities.
The hosted survey has been created and we will begin collecting data on October 15th. ID users may include this link in an email to their faculty and we will provide the selected output frequency reports.
Laptop Software for Architecture Students
Jonathan Holman is working with the Architecture department to provide course/lab software loaded onto student laptops. Student users will be survey for their opinions about the software experience. Jonathan and Vicki are working on a survey.
Technology Learning/Training Gaps
- Through increased partnerships/programs with units, faculty, and various groups, we'll learn where the gaps are that we may or may not be addressing, and be able to better tailor our services to those gaps. This may very well be in the form of interviews.
- An example would be...we'd really like to be more organized, but we don't know whether to use a wiki, project mgmt software, Sharepoint, or what. More specifically, if they've determined they want to use a wiki for some purposes, but they must work with folks who refuse, we can recommend training on wikispaces, as well as Acrobat which can help convert the wiki page into an Acrobat document.
- It has been several years since Training Services has done a broad assessment of its audience which has made it increasingly difficult to ensure that we are meeting the training needs of our customers. This year we plan to create and distribute a survey (via campus/college/department contacts, dedicated listservs, and newswires) to students, faculty, and staff throughout the University to gather detailed data on:
- preferred learning styles
- training needs
- work environment (i.e. is it conducive for participating in online training)
- preferences for instructor-led seminars
- preferences for learning more about our services
These classes will mostly consist of overviews on some of our most popular topics. Sessions will be offered face-to-face while also being streamed live using Adobe Connect. An email with a link to an online survey will be sent to each participant immediately following the class. Those participating via Connect will be provided a direct link to the survey. Surveys will assess the content, instructor and delivery method for those participating via Connect. After reviewing the survey results, focus groups and/or individual interviews may be conducted to gather additional data, especially regarding the experience of participating via Connect.
- Expected number of offerings per semester: 15
- Expected number of attendees per session: 8 in the classroom and 5 virtually
Hands-on sessions will continue to be assessed using an online survey that participants will complete at the end of the class. These will be used to assess the quality of the content, learning environment and instructor.
- Expected number of offerings per semester: 80
- Expected number of attendees per session: 10
Training on Demands
Training Services will make use of Adobe Connect to improve Commonwealth Campus access to our training offerings, including hands-on sessions. Participants will be provided a direct link to the survey which will assess the content, instructor and delivery method. After reviewing the survey results, focus groups and/or individual interviews may be conducted to gather additional data, especially regarding the experience of participating via Connect.
- Expected number of offerings per semester: 10
- Expected number of attendees per session: 10
Through building partnerships with various colleges/departments, we plan on learning more about the University's training needs. One expected outcome of this is to increase the number of face-to-face TODs that we conduct, particularly at University Park. Participants will receive an email containing a link to the survey which will assess the content, instructor and delivery method. After reviewing the survey results, focus groups and/or individual interviews may be conducted to gather additional data
- Expected number of offerings per semester: 40
- Expected number of attendees per session: 16
Because of the IPAS project and the results of the most recent FACAC survey, providing security training will be of utmost importance over the next year. For many topics we will continue to bring in vendor training. Each vendor will be asked to provide us copies of the class surveys for our review.
- Expected number of offerings per semester: TBD in conjunction with SOS.
- Recordings of lectures will be made available on the Training Services web site. At the end of each recording, a survey will be available that will ask the participant to rate the content and their viewing experience. Whatever can be used to motivate survey participation (e.g., chance to win This or That) should be utilized.
- Other asynchronous modules may be developed for internal systems such as ANGEL, Workflow, Data Warehouse, etc. At this time it is not possible to determine the number of modules that will be developed. It will vary depending on how many requests we receive for development of this type of training. Modules will contain links to surveys to access the quality of the training.
If a site license is purchased for Lynda.com tutorials, a yearly assessment would be conducted asking users to evaluate the quality of the tutorials, why they are using them (i.e. personal growth, class requirements, part of a training curriculum), additional topics needed, content quality, etc. The survey would be sent to a sample of those who have logged in to use the training modules.
Indiana University Training Materials
In the Summer of 2007 Training Services purchases Indiana University's award winning training materials on topics such as Office, Adobe products, etc. These materials were made available to trainers throughout the university. This year these individuals will be mandated as part of our agreement to make them available to other University training providers to track which materials they use, the numbers of training sessions they were use for and the number of participants in those sessions.