Provide Academic Guidance To Re-Admitted Students Through The Monitored Academic Progress (MAP) Program
The MAP program will provide quality academic advice and mentoring to students who, following their suspension, are re-admitted by their respective Deans as probationary students with the goals of returning students to good academic standing, improving their grade point averages, and increasing their awareness of the benefits of academic mentoring.
Objective
Increase Academic Accountability In MAP Students
Students participating in the MAP program will realize the importance of academic skills and accountability.
KPI
MAP Student Surveys
Over the last several years, students have been asked to evaluate the MAP program via a survey concerning their perceptions of the program’s effectiveness and their perceived academic improvement. Although the student responses have always been well above our target score, only a minimal number of completed/returned surveys is of concern to the program (76 of the 495 sent [15.4%] during the 2012-2013 academic year). To boost this number, the mentors will take the following actions:
Send out e-mail reminders twice before the end of the semester: (a) once before we send out the e-mail survey and (b) once to remind the student to complete the survey.
Use social media (i.e., Facebook, Twitter) as another avenue to remind the students to complete the online survey.
Provide paper copies of the survey to students upon completion of the program’s requirements (but track the students who were given a survey to remove them from the e-mail blast).
At least 20% of the MAP students will complete surveys regarding the perceived effectiveness of the program.
Result
MAP 2014/2015 Survey Response
Surveys were made available to participating students by their respective mentors at the end of each semester, fall and spring, with the intention of gauging students' perceptions of their academic accountability by having them rate the various requirements of the MAP program and their required participation.
MAP 2014/2015 Program Population Breakdown:
Fall 2014: 313 students enrolled in the program / 27 students (8.63%) completed the survey (10 paper forms, 17 electronic forms).
Spring 2015: 250 students enrolled in the program / 16 students (6.40%) completed the survey (8 paper forms, 8 electronic forms).
Overall: 563 students enrolled in the program / 43 students (7.64%) completed the survey (18 paper forms, 25 electronic forms).
Given these completion rates, the goal of a 20% completion rate for survey completion was not met.
Moreover, the completion rates for the MAP surveys dropped for the third year in a row from 16.1% (2012) to 11.0% (2013) to 8.63% (2014) for the fall semester, from 14.2% (2013) to 12.0% (2014) to 6.40% (2015) for the spring semester, and from 15.4% (2012-2013) to 11.3% (2013-2014) to 7.64% (2014-2015) for the entire academic year. However, despite the small number of replies, the students rated MAP favorably:
Requirements were clearly explained - 100.00% agreed/strongly agreed
Meetings were helpful - 97.67% agreed/strongly agreed
Grade Check Forms (GCFs) were helpful - 95.35% agreed/strongly agreed
Study Skills were helpful - 95.35% agreed/strongly agreed
Treated with courtesy - 100.00% agreed/strongly agreed
Overall, the MAP program was beneficial/helpful - 100.00% agreed/strongly agree
Action
Survey Response
Despite the additional reminders (pre- and post-), the use of social media, and having the students’ mentors contacting them personally , all of which was in addition to the initial e-mail containing the survey as well as the use of paper surveys, the survey completion rate dropped for the third academic year in a row.
However, despite the low percentage of survey participation, the students surveyed overwhelmingly perceived the program to be beneficial (range: 95.35% to 100.00%). As such, although the department will continue to reach out to students, a new focus will be on the academic effect of the program itself (e.g., GPA improvement, course completion) to find out whether it has the desired academic affect.
Goal
Promote Student Classroom Success Through The First Alert (FA) Program
The First Alert program provides quality academic advice and mentoring to students identified by professors as being “at risk” with the goal of preventing their failure in the courses for which they were referred.
Objective
Increased Referral
As a result of more effective contact with new and returning instructors, who in turn refer more students to the FA program, there will be an increase of referrals when compared to the past year.
KPI
Faculty Involvement
All faculty members who have used the First Alert program since 2008 will be e-mailed a survey asking them about what they perceive as the strengths and weaknesses of the program as well as their preferred methods of submission.
A 25% completion rate is the mark of success.
Result
First Alert Survey Results
190 faculty/staff who had previously used the First Alert referral program were e-mailed and asked to complete a survey about their views of the First Alert program.
The survey was created using the Lime Survey software, which has a token system—a system that allows the sender to know who completed a survey, but allows the identity of the survey taker to remain confidential. This system also allowed the office to continually target the specific professors who did not respond, while not “pestering” those who had responded.
E-mails requesting the 190 faculty/staff to complete a survey concerning the First Alert program were sent out in September, October, and December. Within the e-mail was a link to the survey that was set up in such a way as to provide the survey takers anonymity concerning their survey answers (i.e., token system).
Of the 190 professors who were e-mailed, 4 responded:
September (1st e-mail) – 1 response
October (2nd e-mail) – 2 responses
December (3rd e-mail) – 1 response
As such, the overall response rate was 2.11%, which was far below the program’s goal of a 25% completion rate.
Action
Ending The Faculty Survey
Given the truly pathetic response rate by the faculty members for two years (0% for 2013-2014, 2.11% for 2014-2015) in a row, it has been decided that SAM Center mentors will no longer waste their time trying to gain a better understanding of the professors who do not use the First Alert program as it is painfully clear that they can not be bothered with a simple survey.
Goal
Provide Academic Guidance To Student On Academic Probation Through The Help Eliminate Probation (HELP) Program
The HELP program will provide quality academic advice and mentoring to students who have been placed on Academic Probation, though not suspended, with the goals of returning students to good academic standing, improving their grade point averages, and increasing their awareness of the benefits of academic mentoring.
Objective
Increase Academic Accountability In HELP Students
Students participating in the HELP program will realize the importance of academic skills and accountability.
KPI
HELP Student Surveys
Over the last several years, students have been asked to evaluate the HELP program via a survey concerning their perceptions of the program’s effectiveness and their perceived academic improvement. Although the student responses have always been well above our target score, only a minimal number of completed/returned surveys is of concern to the program (27 of the 249 sent during the 2012-2013 academic year). To boost this number, the mentors will take the following actions:
Send out e-mail reminders twice before the end of the semester: (a) once before we send out the e-mail survey and (b) once to remind the student to complete the survey.
Use social media (i.e., Facebook, Twitter) as another avenue to remind the students to complete the online survey.
Provide paper copies of the survey to students upon completion of the program’s requirements (but track the students who were given a survey to remove them from the e-mail blast).
At least 20% of the HELP students will complete surveys regarding the perceived effectiveness of the program.
Result
HELP Survey Completion Rates
Surveys were made available to participating students at the end of each semester, fall and spring, with the intention of gauging students' perceptions of their academic accountability by having them rate the various requirements of the HELP program and their required participation.
HELP Program Population Breakdown:
Fall 2014: 198 students enrolled in the program / 11 students (5.56%) completed the survey (6 paper forms, 5 electronic forms).
Spring 2015: 575 students enrolled in the program / 55 students (9.57%) completed the survey (27 paper forms, 28 electronic forms).
Academic Year: 773 students enrolled in the program / 66 students (8.54%) completed the survey (33 paper forms, 33 electronic forms).
Given these completion rates, the goal of a 20% completion rate for survey completion was not met.
However, the completion rates for the HELP surveys improved from 3.4% (2013) to 5.56% (2014) for the fall semester, from 6.8% (2014) to 9.57% (2015) for the spring semester, and from 6.3% (2013-2014) to 8.54% (2014-2015) for the entire academic year.
Moreover, the students rated the HELP program very highly:
Requirements were clearly explained - 95.59% agreed/strongly agreed
Meetings were helpful - 94.12% agreed/strongly agreed
Grade Check Forms (GCFs) were helpful - 91.18% agreed/strongly agreed
Study Skills were helpful - 92.65% agreed/strongly agreed
Treated with courtesy - 98.53% agreed/strongly agreed
Overall, the HELP program was beneficial/helpful - 95.59% agreed/strongly agreed
Action
HELP Survey
Despite the additional reminders (pre- and post-), the use of social media, and having the students’ mentors contact them personally, all in addition to the initial e-mail containing the survey as well as the use of paper surveys, the survey completion rate for 2014-2015 was only marginally better than the 2013-2014 academic year.
However, despite the low percentage of survey participation, the students surveyed overwhelmingly perceived the program to be beneficial (range: 91.18% to 98.53%). As such, although the department will continue to reach out to students, a new focus will be on the academic effect of the program itself (e.g., GPA improvement, course completion) to find out whether it has the desired academic affect.
Goal
Support Academic Performance Of Students Through Study Skills
SAM Center Study Skills programs will support the academic performance of all participating students, regardless of delivery mode.
Objective
Acquisition Of Study Skills
SAM Center Study Skills program participants will acquire study skills involving preparing, avoiding procrastination, managing time, reading textbooks/taking notes, taking tests, and managing stress, regardless of the delivery mode of the program.
Indicator
Learning And Study Strategies Inventory (LASSI)
Program participants will improve their Learning and Study Strategies Inventory (LASSI) scores during the course of the study skills series. The LASSI, a 10-scale, 80-item instrument developed at the University of Texas at Austin, uses rating scales to measure students’ perceptions of their strategic learning involving the following components: (a) skill, which includes their scores on the information processing, selecting main ideas, and test strategies scales; (b) will, which includes their scores on the anxiety, attitude, and motivation scales; and (c) self-regulation, which includes their scores on the concentration, self-testing, study aids, and time management scales. Each of the three LASSI components’ associated scales will be assessed annually on a rotating basis.
Criterion
5% Growth In Selected Scales For At Least 50% Of Participants
To establish a benchmark, at least 50% of participants during the Spring 2015 semester will demonstrate at least 5% growth in each scale of the skill component of the LASSI.
Finding
5% Growth Criterion Met
A total of 593 individuals registered for at least one section of study skills for the Spring 2015 semester. Of those, 16 either did not enroll at SHSU for spring 2015 or resigned from SHSU for spring 2015; one additional registrant was not a student at SHSU but was granted permission to participate in the program. Of the 576 registrants remaining, 483 enrolled in at least one face-to-face (FtF) section and 93 enrolled in at least one online section. (Some individuals registered for both types of sections.)
Of the 576 registrants, a total of 359 completed both pre- and post-test LASSIs: 315 FtF participants and 44 online participants. Note: One participant took two different FtF sections of study skills and completed pre- and post-test LASSIs for both sections, so only the first set of this participant’s scores was used in the calculation of percentages. Another participant enrolled in both a FtF section and an online section but was counted as an online participant due to completing both pre- and post-test LASSIs only in the online section.
Six participants (one FtF participant and five online participants) did not grant us permission to use their LASSI scores, so the following percentages are based on a total of 353 participants: 314 FtF participants and 39 online participants.
Total Percentages 70% of participants demonstrated at least 5% growth in the information processing scale of the skill component of the LASSI 69% of participants demonstrated at least 5% growth in the selecting main ideas scale of the skill component of the LASSI 67% of participants demonstrated at least 5% growth in the test strategies scale of the skill component of the LASSI
Face-to-face Percentages 68% of participants demonstrated at least 5% growth in the information processing scale of the skill component of the LASSI 68% of participants demonstrated at least 5% growth in the selecting main ideas scale of the skill component of the LASSI 68% of participants demonstrated at least 5% growth in the test strategies scale of the skill component of the LASSI
Online Percentages 87% of participants demonstrated at least 5% growth in the information processing scale of the skill component of the LASSI 77% of participants demonstrated at least 5% growth in the selecting main ideas scale of the skill component of the LASSI 69% of participants demonstrated at least 5% growth in the test strategies scale of the skill component of the LASSI
Action
Raise Criterion For Success And Explore Ways To Improve Growth In Test Strategies
Having far exceeded the benchmark, we intend to raise the criterion for success considerably. The program data for spring 2015 indicated that the average growth for the information processing, selecting main ideas, and test strategies scales was 21.21%, 26.13%, and 16.94%, respectively; therefore, we expect at least 50% of participants to demonstrate at least 20% growth in each scale of the skill component of the LASSI the next time it is assessed in 2017-2018. We intend to keep the current benchmark for the will and self-regulation components of the LASSI because participants' growth in these components has yet to be assessed.
We will also explore participants’ survey responses by the end of the 2015-2016 academic year to determine how to change the study skills content and/or materials to better facilitate participants' growth in the test strategies scale.
Last, the fact that online participants showed greater growth than FtF participants in all scales of the skill component is of interest. If online participants continue to show greater growth in the remaining LASSI components, then we will need to examine possible causes for the disparity.
Objective
Academic Achievement And Progress Toward Graduation
SAM Center Study Skills program participants will demonstrate academic achievement and progress toward graduation, regardless of the delivery mode of the program.
KPI
Grade Point Average (GPA)
Program participants will demonstrate greater grade-point average (GPA) gains during the semester of attendance than nonparticipants. Based upon historical performance, the GPAs of spring 2015 participants will shift in a positive direction 0.3 more than the GPAs of nonparticipants.
Result
GPA Indicator Not Met
Participants A total of 593 individuals registered for at least one section of study skills for the Spring 2015 semester. Of those, 16 either did not enroll at SHSU for spring 2015 or resigned from SHSU for spring 2015; one additional registrant was not a student at SHSU but was granted permission to participate in the program. Of the 576 registrants remaining, 483 enrolled in at least one face-to-face (FtF) section and 93 enrolled in at least one online section. Because two registrants enrolled in classes at SHSU after the census date, the Office of Institutional Effectiveness (IE) was able to return data for only 574 of the 576 registrants: 481 FtF and 93 online. Note: Some individuals registered for both types of sections; these registrants were eventually classified as participating in the format (FtF or online) in which the majority of sessions were completed.
Nonparticipants IE also returned data for 574 nonparticipants, who were selected via a proportionate random stratified sample from the population of SHSU students who enrolled in and completed (i.e., did not resign) the Spring 2015 semester and were not classified as study skills participants during that semester. The stratification variable used was student classification at the beginning of spring 2015 (e.g., if 10% of the participants were freshmen, then 10% of the nonparticipants in the sample were freshmen).
Total Results GPAs of the 556 participants who had been SHSU students prior to spring 2015 (and thus already had institutional GPAs) shifted in a positive direction 0.25 more, on average, than the GPAs of nonparticipants who had been SHSU students prior to spring 2015; however, the final GPAs of the 18 participants who had not been SHSU students prior to spring 2015 were, on average, 0.08 less than the GPAs of nonparticipants who had not been SHSU students prior to spring 2015.
Face-to-face Results GPAs of the 465 participants who had been SHSU students prior to spring 2015 (and thus already had institutional GPAs) shifted in a positive direction 0.25 more, on average, than the GPAs of nonparticipants who had been SHSU students prior to spring 2015; however, the final GPAs of the 16 participants who had not been SHSU students prior to spring 2015 were, on average, 0.22 less than the GPAs of nonparticipants who had not been SHSU students prior to spring 2015.
Online Results GPAs of the 91 participants who had been SHSU students prior to spring 2015 (and thus already had institutional GPAs) shifted in a positive direction 0.22 more, on average, than the GPAs of nonparticipants who had been SHSU students prior to spring 2015; however, the final GPAs of the 2 participants who had not been SHSU students prior to spring 2015 were, on average, 1.10 more than the GPAs of nonparticipants who had not been SHSU students prior to spring 2015.
KPI
Course Completion Rate
Program participants will demonstrate greater course completion rates (the number of semester credit hours completed divided by the number of semester credit hours attempted) during the semester of attendance than nonparticipants. To establish a benchmark, the course completion rates of spring 2015 participants will be, on average, 10% higher than the course completion rates of nonparticipants.
Result
Course Completion Indicator Not Met
Participants A total of 593 individuals registered for at least one section of study skills for the Spring 2015 semester. Of those, 16 either did not enroll at SHSU for spring 2015 or resigned from SHSU for spring 2015; one additional registrant was not a student at SHSU but was granted permission to participate in the program. Of the 576 registrants remaining, 483 enrolled in at least one face-to-face (FtF) section and 93 enrolled in at least one online section. Because two registrants enrolled in classes at SHSU after the census date, the Office of Institutional Effectiveness (IE) was able to return data for only 574 of the 576 registrants: 481 FtF and 93 online. Note: Some individuals registered for both types of sections; these registrants were eventually classified as participating in the format (FtF or online) in which the majority of sessions were completed.
Nonparticipants IE also returned data for 574 nonparticipants, who were selected via a proportionate random stratified sample from the population of SHSU students who enrolled in and completed (i.e., did not resign) the Spring 2015 semester and were not classified as study skills participants during that semester. The stratification variable used was student classification at the beginning of spring 2015 (e.g., if 10% of the participants were freshmen, then 10% of the nonparticipants in the sample were freshmen).
Total Results Course completion rates of participants were, on average, 9% lower than the course completion rates of nonparticipants.
Face-to-face Results Course completion rates of participants were, on average, 9% lower than the course completion rates of nonparticipants.
Online Results Course completion rates of participants were, on average, 10% lower than the course completion rates of nonparticipants.
KPI
Retention
Program participants will be retained from long semester to long semester at a greater rate than nonparticipants. To establish a benchmark, the retention rates of spring 2015 participants will be, on average, at least 10% higher than the retention rates of nonparticipants.
Result
Retention Data Will Be Reported Next Cycle
We will provide results for this KPI during the next assessment cycle because (a) this data is not available until at least the 20th class day of the long semester following participants' enrollment in study skills, (b) the Office of Institutional Effectiveness (IE) must have adequate time to process the request, and (c) the SAM Center must have adequate time to analyze the data provided by IE.
Action
Encourage Participants To Ask For Help From Academic Mentors
Having fallen short of two of the benchmarks related to academic achievement and progress toward graduation (and with remaining results yet to be determined), we intend to promote more aggressively the fact that academic mentors are available to assist participants and to encourage participants to take advantage of the assistance sooner rather than later. Beginning in spring 2016, we will make available in the face-to-face study skills classroom a "Talk to a Mentor" (or similar) box, in which participants can place their names and e-mail addresses if they would like mentors to reach out to them. In the online study skills classroom, this would take a different form, likely involving a link to the private journal tool at the end of every session. Academic mentors are already mentioned more than once during each series as a resource for further reinforcement of study skills and strategies, but providing a concrete step that students can take in the moment to ask for help might make them more likely to do so. In addition, this will serve to promote the Provost’s AIM High campaign, which seeks to foster a campus culture in which the act of asking for help is both expected and welcomed.
Objective
Student Satisfaction
SAM Center Study Skills program participants will demonstrate satisfaction with program components (i.e., effectiveness, leaders, subject matter, and course design).
KPI
Study Skills Improvement
Program participants who respond to the program satisfaction survey—an internally developed instrument containing 11 closed-ended items (14 for online students), 2 multiple-response items (checklists), and 4 open-ended items—will perceive that the program improved their study skills. Closed-ended items related to participants’ perception include the following:
The program was relevant and useful to me.
The program enhanced my study skills.
I would recommend this group to other students.
Participation in study skills was a valuable use of my time.
The open-ended item related to participants’ perception states the following:
The most important thing I learned was . . .
To establish a benchmark, at least 75% of participants who respond to the satisfaction survey during the Spring 2015 semester will either “strongly agree” or “agree” with the above closed-ended items and reference a particular study skill taught in the open-ended item.
Result
Study Skills Improvement Indicator Met Overall But Only Partially For Online Participants
A total of 444 participants had the opportunity to answer these survey questions (i.e., they either attended session six in person or had access to session six online): 375 face-to-face (FtF) participants and 69 online participants. Respondents numbered 337 (76% response rate) for the closed-ended items: 306 FtF (82% response rate) and 31 online (45% response rate). Because 22 (21 FtF and 1 online) of the original 337 respondents did not answer the open-ended item, respondents numbered 315 (71% response rate) for this item: 285 FtF (76% response rate) and 30 online (43% response rate). Note: The original online response rate was 74% for the closed-ended items and 72% for the open-ended item (provided all original respondents answered the closed-ended items and one did not answer the open-ended item); however, a technical error resulted in the loss of the actual survey responses for series two respondents. Records indicating who completed the survey at the end of series two (necessary for determining online study-skills completion) remained intact. The survey was re-opened for the original 22 respondents, but only 2 retook it.
Total Percentages 84% of participants either strongly agreed or agreed that the program was relevant and useful to them 82% of participants either strongly agreed or agreed that the program enhanced their study skills 87% of participants either strongly agreed or agreed that they would recommend this group to other students* 80% of participants either strongly agreed or agreed that participation in study skills was a valuable use of their time 95% of participants referenced a particular study skill taught in their response to the open-ended item
*Based upon 336 (vs. 337) respondents due to one respondent’s answer not aligning with the question asked
Face-to-face Percentages 84% of participants either strongly agreed or agreed that the program was relevant and useful to them 82% of participants either strongly agreed or agreed that the program enhanced their study skills 86% of participants either strongly agreed or agreed that they would recommend this group to other students* 80% of participants either strongly agreed or agreed that participation in study skills was a valuable use of their time 96% of participants referenced a particular study skill taught in their response to the open-ended item
*Based upon 305 (vs. 306) FtF respondents due to one respondent’s answer not aligning with the question asked
Online Percentages 74% of participants either strongly agreed or agreed that the program was relevant and useful to them 74% of participants either strongly agreed or agreed that the program enhanced their study skills 84% of participants either strongly agreed or agreed that they would recommend this group to other students 74% of participants either strongly agreed or agreed that participation in study skills was a valuable use of their time 80% of participants referenced a particular study skill taught in their response to the open-ended item
KPI
Positive View Of Program Leaders
Program participants who respond to the previously described program satisfaction survey will hold a positive view of program leaders. Closed-ended items related to participants’ perception include the following:
The program objectives were clearly stated and met.
The leader had a good understanding of the content.
The leader engaged students in lively discussion.
The leader used good examples to explain points and responded clearly to questions.
The material was clearly presented.
The open-ended item related to participants’ perception asks the following:
What was your overall impression of the leader’s ability to manage the Study Skills program?
To establish a benchmark, at least 75% of participants who respond to the satisfaction survey during the Spring 2015 semester will either “strongly agree” or “agree” with the above closed-ended items and reference at least one positive leader quality (e.g., knowledgeable, caring, confident) in the open-ended item.
Result
Positive View Of Program Leaders Indicator Met
A total of 444 participants had the opportunity to answer these survey questions (i.e., they either attended session six in person or had access to session six online): 375 face-to-face (FtF) participants and 69 online participants. Respondents numbered 336 (76% response rate) for the first closed-ended item: 305 FtF (81% response rate) and 31 online (45% response rate). Respondents numbered 337 (76% response rate) for the remaining closed-ended items: 306 FtF (82% response rate) and 31 online (45% response rate).
Because 38 (34 FtF and 4 online) of the original 337 respondents either did not answer the open-ended item or provided a response to a question that was not asked (e.g., discussed qualities of the program rather than the leader), respondents numbered 299 (67% response rate) for the open-ended item: 272 FtF (73% response rate) and 27 online (39% response rate). Note: The original online response rate was 74% for the closed-ended items and 68% for the open-ended item (provided all original respondents answered the closed-ended items and the same four either did not answer the open-ended item or provided a response to a question that was not asked); however, a technical error resulted in the loss of the actual survey responses for series two respondents. Records indicating who completed the survey at the end of series two (necessary for determining online study-skills completion) remained intact. The survey was re-opened for the original 22 respondents, but only 2 retook it.
Total Percentages 96% of participants either strongly agreed or agreed that the program objectives were clearly stated and met 95% of participants either strongly agreed or agreed that the leader had a good understanding of the content 91% of participants either strongly agreed or agreed that the leader engaged students in lively discussion 96% of participants either strongly agreed or agreed that the leader used good examples to explain points and responded clearly to questions 96% of participants either strongly agreed or agreed that the material was clearly presented 99% of participants referenced at least one positive leader quality in their response to the open-ended item
Face-to-face Percentages 95% of participants either strongly agreed or agreed that the program objectives were clearly stated and met 96% of participants either strongly agreed or agreed that the leader had a good understanding of the content 91% of participants either strongly agreed or agreed that the leader engaged students in lively discussion 95% of participants either strongly agreed or agreed that the leader used good examples to explain points and responded clearly to questions 96% of participants either strongly agreed or agreed that the material was clearly presented 99% of participants referenced at least one positive leader quality in their response to the open-ended item
Online Percentages 100% of participants either strongly agreed or agreed that the program objectives were clearly stated and met 87% of participants either strongly agreed or agreed that the leader had a good understanding of the content 94% of participants either strongly agreed or agreed that the leader engaged students in lively discussion 97% of participants either strongly agreed or agreed that the leader used good examples to explain points and responded clearly to questions 87% of participants either strongly agreed or agreed that the material was clearly presented 100% of participants referenced at least one positive leader quality in their response to the open-ended item
KPI
Positive View Of Program Subject Matter
Program participants who respond to the previously described program satisfaction survey will hold a positive view of program subject matter. Closed-ended items related to participants’ perception include the following:
The material was well organized.
The handouts were clear and easy to understand.
All multiple-response items (checklists) relate to this perception and ask the participant to select the most helpful session(s) and least helpful session(s).
Open-ended items related to participants’ perception include the following:
In the future, what could be added to improve this program?
In the future what could be left out to improve this program?
To establish a benchmark, at least 75% of participants who respond to the satisfaction survey during the Spring 2015 semester will (a) either “strongly agree” or “agree” with the above closed-ended items, (b) select more “most helpful” sessions than “least helpful” sessions, and (c) suggest more additions to the program than subtractions.
Result
Positive View Of Program Subject Matter Only Partially Met
A total of 444 participants had the opportunity to answer these survey questions (i.e., they either attended session six in person or had access to session six online): 375 face-to-face (FtF) participants and 69 online participants. Respondents numbered . . . 337 (76% response rate) for the first closed-ended item: 306 FtF (82% response rate) and 31 online (45% response rate). 336 (76% response rate) for the second closed-ended item: 305 FtF (81% response rate) and 31 online (45% response rate). 335 (75% response rate) for the multiple-response items: 304 FtF (81% response rate) and 31 online (45% response rate).
Because 44 (41 FtF and 3 online) of the original 337 respondents did not answer the open-ended items and 2 (both FtF) provided unintelligible responses, respondents numbered 291 (65% response rate) for the open-ended items: 263 FtF (70% response rate) and 28 online (41% response rate). Note: The original online response rate was 74% for the closed-ended and multiple-response items and 70% for the open-ended item (provided all original respondents answered the closed-ended items and the same three did not answer the open-ended items); however, a technical error resulted in the loss of the actual survey responses for series two respondents. Records indicating who completed the survey at the end of series two (necessary for determining online study-skills completion) remained intact. The survey was re-opened for the original 22 respondents, but only 2 retook it.
Total Percentages 96% of participants either strongly agreed or agreed that the material was well organized 96% of participants either strongly agreed or agreed that the handouts were clear and easy to understand 58% of participants selected more “most helpful” than “least helpful” sessions 37% of participants suggested more additions to the program than subtractions
FtF Percentages 96% of participants either strongly agreed or agreed that the material was well organized 98% of participants either strongly agreed or agreed that the handouts were clear and easy to understand 58% of participants selected more “most helpful” than “least helpful” sessions 39% of participants suggested more additions to the program than subtractions
Online Percentages 100% of participants either strongly agreed or agreed that the material was well organized 74% of participants either strongly agreed or agreed that the handouts were clear and easy to understand 58% of participants selected more “most helpful” than “least helpful” sessions 21% of participants suggested more additions to the program than subtractions
KPI
Course Design Helpful (Online Participants)
Online program participants who respond to the previously described program satisfaction survey will perceive the program’s course design to be helpful. Closed-ended items related to participants’ perception include the following:
The course design helped me determine the tasks to accomplish each week.
The quizzes helped me gauge my understanding of the material.
To establish a benchmark, at least 75% of online participants who respond to the satisfaction survey during the Spring 2015 semester will either “strongly agree” or “agree” with the above closed-ended items.
Result
Course Design Helpful Indicator Met
A total of 69 participants had the opportunity to answer these survey questions (i.e., they had access to session six online). Respondents numbered 31 (45% response rate) for the first item and 30 (43% response rate) for the second item. Note: The original online response rate was 74% for the first item and 72% for the second item (provided all original respondents answered the first item and one did not answer the second item); however, a technical error resulted in the loss of the actual survey responses for series two respondents. Records indicating who completed the survey at the end of series two (necessary for determining online study-skills completion) remained intact. The survey was re-opened for the original 22 respondents, but only 2 retook it.
84% of participants either strongly agreed or agreed that the course design helped them determine the tasks to accomplish each week 83% of participants either strongly agreed or agreed that the quizzes helped them gauge their understanding of the material
KPI
Leader Responsive (Online Participants)
Online program participants who respond to the previously described program satisfaction survey will perceive the leader to be responsive. The closed-ended item related to participants’ perception states the following:
The leader answered my questions in a timely manner.
To establish a benchmark, at least 75% of online participants who respond to the satisfaction survey during the Spring 2015 semester will either “strongly agree” or “agree” with the above closed-ended item.
Result
Leader Responsive Indicator Not Met
A total of 69 participants had the opportunity to answer this survey question (i.e., they had access to session six online). Respondents numbered 31 (45% response rate). Note: The original online response rate was 74% (provided all original respondents answered this survey question); however, a technical error resulted in the loss of the actual survey responses for series two respondents. Records indicating who completed the survey at the end of series two (necessary for determining online study-skills completion) remained intact. The survey was re-opened for the original 22 respondents, but only 2 retook it.
71% of participants either strongly agreed or agreed that the leader answered their questions in a timely manner
Action
Raising Success Indicators And Addressing Shortcomings/Discrepancies
The planned actions for this objective are as follows:
Study Skills Improvement The fact that online participants’ responses did not meet the benchmark for three of the survey items is of concern and calls for further investigation, especially given that FtF participants’ responses exceeded the benchmark for all related survey items. One possible explanation for the discrepancy is that participation in the Blackboard discussion board—where online participants and the instructor share thoughts, experiences, and strategies that can help illustrate the relevance and value of study skills—is not mandatory. Though FtF students can benefit from hearing a discussion even if they do not contribute to it, online participants cannot benefit unless they at least read the discussion board. Before changing the discussion board policy, however, we will further explore online participants’ open-ended survey responses by the end of the 2015-2016 academic year to generate other possible explanations. Positive View of Program Leaders Having far exceeded the benchmark, we intend to raise the success indicator considerably: In the future, at least 95% of participants who respond to the satisfaction survey will either “strongly agree” or “agree” with the closed-ended items related to program leaders and reference at least one positive leader quality (e.g., knowledgeable, caring, confident) in the open-ended item. In addition, we will further explore online participants’ open-ended survey responses by the end of the 2015-2016 academic year to generate possible explanations for the two areas of greatest discrepancy between online and FtF participants for the items related to program leaders (the leader had a good understanding of the content and the material was clearly presented). Positive View of Subject Matter The most likely reason for the large discrepancy between online and FtF participants’ responses for the item related to the clarity of handouts—and for the fact that online participants’ responses did not meet the benchmark for this item—is that FtF participants automatically receive the handouts and online participants are only presented with the opportunity to open/print the handouts and encouraged to do so. From now on, the online study skills instructor will reference the handouts specifically in the weekly updates for the first two sessions of each series of study skills, explaining what they are and the importance of accessing them. In addition, we will explore other ways to present the handouts in Blackboard (e.g., having the content open automatically rather than having the participants open/print it, if possible) during the Fall 2015 semester.
In addition, because all participants’ responses fell considerably short of the benchmark regarding most helpful/least helpful sessions and program additions/subtractions, we will further explore these responses by the end of the 2015-2016 academic year to determine if there are any patterns. For example, were some sessions consistently mentioned as being least helpful and/or was a particular addition/subtraction mentioned by the majority of students?
Course Design Helpful Having exceeded the benchmark, we intend to raise the success indicator: In the future, at least 85% of participants who respond to the satisfaction survey will either “strongly agree” or “agree” with the closed-ended items related to course design. In addition, other planned actions (e.g., exploring other ways to present handouts in Blackboard, possibly making the discussion board mandatory) are likely to improve participants’ overall perception of course design.
Leader Responsive The most likely reason for the fact that participants’ responses did not meet the benchmark for this item is that the online study skills instructor does not respond to e-mail outside of normal business hours. Beginning in fall 2015, online participants will be able to access a virtual office that houses threads to answer frequently asked questions. For example, if the instructor gets a question from a participant during the week and feels others could also benefit from the answer, the instructor will post the question/answer in the virtual office for everyone. In addition, online participants could create threads to ask questions of their classmates and also reply to threads, thereby answering their classmates’ questions. Last, the forum that houses the virtual office will remind participants that the instructor is available only during normal business hours and will give them a time frame in which to expect e-mail responses from the instructor.
Goal
Provide Academic Advising To Undergraduate Students
The SAM Center will provide academic advising to undergraduate students of all classifications to facilitate student understanding of degree plans, degree requirements, and institutional rules and regulations.
Objective
Provide A Positive And Informative Advising Experience
Students advised at the SAM Center will understand their degree requirements and be satisfied with their advising experience.
KPI
Distance/Online Services
Students are enrolling in online and distance learning courses in record numbers; as such, advising must adapt to provide services to these students. Therefore, advising will:
research online advising software,
push for a full-time advisor at TWC for the distance students,
and try to identify the students who are 100% online in order to offer them options concerning advising.
Result
Distance/Online Services
In their efforts to support online/distance students, SAM Center advisors...
researched online advising software, identifying both BlackBoard Collaborate and EAB SSC Campus as potential aids for online advising;
pushed for a full-time advisor at TWC for the distance students, but was denied the full-time position due to university budgetary issues; and
discovered that there is an intradepartmental "tag" for students who are 100% online students (i.e., taking all courses online) and are currently in talks with the respective departments to incorporate this tag into the SSC Insight software.
KPI
Advising Workbooks
The SAM Center will update the advising workbooks in order to accomodate the changes in core curriculum and departmental majors that are supposed to occur in the 2014-2015 academic years, thereby providing more accurate advising to the students.
Result
Workbooks
Given the cross-campus cooperation necessary to achieve the core curriculum and departmental major updates, the SAM Center has no direct power over the process; as such, the SAM Center cannot adequately access the processes which, ultimately, leads to a continuing process of piecemeal updates to advising protocols and workbooks. Concerning practical advising, the SAM Center will have to simply disseminate the information to each advisor as the department is made aware.
Action
SAM Center Updates
Due to the complexities of interdepartmental communication and program modification, the process of updating advisors of major and core curriculum changes will have to remain a piecemeal dissemination of information to advisors.
Given the newness of the two software options, the advancement of SAM Center distance/online services will be reassessed in a few years in order to give the university technical team time to work out bugs and the SAM Center advisors time to learn the systems and then guide students in their operation.
Additionally, the SAM Center will continue to request a full-time advising slot at The Woodlands Center.
Finally, given that the "100% tag" exists, the SAM Center will simply wait for the tag to be put in place, as there is nothing else that the department can do.