OATdb Archive

2013 - 2014

Academic Planning And Assessment

Goal
Promote An Environment That Encourages Continuous Improvement Of Assessment Initiatives
The Office of Academic Planning and Support will encourage and promote an environment of continuous improvement at Sam Houston State University.

Objective
Ensure Quality Annual Assessment Processes
The Office of Academic Planning and Assessment will ensure that members of the university community are conducting a quality, and effective annual assessment process.


KPI
Annual Meta-assessment Process
The Office of Academic Planning and Assessment will utilize a locally developed rubric designed to evaluate the overall quality of a program's annual assessment plans to facilitate an annual review of assessment plans stored  within the Online Assessment Tracking Database (OATDB).  The results of this  evaluation should indicate that 80%, or more, of the reviewed assessment plans for each College/Division reviewed should be rated as “Acceptable” or better.  Additionally 80%, or more, of the total  number of assessment plans reviewed from across the University should be  rated “Acceptable” or better.

Result
Meta-assessment Results For 2013-2014
During the 2013-2014 cycle the Office of Academic Planning and Assessment successfully facilitated Meta-assessment Processes for each of the Academic Colleges and the Division of Student Services. 

Assessment plans from each College or Division were each reviewed twice (excluding those assessment plans used as part of group norming which received only one score).   Each assessment plan element (e.g., Goal, Objective, Indicator, KPI), as well as the assessment plan as a whole, received a score of Developing, Acceptable, or Exemplary from each rater. Percentages were derived by totaling the number of units receiving Developing, Acceptable, and Exemplary scores within each element and dividing these totals by the total number of possible scores for that element. 

College 1
Overall – 92.59%
Goals - 100%
Objectives - 100%
Indicators - 88.89%
Criterion - 85.19%
Findings - 88.46%
Actions - 61.54%
Plan for Continuous Improvement Update – 77.42%
New Plan for Continuous Improvement – 65.39%

College 2
Overall – 50%
Goals – 68.24%
Objectives – 68.18%
Indicators – 68.18%
Criterion – 68.18%
Findings – 47.37%
Actions – 36.36%
Plan for Continuous Improvement Update – 72.73%
New Plan for Continuous Improvement – 54.54%

College 3
Overall – 78.57%
Goals – 90.48%
Objectives – 90.48%
Indicators – 92.85%
Criterion – 90.48%
Findings – 85.71%
Actions – 69.05%
Plan for Continuous Improvement Update – 80.96%
New Plan for Continuous Improvement – 73.81%

College 4
Overall – 65.62%
Goals – 96.55%
Objectives – 100%
Indicators – 85.72%
Criterion – 57.14%
Findings – 65.52%
Actions – 62.06%
Plan for Continuous Improvement Update – 79.31%
New Plan for Continuous Improvement – 62.52%

College 5
Overall – 88.89%
Goals – 94.44%
Objectives – 94.44%
Indicators – 94.44%
Criterion – 94.44%
Findings – 82.35%
Actions – 66.67%
Plan for Continuous Improvement Update – 72.22%
New Plan for Continuous Improvement – 77.78%

College 6
Overall – 87.10%
Goals – 96.77%
Objectives – 91.94%
Indicators – 93.55%
Criterion – 95.16%
Findings – 87.10%
Actions – 87.10%
Plan for Continuous Improvement Update – 93.44%
New Plan for Continuous Improvement – 79.66%

College 7
Overall – 64.91%
Goals – 96.49%
Objectives – 84.21%
Indicators – 85.96%
Criterion – 85.96%
Findings – 73.68%
Actions – 45.61%
Plan for Continuous Improvement Update – 66.07%
New Plan for Continuous Improvement – 57.14%

Overall for all Academic Colleges
Overall – 75.88%
Goals – 93.39%
Objectives – 89.88%
Indicators – 88.28%
Criterion – 84.77%
Findings – 78.17%
Actions – 63.67%
Plan for Continuous Improvement Update – 78.74%
New Plan for Continuous Improvement – 68.25%

Division of Student Services
Overall – 80.77%
Goals –92.31%
Objectives – 80.77%
Indicators – 71.43%
Criterion – 64.29%
KPIs – 76%
Findings/KPI Results – 88.46%
Actions – 57.69%
Plan for Continuous Improvement Update – 80.77%
New Plan for Continuous Improvement – 73.08%

The meta-assessment results reveal several areas improvement.  First, no indivdual college or division exceeded 80% for all assessment elements.  
Generally, the following elements saw the greatest weaknessess: Findings/Results, Actions, and Plan for Continuous Improvement.  This indicates that many programs are having difficulties documenting the collection and using assessment data for programmatic improvement. 

A complete breakdown of the Meta-assesment results can be found in the attached file.


Action
Continued Promotion Of Quality Assessment Processes
Given the success of the meta-assessment pilots within the academic colleges and the Student Services Division, OAPA will continue to roll out meta-assessment to the remaining divisions across campus.  The initial focus for 2014-2015 will be the Division of Academic Affairs.  Additionally, the Director of Assessment will continue to coordinate with the academic colleges and the Division of Student Services to finalize the ongoing, annual meta-assessment processes that will take place within their colleges and divisions beginning with the 2014-2015 assessment cycle.

The meta-assessment results from the various colleges and the Division of Student Services has revealed some University-wide assessment weaknesses; particularly with regards to the collection of data and the use of data to drive continuous improvement.  Therefore, during the 2014-2015 cycle, OAPA will focus on provided specific training and resources on these assessment elements in efforts to reduce these deficiencies for future cycles.  Specifically, OAPA will expand the number of formal training sessions and workshops on these topics, identify and highlight examples of best practice, and provide additional training materials and presentations on its website.  


Objective
Provide Quality Assessment Support Resources
The Office of Academic Planning and Assessment will provide quality assessment resources to the University community through both its website and ongoing training sessions and workshops.

KPI
Website Tracking
Utilizing Google Analytics, the Office of Academic Planning and Assessment will track traffic coming to the department's website. 


Result
Google Analytics Results For The Office Of Academic Planning And Assessment
The Office of Academic Planning and Assessment began tracking website visitors in June, 2014.  To-date, results are available for the months of June and July, 2014, which have been provided as supporting documents.   Additionally, these results only reflect visitors to the Office's homepage, and not any of its secondary pages.  A summary of key results is provided below:

June - 221 sessions, 193 users, 476 pageviews
July - 231 sessions, 203 users, 419 pageviews
August - 241 sessions, 217 users, 447 pageviews

KPI
Number Of Workshops/Training Sessions Held
The Office of Academic Planning and Assessment will conduct at least 50 workshops/training sessions related to the annual assessment process being conducted at SHSU.  These sessions may range from large, group workshops to individual training sessions.


Result
Number Of Workshops/Training Sessions Held For 2013-2014
The Office of Academic Planning and Assessment has conducted a total of 97 meetings, workshops, and training sessions for the SHSU community during the 2013-2014 assessment cycle.  These sessions can be broken down as follows:

63 small group assessment meetings
13 meetings related to Meta-assessment
6 small group OATDB training sessions
5 meetings related to Core Curriculum Assessment
5 assessment workshops
3 meeting related to SACSCOC accreditation
2 presentations to the Council of Associate Deans

KPI
Satisfaction With Workshops/Training
Training session attendees will complete a brief survey, consisting of one Likert-scale question and three open-response questions, which indicate their  satisfaction with the services provided by the Office of Academic Planning and  Assessment.    A copy of the survey is provided as an attachment.  The  average response to the Likert-scale question should be 4 or higher, indicating  that they were satisfied with the services provided by our Office.  Additionally,  respondent comments from the three open-response questions should be  generally positive.


Result
Satisfaction With Workshops And Training For 2013-2014
5 total assessment workshops were conducted for 2013-2014.  Evaluations were conducted at two of these workshops.  The results from one evaluation were never returned to the Office of Academic Planning and Assesment.  The results from the second evaluation returned an average of 4.86. Responses to qualitative questions were similarly positive.  Please see the attached report for more information. 


Action
Continued Expansion Of Assessment Support Resources

OAPA wants to continue to build upon the success it has had in expanding the training and resources provided by the office to the university community. 

First, the Office will continue to expand the number of workshops and training sessions conducted by its staff, especially with regards to group training sessions and workshops.  The desired number of training meetings, workshops, and training sessions will be expanded to 100.  Furthermore, at least 10 should be formalized group training sessions/workshops.  Additionally, the Director of Assessment would like to institute regular assessment "brown-bags."  These events would provide a more informal atmosphere to discuss various assessment topics, and allow attendees to share ideas and examples of best practice.

Second, OAPA has revised its Workshop/Training evaluation survey to ask additional questions about respondents "confidence with regards to designing and implementing effective programmatic assessment." Data from these questions will hopefully provide OAPA with additional information about the effectiveness of its training.

Finally, OAPA will continue to refine how it tracks visitors to its website.  The use of Google Analytics will be expanded to include all pages that are part of the OAPA website.  Sessions, users, and pageviews will continue to be tracked monthly. 


Goal
Promote The Scholarship Of Assessment
The Office of Academic Planning and Assessment will promote the growing scholarship of assessment, within SHSU, Texas, and the nation, through research, presentations, and publications.

Objective
Scholarly Presentations And Publications
The Office of Academic Planning and Assessment will make presentations and submit publications various assessment related topics through state, regional, and national venues.

KPI
Scholarly Presentations
The Office of Academic Planning and Assessment will track the number of scholarly presentations conducted by members of its staff for the 2013-2014 assessment cycle.  The minimum target for success will be 4 presentations at state, regional, or national conferences or meetings. 

Result
Scholarly Presentations
During the 2013-2014 assessment cycle, members of the Office of Academic Planning and Assessment made four total presentations.  Three at a national conference and one at a state meeting.  Presentations by OAPA staff are outlined below:

How do we Assess the Assessment? Developing and Implementing a Process for Formal Meta-assessment - Presented at the 14th Annual Texas A&M Assessment Conference

(Re)Building Credibility: One University's Journey into Writing Assessment - Presented at the 14th Annual Texas A&M Assessment Conference

Sam Houston State University's Plan for Core Curriculum Assessment - Presented at the 2014 LEAP Institute

Sam Houston State University's Plan for Assessing its Core Curriculum and the THECB's Six Core Learning Objectives - Presented at the 14th Annual Texas A&M Assessment Conference


KPI
Scholarly Publications
The Office of Academic Planning and Assessment will track the number of scholarly articles submitted and accepted for publication by member of its staff.  As this is a new measure the minimum target for success will be one article submitted and accepted for publication, per year.

Result
Scholarly Publications
No publications were submitted for 2013-2014.  One article is currently in progress and will be submitted early in the 2014-2015 cycle.


Action
Promotion And Expansion Of Scholarly Activities Relating To Assessment
OAPA met its objective for the number scholarly presentations; however, did not meet its objective for scholarly publications.  OAPA will redouble its efforts with regards to scholarly publications and seek research partnerships with offices and departments from around the University.  Many such opportunities will exist as the University begins the assessment of the State's new core learning objectives. 

In addition, OAPA will seek to promote assessment practices and scholarship from around campus through the creation of a assessment mini-grants.  This will represent a new objective for OAPA in the 2014-2015 assessment cycle.


Goal
Support The Institution's Ongoing Southern Association Of Colleges And Schools Commission On Colleges (SACSCOC) Accreditation Efforts
The Office of Academic Planning and Assessment will support the institution's ongoing efforts to respond to all SACSCOC requirements for maintaining accreditation.

Objective
Facilitate Completion Of The SACSCOC Fifth-Year Interim Report
The Office of Academic Planning and Assessment will work with the University community to ensure the successful completion of the SACSCOC Fifth-Year Interim Report. To this end, the Office will work to disseminate information and resources, provide necessary training, and complete and submit all required documents.


KPI
Prepare A Quality And Thorough Compliance Narrative Document For The 5th Year Interim Report
The Office of Academic Planning and Assessment will work with university personnel to ensure that a thorough, accurate, and quality compliance narrative document is prepared for the SACSCOC 5th Year Interim Report. 


Result
5th Year Report Knowledge Acquisition And Dissemination
Office personnel have attended SACSCOC meetings (Summer 2013 and December 2013) and webinars, gaining necessary information relating to 5th year reporting requirements.  Information was then shared with faculty and administrators as appropriate (numerous individual faculty assessment discussions, Council of Academic Deans meetings, and President’s Cabinet) to prepare the campus for 5th year reporting expectations.


Result
5th Year Report Committee Work
In June 2014, a 5th Year Report Committee began work to develop the compliance narratives for the SACSCOC 5th Year Interim Report due in March 2015.  By August 2014, the committee had identified anticipated data needs and/or potential obstacles in documenting compliance.

Action
5th Year Report Completion And Submission
Committee work will progress throughout the Fall 2014 and Spring 2015 semesters.  A draft report is targeted for completion by October 1, 2014.  Final edits and updates will be made in November through February, as necessary.

Objective
Ensure Institutional Compliance With And Timely Submission Of Required SACSCOC Documentation
The Office of Academic Planning and Assessment will work with the University administration to ensure that all required SACSCOC documents are submitted timely, and appropriately.


KPI
Appropriate Submission Of SACSCOC Required Documentation
The SACSCOC liaison, and the Office of Academic Planning and Assessment, will ensure that all required SACSCOC documents, such as Institutional Profiles, Letters of Notification, Prospectuses, Institutional Profiles, etc., will be summited timely and appropriately to the SACSCOC.


Result
Enrollment And Financial Profiles
SHSU submitted annual Enrollment Profiles and Financial Profiles, as required by SACSCOC, by stated deadlines.

Result
Substantive Change Reporting
SHSU submitted substantive change notifications and prospectus’ to SACSCOC, as per required timelines, for applicable changes implemented in the 2013-2014 academic year.

Following information obtained at the Summer 2014 SACSCOC Summer Institute, SHSU determined a need to conduct an internal compliance audit with regard to reporting of Substantive Changes.  The audit revealed a need to submit mea-culpa substantive change documentation to SACSCOC for 3 changes made at the institution since the last accreditation visit in 2008.  The appropriate substantive change documentation was submitted to SACSCOC in June, 2014.

KPI
Address Functional Deficits In Faculty Credentials Reporting System
Following the institution's conversion from a 'home-grown' ERP system to Banner, some functionality relating to Faculty Credentials reporting was lost.  Steps will be taken to correct functional aspects of programming to include centralizing faculty degree entry into Banner and alteration of existing reports to align with new Banner structure.   

Result
Faculty Data Transition Into Banner
In the 2013-2014 academic year, limited components of faculty data were transitioned into Banner (i.e., faculty titles, ranks, tenure status).  As of August 2014, faculty degree information has not transitioned into Banner, thus not allowing for automated updating of the Faculty Credentials System.  The Faculty Credentials system has, however, been updated to allow automatic connection and information to flow from the schedule of classes to the report (i.e., course listings).

Action
Substantive Change Policy And Procedures Revisions
Due to the failure to report a small number of substantive changes since SHSU’s last reaffirmation visit in 2008, SHSU will review it’s Substantive Change Policy and related procedures.  The policy will be updated to reflect updates SACSCOC expectations.  Further, the Office of Academic Planning and Assessment will develop training and education materials for campus distribution.

Action
Faculty Data Transition Into Banner
Work continues to transition faculty data into Banner to allow for an automated update of the degree information within the Faculty Credentials program/report.

Goal
Support And Facilitate The Undergraduate Program Review Process
The Office of Academic Planning and Assessment will support and facilitate the Undergraduate Program Review Process as Sam Houston State University.

Objective
Facilitate A Quality Undergraduate Program Review Process
The Office of Academic Planning and Assessment will facilitate a quality undergraduate program review process through the creation of formal policies and guidelines, program review templates, and acquisition of program-specific data.  Such resources will be made available through the Office website and print materials.

KPI
Undergraduate Program Review Guidelines
Development and approval of formal policies and guidelines for Undergraduate Program Review.


Result
Acquisition Of Program Review Data
Program review data was identified and requested from Institutional Effectiveness in November, 2013.  Due to a high volume of data requests and limited staffing resources, the program review data was not available until July, 2014.  As of August, 2014 data is under review.

Action
Program Review Data Population And Template Review
Program review data will be populated into the draft templates and presented to the Council of Academic Deans (CAD) in Fall 2014 for review.  Implementation through a pilot review will follow, likely in the Spring 2015 semester, dependent upon CAD approval.

Goal
Support The Strategic Planning Process For The Division Of Academic Affairs
The Office of Academic Planning and Assessment will support the ongoing strategic planning process underway within the Division of Academic Affairs.

Objective
Provide Quality Strategic Planning Resources And Processes
The Office of Academic Planning and Assessment will provide quality strategic planning resources and facilitate effective planning processes within the Division of Academic Affairs.

KPI
Facilitate Development Of A Comprehensive And Quality Academic Affairs Strategic Plan
The Office of Academic Planning and Assessment will facilitate strategic planning discussions within Academic Affairs, providing the necessary resources and structure to the process.  Planning meetings and retreats will be scheduled and data resources provided as needed. 


Result
Strategic Planning Retreat
In October 2013, a strategic planning retreat was held.  Academic Affairs leadership (Deans, Assistant/Associate VPs, and the Provost) participated.  Participants discussed the need for a 6-year strategic plan for the division, as well as the necessary components/content for the plan.  As a first step, the academic affairs leadership group requested a comprehensive set of data, from which planning should revolve.  Data was requested from Institutional Research, however, due to a high volume of data requests and limited staffing resources, the strategic planning data was not available until July, 2014.  As of August, 2014, data is under review. 

In addition to data acquisition, work was completed with regard to the Curriculum Planning component of the Strategic Plan.  In conjunction with the Office of Graduate Studies/Academic Affairs, a curriculum planning database was developed to document curricular forecasts for a 6 year period.


Action
Strategic Planning Retreat - Phase 2
A follow-up Strategic Planning retreat will be scheduled for October 2014.  Participants will review the requested data, discuss the curriculum plan, and begin work to identify strategic initiatives and related resource needs and prioritizations.


Update to previous cycle's plan for continuous improvement

The Office of Academic Planning and assessment was successful in including a large number of additional resources, including formal presentations on assessment practices, best practices in distance education assessment, an assessment FAQ, and a revise OATDB user guide.  Additional documents were posted relating to the core curriculum assessment process as SHSU.  However, work remains to be done in better utilizing the strategic planning portion of the website.  Finally, OAPA has begun utilizing Google Analytics to track website traffic.

OAPA did see an increase in the number of training sessions; however, because of other projects, the Office was unable to significantly increase the number of formal training sessions about assessment practices and the use of the OATDB.  This will remain a desire for the 2014-2015 cycle.

OAPA successfully implemented the large-scale meta-assessment pilot within the Academic Colleges and the Division of Student services.  Results of those reviews have been distributed to the various programs for use in the improvement of their assessment plans.  Given the success of the pilot implementation, OAPA will continue to role meta-assessment out throughout campus.

OAPA successfully addressed all notification and approval requirements for SACSCOC substantive changes.  SHSU is currently reviewing its Substantive Change policy to identify areas for improvement in communication of substantive changes from the campus community to the SACSCOC Liaison.  In addition, SHSU began the process of preparing for the SACSCOC 5th Year Interim report.  Much progress has been made with the appointment of a 5th Year Report committee, development of draft narratives, acquisition and identification of support documentation, and development of a report website.  Work will continue into 2014-2015 with finalization of report narratives, website creation, and final submission.

Much progress was made with regard to the conversion of faculty data into the Banner system.  Phase I of data entry and conversion has been completed with faculty rank and load information being tracked within Banner.  Phase II of the conversion is underway with identification of variables to be moved into Banner (i.e., faculty degrees, tenure appointment data, etc.).  Currently, SHSU is working with an external consultant to make necessary modifications to Banner forms to allow for necessary reporting capabilities.  During the 14-15 cycle, the remaining faculty data variables will be converted into the Banner system.  With this conversion complete, the faculty credentials system will allow for more timely and accurate reporting. 

During the 13-14 cycle, SHSU Division of Academic Affairs made progress on long-term strategic planning.  Specifically, division leadership determined a need for a 6-year plan, identified the necessary plan components, requested a comprehensive dataset to be used in planning discussions, and began development and implementation of the curriculum plan component.  The curriculum plan component was implemented with all departments identifying a 6-year projection of curricular changes and associated budgetary impacts.  Additionally, SHSU has elected to consult with a 3rd party strategic planning expert to facilitate the development of the remaining plan components.  As of the close of the 13-14 cycle, preliminary investigation of consulting services was underway.   

Program review data was obtained in July, 2014. As of the close of the 13-14 cycle, the data were under pending review. In light of pending projects and limited staffing resources, the Undergraduate Program Review process was not implemented during the 13-14 year as planned. Work will continue into the 14-15 cycle to gain approval from the Academic Deans for the Undergraduate Program Review guidelines, process, and templates.


Plan for continuous improvement Following the success experienced in its first full year in operation, OAPA has an ambitious Plan for Continuous Improvement. 

The successful pilot administrations of meta-assessment within the academic colleges and the Student Services Division have provided valuable data about the quality of programmatic assessment that has been used by OAPA to identify areas for improvement.  In particularly, weaknesses were identified in how many programs were both collecting data and how data were being used to drive continuous improvement.  Therefore, during the 2014-2015 cycle, OAPA will focus on providing specific training and resources on these assessment elements in efforts to reduce these deficiencies for future cycles.  OAPA will expand the number of formal training sessions and workshops on these topics, identify and highlight examples of best practice, and provide additional training materials and presentations on its website.

Building upon this, OAPA will expand the number of workshops and training sessions conducted by its staff in general, especially with regards to group training sessions and workshops.  The desired number of training meetings, workshops, and training sessions for 2014-2015 will be 100.  Furthermore, at least 10 of these sessions should be formalized group training sessions/workshops.  New for 2014-2015, the Director of Assessment would like to institute regular assessment "brown-bags."  These events would provide a more informal atmosphere to discuss various assessment topics, and allow attendees to share ideas and examples of best practice.

To help better evaluate these formal training sessions/workshops, OAPA has also revised its Workshop/Training evaluation survey to include additional questions about respondents "confidence with regards to designing and implementing effective programmatic assessment."  Data from these questions will hopefully provide OAPA with additional information about the effectiveness of its training practices.

Given the demonstrated value of meta-assessment, OAPA will continue to expand its application for the remaining divisions across campus.  The initial focus for 2014-2015 will be the Division of Academic Affairs.  Finally, the Director of Assessment will continue to coordinate with the academic colleges and the Division of Student Services to finalize the ongoing, annual meta-assessment processes that will take place within their colleges and divisions beginning with the 2014-2015 assessment cycle.

OAPA is committed to promote the scholarship of assessment at SHSU.  While the office met its objective for scholarly presentations, it did not meet its objective for scholarly publications.  OAPA will redouble its efforts with regards to scholarly publications and seek research partnerships with offices and departments from around the University.  Additionally, OAPA will further promote good assessment practices and assessment scholarship at SHSU through the creation of an assessment mini-grant program.  Members of the university community will be able to apply for grants to make assessment-related conference presentations and to fund assessment-related activities around campus.  This will represent a new objective for OAPA and will be included in the 2014-2015 assessment cycle.

Due to the failure to report a small number of substantive changes since SHSU’s last reaffirmation visit in 2008, SHSU will review its Substantive Change Policy and related procedures to ensure its policies and procedures reflect updates SACSCOC expectations with regards to Sub Change.  Further, OAPA will develop training and education materials for campus distribution regarding Sub Change policy.

OAPA continues to move forward with the Undergraduate Program Review process.  Program review data will be populated into the draft templates and presented to the Council of Academic Deans (CAD) in Fall 2014 for review.   Dependent upon CAD approval, Implementation of a pilot review will follow in Spring, 2015.

Finally, OAPA will continue to refine how it tracks visitors to its website.  The use of Google Analytics will be expanded to include all pages that are part of the OAPA website.  Sessions, users, and page views will continue to be tracked monthly.