OATdb Archive

2011 - 2012

Institutional Effectiveness

Goal
Enhance Data Deliveries To Campus Units.
Enhance Data Deliveries To Campus Units.


Objective
Provide Useful, Accurate Data
Provide useful, accurate data to both internal and external constituents. Help support President's Goal to Make Data- And Outcome-based Continuous Improvement part of our daily environment.


KPI
Useful, Accurate Data Performance Indicator
Attain a 75% satisfaction rate on all customer service related surveys.  Attain a 70% return customer rate.
100% of mandatory reports will be completed by deadlines.  Less than 10% of mandatory reports will receive responses from agencies concerning valid recognized errors.

Success Criteria:

(A)  Attain a 75% satisfaction rate on all customer service related surveys.  

(B)  Attain a 70% return customer rate. 

(C)  Complete 100% of mandatory reports by deadlines. 

(D)  Less than 10% of mandatory reports will receive responses from agencies concerning valid recognized errors.


Result
1.1.1. (A-D) Useful, Accurate Data Performance Indicator
RESULTS: 1.1.1. (A-D)

Planned Assessment for Criterion (A), "Attain a 75% satisfaction rate on all customer service related surveys," was the completion of the Client Satisfaction Survey by FY12 clients following work order delivery.


This assessment was originally planned for early spring semester and summer; however, due to staffing changes, turnover, and departmental/divisional reorganization, the distribution of satisfaction surveys for FY12 took place at the conclusion of the summer 2012 term.

Seventy-seven of the eighty unique, internal SHSU clients associated with the 368 completed work orders in the FY12 Institutional Effectiveness Work Order Table received by e-mail a request to complete a web-based survey. Each was asked to reflect upon his/her satisfaction with various Institutional Effectiveness services performed between September 1, 2011, and August 31, 2012. The Client Satisfaction Survey, delivered via link to Lime Survey, was revised from the previous year. The revision introduced a slight increase in the specificity of some questions and added a “not applicable” response option to the array of possible answers. Input was solicited concerning:
  • whether the client received what was requested
  • the quality of customer service given
  • the accuracy of the data
  • the format in which the data was delivered
  • the clarity of the information or service that was delivered
  • the usefulness of the data or service
  • open-ended feedback on areas needing improvement
Twenty-eight of the seventy-seven clients contacted completed surveys (36.36 percent response rate) as compared to 19 respondents in the previous year.

A copy of the Client Satisfaction Survey is attached to this OAT DB assessment entry.

Outcomes for Criterion (A), "Attain a 75% satisfaction rate on all customer service related surveys.”

The Office of Institutional Effectiveness (IE) slightly surpassed the targeted 75 percent satisfaction rate on the Client Satisfaction Survey. The weighted average of the proportion of positive responses to 15 specific questions was 79.5 percent, and the percentage of aggregated “Satisfied-Very Satisfied” responses ranged between 50-93 percent. The variations in levels of satisfaction for different areas of service provided more information than the overall weighted average of the proportion of positive responses. For example, it is clear that concerns with the time required to complete a work order adversely affected client satisfaction (71.43 percent positive/10.71 percent negative) compared with the positive response to customer service received from IE staff (92.86 percent positive/7.14 percent negative).

The level of satisfaction expressed in relation to both the thoroughness and clarity of work order results delivered in the form of graphs (Thoroughness: 50 percent positive/10 percent negative; Clarity: 60 percent positive/10 percent negative) indicated this is an area of service that would benefit from improvement compared to the delivery of work order results as data lists and tables (Thoroughness: 83.33 percent positive/8.33 percent negative; Clarity: 92.31 percent positive/3.85 percent negative). Open-ended feedback provided by almost 54 percent of survey respondents also provided specific insights into targets for service improvement. A brief summary of positive and negative responses to the 16 closed-ended survey questions has been attached to this entry along with a more detailed analysis.

Planned Assessment for Criterion (B), "Attain a 70% return customer rate," was to examine, compare and tally entries in the IR 2011-2012 WO table and IR 2010-2011 WO table for returning clients. Repeat clients within 2011-2012 were also tallied and included in the proportion of repeat clients. The proportion of unique clients in FY 2012 who were repeat clients was calculated and compared to the targeted 70%.

Outcomes for Criterion (B):
A 70 percent rate of return business was not achieved in FY12. The percentage of individuals returning or repeat clients in FY12 was 59 percent (66 repeat clients of 111 total unique clients in FY12). The percentage of FY12 unique organizational unit clients who had been clients in FY11 was 48 percent (31 of 65 unique FY12 organizational unit clients). It appears the success criterion for FY12 had been set at 70 percent somewhat arbitrarily, as evidence of a precedent for repeat business at a level approaching 70 percent was not seen in the FY11 IE assessment. A workbook of  FY 11 and FY12 Work Order Table information summaries  and calculations are attached to this entry.

Further, the documentation of FY11 client activity in relation to FY12 client activity may be only moderately consistent between years and slightly more reliable within years. As a result, it is not clear if the lower than targeted ‘return business’ was unremarkable because the success criterion had little relation to history, to circumstances under the control of IE staff, or whether the validity of the measurement methodology was questionable. More important, however, is the question of whether increasing repeat clients is actually worthy of being sought after in a public sector educational environment.

Planned Assessment for Criterion (C), "Complete 100% of mandatory reports by deadlines.”

The proportion of total work orders that were delivered late (and without authorized extension) was calculated from work order completion notes in the IE/IR 2011-2012 Work Order Table.

Outcomes for Criterion (C):
One hundred percent of mandatory reports and surveys were completed for on-time delivery. There were no late submissions of mandatory reports. This is an area where the IE office would be highly unlikely to allow failure, given the risk of negative consequences associated with missing hard deadlines for official reports. This fact raises a question about whether a KPI of this nature is truly an appropriate target for formal administrative assessment, or whether it would be more appropriately acknowledged as a fundamentally critical and obligatory operational requirement for ongoing monitoring.

Planned Assessment for Criterion (D), "Less than 10% of mandatory reports will receive responses from agencies concerning valid recognized errors.”
The proportion of total mandatory work orders in which errors were found was calculated from work order completion notes in the IE/IR 2011-2012 Work Order Table and notes saved in National Survey and recurring state and federal report folders in T:\\IR.

Outcomes for Criterion (D):
The percentage of mandatory reports submitted in FY12 with errors was less than ten percent. Although no errors were disclosed to IE by any of the external agencies receiving official reports or surveys, an error in the student debt rate reported to U.S. News and World Report Survey in a previous year was disclosed when encountered and questioned by another Texas university. The underreported U.S. News and World Report Survey student debt rate was recomputed by the Sam Houston State University Financial Aid Office using its own internal data and submitted to the publisher as a correction.


Action
Improve KPI Maintenance Outcomes And Their Assessments
CONCLUSIONS AND ACTIONS:  1.1.1. (For GOAL 1. OBJECTIVE 1. KPI 1.)

The following are reflections and planned actions proceeding from the assessment of four operational performance indicators that materially support the delivery of useful, accurate data to the Sam Houston State University community and beyond in which the university is represented by its data.  The Office of Institutional Effectiveness (IE) affirms the importance of an appropriately balanced assessment philosophy and recognizes the validity and importance of critical operational process monitoring for fundamental maintenance.  Further, the staff acknowledges monitoring the fundamental department operations is critical to developing and maintaining a solid operational foundation.

Each of the four performance indicators associated with the objective of delivering useful and accurate data may be considered a critical maintenance indicator, requiring consistent and continual monitoring, demanding as much attention to process and practice as it does to evidence of successful outcomes.  In the course of assessing the success criteria associated with this Objective and KPI, Institutional Effectiveness observed the questionable meaning of assessment results which may not have been focused upon monitoring processes and practices of recognizable importance.  For example, aiming to achieve 70 percent return rate could be argued to be at odds with encouraging data self-service sufficiency.  It may be more valuable to implement mindfulness and incentives to assure that IE processes and staff practices do not confuse potential clients or create barriers to effective communication.

In keeping with the conception of Objective 1. KPI 1. (A) through (D) as critical maintenance indicators, the Office of Institutional Effectiveness proposes to take action to improve the effectiveness and efficiencies in both documentation practices and the artifacts that support consistency and reliability in documentation.

(1) Implement practical improvements to include designating a single individual to initiate the fundamental initiation of all work order documentation in order to promote consistently applied rules, judgments and interpretations of information capture and minimize duplication.

(2) Thoroughly document work hours dedicated to each work order through written or digital logs and clearly note significant alterations to work order structure as a task develops.

(3) Include an organizational schema within the work order table permitting the nesting of jobs within jobs, i.e., the hierarchical or familial linking of a new job to the original job from which it evolves as a variation.

(4) Acquire an automated, data-base driven, searchable work order system.

 Continuous maintenance monitoring also requires regular sources of input from clients and other constituencies.

(1) Manual or automated release of client satisfaction surveys at relatively frequent and regular intervals.  Automation of this process is ideal in order to mitigate the impact of staffing and work load issues.

(2) Periodic outreach to suppliers and colleagues for input on perceptions and expectations concerning IE functions, products, and staff.

 Actions proposed in response to FY12 Client Satisfaction Assessment include:

  1. Regularly contact clients who have submitted work orders regarding the progress of their work order in the current and projected queue of jobs.  Inform the client of his/her IE WO number and discuss any challenges to fulfilling the work order within the client’s preferred timeframe that may be inherent to the nature of the request.
  2. Schedule training sessions with IT@SAM to move forward with commitments made in Summer 2011 to provide training to all Cognos Report Studio users in advanced graphical report design and production in Cognos.
  3. Regularly initiate follow-up contact with clients, whose work orders have been completed, in order to invite questions about the WO results that were delivered and clarify any questions that may otherwise remain unresolved.

Objective
Improve Office Of Institutional Effectiveness Capacity
Improve Office of Institutional Effectiveness capacity to meet higher and different workload demand. Significant increases in work order demand for Institutional Effectiveness (IE), and major changes in the sructure of data and the technological tools used to store, access and report data have resulted in the need to acquire and practice many new skills and to perform many functions historically associated with IT@SAM services.  This includes in-house responsibility for acquisition of official university data utilized by Institutional Effectiveness with limited support from IT@SAM resources.  Data acquisition and management protocols encompass data cleaning, validation and trouble-shooting and require the construction of a data warehouse. Transitioning from a Microsoft server and MS SQL Developer to a more modern Oracle SQL Developer limited access to uploading data for the IE/IR Warehouse. Currently, these protocols fall to a small staff of two full-time analysts and two part-time student analysts, requiring intensive training and supervision.

Increased demand and expanded areas of responsibility described above are addressed in addition to a full complement of typical Institutional Effectiveness functions, demanding more time, effort and care than can be successfully devoted to analysis, reporting, evaluation and technical support.  Existing staff require instruction support and practice time to develop and improve competence in understanding the new university ERP, and competence in the use of SQL and Cognos programming, in addition to data analysis, manipulation and reporting in SPSS and Excel for university reporting and evaluation-research.

KPI
1.2.1. Improved Capacity Performance Within A Two-Year Time Frame
Success Criteria:   

(A)  Document 20% volume and FTE increases on Institutional Effectiveness data warehouse from FY12 to FY13.

(B)  Document 20% volume and FTE increases on new and special data acquisition processes (e.g., warehousing of longitudinal state and federal data; comparative benchmark data from other institutions; special initiative data not in Banner; financial, space and external funding data not in Banner, etc.).

(C)  Document 20% volume and FTE increases in original program writing or revision from FY12 to FY13.

(D)  Document 15% increase in overall overtime FTE by Institutional Effectiveness staff.


Result
Improved Capacity Performance Outcomes Within A Two-Year Time Frame
Two-year Planned Assessment for Criteria (A) through (D) is to calculate percentage increases in FY13 over FY12 using FY12 as baseline.  FY12 assessment tasks consisted of gathering and presenting FY12 baseline data and contextual information pertaining to success criteria (A) through (D).  This includes tallying the number of times particular types of new tasks were performed in FY12:

(A)  FY12 activities related data stores of special data views

(B)  New and special data acquisition processes (e.g., warehousing of longitudinal state and federal data; comparative benchmark data from other institutions; special initiative data not in Banner; on financial, space and external funding data not in Banner, etc.)

(C)  Original program writing or revision

Methodology :
Because the tasks enumerated in criteria (A) through (C) were not directly associated with client-initiated work orders, Institutional Effectiveness did not expect to find regular and recurring documentation of these tasks in the IE/IR 2011-2012 Work Order Table. Estimates of the number and variety of tasks were collected through interviews with the Institutional Effectiveness Sr. Analyst, both Graduate Assistant analysts, and Analyst I in conjunction with reviews of the data files, tables, views, SQL and Cognos reports stored in T:\IR\!IRA Data Warehouse, in the IE office schema of the Oracle SQL Developer Server and in the IE office folders within the SHSU COGNOS Public Folders.

Benchmark Outcomes for Criteria (A) through (C)

FY12 Baselines:

(A)  Data stores of special data views

(B)  New and special data acquisition processes original program writing or revision

(C)  Original program writing or revision

In Summer 2011, SHSU converted from legacy to an enterprise resource planning software (Banner) for collection of primary data across the institution.  At this time, no frozen (copies of) student data sets nor access to live or staged tables was provided to IE. IE utilized less than 20% of a database administrator funded by IT for assistance in addressing data access, report construction and general information warehouse concerns.  Data used by IE was independently acquired by IE staff and prepared for storage in SPSS, in preparation of the decommissioning of the office's SEQUEL server per IT determination.  An alternative data warehouse technology/hardware solution had not been identified.  The identification of new data fields and tables in ODS to isolate previously identified student attributes, admissions records, academic programs and semester credit hour data was not performed consistently or timely by IT or Student Team members due to the demand for implementation resources across the university. To fill the gaps in data, proxies for Summer 2011 frozen data were retroactively created by IT and IE. Previously submitted raw data files were requested from Texas Higher Education Coordinating Board staff to fill gaps in certified report data. 

Beginning Fall 2011, database administrator (DBA) personnel resources were designated for IE data assistance not to exceed 20% of the DBS's FTE.  With the assistance of the DBA, IE analysts were permitted to extract predefined datasets each semester from the operational data store (ODS) to generate excel files to then convert to SPSS files. In late 2011, IE was able to store data and began to create views in the new Oracle Developer server, similar to those that analysts had created in the MS SQL server to increase efficiency and save time. During the course of the transition from legacy to Banner, IE continued to receive and fulfill data requests.  Due to the variety of filters employed and not employed in the freezing process, managed solely by IT programmers, IE dedicated a great deal of time attempting to filter froze files in a manner that would generate accurate query results and attempting to match enrollment counts in various parts of the Enrollment Management Daily Counts report. CB report file data, managed by university staff outside of the IE office, were discovered to be discrepant with data officially reported to the THECB.  IE requested additional CBM report data files from THECB to replace files deemed questionable.

In Spring 2012, the IT DBA dedicated to IE work was authorized to devote 40% of his time to providing IE with data and consultation. The DBA generated and stored full tables pertinent to enrollment, academic program information, student attributes, semester credit hour and course information from both ODS and Banner staging tables in the IE Oracle SQL Developer.  In addition, IE analysts began to write and store data views in the SQL Developer. Due to the limitations of the baseline ODS packages and the frequent necessity for combining tables across packages or combining data in ODS with data not captured in ODS, IE analysts began to increase the volume of direct report writing with SQL code in Cognos Report Writer.

With Summer 2012 underway, the IT database administrator (DBA) continued to be authorized to devote 40% or more of his time to providing IE with data and consultative services. The Oracle SQL Developer continued to be loaded with frozen data tables and other key tables and views from ODS and Banner. Additional tables and views were added to the set delivered in Spring 2011 to fill gaps.  Variable verification and report testing remains problematic in Cognos, so SQL testing in the IR Data Warehouse is often required in advance of relying on Cognos output.  Reports and analysis requiring the use of historical data still require use of SQL Developer and cannot be written in Cognos until ODS is populated with historical legacy data.  The role of the IE office has shifted from directly acquiring and storing institutional data to defining and performing data verification activities after the data sets are delivered.

A summary table of FY12 baselines is attached to this entry.  The progress of measures benchmarked in Summer 2011 through Summer 2012 indicate there was a high level of data acquisition and data storage activity in the earliest period of the University's transition to Banner, followed by a decrease in activity and a rise in creation of new and revised programs in SQL and Cognos.

(D) Increase in overall overtime FTE

This criterion should have been eliminated from the FY12 Objective 2 KPI list of criteria. While the general intention of this criterion may have been appropriate, It would not be possible to enforce time-monitoring policies for the documentation of extra time voluntarily spent by staff members beyond expected work hours. As a result the criterion was judged to be inappropriate shortly after the FY12 KPIs were submitted.

The intention of this KPI criterion was to present evidence of extraordinary time and effort spent in FY12 on data acquisition, data preparation and data maintenance tasks that had not been historically required under the legacy framework in order to have sufficient data inventory. Because data had previously been supplied primarily by IT, these tasks were only performed in previous years by IE personnel for the purpose of stocking the internal database, which was a secondary data source providing optional stores and views of data. This criterion was also originally intended to document the additional tasks required to generate new programming and views in new media such as Cognos Report Studio, Cognos SQL and Oracle Developer SQL.


Action
Improve Capacity Performance Within A Two Year Time Frame
FORMATIVE CONCLUSIONS AND ACTIONS: 1.2.1. ( For GOAL 1. OBJECTIVE 2. KPI 1.)

Although the data limitations, which the Office of Institutional Effectiveness experienced, were extreme in many estimations, conditions improved as the University's transition to Banner evolved. Plans for resource reallocation and functional redefinition began to come together. The benchmark data reveal that the responsibility of direct data acquisition and data storage have migrated back to IT, and the demand for new and revised report and analysis programming continues to increase, particularly in the area of Cognos report writing.

Functional training continues to be critical to bring all IE staff up to highly-skilled levels with new tools and new data structures.  Due to the dependence on a variety of data sources created over a number of years, it is important to educate IE staff with the relationships of Banner data structures to legacy data structures which must very often be linked together to fulfill data requests.

In the assessment of the KPIs for this Objective, it is observed that informative measurement of quantitative outcomes for particular task performance relies heavily upon reliable, thorough documentation.  Therefore, it is noted the effectiveness and practicality of internal monitoring processes require improvement such as those already proposed in the Conclusions and Actions for 1.1.1.


Objective
Improve And Expand
Improve and Expand evaluative work, analysis and reporting on special  academic initiatives, strategies and externally-funded programs to inform and support President's Goal to Adopt Innovative Methods To Improve The Quality And Access To Instruction.


KPI
Improve And Expand Performance Indicator
KPI: 1.3.1.  Improve and Expand Evaluative Work Performance Indicator #2

Success Criteria

(A) Document a 30% increase over last year in program, initiative and academic policy strategy evaluation or needs assessment for internally- or externally-funded initiative proposals.

(B) Sustain minimum baseline of 75% satisfaction with related programs.


Result
Outcomes Of Improvement And Expansion Of Evaluative Work
RESULTS: 1.3.1. (A-B)

Planned Assessment for Criterion (A), "Document a 30% increase over last year in program, initiative and academic policy strategy evaluation or needs assessment for internally- or externally-funded initiative proposals.”

Methodology:
Count number of work orders in both FY11 and FY12 IR Work Order Tables involving initiative program or externally-funded program  assessment/evaluation and compare those recorded in FY11 to those recorded in the FY12 WO table. Calculate percentage change from FY11 to FY12.

Outcome for Criterion (A):
In FY12 there was not a 30 percent increase over last year in the number of programs, initiative or evaluations/assessments.  The number of program evaluation/needs assessment jobs increased by 21 percent from FY11 to FY12; however, the percentage of growth from FY11 to FY12 in program evaluation/needs assessment jobs as a proportion of total completed work orders was 53 percent.  Program or initiative evaluation/needs assessment made up 9.51 percent of completed jobs in FY12, which was an increase from the 6.21 percent of total FY11 work orders.  The Workbook of 2011-2012 and 2010-2011 Work Order Table summaries and Calculations attached to this entry. 

The assessment of this KPI in relation to its success criterion raised questions about the reason for specifying success as a 30 percent increase in the number of these particular types of jobs. In addition, the assessment process highlighted, again, the degree to which the measurement of outcomes depends upon the reliability of the internal monitoring documentation processes implemented from year to year.  As observed and noted in the assessment of another KPI outcome, failure to meet the expected performance level may have been due to inconsistent documentation or may have been due to unexpectedly weak growth in demand for evaluative reporting on program initiatives and externally funded programs.



Planned Assessment for Criterion (B), "Sustain minimum baseline of 75% satisfaction with related programs.”

Methodology:
The plan for this assessment was to analyze customer service feedback from the FY12 Client Satisfaction Survey submitted by evaluation and assessment clients. However, satisfaction levels of clients who submitted program or initiative evaluation/assessment jobs were not measured distinctly from those of all other clients because satisfaction surveys were administered anonymously.  It was necessary to administer the survey to clients anonymously in order to encourage a high rate of response to the survey invitation within an abbreviated timeline.  The rushed circumstances under which the survey was administered were explained in detail in the Methodology discussion of the KPIs results narrative for Goal 1. Objective 1. KPI 1. (1.1.1 (A-D)).

Outcome for Criterion (B):
The overall responses of all clients in relation to all jobs was 79.5 percent positive, based upon the weighted average of the proportion of positive responses to 15 closed-ended survey questions (previously reported in the Outcomes narrative for 1.1.1.).




Action
Improve And Expand Evaluative Work
CONCLUSIONS AND ACTIONS: 1.3.1. (For GOAL 1. OBJECTIVE 3. KPI 1.)

Outcomes targeted to indicate the success of strategies that were implemented to support the FY12 Objective to improve and expand program and initiative evaluation/assessment work suggested that the strategies succeeded in moving the IE office in the desired direction. However, detailed feedback and recommendations from the clients of this work is needed and proposed. This is to be undertaken in conjunction with the examination of the tasks involved in corresponding work orders, in order to evaluate not only the results but the implementation of services in this area. An additional action proposed in this vein is to conduct an assessment of unmet needs within the University and an analysis of the desirability of expanding this area any further in relation to policy priorities that govern resource allocation.



Update to previous cycle's plan for continuous improvement

Plan for continuous improvement In reviewing and assessing the three Institutional Effectiveness FY12 OBJECTIVES for the overarching GOAL of Enhancing Data Deliveries to Campus Units, a number of determinations for improvement became clear.  First, it was concluded all three FY12 OBJECTIVES would have been more accurately assessed if internal documentation processes were more consistent, precise and thorough within and across fiscal years.  A second determination identified the need for more frequent and pointed client feedback and needs input.  Third, the decision to measure particular outcomes as presumed evidence of targeted improvements, such as assumed increases in work order requests, should be based on clear trends or established operational standards, rather than assumption of growth.  The identification of a targeted outcome requires more consideration than simply the availability of related and/or measurable data.  The following are strategies proposed in preceding conclusion and action narratives, which are expected to offer assistance in the Office of Institutional Effectiveness' efforts at continuous improvement.

I.  Improving operational process monitoring and documentation will be essential for maintenance of fundamental operational quality, as well as essential for producing reliable measurements of efforts to improve productivity and efficiency.  The systems which are currently in place need reinforcement with streamlined approaches and incentives for capturing and documenting initial and evolving information about each work order and man hours spent fulfilling each work order, including evolutionary variations. Improvement actions which have been proposed in this vein include:

(A)    Designate a single individual to initiate each work order as a single point of contact in order to promote consistently applied rules, judgments and interpretations of information capture.

(B)    Establish staff review of work order status, regularly sharing brief oral reports of time invested in completing each work order and noting important changes and developments.

(C)    Develop and implement an organizational schema for a work order table that will permit the documentation of jobs within jobs, i.e., the hierarchical or familial linking of a new job to the original job from which it evolved as a variation.

(D)    Implement an automated, database driven, searchable work order documentation system linked to a web-based input page as permitted by funding opportunities or by sharing resources with IT.

II. Refining and improving the solicitation of client feedback and potential client needs will help focus efforts on particular areas in need of improvement.

(A)    Administration of Client Satisfaction Surveys with tokens in order to identify feedback with corresponding work orders.

(B)    Administration of Client Satisfaction Surveys near the end of each term (Fall, Spring, Summer) , if not immediately following the delivery of the completed work order in order to increase the reliability and specificity of work order related feedback.

(C)    Administration of strategic needs assessments re. underutilized IE services or services which may be anticipated to play a stronger role in supporting a strategic direction in which the University may move.





  1. Encouraging the habit of regular follow-up contact with completed work order clients will provide the IE staff the opportunity to offer highly individualized explanation and assistance in interpretation of delivered results.  FY12 Client Satisfaction Survey results suggest written explanations were less clear and thorough than oral explanations, which did not appear to be frequently offered. 









  2. In order to help assure that IE staff are prepared to meet the challenges of mastering new and changing technology, analysis tools and applications, provide adequate time and opportunity for formal training, including time for practice, self-teaching and peer-tutelage.




In April 2012, the former Office of Institutional Research and Analysis was included in a multi-divisional University reorganization.  The office was moved from the Division of Finance and Operations to the Division of Enrollment Management, reporting to the VP for Enrollment Management.  Personnel were reassigned within the department, and the name of the department changed to Institutional Effectiveness.  Administrative program assessment, legislative relations and public policy responsibilities were added to the department's institutional research function.  State institutional data reporting responsibilities were also moved to IE by virtue of reassigning a position to IE from the Registrar's Office, where the state reporting functions had been performed for several years.  Before the end of April 2012, the individual handling the reporting function accepted a position in another area of the University.  Although personnel positions have been increased in IE since the reorganization, the department has experienced challenges in filling those positions, which require a certain level of technical data management skills.  Two staff were added to the department, and two positions remain unfilled pending acceptable applicants.  IE continues to build on its new structure, while facing transitional challenges.