OATdb Archive

2010 - 2011

Psychology And Philosophy, Department Of

Goal
Faculty Scholarship
Faculty generate and disseminate scholarship

Objective
Scholarship Portfolios
Faculty will be effective scholars as evidenced by the portfolio documenting their scholarship on an annual basis

Indicator
Review Of Faculty Scholarship
Evaluation of faculty scholarship portfolios according to the Department guidelines for Scholarly & Artistic Endeavors. Factors in this category include: number and assessed quality of publications in externally reviewed journals; number of presentations at national, international, and regional conferences, number of students impacted by the faculty member in a mentorship role.

Criterion
Scholarship Evaluation Score
A score of 3 or more on a 5-point scale on the Scholarly & Artistic Endeavors Rubric is considered to be minimally satisfactory. The scale took into consideration: 1. published books; 2. articles printed in reviewed journals; 3. articles in print in non-reviewed journals; 4. articles submitted to reviewed journals; 5. articles in press; 6. presentations at regional/national/international conventions.

Finding
Scholarship Evaluation Scores
The mean Scholarship Evaluation Scores for members of the Department of Psychology and Philosophy was 3.32 on a scale of 1-5. Five of twenty-two members of the Department scored either a 1 (n=3) or a 2 (n=2) which are below the expectancies of the Department. Seventeen of the twenty-two members of the Department scored at 3 or better: 3 (n=7); 4 (n=5); 5 (n=5).

Action
Scholarship Evaluation Actions
Two of the five faculty members falling below criterion for Scholarship are "brand new" faculty to not only Sam Houston State University but to academia itself. The remaining three are tenured faculty members. Each of these individuals met privately with the Chair of the Department to consult about their progress with respect to scholarship. Of the two new faculty members, all currently are engaged in research, either at the IRB submission phase or at the phase in which they are collecting/analyzing data. Each expects to have published articles during the current academic year. Of the tenured faculty, one has organized a research team of graduate and undergraduate students and they are actively collecting data in the area of social psychology. One tenured faculty member is working feverishly on developing assessment tools for evaluation of practicum/internship students and currently is working on materials for accreditation for one of our graduate programs. The other tenured faculty is on a 4-4 load (teaching track) but has been afforded laboratory space and is being encouraged to submit IRBs and collect data.

Goal
Teaching Excellence
Faculty demonstrate high level of teaching effectiveness

Objective
Teaching Portfolios
Faculty will engage in effective teaching as evidenced by a portfolio documenting their teaching activities on an annual basis

Indicator
Teaching Portfolio Review
Evaluation of faculty teaching portfolios according to the Department Chair Guidelines for Teaching Evaluations. Factors in this indicator include: student evaluations, faculty track (teaching or research), number of sections, number of students in each section, etc.

Criterion
Teaching Portfolio Ratings
All department faculty are expected to score a 3 or better on the Chair's evaluation of teaching rubric. Rubric for this evaluation consists of: 1. number of courses taught; 2. IDEA Excellent Teacher evaluation; 3. IDEA Summary Course evaluation; 4. number of sections taught; 5. number of students instructed.

Finding
Teaching Evaluations (Chair's Rating)
The criterion for effective teaching as rated by the chair of the department is a 3 on a scale of one to five. All members of the department met criterion with a range of 3-5. The mean of the chair's rating was 3.99.


Action
Teaching Effectiveness (Chair's Evaluations)
Criterion was met, but oddly, the performance scores were below those of the 2009-2010 performance levels. Essentially, the chair, in making his assessment, did not take into account all of the variables that he used in the preceding year. Probably a better measure of whether the faculty became better during this academic year, as opposed to the last, would be the students' IDEA evaluations.

All faculty have met with the chair to discuss any problems in class that seem to be recurring problems. In addition, all faculty are strongly encouraged to attend the annual College Teaching Conference that will occur in August of 2011.

Faculty members with lower teaching scores are also strongly encouraged to seek guidance at the PACE Center. Interestingly, one faculty member who had scored poorly in last year's evaluations attended sessions at the PACE Center and posted a significant increment in her evaluations for the current year over the previous year.

Because the Chair was not pleased with the current rubric for his evaluation of teaching, he appointed three ad hoc groups to come up with a list of variables that the faculty wish for him to use in future evaluations.

Objective
Individual Development And Educational Assessment (IDEA) Evaluations
IDEA student evaluations of teaching will indicate that faculty are engaging in effective teaching as indicated by their summary scores

Indicator
IDEA Ratings
A summary IDEA score at or above the institution mean is considered to be satisfactory. Consistent with IDEA recommendations, converted averages on IDEA evaluations that are in the gray box (middle 40%) are considered to be "effective teaching." All faculty have students evaluate each of their classes during the Fall and Spring semesters using the IDEA teaching evaluations. The IDEA system focuses on student learning of 12 specific objectives, and the system solicits students' feedback on their own learning progress, effort, and motivation, as well as their perceptions of the instructor's use of 20 instructional strategies and teaching methods. In addition, the system surveys instructors regarding their overall goals and highlights these for them in the analysis and report. The system adjusts evaluation scores for five areas beyond the instructor's control, such as class size; student motivation, effort and work habits; and disciplinary difficulty. The scores are then compared to national norms. Teaching effectiveness is assessed in two ways: A. Progress on Relevant Objectives, a weighted average of student ratings of the progress they reported on objectives selected as "Important" or "Essential" (double weighted) and B. Overall Ratings, the average student agreement with statements that the teacher and the course were excellent.

Criterion
Teaching Effectiveness
The criteria for teaching effectiveness are for each faculty member to score at or above a 3.8 on Progress on Relevant Objectives and at or above a 4.0 on Overall Ratings: the average of B (excellent teacher) & C (excellent course).

Each faculty member will receive a summary score at or above the institution average to be considered satisfactory.

Finding
Teaching Effectiveness
The mean score for "progress on relevant objectives" for the department was 4.35 with a range of 3.4 to 5.0. Three courses did not meet the criterion of 3.8 or above on this measure. Two courses were in Philosophy and one was in Psychology, but the average for the faculty members for those courses, though, did reach the 3.8 criterion. During the faculty member/chair FES meetings the parties discussed possible ways to increase meeting the criteria for these classes.

With respect to overall teacher ratings, the mean was 4.36 with a range of 3.7-4.9. One faculty member scored a 3.7 and three scored a 3.9. All the rest were at 4 or above, with those faculty members in those classes reaching criterion.


Action
IDEA Scores
Criterion for meeting relevant objectives was met with the exceptions of three courses. These exceptions were one course for each of three individual faculty members and if one takes into account the other courses taught by the faculty members, these individuals did, overall, meet criterion. Given that these faculty members consistently have been above criterion over the past several semesters, we must consider that it may have been an anomaly caused ty the student population that semester. We'll continue to keep an eye on those variables in the upcoming semester and, if the problem persists, speak with the faculty members about what they wish to attain in the class and how they can better their chances of being successful.

With respect to overall teaching ratings, one faculty member scored a 3.7 and three scored 3.9s. Two faculty members of this group are tenured and two are tenure track with one being a brand new first year addition to the department. These faculty members are being recommended strongly to attend the CHSS Teaching Workshop in August, 2011. In addition, at least one of these two tenure-track faculty members will be observed by a senior faculty member, as is the new policy, in a classroom setting during the upcoming semesters. 

Goal
Curriculum
Faculty asses the extent to which the curriculum covers a broad base of the filed of psychology

Objective
Curriculum Evaluation
Courses in the Department of Psychology will be evaluated in terms of the breadth of topics covered in the field.

Indicator
Curriculum Matrix
Courses will be compared to the matrix designed by Levy et al. & published in Teaching of Psychology (1999). The chair made the comparisons based upon the syllabi for each course. In addition, the chair asked individual faculty about specific courses and whether those courses met criterion for the Levy et al. matrix.

Criterion
Percent Of Courses Covering Current Perspectives
50% of courses in the psychology curriculum are expected to require knowledge of the "Current Perspectives" section of the Levy Curriculum Matrix.

Finding
Curriculum Evaluation
Nine of the courses/areas listed in the Current Perspectives portion of the Levy Curriculum Matrix met the criterion requiring knowledge of that section.

Action
Curriculum Evaluation
We are continuing to monitor courses taught in the department with respect to meeting the criteria of the Levy Curriculum Matrix. We also will encourage faculty members who are proposing courses to keep the Matrix in mind when they are developing those courses.

Goal
Undergraduate Student Perception Of Psychology Learning
Undergraduates students will be satisfied with learning opportunties.

Objective
Undergraduate Student Perception Of Psychology Offerings
Students will indicate an appreciation for the diversity of fields within psychology and their realization that elementary statistics has enabled them to improve critical thinking to evaluate ideas and arguments in problem-solving.


Indicator
Senior Survey
The Psychology Senior Survey is given to graduating seniors.

Criterion
75% Of Seniors Rate Diversity High
The senior survey will ask: How would you rate the diversity of your education s in psychology based upon the types and number of courses taken? Using a 5-point Likert scale, criterion is to have 75% of the respondents indicating a 4 or 5 (diverse) on the surveys. A copy of the senior survey is attached.


Finding
Senior Survey
The percentage of students filling out the Senior Survey that rated the diversity of the program at 4 or higher on a Likert scale was 84% (Fall 2010) and 73% (Spring 2011). Thus, criterion was met for the Fall semester and was just below for the Spring semester.

Indicator
Individual Development And Educational Assessment (IDEA) Objective Ratings
The IDEA student evaluation requests that students rate various objectives of the class. For this objective, the following objectives will be observed related to the course in Elementary Statistics: (A) to apply course material to improve thinking and problem solving and (B) to apply course material to critically evaluate ideas and arguments.


Criterion
75% Of Students Rate Objectives High
Of the students having taken Elementary Statistics, 75% will indicate a 4 or 5 on the category, "Apply course materials to improve thinking and problem solving." Also, 75% of students having taken Elementary Statistics will indicate a 4 or 5 on the category, "Apply course materials to critically evaluate ideas and arguments."

Finding
IDEA Ratings
Of the students having taken Elementary Statistics, 81% indicated a 4 or 5  on the category, "Apply course materials to improve thinking and problem solving." Thus, criterion was met.
Fifty-eight percent of the respondents indicated a 4 or 5 on the category, "Apply course material to critically evaluate ideas and arguments." Criterion was not met.

Action
Student Perceptions
With respect to Diversity of the Program, criterion was met or nearly met. We are inspecting our programs and attempting to add courses that may provide a greater deal of diversity for the students. Additionally, to reach a greater audience, we are now offering particular courses from different sub-areas through distance learning. We believe that this should open up the opportunities for our students to be able to take greater advantage of different types of courses offered by the department.
With respect to Applying course material to (1) improve thinking and problem solving and to (2) critically evaluate ideas and arguments, even though we did not meet criterion on the latter, we have progressed from years past; In 2008-09, our percentage on critically evaluating ideas and argument was 24% and last year it was 49%. With the 58% this year, the progression is good and we hope to reach criterion next year. We will stress, in the statistics/methods courses, the ability to make informed, probabilistic decisions based upon statistical/methodological procedures taught in those classes and attempt to get more and more germane examples for the classes that will point out the pragmatic value of the materials they are learning.



Update to previous cycle's plan for continuous improvement

Plan for continuous improvement With respect to Teaching Excellence, as measured by Teacher Portfolio Rating, all members of the Department met the criterion. In the IDEA evaluation summary scores, four of twenty-two faculty members failed to reach criterion. With the criterion set at a 4.0 on a scale of 1-5, three of those four individuals received a 3.9. We're not overly concerned with these scores as they are not especially consistent with previous years' performances. The individual who scored a 3.7, and, one must remember that this is only one course, has agreed to attend, and participate in, the CHSS Teaching Conference in August.

With respect to "Meeting Relevant Objectives" on the IDEA forms, the Department, on the whole, surpassed the criterion. Two philosophy and one psychology faculty members were slightly below the criterion in one course each. Of course, overall, they did meet criterion, but in those individual courses the level was not met. The faculty members will be asked to reassess their goals and expectations for these individual courses, and if the current levels persist, the faculty members will be directed to the PACE Center.

With respect to Curriculum Evaluation, the Department met its criterion with classes in each category of the Current Perspectives of the Levy Curriculum Matirx. We continue to monitor these courses and encourage consideration of the Matrix when faculty are developing courses to be added to the curriculum

Concerning Faculty Scholarship, five of 22 faculty members failed to reach criterion. Two of those were brand new faculty who currently are working on building a research program, a third is actively collecting data with a group of undergraduate and graduate students serving as her research team, and the fourth is being encouraged to become more involved. The last person has been afforded laboratory space and money has been set aside for travel. When it comes down to it, though, it really is up the the individual faculty member to take advantage of the opportunities made available by the department and the university. The chair is fairly limited to providing encouragement and incentives.

We also need to incorporate more methodology in the content courses that we teach. This will only reinforce how scientific decisions are made how we gain knowledge through methodology and testing that is based upon theory.

Our program is viewed as relatively diverse but we need to make sure that we consider adding new courses or adding alternative ways of presenting courses to reach the maximum number of students with the maximum impact.