Assuring Validity Of Summative Outcomes Evaluation
The terminal goal is to assist in assuring the validity of the outcome evaluation upon which the Florida A&M University, NSF HBCU-UP project will base its summative conclusions regarding the overall success or failure of its initiatives and efforts.
Objective
A. Advise & Collaborate In Evaluation Design
A. Collaborate with Florida A&M University's designated assessment team to provide collaborative advice to the project??????s leadership on planning and implementation of appropriate and valid evaluation design, measurement & assessment processes.
Indicator
A. Phone Conferences & Email On Evaluation Design
A. Evidence of phone meetings and email exchanges with leaders of the FAMU HBCU-UP to be briefed on evaluation plans developed by the group at various points in the project's unfolding and to brainstorm about the project's evaluation design.
Criterion
A. One phone meeting and two email /document exchanges on evaluation design
A. One phone meeting and two email /document exchanges per semester addressing evaluation design: either initializing, revising, or confirming and checking up on the project??????s planned evaluation design and specific elements therein.
Finding
A. Phone meeting, email & document exchange
There was at least one in-depth phone meeting and more than two email /document exchanges between the FAMU project and the consultant early in Spring semester. Due to changes in the impelemtation of the program during the 07-08 academic year, concommitant design changes were discussed in depth in preparation for the meeting of the Advisory Board in February. See attachment. Just prior to the Fall term, the MOU for this contract was officially approved. In the process of creating and obtaining approval of the contract, FAMU project leaders and the consultant reviewed the last version of the project design and the implicit and expressed evaluation design
Indicator
A. Face To Face Meetings On Evaluation Design
A. Document and count phone meetings with leaders of the FAMU HBCU-UP for briefings on evaluation plans developed by the group at various points in the project's unfolding and to brainstorm about the project's evaluation design.
Criterion
A. One face to face meeting
A. Meet face to face at least once during the contract with FAMU project leadership, assessment personnel and persons who will implement program strategies, on site, at FAMU, and off-site (to be arranged) to develop necessary project background & understanding . This is important for developing background to help with advising on design, for understanding assessment implementation challenges and practical solutions, and for interpreting assessment results
Finding
A. Face to Face Meeting
A. Consultant met face to face with FAMU HBCU Up Project leaders and with their Advisory Board in Orlando FL on February 28-29, 2008
Indicator
A. Client Evaluation Of Consultant On Advice In Evaluation Design
A. Client evaluation of Consultant collaboration and advice on assessment and evaluation design
Criterion
A. Client Evaluation on evaluation design
A. Favorable client evaluation with regard to helpfulness of Consultant''s advising on evaluation design
Finding
A. Client evaluation of consultant design help
A. Structured request for Client Evaluation of Consultant has not been delivered to client at of July 2008. The concept and form of this Client Evaluation of Consultant request was divised in Summer of 2008
Action
A. Continue contributing on design and assessment
A. Consultant has demonstrated commitment to continue to work with the FAMU Project through program design adjustments and to encourage and support the identification of appropriate treatement population comparison groups, as well as pre- project performance benchmarks. consultant voiced these commitments at the Feb 29 Advisory Board meeting and also in a July 15 phone conference with an NSF site-visit tieam and the director of the FAMU HBCU-UP project. Revision plans will be reviewed and actual assessment conducted will be requested for review.
Objective
B. Assessment Implementation Input And Review
Provide input on challenges to planned assessment implementation. Also, provide recommendations for improvement of student-outcomes assessment implementation
Indicator
B. Evidence Of Assmt Review Feedback
B. Documentation and other evidence of informal feedback given to client in response to review of planned assessment implementation
Criterion
B. One written assessment implementation review
B. Assuming that the client has provided the client with a written information about planned assessment implementation and copies ,or written descriptions of measurement instruments, or assessment tools to be used, at least one written report or reaction to the implementation plan and materials must be provided to the client each year.
Finding
B. Evidence Of Assmt Review Feedback
B. the FAMU client did not provide the consultant with the current year''s assessment implementation plans in written form and copies of assessment materials were not provided. Hence there was nothing concrete upon which to produce a written review and written feed-back about assessment inmplemetation. This is a frustrating point for both the client and the consultant
Indicator
B. Evidence Of Response To Need For Methodological Assistance
B. Documentation and other evidence of having responded to FAMU client requests for input on methodological challenges that arose in implementing assessment
Criterion
B. One Instance of Methodological Assistance
B. Assuming that the client requested assistance with a methodological challenge or that Consultant noticed such a challenge and responded to it without being asked there should be evidence of technical methodological assistance having been provided
Finding
B. One Instance of Methodological Assistance
B. Assistance with methodological problems was requested and given via phone conference on Faeb 21. Reference to the Feb 21 Phone Conference Followup document ( attached to earlier finding) will provide details about the nature of the technical problems presented by anticipated program design changes.
Indicator
B. Client Evaluation On Assmt Implementation Input
B. Client evaluation of Consultant input on Assmt Implementation
Criterion
B. Client evaluation on assessment implementation
B. Favorable client evaluation with regard to helpfulness in advising on assessment implementation
Finding
B. Client evaluation on assessment implementation
B. Structured request for Client Evaluation of Consultant has not been delivered to client at of July 2008. The concept and form of this Client Evaluation of Consultant request was divised in Summer of 2008
Action
B. Improve communication of Assmt Implementation
B. Develop plan for encouraging and prompting client (1) to share written information about program changes and evaluation design changes;(2) to identify persons in the institution with whom consultant can work to identify variables on which to define comparison populations and to draw down benchmark outcome information; (3) to provide data and summary results from past assessments; (4) to provide future assessment result on a regular and timely basis. Explore the application of the IDEA faculty and student evaluation survey to measure faculty and student alignment on learning objectives
Objective
C. Documentation Of Recommendations Or Confirmation
Provide documented feedback on implementation methods and procedures requiring improvement. Provide documented confirmation of successful methods and results.
Indicator
C. Evidence Of Feedback On Validity Of Results
C. Written documentation of feedback on assessment results vis a vis technical validity, reliability and authenticity
Criterion
C. One feedback document on formative results and methods
C. At least one written document of feedback, recommendations or confirmation should be deliverd to client on formative results and their associated assessment methods each year of the project and contract duration. This assumes that the FAMU HBCU-UP Project client produces and provides formative assessment results and information on the assessment methods used to obtain them
Finding
C. Evidence Of Feedback On Validity Of Results
C. The FAMU HBCU UP project has conducted limited assessment due to numerous changes conditions which delayed program implementation and which forced revisions in program design which delayed and changed evaluation design and impleentation . Some data was collected at the end of the Spring semester and has been compiled by FAMU project staff. These have not been shared with the Consultant as of July 25 2008. At the current pace, the first formative results may not be available for review, and feedback until Fall 2008
Criterion
C. Documentation of feedback on summative results and methods
C. Over the course of the contract, at least one written document of feedback, recommendations or confirmation must be delivered to client on summative results and their associated assessment methods. This assumes that the FAMU HBCU-UP Project client produces and provides summative assessment results and information on the assessment methods used to obtain them
Finding
C. Documentation of feedback on summative results
C. Although summative results are not expected until the culmination of the project the project has conducted limited assessment due to numerous changes conditions which delayed program implementation and which forced revisions in program design which delayed and changed evaluation design and impleentation . Some data was collected at the end of the Spring semester and has been compiled by FAMU project staff. These have not been shared with the Consultant as of July 25 2008. At the current pace, summative results may not be available for review, and feedback until Spring of 2010
Indicator
C. Client Evaluation On Written Feedback Of Formative Results And Methods
C. Client evaluation of Consultant's written feedback on formative results and methods
Criterion
C. Client evaluation on formative results
C. Favorable client evaluation with regard to helpfulness of written feedback on formative results
Finding
C. Client evaluation on formative results feedback
C. Structured request for Client Evaluation of Consultant has not been delivered to client at of July 2008. The concept and form of this Client Evaluation of Consultant request was divised in Summer of 2008
Indicator
C. Client Evaluation On Written Feedback Of Summative Results And Methods
C. Client evaluation of Consultant's written feedback on summative results and methods
Criterion
C. Client evaluation on sumative results feedback
C. Favorable client evaluation with regard to helpfulness of written feedback on sumative results
Finding
C. Client evaluation on sumative results feedback
C. Structured request for Client Evaluation of Consultant has not been delivered to client at of July 2008. The concept and form of this Client Evaluation of
Action
C. Plan for validity review
C. Once the previous two actions have been initiated and are under way, the enphasis with the client must be shifted towards taking measures ahead of time to assure the validity of assessment results. Visit FAMU campus in order to understand the availability of historical student data and the project''s ability to preserve current student data. More written documentation must be produced of required strategies and procedures for producing valid results.
Goal
Validity Of Formative Outcomes Assessments
The intermediate goal is to assist in assuring the validity of interim student outcome measures and of the formative strategy assessments upon which the Florida A&M University, NSF HBCU-UP project will base decisions about program development and improvement.
Objective
A. Advise & Collaborate In Evaluation Design
A. Collaborate with Florida A&M University's designated assessment team to provide collaborative advice to the project??????s leadership on planning and implementation of appropriate and valid evaluation design, measurement & assessment processes.
Indicator
A. Phone Conferences & Email On Evaluation Design
A. Evidence of phone meetings and email exchanges with leaders of the FAMU HBCU-UP to be briefed on evaluation plans developed by the group at various points in the project's unfolding and to brainstorm about the project's evaluation design.
Criterion
A. One phone meeting and two email /document exchanges on evaluation design
A. One phone meeting and two email /document exchanges per semester addressing evaluation design: either initializing, revising, or confirming and checking up on the project??????s planned evaluation design and specific elements therein.
Finding
A. Phone meeting, email & document exchange
There was at least one in-depth phone meeting and more than two email /document exchanges between the FAMU project and the consultant early in Spring semester. Due to changes in the impelemtation of the program during the 07-08 academic year, concommitant design changes were discussed in depth in preparation for the meeting of the Advisory Board in February. See attachment. Just prior to the Fall term, the MOU for this contract was officially approved. In the process of creating and obtaining approval of the contract, FAMU project leaders and the consultant reviewed the last version of the project design and the implicit and expressed evaluation design
Indicator
A. Face To Face Meetings On Evaluation Design
A. Document and count phone meetings with leaders of the FAMU HBCU-UP for briefings on evaluation plans developed by the group at various points in the project's unfolding and to brainstorm about the project's evaluation design.
Criterion
A. One face to face meeting
A. Meet face to face at least once during the contract with FAMU project leadership, assessment personnel and persons who will implement program strategies, on site, at FAMU, and off-site (to be arranged) to develop necessary project background & understanding . This is important for developing background to help with advising on design, for understanding assessment implementation challenges and practical solutions, and for interpreting assessment results
Finding
A. Face to Face Meeting
A. Consultant met face to face with FAMU HBCU Up Project leaders and with their Advisory Board in Orlando FL on February 28-29, 2008
Indicator
A. Client Evaluation Of Consultant On Advice In Evaluation Design
A. Client evaluation of Consultant collaboration and advice on assessment and evaluation design
Criterion
A. Client Evaluation on evaluation design
A. Favorable client evaluation with regard to helpfulness of Consultant''s advising on evaluation design
Finding
A. Client evaluation of consultant design help
A. Structured request for Client Evaluation of Consultant has not been delivered to client at of July 2008. The concept and form of this Client Evaluation of Consultant request was divised in Summer of 2008
Action
A. Continue contributing on design and assessment
A. Consultant has demonstrated commitment to continue to work with the FAMU Project through program design adjustments and to encourage and support the identification of appropriate treatement population comparison groups, as well as pre- project performance benchmarks. consultant voiced these commitments at the Feb 29 Advisory Board meeting and also in a July 15 phone conference with an NSF site-visit tieam and the director of the FAMU HBCU-UP project. Revision plans will be reviewed and actual assessment conducted will be requested for review.
Objective
B. Assessment Implementation Input And Review
Provide input on challenges to planned assessment implementation. Also, provide recommendations for improvement of student-outcomes assessment implementation
Indicator
B. Evidence Of Assmt Review Feedback
B. Documentation and other evidence of informal feedback given to client in response to review of planned assessment implementation
Criterion
B. One written assessment implementation review
B. Assuming that the client has provided the client with a written information about planned assessment implementation and copies ,or written descriptions of measurement instruments, or assessment tools to be used, at least one written report or reaction to the implementation plan and materials must be provided to the client each year.
Finding
B. Evidence Of Assmt Review Feedback
B. the FAMU client did not provide the consultant with the current year''s assessment implementation plans in written form and copies of assessment materials were not provided. Hence there was nothing concrete upon which to produce a written review and written feed-back about assessment inmplemetation. This is a frustrating point for both the client and the consultant
Indicator
B. Evidence Of Response To Need For Methodological Assistance
B. Documentation and other evidence of having responded to FAMU client requests for input on methodological challenges that arose in implementing assessment
Criterion
B. One Instance of Methodological Assistance
B. Assuming that the client requested assistance with a methodological challenge or that Consultant noticed such a challenge and responded to it without being asked there should be evidence of technical methodological assistance having been provided
Finding
B. One Instance of Methodological Assistance
B. Assistance with methodological problems was requested and given via phone conference on Faeb 21. Reference to the Feb 21 Phone Conference Followup document ( attached to earlier finding) will provide details about the nature of the technical problems presented by anticipated program design changes.
Indicator
B. Client Evaluation On Assmt Implementation Input
B. Client evaluation of Consultant input on Assmt Implementation
Criterion
B. Client evaluation on assessment implementation
B. Favorable client evaluation with regard to helpfulness in advising on assessment implementation
Finding
B. Client evaluation on assessment implementation
B. Structured request for Client Evaluation of Consultant has not been delivered to client at of July 2008. The concept and form of this Client Evaluation of Consultant request was divised in Summer of 2008
Action
B. Improve communication of Assmt Implementation
B. Develop plan for encouraging and prompting client (1) to share written information about program changes and evaluation design changes;(2) to identify persons in the institution with whom consultant can work to identify variables on which to define comparison populations and to draw down benchmark outcome information; (3) to provide data and summary results from past assessments; (4) to provide future assessment result on a regular and timely basis. Explore the application of the IDEA faculty and student evaluation survey to measure faculty and student alignment on learning objectives