New York Codes, Rules and Regulations
Title 8 - EDUCATION DEPARTMENT
Chapter XXI - Annual Program Plans
Part 2302 - Fiscal Year 1978 Annual Program Plan For Libraries And Learning Resources And Educational Innovation And Support
Section 2302.6 - State advisory council evaluation

Current through Register Vol. 46, No. 12, March 20, 2024

(a) In conducting its advisory, evaluation and reporting activities, the New York State advisory council is supported by the Office for Program Analysis and Evaluation in the State Education Department. The Office for Program Analysis and Evaluation serves as support staff for the Executive Deputy Commissioner of Education by conducting evaluation activities in the areas of policy planning, policy analysis, and program evaluation. The office's program evaluation function is executed by the Bureau for Program Planning and Evaluation. Major efforts have been made over the past two years by that bureau to develop and implement a model for the evaluation of department programs. Results of program evaluations conducted by the bureau are reported directly to the executive deputy commissioner. It is in the context of this program evaluation function that the Bureau for Program Planning and Evaluation works with the State advisory council in conducting an evaluation of the title IV of the Elementary and Secondary Education Act program.

(b) The organizational pattern described above places the title IV of the Elementary and Secondary Education Act evaluators from the Office for Program Analysis and Evaluation (Bureau for Program Planning and Evaluation) in a unique position to assist the council in monitoring and evaluating implementation of the title IV of the Elementary and Secondary Education Act program. While individual advisory council members are encouraged to conduct assessments of the title IV of the Elementary and Secondary Education Act program on their own, it is recognized that they cannot commit the amount of time and effort necessary to obtain a comprehensive understanding of the title IV of the Elementary and Secondary Education Act program. In assisting the council in their evaluation, the title IV of the Elementary and Secondary Education Act evaluators will be able to observe the program on a day-to-day, week-to-week basis without having direct responsibility for the administration of the program. This allows the evaluators to retain their objectivity while still being knowledgeable of the intricacies of the program. In addition, the credibility of the evaluation findings is increased because the evaluators are viewed by both council members and program administrators as knowledgeable about program operations. This increased level of credibility should result in greater utilization of evaluation data by the advisory council in making decisions on program policy and by program administrators in modifying program operations.

(c) During the past two Federal fiscal years (fiscal year 1976 and fiscal year 1977) the Office for Program Analysis and Evaluation in the State Education Department has assisted the New York State advisory council in conducting its advisory, evaluation and reporting activities. In both of these years, the State advisory council's evaluation of title IV of the Elementary and Secondary Education Act has focused on the management processes employed in the program. For example, under part B, evaluators have assessed the extent to which LEA part B applications reflect contemporary standards of planning and evaluation for the purpose of improving State guidelines, forms, and instructions as well as technical assistance given to LEA's. In addition, the evaluators worked with part B program administrators in developing instruments for the collection of monitoring and evaluation information on projects. Under part C, evaluators collected and analyzed data on technical assistance given to local school districts. In addition, evaluators worked with program administrators in devising a system for reviewing competitive project proposals submitted under part C. Joint efforts between the evaluators and program administrators have also resulted in the development of a series of instruments to be used for monitoring the implementation of part C projects.

(d) The emphasis on evaluating management processes under title IV of the Elementary and Secondary Education Act was necessitated by the fact that local district projects were only in the early stages of implementation during fiscal year 1976 (i.e. preparing project plans and beginning implementation of the plans). During fiscal year 1977 most local projects under parts B and C have been implemented. Procedures for monitoring both B and C projects have been established and implemented. Therefore, evaluation work in fiscal year 1978 will focus on the results of project implementation. Emphasis will be placed on evaluating the outcomes of projects and the progress being made in following New York State strategies for part B and part C. The next two sections of this evaluation plan will present the implications of this outcome orientation for evaluation of part B and part C.

(e) Part B--Evaluation strategy.

(1) As indicated in the fiscal year 1977 program plan, the State advisory council has viewed its part B evaluation strategy as a cumulative strategy; i.e. generalizations on the impact of part B projects in the State can be made only after reviewing data from a number of project years. Since part B is an acquisitions program, the first project year in a district is often spent obtaining materials and infusing these materials into the library, guidance program or the curriculum in a subject area. At the end of the initial project year, librarians, counselors or teachers may have actually used materials with students for only a few months. It is extremely difficult to attribute changes in student behavior to materials which have been used for such a short period of time. Therefore, a cumulative strategy in which data is gathered on a project over time will begin to give the council greater insights into the impact of IV-B programs on target populations. Implementation of the part B evaluation strategy for fiscal year 1978 will complete the four-phase cumulative approach for the fiscal year 1976 and fiscal year 1977 projects.

(2) The part B evaluation strategy for fiscal year 1978 is designed to complete phases three and four of the strategy outlined in the fiscal year 1977 plan for projects funded with fiscal year 1976 funds and fiscal year 1977 funds (see diagram below). For projects supported with fiscal year 1976 funds, phases one and two have been completed for a sample of districts. The data collection instruments for phases three and four have been developed for use on-site by representatives of the IV-B program office. Evaluation work in fiscal year 1978 will concentrate on conducting analyses with data gathered on project implementation and impact for fiscal year 1976 projects.

(3) In terms of projects funded with fiscal year 1977 monies, phases one and two are being conducted in spring 1977 (see fiscal year 1977 program plan). Phases three and four will be completed in spring 1978 after projects have been implemented. Prior to collecting data on fiscal year 1977 projects, revisions will be made in the data collection instruments based on experience with the fiscal year 1976 projects. Analysis of the data on project implementation and impact for fiscal year 1977 projects will also be conducted in spring 1978.

(4) The emphasis on evaluating the outcomes of part B projects will be continued for the projects funded with fiscal year 1978 money. The four-phase strategy will be modified for these projects by dropping phase 2--planning quality study--from the evaluation design. Work on the financial impact study will be conducted for fiscal year 1978 projects in spring 1978. Phases three and four which deal with project monitoring and evaluation will be implemented in spring 1979. The specific objectives and activities of phases three and four for fiscal year 1978 projects will be included in the fiscal year 1979 program plan since these activities will be conducted during that fiscal year.

(f) Phase 1--Financial impact study. The objective of phase 1 is to determine the impact of part B funding under the current New York State formula on existing LEA expenditures (from State and local sources) in areas eligible for funding under the purposes of part B. Information regarding the increase in expenditures (impact) attributed to part B will be used by the advisory council in considering possible adjustments to the part B distribution formula. In addition, the impact of alternative local school district expenditure strategies (i.e. how local districts decide to spend their part B money such as concentrating the funds on one of the part B purposes or on one target group) will be documented. This information will be useful in determining how to maximize the financial impact of part B monies.

(g) Phase 2--Planning quality study.

(1) The objective of phase 2 is to assess the extent to which LEA part B applications reflect contemporary standards of planning and evaluation. The planning quality study focuses on whether LEA's identify their needs, develop programs which are coordinate with those needs and use part B funds to meet those needs. The study is mainly directed at determining the validity of the premise of the part B program that LEA's are best aware of their needs in the areas eligible for funding and therefore should be given complete discretion in determining how to spend the funds. The results of the study may result in the advisory council's recommending changes in the part B legislation or regulations. In addition, the results of the study, combined with information on local district expenditure strategies, can be correlated with results of phases 3 and 4 of the evaluation plan. Essentially the question is whether local school districts that plan programs in a certain way and expend funds in a certain manner are more likely to implement programs which have greater educational impact on students.

(2) Finally, the results of the planning quality study will assist title IV of the Elementary and Secondary Education Act evaluators and the part B program administrators in identifying local districts with some capability in evaluation. These districts would then serve as prototype districts for developing a reasonable set of evaluation requirements for local districts under the part B program.

(h) Phase 3--Collection of data on part B project implementation. The objective of phase 3 is to determine if a district's ability to implement part B projects is related to how money is spent and how well programs are planned. Title IV of The Elementary and Secondary Education Act evaluators and part B program administrators have jointly determined the criteria by which project implementation will be assessed. Data collection instruments will be used to secure necessary information. The results of these efforts will be reviewed for the sample districts in terms of results from phases 1 and 2 of the evaluation.

(i) Phase 4--Collection of data on impact of part B projects in LEA's. The objective of phase 4 is to determine if districts who are more successful in achieving part B objectives employ higher quality planning and use money differently than districts who are less successful in achieving part B objectives. Again, the title IV of the Elementary and Secondary Education Act evaluators and the title IV-B program office have determined the criteria for measuring project success and the process for collecting empirical data relating quality of planning to project impact. The ultimate aim of phase 4 is to identify factors associated with successful part B projects so program managers at the State and local levels can attempt to put these factors into place. Information from phase 4 activities will also serve as a major input into the advisory council's annual evaluation report to USOE on the overall impact of the title IV-B program.

(j) Part C--Evaluation strategy.

(1) Similar to the evaluation strategy for part B, the strategy for part C in fiscal year 1976 and fiscal year 1977 focused on the management processes of the program. In those fiscal years, a three-phase evaluation strategy was implemented which was designed to evaluate IV-C projects at three points-- initial review and approval of projects, monitoring of project implementation and evaluation of effectiveness of projects. As a result of cooperative efforts between the evaluators and the IV-C program office criteria and procedures for project review and approval were established, procedures and data collection instruments for project monitoring were developed and designs for evaluating project effectiveness were devised. The evaluation strategy for fiscal year 1978 will emphasize evaluating the progress being made by IV-C projects in each of the State priority areas in fulfilling the transferring success strategy of the program.

(2) The fiscal year 1978 part C evaluation strategy will assess projects in each of the three grant types--developer, validation and demonstration both in terms of the effectiveness of the individual project and in relation to the progress that project has made in fulfilling the IV-C program strategy.
(i) Developer grants.
(a) Developer grants are awarded to LEA's for the development of new programs aimed at meeting needs common to many school districts in the State. Developer grants are multi-year; funded on a year-by-year basis with a project generally operating for three years as a developer. Three different levels of evaluation will be addressed for developer grants under the fiscal year 1978 evaluation strategy.

(b) The first level of evaluation is judging the effectiveness of the IV-C project for the LEA. Effectiveness will be judged based on the degree of implementation of project activities and achievement of project objectives as planned in the project proposal. The source of information on implementation of project activities will be progress reports and follow-up site visits. The source of information on achievement of project objectives will be the annual evaluation report which is based on the proposed evaluation design. Project effectiveness will be assessed at the end of each project year for developer grants. The project effectiveness evaluation will be used in conjunction with the proposal for project continuation to determine whether or not a developer project will be funded for an additional year.

(c) The second level of evaluation for developer grants is assessing the impact of the project on the target population. Impact of the project will be judged in terms of changes in the behavior of target populations (e.g. gains in student achievement, changes in teacher performance, dollar savings attributed to the project). The source of this information on project impact is the annual evaluation report of the project. However, information on project impact will probably not be available at the end of the initial year of the project. Project impact evaluation will occur at the end of the second year of the project (these data will be used in making decisions on continuing to fund the project) and at the end of the third and final year of the project (these data will be used as an input to making decisions on validating the project).

(d) The third level of evaluation of developer grants is assessing the progress of that project in moving to the succeeding phases of the IV-C transferring success strategy. For each priority area, the percentage of projects being validated can be calculated. In addition, the cost-effectiveness of developer grants in each priority area can be determined by calculating the total investment of Federal (and possibly local) funds and comparing that investment to the number of projects receiving validation and eventually becoming demonstrators (e.g. cost/validation and cost/demonstration).

(ii) Validation grants.
(a) Validation grants are given to LEA's to cover costs necessary for gathering data on an already existing program that is of high quality, but where there is not enough hard data for validation. Validation grants are designed to cover costs for only one year, after which a district is expected to submit its project for validation.

(b) The evaluation strategy for validation grants combines the project effectiveness level discussed under developer grants with assessing the progress of the project in moving to the next phase of the IV-C transferring success strategy. In terms of validation grants, the proof of project effectiveness is the project's ability to be validated. A validation grant was originally given to the LEA because it provided evidence that it did not have sufficient information to enable the project to be validated. Therefore, effective use of the funds should result in a project being validated. As was the case for developer grants, the cost-effectiveness of validation grants can be measured for a priority area based on the total investment of Federal (and probably local) funds per project being validated (cost/validation) and eventually becoming a demonstrator (cost/demonstration).

(iii) Demonstration grants.
(a) Demonstration grants are given to districts with State and/or nationally validated programs to enable them to inform, assist and train potential adopters. Demonstration funds are granted on a year-to-year basis with a typical project operating for two years. Three different levels of evaluation will be addressed for demonstration grants under the fiscal year 1978 evaluation strategy.

(b) The first level of evaluation is a process evaluation of demonstration activities leading to approval of contracts with replicators. These demonstration activities include sending out first and second level awareness materials, contacting replicators through workshops and on-site visitations, selection of replicators and the formulation of a contract between the demonstrator and replicator for services which is subject to the approval of the State Education Department. The source of information for the process evaluation will be the demonstrator's progress report which documents implementation of the above activities. In addition, the costs (Federal and local) of the demonstration activities will be calculated enabling evaluators to determine the cost per replication contract.

(c) The second level of evaluation for demonstration grants is also a process evaluation, but focuses on demonstration activities leading to installation of the exemplary project in the replicating district. These demonstration activities include training staff from the replicator district, providing assistance and advice to staff from the replicating district and providing any other follow-up information or consultation requested by the replicating district once the project has been installed in the district. The sources of information for the second level of evaluation will be the demonstrator's progress report, the annual evaluation report of demonstrators and possibly a progress report from replicators.

(d) The third level of evaluation for demonstration grants is based on the actual adoption/adaption of IV-C projects. Adoption/adaption will be measured in three ways. The first measure of adoption/adaption deals with quantity. For this measure the number of districts implementing replications will be calculated. In addition, the number of pupils served by districts replicating exemplary programs will be determined. The quantity measures will be aggregated by priority area. The sources of information for the quantity measure will be the demonstrator's progress reports, the annual evaluation report of demonstrators as well as similar documents from replicators.

(e) Another measure of adoption/adaption deals with quality of the adoption/adaption. Quality will be assessed in terms of whether the key elements of the exemplary project are in place in the replicating district. The demonstrator will be asked to specify the key elements of the exemplary project which should be present in replicating districts. The source of information to assess quality of the adoption/adaption will be on-site visits to a sample of replicator districts.

(f) The third measure of adoption/adaption will address the results of adoption/adaption. Results will be addressed in terms of whether the replicating district has in fact achieved results with target populations (e.g. student achievement, changes in behavior of teachers, dollar savings) comparable to the demonstrating district. The source of information for the results measure will be the evaluation report from the replicator.

(g) All three of the measures comprising the third level of evaluation for demonstration grants can be compared to cost figures as a means of conducting cost-effectiveness analysis. Therefore, cost-effectiveness for demonstration grants will be based on cost/quantity or number of replications, cost/quality or number of replications containing certain key elements of the original project and cost/results achieved by demonstrators.

(k) Part C--Strengthening evaluation.

(1) At the December 1976 meeting of the State advisory council, the committee on evaluation reviewed the evaluation reports from the title IV-C strengthening projects for inclusion in the title IV of the Elementary and Secondary Education Act annual report for fiscal year 1976. The committee expressed a need to improve the process for assessing the effectiveness of strengthening projects. Projects operating in fiscal year 1976 were required to complete a self-evaluation report presenting the achievements and accomplishments of the project in that year. This strategy was also planned for fiscal year 1977. While members of the committee found the self-evaluation reports to be informative, many members expressed a need to review some of the more than 25 strengthening projects in depth. The evaluation strategy for strengthening projects in fiscal year 1978 will include both the self-evaluation mechanism completed by project managers and in-depth evaluations of a limited number of projects conducted by staff from the Office of Program Analysis and Evaluation.

(2) During fiscal year 1978, the State advisory council will establish a set of priority projects for in-depth evaluation. These priorities will be reviewed by the Executive Deputy Commissioner of Education who will make the final decision on which projects will be evaluated during the fiscal year. The number of evaluations will depend on available staff resources. The in-depth evaluations will be conducted using an evaluation model developed by the Office of Program Analysis and Evaluation for reviewing Education Department programs. The evaluation model was developed during 1974 and 1975. The first version of the model was field-tested in early 1975 and revised in May 1975. The model has subsequently been used in 1975 and 1976 to review programs in the State Education Department.

(3) The evaluation model is designed to evaluate both the managerial quality and the effectiveness of programs in the Education Department. Since most department programs do not directly serve clients (i.e. students) the examination of client outcomes would only have limited utility with respect to the operation of programs that do not directly serve the clients. Consequently, the model places heavy emphasis on reviewing a program from the point of whether or not the elements of good management can be found. These elements have been specified by the department and consist of such things as:
(i) a well-defined program direction including specification of client needs, objectives for clients, conditions required for clients to realize the objectives and activities that the program will undertake to establish the conditions;

(ii) consensus of staff concerning program direction;

(iii) evaluation which assesses the extent of activity implementation, condition establishment and the realization of objectives for clients.

Disclaimer: These regulations may not be the most recent version. New York may have more current or accurate information. We make no warranties or guarantees about the accuracy, completeness, or adequacy of the information contained on this site or the information linked to on the state site. Please check official sources.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.