Moldovan National System of External Quality Assurance in Higher Education: Steps to Quality

...

Elena Petrov, Vice-President of ANACEC


1. Reform of the external quality assurance system

In 2018 the National system of external quality assurance in education of the Republic of Moldova underwent certain changes. The reform, implemented in the context of the public administration transformation strategy for 2016-2020[1], included a number of organizational transformations of some entities accompanied by content-related aspects. Thus, the National Agency for Quality Assurance in Professional Education (ANACIP), a relatively young entity responsible for quality assurance in higher, vocational and continuing education in the Republic of Moldova established by the Government Resolution in 2016, was transformed into the National Agency for Quality Assurance in Education and Research in February 2018. It was transformed by merging ANACIP, the National School Inspectorate and the National Council for Accreditation and Attestation[2].

Mission, functions and authorities of the reformed Agency

The National Agency for Quality Assurance in Education and Research (ANACEC) is a successor in rights and duties of affiliated legal entities. Thus, the range of the Agency’s activities has been significantly expanded.

The mission of ANACEC is to introduce the state policies and to assist in the development aimed at the highest international standards in designated areas of competence.

To successfully implement its new mission, the Agency performs the functions established by the legislative and regulatory framework of the Republic of Moldova in two main areas:

(1) in the field of education including general, vocational, higher education and continuing vocational training;

(2) in the field of scientific research and innovations.

The Agency’s functions include the following:

• quality control and external quality evaluation along the whole vertical of education including general, vocational and higher education;

• external evaluation of continuing vocational training programs;

• external evaluation of research and innovation organizations;

• evaluation of research and academic staff.

The Agency' broadened powers[3] cover the following:

1) quality assurance in general education through external evaluation of general education institutions, evaluation of curriculum, teaching, auxiliary teaching and managerial staff in the general education system, through accreditation of general education institutions;

2) quality assurance in vocational education, in particular, through external evaluation of programs and institutions, as well as evaluation of teaching, auxiliary teaching and managerial staff in vocational education, in accordance with the established standards and developed methodologies;

3) quality assurance in higher education through external evaluation for authorization for temporary operation or accreditation of study programs and higher education institutions;

4) evaluation of continuing vocational training programs;

5) evaluation of research and innovation organizations as well as research and academic staff.

The Agency in the new format is a state-funded administrative body subordinate to the Ministry of Education, Culture and Research.

Organizational structure

The organizational structure of the Agency has undergone only partial changes. Thus, the Agency's administrative apparatus was supplemented with new subdivisions responsible for quality assurance in general education, as well as in the field of research and innovations. The collective governing body that ensures the development and implementation of the Agency’s strategy is still the Governing Board of the Agency, consisting of 15 people including 13 teaching staff representatives with academic titles and degrees, one students’ representative and one business representative. Moreover, the responsibilities and duties of the ANACIP's Governing Board were retained until expiration of the mandate in summer 2019. Relevant specialized committees are an important element in the Agency’s structure. They will be established in accordance with the rules set by the Regulation on relevant specialized committees the draft of which was developed and will be approved by the government.

The concept of organization of the Competition for selecting the Governing Board members has survived. The basic requirements for the competition are its openness and the international jury participation. The competition committee must comprise five international experts delegated by the quality assurance bodies from the European Union countries. There is a limiting rule whereby the same person can renominate himself/herself for election as a member of the Governing Board, but not more than for two successive mandates.

New elements include:

• organization of the competition by the Ministry of Education, Culture and Research;

• approval of the Governing Board's members by the Ministerial Order;

• appointment of the person recognized as the winner of competition for the position of the Chairperson by the Government;

• appointment of the persons recognized as the winners of the competition for the vacant position of the Deputy Chairperson and Secretary General by the Ministry of Education, Culture and Research.

Basic arguments in support of the reform: 1. Concentration of external quality assurance tasks along the whole vertical of education and in the field of research and innovations in the hands of one nationwide specialized entity, in order to ensure consistency, conformity and continuity of processes. 2. Consolidation and optimization of all available resources in the field of external quality evaluation. 3. Optimization of the entity staff with subsequent increasing motivation, in particular, through a salary increase.

2. Regulatory and methodological support of external quality evaluation processes

Delegation of additional functions to the Agency in the field of quality assurance at the general education level, as well as in the field of research and innovations, put forward provision of the regulatory and methodological framework of these processes as a vital task. Thus, priority areas for ANACEC are development and submission to the relevant Ministry for approval, or for forwarding to the Government for approval, of a number of documents including:

• methodologies required for quality assurance function in general education;

• methodology of comprehensive evaluation of organizations' potential in the field of research and innovations as well as their research and academic staff;

• methodologies to confirm academic degrees (Doctor and Doctor Habilitat) and academic ranks (Conferentiar/Associate Professor and Professor).

A number of projects have been developed in all listed areas involving all concerned players. The final documents will be finalized, taking into account various well-reasoned proposals approved in accordance with the established procedure, with subsequent putting into practice.

At this stage, it was decided not to modify the current methodology of external quality evaluation for authorization for temporary operation and accreditation of educational programs and institutions of vocational, higher and continuing education[4]. From the Agency’s perspective, this will allow us to complete the process of external quality evaluation of higher education First Cycle programs in identical conditions, applying the same standards and tools that allow their objective comparison.

3. External quality evaluation of higher education programs in figures

Between summer 2016 and December of 2018 the Agency has performed an external evaluation of over 300 university-level educational programs, including:

• 21 programs for authorization for provisional operation;

• 283 programs of First Cycle for accreditation.

Sixty external evaluation committees were involved in the implementation of this complex process. For this purpose, 246 persons were identified and trained on implementation of external evaluation of university programs, including:

– 129 teaching staff representatives;

– 72 employers' representatives;

– 45 students.

The external evaluation committees also included nine external experts recommended by the Romanian Agency for Quality Assurance in Higher Education (ARACIS).

By the order of the relevant Ministry, external evaluation of First Cycle programs in Pedagogical sciences was organized between October 2016 and January 2017. A total of 105 programs implemented by eight HEIs were evaluated. In accordance with the current Regulation, the Governing Board's decisions on these programs were distributed as shown in Diagram 1.

In spring 2017 an external evaluation of First Cycle university programs in Economic sciences was carried out. A total of 53 programs functioning at 14 HEIs were evaluated. The decisions of the Governing Board of the Agency were distributed as shown in Diagram 2.

Afterwards, external evaluation covered the programs in some other study fields including:

• Natural sciences

• Public services

• Exact sciences

• Humanities

• Engineering

• Social and behavioral sciences

• Political sciences

• Information and communication technologies

• Manufacturing and processing technologies

• Healthcare, etc.

The Governing Board's decisions on these programs were distributed as shown in Diagram 3.

Thus, over the past period the process of external evaluation of study programs has covered 23 HEIs including 17 public and six private universities.

The number of evaluated programs significantly varies from HEI to HEI and looks as shown in Table 1.

4. Methodological elements of analysis of the external evaluation process influence

Certainly, improvement of external evaluation process is a permanent and highly important task of the Agency. To achieve this, we apply various methodological elements that fulfill various functions, in particular:

(1) To analyze the goals, processes, structure, preferences, actions and institutional and program changes, the following is predominantly used:

– document analysis and direct observations;

– standardized questionnaires for different target groups (students, teaching staff responsible for quality at the level of university/department, managerial staff of HEIs/ departments, the Agency’s experts, etc.);

(2) To analyze causal mechanisms, structured interviews with key players are used.

5. Survey of external evaluation process participants

In order to survey an opinion of the process participants, an on-line questionnaire was developed and used. It contained both general and special sections for certain categories of respondents. There were two categories of survey respondents:

• university representatives including:

– senior management representatives (rectors, vice-rectors, deans, heads of departments),

– persons responsible for quality assurance at the university level,

– program coordinators,

– teaching staff;

• Agency's experts.

As a result, in spring 2018, this questionnaire was filled in by 74 university representatives and 101 external experts, expert evaluation committee members. In the second round, in fall 2018, 28 university representatives and 48 experts participated in the survey. It is important to emphasize that about 2/3 of the questionnaire survey respondents were involved in two and more external evaluation processes having accumulated particular experience in the sphere of external quality evaluation.

The questionnaire includes a few question pools. For the first category of respondents, university representatives, the first pool contains the questions relating to the visit to a university and the opinion on external evaluation committees, developed during the visit, in particular, regarding:

• composition of external evaluation committees, their competence level for implementing external evaluation and quality of their performance of the set tasks;

• quality of the interview with the target groups during the visit to a university and the transparency level of the committees helping/encouraging the participants to express their own point of view.

Thus, 60% of the first survey participants and 74% of the second survey participants determined the level of their own satisfaction with the external evaluation committee members at the level 4 and 3 out of four possible levels. The percentage of respondents, who are least of all satisfied with the results of discussions, decreased from 14% in the first survey to 5% in the second survey.

The percentage of participants, who consider that they had an opportunity to fully or partly express their own views during the interviews organized by evaluation committees, was 92% in the first round and 97% in the second round (Diagrams 4, 5).

It must be emphasized that the respondents noted the importance of continuous replenishment of the register of the Agency’s external experts with new nominees, who combine professionalism, competence, integrity, and a relevant ethical profile. At the same time, most of them placed emphasis on the importance of:

(1) testing experts' professional skills and external evaluation skills;

(2) identification and prevention of the conflict of interest between the external evaluation committee members and the university involved in the external evaluation;

(3) organization by the Agency of ongoing stage-by-stage training, education and attestation of external evaluation committee members/experts;

(4) searching for and inviting experts from abroad to join external evaluation committees;

(5) an increase in the time allowed for the documentation visit to the university up to 2-3 days.

Some opinions also make us happy: "I was pleasantly surprised at the external evaluation committee attitude. Frankly speaking, I was waiting for public execution, but in fact this was a fair evaluation procedure and collegial attitude".

The second question pool is about the applicable external evaluation procedure and quality standards, criteria and indicators used during the process.

On this subject, 90% and 65% of respondents, respectively, determined the level of availability and understanding of the external evaluation procedure as above the average at the time of the expert committee's visit to the university (Diagrams 6, 7).

At the same time, according to the respondents, external evaluation contributed to the increase in the level of understanding of evaluation procedures, standards, criteria and indicators. Thus, in the second survey all respondents partly or fully acknowledged this fact (Diagrams 8, 9).

The respondents expressed their opinion on the relationship between applied standards and indicators with assurance/improvement of the educational process quality and educational institution management.

The respondents formulated a number of proposals speaking in favor of:

• simplification and reduction in the number of evaluation standards, and minimal standards, in particular;

• avoidance of certain indicators duplication in various standards;

• a more detailed description of certain standards (for example, mobility or internationalization standards, etc.);

• indicator weight redistribution within the standards;

• review of indicators relating to:

– the level of university graduates' employment;

– students' academic mobility;

– enrolment of people with disabilities/special educational needs in university programs, etc.

According to a number of questionnaire survey participants, some performance indicators are highly ambitious and, as things stand now, are far from realities of the Republic of Moldova.

The third question pool is aimed to determine the level of the process beneficiaries' satisfaction with communication with the Agency and support provided by employees and, in particular, committee coordinators, at different stages including pre­paration of the self-evaluation report, organization of the expert committee’s visit to the university, etc. Thus, the percentage of respondents,

• whose level of satisfaction with cooperation with coordinators of external evaluation committees was "above the average", was 86% in the first survey and 75% in the second survey;

• whose level of satisfaction with communication with the Agency's management was "above the average", was 92% in the first survey and 96% in the second survey, respectively.

The same section finds out what respondents think about the level of objectivity and transparency when discussing and making decisions on the results of external evaluation by the Agency' Governing Board. A negative trend could be observed here, as the percentage of persons fully or partly sharing this thought decreased from 100% in the first survey to 84% in the second one. Although it should be noted that the first questionnaire only offered two possible answer options "I fully agree", "I partly agree", while in the second survey two additional answer options were added to the questionnaire: "I fully disagree" and "I cannot express my opinion", which expanded the range of options.

The fourth question pool is aimed to identify the level of influence of the external evaluation process on the improvement of quality of programs/quality of the educational process and to find out whether university took some corrective or preventive measures in post-evaluation period or not. According to the survey results:

• 91% of respondents in the first round and 96% in the second round fully or generally agree that external evaluation influenced the improvement of the educational process quality as part of First Cycle educational programs undergoing external evaluation (Diagrams 10, 11);

• 94% of respondents in the first round and 93% in the second round consider that external evaluation processes contributed to the subsequent introduction of certain corrective measures by the university (Diagrams 12, 13).

The respondents noted a number of positive moments including the following:

• external evaluation contributes to sharing of experience and stimulates this process;

• external evaluation contributes to creation of continuous connection and partnership between all parties involved and strengthens it.

A particular target group in the survey were the Agency’s experts, involved in the processes of external evaluation of university programs as members of external evaluation committees. Out of the total number of respondents who filled in the questionnaires, 67% in the first round and 71% in the second round were representatives of the academic community, teachers; 23% and 12% were employers' representatives and, 10% and 17%, respectively, were students’ representatives.

The questions formulated during the survey of experts' opinions covered the following:

– the experts' competence level;

– understanding of the applicable external evaluation procedure and quality standards, criteria and indicators used during the process;

– experts' communication with university representatives;

– experts' communication with the Agency’s representatives;

– influence of the external evaluation process on programs/HEI being evaluated.

Below are some results of the experts' survey in figures:

1. 65% in the first survey and 90% in the second survey noted a high level of satisfaction (3 and 4 out of four possible levels) with seminars organized at the stage of preparation for the visit to a university.

2. All survey participants confirmed that they had an opportunity to ask a question to the coordinator, the Agency’s representative and get an answer at all stages of the external evaluation process.

3. 88% in the first survey and 81% in the second survey fully confirmed that participation in the external evaluation process contributed to their professional growth and, in particular, to the increase in the level of understanding of the external evaluation procedure and the content of standards, performance criteria and indicators applied. Other 12% and 19%, respectively, partly agreed with this.

4. 98% and 99% of the survey participants, respectively, confirmed that communication with university representatives was constructive and open.

5. Most of the survey participants fully or partly confirm that the information obtained from the self-evaluation report, as well as the data recorded in the Visit report during the visit to a university, allowed them to form an objective opinion on the evaluated program.

6. 92% of respondents in the first round and, respectively, 77% in the second round noted that the external evaluation visit to a university was coordinated in advance and effectively organized.

7. About 16% (22%) of respondents consider that one working day is not enough for an objective analysis of the educational program during the visit to a university.

8. The percentage of respondents who evaluate the level of support, provided by the Agency during preparation of the External evaluation report and during the whole process of external evaluation, as high (5 and 4 out of five possible levels), increased from 60% to 97%.

9. The level of experts' satisfaction with cooperation process within external evaluation committees increased from 36% to 76%.

The Agency`s experts also formulated a number of proposals to improve the processes. Thus, in the context of identification of experts on external evaluation of study programs /educational institutions and their continuing training, the respondents noted, in particular, that the following is necessary:

• Inclusion of responsible and demanding professionals in external evaluation committees;

• Organization of more workshops/seminars on external quality evaluation topics;

• A more detailed explanation of performance indicators, evaluation criteria and standards;

• Use of the analysis based on thematic studies/case studies for illustrative purposes during seminars;

• Modeling/simulation of the documentation visit to a university;

• Organization of targeted discussions and sharing of experience between experts following external evaluation;

• Analysis of the best identified quality assurance practices at the national and international levels;

• Certification of trained/qualified external evaluation experts.

The experts, who filled in the questionnaire, also made a number of proposals to improve the standards and performance indicators applied. In particular, the respondents noted that it is necessary to:

• shift emphasis from performance indicators to compliancy indicators;

• avoid duplication/redundancy of certain information as part of several performance indicators;

• put more focus on the criteria related to the development of competencies;

• consider and describe general standards relating to the university in general and not to each program, if a number of study programs undergo simultaneous external evaluation.

• simplify the visit report format, etc.

To make the evaluation visit to a university more effective, the following was suggested:

• to increase the time allowed for the visit;

• to make sure that all members of the external evaluation committee will be able to get fully involved/participate in the committee's performance throughout the process.

• to require HEIs to include all necessary supporting documents in the attachment to the self-evaluation report.

• to make focused efforts to enhance meaningfulness and quality of self-evaluation reports, developed by HEIs, as a core document in the external evaluation process.

It is encouraging that many survey participants noted significant progress in a number of areas during the Agency’s activities, in particular:

• growing interest of HEIs in external evaluation issues;

• increasing quality of self-evaluation reports;

• proper organization and coordination of expert committees' activity;

• creation of real conditions for the objective and transparent process of external evaluation.

At the same time, experience accumulated by the Agency also identified a number of challenges relating to external evaluation committees (Table 2).

All the aspects listed show that it is necessary to strengthen and consolidate the component of the Agency’s experts training programs.

Certainly, for more in-depth analysis of the influence of external evaluation processes, we will have to introduce additional methodological elements, in particular:

• before and after comparison design, which allows us to analyze how and when a certain effect is achieved;

• formulation and analysis of hypotheses about causal mechanisms that allow us to analyze how effects are achieved.

We still have work to do in this area.

6. General conclusions and reserves. Lessons learned

ANACEC (ANACIP)’s hands-on experience in external evaluation of quality of university-level study programs allows us to make a number of conclusions, in particular:

• The Moldovan HEIs have established special institutional structures for internal quality assurance;

• The HEIs have identified and have been introducing specific quality assurance procedures;

• Students are involved in quality assurance processes.

At the same time, a number of reserves can be also distinguished. Thus,

– Most HEIs are still at the stage of transition from focus on quality assurance to the process of development of the real quality culture;

– Continuous and progressive quality improvement is still at the desired level;

– ESG used by ANACIP/ANACEC are already well-known in universities/in the academic community, but less known among students;

– The level of students/beneficiaries' participation in the processes of internal and external quality assurance and influence of this process are very diverse.

The results of ANACEC (ANACIP)'s about two-year activity period were discussed during the international seminar themed "Assurance of university education quality: from challenges to prospects"[5] organized by the Agency on October 25, 2018. The seminar brought together the representatives of the relevant Ministry and HEIs' academic community.

During the seminar the Agency staff informed the participants on:

• achievements in external quality evaluation at the higher education level;

• the problems and challenges in this process;

• the on-line questionnaire survey results and perception of the process of external quality evaluation by ANACEC (ANACIP)'s beneficiaries.

Our guests Iordan Petrescu, President of the Romanian Agency for Quality Assurance in Higher Education (ARACIS) and Emilia Gogu, expert of the Agency, shared their experience in external quality assurance at the university level.

The seminar participants discussed a topical question: how to increase the attractiveness of Moldovan HEIs and their educational programs at the national and international levels. Everybody agreed that this can be done, in particular, by improving quality of functional programs, as well as quality of educational services provided by HEIs. The Agency, which is the national entity responsible for external quality assurance, plays a significant role in solving this vital task.

The Seminar also generalized the lessons learned over the period of the Agency’ performance, which materialized in the whole range of questions still to be answered, in particular: How do external evaluation and the Agency's final decisions influence higher education components including key players? To what extent is the national quality assurance system with particular standards promoted by ANACEC able to provide real support to HEIs in quality assurance and development of the real quality culture? How does the national system of external quality assurance influence the quality of learning, teaching and evaluation processes? Which accreditation standards applied during the external evaluation process achieved desired effect? How did they influence study programs that underwent external evaluation? Which accreditation standards applied during the external evaluation process did not achieve expected effect and why? Did the current quality assurance system help HEIs to increase the number of students and graduates? Did the current quality assurance system help HEIs to develop students’ competencies sought-after on the labor market? To what extent has the national quality assurance system been contributing to the growth of students' academic mobility and the development of internationalization process in Moldovan higher education? To what extent do the results of external evaluation of programs/HEIs influence students’ choice of mobility path?

7. External quality evaluation: tasks for the future

The Ministry of Education, Culture and Research entrusted the Agency with the task to complete external evaluation of all First Cycle programs (Bachelor`s degree) in 2018 and then to fully cover university programs of Second Cycle Master's degree in 2019 and Third Cycle Doctoral degree in 2020. The task appears to be rather complicated, given the fact that at the moment there is no straightforward information on the number of active programs and, especially, Master's programs offered by Moldovan HEIs. The concrete data on this issue would allow us to plan the external evaluation process more effectively.

It should be noted that the number of the Agency's employees responsible for higher education has been reduced in the new format. Thus, ANACEC has one subdivision: the Directorate of Evaluation in Higher Education consisting of five people (at the moment two positions are vacant). For comparison, ANACIP comprised two entities with eight specialists. At the same time, the Agency faces some difficulties in organizing competitions to recruit new employees, in finding highly qualified specialists. Thus, upon the expiration of about ten months from the date of establishment, only 65% of the Agency's vacancies are filled.

Nevertheless, to assure its own performance quality, the Agency takes specific measures.

Taking into account the lessons learned based on its experience and questions still waiting for a concrete answer, the Agency has defined a number of priority tasks for the foreseeable future, including various areas of activity.

The most important tasks are presented in Figure 1.

To solve the listed tasks, we intend to:

• continuously improve external evaluation procedures;

• regularly consult all parties concerned;

• consolidate the Agency’s expert database through its continuous completion, as well as through systematic training of external evaluation experts;

• include experts recommended by foreign quality assurance agencies in all external evaluation committees;

• identify and mediate advanced quality assurance practices developed and introduced both at the national and European levels;

• achieve international recognition of the Moldovan system of external quality assurance in higher education through ENQA’s external evaluation;

• enhance international visibility and importance of ANACEC.

And, though the solution of these tasks is a complicated and lengthy process, we act according to the motto "Keep calm and carry on."

 


REFERENCES

1. Code of Education №152, Government Decree №201 of February 28, 2018. www.lex.justice.md

2. Methodology of external quality evaluation for authorization for temporary operation and accreditation of educational programs and institutions of vocational education, higher and continuing education. Government Decree №616 of May 18, 2016 www.lex.justice.md

3. Regulation on the organization and operation of the National Agency for Quality Assurance in Education and Research. Government Decree № 201 of February 28, 2018 www.lex.justice.md

4. Public administration transformation strategy for 2016-2020. Government Decree №911 of July 25, 2016 www.lex.justice.md

5. www.anacip.md

6. www.edu.gov.md

7. https://www.facebook.com/www.anacip.md/

 

[1] Public administration transformation strategy for 2016-2020. Government Decree no. 911 of July 25, 2016. www.lex.justice.md

[2] Code of Education no. 152, The Regulation on the organization and operation of the National Agency for Quality Assurance in Education and Research.Government Decree no. 201 of February 28, 2018. www.lex.justice.md

[3] Ibidem.

[4] Government Decree no. 616 of May 18, 2016. www.lex.justice.md

[5] See https://www.facebook.com/www.anacip.md/

To top