Monitoring universities when there will be results.  The results of monitoring Russian universities from the Ministry of Education and Science

Monitoring universities when there will be results. The results of monitoring Russian universities from the Ministry of Education and Science

The Ministry of Education and Science of the Russian Federation has published a report on the monitoring of Russian universities in terms of the effectiveness of their activities.

Monitoring results are presented for 502 state universities and 930 university branches.

Earlier Deputy Minister of Education and Science of the Russian Federation Alexander Klimov said: "If a university or branch is included in the list of universities with signs of inefficiency, this does not mean that it will be closed. There are many options for solving the problem: from strengthening to joining another university."

Download the full list of efficient and inefficient universities in all regions of Russia:
Final list: .

1. Educational activities: the average USE score of students admitted based on the results of the USE for full-time education in bachelor and specialist training programs at the expense of the relevant budgets of the budgetary system of the Russian Federation or with payment of the cost of tuition costs by individuals and legal entities (weighted average);

2. Research activities: the volume of R&D per one faculty member;

3. International activities: the share of the number of foreign students who have completed the development of the BEP of HPE in the total graduation of students (adjusted contingent);

4. Financial and economic activity: university income from all sources per one faculty member;

5. Infrastructure: the total area of ​​educational and laboratory buildings per student (reduced contingent), which are owned by the university and assigned to the university by the right of operational management.

Key indicators for evaluating the effectiveness of the activities of branches (in addition to the five indicators for evaluating universities):

6. The given contingent;

7. The share of candidates and doctors of sciences in the number of teaching staff (without part-time jobs and those working under civil law contracts);

8. The share of teaching staff employees (without part-time jobs and working under civil law contracts) in the total number of teaching staff.

"For the first time we have conducted a full-scale diagnosis of the quality of higher education. Nothing like this has happened before. It is important that all universities were assessed according to uniform and understandable criteria. Now we have a complete set of data on the quality of education in each branch, university, region. These data should become a signal for further work," said Dmitry Livanov, Minister of Education and Science of the Russian Federation.

universities

1. State Musical and Pedagogical Institute named after M.M. Ippolitova-Ivanova
2. State Specialized Institute of Arts
3. State University for Land Management
4. State University of Management
5. Literary Institute named after A.M. Gorky
6. Moscow State Academy of Water Transport
7. Moscow Architectural Institute (State Academy)
8. Moscow State Agroengineering University named after V.P. Goryachkin
9. Moscow State Evening Metallurgical Institute
10. Moscow State Humanitarian and Economic Institute
11. Moscow State Open University
12. Moscow State Technical University "MAMI"
13. Moscow State University of Design and Technology
14. Moscow State University of Environmental Engineering
15. Moscow State University of Technology and Management
16. Moscow Pedagogical State University
17. Russian State University for the Humanities
18. Russian State Social University
19. Russian State Trade and Economic University
20. Federal State Budgetary Educational Institution of Higher Professional Education "Moscow State University of Printing Arts named after Ivan Fedorov"

Branches

1. Dmitrovsky branch of the Federal State Budgetary Educational Institution of Higher Professional Education "Moskovsky
State Agroengineering University named after V.P. Goryachkina"
2. Moscow Film and Video Institute (branch) of the Federal State Budgetary Educational Institution of Higher Professional Education "St. Petersburg State University of Film and Television"


universities

1. State Polar Academy
2. St. Petersburg State Academy of Veterinary Medicine
3. St. Petersburg State Academy of Theater Arts
4. St. Petersburg State University of Architecture and Civil Engineering
5. St. Petersburg State University of Engineering and Economics
6. St. Petersburg State University of Water Communications
7. St. Petersburg State University of Cinema and Television
8. St. Petersburg State University of Culture and Arts
9. St. Petersburg State University of Service and Economics
10. St. Petersburg State University of Technology and Design

Branches

1. branch of the federal state budgetary educational institution of higher professional education "Russian State University for the Humanities" in St. Petersburg

The General Director of the Research Institute for Monitoring the Quality of Education, Doctor of Technical Sciences, Professor Vladimir NAVODNOV answered the questions of the journal "Accreditation in Education".

Vladimir Grigoryevich, what conclusions did you come to when analyzing the results of monitoring the effectiveness of universities in 2017?

The conclusions, as they say, are disappointing.

First, there are unfavorable statistics. This year the monitoring was carried out for the fifth time. During this time, from 2013 to 2017, the Russian higher education system lost about a thousand universities and branches. It turns out, excluding weekends and holidays, on average, every working day, Rosobrnadzor closed one educational organization. There has never been such a process in history. I must say, not only in Russia, but nowhere at all.

Secondly, "information noise" (a huge amount of data), constant changes in the rules of the game and calculation methods do not allow educational organizations to quickly model and predict their performance.

- In this case, does it make sense to make any predictions?

I still think yes. You have to be at least somehow mentally prepared for a not very predictable reality.

Moreover, we have created such software that allows us to model future situations with a certain degree of probability. But first - not about that. Let's look at the technological chain of ranking.

Formation of indicators;

Collection of data on designated indicators;

Data verification;

Three tasks out of four were solved by the relevant ministry. First, the formulation of indicators is very important. In fact, this is a task for the development vector of the system, a system of views on the development of education in the country. The second, hardest task is to collect data. This can be done through open sources or through specialized collection. The task is being solved, but as experience has shown, there are a huge number of inconsistencies. Third, the data is verified. And, finally, all calculations come down to dividing into "efficient" and "inefficient" universities.

- Pretty rough division.

Very rough. After all, what happens? There are universities that easily overcame all the threshold signs, there are those that have overcome with great difficulty, there are those that have almost overcome, and there are those that have not overcome almost all indicators. In connection with this, the task arises - to describe a more subtle tool for dividing into groups or. in other words, leagues, and not lump everything together.

Based on the results of performance monitoring in 2016, you have built a table of seven leagues. Was such a rating compiled based on the results of 2017?

Yes, we continued this work, improved the methodology, and 7 leagues were not enough, we did 10.

Let me explain. The publication “Seven Shades of Monitoring”1 aroused great interest, there were many calls and appeals. Especially in the formation of the last, 7th league, which included all the "inefficient" universities. It turned out to be very large: from those universities that “slightly fell short of effective ones” to those that really look very bad. Therefore, the task arose to expand the number of leagues. I want to emphasize that the choice of the number of leagues is up to the developers. So far we have stopped at 10. Let's see how this mechanism will work.

What's new this year compared to last year?

Firstly, the methodology was improved by increasing the number of leagues and introducing additional parameters that are necessary for the calculation. Secondly, calculations were made not only on the basis of new data for 2017. but also “calculations back” for all five years of monitoring. This makes it possible to analyze the development of the education system.

Let me remind you that the method of state accreditation for determining threshold values ​​formed the basis. However, when it was created twenty years ago, the lower quartile was proposed as a threshold value, which divided universities into 75% of the best and 25% of the worst for each accreditation indicator. The median was taken as a threshold value, dividing the universities in half: by 50% of the best and 50% of the worst.For each performance monitoring indicator, the university falls into one of four groups: A - 25% of the best.B - if it is in the 50% of the best, but not included in area A, C - if the value of the indicator is above the threshold, but not included in either area A. or area B. and finally, d - if the value of the indicator is below the threshold.Plus this year we added an estimate of E - the value of the indicator below the lower quartile and is not included in the O region.

Each educational organization received a certain set of assessments based on monitoring indicators. It is extremely difficult to do this manually, so special software was created.

It was called LiftUp. posted on the site msd-nica.ru, and anyone can use it. And not only to analyze the current state and compare with previous results, but also to predict and model future results of performance monitoring.

- What do the results show? Do educational organizations move through leagues?

An amazing result: in 2017, only the Russian State University of Oil and Gas (National Research University) named after. And M. Gubkin received all the grades "A" and turned out to be the only university in the first league. By the way, it has been holding these leading positions for the third year in a row.

In general, the changes that have taken place can be tracked on the msd-nica.ru website, where the full version of the rating based on the results of performance monitoring is presented in the form of a league table.

If we look at the distribution of universities by leagues, we will see that it is close to normal. That is, there are quite a few universities in the 1-2 and 9-10 leagues. and the bulk is concentrated in the 4th-6th leagues. We have made tables that show how the number of universities has changed by league over the course of five years. There is no clear comparison here. The following effect is observed: the number of universities for which there is data on the website of the ministry is different. In the first year, 1874 educational organizations were processed, then the number falls, and this would be quite justified, since the number of universities and their branches is reduced. but for some reason, there was a surge in 2016 - the number of processed organizations increased, and in 2017 it fell again. Therefore, apparently, it makes sense here to talk not about a quantitative comparison, but about a percentage.

The bad news is. that there are fewer leaders, but the good news is that the number of laggards has decreased. The number of universities and branches located in the "red zone" has sharply decreased. These are leagues 8-10. There are many reasons for that. One of them. that there are simply fewer of them - weak universities and branches are closed. Secondly, some of the results were not processed. Well, I must say that the system still responds. The monitoring has been carried out for five years. Universities are adapting to the rules that exist today, and are tightening their indicators - a completely natural process. But. Unfortunately. It cannot be said that it goes exactly over the years. Much, of course, depends on the rules of the game, they change from year to year. But at the same time. the rules are the same for everyone, so everyone is in the same conditions.

Please note that in the results of the Performance Monitoring 2019, the indicator "Salary of teaching staff" is not evaluated and data on the indicator "Employment of graduates" has not been published. The rating is built on the basis of 5 target indicators.

Universities are sorted in the following order: by league, index J and university name in alphabetical order.

For each of the many values ​​of indicators for monitoring the effectiveness of universities (taking into account the specifics), with the exception of the indicator "Salary of teaching staff" and "Employment of graduates", the ranking is carried out in descending order of the values ​​of the indicator

There are 5 areas (A, B, C, D, E). Each area is assigned a weight:

  • Area A - the value of the indicator is above the value of the 1st quartile. Weight +5
  • Area B - the value of the indicator is above the threshold and above the median, but is not included in area A. Weight +3
  • Area C - the value of the indicator is above the threshold, but is not included in area A and B. Weight +1
  • Area D - the value of the indicator is below the threshold, but above the 3rd quartile. Weight 0
  • Area E - the value of the indicator is below the threshold and is not included in area D. Weight -1

Based on the hit of indicator values ​​in the area, the J index is calculated as the sum of the weights of belonging to the areas.

If the value of the indicator is higher than the threshold, then it is indicated by the icon, otherwise, by the icon

If the university has fulfilled 3 or more indicators, then it is indicated by the icon, otherwise by the icon

There are 10 leagues in total:

  • League 1: universities with J = 25 "Salary of teaching staff and the number of completed indicators ≥ 3
  • League 2: universities with 22 ≤ J ≤ 24 "Salary of teaching staff and the number of completed indicators ≥ 3
  • League 3: universities with 19 ≤ J ≤ 21 "Salary of teaching staff and the number of completed indicators ≥ 3
  • League 4: universities with 15 ≤ J ≤ 18 "Teaching staff salaries and the number of completed indicators ≥ 3
  • League 5: universities with 11 ≤ J ≤ 14 "Salary of teaching staff and the number of completed indicators ≥ 3
  • League 6: universities with 7 ≤ J ≤ 10 "Salary of teaching staff and the number of completed indicators ≥ 3
  • League 7: universities with 1 ≤ J ≤ 6 "Salary of teaching staff and the number of completed indicators ≥ 3
  • League 8: universities with 3 ≤ J ≤ 9 and the number of completed indicators
  • League 9: universities with 0 ≤ J ≤ 2 and the number of completed indicators
  • League 10: universities with J ≤ -1 and the number of completed indicators

Reference:
The median is the midpoint of the distribution: half of the observations are above it and the other half are below it (the median of numbers 3, 4, 5, 6, and 102 is 5).
When there is an even number of observations, the median is the midpoint between the two middle observations.
The median can be divided into quarters, or, as they are also called, quartiles. The first quartile consists of the bottom 25% of observations; second of the next 25% of observations, etc.

League University/branch Completed more than 3 indicators Image activity Scientific research activity International activities /Reduced contingent Fin.-economic activity Salary of teaching staff Additional indicator
1 State Institute of the Russian Language. A.S. Pushkin Yes
J=25
89.45
A
466.63
A
28.25
A
8221.51
A
184.24
11.32
A
1 Moscow Polytechnic University Yes
J=25
69.51
A
412.42
A
17.74
A
3834.79
A
204.82
4.53
A
1 Moscow Institute of Physics and Technology (National Research University) Yes
J=25
94.56
A
4061.84
A
11.00
A
8767.60
A
244.94
6.70
A
1 National Research Tomsk State University Yes
J=25
76.23
A
1694.19
A
20.74
A
5485.34
A
239.66
5.02
A
1 National Research Tomsk Polytechnic University Yes
J=25
77.58
A
1434.51
A
27.92
A
3969.73
A
218.50
7.36
A
1 National Research Technological University "MISiS" Yes
J=25
73.76
A
2463.22
A
26.14
A
11304.18
A
211.19
5.25
A
1 National Research Nuclear University "MEPhI" Yes
J=25
89.40
A
3187.97
A
21.83
A
9751.86
A
282.76
7.72
A
1 First St. Petersburg State Medical University named after Academician I.P. Pavlova Yes
J=25
79.44
A
271.30
A
12.99
A
7334.07
A
201.28
70.26
A
1 Privolzhsky Research Medical University Yes
J=25
70.98
A
311.93
A
15.77
A
5453.38
A
199.19
69.13
A
1 Peoples' Friendship University of Russia Yes
J=25
68.72
A
302.66
A
28.49
A
6835.70
A
225.23
5.03
A
1 Saint Petersburg State University Yes
J=25
86.91
A
603.40
A
13.87
A
4236.28
A
194.32
15.21
A
1 St. Petersburg Polytechnic University of Peter the Great Yes
J=25
75.89
A
1480.98
A
15.38
A
5668.39
A
244.76
4.53
A
1 Branch of "Russian Customs Academy" in Vladivostok Yes
J=25
67.87
A
304.62
A
658.10
A
4194.62
A
309.04
5.10
A
2 State University for Land Management Yes
J=23
70.04
A
393.80
A
6.72
B
3914.61
A
204.77
5.25
A
2 Far Eastern Federal University Yes
J=23
68.98
A
447.90
A
9.52
B
6341.82
A
202.54
4.58
A
2 Kazan (Volga Region) Federal University Yes
J=23
71.81
A
615.50
A
15.71
A
3298.97
B
223.91
5.01
A
2 Kazan National Research Technical University A.N. Tupolev-KAI Yes
J=23
70.49
A
1030.71
A
6.01
B
4068.06
A
231.82
4.67
A
2 Moscow State Academy of Veterinary Medicine and Biotechnology - MBA named after K.I. Scriabin Yes
J=23
71.06
A
285.18
A
5.52
B
4052.62
A
225.57
89.97
A
2 Moscow Aviation Institute (National Research University) Yes
J=23
72.25
A
1141.22
A
5.76
B
4516.10
A
206.13
5.51
A
2 Moscow State Institute of International Relations (University) of the Ministry of Foreign Affairs of the Russian Federation Yes
J=23
88.48
A
164.51
B
13.18
A
3937.38
A
205.68
7.52
A

We invite you to familiarize yourself with the results of the regular monitoring of the effectiveness of Russian universities, which was conducted by the Ministry of Education and Science on the basis of data provided by educational organizations.

Education system monitoring- a universal means of control, systematization and development of a constructive line of development in one of the most important sectors of the development of society and the state.

In addition, the results of the research allow future students to judge the quality of education in a particular university, its prestige and the possibility of obtaining the best knowledge that will really become useful in their professional activities.

We invite you to familiarize yourself with the results of the regular monitoring of the effectiveness of Russian universities, which was conducted by the Ministry of Education and Science on the basis of data provided by educational organizations.

Why is monitoring needed?

Development and improvement, raising education to a new, higher level is the main goal of regular monitoring the effectiveness of universities.

Monitoring studies show:

  • the quality of the work of the teaching staff, as well as the degree of assimilation of the program material by students;
  • consistency, goals and objectives of training, methods of presenting the material. The use of technical teaching aids and knowledge control significantly simplify the educational process and help to rationally spend the student's and teacher's time allotted for study;
  • structure and forms that make it possible to acquire knowledge in optimal conditions. Monitoring statistics reflect the number of specialized universities in each region, stationary and correspondence forms of education, as well as material conditions created for independent educational and scientific activities;
  • the effectiveness of the educational process, which is displayed in the data on the employment of graduates in the specialty.

Monitoring includes a number of other assessment items related to control and educational process management. The regulation of budgetary funds allocated to improve the quality of higher education is also based on the results of research.

It should be noted that compared to last year's research results, the number of universities that improved their results in four or more indicators increased by 2.5 times. That is, monitoring proves its effectiveness and positive impact on improving the quality of education.


Key performance indicators of universities

The 2017 study involved 769 universities and 692 branches of educational organizations of various forms of ownership (state, municipal and private).

The effectiveness of universities was evaluated on the basis of indicators characterizing:

  • Educational activity - average USE score;
  • Research activity - the volume of research and development work per employee;
  • International activities - the percentage of foreign students to the total number of students;
  • Financial and economic activity - income of an educational organization per employee;
  • Salary of the teaching staff - the percentage ratio of the salary of employees to the average salary in the region;
  • Employment - the percentage of graduates who were employed in the year following graduation, to the total number of graduates;
  • Additional indicators - the share of student-athletes, the share of employees with state awards, the share of students in advanced training and professional retraining programs, etc.

Main indicators of monitoring by districts

Universities of all branches of industry, located in eight federal districts and almost in all regional centers of the Russian Federation, took part in the last monitoring.

Central District

438 higher education organizations (including 156 branches) took part in the monitoring. 49 universities were able to achieve the required values ​​for all 7 indicators, including:

Northwestern District

152 higher education organizations (including 60 branches) took part in the monitoring. 20 universities were able to achieve the required values ​​for all 7 indicators, including:

Privolzhsky District

273 higher education organizations (including 155 branches) took part in the monitoring. 40 universities were able to achieve the required values ​​for all 7 indicators, including:

Southern District

151 higher education organizations (including 92 branches) took part in the monitoring. 16 universities were able to achieve the required values ​​for all 7 indicators, including:

North Caucasian District

95 organizations of higher education (including 50 branches) took part in the monitoring. Only 2 universities were able to achieve the required values ​​for all 7 indicators, namely:

  • Vladikavkaz branch of the Financial University under the Government of the Russian Federation.

Ural District

112 higher education organizations (including 59 branches) took part in the monitoring. 12 universities were able to achieve the required values ​​for all 7 indicators, including.

Compiled based on the results of efficiency monitoring conducted by the Ministry of Education and Science of the Russian Federation. The team of researchers who created the rating continued their work. The methodology was improved, and the number of leagues increased from 7 to 10. But first things first... General Director of the Research Institute for Monitoring the Quality of Education, Doctor of Technical Sciences, Professor Vladimir NAVODNOV answered our questions.

Vladimir Grigoryevich, what conclusions did you come to when analyzing the results of monitoring the effectiveness of universities in 2017?

The conclusions, as they say, are disappointing.

First, there are unfavorable statistics. This year the monitoring was carried out for the fifth time. During this time, from 2013 to 2017, the Russian higher education system lost about a thousand universities and branches. It turns out, excluding weekends and holidays, on average, every working day, Rosobrnadzor closed one educational organization. There has never been such a process in history. I must say, not only in Russia, but nowhere at all.

Secondly, "information noise" (a huge amount of data), constant changes in the rules of the game and calculation methods do not allow educational organizations to quickly model and predict their performance.

- In this case, does it make sense to make any predictions?

I still think yes. You have to be at least somehow mentally prepared for a not very predictable reality.

Moreover, we have created such software that allows us to model future situations with a certain degree of probability. But first - not about that. Let's look at the technological chain of ranking.

Formation of indicators;

Collection of data on designated indicators;

Data verification;

Three tasks out of four were solved by the relevant ministry. First, the formulation of indicators is very important. In fact, this is a task for the development vector of the system, a system of views on the development of education in the country. The second, hardest task is to collect data. This can be done through open sources or through specialized collection. The task is being solved, but as experience has shown, there are a huge number of inconsistencies. Third, the data is verified. And, finally, all calculations come down to dividing into "efficient" and "inefficient" universities.

- Pretty rough division.

Very rough. After all, what happens? There are universities that easily overcame all the threshold values, some that overcame with great difficulty, there are almost overcame, and there are those that have not overcome almost all indicators. In connection with this, the task arises - to describe a more subtle tool for dividing into groups or, in other words, leagues, and not to lump everything together.

Based on the results of performance monitoring in 2016, you have built a table of seven leagues. Was such a rating compiled based on the results of 2017?

Yes, we continued this work, improved the methodology, and 7 leagues were not enough, we made 10 (see Diag. 1).

Let me explain. The publication of "Seven Shades of Monitoring" aroused great interest, there were many calls and appeals. Especially in the formation of the last, 7th league, which included all the "inefficient" universities. It turned out to be very large: from those universities that “slightly fell short of effective ones” to those that really look very bad. Therefore, the task arose to expand the number of leagues. I want to emphasize that the choice of the number of leagues is up to the developers. So far we have stopped at 10. Let's see how this mechanism will work.

What's new this year compared to last year?

Firstly, the methodology was improved by increasing the number of leagues and introducing additional parameters that are necessary for the calculation. Secondly, calculations were made not only based on new data for 2017, but also “backward calculations” for all five years of monitoring. This makes it possible to analyze the development of the education system.

Let me remind you that the methodology for determining the threshold values ​​formed the basis. True, when it was created twenty years ago, the lower quartile was proposed as a threshold value, which, for each accreditation indicator, divided universities into 75% of the best and 25% of the worst. In the monitoring, the median was taken as a threshold value, dividing universities in half: into 50% of the best and 50% of the worst. For each indicator of performance monitoring, the university falls into one of four groups: A - 25% of the best, B - if it is in the top 50%, but is not included in area A, C - if the value of the indicator is above the threshold, but is not included in area A , nor to area B, and, finally, D - if the value of the indicator is below the threshold. Plus, this year we added an E score - the value of the indicator is below the lower quartile and is not included in the D area.

Each educational organization received a certain set of assessments based on monitoring indicators. It is extremely difficult to do this manually, so special software was created. It was called LiftUp, posted on the site msd-nica.ru, and anyone can use it. And not only to analyze the current state and compare with previous results, but also to predict and model future results of performance monitoring.

- What do the results show? Do educational organizations move through leagues?

An amazing result: in 2017, only the Russian State University of Oil and Gas (National Research University) named after. THEM. Gubkin received all the "A" grades and turned out to be the only university in the first league. By the way, it has been holding these leading positions for the third year in a row. Let us give as a visual application the lists of the 1st, 2nd and 3rd leagues (see pp. 40-44).

In general, the changes that have taken place can be tracked on the msd-nica.ru website, where the full version of the rating based on the results of performance monitoring is presented in the form of a league table.

If we look at the distribution of universities by leagues, we will see that it is close to normal. That is, there are quite a few universities in the 1st-2nd and 9th-10th leagues, and the bulk is concentrated in the 4th-6th leagues. We have made tables that show how the number of universities has changed by league over the course of five years. There is no clear comparison here. The following effect is observed: the number of universities for which there is data on the website of the ministry is different. In the first year, 1874 educational organizations were processed, then the number falls, and this would be quite justified, since the number of universities and their branches is decreasing, but for some reason there was a surge in 2016 - the number of organizations processed increased, and in 2017 it fell again . Therefore, apparently, it makes sense here to talk not about a quantitative comparison, but about a percentage.

The bad news is that there are fewer leaders, but the good news is that the number of laggards has decreased. The number of universities and branches located in the "red zone" has sharply decreased. These are leagues 8-10. There are many reasons for that. One of them is that there are simply fewer of them - weak universities and branches are closed. Secondly, some of the results were not processed. Well, I must say that the system still responds. The monitoring has been carried out for five years. Universities are adapting to the rules that exist today, and are tightening their indicators - a completely natural process. But, unfortunately, it cannot be said that it goes exactly over the years. Much, of course, depends on the rules of the game, they change from year to year. But at the same time, the rules are the same for everyone, so everyone is in the same conditions.

The msd-nica.ru website also provides an additional opportunity to analyze information not only by leagues, but also by federal subjects. And you can see the dynamics of a particular university over the years, comparing it, for example, not only with regional competitors or in the federal district, but also by profile: for example, medical with medical.

- I wonder what indicators turned out to be the most difficult to achieve this year?

In the first place in terms of non-fulfillment, the indicator “Employment” is expected (see Diagram 3): it is below the threshold value for every second organization. But it seems that the problem is not on the side of universities. Take, for example, the data presented on the employment of graduates of the Moscow State University. M.V. Lomonosov. It cannot be that they are zero, but in the table it is.

More details about the new rating of Russian universities based on the results of performance monitoring will be discussed at our traditional webinar, which will be held on March 30, 2018. If readers have any questions, they can be asked right now at [email protected] and we will try to answer them.

Interviewed by Ekaterina SINDEEVA.