Download Futurreg 4 Steering Committee Draft Evaluation Methodology

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
4th Steering Committee
Logroño, 28th November 2006
Draft Evaluation Methodology
Futurreg
Evaluation:
“The evaluation of public intervention consist of judging its value
in relation to explicit criteria, and on the basis of information that
has been specially gathered and analysed.”
Intrinsic features:
• Description
• Regulatory dimension
• Instrumental dimension
Others features:
• Scope of public policy evaluations.
Controlling public expenditure and improving administration.
• Evaluation should be more than just an instrument of
control to measure complex features.
• Key actors involved in the process.
¿
Why evaluate?
• Verify programme effectiveness and efficiency
• Learn
• Aid reflection
• To see what is happening?
• Aid transparency
+ Futurreg
• Foresight as a technique for decision making in organisations.
• Evaluation strengthens the stated purpose of the project to provide
a learning tool for European regions.
• Evaluate to ensure transfer is as good as possible.
• Foresight must be evaluated in context.
• Evaluate whether the results and conclusions are used and/or taken
into account in the region.
¿
Which kind of evaluation is required?
• On going supplements the monitoring system and broadens the capacity to
manoeuvre.
• Ex post attempts to measure project effectiveness and efficiency.
Evaluation structure
•
Context
•
Methodology
•
Data collection
•
Data analysis
•
Global analysis and selection of cases for study
•
Summaries of results and/or impacts
•
Communication of results and recommendations
Evaluation Methodology
•
•
Analysis of the project in two dimensions:
•
Analysis of the project as a whole
•
Analysis of foresight applications performed in Futurreg. Applications
grouped into different categories.
Indicators
Indicators concerning the project as a whole, its objectives, toolkit and foresight
methods.
•
Three questionnaires for different actors involved in the project
Partners (7), Application managers (14-15) and Participants in foresight applications
(case of studies)
Analize the project as a whole (I)
•As Futurreg is part of the INTERREG IIIC programme, its
contribution to the programme objectives through this project needs
to be evaluated.
•Importance to evaluate aspects like cooperation and project partner
inter-communication, partner experience exchange, the toolkit, etc.
•Project as learning experience for participating regions and as a
methodological guide for future experiences in other European
regions in the futures and foresight field
Analize the project as a whole (II)
At this level, the main questions for evaluation are:
•Were the objectives well defined?
•Have they been achieved?
•Have the expected results been obtained?
•Was the workplan carried out?
•Have the deadlines and planned costs of the project been respected?
• Have the results been transferred from the project to other regions?
Analyze the project as a whole (III)
Indicators:
•Project activity and working plan indicators (objective indicators)
•Result indicators
•Others indicators
Indicators (I)
AREA
INDICATORS
OBJECTIVE INDICATOR
General
Project activity and working plan indicators (objective indicators)
Organization and
Steering Committee meetings
6
coordination
Partner participation in Steering Committee meetings
100%
Recruitment of local auditors
6
Partner refund deadlines
2 weeks
Activities performed according to initial project agenda deadlines
Good working atmosphere created and consensus promoted
Report sent on time
100%
Activity and financial reports
5
Monitoring reports
Every six months
Evaluation reports
Ongoing / ex post
Project development in alignment with planned objectives.
Project development in alignment with costs.
Benchmarking
Number of regional and interregional stakeholders informed about Futurreg
and evaluation
Number of regions informed about their benchmarking position as per
70, 10 per region
regional indicators
Foresight tool report
5
Regional benchmarking and evaluation report
7
Opportunity matrix for regional toolkit applications
Creation of a basis and evaluation criteria for project applications and
1
implementation.
Ongoing evaluation report.
Ex post evaluation report.
Evaluation of interregional workshops.
Definition of
Participants in interregional workshops
foresight
Type of participants in interregional workshops
processes
Participant satisfaction with interregional workshops
30
85%
Production of regional futures toolkit with guide.
Production of materials for up-skilling workshops
2 sets
Interregional Workshops
2
Participants in each workshop
30 in each workshop
Production of plans for regional applications
14
Interregional matrix comparing application-related issues and features
Indicators (II)
AREA
INDICATORS
OBJECTIVE INDICATOR
General
Project activity and working plan indicators (objective indicators)
Implementing
Number of regional stakeholders used in futures toolkit application
foresight
Satisfaction of regional organizations with accepted applications.
28 stakeholders, 4 per region
Number of futures-toolkit-related plans and future collaborations
Applications of futures toolkit
14
Regional application monitoring reports
Summary report on steps to ensure sustainability of resulting operations.
14
starting from 2006, 1
biannual for region. In all, 21
Diffusion
Diffusion level for “champions” in public sector organizations (number of
150 colleagues used by the
encounters held and number of partners used)
champions, 5 per champion
Visits to web page
5,000 visits
Publication of Futurreg project publicity brochure
3000 copies
Encounters with 105 stakeholders to diffuse toolkit and project
2
Interregional conference
Project web page, with information of interest on project and applications.
Project web page in partners’ web pages.
Other project indicators
Meetings held
Press notes released after meetings
Targets hit
Applications developed
Number of intra-regional and inter-regional networks created and participants
(public administrations, universities, technology centres, companies, etc)
Number of foresight experiences generated
Objective indicators
Objective 1
Number of agents involved (and interested?) in regional foresight practice.
Objective 2
Number of experiences at regional level in the use of foresight techniques.
Number of foresight exercises practised
Number of users requesting application in their own region. (Number of users
demanding foresight in their own region).
Objective 3
Number of integrated sources of information.
100 participants
Indicators (III)
Other Indicators
AREA
General
INDICATORS
Progress on creating foresight culture
Beneficiaries taking foresight results on board.
Regional public administrations’ satisfaction with, and interest in, foresight.
Objective 1
Transferability. Toolkit transferability level to other European regions. (Number of
applications from other regions interested in the application)
Extension of the interregional network that maintains foresight tools: number of partners,
class of partners, type of channels and communication flows, etc.
Toolkit adaptability and flexibility for other regions and studies. User friendliness, success
of project, etc.
Objective 2
Applicability. Implementation or use status of foresight results in regional economic
policies.
Regional public administration and business interest in toolkit.
Regional agents’ integrated use of foresight.
Sectors where it is applied (sector type).
Future sustainability of application in the region
Objective 3
Interest for the region of area/sector benefiting from use of foresight.
Impact on regional economy.
Quality of sources
Evaluating applications (I)
Critical areas(1) were used in classifying applications:
• Social, factors influencing human beings, society and lifestyle
• Technological, factors stimulated by science and its applications
• Economic, factors affecting industry and wealth creation
• Environment, factors impacting on the physical world we living
• Political, factors relating to government and administration
[1] Preliminary version of the UPGRADE blueprint has been produced for the dissemination conference “Building the future on
knowledge”. Brussels, September 23, 2004.
S
ocial
Menter a Busnes
Main theme: Economic value of
the Welsh Language.
Type of application: Company
development and strategic
contribution to policy framework
and programme for action.
Fondation Rurale de Wallonie
Main theme: Support for rural
development operations and on
going education..
Type of application: diagnosis on
future needs of rural territories of
Wallonia in order to define the
new strategy of the organisation
FRW. Support existing working
groups of the organisation.
Institute of Technology, Sligo
Main theme: Higher education
Type of application: Develop
strategy
T
echnological
E
conomical
Athone Institute of Technology
Main theme:
Innovation/Sustainable
development.
Type of application: Development
of a strategy/Organizational
development.
Loimaa Region Development
Centre
Main theme: Economic
development (entrepreneurship)
Type of application: Create
networks and star up new joint
projects among entrepreneurs.
University of Malta
Main theme: Knowledge Transfer
Type of application: development
of knowledge transfer strategy
within the University´s ten year
vision.
Tr@me SCRL
Main theme: Territorial
development, tourism and
patrimony, agriculture and
sustainable energies.
Type of application: improve the
use of foresight in rural areas of
Wallonia.
Aristotle University of
Thessaloniki
Main theme: Innovation, new
product development.
Type of application: Strategy
development towards technology
and knowledge transfer..
Galway-Mayo Institute of
Technology
Main theme: Innovation in SME
Type of application: Product
Development and Innovation
Planning.
ADER, INESCOP and AICCOR
Main theme: Innovation, research
and technology.
Type of application: Identification
of Future Technological trends in
the footwear Sector.
Regional Council of Satakunta
Main theme: manages the
responsability of the long term
regional development, defines
objectives of the long term
development and provides
guidelines for the member
comunities of land uses.
Type of application: preparation
of a new Regional strategic plan
E
P
Countryside Council of Wales
Main theme:Conserving tha
natural enviroment and helping
people to enjoy it and earn from
it.
Type of application:Developing
the organisation´s strategic
direction in medium term.
Municipality of Thermi
Main theme: Digital Research
Centre: Digital Cities, egoverment.
Type of application: regional
policies, organizational
restructure.
nvironment
olitical
Malta enterprise
Main theme: Innovation.
Type of application: Development
of a regional innovation strategy.
Evaluating applications (II)
Case study selection criteria:
•
Operational level and situation of exercise.
•
Design of foresight exercise.
•
Relevance of issues on which the foresight exercise is carried out.
•
Territorial balance according to types and regions involved in the
project.
•
Novelty.
Evaluating applications (III)
The main questions (1) the evaluation considers at this level are:
1. Introduction
of foresight in the region and complementarity to other
regional actions.
Are the defined objectives consistent with the needs of the environment?
Are they defined in detail? Has foresight implementation design taken into
account regional characteristics and problems? Is foresight integrated and
complemented by other types of regional actions, with which it shares
common objectives?
2. Target hits
Has the project fulfilled its general objectives? Were results as expected?
(1) The questions are a result of the collective work of the partners and of the conclusions propoused in the last up skilling workshop
Evaluating applications (IV)
3. Foresight as a useful tool
Has foresight proved its use as a tool for partners and regional
organizations? What type of method was used? Were the methods used
appropriate for the targets set? To what extent was the foresight guide
useful in the development of applications?
4. Foresight as a participatory process
What is the role of private and public agents in the development of the
applications? Was civil society involved? Was the foresight practice
carried out as an open, participatory process?
5. The role of experts, stakeholders and external consultants
What role did foresight experts, stakeholders and external consultants play
in the performance of the project and applications?
Evaluating applications (V)
6. Application of results
Have the results of the foresight applications been applied in the regions?
7. Diffusion of regional foresight results
Have other European regions been informed of the programme results? Do
regional agents know about regional foresight applications and their
results?
8. Extension and sustainability of the foresight culture
In view of the foresight experience in the region, has the foresight culture
developed in the region? Are regional organizations showing interest in
maintaining and using this tool?
Evaluating applications (VI)
Indicators:
•
Indicators on Futures toolkit
•
Indicators on foresight methods
Indicators (I)
Indicators on Futures toolkit
Number of foresight methods used
Availability of data and sources. (Number and quality of sources).
Duration of foresight exercise with regard to period planned.
Cost of foresight exercise in comparison to originally estimated cost.
Staff deployed in foresight process.
Number of experiences in which it is being applied.
Number of agents involved.
Indicators (II)
Indicators on foresight methods
Scenarios
Months in operation.
Cost of application.
Staff deployed and taking part.
Number of environment-related variables.
Number of scenarios considered and degree of difference between them.
Trend analysis
Available data.
Time used.
Cost of application.
Staff deployed and taking part.
Number of variables selected.
Number of trend conditioning factors detected.
Delphi
Time used.
Staff deployed and taking part.
Number of experts.
Level of knowledge in research field.
Quantification of questions.
Number of consultations.
Degree of consensus
Visionary Management
Number of participants (working groups or key people).
Communication tools used.
Time used.
Cost of application.
Staff deployed and taking part.
Horizon Scanning
Number of studies identified.
Number of studies selected.
Available data.
Number of trend keys detected.
Time used.
Cost of application.
Staff deployed and taking part.
Indicators (III)
Indicators on foresight methods
Panel of experts
Selecting the experts: profile, qualification, etc.
Range and variety of agents taking part (origin or sector to which they belong)
Indicators on work done by panel of experts: meetings, working documentation, documents
generated, deadlines reached, motivation of participants, etc.
Time used.
Cost of application.
Staff deployed and taking part.
Diffusion of panel’s results
Level of consensus achieved
Ideas generated
Futures Workshop
Indicators on the management of this method: cooperation between agents, transparency
levels and fairness of process, number and length of workshops held, intervals between them,
etc.
Study horizon.
Time used.
Cost of application.
Staff deployed and taking part.
Ideas generated
Indicators (IV)
Other indicators
AREA
Toolkit
INDICATORS
Degree of complexity in handling
Degree of satisfaction in handling
Questionnaries
Presented below are three kind of questionnaires for agents taking part in
the Futurreg project:
•
Questionnaire for partners
The questionnaire is designed to obtain information on how the working
plan developed and the measures worked, on the information
generated, what type of difficulties were founded, etc.
•
Questionnaire to application agents
Sent to the agents of all regional applications. It will provide outstanding
information on the management of applications, their design, how they
worked, the methods used, the measures developed once foresight had
concluded, the role of the experts, etc.
•
Questionnaire to participants in applications
This questionnaire will only be sent to participants in regional
applications selected as case studies.
Chronogram
Evaluation methodology:

Presentation of Draft. 28 November 2006.

Deadline for comments. End of December 2006.

Methodology evaluation document. 15 of January.
On-going evaluation:

Evaluation criterion. 28 November 2006.

Indicators. 28 November 2006.

Evaluation of applications (selection of a specific number of applications). 30 November 2006.

Questionnaire 28 November 2006.

Data processing. 31 January 2006.

Draft ongoing evaluation report. 31 May 2007.

Ongoing evaluation report. 29 June 2007.
Evaluation Ex-post evaluation:

Evaluation criterion. 31 August 2007.

Indicators. 31 August 2007.

Evaluation of applications (selection of a specific number of applications). 28 September 2007.

Questionnaire. 28 September 2007.

Data processing. 31 October 2007.

Draft ex post evaluation report. 30 November 2007.

Ex post evaluation report. 28 December 2007.