5. Evaluate

Brief – It is critical that effective projects are maintained and less effective ones are either dropped or improved.  Evaluation is what makes the difference between random acts of kindness and developing professional knowledge.

Evaluation is the most critical stage of the whole process but as it is such a skilled area it tends to be the stage most often left out.  In order to recognise this I have set out a ladder so that, depending on the skill base in your school you can use the level that best suits your current skill base.

1.   Basic evaluation – Personal reflection: What worked well? What would you improve? Would you do it again?

2.   Single loop evaluation – What was the original intention?  Did the project deliver on this original intention?

3.   Double loop evaluation – Do the students believe that the project successfully met its goals?

4.   Collaborative evaluation – Input your own views and those of students to your Professional Learning Team

5.   Impact assessment – The same measurements were taken before and after.

6.   Evidence based impact assessment – The core aim is measured before and after

7.   Transferable evaluation – Confidence indicators are included to enable scaling of what has been learned

8.   Professional Knowledge – The work contributes to the professional knowledge base

9.   Multiple context evaluation – The project is evaluated in multiple contexts for generalised learning.

Some projects will be outstanding and others are unlikely to ever be repeated.  Some will logically start to merge into bigger projects with more staff inolved.  Some may be loved by the teachers but found to be ineffective by the students, and so on.  The key question here is…

Which projects are really having the desired impact and how can we build on these?

This is a key question and to answer it requires some form of measurement before and after the project.  The first time through this cycle it is common that no BEFORE measurements were taken. This normally means that it is only subsequent times through the cycle that allow impact assessment.

Unintended benefits may occur as a result of the project – it is possible to comment on such observations but they cannot be certain conclusions because of the lack of starting data.  Unintended benefits provide a clear rationale for trying new projects based on this hunch.

Obsession with measurement data can be very dangerous – some improvements are based on ethical needs and so the impact assessment could be that unethical outcomes are less likely to happen or for example the impact assessment could be that an aspect of work is now more in line with values and principles of the organisation.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s