UK media reports of the inherent unreliability of school exam results have drawn criticism for undermining public confidence in qualifications. Can the same observations be made about the veracity of PR qualifications?
One of the foundations of assessment is the belief that there is a standard represented by a particular mark, grade or qualification. Career opportunities arise as a result of trust in qualifications – especially in topics such as PR, where there may be cynicism among employers who aren’t familiar with what is involved in studying for such degrees or gaining professional qualifications.
We live in a world where certificates can simply be bought online – so it is vital that genuine qualifications, where candidates have invested considerable time and money, have a reliable reputation.
But in reality, it is difficult for markers to agree 100% on the quality of work in topics such as PR where there is a requirement to assess intangible aspects involving personal judgement rather than looking for absolute rights/wrongs.
I’ve been involved in many discussions regarding methods of assessment – both in terms of the actual assignment and the marking system. Personally, I struggle with the general approach of allocating percentage points to work where it is not mathematically logical. I prefer to look at assessment against agreed criteria – although again, this involves subjectivity.
So isn’t it sensible to acknowledge that it is impossible to be absolutely accurate in assessing the essentials of PR? Should we support a call for a public debate about the nature of an examination system?
Or do we need the illusion that evaluation can be objective in order to maintain confidence in the reliability of those vital qualifications?
Mariana,
I agree that qualifications are only one part of demonstrating competence, but not that these are the first step. The value of learning – including gaining qualifications – applies throughout our careers.
I also don’t understand why in PR it is seen as a sign of insecurity to demonstrate knowledge or accountability. That is a basic expectation in many careers – would you want to fly with a pilot or be operated on by a doctor who has not proved either?
My initial point though was that the challenge in PR is often that assessing competence may be subjective but expectations are that objective measures can be used.
What we maybe need more of in life – practice and education (PR and not) is recognition that people will have a range of competences and abilities and that these are developed by experience and studying. That is, there may not be an absolute set of skills and an objective measure by which to assess these.
This issue of exams qualifications as a kind of standardizing rule dividing those who are professional PR from those who are not seems to me to be based on the same need of approval / accountability that PR pros are experiencing and the recent urge to numerically measure PR results.
It seems that only those exams that can be marked objectiviely can be trusted on.
In my opinion, the result in an exam may bring you closer or further from achieving the top scores in a university or land you closer to a nice position in a company. But it’s only the first step: what you do and how you do it is almost the only thing that counts in nowadays business world.
Jean, I am aware that a lot of work is being undertaken in respect of standards and criteria both for membership of PR professional bodies and qualifications. Indeed, just yesterday CIPR in UK published a consultation paper on professional practice and discipline and a summary of proposals to enable members to obtain a chartered practitioner “gold standard”.
As you indicate, at the heart of attempting to put some measures around PR competencies lies the “what is PR” question. This is complex when the profession involves so many diverse aspects, a lot of which also connect with core management, organisational or communication skills, plus competencies related to complementary disciplines (marketing, HR, legal, finance, etc), and sector specialisation requirements.
And, of course, we need to recognise that PR is a developing discipline, so any standards must be accommodate new skills and competencies, ie support a reflective practice.
So the issue of how well someone can be assessed across or within such variation of possible skills, knowledge and capability is similarly complicated.
As Caroline indicates, PR is not a “tick box” area of study either and at present, we are fortunate that those involved in assessment aren’t quite working in a “sausage machine” compared to standard school assessment. However, I am currently assessing over 80 exam scripts for undergraduate students with a very tight deadline – so the numbers are coming. I understand that in China there are potentially millions of PR practitioners wanting qualifications. Even CIPR could be faced with hundreds of members seeking chartered status which will be a difficult administrative challenge.
Of course, these arguments apply equally to medicine, law and other areas which Brandon indicates have standardised testing. In the case of the traditional professions, perhaps there is a clearer body of knowledge and more agreement over what is “right/wrong” in terms of accepted standards.
Can we ever be confident that a PR practitioner can hold up a certificate or membership of a professional body as a similar badge of competency?
Heather,
this is THE issue for emerging PR professionals. There is no doubt that PR as a profession is in a time of high demand as companies across nearly every sector begin to realize the value of effective communications.
PR, though, is in a state of limbo. Along with increasing recognition is the issue of legitimization. Doctors, lawyers and other professions have standardized testing. This standardization legitimizes the profession because with a set level of testing comes the understanding that all of these professionals have met or exceeded this difficult testing. PR lacks standardization. Of course, there are certifications– in Canada, the two most recognized are the ABC in IABC and APR in CPRS– but even these lack a uniform set of standards.
The issue becomes more fuzzy in post secondary schooling. Undergraduate degrees in PR are very exclusive, and even post graduate certificates are only offered by a few schools. With the advent of web 2.0, PR is experiencing a grand shift and while some schools are coping well with the need to alter the programs, others have been left in the dust. The result: some students are more prepared than others, leaving employers to a guessing game when it comes to hiring qualified new graduates.
The first step towards legitimizing this profession is standardized testing.
It is a great shame that we can’t expect a little more reliability for an exam young people put such a lot of effort into.
But I wonder if school exams aren’t a bit like the UK NHS – we expect such a lot.
PR exams have a fair bit of investment and therefore skilled marking and cross-marking.
GCSEs and A levels are taken by gazillions of children (my son is one of the sausages going through the machine at the moment) and is just not going to meet the same standard as a professional PR exam without a lot more cash.
The human race is a flawed lot. But a bit of subjectivity is the price we pay for not having exams that are all simple tick boxes that usually end up being more of a test of memory than talent.
Yes – things could be better – but I read somewhere that your exam qualifications make up only about 20% of the explanation of your career prospects anyway.
And while as we all become global players, it’s good to hear Jean report on building the framework for a shared level of expectation, it’s also important to remember that an exam is only one way to make a judgement about talent.
Heather,
Interesting post.
Perhaps you are aware that the Global Alliance is looking into various aspects of the issue you raise.
Two initiatives are being examined. One is more advanced than the other as you can appreciate that examining standards and proceses from different parts of thw world is not only tricky but requires a lot of thought before settling on criteria to be used for analysis.
First personnal credentials- such as the APR and ABC. There is a report on the GA web site which was led by Pierette leonard of Canada which expanded on previous work done by Margaret Moscardi of south Africa. The latest reports concludes that all existing schemes to measure competency of members who are in practice for a while and voluntarily submit to an examination have a lot of similarit in WAHT they measure- not HOW they measure it. Developing this analytical grid took time and effort but I beleive we got it right. The work group is still digging through another layer to ascertain that the self-reportin information obtained against this grid is indeed ready to withstand scrutiny before the GA considers adopting this as a global standard.
the seconf ‘world standard’ project is curriculum standards. We are still in the initial stages on this one but a solid base which we intend to use for validation is the “Professional Bond’ report published last year by a consortium of US organisations. I am not sure we will ever dig deep enough to be able to assess how assesments (pardon the pun)are being made in the education community- which is what I think you are raising but there are suffcient elements to be explored ranging from what a minimum curriculum standard could be, the amount and quality of resources available to faculty and students, style/method of teaching etc before we tackle the issue you raise. Our nascent workign group is being led by John Paluszek of PRSA who oversaw with other colleagues the “Professional Bond” report. You would be welcomed to share your views on our ‘little’ project !. One of our starting points would be to agree on a common definition of public relations- no small task.