There’s no room for complacency.

Over 62,000 findings of hate speech and 900,000 of toxic language.

The 2022 version of Keeping Children Safe in Education brought in a new requirement for schools to “consider carrying out an online search” as part of the safe recruitment checks.  The phrase “online checks” caused a lot of confusion, with schools and trusts not sure how to discharge this duty and, specifically, whether to include checks of social media as part of that search. A lot of schools and trusts opted to include social media checks in their process and of those, many chose to use Social Media Check (SMC) service.

Why undertake online checks?: the first 12 months in numbers

Keeping Children Safe in Education (KCSIE) 2023 states that schools “should consider” carrying out an online search as part of due diligence on shortlisted candidates. The new KCSIE guidance states that schools and colleges must let potential candidates know that online searches will be done as part of due diligence checks ahead of their interview. KCSIE guidance must be followed unless there is a very good reason not to, which should be recorded.

Key statistics that might surprise you

Many schools and Trusts did decide to include online checks in their process and of those, many chose to use SMC. This has enabled us to analyse data (we never see individual reports) from thousands of checks in the sectors in which we operate, as well as obtaining anecdotal feedback from our education partners, Trusts and schools to provide trends and statistics that should convince you of the value of an online check (essentially what’s found on social media platforms), particularly in regard to characteristics that “may help identify any incidents or issues that have happened, and are publicly available online, which the school or college might want to explore with the applicant at interview.” (KCSIE 2023).

Did you know?

The 3 most common Social Media platforms

  • 2.12 is the average number of social media accounts per individual

  • 93.5% of individuals checked use at least one social media platform

  • 100% of checks are returned within the hour

  • 0.5% of individuals have refused to undertake a check

The average number of posts returned on a SMC report

  • 2634 posts on average for all reports

  • 312 posts on average for public only reports in education

N.B.
(a) A social media post or tweet refers to content shared on social media through a user’s profile (i.e. an individual). It can be just text, but can also include images, videos, and links to other content.
(b) A report is the output generated by an individual running an SMC check.
(c) A social media account refers to the platform on which the user can make posts or tweets e.g. Facebook/Twitter.

Analysis of Findings by behavioural characteristic

SMC’s fully automated solution searches posts against seven key behavioural characteristics₅: toxic language, extremist groups, hate speech, swearing and profanity, potential nudity, negative sentiment and violent images, highlighting content which might cause embarrassment, harm or reputational damage. Organisations can also add specific key client words to the search.

N.B.
(a) Toxic Language: is a way of communicating that harms other people. It can be threatening, blaming or labelling and cause emotional turmoil to the recipient: e.g. Trolling is when someone posts or comments online to ‘bait’ people by deliberately provoking an argument or emotional reaction.
(b) Extremist groups: are groups of individuals whose values, ideals and beliefs fall far outside of what society considers normal. An extremist group is often associated with violent tactics to convey their point to outsiders. This includes banned and proscribed groups such as Al Qa’ida or activist groups such as Just Stop Oil.
(c) Hate Speech: denigrates an individual or a group based on perceived identity factors including: religion, ethnicity, nationality, race, colour, descent, gender, in addition to characteristics such as language, economic or social origin, disability, health status, or sexual orientation, among many others. It can also include symbols e.g. Nazi swastika.
(d) Swearing and profanity: refers to language that includes four-letter words, cursing, cussing and expletives used in a manner deemed to be socially offensive.
(e) Potential nudity: refers to an image ranging from depicting partial or complete nakedness to more explicit pornographic material.
(f) Negative sentiment: highlights the lack of positive or affirmative qualities such as enthusiasm, interest or optimism expressed in opinions and thoughts – perhaps tending towards opposition or resistance.
(g) Violent images: defined as any image that conveys an imminent physical or existential threat to person(s), property, or society, with or without weaponry.
(h) Key words: customised words that are relevant to an organisation such as the name of a school, governor etc.

Distribution of report findings by comparison to the age of the candidate when posted

Why are social media checks important?

Reputation and trust are particularly valuable commodities for educational establishments and are key factors for parents, and students when deciding on a school. The reputational damage caused through an unsavoury post coming to light should not be underestimated.

Moreover, screening candidates’ social media profiles is a valuable tool in the recruitment process. It can reveal information that’s often overlooked during other pre-employment screening checks, thereby helping to ensure that the candidate is the right fit for the school.

In summary Social Media Checks:

  • Play an increasingly important role in vetting procedures whether required by law, guidance or simply for best practice.

  • Protect organisations from brand reputational damage.

  • Avoid the cost of a bad hire, loss of contracts and financial penalties.

  • Ensure company values and culture are maintained.

  • Demonstrate a commitment to due diligence in the vetting process.

The dangers of manual searches and human interaction

Even before the KCSIE guidelines, checking a candidate’s social media background might well have been undertaken. Naturally, the first thought is to ask Google or other search platforms for information that’s available in the public domain. However, most are conducted without the prior knowledge of the individual and this DIY approach to an online search often provides an incomplete picture and, more importantly, can be uncompliant and, in some cases, unlawful. Leading education legal firm Browne Jacobson₃ provides some useful guidance on this subject. 

First and foremost, with any online search, you cannot be certain that the person you are Googling is indeed the right one. Even if you have a photo of the individual many handles have avatars or pictures of the family pet or some other image. To do this thoroughly, even on the right person, requires a great deal of searching and trawling of the social media posts that it might reveal. Worse still, with a manual search, you open yourself up to subjectivity based on your own values, beliefs and perceptions, and potentially discover information that isn’t relevant to the hire and deemed as protected characteristics. Finally, you will most likely need to produce some sort of report that verifies you have carried out the check. Bearing this in mind, it’s unlikely that you can perform a manual search and report in less than half a day.

In short, manual searches are fraught with dangers. Have you?

  • Ensured you’re checking the right individual? – same name but wrong profile

  • Mistakenly identified protected characteristics not relevant to a hire? – unlawful and costly

  • Avoided unconscious bias and subjectivity? – through human intervention

  • Gained consent of the individual? – enabling a collaborative and compliant approach

  • Calculated the true cost of the time taken? – an automated search saves time and money

  • Created an auditable report? – providing a defensible position on inspection.

The benefits of a technology solution

The vast majority of online checks relate to information found on social media platforms. A technology solution to a social media check is more efficient, thorough, reliable, objective and consistent than a manual check.

Social Media Check (SMC) is a fully automated solution that can check posts, both public and private (optional), against a number of risk categories. It searches all posts across the major social media platforms using machine learning and algorithms, eliminating the need for manual searching or the possibility of unconscious bias. Checks exclude private messaging apps like Whatsapp. 

Using algorithms and machine learning, SMC provides a comprehensive and easy to follow interactive report and certificate in under an hour. It searches posts against seven key behavioural characteristics: toxic language, extremist groups, hate speech, swearing and profanity, potential nudity and negative sentiment, highlighting content which might cause embarrassment, harm or reputational damage. Organisations can also add specific key client words to the search such as the name of a school. Content can be both text or images as SMC uses Optical Character Recognition (OCR) to provide a complete and robust solution.

Social media checks from SMC can only be carried out with the individual’s consent, ensuring GDPR compliancy, and a collaborative approach to an on-line search. Our data shows that less than 1% of individuals refuse to undertake an SMC search. 

The consent process is simple and easy to follow for the individual requiring only an e-mail address and date of birth (to comply with social media platform guidelines). SMC never has access to passwords.

SMC is also an approved app partner of the major social media platforms and searches can be conducted over an unlimited period of time. Nowadays, ignorance of a person’s affiliation to a prohibited organisation or propensity to racism or gender slurs isn’t acceptable, particularly if you have failed to make the right effort to check their online behaviour. We might have short memories but the internet doesn’t. Quite often people have simply forgotten a naïve post made several years ago. The option to edit or delete a post can avoid reputational risk and embarrassment to both employer and individual.

Key features

  • Checks private and public posts across major social media platforms.

  • Fully automated system using machine learning and algorithms.

  • Provides a fully interactive and auditable report within the hour.

  • Requires the individual’s consent.

  • UK based team and systems.

  • Approved app partner of key social media platforms.

  • No IT integration required.

Great benefits

  • Identifies risks against eleven key categories: extremist groups, hate speech, negative sentiment, potential nudity, swearing and profanity, toxic language, violent images, drugs, weapons, firearms, client keywords.

  • No manual searching or unconscious bias.

  • Quick, easy and cost-effective to manage.

  • GDPR compliant.

  • 100% secure.

  • Reliable and comprehensive.

  • Easy set-up within 24 hours.

How SMC has helped the education market

With multiple schools in our Trust, we needed a quick, cost-effective and compliant procedure to conduct on-line checks of candidates. Social Media Check’s automated solution has been very effective. It is quicker and more comprehensive than a manual check, with results often delivered in under an hour.

Debbie Duggan, Operational Resources Director, The Two Counties Trust

With the changes to the KCSIE guidelines, we needed an efficient, cost effective and compliant way to conduct online checks. Manually checking is time consuming and can find characteristics irrelevant to a hire. Social Media Check’s automated solution reduces the risk of subjectivity and unconscious bias and its ease of use and comprehensive reporting has enabled us to realise operational efficiencies in our recruitment process.

Gail Murphy, Operations Manager, Lumen Learning Trust

FAQs handled by independent, education, legal experts Browne Jacobson

Browne Jacobson are well known in the education sector and frequently speak at safeguarding events. They have provided a comprehensive FAQ section on how to carry out the KCSIE guidelines on their website₃, some of the questions are provided below for convenience. We have picked out one with their answer which is particularly relevant to our solution. We recommend you refer to all the questions and their answers₃.

We advise that you do. Ultimately, if content falls into the four categories mentioned above (extremism and hate speech; violent images; nudity; toxic language, swearing and profanity), then it is most likely to appear in social media posts and comments. Yes, there is an off-chance that they may have penned an article on an obscure website that has some dangerous views but the vast majority of the time, if there is anything we need to see, it will be contained in social media.

Your Content Goes Here

Source Material

FAQs: How to carry out KCSiE online checks | Education Law (brownejacobson.com)

₄ SMC analysis on reports carried out in last xxx months (Sept 2023)

Safeguard children in education

For more information

Get in touch today to see the benefits of Social Media Check.

Call: 01565 874 241