Skip to content

Psyched For Business Podcast Episode 8

by Richard Anderson - Co-Founder on

Episode 8:
Picking the brains of the assessment agony uncle

In this podcast episode, Richard Anderson is joined by Ben Williams, a chartered occupational psychologist, assessment design expert and Managing Director at Sten10.
In this episode, we cover:
✅ How Ben got started in the world of occupational psychology and assessment
✅ A whistle-stop tour of the range of bespoke assessment projects that he gets involved with.

Subscribe to the podcast on your favourite platform:

Apple Podcasts

Spotify

Amazon/Audible

Pocketcasts

Other Platforms


Episode 08 - Transcript 

[00:01]
Welcome to Psyched for Business, helping Business Leaders understand and apply Cutting edge business psychology principles in the workplace.

[00:10]
Richard Anderson: Hi, and welcome to Psyched for Business. I'm Richard Anderson. Thank you very much for joining me. In this episode, I'm joined by Ben Williams, Managing Director at Sten10. Ben is a chartered occupational psychologist and assessment design expert. In this episode, Ben talks us through how he got into the world of assessment and psychology and also gives us a whistle stop tour of the various bespoke assessment projects that he gets involved with. Thanks again for listening. Ben Williams, what a delight to have you on. How are you doing?

[00:41]
Ben Williams: I'm very well, thanks Rich. I was up at 5:30 AM to speak to a client in Australia, so a little bit blurry eyed. So, apologies.

[00:49]
Richard Anderson: Then you've come after that directly from a talk, haven't you? That you were given, the agony. What was that about?

[00:54]
Ben Williams: Initially, it was a new feature on LinkedIn that popped up audio events. So, a little bit like the clubhouse app where people can just talk into a microphone, chat, almost like a live podcast where you can pick a topic and you can all contribute in a more democratic way perhaps than a Zoom call where everyone's just talking over each other and it's a bit of a mess. Nice technology. But what I was doing was acting without wanting to blow one's own trumpet an oracle on all things assessments. So people came with questions about rating scales. What kind of rating sales should you use at assessment center? People asked about personality questionnaires, should you use normative or positive for selection of development. There was people who are MSC students through to leadership consultants through to in-house people, questions about how do you measure empathy and develop people on that front. So, we had a little discussion about the skill of building upon one's empathy, but also the motivation to do it. Because some people just won't see the business case for it and they'll just say on, I'm not going to spend all my time listening to people's feelings. So yeah, it’s really interesting discussion.

[02:00]
Richard Anderson: You weren't blowing your own children. I think it's the very fact that you are an oracle in those things that I was very keen to speak and obviously, you and I have known each other for, I was trying to work this out recently, but it must be over 10 years, in the past life of mine. You were a kind of partner, organization of ours, I think when you were independent maybe before you started Sten10. And we probably haven't had a full conversation about what got you into this world in the first place. So, I'd be keen if you'd be happy to just indulge me for a couple of minutes and kind of take me back to why psychology, to begin with business psychology, you know, that's one area of psychology, but why did you decide to go into business psychology and then assessment? We'll get on the assessment bit.

[02:37]
Ben Williams: I can go earlier than that if you like, Rich.

[02:39]
Richard Anderson: Yeah. Why not?

[02:40]
Ben Williams: When I was a baby, no, for my A Levels, I'd chosen English and history and they said, those are two tough subjects. You want an easy third subject? And they said, psychology's easy. And I said, yeah, go on. I didn't really know much about it. Then when I started it, and it's interesting that I didn't realize that even at 16 years old, what appealed to me then has led to my career now. It was about putting a number to things that I thought were not measurable. So, five factors of personality, seven chunks of information we can hold in our short term memory. And I was like, what? Like you can measure this stuff. So that got me more interested in that than dissecting Ted Hughes crow for the umpteenth time in English or reading about crop rotation in the 14th century with history. So, I then did my undergraduate in experimental psychology because I still thought maybe I'll be a clinical psychologist. I learned about the full spectrum of psychology, but always everything that could be shown through evidence based rather than lie on this sofa. And talk to me about your childhood. So, it was less the psychoanalysis, more experimental. Tried out an experience in a prison as work experience rather than her majesty's pleasure in the forensic psychology department. So, I thought might go into that. At the time, I was pretty young and I couldn't quite separate out ethically shaking hands with someone who had murdered someone but also perhaps needed my help. And I thought I would struggle with that on a day basis to be able to switch off afterwards.

[04:13]
Richard Anderson: And was that an interesting, you think you'd still think the same now?

[04:16]
Ben Williams: I think I'd be a bit more mature now. I mean, especially since becoming a dad, I think my empathy levels have gone through the roof, especially given the behavior of my children. I can empathize with prisoners now. I think it would change, but it was quite interesting actually because that whole empathy piece and what you need in order to work in some of these settings. So, we looked at a study back when I was working at one of the test publishers that looked at what makes a good nurse. And actually having an empathy for others, whilst you might think is a good thing to have is actually quite tough to have because you can't switch off after work. And these nurses find that they're quite stressed if they can't switch off after work. So actually if you're working at a prison as a psychologist, you benefit from being able to switch off afterwards and not caring about these. So it's really quite an interesting one. So, business psychology was basically because one of my other tutors was OPP, Oxford psychologist press, Robert McHenry, who was always like a really interesting tutor, but also he had a very exotic lifestyle. So, he'd be traveling around the world. He'd say to me, Ben, I will mark your essays soon, but I'm currently in Africa watching herds of elephant suite past my hotel suites. And he wore a blazer with gold buttons on the cufflink, which I always remembered. I thought that's a sign of success. So yeah, then I did a master's at Surrey and entered the world of business psychology that way. So that was, that was my journey into the field.

[05:43]
Richard Anderson: It's a really interesting journey as well. It's funny because when you talk about doing your A levels, English, history, I did something similar, but I went down the sociology route for the same reason because that's what people said. Sociology would be an easy one. And I think, well I passed it, Ben and I found it interesting enough, but it was nothing. I think for me it was nothing more than that at that time. I think if I was to do it bit like what you were saying before, you know, with the benefit of a few years of experience, if I was to go and do that again, I think I'd find it much more enjoyable than Emily Brandie or whatever it was that I was studying in English. So that was the journey into business psychology. Obviously we know what you do now and we'll give you the chance to explain that probably towards the end of the podcast. But what then took you into the world of assessment?

[06:28]
Ben Williams: So, there's an official answer and then there's a bit of a silly answer. So the kind of a silly answer, which I don't usually kind of mention is love and I don't mean love for psychometrics, I mean the love of a woman. So, my current wife and then girlfriend at the test publisher I work for had a particular passion for people development and coaching. One that's kind of followed her through to today. And when we left the graduate portion of our training, they need to decide where to allocate us and my wife went into the people development team, but they felt, well these two in a relationship, maybe we should keep them separate. I'm not quite sure what they thought we'd be doing on company time if we were in the same team, but, so I was putting actually into the training team. So I trained people for about 18 months in how to use psychometrics and how to assess, which whilst being terrifying at first because you're dealing with HR professionals who are far more experienced than I was. I took solace in the fact that I'm an expert in this one little niche area and I got every question under the sun thrown at me. I learned how to either bat them away, answer them, postpone them, always I guess just trying to be frank with people. And I guess that that follows through to the webinar thing I did this morning. It was just about listening to questions and responding to them in the moment. So I think that forged my skill in that area. And then I went into the assessment team, so assessment center design assessment, center delivery, before my other two roles after that where I kind of continued that but broadened it out into different client types. The love reason is I was being kept separate from the love of my life. But also I think it goes back to that a level thing. Like I'm intrinsically interested in measuring things that are really hard and intangible to assess. That's what gets me excited and interesting. I do like the coaching and development side, but yeah, it's the assessment side where my main interest is.

[08:28]
Richard Anderson: And I can imagine that at that fairly young age, which I imagine you were delivering training to HR professionals, that would've been a daunting thing. But the fact that you were able to give yourself that level of comfort that I'm an expert in this particular area, I guess that was probably a big learning cur for and that maybe put you on the path that you've gone on since.

[08:47]
Ben Williams: Yeah.

[08:48]
Richard Anderson: Because it can be intimidating, Ben, that sort of thing when you’re young, you're standing in front of a room full of people and you're delivering a training session and you're getting stern looks.

[08:56]
Ben Williams: It probably did forge my whole approach to how I act as a consultant because if you blag, it quickly comes unraveled and you just end up being far more embarrassed than if you say, I don’t know that, but great question, I'll look into it. And just showing that honesty and integrity and upfrontness not only I think leads to people just trusting you more, but it's less hassle for me personally. So I'm not thinking I'm operating in an area that I'm not an expert in. I'm either in my zone or I'm saying, sorry, that's not something I can help you with and here's someone who can.

[09:31]
Richard Anderson: I definitely think it's the right way to go. And so when it comes to assessment or when it came to assessment, particularly at that time or maybe even at this stage, are there any areas or were there any areas that interested you more than others when it came to assessment?

[09:46]
Ben Williams: At the time, I was more of an assessment center person, so I liked the idea of creating day, the life experiences. And I think as companies got bolder, we started hearing about things like branding the room in which the assessment center takes place, maybe putting in some fun elements. So, when I worked at TMP, they were busing candidates to the assessment center in a London Open top bus, which had nibbles and things on board. So, all part of the experience and I thought, oh that sounds really fun. And yeah, just trying to give them a, a preview but also sell the job to them. So that was probably my initial interest. And psychometrics were more, oh you use something off the shelf and it'd be interesting, but I wasn't so much involved in their design. Then as a freelancer, that's where I started get intense experience doing that, starting off with a bit of work through test publishers, but also through test practice sites. So, writing hundreds and hundreds of abstract reasoning, verbal reasoning, numerical reasoning questions, which I like designing them. Obviously, that number's a bit of a challenge, but then again that kind of then takes a side step into personality. And now I think probably what's most interesting to me is trying to assess something new and thinking the best way of doing that. So when a client says we've got a model of what makes a great leader, we don't want to use more of a vanilla off the shelf personality tool, we want the language, the length, the reports all to feel very much like us. Can you help us to create that? So, that's really interesting. It obviously has its own challenges, but yeah, that's probably where my interest is at the moment.

[11:24]
Richard Anderson: Brilliant. And you've given me a few things there in the last minute or so, we could take this probably in a couple of different directions. Now, I'm going to come back to what you've just mentioned there when it comes to bespoke assessments, when you know psychometrics in line with somebody's model or somebody's brand or whatever it might be. And just take your step back to the assessment center stuff that you've just talked about, the open top bus or whatever on the way to the assessment center. So, talk me through a little bit of that because I've never done an assessment center in my life, ever. I mean I've only, as you know, I've only worked for small businesses. And I'm not saying that these are exclusive to large businesses, but I guess they're more heavily used in larger organizations. So, how does an assessment center typically work and how is it different now compared to what it was at that time?

[12:09]
Ben Williams: Yeah, so an assessment center has its roots in the Second World War, I think. They were finding that those people they were promoting to being an officer on the basis upon what education they'd received or who their family was, wasn't necessarily, or just personal recommendations, wasn't leading to the best officers. So, they needed a structured way of assessing people against job relevant tasks. So, they instigated the first assessment centers. Over time, that transferred into the commercial world. And it's really the idea of multiple assessments. You've got multiple different competencies. So, unlike an ability test, it's not looking at just one thing, it's looking at maybe 8, 10, 12 different competencies in one day or one half day. You've got multiple exercises in which to assess them. So, if someone feels they stuffed up the group exercise, they've always got a second chance to show that same competency maybe in the interview or even in the personality questionnaire. And then there's multiple assessors as well. So, you've not just got one person's judgment determining your success or failure with all of those biases that can come in with it. You've got multiple different assesses, multiple different perspectives that can then be challenged at the washup session. So, an assessment center is typically, you'll arrive for the day, you'll be presented with a brief, usually it's a fictitious company you've joined, you get org chart financial information, some email threads, and then you've got a diary of meetings or deadlines you've got throughout the day. So, maybe there's a report that's due by five o'clock, maybe you've got a team discussion about a problem your company's facing and you need to sort it out together. Maybe there's a role play with a customer, an angry customer that you need to go to at two. So that's how they work. And I think over the years, they've probably become a little bit more immersive and face valid. So previously there were independent exercises administered when you read off an admin card and time at the stopwatch. Now, they're far more, this is a day at the office, you manage it as you want to.

[14:05]
Richard Anderson: I'm sorry, Ben, just in face valid meaning?

[14:08]
Ben Williams: Meaning it looks to be an accurate representation of what you'll need to do on the job rather than it being set in a, I don’t know if you're applying for a job in a bank sitting an assessment center exercise that's set on an oil rig and you're thinking, hmm, I can't really see the link. Obviously, the big move was in the pandemic when for that most assessment centers were face to face and all of a sudden, 100% of them had to be either canceled or go online. There's been challenges and benefits to that. So, some of the benefits obviously are reduced cost of travel and hotels and assessor time is far reduced because they don't have to travel. There are, I guess mixed views on its fairness. Mostly it's positive because they say, well look, people don't need to take a day off work to travel somewhere to an assessment center. They can dial onto it from home. There are some concerns over new biases that might sneak in those. So, when you're being assessed face to face, you're at the company's premises in a little side room. When you're being assessed from home, I've got the world's most boring background behind me now, but if I had a poster of pulp fiction up there, would that be damaging my credibility as an oracle of psychology and people saying, oh he's a bit of a flake. Or if I had a poster of Donald Trump here, I love Donald. Could that be leading people to draw certain conclusions about my personality? So yeah, there's an element of that. I think the big challenge that companies are facing now is that I think virtual assessment centers are here to stay, even if that's sometimes hybrid and they might do a bit face to face, but its how do you get across what a great place a company is to work when it's all being done virtually. So, you can't do the London bus anymore, you can't do free suites, you can't do the football table, it's log on, do eight hours worth of zoom calls and then we'll tell you whether you've got a job or not, how do you entice them to join.

[16:00]
Richard Anderson: And how much of that, Ben, is just based on the world in which we live now. I was going to say the majority of companies and I should have some stats around, but many companies work exclusively from home now are certainly hybrid and maybe that's just the world that we live in now, do you think?

[16:16]
Ben Williams: It is and I think that means there are different competencies we need now to when we used to. But whereas, at a face to face assessment center, you get to see the office, you get to see the coffee making machine, you get to maybe chat to a graduate in between. If the assessment center's done very much at a distance there, there isn't that human connection. Even if you're only going to go into the office once a month or entirely remotely, you need to build in opportunities for discussions. You need to maybe have a video tour if you are going to be in the office occasionally, maybe you need that welcome from the CEO at the start of the day. They say, oh they really do care about me. Oh, this got that personal touch. So, it's trying to win people's hearts as well as their minds. This looks like a good job, good salary. It's actually, I want to work here because I like these people.

[17:01]
Richard Anderson: So, maybe as the general consensus is often when we speak about the topic of working from home or working remotely, maybe hybrids are the best approach. Probably the same with assessments in there?

[17:11]
Ben Williams: Yeah, I mean it's so different for different companies. I mean I've been kind of dragged on a little bit of a journey. I mean when I say dragged, I think we were probably ahead of the curve in that. We moved to four days in the office rather than five, like well before the pandemic thinking, oh well, let's have Fridays as the day that we don't have to commute back. And that was quite forward thinking, but then pandemic meant a hundred percent remote and I was a bit reluctant to then say, right now let's continue that completely. So, we've gone from two days in the office a week, now we're on one day in the office a week and a check-in, constant check-ins on messaging, but also on a Wednesday. But then other companies I know even in the same area, they'll come in once a month or another company comes in five days a week, they've gone straight back to that. It's really tough and it's always a balance of, well how are we going to keep the idea generation going, the sense of loyalty and commitment amongst the employees, but also entice people to join us when our competitors are saying, you don't need to spend any money on travel. You don't need to devote any of your evenings to the commute back from central London.

[18:10]
Richard Anderson: It's so difficult. And, you know, I share that with you because I don’t know what the right answer is. I wish I did. And, everyone's got their own. There's a lot of views on it, I have to say there's a lot of views on it, but I don't think anyone's come up with a perfect solution just yet. That's assessments center piece. I was just interested. That's really insightful. It gives me a bit of an idea about how these things are structured because although I know that that's an area of what you do that I don't particularly get involved with, it's nice to kind of hear how that works to reverse to maybe five minutes ago when you were talking about creating those bespoke assessments or a client will come to you with a very specific requirement, they're looking to measure these leadership competencies or behaviors. How do we go about doing it? That's something that you get involved with a lot, isn't it?

[18:54]
Ben Williams: Yeah, the shape of Sten10's businesses really shifts over the years. So at one point, situational judgment tests were over half of all of our work and they're still a portion of our work, but a much smaller percentage now. But custom psychometrics, so personality questionnaires, motivation questionnaires are now quite a big part of our work. At the moment, it's typically at the leadership level, a company that either already has a clearly defined model or wants to work with us on their own unique perspective or maybe they've written a book and they want to say, look, how can we turn that into a psychometric to a company that books philosophy to help embed our branding. And there's always a bit of a balance to be struck. So, obviously you can go out and buy tests that have been around for years and will do a great job at assessing someone's personality, but you'll need to put in a bit of legwork to interpret the report in the way that you want to. The language might not be quite right. The norms might not be quite right. So, we worked with a company recently, I know that we collaborated on this based in Africa that said actually a lot of the psychometrics on the market have a western bias to the language that's used, how they interpret competency scores off the back of that. So, we want to develop an Africa first psychometric. So, it's those kinds of scenarios that people are finding either frustration with or they've seen a commercial opportunity to promote their brand that they come to us and they say, help us to create something.

[20:20]
Richard Anderson: Many of these types of businesses, they're entrepreneurial, they've maybe spotted that gap in the market, and they think they can do things better and you are there and, of course we're there to support. We hear that terms, Ben, in psychometrics in an assessment of reliability, validity, those types of things. And without making any assumptions audience wise probably, and I'll be grateful if you could just kind of define what those terms mean, but how do you ensure that the assessments that you create from scratch are going to be reliable, they're going to be valid, those types of things. How do you do that?

[20:52]
Ben Williams: Reliability is shorthand in layman's terms for consistency. So just think of consistency. There's different ways we can see how consistent a test is. It can be to see is it consistent over time? So if I answer this personality questionnaire today, then I go home, have a late night, maybe receive an email, feel a bit more stressed, sit it again tomorrow, will my response be wildly different or is it a bit resilient to that? Is it quite consistent as a measure? And you want that to be as high as possible? And, so I guess how you achieve that is going to be around your administration instructions, telling people in what environment to sit the test in and also how to interpret the question. So, whether you are saying, well look how are you feeling right this second versus how do you generally feel and also writing just very clear questions that can't be misconstrued depending upon the perspective that you read it from. So, that's one type of consistency. And the other common one is called internal consistency. So, if you've got a questionnaire in a personality tool that might say something like, I don’t know, measuring extroversion, how wild and outgoing and lively are you? And how much energy do you get from others? If you write 10 questions, they should all be measuring extroversion. What you don't want is one of them sneaking in there that measures something else. So, another of the big five personality traits is agreeableness. So, it's how much you tend to get on with people, whether you are a warm kind of outgoing person. Now, if you had a question that was meant to be for extroversion but actually talked about how warm and engaging and sympathetic you are, then you say that's not the same thing and you're going to find that internal consistency is low. Now you could say, is that just something that psychologists worry about and get themselves caught up in? And it shouldn't be because if you have an inconsistent scale, if you tell someone, Richard, you are a real extrovert, what does that actually mean? If it's a whole melting pot of how agreeable you are, how extroverted you are, how much you like persuading people, then it's, well, what's exactly going on there? Whereas, if it's a really tight definition of lively, outgoing, gets energy from others, that means much more. That's just good, good test question writing and piloting, that's how you get that. And then validity, again, there's many types but I'll only talk about the two most popular, two most important. One is content validity. So, that is the lowest form of legally defensible validity in the UK and it's generally described as saying, have you done appropriate job analysis? So, you are analyzing the right qualities required in the job. For a bespoke personality questionnaire, I'll be wanting to see that the areas that they're proposing measuring have a track record of leading to success for those people in the job so they can say we've been in this executive search industry for 15 years and our directors have pulled their collective wisdom and time after time again, these are the traits that lead to success. So you say, okay great, that seems solid. The second type, we often don't get before launching and that's because it's quite tricky to get. So, what you want to do is to see whether a test is called criterion validity, but does a score in a test predict behavior in a job or tenure in a job or customer satisfaction or something like that. And you need to correlate, so you need to do some statistics on it, see how strong that relationship is and you need reasonable sample sizes for that. So, you need at least 50 people to do that and you generally need some time having elapsed because if you're going to use an ability test to screen people, you need to say, look, are they the best ones six months down the line, a year down the line. So that's usually done after it's launched.

[24:35]
Richard Anderson: That makes sense. And how long after typically would that?

[24:39]
Ben Williams: You'd want to wait at least six months because if someone's just new into a job, they're going to need to get their feet under the table and getting to the rhythm of it before you start judging their performance. So yeah, at least six months, ideally a bit longer.

[24:50]
Richard Anderson: Okay, brilliant. And while we're on this topic, I think it's probably irrelevant to go down the route of scoring. So, how are these assessments typically scored, I know that we hear a lot about raw scores and percentiles and STEM scores of course, where your name came from, who decides and what forms the decision around how we're going to score this particular test or questionnaire?

[25:13]
Ben Williams: Usually, the decision is made in the very early stages of the design when we come up with what's called a test design blueprint and we talk about all the parameters of the design. If we take some of those ones that you mentioned, so a raw score on its own, you scored nine out of 12, isn't that meaningful because you don’t know if that's good, bad, or typical of most people. So, what we do is we norm the scores. So, we say a nine out of 12, how does that compare to most people have sat the test in the past? Usually we take a sample of about a hundred people and then we can report what's called percentile. So we can say you did better than 30% of people who sat the test before. So nine out of 12 is actually quite bad or you did in the top 10% of people who sat the test before, in which case that's outstanding. Well done. So, percentile are quite useful for feedback because you can see like the way I just fed it back to you, you can understand it and you can set cutoff points quite clearly. So percentile are good. If you want to start doing anything clever with your score, if you want to start waiting, one test is more or less important than another or you want to start averaging them across a few tests, you can't do that with percentile because percentile are basically telling you what rank order you came in versus everyone that's done it before. So, they're not equal units of measurement. So, that's where standardized scores come in and it's like just saying, well look, here's the bell curve of everyone that sat the test before. The percentile say whether you're up this end or down that end or in the middle. A standard score is like putting a ruler along the bottom of that and saying, right, we've got equal units of measurement now, let's see what point you came along. So common ones would be a T score, transform score for ability tests or a Sten score for personality questionnaires. And our name Sten10 came from that because Sten10 is about, well it is precisely 2.28% of the population fall at that point. But of course I didn't realize about the cartoon character Ben 10 when I named the company Sten10 and that's caused untold confusion when I'm being introduced as Ben 10's clients. But yeah, so a Sten score is often used for personality because T scores are more granular, they're more finely divided, but with personality a 1 to 10 is fine.

[27:25]
Richard Anderson: It's really interesting. So I think we'll put some links up in line with this podcast transcript or something for anybody who's interested in kind of going into real granular detail with those scoring systems. One thing I'm very keen to speak to you about is a really important topic, of course, is that of diversity and inclusion, that we're hearing more and more about, we know the importance of it, we hear this term adverse impact when it comes to assessment quite frequently. How do we go about ensuring, you know, if you were creating a test from scratch for a leadership development test, how do you ensure that it won't adversely impact or an ability test or whatever it might be?

[28:07]
Ben Williams: There's a whole toolkit of things that you can do to try to reduce it. First thing I'd say is that there are differences between certain groups. So, men tend to be more competitive than women on average. And if we come up with a questionnaire that looks at competitiveness and measures it properly, then men will always score a bit higher than women on that and women will always score a bit higher on the more compassionate, empathic scales. So there are group differences that have been seen before countless times and therefore, it doesn't mean that necessarily your test is wrong, it's reflecting a real difference. But if you've got an aim to equalize opportunity for people from those groups, then you can do things like you can weight certain questions as being more important than others or you can just be bit more modest with your cutoff scores. Because if you set your cutoff point really high, the adverse impact is going to be much larger. We were talking to a client the other day that set their cutoff point at the 80th percentile, but they also had a diversity agenda and you are kind of balancing it. Well yes, that helps get you the numbers down, but that's not going to help your diversity agenda.

[29:17]
Richard Anderson: So it's what's dragging that balance?

[29:20]
Ben Williams: Yeah. Also, there are some certain, I guess, societal factors that are going to impact upon how people, let's say from lower socioeconomic groups are going to do in some of these assessments. So, less access to practice materials, less coaching, less role models maybe. And this applies to different ethnic groups as well. Maybe a little bit of bit of stereotype threat, like people like me, I can't see anyone like me employed here, do I fit? And there's been quite a bit of research that shows that affects your test performance as well. So, trying to encourage people to reduce that stereotype threat to say, look, there are people of all walks of life and all different groups who work here and getting people to relax before they take it, etc., is all going to help. That's part of it. When we write the questions, we can put them through gendered language checkers to make sure that they're not overly biased one way or the other. Verbal reasoning tests tend to be the biggest offenders when it comes to ethnic group differences. And I think the key thing is to say, well look, what's your test seeking to assess? Is it assessing reasoning skills? In which case, a nonverbal test is going to do that for you. So, instead maybe you should be using an abstract reasoning test. Is it looking at your English language speaking skills? In which case, a better test would be looking at qualifications in English language. So, from the CV or application form. If it is, no, it is verbal reasoning that we're looking at and we do need to simulate this, then it's a case of saying, right, well if it's a genuine like requirement of the role, then making sure that the reading difficulty is as difficult as you need it to be, but no more. So you can look like in word, it has a complexity checker and you can set it at, you can say, well what's the typical range expected for an A level individual or a 14 year old or a white collar professional. So you can start getting a sense of how complex your questions are and whether you've over egged it. So there's that. There's also in the design phase, it's speaking to a diverse range of people in the design phase to make sure there's nothing culturally specific, any idiomatic language in there, piloting it on a diverse group, norming it on a diverse group. So yeah, there's a whole host of things that you can do also from a neurodiversity perspective. So what are we going to do with candidates who have dyslexia? Have we built in the facility to change the time? Have we built in the facility to change the color of the font? Does it have screen reader technology for people who are partially sighted? People who have autism actually find certain types of tests more of a challenge than others. So things like situational judgment tests, these hypothetical scenarios. Place yourself in this fictitious imaginary scenario and tell us what you would do. It isn't so easy to relate to. And in that case, it might just be, well don't use an SJT in that scenario, actually instead either move them through the process or have an interview to discuss the areas you want to. So yeah, a multifaceted approach.

[32:27]
Richard Anderson: No, it's really interesting stuff, Ben, and I'm guessing here, and please correct me if I'm wrong, but the stuff that we're talking about when it comes to, particularly with adverse impact and ability tests and those types of things that more often than not applied to recruitment scenarios, recruitment and selection. Are you finding at the minute, and you know, I've got my own views on this, are you finding at the minute that your assessments are being used as much in recruitment as they are in learning and development settings? How do you find that? You know, just since Evolve has started, it's been ups and downs in terms of recruitment, you know, one year or for this six months, then more learning and development, obviously we had the pandemic, we had war for talent, all of those things. What are you finding at the minute?

[33:11]
Ben Williams: I think like you, it's quite up and down. So we'll be, heads down, recruitment, recruitment, recruitment. I think there are obviously certain cycles in the year for graduate recruitment. So a lot of that will be happening now. So we will have less graduate assessment center design right now because they've been rolled out. But that will pick up again in kind of quarter one, quarter two next year. I'd say that, yeah, a lot of the requests for bespoke psychometrics is for use in a selection setting at the moment or as a complimentary bit to a selection setting rather than saying, let's use this purely as a development tool. It's probably a side effect of the pandemic with people acquitting their roles, but then applying for new ones and maybe ones that best suited, maybe this is this greater attention to getting the person to job match, right? So they'll stay for longer and feel satisfied. Maybe that's driving some of it.

[34:08]
Richard Anderson: I guess just to finish off, if you'd be happy to, I know that we've talked about this of course throughout, but maybe just letting the audience know kind the different types of things that you do at Sten10. Again, I know that we talked a little bit about it, but maybe a quick whistle stop and how people can get in touch with you if you want to.

[34:25]
Ben Williams: Great. Thanks, Rich. So, I'll just do really quick. We help you to identify what you want to assess. So we design competency frameworks, values frameworks, strengths frameworks, etc. So that's the foundation. We then help to design people assessments at any stage in the selection process or for development purposes. So, we'll do right up the front structured application forms, telephone interviews, ability tests, all the way through to the final stages like the assessment center. Whilst we do come up with bespoke assessments, that's our specific area of expertise, we also publisher agnostic. So, if we think a quicker win is for you to use an off the shelf tool, then we'll always recommend that. And we've got relationships with all of our test partners. We will train, train people how to interview, how to assess, and that's often to accompany what our interventions, but it doesn't have to be. And also covering things that unconscious bias and we will evaluate, so we'll do statistical analyses that will help you to decide which parts of your assessment process are working the best or the hardest, where there might be adverse impact, where you could save time by removing duplication of effort in your assessment process, etc. So, I guess in a nutshell, one stop shop assessment experts. And get in touch at sten10.com.

[35:41]
Richard Anderson: There you go. Not Ben 10. Brilliant. We'll obviously link your LinkedIn profile and the Sten10 website address as part of this blog. But thank you very much. Really, really appreciate your time. Its great catching up as always, and in this setting it was even more interesting for me. But there you go. Thanks, Ben.

[36:00]
Ben Williams: Brilliant. Thanks, Rich.

[36:02]
Thanks for listening to Psyched for Business. For show notes, resources and more, visit evolveassess.com.