THERE AREN'T ANY lies on my CV, but there is one thing that is, at best, a bit of an exaggeration. For someone who is "comfortable with Excel," I look suspiciously panicked when asked to do anything more complex than calculate the sum of a column.
For some, though, the CV is a complete work of fiction, which is why LinkedIn introduced endorsements. That utopian ideal, that former colleagues could vouch for your skills in C++ or PowerPoint, was somewhat undermined by them preferring to praise their coworkers for the quality of their hugs and coffee-making abilities instead.
So now LinkedIn has another idea: Skill Assessments. The idea is that you prove your ability by taking a quick quiz. If you pass, you get a badge for your profile, one which the company says makes you 30 per cent more likely to get hired. If you fail, well that's just between you and Microsoft.
There's only one problem: it's incredibly easy to cheat.
The trouble with Skill Assessments
Tigran Sloyan has 11 LinkedIn Skill Assessment badges on his profile with certificates in everything from C++ to MySQL. Although the former Googler is undoubtedly qualified, by his own admission he's not that qualified. "Even things like QuickBooks that I've never touched in my life: I've got a passing grade now," he tells me via telephone. "I know nothing about it."
To be clear, Sloyan has a not-particularly-vested interest in this area: After leaving Google, he set up CodeSignal which specialises in exactly the kind of accreditation that LinkedIn is now pursuing. Initially, he was excited by LinkedIn's arrival in this space, but that quickly evaporated: "I'm like, 'oh my God, this is not going to work'," he recalls. "The only way assessments are going to be successful is if they can be trusted," he continues.
People will cheat, he says. And he knows this for sure, because CodeSignal had exactly the same problem at first, even with far more rigorous tests that required applicants to demonstrate their coding skills. "They were asking their friends to do it for them, they were posting questions online..." he recalls.
Multiple choice is not just easy to game: it's not hugely helpful either, which is why brain surgeons aren't handpicked via a Facebook personality quiz.
"Let's imagine you're hiring a pilot who's going to fly your plane and you just ask some multiple-choice questions about flying. 'You know where the cockpit is?' 'Sure, it's in the middle of that thing'. 'You know what to do when you have to crash land?' 'Of course, I do.'" Unsurprisingly, that's not how flight schools do things: "no, you put them in a flight simulator and see if they can do the job."
How many people would cheat on a LinkedIn test? It doesn't really have to be a big number to make a difference. "Even if this is like 10 per cent of the population, right, or six per cent. When you have 1,000 candidates that you're trying to choose from, six to 10 per cent is like 60 to 100 people who are completely unqualified, right?"
Following in Tigran's footsteps
So it's time to beef up my digital CV. The trouble is that almost all the tests are about coding, and the only bit of code I know is how to add HTML links: a skill I occasionally have to look up if I haven't done it in a while. Tigran was helped by a knowledge of some code which made the answers Googlable within the 90-second time limit: for me, I'd barely be able to read the acknowledgements page of the Coding for Dummies.
That's a problem, because even with multiple-choice, the odds aren't in my favour. With 15 four-answer questions, relying on pure guesswork, I stand less than a 0.0001 per cent of getting every one right. Even to get half marks, it's around a one per cent chance. Fortunately, there are 29 tests to take which increases the odds somewhat because… you know what, let's just add probability to the long list of skills I don't have and move on, shall we?
Nonetheless, like a problem gambler, I take my first shot, starting at the very top of the list with Amazon Web Services. Any hopes that it might be about buying suspiciously cheap books are quickly dashed as I'm hit with questions involving more brackets and symbols than any sentence should ever have.
How the hell did I manage that? A combination of quick Googling, guesswork and the old pub quiz machine classic: looking at what the answers have in common and picking ones with similar words to each other.
Anyway, I've tasted (very low-level) fraud and I like it. Next up is AutoCAD, which feels appropriate for a cheating cad like me. No joy, this time: just a sorry looking Google search history and nothing to show for it.
PowerPoint, on the other hand, is a rip-roaring success, and another string to my imaginary bow despite me using it precisely once in my life. Quite rightly, I flunk HTML after a catastrophic choice on the examiner's part not to focus entirely on how to add clickable links.
At this point, I decided to try another strategy: outside help. I message an accountant the QuickBooks questions as I do them, trying to see if 90 seconds is enough time to get a passing grade with a little help. First off, LinkedIn won't let you copy and paste here, so I was forced to use Microsoft's tools against it, nefariously using the Windows Snipping tool to quickly grab a question and WhatsApp it to my contact - this usually lost me five to 15 seconds, which isn't too bad. Excluding the one where I sent the same question twice, panicked and guessed wrong.
But I/my contact didn't pass: turns out that five of the questions were a bit ambiguous and some were clearly aimed at American accountants. "I think that because I know all the exceptions to rules (or most of them), then none of the answers were clear cut to me," he explained afterwards. Oh, and he doesn't use QuickBooks much. Should probably have factored that in: we can't all be Amazon Web Services experts, like me.
So overall, it's Me 2, LinkedIn 3.
Fixing the problem
Clearly, for people determined to cheat, you can. I didn't resort to it, but no doubt I could have found someone on Upwork who would have helped me out for a few quid, if I really wanted to pose as a Python pro. Now I know that's not a euphemism.
So how can LinkedIn fix its problem? For Sloyan and CodeSignal, the answer is longer exams that actually make you prove your skills rather than pick the right answer. The company's tests have shown that 60 to 80-minute exams are right for this: "if your assessments are less than 60 minutes on any proper subject, you cannot claim to have measured ability reliably on the other side," he says. Suffice it to say if LinkedIn's tests were 60 minutes long, I wouldn't have taken five of them. Nor, I suspect, would anyone, and there's the rub.
But for all of the problems with Skills Assessments, Sloyan still thinks it's the right way you go. "I admire that you guys are getting into the space, but this is not how you're going to do it," he explains. "Recruiters have gotten really used to pedigree recruiting and it's going to take a lot of work to undo that mindset and get them into measuring ability directly.
"Trusting the ability of someone who's maybe a college dropout or went to a community school or maybe doesn't have any of the companies that you expect to see on their resume... For them to put that aside and trust an assessment, it's going to take a lot of work and not just like 'hey, they passed a multiple-choice test.'"
It's a fair point. And anyone who hires me to look after a project based on Amazon Web Services will quickly learn that lesson the hard way. µ
Researchers train AI to spot difference between bots and human users on Twitter based on their activity patterns
Real users tend to respond more frequently to tweets from other users compared to bots
Such users will be directed to a "myth busters" page on WHO website
Not all Twitter bots have bad intentions
Perhaps they should have waited a little longer to announce that, given the backlash, argues Holly Brockwell
Google's Shoelace app is currently being rolled out to only a select group of users in New York City