IT needs more brains, so why is it so bad at getting them?::Open-book exams aren’t nearly open enough

  • Artair Geal@pawb.social
    link
    fedilink
    English
    arrow-up
    27
    ·
    1 year ago

    Speaking from years of experience in IT (nearly thirty of them), I can give my own unscientific opinion: because people put too much faith in certifications, and refuse to do any on-the-job training. You can have five of the six skills listed in a job ad, but if you don’t have that all-important sixth one, your application will get round-filed. It doesn’t matter if it would be a simple matter to train a tech on that one thing. Businesses want phoenixes for chicken scratch.

    Certifications are a boondoggle, and have been for years. The tests have been rigged in such a way that candidates need to take them again and again to pass, and they get charged a fee for each attempt. The test itself is a revenue source for companies. The “prestige” those certifications bring for the companies that front them is based on their difficulty, not on their relevance or fairness.

    I once attended a Microsoft certification “boot camp.” We all worked our asses off, studied the material, and most of us passed at least one test. Nobody passed all three exams except for one person. I had noticed that person using test prep software with a logo that didn’t match the stuff we’d been given. It looked like an orange DNA helix.

    After the last test, a bunch of us milled around outside the building, and I asked the guy who passed how he made it through. He ran for his truck so fast that there was practically a dust cloud behind him. That’s when I decided to look up that logo on Google.

    He’d been using a “brain dump” service. For those unaware of what a “brain dump” is, it’s when a third-party company sends a bunch of people to intentionally fail the exams over and over. During each attempt, those people memorize the test questions. Then the company has their plants aggregate all the possible questions in an exam pool and the correct answers to them. In effect, it’s a copy of the whole test.

    Brain dumps are extremely common in IT. When I worked at VMware, many of our own employees used them to pass certification exams that were mandatory for continued employment. Those people had been doing their jobs for years. They just needed a bogus piece of virtual paper to prove it to our executive leadership. It was all about appearances.

    Why is tech struggling for qualified workers?

    Because it refuses to acknowledge them.

    • Dark Arc@social.packetloss.gg
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      Why is tech struggling for qualified workers?

      Because it refuses to acknowledge them.

      This seems to be a common problem with industries that just can’t find talent. “Qualified” is used in place of “they meet our desires perfectly.”

      It’s the same idea even as absurd incel dating ideals. The issue may be the candidates sure; but maybe just maybe, the issue is you need to look in the mirror and ask yourself if you’re being (un)reasonable.

    • Oliver Lowe@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Fascinating insight about those brain dump services.

      Thanks for sharing your experiences. Massive respect for you to have done 30 years in this silly industry!

  • thelastknowngod@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    Honestly just changing the interview process would be enough to get more people into the business.

    Literally yesterday I did a code challenge to track the distance, speed, maintenance schedules, and predict collisions of forklifts in a warehouse. The job I was applying for was a pretty average SRE roll… System design, IaC, CI/CD pipelines, PromQL, etc… How is the code challenge representative of the job in any way?

    I feel like I need to learn leetcode algorithm patterns just for the interviews… I never need them for the actual jobs I get hired for.

    • huginn@feddit.it
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Leetcode style interviews are good for showing off that you’re a smart and flexible employee who can solve novel problems.

      The issue is that most companies don’t have any novel problems and they just need quiet competence… But want the best/smartest w/e

    • linearchaos@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Pre-COVID I needed a low - mid level help desk person.

      My screening questions were:

      What are the steps for troubleshooting not being able to print.

      Excluding out of paper or out of toner / ink which are states clearly displayed to the user, What is the most likely cause for not being able to print.

      If a user puts a ticket in that they’re getting BSoD but they missed what the message was. How do you find out what that message was.

      I wasn’t even looking for right answers I was just looking for some signal that they had seen the problems before or had a reasonable thought process of how to proceed.

      I had around 150 applicants, six of them said anything at all that would make me think they had seen a printer or blue screen of death situation before.

  • Oliver Lowe@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    The article argues for a reworked IT education industry in the hopes of a more skilled workforce:

    The result would solve the industry’s most pressing need, for good people doing good work, and through expansion into other areas benefit us more than AI will ever manage.

    Most IT today exists as a means to support business and commerce. Corporations post absurd profits year over year. They don’t need more knowledgeable IT staff. What is “good” for the IT industry employers may be more staff willing to say “yes, sir” and kick the can down the road. Business doesn’t care about efficient systems if their systems are profitable.

    So why is IT bad at getting brains? Because it is against most leadership’s interests. Progress, change, automation all introduce risk which can hurt profitability.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      If you’re not familiar with what they do, IT also be seen as a money sink, since there’s no obvious sign of them preventing things from going wrong. So they might seem like they’re just a department sitting there wasting money, or they’re a department you wasted money on when the company is inevitably hacked, for not stopping it in the first place.

  • lilShalom@lemmy.basedcount.com
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    1 year ago

    IT requires you to constantly learn new things to stay relevant. I don’t know if any other industry requires this as much as IT.

    • Oliver Lowe@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      For me, that feeling of needing to learn new things I think comes not from new tech or tooling, but from needing to solve different problems all the time. I would say there is definitely a fast-moving, hype-driven churn in web development (particularly frontend development!). This really does wear me down. But outside of this, in IT you’re almost always interacting with stuff that has been the same for decades.

      Off the top of my head…

      Networking. From ethernet and wifi, up to TCP/IP, packet switching, and protocols like HTTP.

      Operating systems. Vastly dominated by Windows and Linux. UNIX dates back to the 70s, and Windows on the NT kernel is no spring chicken either.

      Hardware. There have been amazing developments over the years. But incredibly this has been mostly transparent to IT workers.

      Programming. Check The Top Programming Languages 2023. Python, Java, C: decades old.

      User interfaces. Desktop GUI principles are unchanged. iOS and Android are almost 15 years old now.

      Dealing with public cloud infrastructure, for example, you’re still dealing with datacentres and servers. Instead of connecting to stuff over serial console, you’re getting the same data to you over VNC over HTTP. When you ask for 50 database servers, you make some HTTP request to some service. You wait, and you get a cluster of MySQL or Postgresql (written in C!) running on UNIX-like OS (written in C!) and we interact with it with SQL (almost 50 years old now?) over TCP/IP.

      As I spend more time in the industry I am constantly learning. But this comes more from me wanting to, or needing to, dig deeper.

      • Aceticon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        This is also my experience.

        Whilst one can viably move around in IT to be near the bleeding edge (which moves around from area to area slowly over timeframes of a decade or so), most of what’s done in IT is pretty much the same old same old, maybe with bigger tech stacks because the expectations of fancy features keep going up yet the time frames are still the same (for example, integration with remote systems via networking used to be a pretty big deal, but nowadays it’s very much expected as norm in plenty of sintuations) so you end up with ever larger frameworks and ever larger and thicker stacks of external dependencies (20 or 30 years ago it was normal to manually manage the entire hierarchy of library dependencies, whilst nowadays you pull out a clean project from source control and spend the next half an hour waiting for the dependencies to be dowloaded by whatever dependency management system the project build framework - itself much more complex - uses).

  • uzay@infosec.pub
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Judging from the AI-generated picture above I assume it is because IT is an undead nightmare hellscape where you are shackled to ancient technology that sucks your life-blood out of you until you inevitably fuse into it and become part of the unending doom machine that is late-stage capitalism

  • Sygheil@lemmy.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Certifications vs real world experience. Hoods are better than suits. The pioneers does not even have one and yet we are here.