• fuzzy_tinker@lemmy.world
    link
    fedilink
    arrow-up
    49
    ·
    2 months ago

    This is fantastic and I appreciate that it scales well on the server side.

    Ai scraping is a scourge and I would love to know the collective amount of power wasted due to the necessity of countermeasures like this and add this to the total wasted by ai.

      • adr1an@programming.dev
        link
        fedilink
        arrow-up
        11
        ·
        2 months ago

        That’s awful, it means I would get my photo id stolen hundreds of times per day, or there’s also thisfacedoesntexists… and won’t work. For many reasons. Not all websites require an account. And even those that do, when they ask for “personal verification” (like dating apps) have a hard time to implement just that. Most “serious” cases use human review of the photo and a video that has your face and you move in and out of an oval shape…

  • Jankatarch@lemmy.world
    link
    fedilink
    arrow-up
    23
    arrow-down
    2
    ·
    2 months ago

    Everytime I see anubis I get happy because I know the website has some quality information.

  • refalo@programming.dev
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    2 months ago

    I don’t understand how/why this got so popular out of nowhere… the same solution has already existed for years in the form of haproxy-protection and a couple others… but nobody seems to care about those.

    • LePoisson@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      Probably a similar reason as to why we don’t hear about the other potential hundreds of competing products or solutions to the same problem (in general).

      Luck.

      It’s just not fair in our world.

  • not_amm@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    2 months ago

    I had seen that prompt, but never searched about it. I found it a little annoying, mostly because I didn’t know what it was for, but now I won’t mind. I hope more solutions are developed :D

  • inbeesee@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    2 months ago

    Fantastic article! Makes me less afraid to host a website with this potential solution

  • koper@feddit.nl
    link
    fedilink
    arrow-up
    7
    arrow-down
    22
    ·
    2 months ago

    I get that website admins are desperate for a solution, but Anubis is fundamentally flawed.

    It is hostile to the user, because it is very slow on older hardware andere forces you to use javascript.

    It is bad for the environment, because it wastes energy on useless computations similar to mining crypto. If more websites start using this, that really adds up.

    But most importantly, it won’t work in the end. These scraping tech companies have much deeper pockets and can use specialized hardware that is much more efficient at solving these challenges than a normal web browser.

    • Luke@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 months ago

      she’s working on a non cryptographic challenge so it taxes users’ CPUs less, and also thinking about a version that doesn’t require JavaScript

      Sounds like the developer of Anubis is aware and working on these shortcomings.

      Still, IMO these are minor short term issues compared to the scope of the AI problem it’s addressing.

      • koper@feddit.nl
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        2 months ago

        To be clear, I am not minimizing the problems of scrapers. I am merely pointing out that this strategy of proof-of-work has nasty side effects and we need something better.

        These issues are not short term. PoW means you are entering into an arms race against an adversary with bottomless pockets that inherently requires a ton of useless computations in the browser.

        When it comes to moving towards something based on heuristics, which is what the developer was talking about there, that is much better. But that is basically what many others are already doing (like the “I am not a robot” checkmark) and fundamentally different from the PoW that I argue against.

        Go do heuristics, not PoW.

    • BeNotAfraid@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      2 months ago

      It is basically instantaneous on my 12 year old Keppler GPU Linux Box. It is substantially less impactful on the environment than AI tar pits and other deterrents. The Cryptography happening is something almost all browsers from the last 10 years can do natively that Scrapers have to be individually programmed to do. Making it several orders of magnitude beyond impractical for every single corporate bot to be repurposed for. Only to then be rendered moot, because it’s an open-source project that someone will just update the cryptographic algorithm for. These posts contain links to articles, if you read them you might answer some of your own questions and have more to contribute to the conversation.

      • koper@feddit.nl
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        2 months ago

        It is basically instantaneous on my 12 year old Keppler GPU Linux Box.

        It depends on what the website admin sets, but I’ve had checks take more than 20 seconds on my reasonably modern phone. And as scrapers get more ruthless, that difficulty setting will have to go up.

        The Cryptography happening is something almost all browsers from the last 10 years can do natively that Scrapers have to be individually programmed to do. Making it several orders of magnitude beyond impractical for every single corporate bot to be repurposed for.

        At best these browsers are going to have some efficient CPU implementation. Scrapers can send these challenges off to dedicated GPU farms or even FPGAs, which are an order of magnitude faster and more efficient. This is also not complex, a team of engineers could set this up in a few days.

        Only to then be rendered moot, because it’s an open-source project that someone will just update the cryptographic algorithm for.

        There might be something in changing to a better, GPU resistant algorithm like argon2, but browsers don’t support those natively so you would rely on an even less efficient implementation in js or wasm. Quickly changing details of the algorithm in a game of whack-a-mole could work to an extent, but that would turn this into an arms race. And the scrapers can afford far more development time than the maintainers of Anubis.

        These posts contain links to articles, if you read them you might answer some of your own questions and have more to contribute to the conversation.

        This is very condescending. I would prefer if you would just engage with my arguments.

        • BeNotAfraid@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          2 months ago

          At best these browsers are going to have some efficient CPU implementation.

          Means absolutely nothing in context to what I said, or any information contained in this article. Does not relate to anything I originally replied to.

          Scrapers can send these challenges off to dedicated GPU farms or even FPGAs, which are an order of magnitude faster and more efficient.

          Not what’s happening here. Be Serious.

          I would prefer if you would just engage with my arguments.

          I did, your arguments are bad and you’re being intellectually disingenuous.

          This is very condescending.

          Yeah, that’s the point. Very Astute

    • DaPorkchop_@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      It takes like half a second on my Fairphone 3, and the CPU in this thing is absolute dogshit. I also doubt that the power consumption is particularly significant compared to the overhead of parsing, executing and JIT-compiling the 14MiB of JavaScript frameworks on the actual website.

      • koper@feddit.nl
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        It depends on the website’s setting. I have the same phone and there was one website where it took more than 20 seconds.

        The power consumption is significant, because it needs to be. That is the entire point of this design. If it doesn’t take significant a significant number of CPU cycles, scrapers will just power through them. This may not be significant for an individual user, but it does add up when this reaches widespread adoption and everyone’s devices have to solve those challenges.

        • iopq@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          The usage of the phone’s CPU is usually around 1w, but could jump to 5-6w when boosting to solve a nasty challenge. At 20s per challenge, that’s 0.03 watt hours. You need to see a thousand of these challenges to use up 0.03 kwh

          My last power bill was around 300 kwh or 10,000 more than what your phone would use on those thousand challenges. Or a million times more than what this 20s challenge would use.

    • FuckBigTech347@lemmygrad.ml
      link
      fedilink
      arrow-up
      2
      ·
      2 months ago

      But most importantly, it won’t work in the end. These scraping tech companies have much deeper pockets and can use specialized hardware that is much more efficient at solving these challenges than a normal web browser.

      A lot of people don’t seem to be able to comprehend this. Even the most basic Server Hardware that these companies have access to is many times more powerful than the best Gaming PC you can get right now. And if things get too slow they can always just spin up more nodes, which is trivial to them. If anything, they could use this as an excuse to justify higher production costs, which would make resulting datasets and models more valuable.

      If this PoW crap becomes widespread it will only make the Internet more shitty and less usable for the average person in the long term. I despise the idea of running completely arbitrary computations just so some Web Admin somewhere can be relieved to know that the CPU spikes they see coming from their shitty NodeJS/Python Framework that generates all the HTML+CSS on-the-fly, does a couple of roundtrips and adds tens of lines of log on every single request, are maybe, hopefully caused by a real human and not a sophisticated web crawler.

      My theory is people like to glaze Anubis because it’s linked to the general “Anti-AI” sentiment now (thanks to tech journalism), and also probably because its mascot character is an anime girl and the Developer/CEO of Techaro is a streamer/vtuber.

  • fox2263@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    22
    ·
    2 months ago

    I’d like to use Anubis but the strange hentai character as a mascot is not too professional

    • sleepydragn1@lemmy.world
      link
      fedilink
      arrow-up
      19
      arrow-down
      1
      ·
      edit-2
      2 months ago

      I actually really like the developer’s rationale for why they use an anime character as the mascot.

      The whole blog post is worth reading, but the TL;DR is this:

      Of course, nothing is stopping you from forking the software to replace the art assets. Instead of doing that, I would rather you support the project and purchase a license for the commercial variant of Anubis named BotStopper. Doing this will make sure that the project is sustainable and that I don’t burn myself out to a crisp in the process of keeping small internet websites open to the public.

      At some level, I use the presence of the Anubis mascot as a “shopping cart test”. If you either pay me for the unbranded version or leave the character intact, I’m going to take any bug reports more seriously. It’s a positive sign that you are willing to invest in the project’s success and help make sure that people developing vital infrastructure are not neglected.

    • TomAwezome@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      2 months ago

      It’s just image files, you can remove them or replace the images with something more corporate. The author does state they’d prefer you didn’t change the pictures, but the license doesn’t require adhering to their personal request. I know at least 2 sites I’ve visited previously had Anubis running with a generic checkmark or X that replaced the mascot