• BeNotAfraid@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    edit-2
    2 months ago

    It is basically instantaneous on my 12 year old Keppler GPU Linux Box. It is substantially less impactful on the environment than AI tar pits and other deterrents. The Cryptography happening is something almost all browsers from the last 10 years can do natively that Scrapers have to be individually programmed to do. Making it several orders of magnitude beyond impractical for every single corporate bot to be repurposed for. Only to then be rendered moot, because it’s an open-source project that someone will just update the cryptographic algorithm for. These posts contain links to articles, if you read them you might answer some of your own questions and have more to contribute to the conversation.

    • koper@feddit.nl
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      2 months ago

      It is basically instantaneous on my 12 year old Keppler GPU Linux Box.

      It depends on what the website admin sets, but I’ve had checks take more than 20 seconds on my reasonably modern phone. And as scrapers get more ruthless, that difficulty setting will have to go up.

      The Cryptography happening is something almost all browsers from the last 10 years can do natively that Scrapers have to be individually programmed to do. Making it several orders of magnitude beyond impractical for every single corporate bot to be repurposed for.

      At best these browsers are going to have some efficient CPU implementation. Scrapers can send these challenges off to dedicated GPU farms or even FPGAs, which are an order of magnitude faster and more efficient. This is also not complex, a team of engineers could set this up in a few days.

      Only to then be rendered moot, because it’s an open-source project that someone will just update the cryptographic algorithm for.

      There might be something in changing to a better, GPU resistant algorithm like argon2, but browsers don’t support those natively so you would rely on an even less efficient implementation in js or wasm. Quickly changing details of the algorithm in a game of whack-a-mole could work to an extent, but that would turn this into an arms race. And the scrapers can afford far more development time than the maintainers of Anubis.

      These posts contain links to articles, if you read them you might answer some of your own questions and have more to contribute to the conversation.

      This is very condescending. I would prefer if you would just engage with my arguments.

      • BeNotAfraid@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        2 months ago

        At best these browsers are going to have some efficient CPU implementation.

        Means absolutely nothing in context to what I said, or any information contained in this article. Does not relate to anything I originally replied to.

        Scrapers can send these challenges off to dedicated GPU farms or even FPGAs, which are an order of magnitude faster and more efficient.

        Not what’s happening here. Be Serious.

        I would prefer if you would just engage with my arguments.

        I did, your arguments are bad and you’re being intellectually disingenuous.

        This is very condescending.

        Yeah, that’s the point. Very Astute