• unskilled5117@feddit.org
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    1 day ago

    That seems highly unlikely to me. Could a reason be website scrapers for AI using different user agents to prevent being blocked? The recent reports of different projects plagued by scrapers fit the timeline

      • unskilled5117@feddit.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        15 hours ago

        Yeah i read that too. But how well is it working? I mean that’s what the news was all about the last few months. A lot off projects were having trouble blocking the bots because they were trying and succeeded to circumvent detection measures.