Reddit's logo is displayed, at the New York Stock Exchange in New York City, US, March 21, 2024.Social media platform Reddit said on Tuesday it will update a web standard used by the platform to block automated data scraping from its website, following reports that AI startups were bypassing the rule to gather content for their systems.
Reddit said that it would update the Robots Exclusion Protocol, or "robots.txt," a widely accepted standard meant to determine which parts of a site are allowed to be crawled. More recently, robots.txt has become a key tool that publishers employ to prevent tech companies from using their content free-of-charge to train AI algorithms and create summaries in response to some search queries.
This follows a Wired investigation which found that AI search startup Perplexity likely bypassed efforts to block its web crawler via robots.txt.
Technology Technology Latest News, Technology Technology Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: ChannelNewsAsia - 🏆 6. / 66 Read more »