--- title: "CNET didn’t have to delete old articles" date: 2023-08-15T09:22:31-07:00 replyURI: "https://gizmodo.com/cnet-deletes-thousands-old-articles-google-search-seo-1850721475" replyTitle: "CNET Deletes Thousands of Old Articles to Game Google Search" replyType: "NewsArticle" replyAuthor: "Gizmodo" replyAuthorType: "NewsMediaOrganization" replyAuthorURI: "https://gizmodo.com/" syndicatedCopies: - title: 'The Fediverse' url: 'https://pleroma.envs.net/notice/AYlCIhRRPwoFEDB0dM' - title: 'The Mojeek Discourse' url: 'https://community.mojeek.com/t/cnet-didn-t-have-to-delete-old-articles/703' --- CNET actually didn't have to delete old articles to improve ranking. If CNET simply removed those articles from its sitemap, used [WebSub](https://www.w3.org/TR/websub/) to inform Google (and IndexNow to inform Bing, Seznam, and Yandex) of new higher-priority pages, and maybe used `robots.txt` to disallow crawling of stale pages: CNET could keep old content but prioritize the crawling of recent content. Nothing I just described is Google-specific; these are all agreed-upon standards that work across several search engines. I suppose it's easier to just delete pages, though. Less labor means fewer expenses. After all, this is the outlet that [cut costs with algorithmically-generated articles](https://gizmodo.com/cnet-ai-chatgpt-tech-news-1850017739).