mirror of
https://git.sr.ht/~seirdy/seirdy.one
synced 2024-11-14 09:42:09 +00:00
1.4 KiB
1.4 KiB
title | date | replyURI | replyTitle | replyType | replyAuthor | replyAuthorType | replyAuthorURI | syndicatedCopies | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
CNET didn’t have to delete old articles | 2023-08-15T09:22:31-07:00 | https://gizmodo.com/cnet-deletes-thousands-old-articles-google-search-seo-1850721475 | CNET Deletes Thousands of Old Articles to Game Google Search | NewsArticle | Gizmodo | NewsMediaOrganization | https://gizmodo.com/ |
|
CNET actually didn't have to delete old articles to improve ranking. If CNET simply removed those articles from its sitemap, used WebSub to inform Google (and IndexNow to inform Bing, Seznam, and Yandex) of new higher-priority pages, and maybe used robots.txt
to disallow crawling of stale pages: CNET could keep old content but prioritize the crawling of recent content. Nothing I just described is Google-specific; these are all agreed-upon standards that work across several search engines.
I suppose it's easier to just delete pages, though. Less labor means fewer expenses. After all, this is the outlet that cut costs with algorithmically-generated articles.