1
0
Fork 0
mirror of https://git.sr.ht/~seirdy/seirdy.one synced 2024-09-19 20:02:10 +00:00

syndicate

This commit is contained in:
Rohan Kumar 2023-08-15 09:27:46 -07:00
parent 1163e0d311
commit 4ee4730cb8
No known key found for this signature in database
GPG key ID: 1E892DB2A5F84479

View file

@ -7,6 +7,11 @@ replyType: "NewsArticle"
replyAuthor: "Gizmodo"
replyAuthorType: "NewsMediaOrganization"
replyAuthorURI: "https://gizmodo.com/"
syndicatedCopies:
- title: 'The Fediverse'
url: 'https://pleroma.envs.net/notice/AYlCIhRRPwoFEDB0dM'
- title: 'The Mojeek Discourse'
url: 'https://community.mojeek.com/t/cnet-didn-t-have-to-delete-old-articles/703'
---
CNET actually didn't have to delete old articles to improve ranking. If CNET simply removed those articles from its sitemap, used [WebSub](https://www.w3.org/TR/websub/) to inform Google (and IndexNow to inform Bing, Seznam, and Yandex) of new higher-priority pages, and maybe used `robots.txt` to disallow crawling of stale pages: CNET could keep old content but prioritize the crawling of recent content. Nothing I just described is Google-specific; these are all agreed-upon standards that work across several search engines.