diff --git a/content/about/uses.gmi b/content/about/uses.gmi index bef657f..022bd2d 100644 --- a/content/about/uses.gmi +++ b/content/about/uses.gmi @@ -66,6 +66,7 @@ I don't currently use a prebuilt desktop environment. I assemble mine out of the * wormhole-william * rdrview * Efficient Compression Tool (better than Zopfli/ZopfliPNG) +* usvg and resvg * zpaqfranz * scc * Pandoc diff --git a/content/about/uses.md b/content/about/uses.md index 4181bb4..42fec05 100644 --- a/content/about/uses.md +++ b/content/about/uses.md @@ -172,6 +172,11 @@ z.lua [Efficient Compression Tool](https://github.com/fhanau/Efficient-Compression-Tool) : The last word in optimizing gzip or PNG size. Runs circles around Zopfli, ZopfliPNG, oxipng, etc. I use it in combination with `brotli` to compress all static text and PNGs on this site. +[usvg](https://github.com/RazrFalcon/resvg/tree/master/crates/usvg) +: An SVG compiler, and one of the most under-appreciated tools I use. It compiles complex SVGs into simpler path-based SVGs. Edge-case SVGs may render incorrectly in some renderers (e.g. librsvg), but compiling them with usvg tends to iron these edge-cases out and make them more compatible. `usvg` is part of the [resvg](https://github.com/RazrFalcon/resvg) project, which is the most conformant SVG renderer I know of. + + All the SVGs I serve on seirdy.one have gone through `usvg`. + [zpaqfranz](https://github.com/fcorbelli/zpaqfranz) : I use this for my long-term backups. `zpaq` is a journaling archiver, which allows me to compress backup deltas without having to use a journaling filesystem. `zpaqfranz` adds several features related to integrity-checking. The compression ratios are ridiculously good, even without the journaling; it beats every other realistic option, especially when combined with pre-processing offered by [lrzip-next](https://github.com/pete4abw/lrzip-next). diff --git a/static/robots.txt b/static/robots.txt index 1ca14c3..0998ff5 100644 --- a/static/robots.txt +++ b/static/robots.txt @@ -110,8 +110,9 @@ Disallow: / # FacebookBot crawls public web pages to improve language models for our speech # recognition technology. -# UPDATE 2024-07: The Meta-ExternalAgent crawler crawls the web for use cases such as training AI models or improving products by indexing content directly. # +# UPDATE: The Meta-ExternalAgent crawler crawls the web for use cases such as training AI models or improving products by indexing content directly. +# User-Agent: FacebookBot User-Agent: meta-externalagent Disallow: /