Use a new branch of webmentiond that lets me pull in all webmentions for
all pages in a single JSON response
Before, Hugo would make one request to webmentiond per page to ask for
approved webmentions for that page. Sometimes, it makes two requests
because some pages used to have a different canonical location. In all,
it ended up making over 150 requests within a second or two. Webmentiond
can handle this for now, but this isn't sustainable: page count will
only increase with time. I wanted to have Hugo instead get all
webmentions for all pages in one cached request.
I recompiled webmentiond from
https://github.com/zerok/webmentiond/pull/65, which updates the API to
support admin access keys. The admin API allows pulling in all
webmentions for all pages, instead of pulling them in for one page at a
time.
Doing so requires getting a bearer token, so I had to manage a new CI
secret: the password for getting a token. I get the token in a shell
script (get-token.sh) and write it to a temporary file, then have Hugo
read the token from that file. The shell script gets the password using
either the CI secret (in CI) or using my password manager (on my
workstation).
TODO: support marginalia (mentions with fragments in their targets)
- Allow long CC durations in webhint
- WH warns when load time is within a whole second of the target
load-time, so bump it up a little.
- Add a comment to the build manifest
Make it possible to build the tilde, staging, production, and onion
sites in parallel. Lint the staging site before deploying it.
Also make these bmake-compatible so I can use bmake instead of gmake.
ECT is more efficient than Zopfli given the same amount of time. It uses
Zopfli under the hood.
- Switch from binaries.tar.br to binaries.tar.gz
- Bring in the statically-linked ect/brotli binaries from
binaries.tar.gz
Update the Makefile to download the old version of the site, run
static-webmentions, and collect the WebMentions to send in a json file
saved as a build artifact.
Don't send these automatically; just save them for now. Until I work out
a solution to save sent WebMentions and avoid sending duplicates, I'll
keep the sending of WebMentions manual.
For some reason this caused webhint's axe/aria test to error out with a
"Protocol error" so I disabled it. Axe tests are covered by Lighthouse
anyway.
Compress with brotli ahead of time in CI, just like we do with zopfli
for gzip_static
Update hintrc to check for brotli compression.
Update lighthouse config to throttle CPU some more since brotli
decompression can be heavier.