- Avoid relative paths, so I can run them from any working directory.
- Make a dedicated curl-wrapping shell script instead of re-defining the
same alias everywhere.
- Support extended offline periods: allow get-webmentions.sh to fall
back to the cached copy of my webmentions for up to a day, and don't
accidentally destroy it; keep changes atomic.
- Verify that the fetched webmentions are legit before replacing the
cached ones.
- Make shellcheck happy about quoting in vnu.sh by passing the list of
files with xargs instead of a shell variable.
html-tidy takes care of some post-processing, rendering other
substitutions obsolete. Remove the obsolete regex substitutions.
Now that we did that, the remaining substitutions can be done with
vanilla POSIX or POSIX-Extended regular expressions. Replace sd with
sed, and group the substitutions together into one invocation instead of
multiple invocations piped together. This change speeds up
post-processing to be almost as fast as the initial build step.
Use a new branch of webmentiond that lets me pull in all webmentions for
all pages in a single JSON response
Before, Hugo would make one request to webmentiond per page to ask for
approved webmentions for that page. Sometimes, it makes two requests
because some pages used to have a different canonical location. In all,
it ended up making over 150 requests within a second or two. Webmentiond
can handle this for now, but this isn't sustainable: page count will
only increase with time. I wanted to have Hugo instead get all
webmentions for all pages in one cached request.
I recompiled webmentiond from
https://github.com/zerok/webmentiond/pull/65, which updates the API to
support admin access keys. The admin API allows pulling in all
webmentions for all pages, instead of pulling them in for one page at a
time.
Doing so requires getting a bearer token, so I had to manage a new CI
secret: the password for getting a token. I get the token in a shell
script (get-token.sh) and write it to a temporary file, then have Hugo
read the token from that file. The shell script gets the password using
either the CI secret (in CI) or using my password manager (on my
workstation).
TODO: support marginalia (mentions with fragments in their targets)
Left bc I quit trying to make a good first-party iframe alternative
that conformed to my site design standards while also imparting the
message of GEORGE as intended.
Whether I join or leave, GEORGE lives on. Whether GEORGE of the JUNGLE
or CURIOUS GEORGE, GEORGE is coming and GEORGE will be known to all as
as the one true GEORGE.
- Make xhtml and html alternates the same (we're polygot), cutting
static-compression time in half
- Make axe-ff run on local files, reducing some overhead.
- Run webhint in parallel with other site checks, right after the
deployment finishes
- Fix webhint perf budget (it warns when within 0.5 sec so give it a
0.5 sec buffer)
- Throttle LH a bit more
- Increase list padding so that ordered-list decimal markers have space
to fit without overflowing.
- Improve style for removing underline between h-card name and photo, so
it doesn't apply to any unnecessary elements.
- Reduce budget for document size.
FeedValidator was fixed upstream; it doesn't use an allowlist of (X)HTML
attributes anymore. After updating FeedValidator to the latest commit, I
don't need to filter out false-positives for these attributes.
The FeedValidator commit that resolved the issue:
0b130bd5d7
- Replace achecker flags with a config file
- Bring back webhint
- Amend check-whole-site so that it will deploy to staging if all checks
pass, and then run webhint on every staging page.