- I forgot to compress xhtml files. fix that.
- Stylistic change: remove unnecessary brace expansions
- Don't repeatedly append to a file; run commands in a different scope
and write all at once.
Move Nu HTML validator filtering into a shell script:
- Return a bad exit code if validation errors are found after filtering
- Remove null-ish values from the JSON output; the final output *should*
be an empty string, since nothing should be reported.
- Remove XHTML content-type meta header from HTML documents, reverting
back to the meta charset
- Give XHTML documents their own XHTML declaration
- Remove now-redundant meta charset from XHTML
- Since XHTML and HTML documents differ now, compress after running
xhtmlize and make xhtmlize only act on uncompressed files.
- Validate XHTML using vnu
The site now has polygot markup and can handle both XHTML5 and HTML5
parsing rules. My staging site will be XHTML but my main site will be
HTML5, just in case of parse errors.
If other tools (e.g. LightHouse) end up supporting XHTML5, I'll consider
switching the content-type to XHTML.
- Stop using draft WAI-ARIA 1.3 that isn't supported yet
- Make in-page links focusable across shortcodes/partials
- Replace existing in-page heading anchor links with a more accessible
option.
- Make backlinks aria-labelledby instead of giving them an aria-label,
so they can be translated.
- Streamline CSS to reduce duplication
- Better backlink accessible names for endnotes with multiple backlinks
This required updating a false positive filter in my vnu jq filter.
- Add lint using local installation of the Nu HTML Validator and some
jq-based filtering of false positives
- Move linter configs to directory, to de-clutter the repo