1
0
Fork 0
mirror of https://git.sr.ht/~seirdy/seirdy.one synced 2024-11-24 05:02:10 +00:00

Compare commits

..

No commits in common. "71ebd8ca56e24401f4472db8c6c0b6e26307ef47" and "eb690f5e09a23b94d5f7a2d8f28e563b3faffdd3" have entirely different histories.

4 changed files with 4 additions and 21 deletions

View file

@ -68,22 +68,12 @@ My main computer is a 2013 HP Elitebook 840 G1. It has a dual-core Intel i5-4300
=> https://github.com/fhanau/Efficient-Compression-Tool Efficient Compression Tool
=> https://github.com/boyter/scc scc
## Server-side stuff
* Custom build of nginx-quic with some patches. Statically l inked against zlib-ng, BoringSSL, PCRE2, musl, headers_more, and ngx_brotli. Patched for dynamic TLS record sizing, using externally-managed OCSP stapling files, static HPACK dictionaries, removing server signatures, adding dark-mode on in-binary error pages.
* certbot-ocsp-fetcher
* webmentiond Webmention receiver
* Agate Gemini server
* searchmysite-go
* Conduit matrix server
## Services
* Migadu: email provider
* deSEC: managed DNS name servers
* Namecheap: domain registrar (not endorsed)
* Digital Ocean: VPS (not endorsed)
* Search My Site: search API
## What I don't use

View file

@ -250,7 +250,7 @@ I run these tools locally, on every applicable file. A full run takes under <tim
: I use this just like axe-core: as a CLI utility to check every page on my sitemap for basic accessibility violations. I disable "potential-violations" checks because those have false-positives.
jq
: I use jq to ensure that all my JSON is syntactically valid. This includes my Web App Manifest file and Webfinger JSON. I also use jq to filter out specific false positives from the Nu HTML Checker, all of which are reported upstream.
: I use jq to ensure that all my JSON is valid. This includes my Web App Manifest file and Webfinger JSON. I also use jq to filter out false positives from the Nu HTML Checker.
[Feed Validator](https://github.com/w3c/feedvalidator)
: I validate my Atom feeds using this tool. Like always, I filter out false positives and report them upstream.
@ -273,13 +273,13 @@ All my server daemons are statically-linked binaries, which makes sandboxing eas
Nginx
: Specifically, [nginx-quic](https://quic.nginx.org/) with the [headers_more](https://github.com/openresty/headers-more-nginx-module) and [ngx_brotli (static)](https://github.com/google/ngx_brotli) modules. Statically linked against zlib-ng, BoringSSL, PCRE2 (non-JIT), and musl libc; patched for dynamic TLS records, basic externally-managed OCSP-stapling support, static HPACK compression, removing server signatures, and enabling dark mode on in-binary error pages. I recommend most people use Caddy instead of Nginx. The only benefits of Nginx are certain modules providing application-server capabilities, the ability to re-load all configs with zero downtime, better requests-per-second on limited hardware (although most sites won't need to handle more than a few hundred requests per second, which Caddy can handle *easily*), and kernel-accelerated TLS for maximizing bandwidth (usually unnecessary).
: Specifically, [nginx-quic](https://quic.nginx.org/) with the [headers_more](https://github.com/openresty/headers-more-nginx-module) and [ngx_brotli](https://github.com/google/ngx_brotli) modules. Statically linked against zlib-ng, BoringSSL, PCRE2 (non-JIT), and musl libc; patched for dynamic TLS records, basic OCSP support, larger buffers for dynamic zlib compression (necessary for zlib-ng), and static HPACK compression. I recommend most people use Caddy instead of Nginx. The only benefits of Nginx are certain modules providing application-server capabilities, the ability to re-load all configs with zero downtime, and better performance on limited hardware (although most sites won't need to handle more than a few hundred requests per second, which Caddy can handle perfectly well).
[certbot-ocsp-fetcher](https://github.com/tomwassenberg/certbot-ocsp-fetcher)
: Shell script to manage the OCSP cache for Nginx, since Nginx's own implementation shouldn't be used without running a trusted resolver (and is completely non-existent if you build with BoringSSL).
[nginx-rotate-session-ticket-keys](https://github.com/GrapheneOS/nginx-rotate-session-ticket-keys)
: Shell script to manage TLS session tickets, since Nginx's own implementation is really flawed (update: Nginx fixed it! I still keep this script since I can't be bothered to remove it). This replaces its default stateful session cache and also allows 0-RTT (also known as "early data") for idempotent requests. I patched it to use my statically-linked build of BoringSSL (I already had it sitting around after building it for Nginx).
: Shell script to manage TLS session tickets, since Nginx's own implementation is really flawed. This replaces its default stateful session cache and also allows 0-RTT (also known as "early data") for idempotent requests. I patched it to use my statically-linked build of BoringSSL (I already had it sitting around after building it for Nginx).
[webmentiond](https://webmentiond.org/)
: Lightweight Webmention receiver.
@ -311,9 +311,6 @@ I generally try to limit my dependence on services, preferring to run software m
[Digital Ocean](https://www.digitalocean.com)
: My VPS provider. I do not endorse Digital Ocean for most peoples' needs. It's far pricier than equivalent options, and is only worth that price if you need top-tier support and a very good SLA. That being said, it does offer a lot of free credits ($100 if you sign up with someone's referral code; another $100 if you're a student); I started using Digital Ocean for the free credits. Scaleway and BuyVM are much better options if you want to go cheap. If I ever manage to get my hands on a home internet connection with excellent uptime, I might switch to self-hosting.
[Search My Site](https://searchmysite.net/)
: I already pay for it; I might as well use it! Its API powers the site's search functionality, with searches proxied through a tiny Go wrapper on my backend.
What I don't use
----------------

View file

@ -15,8 +15,6 @@ A much smaller semi-curated subset of pleroma.envs.net suitable for the majority
A curated subset of tier0.csv, containing what I deem the "worse half" of it. This contains instances I really do recommend most people block, or at least avoid. I try to make it a suitable candidate for a "default blocklist", and use it as reference when I evaluate the quality of other blocklists.
Some of these lists are also sources for the Oliphant blocklists:
=> https://writer.oliphant.social/oliphant/the-blocklist-algorithm The Blocklist Algorithm - The Oliphant
This post is an attempt to document how they are made, their differences, their intended use, and especially their caveats.

View file

@ -21,14 +21,12 @@ I maintain three blocklists for the Fediverse:
[`FediNuke.txt`](https://seirdy.one/pb/FediNuke.txt)
: A curated subset of `tier0.csv`, containing what I deem the "worse half" of it. This contains instances I really do recommend most people block, or at least avoid. I try to make it a suitable candidate for a "default blocklist", and use it as reference when I evaluate the quality of other blocklists.
Some of these lists are also sources for the Oliphant blocklists. {{<mention-work itemtype="BlogPosting">}}{{<indieweb-person name="Oliphant" url="https://oliphant.social/@Oliphant" itemprop="author">}} describes them in his article {{<cited-work name="The Blocklist Algorithm" extraName="headline" url="https://writer.oliphant.social/oliphant/the-blocklist-algorithm">}}.{{</mention-work>}}
This post is an attempt to document how they are made, their differences, their intended use, and especially their caveats.
How Tier-0 and FediNuke work
----------------------------
[My tier-0 list](https://seirdy.one/pb/tier0.csv) (mirrored to `tier0.csv` in [the Oliphant repository](https://codeberg.org/oliphant/blocklists)) is a subset of the `pleroma.envs.net` blocklist. It contains entries that appeared on at least **11 out of 20** other hand-picked instance blocklists ("bias sources"), with exceptions detailed below. A smaller list containing what I personally deem the "worse half" of Tier 0 is [FediNuke.txt](https://seirdy.one/pb/FediNuke.txt).
[My tier-0 list](https://seirdy.one/tier0.csv) (mirrored to `tier0.csv` in [the Oliphant repository](https://codeberg.org/oliphant/blocklists)) is a subset of the `pleroma.envs.net` blocklist. It contains entries that appeared on at least **11 out of 20** other hand-picked instance blocklists ("bias sources"), with exceptions detailed below. A smaller list containing what I personally deem the "worse half" of Tier 0 is [FediNuke.txt](https://seirdy.one/FediNuke.txt).
When I add a bias source, I may also increase the minimum number of votes required if I find that its blocklist is too close to (or mainly just imports all of) tier-0 or the blocklist of a bias source's blocklist. That's the reason why the threshold is 11 instead of 10.