the macrobean philosophy
we were promised simplicity
Published 2025-12-05
The Physics behind your internet speed: A 40-Year Journey
a research note (my work-in-progress)
We don’t see any scaled-up BERTs anymore. The majority of LLMs (Claude, GPT, Llama..) are decoders. BERTs deprecated in favour of more flexible forms of T5, which are mostly complementary to decoders.
How data is split into byte stream and my take on obscure binaries
Talked about the building blocks of computer networking and socket api