full of useful and juicy nuggets, e.g. on GPU failures due to thermal stress: "So let’s aggregate up. Imagine you had a 10,000 or even a 20,000 GPU data center. You should expect on the statistics a chip to fail about every 3 or 4 hours. So long before I get to the point where I’m rapidly turning these over because there’s a new generation of chips, I’m turning over a vast chunk of my chips just because they’re failing under thermal stress."; "TSMC now is something like 15% of Taiwan’s GDP" (!!); "the U.S. has been very good at speculative bubbles. This is one of our main core competencies here. They tend to be about real estate, or... technology, or... loose credit, and sometimes they even have a government role with respect to some kind of perverse incentive that was created. This is the first bubble to have all four.... The sum of all bubbles"; "[Like shale wells], it’s an extractive resource economy in surprising ways. So not just in terms of the... declining return from the GPUs themselves, but also the declining return... of these giant training sets that allowed us to scale up the so-called scaling laws for large language models...." also it's adorable that Paul Krugman thought that GPU stands for "general processing unit"
very generous resource with activities, readings and an example syllabus "designed to help you develop a creative practice rooted in inquiry, context and change"
"We have created a system where the only way to survive is to be destitute enough to qualify for aid, or rich enough to ignore the cost. Everyone in the middle is being cannibalized. The rich know this… and they are increasingly opting out of the shared spaces"
chock full of pithy condensations of important facts, e.g. "Most phones — indeed, most computers — are 24+ month old Androids."; "[I]f you spend a majority of your time in front of a computer looking at a screen that's larger than 10", you live in a privilege bubble."; "In April the median mobile page grew to be larger than a copy of DOOM, and the 75th percentile site is now larger than two copies of DOOM."
if trump's tariffs really are about tanking the dollar, a weird and ironic side effect would be that it would become less profitable to propagandize us into voting for people like trump
"[T]he Sinclair ZX80 tokenized BASIC keywords at the keyboard. For instance, after typing a line number, the cursor would flash with an inverse K, indicating that the next key you hit would insert a corresponding keyword that could start a BASIC line: for instance, typing V then didn’t produce a V but instead inserted the keyword GOSUB, typing Y inserted REM, and so on." kinda reminds me of the MegaZeux Robotic editor
"[S]ometimes I wear a cool Hawaiian shirt on Fridays, and it’s commonly accepted that bad people don’t wear shirts with flowers on them. That’s just a fact."
"[T]hat’s not cool. That’s an ad hominem, and very immature of you. What do you have against a mixture of dried carbohydrates and powdered seasonings?"
"...[T]here is zero, less than zero, stress put on the relation between those two 'sides,' or their histories, or their sponsors, or their relative evidentiary authority, or any of it. Instead, what you get is a piece making the various more or less bovine noises of studious grey-lady impartiality, with the labor of anything resembling 'appraisal' surgically excised."
"There were a tiny handful of incredible nerds who thought [Zork] was fun, mostly because 3D graphics and the physical touch of another human being hadn't been invented yet. But for the most part, people would tire of the novelty because trying to guess what to type to make something happen is a terrible and exhausting user interface. This was also why people hated operating systems like MS-DOS, and why even all the Linux users reading this right now are doing so in a graphical user interface." I get that Dash is exaggerating here for effect, but the comparison between command-line interfaces and LLM chatbot interfaces rings false to me. Command-line interfaces have steep learning curves, but they're deterministic and composable, which is why we still use them today. The reason that chatbot interfaces suck isn't that they're based on textual inputs, but that they're neither deterministic nor composable. (GUIs have different affordances than command-line interfaces which make sense for some use cases, but they also have their own learning curves and limitations. A hypothetical GUI-based LLM interface would have all the same problems as a text-based one.) Plus the Zork series sold like 700,000 copies