[ 12 December 2024 ]
Moore's World: A Shot and Two Chasers
Shot [Wikipedia]:
"[Moore's Law] is named after Gordon Moore, the co-founder of Fairchild Semiconductor and Intel and former CEO of the latter, who in 1965 noted that the number of components per integrated circuit had been doubling every year, and projected this rate of growth could continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years [ . . . ]"
[ source ]
[ subtext ]
Chaser 1 [Randall Hyde, The Art of Assembly Language Programming]:
"Intel might not have survived had it not been for a big stroke of luck - IBM decided to use the 8088 microprocessor in their personal computer system."
"To most computer historians, there were two watershed events in the history of the personal computer. The first was the introduction of the Visicalc spreadsheet program on the Apple II personal computer system. This single program demonstrated that there was a real reason for owning a computer beyond the nerdy "gee, I've got my own computer" excuse. Visicalc quickly (and, alas, briefly) made Apple Computer the largest PC company around. The second big event in the history of personal computers was, of course, the introduction of the IBM PC. The fact that IBM, a "real" computer company, would begin building PCs legitimized the market."
"[ . . . ] for a brief period, the introduction of the IBM PC was a godsend to most of the other computer manufacturers. The original IBM PC was underpowered and quite a bit more expensive than its counterparts. For example, a dual-floppy disk drive PC with 64 Kilobytes of memory and a monochrome display sold for $3,000. A comparable Apple II system with a color display sold for under $2,000. The original IBM PC with it's 4.77 MHz 8088 processor (that's four-point-seven-seven, not four hundred seventy-seven!) was only about two to three times as fast as the Apple II with its paltry 1 MHz eight-bit 6502 processor. The fact that most Apple II software was written by expert assembly language programmers while most (early) IBM software was written in a high level language (often interpreted) or by inexperienced 8086 assembly language programmers narrowed the gap even more.
Nonetheless, software development on PCs accelerated. The wide range of different (and incompatible) systems made software development somewhat risky. Those who did not have an emotional attachment to one particular company (and didn't have the resources to develop for more than one platform) generally decided to go with IBM's PC when developing their software.
About the time people began complaining about Intel's architecture, Intel began running an ad campaign bragging about how great their chip was. They quoted top executives at companies like Visicorp (the outfit selling Visicalc) who claimed that the segmented architecture was great. They also made a big deal about the fact that over a billion dollars worth of software had been written for their chip. This was all marketing hype, of course. Their chip was not particularly special. Indeed, the 8086's contemporaries (Z8000, 68000, and 16032) were architecturally superior. However, Intel was quite right about one thing - people had written a lot of software for the 8086 and most of the really good stuff was written in 8086 assembly language and could not be easily ported to the other processors. Worse, the software that people were writing for the 8086 was starting to get large; making it even more difficult to port it to the other chips. As a result, software developers were becoming locked into using the 8086 CPU."
[ source ]
[ subtext: I don't think the consumer market would have been this excruciatingly cautious, if the local volatility of inflation [which I should really explain in its own post] hadn't been pinned to FLT_MAX. ]
Chaser 2 [Wikipedia again]:
"each new generation process became known as a technology node or process node, designated by the process' minimum feature size in nanometers (or historically micrometers) of the process's transistor gate length, such as the "90 nm process". However, this has not been the case since 1994, and the number of nanometers used to name process nodes (see the International Technology Roadmap for Semiconductors) has become more of a marketing term that has no standardized relation with functional feature sizes or with transistor density (number of transistors per unit area)."
[ source ]
[ subtext: It's true that the nerfed, post-1975 version of Moore's law has nominally continued not-very-abated since 1975; however, after seeing this, I'm inclined to suspect the more recent nomination criteria somewhat. ]
[ originally published 3 December 2024 ]
Short AoC 1.1
The AoC problems are great but too wordy. As a public service, I’ll be publishing short versions of this year’s problems, until I get bored with that.
Day 1: Historian Hysteria – pt 1
You need to find a traveling historian. Your clue is, he’s touring locations that are historically significant to the North Pole. As each location is checked for the historian, you’ll mark it on your list with a star. There are fifty locations you need to check, and twenty-five puzzles [ so don’t expect to find anything in the first 49 locations, it doesn’t matter, you’re essentially just going sightseeing ].
First you need the list of locations. You have a document with the locations the historian planned to visit, listed not by name but by a unique number called the location ID. To make sure they don’t miss anything, the checkers split up into two groups, each searching the office and trying to complete their own list of location IDs. There’s just one problem: by holding the the two lists up side by side [your puzzle input], it quickly becomes clear the lists aren’t very similar.
Maybe the lists are only off by a small amount! To find out, pair up the numbers and measure how far apart they are. Pair up the smallest number in the left list with the smallest number in the right list, then the second-smallest left number with the second-smallest right number, and so on.
Within each pair, figure out how far apart the two numbers are; you’ll need to add up all of those distances.
What is the "total distance" between your lists?
[ originally published 3 November 2024 ]
Common Crawley
The Internet Archive was DDOSed on the 8th, and for reasons I don't understand, hasn't been able to "archive" since. Wikipedia refers to a "November 2013 crawl" for Common Crawl; I'm not sure if IA would ordinarily have done anything for the rest of this month.
Wikipedia:
> Crawls [for the Wayback Machine] are contributed from various sources, some imported from third parties and others generated internally by the Archive. For example, crawls are contributed by the Sloan Foundation and Alexa, crawls run by Internet Archive on behalf of NARA and the Internet Memory Foundation, mirrors of Common Crawl.
> Amazon Web Services began hosting Common Crawl's archive through its Public Data Sets program in 2012. […] In 2013, Common Crawl began using the Apache Software Foundation‘s Nutch webcrawler instead of a custom crawler.
What is web crawling?
Recently a smart acquaintance on Discord referred to the majority of what Google does with websites as “indexing”, in the context that using Google as Tumblr search doesn’t work because Google doesn’t “index” most Tumblr posts. I didn’t understand what that meant
> Keywords? How would that help make them more searchable? I would imagine it would be something like, just scrapping all the chaff and putting the “raw” text in a tree with numbered nodes
until I read [Wikipedia]:
> A Web crawler, sometimes called a spider or spiderbot and often shortened to crawler, is an Internet bot that systematically browses the World Wide Web and that is typically operated by search engines for the purpose of Web indexing (web spidering).
Web indexing, or Internet indexing, comprises methods for indexing the contents of a website or of the Internet as a whole.
and then a lightbulb turned on. “The Internet” is a territory, not a map, and not a territory that’s automatically Seen Like A State, to the point where it would basically have a natural map. Explainers usually go way harder on the “Pony Express” framing of Internet “addresses”, than the “series of tubes” framing, I realized, just because most readers will experience the superficial resonance of understanding at that point, and not because it’s actually very accurate at all. “The Internet” is as distributed as crypto. More so. [It has to be, for crypto to work – although the Internet’s distributedness isn’t, like Bitcoin’s distributedness, deliberately there to make subversion difficult – it’s just distributed because that makes such a large-scale routing system logistically possible, and this happens to be lucky for Bitcoin].
Bec ause the Internet has no natural map, mapping the Internet is a complicated technical problem with internal tradeoffs where I expect, even after years of Google doing this, the good idioms for many reasonable purposes are not, actually, Known.
And we leave it to Google, Amazon, and what is apparently known to Wikipedia as more or less Peter Norvig’s Common Crawl.
Her e is the source code for Marginalia Search: https://github.com/MarginaliaSearch/MarginaliaSearch/tree/master It includes a crawler that runs on cache miss of existing crawl data. I have no idea how it can be so small.
[ originally published 2020 // originally posted to Substack 23 Nov 2024 ]
Luminosity and Radius [ repost - originally posted 2020 ]
[ Author’s note: I originally published the below post on March 6, 2020, to my old now-defunct WordPress.
It was the middle of Covid panic [though the university I was going to hadn’t locked down yet]. I was struggling not to drop out for what would be the second time [at a second school] in 6 months [I eventually did drop out a second time, just a couple months after writing this essay] but I didn’t feel nearly as scared as the first time my brain had threatened to give out on me for smashing it day after day against classwork that felt pointless. I had just taken home a 40mg/day Ritalin prescription, and that had changed everything. I could actually think, for longer than a split second, about what I wanted out of life, and I was starting to convince myself that maybe it wasn’t a research directorship or a Fields Medal or whatever “I Survived The Unbearable Meat Grinder” sticker was supposed to give you the license to actually do things.
After all, what I really wanted was the license to do things. At least one person I knew of [at the time] had found their own way to give themselves that license. Maybe there was another way for me, too.
I had just read Woman: An Exposition for the Advanced Mind by David Quinn. It had confirmed every worst fear I’d started out having about myself, and had gotten reinforced by reading The Last Psychiatrist in high school, and trying to participate in 2020!TPOT. Because I was concerned primarily with what other people thought about me, and couldn’t focus on anything for longer than everyone else could agree it mattered, I would never be able to build a lasting legacy*. [That, said David Quinn - and, by admission or omission, everyone else - was why women, in general, didn’t leave legacies.] [*I didn’t intend to die, of course, but I did intend to change the world.]
I’ve since come to terms with two relevant things:
1. I’m actually a boy in a girl’s body, was the whole time, and
2. Leaving a legacy, to the degree “Darwin or Galileo or Einstein” did, requires far more pre-existing social status than I’d been conceptualizing. To some degree it’s always been the case - in Athens less so, but still at all - that you have to start out important, to do work such that you go down in history as a genius. Without the right personal connections - and the right mental stereotype everyone has of you [virtually impossible to achieve for members of the female sex] even if you know what the right problems are, no one will ever believe that you have a good answer. And very likely you get demoralized and demotivated in the very early stages of this, and drop out, and later it looks, from the cold, distant eye of History, like marginalized people just didn’t care enough.
But banging this post out and hanging it up as a middle finger to the world was an important part, I think, of coming to terms with those things. And I still love it. ]
If you ever went on College Confidential, or better yet went on College Confidential and failed, you’ll know way too much about pointy applicants vs rounded applicants. The terms are kind of self-explanatory: pointy applicants have a Skill, rounded applicants have a Self. Maybe you realized they were code for masculine and feminine earlier than I did. Maybe the Puritans’ constant urging that you build something, play a sport or an instrument, grow a real tail feather to flash instead of just looking coyly at the reviewer and hinting at the depths you contain was always clear to you as code for hey, maybe try to be a bit more of a boy, we love girls of course, they look nice on us, but we have to find some way of incentivizing behavior that will actually keep our organization alive.
No, I didn’t come here to imitate Alone. I mean, to some degree I did, imitation is all we really know how to do and all that, but I have many teachers. He’s just the one I rediscovered most recently. Apologies for the shadow on my style. Still.
I came here to describe my way of seeing things.
I am a rounded applicant. And I expect to be able to create and receive a lot of value from distributing information gleaned from leveraging that.
Why, when so many other rounded applicants have tried and failed? Why, when points are the only things that extend and spar between men to form the battleground of history, should one expect to be able to make a real contribution to humanity’s resume or health by leveraging the ability to point beyond oneself while never building?
In the dreamtime women can do whatever they want. Plenty of women think they’re Elon Musk in an age where identities are disposable and information is free. And maybe they’re right. Time will tell for sure. Obviously not everybody is in the right place along all the axes you need to be to be Elon Musk. And life is not a story. Life is war. No virtue decides whether the shrapnel hits you. That’s not a woman’s world. But holy shit, the artifice of the dreamtime is.
There’s an artful way to paint naked people. Why? Because the Renaissance painters say there is, and they’re pretty cool. There’s an artful way to give up on having a writing style. Why? Because I say there is, and I’m pretty cool.
Why should you care? You shouldn’t, not yet. You see this all very clearly for what it is: a careful way to lay a foundation of nothing. There’s nothing to trust in a foundation of nothing, but as soon as you pour the first concrete in an actual foundation your trust begins the activity of decay. Only more cracks can appear over time in something that started out so assured. Contradiction and doubt is human existence. Change is the only constant and all that. That is why, sir, I am the best one for the job, because I have autism whose benefits drawbacks are balanced by ADHD.
The third virtue of rationality is lightness. Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. I’m sorry, but that is womanhood. The brain is Bayesian, and of the many levels of its model it rightly has the highest confidence in qualia. This is “frivolity”: being carried along on the winds of the lived moment. Masculinity – putting all your sperm in one theory – risks Lysenkoism. The first Lysenkoist gets to wear just as many cool suits as the rare Darwin or Galileo or Einstein. As soon as you try to imagine a real instantiation of Bella Swan you find yourself sticky with her degree of kayfabe, but so it goes for God. It’s just that theologians are pointy applicants who have to get it interesting and self-consistent, which means that He manifests as a much more varied and granular kaleidoscope of insanity than harlequin paperbacks do.
Not that philosopher is the default or even the central male role. There’s politician and artist, sort of, but women come much closer to touching men there. Mechanic, though – that’s a true male role free of epistemic Dark Arts. But mechanics are mechanics. If you find all your meaning in engineering, cool. I don’t. Keeping my sights steady on the von Braun role is beyond me, at least now. The next calculus, maybe. But as they say, most of getting history to pedestal you as a genius is finding the right problem. You wouldn’t know the next calculus to see it. You would know a chance to fake the moves.
I think the popular wisdom on why all those busts have strong jaws and forehead ridges is basically correct, with one exception.
First, let’s get the blue junk out of the way: “There are no great women because the social structure has never allowed a woman to be great”. Number A, thanks for tearing down that fence, buddy. I’m kidding. I like the dreamtime. But uh, have you seen Swedish OCEAN scores? Unfortunate, and I mean that sincerely. I’m a gender abolitionist. But you have to have the balls to see that abolishing gender implies abolishing anisogamy first.
The people who can take a little voidgazing say the reason there are no female marble busts is that women can’t abstract or focus enough to reason deeply about the world outside themselves. That the caprice of a career in nonexistence (find a husband, keep him guessing) is completely incompatible with the kind of penetrative thought that begets great insights. “Women are not just insufficiently religious, they’re insufficiently autistic”. And that does a great deal of the work, to the extent that you believe it.
But any good rationalist knows that “penetrative thinking” does not often beget genius. How could it? We’ve had four, maybe five great scientific paradigm shifts in human history, and those the result of the right person making the right observations at the right time, not any truly new ways of thinking.
No, most people who build up and calcify a self-consistent model of the world just end up as the 10th century Chinese peasant equivalent of conspiracy theorists, except the conspiracy is good and also happens to justify everything they are to all the people who have something they want. They became actual conspiracy theorists when heresy got cool in the 1500s. I don’t think anybody really knows what to make of that yet. Henrich’s stuff on hbdchick’s (huh) stuff is promising.
While a couple of those heretics were Locke and Newton, most were, like, Matthew Hopkins. Sure, we don’t remember them now, but they existed in hordes. Maybe they failed to be remembered only to the extent that they “failed to be men”. They were still male, and we still don’t remember them. We sure do remember Elizabeth. Did she “succeed at being male”? In her words. It’s not as if either she or Newton fucked. WEIRD time.
Isn’t all this boilerplate theorizing grossly presumptuous?
Well yeah, that’s my point.
Type I errors are a guy thing.
It’s not “pointy applicants are closer to the truth” but “anisogamy incentivizes different relationships with reality (yes, roles) for the pointy and rounded”. Only the pointy can sometimes win big, with respect to truth same as everything else, but as a category they lose big exactly as often. The wider male bell curve is just one aspect of that.
I don’t know exactly the mechanism that equalizes average male and female g squeaky clean despite the high degree of cognitive specialization between the sexes. But whatever it is, you would expect it to balance the sexes’ “ability to see “the truth”” just as well. “That the truth is whatever gets you laid is not a moral imperative, it is a contingent fact of human evolution”.
So.
The leaf on the wind.
I intend to be that.
That’s how I lost my religion – a cascade of religions, actually – which throws you into the void but supplies you with a new confidence in reality being manipulable.
The problem for an arrogant, socially challenged rounded applicant who has lost your religion is the only parts of reality you are inclined to manipulate become phony without God. Your self-understanding has lapped your coolness level because the worldview that gave you paradoxically outsized relevance is gone. You awake on the battlefield. With the nerves of a rabbit and the intent to kill of a bowl of wet grapes. In no-man’s-land. Faced with the prospect of winning the war on your own. …Cool, but wouldn’t all of that go away if you just accepted your station in life? Psh. I’m American. Reframed it as something else? Ha! I’m a rat. All that could only happen to a teenager, of course. I don’t know whether to identify it with “growing up”. Maybe “growing up zoomer”.
“Extend your radius”. This presumes life is one giant game of agar.io (not bad so far) and that the meta-skill is the fraction of relevant reality, from quanta to people, you are able and willing to emulate.
You have to admit that sounds like a more fun way to build career capital than McKinsey. Whether it beats MIT is more up to taste.
And whether it’s a safer investment depends on your goal, of course, but if it wasn’t clear I’m already a failson, and I don’t really have the luxury of working toward whichever goal best suits my actual situation.
College essays are the favorite and absolute domain of a successful Press Secretary – a story of self that expertly rides between the crest of confidence and the undertow of humility, missing the aimless expanses of arrogance and self-pity altogether. Granted, most people can’t surf like that, but most women can at least sort of do it, and most men can build sailboats for the Extracurriculars section. Most female failsons are OK with trying to learn to surf on a slightly shittier beach. Most male failsons can swim pretty well until they figure out the plan for a new kind of boat to build. You can see about where I’m going with this: I’m supposed to have fallen into the water and H2Od into a shark. Hokey? Yeah, it’s a pretty shit sole consolation prize from a bipolar Press Secretary. Doesn’t mean it’s the exact opposite of the truth. The shark just looks a lot like a human in a dinky shark suit.
In high school, the college essay they had us read as an aspirational example was by an Asian girl who spent her childhood thinking Hermione was sublime, and that unless she was sublime, too, she wouldn’t matter. In the end she realized she didn’t have to be anything at all, except her. The feeling your fingers get from successful embroidery. The highs and lows of momentary calculus and sun.
Okay. Not my thing.
What’s my thing then?
Sorry, ask again later.