• 6 Posts
  • 342 Comments
Joined 2 年前
cake
Cake day: 2023年8月29日

help-circle

  • Yud: “Woe is me, a child who was lied to!”

    He really can’t let down that one go, it keeps coming up. It was at least vaguely relevant to a Harry Potter self-insert, but his frustrated gifted child vibes keep leaking into other weird places. (Like Project Lawful, among it’s many digressions, had an aside about how dath ilan raises it’s children to avoid this. It almost made me sympathetic towards the child-abusing devil worshipers who had to put up with these asides to get to the main character’s chemistry and math lectures.)

    Of course this a meandering plug to his book!

    Yup, now that he has a book out he’s going to keep referencing back to it and it’s being added to the canon that must be read before anyone is allowed to dare disagree with him. (At least the sequences were free and all online)

    Is that… an incel shape-rotator reference?

    I think shape-rotator has generally permeated the rationalist lingo for a certain kind of math aptitude, I wasn’t aware the term had ties to the incel community. (But it wouldn’t surprise me that much.)




  • “You don’t understand how Eliezer has programmed half the people in your company to believe in that stuff,” he is reported to have told Altman at a dinner party in late 2023. “You need to take this more seriously.” Altman “tried not to roll his eyes,” according to Wall Street Journal reporter Keach Hagey.

    I wonder exactly when this was. The attempted oust of Sam Altman was November 17, 2023. So either this warning was timely (but something Sam already had the pieces in place to make a counterplay against), or a bit too late (as Sam had recently just beaten an attempt by the true believers to oust him).

    Sam Altman has proved adept at keeping the plates spinning and wheedling his way through various deals, I agree with the common sentiment here that he his underlying product just doesn’t work well enough, in a unique/proprietary enough way for him to actually use that to get profitable company. Pivot-to-AI and Ed Zitron have a guess of 2027 for the plates to come crashing down, but with an IPO on the way to infuse more cash into OpenAI I wouldn’t be that surprised if he delays the bubble pop all the way to 2030, and personally gets away cleanly with no legal liability for it and some stock sales lining his pockets.




  • It seems like a complicated but repeatable formula: Start a non-profit dedicated to some technology, leverage the charity status for influence and tax avoidance and PR and recruiting true believers in the initial stages, and then make a bunch of financial deals conditional on your non-profit changing to for profit, then claim you need to change to for-profit or your organization will collapse!

    Although I’m not sure how repeatable it is without the “too big to fail” threat of loss of business to state AGs. OTOH, states often bend the rules to gain (or even just avoid losing) embarrassingly few jobs, so IDK.





  • Here: https://glowfic.com/posts/4508

    Be warned, the three quarters of the thread don’t have much of a plot and are basically two to three characters talking, then the last quarter time skips ahead and gives massive clunky worldbuilding dumps. (This is basically par for the course with glowfic, the format supports dialogue interaction heavy stories and it’s really easy to just kind of let the plot meander. Planecrash, for all of its bloat and diversions into eugenics lectures, is actually relatively plot heavy for glowfic.)

    On the upside, the first three quarters almost read like a sneer on rationalists.








  • So, to give the first example that comes to mind, in my education from Elementary School to High School, the (US) Civil Rights movement of the 1950s and 1960s was taught with a lot of emphasis on passive nonviolent resistance, downplaying just how disruptive they had to make their protests to make them effective and completely ignoring armed movements like the Black Panthers. Martin Luther King Jr.'s interest and advocacy for socialism is ignored. The level of organization and careful planning by some of the organizations isn’t properly explained. (For instance, Rosa Parks didn’t just spontaneously decide to not move her seat one day, they planned it and picked her in order to advance a test case, but I don’t think any of my school classes explained that until High School.) Some of the level of force the federal government had to bring in against the Southern States (i.e. Federal Marshals escorting Ruby Bridges) is properly explained, but the full scale is hard to visualize so. So the overall misleading impression someone could develop or subconsciously perceive is that rights were given to black people through democratic processes after they politely asked for them with just a touch of protests.

    Someone taking the way their education presents the Civil Rights protests at face value without further study will miss the role of armed resistance, miss the level of organization and planning going on behind pivotal acts, and miss just how disruptive protests had to get to be effective. If you are a capital owner benefiting from the current status quo (or well paid middle class that perceives themselves as more aligned with the capital owners than other people that work for a living), then you have a class interest in keeping protests orderly and quiet and harmless and non-disruptive. It vents off frustration in a way that ultimately doesn’t force any kind of change.

    This hunger strike and other rationalist attempts at protesting AI advancement seems to suffer from this kind of mentality. They aren’t organized on a large scale and they don’t have coherent demands they agree on (which is partly a symptom of the fact that the thing they are trying to stop is so speculative and uncertain). Key leaders like Eliezer have come out strongly against any form of (non-state) violence. (Which is a good thing, because their fears are unfounded, but if I actually thought we were doomed with p=.98 I would certainly be contemplating vigilante violence.) (Also, note form the nuke the datacenter’s comments, Eliezer is okay with state level violence.) Additionally, the rationalist often have financial and social ties to the very AI companies they are protesting, further weakening their ability to engage in effective activism.