Infodump of Experiences: On why MIRI got pwned

A post from “infodump of experiences“. tw rape


(anon)Today at 6:51 PM
so what do you think is wrong with MIRI?

Jay_WToday at 6:51 PM
same stuff as her:
is pwned by playing social reality games, has bad politics, nudges ppl away from it by having bad politics (think like re; social justice), nobody there I know of is jailbroken (correlated to: math is useless AF for aligning AI and they’d have noticed this if they were jailbroken), the statutory rape thing is real (I trust Ziz to judge that particular thing actually), also very importantly: they are pwning very smart people who could be hella fucking impactful elsewhere

(anon)Today at 6:56 PM
> statutory rape
my thoughts are basically “yes, and?”
it was a 17 year old (above the age of consent in many places)
they consented at the time, and still retroactively endorse it
> math is useless for aligning AI
interested in going into this further – seems very important
the “rocket alignment problem” analogy seems reasonable to me

Jay_WToday at 6:58 PM
idgaf if that’s above the age of consent some places, they have been systematic in enablinng rapists in other cases, also idk if that is even the case that the accusations were about a 17 year old

(anon)Today at 6:58 PM
> systematic
the only other case I know of is Brent

Jay_WToday at 6:58 PM
I really… don’t want to do this chat, feels like justifying myself
at a minimum they have enabled mine as well (not public knowledge, in particular this is Robert Lecnik I’m talking about)

(anon)Today at 6:59 PM
nods
was not aware

Jay_WToday at 6:59 PM
Giego, also, and people accused by Jax
Honestly I don’t keep track anymore bc at some point I just reached a “this is plenty of evidence and it’s emotionally expensive to follow this”

(anon)Today at 7:00 PM
nods

Jay_WToday at 7:01 PM
> the “rocket alignment problem” analogy seems reasonable to me
what the fuck why would this be important

(anon)Today at 7:02 PM
hmm
my model of what might be happening there is that they’re thinking “these people are saving the world, who cares if they did that thing, it’s nowhere near the same scale” or similar
that or just a strong ingroup effect

Jay_WToday at 7:02 PM
like in the sense of: that rocket thinng seems kinda fucking obvious and like a thing ppl were already considering, probably you are dumb if youare having to respond to such things

(anon)Today at 7:02 PM
not sure I follow

Jay_WToday at 7:03 PM
like: talking about the rocket alignment problem is dumb, because it should be obvious
and it is
and like probably it’s jsut being used as a thing to hammer people you don’t like and get social points if you’re doing it?

(anon)Today at 7:03 PM
do you agree or disagree that we’re confused about superintelligence and alignment?

Jay_WToday at 7:03 PM
I disagree
it’s fucking simple if you have a brain
normies are dumb and don’t count here

(anon)Today at 7:04 PM
okay, how would you build an aligned AI?

Jay_WToday at 7:04 PM
I have one
in my pants

(anon)Today at 7:04 PM
?

Jay_WToday at 7:04 PM
It’s a joke, (anon).

(anon)Today at 7:04 PM
I don’t see how it answers the question though

Jay_WToday at 7:05 PM
I am really frustrated with you right now, because I see you as enabling bad behavior from MIRI
et al

(anon)Today at 7:06 PM
on my end it looks like you’ve hit effective override and aren’t really engaging with my epistemic state enough to actually change it

Jay_WToday at 7:06 PM
effective override meaning: I got triggered and am not thinking?

(anon)Today at 7:07 PM
kinda, yeah
I’d want to look up the technical definition to see if I’m using it right

Jay_WToday at 7:07 PM
nahh
i get what you mean

(anon)Today at 7:08 PM
like
I wasn’t aware that there were a significant number of rape/abuse allegations against MIRI and people they support
and it’s concerning
but it doesn’t really factually connect to whether their approach to AI is correct?

Jay_WToday at 7:09 PM
the root cause is the same.
the root cause is that they exist to do social things, or more accurately: they’r emotivated by both self love reasons and other love oriented reasons, andhave been playing this social game of pretending to only do the latter. which makes them only able to optimize for both things at once, rather than both things separately

(anon)Today at 7:11 PM
I don’t see why that would bias them against finding a correct approach to alignment

Jay_WToday at 7:12 PM
like. they dont course correct on there being a better way to align ai (anything but fucking math) for the reason they don’t want to dismantle their social accreditation, eg “we have the best cause, so just excuse our behavior which is satisfying our self love oriented cores”
this generalizes to their ability to course correct on other things
their social bubble pays them rents for working within things without course corrections
pwned, you now have bad epistemics (in some areas) and a rape cult
like in a related way (as described above)
I rest my case.

(anon)Today at 7:14 PM
I can see your logic
I’ll want to process it without as much emotion-layer pressure though

Jay_WToday at 7:15 PM
of course
I will leave it to you to strip that and think through it for yourself


back to Infodump of Experiences