Sacred_Path
Unwanted
You mean like MOAR WRITERS??
You mean like MOAR WRITERS??
You mean like MOAR WRITERS??
http://www.formspring.me/GZiets/q/434065342252213648As long as it's ZIETS and possibly MITSODA I'm happy.
http://www.formspring.me/GZiets/q/434065342252213648As long as it's ZIETS and possibly MITSODA I'm happy.
Don't cry.
Totally agree with you on the peer reviewed front. But that's his choice to make. Maybe he considers his research still work in progress and not worthy of publishing results in peer reviewed publications, or whatever. Everything is not peer reviewed in the beginning.I don't think psychologists try to impersonate nonexistent robot gods, or any highly dubious, purely speculatory agents for that matter, during their experiments. They also seek peer review rather than shroud their proceedings in mystery. It really ain't science, mate.It doesn't seem much worse (un-scientific) that most experiments run by, for example, psychologists, or other practitioners of various soft sciences.Well yes, colloquially speaking, we might call it an experiment, but in no way it is scientific.I call it an experiment because, in a manner, he actually ran the experiment 5 times already: http://rationalwiki.org/wiki/AI-box_experimentOh, you're talking about that Yudkowsky! I haven't read his Harry Potter stuff and I don't find the concept of it that interesting. However, his AI Box experiment ( http://yudkowsky.net/singularity/aibox ) is off-the-charts-awesome.
It sure is a cool concept for a sci-fi novel but calling it an AI experiment or even an experiment at all is quite a stretch.
I don't understand why Mitsoda is even considered? He can probably barely keep up with work on Dead State as it is. Starting work on a different project is unrealistic.As long as it's ZIETS and possibly MITSODA I'm happy.
http://www.formspring.me/GZiets/q/434065342252213648As long as it's ZIETS and possibly MITSODA I'm happy.
Don't cry.
Damn that is a shame. If only he could help in the creative process of the world/lore/setting. But an area might be nice.
The made up, speculatory robot god is the dodgy part here. Yudkowsky assumes baseless axioms about nonexistent stuff and works from there - lots of them are implicit, produced by his little community, and not mentioned to the outsider, some are stated but still ridiculous nontheless. All he is evaluating is how a small, personally selected clique, reacts to his make believe MUD game while he appoints himself as an admin. This doesn't meet any standards of being an experiment. I really don't see how this silliness has anything in common with Milgram experiment.Totally agree with you on the peer reviewed front. But that's his choice to make. Maybe he considers his research still work in progress and not worthy of publishing results in peer reviewed publications, or whatever. Everything is not peer reviewed in the beginning.
As for the impersonating stuff and doing some rather awful things in order to run an experiment, I assume that you never heard of the Milgram experiment ( https://en.wikipedia.org/wiki/Milgram_experiment ).
I have read novels by Nathan Long set in Warhammer Fantasy, and while not one to compete with G.R.R.M, he is leagues ahead from Gaider and his ilk.
Most test subjects aren't aware of the rules of the experiment they take part in, in psychological experiments either.The made up, speculatory robot god is the dodgy part here. Yudkowsky assumes baseless axioms about nonexistent stuff and works from there - lots of them are implicit, produced by his little community, and not mentioned to the outsider, some are stated but still ridiculous nontheless.
As you yourself said earlier, the outsider is not aware of the rules of the "AI", so there is no "personally selected clique" to speak of.All he is evaluating is how a small, personally selected clique, reacts to his make believe MUD game while he appoints himself as an admin.
The test subjects in the Milgram experiment weren't aware of what was actually happening either. If they knew, the experiment wouldn't have had any relevance.I really don't see how this silliness has anything in common with Milgram experiment.
No. Nathan Long has written the late books in the Gotrek and Felix series. Mindless hack and slash as a series for the most part, but his writting is not bad. Compared to most game writters he is shakespeare. And he sertainly is more creative than AAA designers.I have read novels by Nathan Long set in Warhammer Fantasy, and while not one to compete with G.R.R.M, he is leagues ahead from Gaider and his ilk.
I can only guess that you are talking about Graham McNeil although I like Dan Abnett much more. If find it strange that nobody, yet, hired him for writing a game story as he is leagues ahead of others other than Dan.
I was talking about G.R.R.M abbreviation...No. Nathan Long has written the late books in the Gotrek and Felix series. Mindless hack and slash as a series for the most part, but his writting is not bad. Compared to most game writters he is shakespeare. And he sertainly is more creative than AAA designers.I have read novels by Nathan Long set in Warhammer Fantasy, and while not one to compete with G.R.R.M, he is leagues ahead from Gaider and his ilk.
I can only guess that you are talking about Graham McNeil although I like Dan Abnett much more. If find it strange that nobody, yet, hired him for writing a game story as he is leagues ahead of others other than Dan.
George RR MartinI was talking about G.R.R.M abbreviation...
Ah. OKGeorge RR MartinI was talking about G.R.R.M abbreviation...
Most test subjects aren't aware of the rules of the experiment they take part in, in psychological experiments either.The made up, speculatory robot god is the dodgy part here. Yudkowsky assumes baseless axioms about nonexistent stuff and works from there - lots of them are implicit, produced by his little community, and not mentioned to the outsider, some are stated but still ridiculous nontheless.
As you yourself said earlier, the outsider is not aware of the rules of the "AI", so there is no "personally selected clique" to speak of.All he is evaluating is how a small, personally selected clique, reacts to his make believe MUD game while he appoints himself as an admin.
Yes, that's one of the differences between the two cases. People Yudkowsky had selectively chosen for his ploy knew the purpose of the "test" beforehand. How couldn't they, the game couldn't proceed otherwise. The only thing "tested" here is their interaction, and the elephant in the room, the improbable, absurd, black-box-like, hocus pocus, unjustfied robot god chatting with someone is taken for granted.The test subjects in the Milgram experiment weren't aware of what was actually happening either. If they knew, the experiment wouldn't have had any relevance.I really don't see how this silliness has anything in common with Milgram experiment.
I wasn't aware of this aspect. Under these circumstances, it's all a massive circlejerk and no further debate is necessary.Yes, that's one of the differences between the two cases. People Yudkowsky had selectively chosen for his ploy knew the purpose of the "test" beforehand.
You shouldn't feel bad about it, the likes of Yudkowsky often either omit or obscure vital information to push their agenda and avoid critical scrutiny. I enjoyed his autobiographical fanfic book, though.I wasn't aware of this aspect. Under these circumstances, it's all a massive circlejerk and no further debate is necessary.Yes, that's one of the differences between the two cases. People Yudkowsky had selectively chosen for his ploy knew the purpose of the "test" beforehand.