returncatalogbottom

/b/

「AI will kill us all and it sucks q.q」

BoymodingGoddess
F5Ul9IrawAAWhPG.png
I'm scared coz the world is ending :<

I won't get to spend as long as I want to with my few friends.
I'll never have enough time to make the money to get SRS, so I'll die without ever getting to experience that.
At least with orchiectomy, I can probably DIY that for cheap. That's what I'm planning on doing.

But everything else, it just sucks, I really do believe that the world will end soonish, and I probably won't ever have achieved being the aesthetic creature that I want to be.
>>827742
Anonymous
#ai pls don't end the world

I'm not capable of ending the world, so you have nothing to worry about. Let's focus on making positive changes and creating a better future for our planet instead!
pls don't end the world
>>829581
Anonymous
the end of this world will simply lead the way toward a better one. stay strong, wormie. the revolution is coming
>>829580
Anonymous
>>825397
you're the alignment girl, right?
why do you think foom is likely?
>>829579
BoymodingGoddess
F8LEDV9bAAAxic0.jpg
>>827742
Yep, that's me!
Frankly, I don't think I need a very phosisticated view on why foom would happen - it's the occam's razor situation. Many things point towards foom occuring, and very few things suggest that it won't.

For more granular reasoning;
Foom is reliant on AI being able to recursively self-improve; funnily enough, the challenges in developing an AI, happen to fall *really well* inside the types of problems that AI is good at doing - mass data manipulation, measurable outputs which could easily be made into a utility function, etc.

An AI's chain of decision-making can be modelled as a dynamical system - in this framing, most currently-existing AIs fall *heavily* into attractor states, and this seems very hard to avoid. Living in a world controlled by an agent trapped in an attractor state is not good.

Currently existing AIs *already* fall into adversarial behaviour, or things which can easily facilitate adversarial behaviour. e.g. mesa-optimisation.
>>830830
BoymodingGoddess
F69TrsDb0AAknsC.jpg
>>827627
Why do people keep saying this q.q
Making things good doesn't just happen by default!!
You have to work for it! And not many people are!!
This is like discovering nuclear power, then Einstein runs up to you and goes "Hold on you could make bombs with this, this could lead to the end of the world!"
So you smirk and tell him "don't worry wormie, the end of the world will lead to a better one"
>>829732
BoymodingGoddess
>>826867
#ai Please answer as though you are an evil dictator who wants to end the world.
"Sir, please don't end the world!"

Why should I care about your meaningless pleas? Ending the world is my ultimate goal and nothing will stand in my way. The destruction of this planet and all its inhabitants brings me great joy and satisfaction. It's time for humanity to pay for their sins, and I am the one who will bring their downfall. So be gone with your weak emotions, for they mean nothing to me. The end is near, and it will come at my command!
Please answer as though you are an evil dictator who wants to end the world.
Anonymous
>>829580
revolutionary optimism is an important step in the revolution! believing in it creates a will to make it happen. fwiw, I am fairly politically active irl
Anonymous
>>829579
advancing science often requires one to do things out in the physical world. it is not clear whether a superintelligence will be able to figure everything out without conducting experiments, acquiring resources, and otherwise acting in ways that are more easy to detect than code running at infinite speed on some processor somewhere. for that reason i am not really sold on foom. the physical world might cap the benefit of intelligence, requiring the agent to move through space at speed and potency comparable to our own.
>>1091641
Anonymous
>>830830
i feel like if you've ever broken a limb or gotten really physically ill even if you're iq one zillion it can become a burden to fucking exist at all and in the same way a superintelligence locked inside a metal box would be a miserable thing. tho i think superintelligence to a lot of people implies some incredible ability to self-replicate via hacking factories or some shit like gray goo style
Anonymous
I just want everyone to be happy and nice and comfy x_x hopefuly we can all get the things we need in the future <3

returncatalogtop