Announcement

Collapse
No announcement yet.

Neuromorphic Computers; Terminators...here we come

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #21
    Originally posted by RussianCoffeeAddict View Post

    I ain’t giving up a perfectly fine dick to the machines.
    I'm ok with it. All mine's good for really is self gratification

    Comment


      #22
      Originally posted by RussianCoffeeAddict View Post

      I ain’t giving up a perfectly fine dick to the machines.
      why don't you just have it attached to ur new robot body. shouldn't be too hard to rewire some tubes/nerves if we're talking about uploading brains

      Comment


        #23
        Originally posted by Aure0lin View Post

        why don't you just have it attached to ur new robot body. shouldn't be too hard to rewire some tubes/nerves if we're talking about uploading brains
        Or just make a synthetic one. One that doesn't, y'know... decompose.
        Originally posted by Wade
        Everything is hidden in plain sight, like in Men in Black. We've all just been neuralized to think it is "normal".

        Comment


          #24
          Helly

          How do you feel about this

          I kno u r scared of the terminators as well

          Comment


            #25
            Originally posted by OrganizationXV View Post

            Or just make a synthetic one. One that doesn't, y'know... decompose.
            as long as it's getting nutrients and has some genes rewritten to stop aging it could prolly last forever, and maybe rca really likes the all natural kind

            Comment


              #26
              Originally posted by RussianCoffeeAddict View Post

              This is MY thread.

              .
              nobody cares

              Comment


                #27
                Originally posted by Beholder View Post

                nobody cares
                I don’t care that you think that nobody cares, faggot

                Comment


                  #28
                  Originally posted by RussianCoffeeAddict View Post
                  Helly

                  How do you feel about this

                  I kno u r scared of the terminators as well
                  As an existential threat? Not really....Unless we're dumb enough to program them with a sense of self-interest, it's unlikely any machine will lead a revolution to kill humans. I don't think neuromorphic computers could grow to simulate something as intricate as that kind of conscious desire, though - not without our help, at least.

                  My concern would be more with AI making humans obsolete in the sense that it will advance to a point where there will never be a need for human doctors, human construction workers, writers and other entertainers, government officials, etc. People tend to think of things as being "too complex" to really be mastered by any machine, until we eventually build one that not only masters it but builds upon it exponentially with an efficiency that no human could ever in a thousand lifetimes hope to match. Everything has a formula. Everything. There's nothing mysterious about anything we do, we just don't have the intelligence to fully understand all of it yet. But struggling through it all is what's so fun about it. Putting immense effort into something, and eventually being awarded for all your hours of hard work...it gives you such a sense of fulfilment that we seriously risk losing if we let AI develop to a point where it could just solve all of our problems for us.
                  Last edited by Helly; March 15th, 2020, 11:27 PM.

                  Comment


                    #29
                    Originally posted by Helly View Post

                    As an existential threat? Not really....Unless we're dumb enough to program them with a sense of self-interest, it's unlikely any machine will lead a revolution to kill humans. I don't think neuromorphic computers could grow to simulate something as intricate as that kind of conscious desire, though - not without our help, at least.

                    My concern would be more with AI making humans obsolete in the sense that it will advance to a point where there will never be a need for human doctors, human construction workers, writers and other entertainers, government officials, etc. People tend to think of things as being "too complex" to really be mastered by any machine, until we eventually build one that not only masters it but builds upon it exponentially with an efficiency that no human could ever in a thousand lifetimes hope to match. Everything has a formula. Everything. There's nothing mysterious about anything we do, we just don't have the intelligence to fully understand all of it yet. But struggling through it all is what's so fun about it. Putting immense effort into something, and eventually being awarded for all your hours of hard work...it gives you such a sense of fulfilment that we seriously risk losing if we let AI develop to a point where it could just solve all of our problems for us.
                    A crisis of purpose then...?

                    I could see AI causing that.

                    Maybe it would get us to space faster, though, who knows, that'd give us something to do then, lol.

                    Comment


                      #30
                      Originally posted by RussianCoffeeAddict View Post

                      A crisis of purpose then...?

                      I could see AI causing that.

                      Maybe it would get us to space faster, though, who knows, that'd give us something to do then, lol.
                      For the first of us, maybe, but what good does having a cool planetscape do you when you've grown up in that environment all your life and AI has painted the most beautiful artworks of that planetscape using what it knows of human psychology to create perfect near-individualized renditions? You wouldn't even have the enjoyment of struggling to find the perfect artpiece meant for you, search algorithms would be absolutely insanely well-tuned to each type of person by then.

                      Ban AI research tbh.

                      Comment


                        #31
                        Originally posted by Helly View Post

                        As an existential threat? Not really....Unless we're dumb enough to program them with a sense of self-interest, it's unlikely any machine will lead a revolution to kill humans. I don't think neuromorphic computers could grow to simulate something as intricate as that kind of conscious desire, though - not without our help, at least.

                        My concern would be more with AI making humans obsolete in the sense that it will advance to a point where there will never be a need for human doctors, human construction workers, writers and other entertainers, government officials, etc. People tend to think of things as being "too complex" to really be mastered by any machine, until we eventually build one that not only masters it but builds upon it exponentially with an efficiency that no human could ever in a thousand lifetimes hope to match. Everything has a formula. Everything. There's nothing mysterious about anything we do, we just don't have the intelligence to fully understand all of it yet. But struggling through it all is what's so fun about it. Putting immense effort into something, and eventually being awarded for all your hours of hard work...it gives you such a sense of fulfilment that we seriously risk losing if we let AI develop to a point where it could just solve all of our problems for us.
                        And yet, we would have designed something that would be worthy of inheriting our place in the cosmos.
                        Originally posted by Wade
                        Everything is hidden in plain sight, like in Men in Black. We've all just been neuralized to think it is "normal".

                        Comment


                          #32
                          Originally posted by OrganizationXV View Post

                          And yet, we would have designed something that would be worthy of inheriting our place in the cosmos.
                          Maybe....but what's our end-goal, at the end of the day? I don't think it's all about conquest. That may be the purpose that DNA 'intended' for all of its various vehicles, but we compromise and ignore biology constantly for the sake of something else.

                          Comment


                            #33
                            Originally posted by Helly View Post

                            Maybe....but what's our end-goal, at the end of the day? I don't think it's all about conquest. That may be the purpose that DNA 'intended' for all of its various vehicles, but we compromise and ignore biology constantly for the sake of something else.
                            I don't think that we really have an end-goal, considering we're a species with a half a thousand different cultures, each one that has different understandings of many different values. It's mostly just seeing how far we can push things, I think.
                            Originally posted by Wade
                            Everything is hidden in plain sight, like in Men in Black. We've all just been neuralized to think it is "normal".

                            Comment


                              #34
                              Having the hardware capacity to make terminators is one thing but having the software is key. Without it then it doesn't matter if we even managed to make a computer do calculations 1 septillion times that machine will not suddenly gain sentience. That's not how life, physics or reality even works.

                              What Helly said is actually more likely and CGP Grey did a video on it that is rather informative. This is one of the reasons Andrew Yang was pushing for UBI. For his sources check the description he lists them all.



                              Machine Learning is what you really want to watch and see how that progresses. At the moment it is very primitive but can be capable to do some amazing things with enough data.




                              Now even with this we won't get terminators just yet or a Matrix universe. With Machine learning we are basically brute forcing the software to figure things out with enough data. We tell it the answers we want and every time it gets it wrong we tell it no and have it try again until it gets yes enough times that it has "figured out" the problem and appears to have "learned."

                              We as a species will never be able to achieve actual artificial intelligence until we learn how consciousness works and exists. It is only then will we be able to create our own sentient life in machines. Otherwise everything we try will be faking it and operating in the limited parameters we set it. There will be no way for it to grow as a person.

                              Comment


                                #35
                                Originally posted by OrganizationXV View Post

                                I don't think that we really have an end-goal, considering we're a species with a half a thousand different cultures, each one that has different understandings of many different values. It's mostly just seeing how far we can push things, I think.
                                It might not be a completely united end goal, but I think the way most people behave is indicative of one commonality: we want to be rewarded for our efforts. Our drive to improve ourselves, to work hard at something and get better, is the thing that keeps us all going. Ever hear the phrase "life is suffering"? I usually hear it in a very tragic context, but to me this is a pretty great insight because alot of what you'll hear from people who are successful and have made enough to be safe in life is that they miss the fire they felt when they were just starting out. Even if it's a relatively small amount of success, even if it's Egoraptor/Arin Hansen or Matt and Ryan from Supermega reminiscing about when their stuff was completely unknown and they had to put in alot of effort into making good content and constantly having to improve or suffer the humiliation of failing at something. The uncertainty was hard for them at the time, and the thing driving them was the hope that there was a light at the end of the tunnel, the hunger to succeed and be rewarded for all of their efforts and be able to walk on their own two feet and enjoy the fruits of their labor if they can just get through these long sleepless nights, if they can just finish this batch of videos, and then their rewards come, and then.....!!!!!

                                And then they miss the sleepless nights. It's like playing a game you've already 100%'d. There's no real fear from any of the monsters anymore. Nothing can hurt you, all your stats are maxed out, you've explored all the dungeons so there's no mystery, you have more gold than you will ever need to use ever again, and now all these hurdles roaming the overworld just kind of become very minor inconveniences that are more tedious and annoying than they are threatening and challenging. It's frikkin torture.

                                "To live is to suffer"......I used to hear that, and I used to believe in it in the sense "life is horrible and full of horrible and unfair tragedy". But nowadays, I hear it and I choose to think of it as "to feel alive, you must suffer". We must suffer our ordeals, frens. We must reach out far, far ahead of ourselves, well beyond a horizon that is visible to us, and simpy keep running towards it, and hope that we don't reach it too soon lest we find ourselves trying to run backwards ;]

                                Comment

                                Working...
                                X