View Single Post
 
Old Mar 27, 2016, 04:14 PM
ScientiaOmnisEst's Avatar
ScientiaOmnisEst ScientiaOmnisEst is offline
Poohbah
 
Member Since: Sep 2015
Location: Upstate NY
Posts: 1,130
Quote:
Originally Posted by qwerty68 View Post
As a computer scientist(MS in the field) it is my opinion that many of your fears are unfounded. It is true that jobs will be replaced(and have been to a small extent) but it is not likely that all jobs will ever be replaced, including those that don't require degrees.

Computers are dumb, that is not the best way to put it because they have zero intellectual and creative abilities. Zip, nada. Computers can do basic math, read from various kinds of memory and write to memory/screen and not much else. No matter how complex the task it gets narrowed to do those basic tasks(more or less, just keeping this non-technical). Software is dumb as well, but can be made to mimic intelligent systems in a very narrow scope, but are not intelligent.

When the assembly lines first started up people had many of the same fears. Even "unskilled" jobs will not be all replaced by machines in yours or even your grand kids lifetime. There needs to be huge advances in software and robotics for that to happen.

As a recent example, Microsoft released an "AI" chat program and in two days they had to shut it down because it became racist. The reason it did that is that people "taught" it to be that way. It would store peoples responses to other peoples questions and just spew it back out. That is the state of AI. It has no ability to reason or think creatively.

Those chess programs that beat chess masters are similar. They don't think they just were "taught" the rules of chess and fed chess moves from thousands of games played by masters. It had the advantage of quickly recalling counter-moves from a pool much larger than a person can hold in their head.

For a computer program to write an actual novel, it would have to follow the 100 monkeys idea(put 100 monkeys on 100 typewriters and eventually one will write a novel) and it wouldn't even know if it had created a novel.

I don't know a lot about robotics, but I haven't seen one with the dexterity of a human, maybe they exist.

Look at it this way, if what you fear comes true then nearly everyone will be in the same boat as you. I am not saying this because I am currently unemployable, it is what I always believed: Work is not life and a life shouldn't be measured by a person's job. Work pays for life and if humans were freed from having to work, things that are starting to be proposed, like basic income would become reality and people who would have 8-10 more hours a day to do something else, which may or not not be a good thing.

We are so far away from what you fear, that worrying about it is unproductive. I know that is easy to say that, and I have all sorts of fears that I can't get rid of so please don't take this like I am trivializing your fears.

I just saw this now, after spending all day since this morning ruminating (to the point of tears) some existential fears about human obsolescence, desperately looking for someone to talk to, particularly if they're in the field.

My biggest fear is when machines are sapient enough to have creative, intellectual abilities. A machine that can, without huapman assistance, create artwork, make a theoretical scientific discovery, invent a thought experiment, invent anything for that matter, or heuristically reason around a philosophical argument.

Basically, when machines are better at virtually everything then humans, and are carrying on our legacy while we're still around. Leaving us totally useless, with nothing to do and nothing worth doing. Our sole values have been replicated cheaply and surpassed.

I suppose I have some terror of uniquely human things becoming digitally simulatable. Of...I struggle to express it...of the entire human condition, all our experiences, dreams thought, search for meaning, getting handed over to machines because we built things better than us. When there's nothing for us to do and we have no choice but to die out.

Some laymen, when I express these fears, wondering how to cope, basically say something about hubris, useless self-worth, we shouldn't hold back progress because it hurts our egos. It's evolution, it can't be helped, no use worrying over something you can't change. Fine, I can't change it. I want to know how to cope with anything I do for my own self-worth being soon rendered useless by superior machines. Not even people I could compete with.

In addition is this impression I can't get rid of that machines taking over all of human life somehow makes all our abstract, personal, feeling things (our psychology, our longing for identity, our spiritual searches) especially worthless and useless. How can our lives mean anything to ourselves if we cede our major agency to smarter machines we created? What does that even mean, philosophically, that we did that?

It's easy to think of humanity as ephemeral when we'all die out in a few million years, or modify ourselves (the best option, I think, if we make machines better then us at everything); not so much when we'all be obsolete in a few decades. Why do anything? It's all going to be empty very, very soon.

Please send your thoughts, this is causing me too much pain. I never know who to trust about these things: the experts who say we'll have intellectual machines in 40 years, or the ones who believe in (as I, honestly, long for) areas of human exceptionality.

The thing about the racist chatbox made me laugh, and I needed that. I'm not okay today.