Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search
21 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
RIP Stephen Hawking - Remembering His Warning (Original Post) JimGinPA Mar 2018 OP
Our destiny is much like that of the dinosaurs, tavernier Mar 2018 #1
Ironic, because we always held up the demise of the dinosaurs as the proof that they couldn't adapt. Nitram Mar 2018 #5
Well, don't self-exterminate until after you've voted this year. Hortensis Mar 2018 #11
And the byproducts from those dinosaurs.... SergeStorms Mar 2018 #17
No, that's not our destiny Plucketeer Mar 2018 #19
Kick with a candle... NeoGreen Mar 2018 #2
Yes, let us remember the words of this great man. PatrickforO Mar 2018 #3
RIP Mr. Hawking usaf-vet Mar 2018 #4
Oh no! One of my favorite people...😥 BlancheSplanchnik Mar 2018 #6
Yes! :) One of mankind's favorite people. Hortensis Mar 2018 #13
Yes....yes yes.... BlancheSplanchnik Mar 2018 #14
What a lovely idea. You know, he could have Hortensis Mar 2018 #16
I could imagine hed enjoy using it to heckle! BlancheSplanchnik Mar 2018 #20
Lol. He certainly didn't need it to be a big voice. Hortensis Mar 2018 #21
Wise words from a wise man. NT dreamland Mar 2018 #7
Brilliantly Insightful Words to Live By dlk Mar 2018 #8
His most immediate warning- poboy2 Mar 2018 #9
Technological singularity From Wikipedia, the free encyclopedia poboy2 Mar 2018 #10
Thank you Mr. Hawking bdamomma Mar 2018 #12
This makes me very sad. sweetroxie Mar 2018 #15
I really liked the guy mitch96 Mar 2018 #18

Nitram

(22,794 posts)
5. Ironic, because we always held up the demise of the dinosaurs as the proof that they couldn't adapt.
Wed Mar 14, 2018, 09:42 AM
Mar 2018

Now who's laughing?

SergeStorms

(19,199 posts)
17. And the byproducts from those dinosaurs....
Wed Mar 14, 2018, 10:47 AM
Mar 2018

the black goo that humans so greatly crave, will be our demise. Oil, petro-chemicals, and especially plastics, are killing animals, the oceans, and ultimately humans themselves. The water we drink is so polluted with micro-beads of plastic who knows what effects they'll have on the human body? Safe to say, it isn't going to be good.

There is no intelligent life on earth.

 

Plucketeer

(12,882 posts)
19. No, that's not our destiny
Wed Mar 14, 2018, 11:18 AM
Mar 2018

There were NO guarantees for the creatures of this orb when they enjoyed their heydays here. Whether they died from Super-Volcanoes or Asteroids doesn't matter. For the most part they died off almost totally. But like the Phoenix, life arose from the wreckage and set about adapting (evolving) to the revised climate that had developed after the cataclysmic die-off.
The next big extinction event is going to happen - one way or another. DO WE have the powers to keep it from being a result of our inability to handle what we've wrought with our comprehensive brains, or will Yellowstone explode, or will a giant, potato-looking rock dumbly blunder it's way into a collision with our fragile, spinner of a home? That's why I say it's not our destiny to go out in a nuclear haze. It is an option that we can choose - but so is the slow-motion explosion we're in the midst of right now - global warming. Global warming has the potential to cause more (or at least as much as) death as a limited nuclear shootout. But again - don't discount the possibility of a big rock or a Yellowstone blowout. Both of those are possibilities we (unlike previous waves of life) can comprehend but can do nothing about. As to doing ourselves in - even if global solidarity and sanity broke out and infected every human on earth, there's still a fair chance we could literally stand and watch as our demise played out at NO ONE'S fault.

PatrickforO

(14,572 posts)
3. Yes, let us remember the words of this great man.
Wed Mar 14, 2018, 09:22 AM
Mar 2018

He was trapped in that disabled body for so long, and did so much good anyway that it is an inspiration for all of us.

Let us hope he is free now.

BlancheSplanchnik

(20,219 posts)
6. Oh no! One of my favorite people...😥
Wed Mar 14, 2018, 09:53 AM
Mar 2018

Yes. Overpopulation. We can’t continue as we’ve been going.

We will miss you Sir, Dr. Hawking. 😪💔

Hortensis

(58,785 posts)
13. Yes! :) One of mankind's favorite people.
Wed Mar 14, 2018, 10:31 AM
Mar 2018

He's being mourned around the planet. Imagine being such a person.

"Some people would claim that things like love, joy and beauty belong to a different category from science and can't be described in scientific terms, but I think they can now be explained by the theory of evolution." Stephen Hawking

Who wasn't quite sure yet that time travel would be possible, but who we will for sure be visiting again if it is.



BlancheSplanchnik

(20,219 posts)
14. Yes....yes yes....
Wed Mar 14, 2018, 10:41 AM
Mar 2018

I think we get to time travel as much as we want after we die. Maybe a visit with Dr Hawking—-to go to a comedy show together! — will be in order.

Hortensis

(58,785 posts)
16. What a lovely idea. You know, he could have
Wed Mar 14, 2018, 10:46 AM
Mar 2018

updated that half-century-old voice generator, but he said he never found a voice he liked better and it had become part of his identity. If anyone could find a way to have it in a next existence...

 

poboy2

(2,078 posts)
9. His most immediate warning-
Wed Mar 14, 2018, 10:15 AM
Mar 2018
Is Stephen Hawking Right? Could AI Lead To The End Of Humankind?
“Once humans develop artificial intelligence, it would take off on its own and re-design itself at an ever increasing rate,” he reportedly told the BBC.

“The development of full artificial intelligence could spell the end of the human race.”

Could thinking machines take over?

I appreciate the issue of computers taking over (and one day ending humankind) being raised by someone as high profile, able and credible as Prof Hawking – and it deserves a quick response.

The issue of machine intelligence goes back at least as far as the British code-breaker and father of computer science, Alan Turing in 1950, when he considered the question: “Can machines think?”

The issue of these intelligent machines taking over has been discussed in one way or another in a variety of popular media and culture. Think of the movies Colossus – the Forbin project (1970) and Westworld (1973), and – more recently – Skynet in the 1984 movie Terminator and sequels, to name just a few.

Common to all of these is the issue of delegating responsibility to machines. The notion of the technological singularity (or machine super-intelligence) is something which goes back at least as far as artificial intelligence pioneer, Ray Solomonoff – who, in 1967, warned:

Although there is no prospect of very intelligent machines in the near future, the dangers posed are very serious and the problems very difficult. It would be well if a large number of intelligent humans devote a lot of thought to these problems before they arise.

It is my feeling that the realization of artificial intelligence will be a sudden occurrence. At a certain point in the development of the research we will have had no practical experience with machine intelligence of any serious level: a month or so later, we will have a very intelligent machine and all the problems and dangers associated with our inexperience.

As well as giving this variant of Hawking’s warning back in 1967, in 1985 Solomonoff endeavoured to give a time scale for the technological singularity and reflect on social effects.

I share the concerns of Solomonoff, Hawking and others regarding the consequences of faster and more intelligent machines – but American author, computer scientist and inventor, Ray Kurzweil, is one of many seeing the benefits.

Whoever might turn out to be right (provided our planet isn’t destroyed by some other danger in the meantime), I think Solomonoff was prescient in 1967 in advocating we devote a lot of thought to this.

Machines already taking over

In the meantime, we see increasing amounts of responsibility being delegated to machines. On the one hand, this might be hand-held calculators, routine mathematical calculations or global positioning systems (GPSs).

On the other hand, this might be systems for air traffic control, guided missiles, driverless trucks on mine sites or the recent trial appearances of driverless cars on our roads.

Humans delegate responsibility to machines for reasons including improving time, cost and accuracy. But nightmares that might occur regarding damage by, say a driverless vehicle, would include legal, insurance and attribution of responsibility.

It is argued that computers might take over when their intelligence supersedes that of humans. But there are also other risks with this delegation of responsibility.

-
http://www.iflscience.com/technology/stephen-hawking-right-could-ai-lead-end-humankind/
 

poboy2

(2,078 posts)
10. Technological singularity From Wikipedia, the free encyclopedia
Wed Mar 14, 2018, 10:16 AM
Mar 2018

Technological singularity
From Wikipedia, the free encyclopedia

The technological singularity (also, simply, the singularity)[1] is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.[2] According to this hypothesis, an upgradable intelligent agent (such as a computer running software-based artificial general intelligence) would enter a "runaway reaction" of self-improvement cycles, with each new and more intelligent generation appearing more and more rapidly, causing an intelligence explosion and resulting in a powerful superintelligence that would, qualitatively, far surpass all human intelligence. Stanislaw Ulam reports a discussion with John von Neumann "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue".[3] Subsequent authors have echoed this viewpoint.[2][4] I. J. Good's "intelligence explosion" model predicts that a future superintelligence will trigger a singularity.[5] Emeritus professor of computer science at San Diego State University and science fiction author Vernor Vinge said in his 1993 essay The Coming Technological Singularity that this would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate.[5]

Four polls conducted in 2012 and 2013 suggested that the median estimate among experts for when artificial general intelligence (AGI) would arrive was 2040 to 2050, depending on the poll.[6][7]

Many notable personalities, including Stephen Hawking and Elon Musk, consider the uncontrolled rise of artificial intelligence as a matter of alarm and concern for humanity's future.[8][9] The consequences of the singularity and its potential benefit or harm to the human race have been hotly debated by various intellectual circles.[citation needed]
-
https://en.wikipedia.org/wiki/Technological_singularity

mitch96

(13,895 posts)
18. I really liked the guy
Wed Mar 14, 2018, 10:59 AM
Mar 2018

Smart and funny too!
Heard this on the news..
And there are some weird coincidents about him...
He was born on the anniversary of Galileo's death. Big mind
Died on the same day as Albert Einstein was born.... another big mind
which is March 14.. aka 3/14 or Pi day!! a very big number
A fitting tribute to a physic's guy.. 3.1416.* ad nauseam

m

Latest Discussions»General Discussion»RIP Stephen Hawking - Rem...