Cart

Votre panier est vide.

If Anyone Builds It, Everyone Dies

105,00 DH

En 2023, des centaines de sommités de l'IA ont signé une lettre ouverte avertissant que l'intelligence artificielle pose un risque sérieux d'extinction humaine. Depuis lors, la course à l'IA n'a fait que s'intensifier. Les entreprises et les pays se précipitent pour construire des machines qui seront plus intelligentes que n'importe quelle personne. Et le monde est terriblement mal préparé à ce qui se passera ensuite.

In stock
12X13X14 January 1, 2025 English 272 pages , , ,

Authors

If Anyone Builds It, Everyone Dies

Eliezer Yudkowsky

Eliezer Yudkowsky is a founding researcher of the field of AI alignment and played a major role in shaping the public conversation about smarter-than-human AI. He appeared on Time magazine’s 2023 list of the 100 Most Influential People In AI, was one of the twelve public figures featured in The New York Times’s “Who’s Who […]

Book By Eliezer Yudkowsky View All
If Anyone Builds It, Everyone Dies

Description

“May prove to be the most important book of our time.”—Tim Urban, Wait But Why

The scramble to create superhuman AI has put us on the path to extinction—but it’s not too late to change course, as two of the field’s earliest researchers explain in this clarion call for humanity.

En 2023, des centaines de sommités de l'IA ont signé une lettre ouverte avertissant que l'intelligence artificielle pose un risque sérieux d'extinction humaine. Depuis lors, la course à l'IA n'a fait que s'intensifier. Les entreprises et les pays se précipitent pour construire des machines qui seront plus intelligentes que n'importe quelle personne. Et le monde est terriblement mal préparé à ce qui se passera ensuite.

For decades, two signatories of that letter—Eliezer Yudkowsky and Nate Soares—have studied how smarter-than-human intelligences will think, behave, and pursue their objectives. Their research says that sufficiently smart AIs will develop goals of their own that put them in conflict with us—and that if it comes to conflict, an artificial superintelligence would crush us. The contest wouldn’t even be close.

How could a machine superintelligence wipe out our entire species? Why would it want to? Would it want anything at all? In this urgent book, Yudkowsky and Soares walk through the theory and the evidence, present one possible extinction scenario, and explain what it would take for humanity to survive.

The world is racing to build something truly new under the sun. And if anyone builds it, everyone dies.

“The best no-nonsense, simple explanation of the AI risk problem I’ve ever read.”—Yishan Wong, Former CEO of Reddit

Avis

Il n’y a pas encore d’avis.

Soyez le premier à laisser votre avis sur “If Anyone Builds It, Everyone Dies”

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *