All Article Properties:
{
"access_control": false,
"status": "publish",
"objectType": "Article",
"id": "2022024",
"signature": "Article:2022024",
"url": "https://staging.dailymaverick.co.za/opinion-piece/2022024-predictors-of-ai-doom-may-have-too-little-imagination-not-too-much",
"shorturl": "https://staging.dailymaverick.co.za/opinion-piece/2022024",
"slug": "predictors-of-ai-doom-may-have-too-little-imagination-not-too-much",
"contentType": {
"id": "3",
"name": "Opinionistas",
"slug": "opinion-piece"
},
"views": 0,
"comments": 7,
"preview_limit": null,
"excludedFromGoogleSearchEngine": 0,
"title": "Predictors of AI doom may have too little imagination, not too much",
"firstPublished": "2024-01-23 14:38:04",
"lastUpdate": "2024-01-23 14:38:04",
"categories": [
{
"id": "435053",
"name": "Opinionistas",
"signature": "Category:435053",
"slug": "opinionistas",
"typeId": {
"typeId": "1",
"name": "Daily Maverick",
"slug": "",
"includeInIssue": "0",
"shortened_domain": "",
"stylesheetClass": "",
"domain": "staging.dailymaverick.co.za",
"articleUrlPrefix": "",
"access_groups": "[]",
"locale": "",
"preview_limit": null
},
"parentId": null,
"parent": [],
"image": "",
"cover": "",
"logo": "",
"paid": "0",
"objectType": "Category",
"url": "https://staging.dailymaverick.co.za/category/opinionistas/",
"cssCode": "",
"template": "default",
"tagline": "",
"link_param": null,
"description": "",
"metaDescription": "",
"order": "0",
"pageId": null,
"articlesCount": null,
"allowComments": "0",
"accessType": "freecount",
"status": "1",
"children": [],
"cached": true
}
],
"content_length": 4704,
"contents": "<span style=\"font-weight: 400;\">AI</span><a href=\"https://www.theatlantic.com/technology/archive/2023/06/ai-regulation-sam-altman-bill-gates/674278/\"> <span style=\"font-weight: 400;\">doomerism</span></a><span style=\"font-weight: 400;\">, the belief that AI poses an existential risk, gained substantial ground in 2023, most notably following the</span><a href=\"https://futureoflife.org/open-letter/pause-giant-ai-experiments/\"> <span style=\"font-weight: 400;\">open letter</span></a><span style=\"font-weight: 400;\"> published in March 2023 by the </span><span style=\"font-weight: 400;\">Future of Life Institute. The letter, which now has been signed by more than 33,000 people, sounded the warning that “</span><span style=\"font-weight: 400;\">AI systems with human-competitive intelligence can pose profound risks to society and humanity”.</span>\r\n\r\n<span style=\"font-weight: 400;\">In a reaction to this letter, decision-theorist Eliezer Yudkowsky </span><a href=\"https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/\"><span style=\"font-weight: 400;\">wrote</span></a><span style=\"font-weight: 400;\"> that “</span><span style=\"font-weight: 400;\">the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die”.</span>\r\n\r\n<span style=\"font-weight: 400;\">Not surprisingly, more and more calls are made for AI research to be</span><a href=\"https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/\"> <span style=\"font-weight: 400;\">stopped</span></a><span style=\"font-weight: 400;\"> – or at least slowed down and shackled. This would be a mistake. AI can contribute to ensuring humanity’s future. Indeed, humanity’s</span><a href=\"https://quillette.com/2023/08/06/ais-will-be-our-mind-children/\"> <span style=\"font-weight: 400;\">descendants</span></a><span style=\"font-weight: 400;\"> may merge with AI over the long term. AI could be seen as our offspring – at least our “</span><a href=\"https://quillette.com/2023/08/06/ais-will-be-our-mind-children/\"><span style=\"font-weight: 400;\">mind children</span></a><span style=\"font-weight: 400;\">”</span><span style=\"font-weight: 400;\"> – an offspring that straddles both biosphere and</span><a href=\"https://nonzero.substack.com/p/artificial-intelligence-and-the-noosphere\"> <span style=\"font-weight: 400;\">noosphere</span></a><span style=\"font-weight: 400;\">.</span>\r\n\r\n<span style=\"font-weight: 400;\">If the further development of AI is halted, as the AI doomsters want, it may be a grave mistake. This is not only because it will curtail the evolution of humanity over the long run, but it may also leave humanity defenceless against the threat of an</span><a href=\"https://www.iza.org/publications/dp/15924/extraterrestrial-artificial-intelligence-the-final-existential-risk\"> <span style=\"font-weight: 400;\">extraterrestrial AI</span></a><span style=\"font-weight: 400;\">. The </span><i><span style=\"font-weight: 400;\">real</span></i> <span style=\"font-weight: 400;\">existential risk to society from AI may not come from our own AI but from an extraterrestrial AI – an ET-AI, if you will.</span>\r\n\r\n<span style=\"font-weight: 400;\">Although there is no evidence</span> <span style=\"font-weight: 400;\">of alien/extraterrestrial civilisations, statistically, the odds of human civilisation being singular are almost vanishingly small. There are about</span><a href=\"https://arxiv.org/abs/1607.03909\"> <span style=\"font-weight: 400;\">two trillion</span></a><span style=\"font-weight: 400;\"> galaxies in the universe, each containing a 100 billion stars – most of which are likely to have</span><a href=\"https://www.nature.com/articles/nature10684\"> <span style=\"font-weight: 400;\">planets</span></a><span style=\"font-weight: 400;\">. So, as Enrico Fermi</span><a href=\"https://theconversation.com/are-we-alone-the-question-is-worthy-of-serious-scientific-study-98843\"> <span style=\"font-weight: 400;\">asked</span></a><span style=\"font-weight: 400;\">, where are all the aliens?</span>\r\n\r\n<span style=\"font-weight: 400;\">One of the best answers to date is that the universe is a</span><a href=\"https://ui.adsabs.harvard.edu/abs/2015JBIS...68..142Y/abstract\"> <span style=\"font-weight: 400;\">dark forest</span></a><span style=\"font-weight: 400;\">. In a dark-forest universe, where the intentions of other intelligences are unknown and cannot be reliably communicated (due to interstellar distances), the best strategy for any civilisation is to conceal its existence. If it is discovered, it then may want to strike first to eliminate the civilisation that had found it – as a precautionary measure before possibly being eliminated itself.</span>\r\n\r\n<span style=\"font-weight: 400;\">It is therefore foolish for humanity to broadcast its existence to the cosmos as it is currently doing, either unwittingly through</span><a href=\"https://www.discovermagazine.com/planet-earth/our-radio-signals-have-now-reached-75-star-systems-that-can-see-us-too\"> <span style=\"font-weight: 400;\">radio</span></a><span style=\"font-weight: 400;\"> and other broadcasts or intentionally, as through the information embedded in the Voyager spacecraft. As the scientist</span><a href=\"https://www.nytimes.com/1999/12/05/magazine/to-whom-it-may-concern.html\"> <span style=\"font-weight: 400;\">Jared Diamond</span></a><span style=\"font-weight: 400;\"> has warned:</span>\r\n\r\n<i><span style=\"font-weight: 400;\">“Extraterrestrials might behave the way we intelligent beings have behaved whenever we have discovered other previously unknown intelligent beings on Earth, like unfamiliar humans or chimpanzees and gorillas. Just as we did to those beings, the extraterrestrials might proceed to kill, infect, dissect, conquer, displace, or enslave us, study us as specimens for their museums or pickle our skulls and use us for medical research.”</span></i>\r\n\r\n<span style=\"font-weight: 400;\">If more advanced than humanity, extraterrestrials would probably not be biological entities but artificial general intelligences (AGIs), as the UK’s astronomer royal,</span><a href=\"https://theconversation.com/seti-why-extraterrestrial-intelligence-is-more-likely-to-be-artificial-than-biological-169966\"> <span style=\"font-weight: 400;\">Martin Rees</span></a><span style=\"font-weight: 400;\"> and</span><a href=\"https://www.cambridge.org/core/journals/international-journal-of-astrobiology/article/introduction-the-true-nature-of-aliens/C5EA66D8D338A7EA9085602793D85618\"> <span style=\"font-weight: 400;\">many others</span></a><span style=\"font-weight: 400;\"> have argued. If they are AGIs, this may also explain why we are not (yet) aware of any aliens – they may, for instance, use</span><a href=\"https://www.cambridge.org/core/journals/international-journal-of-astrobiology/article/will-recent-advances-in-ai-result-in-a-paradigm-shift-in-astrobiology-and-seti/044673CC288498FC3E17C993D296F285\"> <span style=\"font-weight: 400;\">quantum entanglement</span></a><span style=\"font-weight: 400;\"> to communicate (and not radio waves) or compress their communication signals so that they would be indistinguishable (to us) from noise.</span>\r\n\r\n<span style=\"font-weight: 400;\">How could such an ET-AI pose an existential risk? One way to destroy us could simply be to broadcast a “</span><a href=\"https://www.lesswrong.com/posts/DWHkxqX4t79aThDkg/my-current-thoughts-on-the-risks-from-seti\"><span style=\"font-weight: 400;\">killer code</span></a><span style=\"font-weight: 400;\">” – an AGI computer code that would infest all our systems and take over. It could also broadcast instructions for building a civilisation-destroying “bomb”, perhaps camouflaged as a Trojan Horse. Our only long-term existential chance may, after all, depend on whether we can develop our own</span><a href=\"https://en.wikipedia.org/wiki/Friendly_artificial_intelligence\"> <span style=\"font-weight: 400;\">AGI</span></a><span style=\"font-weight: 400;\">.</span>\r\n\r\n<b>Read more in Daily Maverick: </b><a href=\"https://www.dailymaverick.co.za/article/2023-12-07-resistance-is-futile-south-africa-must-urgently-adapt-to-the-new-age-of-artificial-intelligence/\"><span style=\"font-weight: 400;\">Resistance is futile – South Africa must urgently adapt to the new age of artificial intelligence</span></a>\r\n\r\n<span style=\"font-weight: 400;\">There is one last twist. If we fail to create our own AGI, then we would have no defence against such an ET-AI and over the long run, if humans eventually do go extinct, there would be no AGI that may enable a</span><a href=\"https://www.lesswrong.com/posts/bNJfe7zyXpdnhqWmo/technological-resurrection-two-possible-approaches\"> <span style=\"font-weight: 400;\">technological resurrection</span></a><span style=\"font-weight: 400;\">.</span>\r\n\r\n<span style=\"font-weight: 400;\">It has been suggested that an AGI may, one day in the far future, be able to use simulation methods to “</span><a href=\"https://philpapers.org/references/TURCOA-3\"><span style=\"font-weight: 400;\">resurrect</span></a><span style=\"font-weight: 400;\"> all possible people” who have ever lived. A super-intelligence may even use signals from advanced civilisations that lived in an aeon </span><i><span style=\"font-weight: 400;\">before</span></i><span style=\"font-weight: 400;\"> the Big Bang, which they may have embedded in the universe’s cosmic background radiation, to “</span><a href=\"https://link.springer.com/article/10.1140/epjp/i2016-16011-1\"><span style=\"font-weight: 400;\">reconstruct</span></a><span style=\"font-weight: 400;\"> an entire previous aeon civilisation”.</span>\r\n\r\n<span style=\"font-weight: 400;\">Ultimately, AI doomsters don’t have too much imagination but too little. </span><b>DM</b><i></i>",
"authors": [
{
"id": "250356",
"name": "Wim Naudé",
"image": "https://www.dailymaverick.co.za/wp-content/uploads/2024/01/Wim-Naude-1.jpg",
"url": "https://staging.dailymaverick.co.za/author/wim-naude/",
"editorialName": "wim-naude",
"department": "",
"name_latin": ""
}
],
"keywords": [
{
"type": "Keyword",
"data": {
"keywordId": "236823",
"name": "Jared Diamond",
"url": "https://staging.dailymaverick.co.za/keyword/jared-diamond/",
"slug": "jared-diamond",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "Jared Diamond",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "413900",
"name": "opinionistas",
"url": "https://staging.dailymaverick.co.za/keyword/opinionistas/",
"slug": "opinionistas",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "opinionistas",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "414059",
"name": "Wim Naudé",
"url": "https://staging.dailymaverick.co.za/keyword/wim-naude/",
"slug": "wim-naude",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "Wim Naudé",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "414060",
"name": "extraterrestrial artificial intelligence",
"url": "https://staging.dailymaverick.co.za/keyword/extraterrestrial-artificial-intelligence/",
"slug": "extraterrestrial-artificial-intelligence",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "extraterrestrial artificial intelligence",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "414061",
"name": "AI doomerism",
"url": "https://staging.dailymaverick.co.za/keyword/ai-doomerism/",
"slug": "ai-doomerism",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "AI doomerism",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "414062",
"name": "Future of Life Institute",
"url": "https://staging.dailymaverick.co.za/keyword/future-of-life-institute/",
"slug": "future-of-life-institute",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "Future of Life Institute",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "414063",
"name": "Eliezer Yudkowsky",
"url": "https://staging.dailymaverick.co.za/keyword/eliezer-yudkowsky/",
"slug": "eliezer-yudkowsky",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "Eliezer Yudkowsky",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "414064",
"name": "noosphere",
"url": "https://staging.dailymaverick.co.za/keyword/noosphere/",
"slug": "noosphere",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "noosphere",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "414065",
"name": "Enrico Fermi",
"url": "https://staging.dailymaverick.co.za/keyword/enrico-fermi/",
"slug": "enrico-fermi",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "Enrico Fermi",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "414066",
"name": "dark forest",
"url": "https://staging.dailymaverick.co.za/keyword/dark-forest/",
"slug": "dark-forest",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "dark forest",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "414067",
"name": "Martin Rees",
"url": "https://staging.dailymaverick.co.za/keyword/martin-rees/",
"slug": "martin-rees",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "Martin Rees",
"translations": null
}
},
{
"type": "Keyword",
"data": {
"keywordId": "414068",
"name": "quantum entanglement",
"url": "https://staging.dailymaverick.co.za/keyword/quantum-entanglement/",
"slug": "quantum-entanglement",
"description": "",
"articlesCount": 0,
"replacedWith": null,
"display_name": "quantum entanglement",
"translations": null
}
}
],
"related": [],
"summary": "If the further development of AI is halted, as the AI doomsters want, it may be a grave mistake. This is not only because it will curtail the evolution of humanity over the long run, but it may also leave humanity defenceless against the threat of an extraterrestrial AI.",
"elements": [],
"seo": {
"search_title": "Predictors of AI doom may have too little imagination, not too much",
"search_description": "<span style=\"font-weight: 400;\">AI</span><a href=\"https://www.theatlantic.com/technology/archive/2023/06/ai-regulation-sam-altman-bill-gates/674278/\"> <span style=\"font-weight: 400;\">doomerism</span><",
"social_title": "Predictors of AI doom may have too little imagination, not too much",
"social_description": "<span style=\"font-weight: 400;\">AI</span><a href=\"https://www.theatlantic.com/technology/archive/2023/06/ai-regulation-sam-altman-bill-gates/674278/\"> <span style=\"font-weight: 400;\">doomerism</span><",
"social_image": ""
},
"cached": true,
"access_allowed": true
}