Artificial General Intelligence: A Potential Threat Comparable to Climate Change and Its Implications for Medical Education


Creative Commons License

Kıyak Y. S.

8th Medical Education Day #VIIIJEM, Murcia, İspanya, 06 Eylül 2024, ss.3

  • Yayın Türü: Bildiri / Özet Bildiri
  • Basıldığı Şehir: Murcia
  • Basıldığı Ülke: İspanya
  • Sayfa Sayıları: ss.3
  • Gazi Üniversitesi Adresli: Evet

Özet

Introduction: Artificial General Intelligence (AGI) and Artificial Superintelligence (ASI) represent a significantly more capable forms of artificial intelligence that could match or exceed human capabilities across various domains. This study aims to provide a beginner-level introduction to the concepts of AGI and ASI, compare their potential risks to those of climate change, and call for incorporating AGI awareness into medical education.

Methods: A critical analysis of current literature and expert opinions on AGI/ASI was conducted, focusing on their potential impacts and risks. The study also focused on the current state of AGI awareness in medical education curricula compared to climate change education.

Results: The study pointed out that while AGI and climate change both pose existential risks to humanity, there is a significant shortfall in the attention allocated to the threat of AGI. Climate change is relatively widely addressed in curricula, while AGI receives little to no attention. The potential risks of AGI/ASI include, but not limited to, ethical concerns, loss of human control, lack of alignment, and unintended consequences in healthcare applications. The study emphasizes the need for decentralized, accountable decision-making systems in AGI development and deployment.

Conclusions: There is an urgent need to integrate AGI/ASI risk awareness into medical curricula. By raising awareness about AGI risks alongside climate change, medical education can better prepare healthcare professionals for future challenges and ensure that medicine remains a force for good in the age of advanced AI, given that it could potentially lead to an abundant era or human extinction.