Everywhere we remain unfree and chained to technology, whether we passionately affirm or deny it. But we are delivered over to it in the worst possible way when we regard it as something neutral; for this conception of it, to which today we particularly like to pay homage, makes us utterly blind to the essence of technology.
Martin Heidegger—The Question Concerning Technology, 1954
In my post Why I’m No “Thought Leader”, I mentioned that one reason I don’t want to be a thought leader is that the abysses in which I want to travel are best entered alone.
Martin Heidegger is one of those abysses.
As I reread his essay The Question Concerning Technology, my instinct tells me that contained within his work is both the definition of and the solution to what I call “The Problem of AI “—roughly that artificial intelligence necessarily dehumanizes individuals). I believe this to be the single greatest problem in healthcare, and one of the greatest problems facing the world.
Unfortunately, mining Heidegger’s ideas require contending with the fact (not the dalliance) of his Nazism.
The dilemma of accepting brilliance of thought while simultaneously rejecting darkness of character is not uncommon. Take a look at Joshua Rothman’s excellent 2014 article Is Heidegger Contaminated by Nazism? (The answer is yes!), or another post of mine, Healthcare AI, Kant’s Deontology, and My Cat, to put some brackets around this challenge.
So what shall I do? Well, for myself (and I speak only for myself here, advising no one) I believe it requires I maintain a dualistic view of Heidegger (brilliant/repugnant, insightful/ignorant, visionary/blind, and the like), as I study him.
This will be challenging. Even without expending the intellectual energy needed to maintain a dualist, “switch always on” view of him, Heidegger is difficult to understand, and to find what I am looking for this may be a years-long exploration, not a months-long one. But the tsunami of AI is so powerful, to ignore Heidegger’s lifeboat borders on philosophical negligence.