About

-

My name is Michael Schmatz, and Finding Protopia is my blog. This is mainly a place for me to think out loud.

This page is about this website. For information about me, see the personal page.

Why Finding Protopia?

Protopia is a term coined by Kevin Kelly in his book The Inevitable1 to describe where he believes technology is taking our world. He belives that rather than heading to a perfect (yet stagnant) utopia or a terrible (but unsustainable) dystopia, we are heading towards a gradually improving protopia. In a protopia, subtle and incremental progress leads to a future which is better than present, on the whole. Not a lot better day to day, but better in more ways than it is worse.

I believe I have a moral responsibility to do what I can to ensure the continuance of protopia. Many of my areas of interest are in reducing the risk of this not happening.

The Risks

Existential Risk

Existential risk are risks that threaten the destruction of humanity’s long term potential.2 These can manifest through human extinction, an uncrecoverable collapse, or a permanent dystopia. Toby Ord argues that preserving and protecting the future potential of humanity is good because it would allow our descendants to fulfill that potential and realize one of the best possible futures for humanity. I believe preventing the destruction of humanity’s long term potential is an axiomatic good.3 For a stirring vision of what such a future could be like, I recommend Bostrom’s Letter from Utopia.

Like Ord, I’m principally concerned about the existential risk of superhuman AI. The primary reason why I’m concerned about this is that there are enormous incentives, both economic and moral, to developing that technology.

Stagnation

We also risk futures in which progress is extremely slow, nonexistent, or negative. This stagnation could be economic, scientific, moral, or otherwise. I generally accept Thielian notions that stagnation in these areas would be be bad for a number of reasons and that growth is generally slowing down.4 Preventing or reducing this stagnation would enable a better future.


  1. Kelly, K. (2017). The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future. United Kingdom: Penguin Books. ↩︎

  2. Ord, T. (2020). The Precipice: Existential Risk and the Future of Humanity. United Kingdom: Bloomsbury Publishing. ↩︎

  3. Ord spends a considerable amount of words explaining why human extinction is bad. There are some philosophical arguments as to why the demise of humanity would be neutral or good. I won’t dig into these, but I generally believe the demise of humanity would be one of the worst things that could happen. ↩︎

  4. Ross Douthat has a good examination of these issues in his book The Decadent Society. ↩︎