<aside>
💛
these notes are part of a larger database of .
an intention is to for these to support collective sensemaking. commenting is always on and is a gift.
</aside>
By Carhart Harris. Cited by 1050

qs
- im curious abt how entropy is in information theory:
- Entropy in information theory is reflected in the shape of a probability distribution (Ben-Naim, 2012), i.e., we have less confidence (or more uncertainty) about something when the distribution is broader or more evenly spread.
thoughts
it seems like DMN/ego is just that helpful filter for the uncertain world. it’s like PCA - dimensionally reduction technique. helps us control the world via an idea of ego but also constrains the many infinite things about it.
Doors of Perception
naming = control
highlights
- self-organized criticality - how a complex system forced away from equilibrium by a regular input of energy begins to exhibit interesting property since it reachers critical point in a relatively narrow transition zone between two extremes of system order and chaos.
- quality of any consciousness state depends on the system’s entropy measure via key parameters of brain function
- human brain exhibits greater entropy than other members of the animal kingdom, which is equivalent to saying that the human mind possesses a greater repertoire of potential mental states than lower animals