Trust and Trustability, part 2

How future society can evaporate in a puff of autonomy

Mark Burgess
18 min readMay 23, 2023

In an earlier post, I discussed how Promise Theory offers an alternative picture of trust–-not as the usual moral imperative for human behaviour, but that of a coarse potential landscape, shaping decisions and relationships, more like something you would find at work in the physical sciences. This second part of the summary and discussion attempts to think about the implications of this understanding of trust to shape our modern worlds, which now include both real and virtual places in games, online forums, and augmented reality.

In Promise Theory, trust emerges as a two part conception that concerns the way agents assess and allocate resources to decide how much we can rely on one another. Trust comes in essentially both potential and kinetic forms (passive and active, if you prefer). We call them (potential) trustworthiness and (kinetic or attention) trust. Together these form a coarse, non-local assessment of influence, which acts something like an energy in physics or a money in economics, only with somewhat more specialized semantics. The parts of trust apply to all processes and thus (by teleological inference) to the agents engaged in the processes too. Trust is also related to (but distinct from) the notions of confidence and risk, which won’t be discussed here.

The role of trust in a cyborg era

--

--

Mark Burgess

@markburgess_osl on Twitter and Instagram. Science, research, technology advisor and author - see Http://markburgess.org and Https://chitek-i.org