In a couple of articles (here and here) for the ACLU’s Free Future Blog, Jay Stanley (@JayCStanley) expresses concern about how computers and, in particular, the Internet of Things will be used to impose rules on us:
many computers (often no more than single chips) may come to serve as little mini-bureaucracy “pods” cast off from their agency or company mother ships, allowing those bureaucracies to encode their rules and distribute their power in ways they never could do before.
Jay is sceptical about the possibility to formulate rules (laws) general enough to cope with every possible individual circumstances, and he come up with a provocative sort of impossibility theorem: a Godel’s incompleteness theorem of law:
no matter how detailed a set of rules is laid out, no matter how comprehensive the attempt to deal with every contingency, in the real world circumstances will arise that will break that ruleset. Applied to such circumstances the rules will be indeterminate and/or self-contradictory. This is sort of a Godel’s incompleteness theorem of law.
It is very interesting. Jay’s “theorem” not only raises a serious concern on the development of technology, it also provides us with a different way to look at the evolution of mathematics in front of undecidable propositions, in the light of jurisprudence.
Oh, by the way, if you are also concerned about these questions, I suggest that you have a look to and maybe sign this open letter on “Research Priorities for Robust and Beneficial Artificial Intelligence.”