Hart contracts, not smart contracts

I was recently re-reading H.L.A. Hart’s “Positivism and the Separation of Law and Morality”, the opening salvo in the never-ending, intergenerational Hart-Fuller debate, and his description of the “core and penumbra” approach to reasoning about rules is a perfect illumination of the error behind so-called smart contracts, blockchain-based self-enforcing quasi-contractual agreements.

From page 606-607 of Hart:

…the “Realists” made us acutely conscious of one cardinal feature of human language and human thought, emphasis on which is vital not only for the understanding of law but in areas of philosophy far beyond the confines of jurisprudence. The insight of this school may be presented in the following example. A legal rule forbids you to take a vehicle into the public park. Plainly this forbids an automobile, but what about bicycles, roller skates, toy automobiles? What about airplanes? Are these, as we say, to be called “vehicles” for the purpose of the rule or not? If we are to communicate with each other at all, and if, as in the most elementary form of law, we are to express our intentions that a certain type of behavior be regulated by rules, then the general words we use - like “vehicle” in the case I consider - must have some standard instance in which no doubts are felt about its application. There must be a core of settled meaning, but there will be, as well, a penumbra of debatable cases in which words are neither obviously applicable nor obviously ruled out. These cases will each have some features in common with the standard case; they will lack others or be accompanied by features not present in the standard case. Human invention and natural processes continually throw up such variants on the familiar, and if we are to say that these ranges of facts do or do not fall under existing rules, then the classifier must make a decision which is not dictated to him, for the facts and phenomena to which we fit our words and apply our rules are as it were dumb. The toy automobile cannot speak up and say, “I am a vehicle for the purpose of this legal rule,” nor can the roller skates chorus, “We are not a vehicle.” Fact situations do not await us neatly labeled, creased, and folded, nor is their legal classification written on them to be simply read off by the judge. Instead, in applying legal rules, someone must take the responsibility of deciding that words do or do not cover some case in hand with all the practical consequences involved in this decision.

We may call the problems which arise outside the hard core of standard instances or settled meaning “problems of the penumbra”; they are always with us whether in relation to such trivial things as the regulation of the use of the public park or in relation to the multidimensional generalities of a constitution. If a penumbra of uncertainty must surround all legal rules, then their application to specific cases in the penumbral area cannot be a matter of logical deduction, and so deductive reasoning, which for generations has been cherished as the very perfection of human reasoning, cannot serve as a model for what judges, or indeed anyone, should do in bringing particular cases under general rules. In this area men cannot live by deduction alone. And it follows that if legal arguments and legal decisions of penumbral questions are to be rational, their rationality must lie in something other than a logical relation to premises.

This is a key thought for software developers—whether blockchain-enamoured or not—to understand. Computers are bad at this. Machine learning models are starting to be okay at producing a simulacrum of this kind of inductive, rules-based reasoning, but that in itself brings a whole set of different problems.

Smart contracts are inherently rather ludicrous because when you compare them with ordinary legal contracts, the elements which a smart contract can encode and the bits which are actually interesting to us are very, very different. Was there substandard performance of a contract? You can’t write a blob of JavaScript that will tell you whether the builder who redid your kitchen did a good job—you need human judgment to decide. (Incidentally, in standard construction contracts like the JCT, there are a series of points where a third party inspector is supposed to come in and decide whether the project is being delivered correctly and to spec.)

Was there misrepresentation that might amount to fraud? You are now asking a computer to evaluate the state of all of human language and knowledge to make that decision. Good luck with that: computers still struggle to discern pedestrians, cyclists and unknown objects. Suitably trained humans can do an okay job of it, but computers are really bad at this.

Was the property in question passed with good title? Your smart contract needs to know whether all previous transactions involving the property being passed over via a blockchain or smart contract transaction has good title. “Ah, well, we will just use the blockchain as a proper record of title”, comes the answer. That doesn’t work, because blockchains can contain factually incorrect information, or previous blockchain-based transactions may have been fraudulent.

If the promise of smart contracts and blockchain technology was supposed to be less lawyerly hair-splitting and formalism, Augur’s ‘House of Representatives’ dispute puts pay to that. When you encode human rules in computer code, you often get arbitrary and unspecified behaviour—think of every person whose name contains non-ASCII characters having it mangled up on airline boarding passes—but the ‘House of Representatives’ dispute is more interesting even than that. You have a distributed community of people betting on the outcome of a question that has two potential interpretations—a technically correct but silly one (the Democrats won the election but they haven’t taken their seats yet, so are not in control of the House), and a common sense one. How is this dispute resolved? Not by applying the reasoning of a third-party neutral arbitrator, but by placing increasingly larger and larger bets on the outcome. It was precisely the lack of any sensible human judgment in the process which enabled the person who created the prediction market to scam people.

When software developers tell you their magic blockchain solution will “replace laws” or “supplant contracts”, ask them when they last read a law or legal judgment or what they know about contractual interpretation, just as when they tell you their magic AI solution will “fix government”, whether they actually have any meaningful understanding of what government truly involves. This is what happens when people dream up technical solutions to problems they have only a small fraction of a clue about to start with—they end up having to find retrospective justifications.

Incidentally, what we most want from a programming language—clear, unambiguous specification of behaviour rather than compiler/implementation-specific behaviour—is often extremely undesirable in laws and rules. When Parliament passed the Rent Act 1977, they probably did not think that “family” included “two cohabiting gay men”. If the question had arisen during the Parliamentary debate, the answer would have very clearly been ‘no’. Twenty years later, when the House of Lords was deciding Fitzpatrick v Sterling Housing Association Ltd, the answer was ‘yes’. And, while homophobes may disagree, I think not specifying that behaviour up front but only after it arose probably led to a more just outcome.