Ariel Porat, "Personalization of the Law"
Presented at the Legal Challenges of the Data Economy conference, March 22, 2019.
Transcript
ARIEL PORAT: Good morning. Thank you for having me. It's a real pleasure to be here. so personalization of the law-- this is a project that I started working with Professor [INAUDIBLE] from Chicago University. But most of the work I've done on personalized law is with my colleague and friend, Omri Ben-Shahar. Now we are both doing a book together titled, Personalized Law-- forthcoming OUP-- one day. We didn't take into account that I might become the president of Tel Aviv University. But we somehow have to handle this new problem.
So usually, I like to start with an example, right? Instead of long introductions, let me start with an example. So the example is taken from inheritance law. In, I think, all jurisdictions, people can leave a will. But if they don't leave a will, there is intestacy law-- which are actually default rules. So you can opt out of the default rules by leaving a will.
Now most people-- actually I think more than 50%-- do not live wills. So intestacy law is really important. Now intestacy law is uniform. So, for example, in many jurisdictions, the estate would be allocated say 50% to the children, and 50% to the remaining spouse. Of course, it could be different. But in many jurisdictions, it is done this way. But what's important for our purposes is to say that the intestacy law is uniform. It doesn't change according to the characteristics or the identity of the deceased.
Now there is a very interesting-- like solid evidence-- this research showing that there is a very interesting difference between men and women when they leave a will. So this is from the United States. It might be different in other countries. But I think it's quite striking that with women, at every age, when the women live a will, 80% will go to the children, and the rest to the remaining spouse. So 80% to the children. With men, it is 40%.
Why? Well maybe-- why is not exactly the topic of today's discussion. But of course, I cannot resist the temptation to explain why. So you can think of many reasons. But probably, actually, there is a kind of a very rational reason for that. And the rational reason is the different fertility age of men and women. Right? So you could imagine a woman thinking about what would happen after death. Now she knows that it's quite possible-- or she might have the concern-- that her husband will remarry and will have new children.
But from a certain age, there is not such a concern anymore. Right. It still is not about remarriage, but about new biological children. So if your wife is 50 or 55 or 60, the chances that you would have biological children from the new husband is-- I don't want to say-- even say zero-- or at least very, very close to zero. It's different when it's about women. Because men could have children even at a very-- so that might be kind of a reason.
So in other words, women have more of a concern actually to protect the children from the future children that the husbands might have if they passes away. Anyway, whatever the reason is, what's important for our purposes is that there is a huge difference between men and women. And then, the question is, why not to have two default rules, not only one? Instead of having a uniform default rule, to have one default rule for men, and another one for women. And if we go in this direction, maybe, why only two default rules?
You could also think about Muslims, Jews, Christians. And you can think about people the age of 40, or the age of 80. And you could think of many other differences between people that might affect the default rule which apply to them. Remind you-- this is a default rule. They can opt out by leaving a will. But if we go in this direction, the question is, should we do just crude personalization? Or maybe it should be even more granular. So with the aid of big data, we might gather information about people's preferences and characteristics, and actually tailor, at least theoretically-- I understand that that's not something that would happen today or tomorrow.
But maybe in 10 years, we would be able to tailor a default rule for each and every person. And we argue that it might be a good idea. Now in the second example I want to give you is about personalizing remedies. So as you know, there is-- there are many differences between civil law and common law countries. By the way, Israel is something in between. It's true that I personally more kind of an American a common lawyer, because I spend a lot of time in teaching American law schools, not only in Israel.
But actually, Israel is a mixture of both. Anyway, in common law-- common law Anglo American legal systems, usually we say that the primary remedy for breach of contract is damages, while in civil countries, usually we say that this is specific performance. Suppose that instead of having a default rule-- if we look at it as-- I understand that you could argue, is a default rule or modern default rule. But let's assume for a moment that these are two default rules. In one type of jurisdictions, we have default rule damages, and the other one, specific performance.
But suppose-- let's imagine now that we have information about buyers. Maybe take an example of a consumer contract, something-- buying a house from a contractor. So it's not the-- it's kind of more-- you buy your house from a contractor. And the question might arise-- if there is a breach of a contract-- whether the remedy would be specific performance or damages. So one possibility is to look into the characteristics and preferences of the parties, again, by using big data.
And so suppose one of them is a kind of a person that his characteristics is more about being attached to the property that he buys. In other words, if he does not get specific performance, there would be some uncompensated emotional losses. While the other guy, let's assume-- guy or could be woman, of course-- or man-- so the other a party to the contract, imagine, is a kind of a very rational, economical oriented guy. Doesn't have really emotion about property. He has emotion, but not about property. Some people think that economists don't have emotion. This is wrong.
But when it comes to property-- so this guy is a kind of a very economic oriented guy. And for him, maybe damages would be enough. He doesn't need specific performance. Of course, if it is done this way, also the price may change, because specific performance is more burdensome on the seller and it might affect the price. So if we think about full personalization, it would not be just personalization of the default rules, but also personalization of the price.
The right to return a product-- a third example. Now again, as you know, there are jurisdictions in which there is a ideal default rule, sometimes even mandatory rule about returning products-- returning products and get your money back. And so what happens when-- imagine a jurisdiction in which there is such a right to return a product, either as a mandatory or the default rule. Suppose it's a mandatory-- take the extreme case. Is it good or bad? There could be many arguments for and against.
But one thing should be noticed, which is that if you have one rule applies to all, there is a kind of both efficiency and equity, as you mentioned, right? I would say, equality, or I would say more justice for them. And this is because, of course, subsidization. What does it mean? Imagine two consumers. One is very careful in purchasing products. He thinks several times before purchasing, and he would almost never to own a product and get his money back.
The other one is very careless one. So he buys without second thought, and use the right to return almost on a daily basis. Of course, that last person also impose higher costs on the seller. We imagine that there are costs here. Now if we have a uniform rule, there is cost subsidy, because the price would be the same for both consumers, meaning that the careful consumer subsidizes the careless one. If instead you personalize the right to return, and also personalize prices the same way, then cost subsidization would disappear, because any-- every consumer first would get a default rule, or a rule of-- with respect to returning the product according to health, characteristics, traits, and preferences, but would also pay an adequate price.
So cost subsidization would completely disappear. I would say, it's not only more efficient, it might also be more just. Last example is about donating organs. It's interesting how the default rule is kind of a sticky one when it's about donating organs. So there is research showing very interesting difference between Germany and Austria. So the population in both countries are not very much different in the relevant perspectives. At the same time, in Germany, only 10% of people donate their organs after death-- 10%. In Austria, 90%. 10% versus 90%.
Why? Simply because the default rule is different. So in Germany, the default rule is no donation unless there is an explicit consent. In Austria, it's exactly the opposite. And that actually-- probably the best explanation for the difference-- this different rates of donation of organs after death in both countries. Here too, you could think about a personalization of the default rule. Again, this is a default rule, of course. So how exactly it would work, right? So I want to focus-- only because we are way short of time-- only of consumer contracts. And think for a moment how exactly it would look-- how the world of personalization in the consumer context would look like, if our proposal is adopted by lawmakers.
So actually, quite simply. So consumer would approach a merchant. The merchant would be able to know the cluster, or the profile, of that specific consumer. And the default rules would be adapted accordingly. It would be done by algorithms, but basically, based on big data. And the consumer, of course, would be able to ask for full disclosure-- namely, he would just have to give the merchant his ID. And then the merchant would be able to print for him everything that he likes to see.
Of course, we know that most consumers don't really care and don't know nothing about the default rules, not to mention even the written terms. Many of us do not really read. Certainly not the default rules, which are not written at all in the contract. Nobody would go away to a lawyer to study the default rules when they buy a TV-- a bit TV screen, or whatever. But if somebody would like to do it, he would be able to get it. Again, you could imagine that people would be-- who'd have an access to their profile.
So if you think that your profile does not-- is not adequate enough, or not accurate enough-- does not characterize you accurately, you would be able to enter into that profile to change it accordingly. And of course, if you don't like personalization, you might also be able to opt out and say, I don't want that issue in my life. So this is how it could work. Feasibility. I will do it very quickly. So I want you-- there are actually more example that I'm going to discuss, because Omri told me that it should be very short. So in the instance, I don't play games.
So with big data-- here are some examples how merchants currently use big data. And my examples-- I know that-- I'm sure that you know of many other examples. But I want to give some examples that are not exactly personalization of the law, but they are close enough to show the potential. So in other words, the next step could be personalization of the law. So here are a couple of examples. So there are some restaurants, so I've heard-- I think in New York. Actually, I've never been in such a restaurant, but I know that there are such. In which, when you go to the restaurant, you might get a napkin of a certain color, while the guy or the person sitting next to you would give a napkin of a different color.
And you ask yourself-- actually, you like the color that you got more than the one that your neighbor got. But you don't know why you got such a napkin. And the answer is that probably they have information about you. Not because-- even if it's your first time in the restaurant, they might know it. How? So they have information about you. Maybe they acquired information from other restaurants. But even more interestingly, they might-- big data might, maybe even predict what would be the color that you prefer, even if you have never said so. So this is one example.
Another example is the example of the pads on the furniture. So there is a very interesting example that lenders-- you see, also auto insurance companies-- but actually lenders realized that there is a very interesting correlation between sticking pads to people's-- to the furnitures that you have-- to the legs of the furnitures that you have in order to avoid scratching the wooden floor, and being very careful in returning your loan. So if-- so here is free advice. If you want to have better terms, maybe lower interest when you take a loan, invest $2. Buy pads. Stick it to the legs of your-- now, of course, don't do it.
Because once everybody knows it, they stop doing it. But actually it happened. It happened. So once it is revealed, you cannot use it anymore. But you see, it's kind of an interesting-- and there is the very famous example of Target using-- they realized that there is a correlation between using some type of vitamins and the fact that the woman using the vitamin-- or buying the vitamins is pregnant. So if you buy those vitamins at Target, you might get-- two days later-- advertisement for diapers and other baby-- right? They try to-- and so there are-- and there are many other interesting examples.
We know how much merchants can learn from the way in which people use cellular phones, the way the webs-- by the way in which the websites that they visit actually shows a lot about them. And all that is actually used by merchants. In 2000-- 2013 about $34 billion, according to some estimates, were spent just about collecting data. Today it's doubled, at least. Now, I think I have-- yeah, I still have it. So now about personalizing disclosure and informed consent, which is one more example I want to give. And maybe it would be very close to the end.
So there is a book-- a very interesting book. I recommend you reading it, called-- written by Omri Ben-Shahar and Carl Schneider, about disclosure. So it's a book that actually attacks the practice of disclosure, arguing that it's mostly a waste of money, and it's really-- well, with some exceptions, it doesn't really make any good. So counter argue-- not exactly counter argument. But personalization argument would say, so personalize disclosure. So one of the problem that makes disclosure waste of money, is because people don't read. And why they don't read?
That most of the information is really irrelevant for them. If you buy some drug, and there are all kinds of warnings that are completely irrelevant to you. Maybe one might be relevant. So it might be written that if you're a pregnant woman, you shouldn't use it. If you are a child at the age of 16 or younger, you-- but if, instead, it would be personalized-- you buy the drug, and then by using your app by your cellular phone, you get warnings which are directed to you, exactly to you according to your characteristics and maybe health and so on.
Maybe even disclosure would make more sense. There is also the example of informed consent. Those of you are familiar with tort law, you know. I know that it's both in Europe and the US and actually everywhere. The doctors should disclose information to patients about risks before treating them. And some people-- some patients-- want more information. Some others want less information. Sometimes more information might have adverse effects on those patients that really don't want such bold disclosure. So again imagine that you come to a doctor, or to a hospital. They would enter your ID to the computer. And it would be on the screen. It would be written, [INAUDIBLE]. This is like low level of disclosure.
Then disclosure would be at a very low level, while if instead it would be medium or high, the disclosure would be higher. So we believe that there is a potential here too. Of course, part of it might be that if doctors disclose according to that-- to what they learned about you-- they would be shielded from potential liability. That's, of course-- now, there are some objections. Maybe I'll skip it now, and I would be very happy to answer questions about objections that might arise, and in conclusion, what we suggest, start personalization in the easiest cases.
And then, after acquiring experience, proceed to other areas. So we have-- Omri and I co-authored the paper about personalizing negligence law, which is quite interesting. I wouldn't enter into it. Maybe I can give one minute. Am I right? Could I say-- one minute? Is it OK? One more minute. So that would be the end of my presentation. So when we called to the paper about personalizing negligence law. And here is, in one minute, the intuition. If you want more, you need to read the paper. So only 60 pages.
So the intuition is, think about Omri and I as two drivers. And suppose we drive exactly in the same conditions. So everything is the same-- same weather, same car-- just everything. Now imagine now that Omri is a more skillful driver than me. Just he is better in driving. And he might also have better instincts, or maybe I have better instincts. So the intuition is, that why Omri and I should meet the same standard of care. Maybe the standard of care should be adapted to our characteristics, to our skills, and also to the initial risks that we create maybe because we have different instincts. So this is just a very, very short introduction to the personalization of negligence. Thank you very much.
[APPLAUSE]
Big data