Nicholas Laskowski: Honesty and Trust
July 12, 2024

Honesty and Trust
- Allen Stairs
It might seem obvious what honesty amounts to, but Prof. Nicholas Laskowski thinks that standard accounts in terms of virtue or truthfulness don't get it right. Nick has been an Assistant Professor in the Philosophy Department since the fall of 2023. He works mainly in moral and social philosophy, and with his collaborator Nathan Howard (University of Toronto), he argues for a novel account of honesty: the constitutive norm and trust (CNT) model.
Start with dishonesty. Suppose Bob says to Alice "Carl will be out of town this weekend" even though he believes no such thing. (Maybe Bob is jealous of Carl and wants Alice's company for himself.) Unless people normally believe what they assert, there couldn't be a practice of asserting things: hearers couldn't trust speakers. Bob has violated a rule or norm that's part of what makes assertion possible -- a constitutive norm of assertion. And by doing that, he violated the trust that his words entitle Alice to have in what he said.
The part about trust is interesting. Carol might shame Dave by speaking the truth about him. Dave might have trusted Carol not to do that. But the nature of assertion doesn't entitle anyone to expect not to be shamed, and we'll suppose Carol never promised any such thing. Carol hasn't violated a trust, regardless of what Dave might have expected. Or consider a different case. Fran is a casual acquaintance of Ed's. Ed runs into her in the grocery store and asks "How are you?" Fran says "Fine" even though she has a raging migraine. Fran has violated a constitutive rule of assertion, but the customs of polite conversation allow parties to a conversation to set the rule aside in cases like this. No legitimate trust has been violated.
Respecting constitutive norms isn't enough for honesty and likewise, speaking truthfully isn't always enough. On the CNT trust view, an honest action vindicates the trust that the constitutive norm licenses. You can be dishonest by saying true things chosen to mislead, violating someone's trust in the process.
In this election year, honesty is front and center. Politicians claim things and make promises. The things they say may not be true and the promises may not be kept. One of the attractive features of the approach that NIck is working on is that it makes room for the complexity of the web of background norms and expectations. But this makes the question of when politicians are being dishonest tricky. "When a politician on the campaign trail says that they're going to do X and they know that they're not going to, then perhaps they're not being dishonest." It will depend on where we've actually set the norms that determine the appropriate level of trust.
But that doesn't mean all is well. On the norms we actually have, Nick says, "It would have been better if we had put stronger norms in place earlier on." This raises an interesting issue. We've learned in the last few years that norms aren't a matter of set-and-forget. Bald-faced fabrication has a place in American politics that it didn't have even twenty years ago. This surely isn't good for the body politic, but it may provide fruitful work for the CNT account. "I think it's a strength of the view that it allows us to say that something may be honest with respect to one set of norms and dishonest with respect to another," says Nick. "And it may be that in settings where you have an actor from one set of norms and an actor from another, there may be a negotiation of a third set of norms."
Another area where Nick thinks the view he's developing has a contribution to make is in deciding how to think about the guardrails for generative AI. Nick was a Residential Fellow with the Center for Artificial Intelligence Safety in San Francisco. In his view, it may be more fruitful to think about how to make chatbots trustworthy than just how to make them truthful. One reason: truthful statements can be used for harmful ends. For example, Nick points out, a truthful chatbot could be part of a scheme to gain the trust of someone as part of a cybercrime scheme. Getting a grip on what honesty might mean for AI programs is an important challenge.
Nick has other projects on the go as well. For more about him, and for links to some of his work, go to his webpage, at