Discussion about this post

User's avatar
John Schulman's avatar

Full exemption from tort liability seems a bit too extreme -- I think there needs to be some strong incentive on *outcomes*, not just *process*. It'll be too hard for regulations to cover all the risks and mitigations. The real threat of liability from bad outcomes will force companies to think more creatively about possible risks and fully account for knightian uncertainty.

But I like the spirit of it -- liabilities should be greatly reduced for a company that's been given a high score by a validated private auditor. The amount of liability should be a function of both the damage and the negligence, and if the company gets a high score for safety, then they weren't being negligent.

Expand full comment
Rohit Krishnan's avatar

Interesting proposal. I think it ends up becoming something like the equivalent of insurance against a future tort liability, which definitely has useful features and also overhead. That might even be a more direct method. The thing that I have the biggest question with is that none of the AI safety research or assessments I have seen are nearly good enough to be enough for a certification. Definitely not enough to have any views on whether it could cause particular forms of mayhem.

Expand full comment
24 more comments...

No posts