Digital content today is governed by online providers like Facebook or YouTube. Increasingly, these providers are expected to enforce the law by removing illegal content, such as copyright infringement or hate speech. Typically, once they are notified of its existence, they have to assess it and, if infringing, remove it. Otherwise, they face liability. This system of content moderation is a form of delegation of the state’s tasks to private parties. In literature, it is empirically established that some schemes of delegated enforcement can trigger substantial false positives, mostly due to over-compliance by providers and under-assertion of rights by affected content creators. This results in a phenomenon known as over-blocking: collateral removal of lawful content. We conduct a laboratory experiment to test a possible solution to this issue, as proposed by Husovec (2016). Our results show that an external dispute resolution mechanism subject to a particular fee structure can significantly reduce over-compliance by providers and improve the accuracy of their decisions, largely thanks to the content creators taking initiative. It does so by re-calibrating the typical asymmetry of incentives under the delegated enforcement schemes. The principles behind the solution have the potential to improve also other schemes of delegated enforcement where providers have weak incentives to properly execute delegated tasks in the public interest.