Currently, reputation is the primary line of defense for preventing comment abuse. You can't post comments until you earn at least 50 rep - but after that, you have free rein; there's very little practical oversight for comments. As a result, if we want to explore opening up commenting to a broader audience, or experiment with making voting more accessible (and thus gaining reputation), we need to make sure that reputation isn't the sole line of defense - if those changes are made without any changes to comment moderation, then we open up huge vectors for abuse.
With that in mind, here's a proposal for a system to put us in a better position to detect, handle, and prevent comment abuse. This proposal is based on my experience as a moderator and experience with community-developed tools for moderating comments and other content. I'm including all of the different aspects in this one post, since they're rather interconnected, even though in theory parts could be implemented without the others.
Split "No longer needed" back into "chatty" and "obsolete"
This is relatively minor, but it'll be relevant later on.
A while back, we had different flags for "chatty" and "obsolete" comments. At some point, they were rolled into one flag, called "No longer needed". If we split this into two different flags again, it makes it clearer what types of comments are supposed to be flagged, and gives handlng moderators a better sense of context when handling as to why it was flagged. The distinction also allows for automated action - more on that later.
I wouldn't necessarily be opposed to merging "rude or abusive" and "unfriendly or unkind" into a single "violates the Code of Conduct" flag, but that's less relevant for this discussion.
A new comments dashboard
There's currently no way for anyone to view new comments posted on the site. That needs to change.
I'm envisioning a dashboard that presents all new comments posted on the site - essentially a onebox of the comment with a link to the parent post, much like the "new answers to old questions" tool. This would at least allow people to trawl through the list of new comments and spot abuse, making oversight at least in theory possible. This would allow for noticing abusive (or just chatty) comments on old posts that wouldn't necessarily be noticed otherwise, for instance, or help in spotting a single user making a number of problematic comments in a short amount of time.
Within this dashboard, comments that might need attention should be marked in some way. Comments on posts that have not been active in, say, 30 days, should be marked with a little notice along the lines of "new comment on inactive post". Short comments that contain certain keywords such as "thanks" should be marked as "possibly chatty".
I'm not sure access to this dashboard needs to be gated at all. Getting more eyes on new comments and people participating in curation seems like a good thing. There is an argument that people shouldn't be getting involved in comment moderation without being familiar with the specific site culture and site-specific policies, but even so I think 200 rep is the top reputation gate that I'd be comfortable with; it shouldn't be too hard to get involved.
However, once you do have access, just a list isn't the most useful....
Filters for the dashboard
You should also be able to filter explicitly for comments that are detected as a certain type, such as new comments on inactive posts, possible "thanks" comments, or comments containing external links (particularly important if commenting is ever allowed for 1-rep users). This would allow curators to easily go through comments that are possibly in need of flags.
Moderators should also have an option to include deleted comments in the full list, and to view recently deleted comments. There are cases where deleted comments are symptoms of larger issues; better spotting them would help get attention to the root issue sooner.
Automated action based on flags
Once curators have identified comments that should be deleted, and flagged them, those flags should result in action. Currently, it takes at least three non-CoC-violation flags to delete a comment without moderator intervention; the higher a score a comment has, the more flags it takes. There should be a practical cap on the amount of flags it takes to remove a comment, even highly-scored obsolete comments, to allow the community to handle these cases. (Having moderators be able to view recently deleted comments also helps prevent abuse in this arena.)
Furthermore, enough helpful "chatty" flags on comments by a single user within a certain time period should trigger an automated message to the user whose comments are being flagged, saying something to the effect of "Our system has detected that many of your comments have been removed for being chatty. We wanted to remind you that comments are to be used for the purpose of improving the post; you're welcome to check out chat [and Discussions, if it still exists] for some less restrained discussion". If the chatty comments continue after that point, an automated mod flag is probably a good idea. (See, this is why we need the distinction between "chatty" and "obsolete".)
It should also be possible for comments to be automatically moved to the chat room, once comments have been moved to chat, by flagging them as chatty. Comments currently can't be moved to chat more than once; that should be made possible via either moderator action ("move comments to existing chat room") or by comments accumulating enough flags after a conversation on the post has already been moved.
Gradually open up commenting to users just earning the privilege
Instead of allowing brand-new users to post an infinite number of comments immediately, there should be diminishing rate limits for users who have just earned the privilege. Users should have a limited pool of comments at first - open to ideas about the exact numbers - that then expands or stays the same, based on the reception of their comments; users who are receiving upvotes on their comments should have a larger pool, while users whose comments are receiving chatty or CoC violation flags shouldn't necessarily have their available pool increased. I think it would make sense to remove the limitation entirely once you hit a certain rep point, or other milestone (Pundit badge?), and have the same behavior for being allowed to leave comments that exists after you hit 50 rep today.
All together, I believe this system would put us in a much better position to handle comment abuse, and is a step toward moving away from reputation being the sole defender of the site. Once we move away from reputation being the primary method to prevent abuse and misuse of the system, we can be a lot more flexible in experimenting with concepts such as voting and awarding privileges.
I'm interesting in hearing thoughts on this proposal - issues, things I haven't considered, etc. I'm tagging this discussion as well to encourage that discourse. How can we turn this into a robust comment moderation ecosystem?