So at work one of the things I work on is a multithreaded server application.
I've written about bugs I've tracked down in it before, and I bitch about it a lot because multithreading in general is a huge pain in the ass. If I could possibly come up with a way to avoid it in this particular application, I would, but I haven't been able to find a feasable way to do that yet, for a number of reasons, some good and some bad.
Despite this hard earned dislike of multithreading, when I read Chris Brumme's weblog entry about some of the issues involved in hosting the .NET CLR inside a high performance multithreaded application (SQL Server specifically), I was just floored by some of the ideas he mentioned. It's stuff like this that makes you want to say "yes damn it, those kind of performance requirements are a damn good reason to use threads".
Specifically, they go to massive lengths in SQL Server to keep exactly the right number of threads executing useful code as they can, which gives huge bennefits for a number of reasons.
It's really neat stuff, and I'm now struggling to determine if any of it is directly applicable to what I'm currently doing. I suspect it'll be difficult to implement, and I'm not sure if it'll be practical due to the constraints of my application (third party code sucks in these sorts of cases), but damn it's an interesting thought experiment if nothing else.
In any event, if you're doing anything with multithreading in your applications you should read his blog entry, even if you never have any intention of hosting the CLR inside an application. The stuff he talks about, in this entry and others, has implications ranging far beyond the CLR, and can be applied to all sorts of programming problems.