Some Technocrats Are Ideologues

Felix Salmon says:

I do have some sympathy for Dani Rodrik’s skepticism when it comes to technocrats, but surely a government of technocrats — which is what I think we now have — is nearly always to be preferred to a government of idealogues.

Perhaps Felix should consider that a large part of the set of technocrats is always a subset of the set of ideologues. I think he means that one prefers technocrats who share one’s ideology. Of course!

Advertisements

12 thoughts on “Some Technocrats Are Ideologues

  1. We can probably take it for granted that most technocrats are ideologues, as long as the relevant technical field is riven by ideologies. I rather think that Felix is referring to a government of ideologues who are not technocrats – who, indeed, may not appear to have any relevant technical expertise whatsoever. Then his remark makes more sense, but how relevant it is to reality depends on your allegiances to administrations past and present, no?

  2. Out of curiosity, Will, would you prefer (if you were somehow in a position to choose) a government by libertarian technocrats or pure utilitarian technocrats?

  3. I think I'd go with libertarians for consequentialist reasons. As my friend the philosopher David Schmidtz likes to put it, “If all you care about is numbers, you won't get very good numbers.”

  4. This sounds weird to me. Utilitarianism doesn't tell you that you have to believe your model 100% (and it would be madness to, because we don't have good enough models). A model could give confidence estimates, and you could fall back on ideologies, like libertarianism (think of this as a very unrefined “model” which has some empirical validation) when the numerical model has low confidence. It seems to me that this use of “fallbacks” is consistent with utilitarianism.So in other words, being a utilitarian just means you use data when you can. Which seems to me to be strictly superior to libertarianism.

  5. mk, its about decision procedures. Utilitarianism may very well be the truthmaker in a moral theory. (Depends on how far your mileage goes) However, it makes a shitty decision procedure. For one thing, not all the consequences of any action are ever in. Secondly, in most cases, we make terrible utility calculations. 3. Utility calculations are really really hard. 4. We even make terrible decisions about when we are bad calculators too. What we need if we are utilitarians is a decision procedure. Some rule of thumb which we do not deviate from in order to do utilitarian calculations. These decision procedures presumably are more likely to result in performing the right action. (The right action being what sonforms to the truth maker of the theory)

  6. Yes, but a “rule of thumb” is an instance of us solving problem (4) successfully under the utilitarian framework.If before I decide something I have to whomp myself in the head so that I feel dizzy, and I find that this is what tends to result in the utility-maximizing decision, then utilitarianism will tell me to whomp myself over the head before I decide something (assuming it doesn't hurt that much). I don't see utilitarianism as prescribing a decision procedure or a high level of confidence in fine-grained mathematical predictions. Rather, utilitarianism says that you should do what works. The real problem is that we don't agree on what it means for something to work. But perhaps the idea is to move away from spooky prejudices like “it's what God wants” or “it's what preserves our life-force” towards prosaic statements of what a “good world” looks like. This focus on the prosaic gives us a framework for less blocked conversations about values. Is “let's maximize liberty!” prosaic or spooky? I think for some libertarians liberty is a fetishized (“spooky”) concept. Will treats it somewhat prosaically but I think there is the lingering hint of the spooky in his treatment too.I guess what I'm describing is not really a philosophy but rather a strategy for discussing moral issues by de-spookifying them. I think that strategy is a large part of the appeal of utilitarianism anyway.

  7. A slightly different (dare I say better) way of putting it is:Deontology is essentially consequentialism that recognizes that there are computational limits to rationality.

  8. Deontology is essentially consequentialism that recognizes the computational limitations of rationality.

  9. “Utilitarianism” is a sham. There are always unknowns, always unintended consequences.”Utilitarianism” only works when you have an omniscient, omnipotent god running things.

  10. A long while back when there was some drama going on in the blogosphere about the relative lack of conservatives in academia, someone over at the Volokh Conspiracy pulled out this study that purported to show that conservatives were, on average, better educated and better informed than liberals. The other interesting thing it purported to show was that better educated liberals tended to be more liberal, and better educated conservatives tended to be more conservative.My guess is that smart well educated people are much less tolerant of inconsistencies in their own political views, and much more confident that their own views are correct.

  11. A long while back when there was some drama going on in the blogosphere about the relative lack of conservatives in academia, someone over at the Volokh Conspiracy pulled out this study that purported to show that conservatives were, on average, better educated and better informed than liberals. The other interesting thing it purported to show was that better educated liberals tended to be more liberal, and better educated conservatives tended to be more conservative.My guess is that smart well educated people are much less tolerant of inconsistencies in their own political views, and much more confident that their own views are correct.

Comments are closed.