[UPDATE: Here are some more observations from Ruby-coloured glasses.]
Alex Papadimoulis over at The Daily WTF (one of my favorite IT blogs) has posted a lengthy and thoughtful solution to the problems I raised in my post on the “Dead Sea effect“. Specifically, he refers to the “Up or Out” model, pioneered over a century ago by Paul Cravath and well-known to anyone who has worked in a law firm or one of the Big Eight Six Five Four (how many are left now?) Consulting Firms.
In fact, I’m quite familiar with it, because I spent two years as a Director (one level below Partner) at PricewaterhouseCoopers, which definitely used the “Up or Out” model. In addition, I’ve been serving off and on as a expert witness in IT-related litigation for nearly a decade and so have spent many long hours in law firms and with lawyers (ranging from associates to partners), and as Alex notes, law firms tend to use this same model as well.
So, how well does this model work in developing and maintaining top talent? Well, that depends upon what you define as “talent”. I haven’t read Cravath’s book, so I don’t know what he originally proposed. But in modern law firms and large consulting firms, “talent” is mostly defined as “bringing in clients with lots of money” and “keeping the staff under you fully occupied with billable hours” (though, to be fair, it also includes your actual performance over the course of multiple engagements). Let me explain.
Modern “up or out” firms typically have four to six basic job titles, sometimes with subdivisions. (For example, in our division at PwC, we had — as I recall — Partners, Directors, Senior Managers, Managers, Senior Associates, and Associates.) Ideal engagements follow the “pyramid model” — say, a Partner, one or two Directors, two to four Managers, and four to eight Associates. (You scale up all the numbers for larger engagements, and so on.) In a firm like this, everyone below Partner is typically on a fixed salary, but has a billable rate for clients that was roughly some multiple of their salary. So, for example, in such a firm an associate might be paid $70,000/year but might be billed out to clients at $140/hour (equivalent of $280,000/year, assuming 2000 hours/year in billable time).
The two key words are “leverage” and “utilization”. You want an engagement with leverage — that is, one in which you could apply the pyramid model, staffing it primarily with Managers and Associates. You also want an engagement that would allow high utilization — that is, it would keep those Associates (and hopefully the Managers) billing 30+ hours/week (80% utilization) throughout the course of the engagement.
The “Up or Out” process itself can be pretty brutal. Here’s a simplified example, again typical of such firms. Once a year, each level would do stack rankings on a fixed curve of all those on the level beneath them. So, for example, the Managers would have to stack-rank all the Associates — if you had 20 Associates, you would have to rate them #1 through #20, with no ties. Then you would have to apply a fixed curve — say, 10% A, 20% B, 40% C, 20% D, and 10% F. That would mean that of those 20 Associates, only #s 1 & 2 could be A-rated, #s 3-6 would be B-rated, and so on, regardless of how much or little separated them. Those not making the minimum grade (whatever that was) would be invited to leave. Even those who were above the cut but who did not score high enough for a certain number of years running would be informed that there was no promotion to Manager in their future and that they should start looking elsewhere. The Directors would repeat the same process for the Managers, and the Partners for the Directors.
(All that said, let me add that the mentoring aspect that Alex mentions was very real and extremely valuable. I learned most of what I know about being an expert witness from PwC — specifically from Jeff Parmet, the PwC Partner who hired and supervised me, as well as from other PwC Directors and Partners, not to mention the significant in-house training sessions that PwC ran each year. I enjoyed my two years at PwC and am grateful for them, particularly when I find myself up against an expert witness who has not had such a background.)
Now, if we look at applying “Up or Out” to an in-house IT department, two immediate questions arise. First, what do we define as “Up”? Most large organizations really don’t have a career track for IT engineers above “senior programmer”, except for maybe a handful of “architect” slots and possibly a Chief Technical Officer (CTO) position. The usual “promotion” is to move onto a management track, which a lot of IT engineers don’t really want and aren’t necessarily very good at. This is, I believe, one of the factors behind the Dead Sea effect — your talented IT engineers see very few choices ahead within the organization that let them keep doing what they do best, so they leave; the less talented/skilled IT engineers, on the other hand, are content to remain in their current positions and not advance at all.
I have proposed before to organizations that they implement a non-management technical track all the way to the CxO level, because it’s a lot cheaper to keep your best people on salary than to hire them (or their equivalents) back as consultants at 2x to 6x their salaries. Such a track might look like this: Associate Engineer -> Engineer -> Senior Engineer -> Technical Officer -> Senior Technical Officer -> Executive Technical Officer -> Chief Technical Officer (only one of these). The ranks from Technical Officer and up would have salaries, perks, and benefits equivalent to progressing through management (VP, Senior VP, Executive VP), but without actual management/head count responsibilities. Architects, mentors, and project overseers would be drawn from these ranks. I honestly believe that this upper-level technical track would save most organizations anywhere from millions to hundreds of millions of dollars in failed or late IT projects because they would constantly disrupt the ‘thermocline of truth‘ before it even formed.
The second question is, how, and on what basis, do we evaluate the IT engineers? This is a trickier question than it would appear at first. That’s because whatever criteria you select for evaluation will then become the key factors that the IT engineers will “game” to. Suppose, for an example, that you judge IT engineers on the basis of accurate schedule prediction and on-time completion of subprojects. You will suddenly have a group of hyper-conservative IT engineers who maximize schedule estimates and minimize completion criteria in order to ensure that they have a great (for them) track record. There’s actually some upside to that — you’ll dampen the inherent (and often excessive) optimism found in many IT engineers — but you have to be willing to live with the consequences as well (such as very long, drawn-out projects).
As for the ‘how’ question, I think that a group evaluation from two levels up might work the best; that is, have all the Senior Engineers evaluate the Associate Engineers, the Technical Officers evaluate the Engineers, and so on. I think the extra distance will allow for a more objective evaluation; IT engineers are notorious for being competitive. I don’t even mind stack ranking so long as there is no fixed curve. If you’re doing your recruiting and hiring correctly, you shouldn’t have many (if any) D- or F-level IT engineers.
So, Alex’s proposal has definite merit, but has some potential pitfalls as well. I’m curious to see what more details he may suggest for it. ..bruce..
About the Author: bfwebsterWebster is Principal and Founder at at Bruce F. Webster & Associates, as well as an Adjunct Professor for the BYU Computer Science Department. He works with organizations to help them with troubled or failed information technology (IT) projects. He has also worked in several dozen legal cases as a consultant and as a testifying expert, both in the United States and Japan. He can be reached at 303.502.4141 or at email@example.com.
Sites That Link to this Post
- The Wetware Crisis: the Dead Sea effect : Bruce F. Webster | April 29, 2008
- links for 2010-11-09 « AB's reflections | November 9, 2010
- Up-and-out at Microsoft : And Still I Persist… | August 27, 2013