Project Management Essentials

A one page summary of (Ken Freed's opinion on) what's essential



The challenge is to develop "good" software within budget and on time, without excessive up front estimate padding or "analysis paralysis"



Effort is Not Timespan

The effort to do a project is a an estimate of the full-time hours the project will take to do, and includes normal full time work habits such as an occasional coffee break, etc.

  • The customer's cost to do the project is based on the effort.

Timespan is how long it takes to put these given number of hours (of effort) in on a project.

  • In many environments, unscheduled maintenance and support activities pre-empt an effort for a few hours to a few days.  Splitting development time between projects also extends a timespan for completion, but does not effect an individual project's effort.

The Most Frequent Problem in scheduling projects are that timespans to develop code are difficult to predict when unpredictable line maintenance responsibilities are also involved.


Track the actual "availability" for development work over a several month timespan.  Use this for predicting how long it will take to do N weeks worth of (full time) effort.

  • E.G.: A 10 week (full time) development effort will take 10 weeks / 0.7 availability = 14.3 week timespan.



No one estimates software projects perfectly all the time.

  • A few good projects and you'll tend to underestimate the next few projects.
  • Get burned a few times and you'll start overestimating instead.
    • I've been estimating and tracking projects for over 25 years, and I still go through this cycle.

Post Coding Estimate Percentages

 I've tracked the data on most of my projects over the years.  For the type of work I've done, given a Waterfall process of Analysis/Requirements, Design, Code, Test/Debug, here's what the later steps wind up being:

Debug: +20% of the coding effort for "Glasshouse" or purely computer applications
+30% of the coding effort if machinery, instrumentation, vision, robotics, etc. interfacing and coordination is involved
Documentation, Limited User Training: +5 to 10% of the coding effort
Administration, Meetings, Status Reporting: +15 to 20% of the coding effort

 This is not only for your own estimates, but also for outside contractors who claim they can complete a (e.g.) 6 month coding effort with two weeks of onsite installation and debug. Chances are that before you're done, it will wind up being around 6 weeks (if the project is to be successful). 



There are many ways in which reasonable people can disagree, and whole books have been written which encompass the following points.  These are not "universal truths", but my own boiled down generalizations. They stem from reflecting upon the particular work environments I've have been in.

It should work, be used and useful.

  • Meeting schedules, budgets and formal requirements with software that end users consider a millstone around their necks - is not a good thing.
  • It is important to flush out screen GUIs and identify awkward functionality with the end users, as part of alpha or beta testing.
    • The "scope" of a project is defined as that where likely minor changes do not take a major rework of the design.
  • In the end, the average customer doesn't care if an application is developed in GWBASIC, Visual Basic, C#, or Assembly for that matter, as long as it looks good, runs good, and works well.
    • Among modern programming languages, debugging capability is much more important than language particulars.

Design and Code should strive for ease of understanding.

(more than performance, reuse, generality, terse code,...)

  • This often means that the software should have right degree of "generalization" for the problem and budget at hand.
  • The less the programmer understands the application domain or the likely paths of change - the more generalized they tend to write custom software (see: Programmer Joke).
    • Reducing a chaotic set of requirements to simple categories, and then coming up with an easily understood, obvious (and by necessity: limited) design - takes far more up front thought and negotiation than an overly generalized solution.
    • The limits of a software project are best described by documenting those features, that the user is most likely to want, that this phase of the project will NOT deliver.
  • Good code reads like a novel (to someone other than the one who wrote it). 
  • Bad code reads like the Rosetta Stone:
    • The correlation between code, screen/database/networking functionality, and the business model/machine was taken for granted by the originators - and has now become hard for anyone else to infer from (at the worst) a collection of superficially commented cryptic looking listings of highly abbreviated subroutine names and unclearly scoped variables.
      • In the real world, most maintainable code falls between these two subjective good-bad extremes (and depends on the coder's familiarity with the domain, application, language, etc.)
  • As in all writing: the more intelligent the writer, the easier they are to understand (Feynman's lectures come to mind).  The ultimate test of code maintainability is whether it lives on (to be used, useful, and enhanced) well past its originators.


 All software development follows some variation of these Waterfall Process steps/stages/phases:

  • Feasibility: An optional phase to figure out essential unknowns.
  • Analysis: What's the problem and how can we solve it? What are the rough costs of different approaches?
  • Design: How we will solve the problem and what will it cost (effort, timespan)?  Scope/Limitations: What we won't do in this pass that the users are most likely to want.
  • Test: Is desk testing, tool or simulator alpha testing, beta (or the programmer isn't right there) testing.
  • Maintain: As users come up with better ideas for this or that.



This is the approach I used to sort out an inhouse manufacturing context with a lot of competing internal priorities. Other contexts might call for other approaches. Click on the image thumbnail below for a bigger picture.

Initial Project Estimates,
Project Prioritization
Quick estimates, accurate to within some margin, are usually better for sorting a list of competing priorities than accurate ones which take 30% of the total project time to pin down.
If the delivery date is not critical (i.e., plant production and customer commits are not depending upon it), then don't get lost in "analysis paralysis".
Actual vs. Estimated Project Effort Keep the hours tracking simple, easy and quick. Unless the customer is paying by the hour, track to the nearest 15-30 minutes per day. proj_actuals
Maintenance, Overhead, Project Availability Summary Using an Excel spreadsheet, it takes me about 8 hours to sum up and issue a memo on a quarter's worth of work.
The purpose of this is to (i) get the "availability" for future project development, and (ii) see if any items spent on maintenance, Pareto out above the noise and should be prioritized as "root cause fix" type projects.
maint overhead summary



Some typical effort multipliers based on post project regressions from other jobs:


Application Domain

Operating System

  You know the ins and outs of what the code is used for You've never worked with code that does this stuff
You know the programming language, operating system, etc. If you know the application and have worked in the application domain, your estimates should be pretty close.  (+0%) If you know the language & operating system, but haven't worked in the domain, it will be harder to anticipate the "scope" of the project.
  • The project "scope" is where likely minor changes in requirements don't take major reworks to the design.


The programming language, operating system, application, or class library (etc.) are new to you If you know the domain, but not the application or the language, you'll have to add some padding to come up to speed.  (+20-30%) Not knowing the domain nor the application makes it difficult to estimate the project.  (+300%)

Favorite Workplace Project Management Guide


Final observations ...

An engineer writes a program to solve a problem.
Con: Their code tends to be a collection of point solutions. They don't generalize when they should
A programmer invents a language (or a class library) to solve a problem.
Con: They tend to overgeneralize, making their code harder to pickup and maintain for anyone else.
A computer scientist writes a specification for a language to solve a problem.
If they're witty they write books and lecture, or become software methodologists.
A PhD writes a grammar for a specification for a language to solve a problem.