What are alternatives to the Waterfall model - methodology

Can you please give a methodology that stands to alleviate the disadvantages of waterfall model?

The problem with Waterfall is that it consists of monolithic stages, each building on the previous stage. So the code is developed in one chunk after the entire system has been designed, which in turn happened after all the requirements have been gathered and signed off.
This is problem because any change has to be ratified by a complex procedure and rippled through all the stages. But the lesson of history is: change happens. The requirements are always incomplete, or mis-specified or simply out-of-date by the time we get to coding. Too often design and build proceed on the basis of assumptions which are nullified when the system gets to UAT. This leads to frantic re-work and slippages.
The truth is not many customers are good at the sort of abstract thinking required to envisage a working software software system. And too many IT professionals lack the experience necessary to understand business logic. Waterfall refuses to accept these truth.
The only honest requirement specification is "I'll know it when I see it". So it is crucial to get working software in front of real users as soon as possible. Any methodology which focuses on delivering working software incrementally in short iterations will "alleviate the disadvantages of waterfall model".
Originally that was RAD or DSDM. Then XP tok up the banner. Now there is Agile and related things like Scrum and Kanban.
So why do people persist with the Waterfall method?
There is a common perception that Agile is just a cover for cowboy hackers to ditch all the boring process stuff and get on with what they enjoy most: writing code. The branding of "Extreme Programming" certainly encourage this thought, and, let's be honest, it is not an unfounded allegation. That is, some coders pretend to be agile as an excuse not to plan, design or document. This does not reflect the actual practice of Agile, which require just as much rigour as any other methodolgy.
Also Agile requires a much greater commitment of time from the customer's staff, which many organizations are loath to accept. Also the people footing the bill may be unwilling to empower their junior staff to make decisions. There is an important distinction between Customer and User.
When it comes to outsourcing the waterfall model provides an easy framework for matching deliverables to staged payments. Indeed the contractual aspect maybe stronger than that: in the EU Waterfall is mandated for all projects valued at EUR 100m or more.
Finally, there are projects where Waterfall works well. These projects have knowledge domains which are stable and well-understood by both the customers and the developers.
last word
Despite its failings Waterfall has delivered many projects successfully. This is because hard work, aptitude and integrity are more important than methodology.

The waterfall model was documented in 1970 by a Dr Winston Royce in a paper titled 'Managing the development of large Software Systems'. Basically outlining his ideas on sequential development. His idea was that software could be produced in a similar fashion to an automobile, where the vehicle is pieced together in sequential/linear phases.
This linear approach doesn't really allow for changes in a piece of software once it begins. There is no tight relationship with the end user/client so its harder to outline possible problem areas.
Its worth noting some phases of the waterfall model allow for 'splashback' whereby there is enough time in the development period to go back and make small changes. Time constraints and the amount of work involved and budgets don't really allow for much change if any to be made using this model.
The waterfall model is old, as time goes by software paradigms themselves change. Object Oriented programming is popular, back then it was barely alive. Through the use of the waterfall model its obvious that the flaws have been spotted and this has lead to the alternative development methodologies.
Ok, so now for alternatives. Incremental model is described by Alistair Cockburn(2008) as a staging and scheduling strategy in which various parts are developed at different times or rates and integrated upon completion of that specific part.
Basically incremental looks a lot like this:
Analysis->Design->Code->Test
Analysis->Design->Code->Test
Analysis->Design->Code->Test
Number of benefits include lifecycle being flexible and allowing for change from the get go.
Working software or rather parts are generated quickly and early on. Code produced is earlier to test and manage due to the small iterations of progress. Not all of the requirements of the system are gathered up front, just an outline. This allows for a quick start, however it might be a disadvantage in some systems as things like the system architecture being supported might be missed.
Iterative on the other hand allows parts of the system to be reworked and revised to improve the system. Time is set aside to allow for this. Iterative does not start with a full specification of requirements. Development is done by specifying and implementing just part of the software. Software is reviewed in order to identify further requirements.This is more of a top down approach. Disadvantages with this methodology are making sure all the iterations are compatible. As each new iteration is approved, developers may employ a technique known as backwards engineering, which is a systematic review and check procedure to make sure each new iteration is compatible with previous ones.A major benefit with the constant iterations is that the client is kept in the loop and the final product should meet the requirements.
Iterative approach diagram.
Other methodologies include Prototyping. Evolutionary and Throwaway. These are also deemed as more of a top down approach. Both process are borrowed from engineering.In engineering it is common to construct scale models of objects to be built. Building models allows the engineer to test certain aspects of the design. The software development prototyping methodology provides the same ideology. Prototyping is not seen as a standalone, complete development methodology but rather an approach to handling selected portions of a larger, more traditional development methodology.
Throwaway Prototyping - Throwaway prototyping does not preserve the prototype that has been developed. In throwaway prototyping there is never any intention to convert the prototype into a working system. Instead the prototype is developed quickly to demonstrate some aspect of a system design that is unclear. It can also be developed to help users or clients decide between different features or interface characteristics. Once any problems or uncertainty has been addressed the prototype can be ‘thrown away’ and the principles learned used in the design and documentation of the actual product.
Evolutionary Prototyping - In Evolutionary prototyping you begin by modeling parts of the target system and if the prototyping process is successful you evolve the rest of the system from those parts. One key aspect of this approach is that the prototype becomes the actual production system. This process allows for difficult parts of the system to be modeled successfully in prototypes and dealt with early on in a project.
Other areas to look into will include Agile-> SCRUM, Extreme programming, Paired programming etc.
Tried to keep it short but people write books on this sort of stuff and there is so much to discuss.
Might be worth having a look at:
Incremental and Iterative

The alternative to the waterfall method is "doing it the correct way".
Waterfall seems to make sense if you are on a factory floor assembly line. But I've never seen it work as part of the design process...and sofware development is ALL a design process. And so the waterfall method never really works in the sense that it doesn't help facilitate the creation of high quality product, but rather focuses on process. Process can be great, but what's the point if the product it produces is second rate?

Kanban and Scrum are two of the most commonly used alternatives to Waterfall. I tried to give a good overview and comparison of the different SDLC approaches.
Waterfall relies heavily on massive monolithic phases as mentioned by APC. This is a huge weak point because trying to determine the end product from the start is a fruitless endeavor.
Kanban is slightly cowboy, but I find if you couple it with standups it certainly still has it's place.
Scrum is great for putting pressure on the team and getting ownership on tickets. I've found most places have been going with this one but the downfall of it is some people go overboard with having meetings for everything. Sprint planning meetings, sprint kickoff meetings, daily standup meetings that last 1 hour with 20+ people present, demo meetings, and then finally the post-mortem.
Remember that agile is only as good as you make it and you can easily sink any methodology if you go wild with unrestrained meetings which aren't adding value. Keep it as lean as you can without it being chaotic.

From the top of my head, I can think of ways to palliate the shortcomings of the waterfall model:
Have the coder concentrate on automating the process itself. Automate the transitions between one step and another, so that changes will flow more or less automatically.
Make the process more bidirectional. One principal characteristic in the waterfall model is that changes flow from top to bottom. This is a unidirectional process, and that is part of the problem.
Another thing which would help is (as someone mentioned in an earlier answer) is for the developer to gain a better understanding of the business logic involved, and of what the customer wants, and for the customer to gain knowledge about the characteristics of the development process.

Here are some links about Waterfall model:
http://www.cs.odu.edu/~zeil/cs451/Lectures/01overview/process2/process2_htsu2.html
http://www.buzzle.com/editorials/3-13-2005-67039.asp

Related

where does specification by example complement/replace traditional requirements documentation?

I'm trying to understand where SBE's complement or replaces traditional requirements documentation. The diagram levels of requirements shows three levels of traditional software requirements.
Which of the items below (from the diagram) does SBE replace and which ones does it complement:
Vision and Scope Document
Business Requirements
Use Case Document
User Requirements
Business Rules
Software Requirements Specification
System Requirements
Functional Requirements
Quality Attributes
External Interfaces
Constraints
My naive understanding of SBE's would say that the SBE's are just an alternative form of the Software Requirements Specification. Is this correct?
BDD and SBE are normally used by Agile teams, who don't focus as much on documentation as traditional software development teams do.
BDD is the art of using examples in conversation to illustrate behaviour. SBE then uses the examples as a way of specifying the behaviour that you decide to address (I always think of it as a subset of BDD, since talking through examples often ends up to eliminating scope, discovering uncertainty or finding different options, none of which end up as specifications).
There are a couple of things that are hard to do with BDD. One of them is anything which isn't discrete in nature, or which needs to always be true throughout the lifetime of the system - non-functionals, quality attributes, constraints, etc. It's hard to talk through examples of these. These continuous aspects of requirements lend themselves better to monitoring, and that's discrete, so BDD can even be used to help manage these.
Since an initial vision is usually created to help the company make money, save money, or protect existing revenue (stopping customers going elsewhere, for instance), you can even come up with examples of how the project will do this. In fact, if you can't, the project is likely to fail anyway. So BDD / SBE can also be used to help complement an initial vision and scope.
Therefore, BDD / SBE can complement all of these documents, and in Agile teams, the documents themselves are usually replaced by conversations about the requirements and rules (illustrated by examples), story cards to represent placeholders for those conversations, and perhaps some lightweight capture of those conversations on a Wiki.
It is unlikely that any Agile team captures all of their examples up-front, as this leads to excessive investment in the requirements and tends to turn it into a traditional Waterfall /SDLC project instead.
This blog post I wrote about BDD in the Large may also be of interest.

Greenhopper: only useful if you buy the whole Agile package?

I work in a remote company where email, forums and IM are key communication tools. We don't formally follow any book-methodology but have been evolving towards a more agile-style setup in terms of release schedules, client interaction and a sort of scrum adapted for distributed teams.
We use Jira on one project and I wondered if Greenhopper might be of use. Or, whether it is focused on teams who adhere much more formally to the entire Agile dogma, and would end up getting in the way?
I don't want to get into a discussion on if Agile is good or bad, what it involves, etc... just whether Greenhopper is useful to projects whose methodology is inspired by Agile principles rather than defined by them.
GreenHopper is pretty flexible. I've seen marketing teams, operations teams plus more using the task board to track their work in progress without any real though to 'agile'. Give it a shot and let me know how you get on.
Thanks,
Nicholas Muldoon
GreenHopper Product Manager
I won't comment on Greenhopper, since I don't have recent experience regarding it.
Many tools are designed based on a "pure agile" model. That is, for a single team, working on a single thing for or a single product owner.
However, in my experience the vast majority of real-world organizations – even the very small – have a more complex landscape they need to manage. Often, even a single freelance developer has a load of things to work on: the active project(s), leads for new projects, maintenance of old projects, and so on. And simple tools that have a pure-scrum-one-team-works-on-a-single-thing-at-a-time just won’t cut it…
Agilefant is IMHO, while complient with "the agile dogma", also pretty flexible with respect to how people can be assigned to multiple things. Also, it's quite flexible in terms how different types of work can be modeled into it (that is, there's no need to conduct every project as Sprints, and so on).
DISCLAIMER: I'm Agilefant's product owner.

Tips for avoiding second system syndrome [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Lately I have seen our development team getting dangerously close to the 'second system syndrome' type ideas while we are planning the next version of our product. While I'm all for making improvements and undoing some of the mistakes of our past, I would hate to see us stuck in an endless loop of rewriting and never launching anything.
Is there a good design / development method that lends itself to building a better version 2.0 while avoiding second system scenarios?
I have experience the second system syndrome from both sides as a customer/sponsor and part of a development team.
A root cause for problems is when the team latches on to an Utopian vision of version 2, such as the desire to make the new software "flexible". In this scenario everything is abstracted to the nth degree. For example, instead of data objects that represent real-world entities a generic "item" object is created that can represent anything. One flawed idea is that we can build in so much flexibility into the software to anticipate future needs, that non-programmers will be able to just configure new capabilities. Often one goal such as "flexibility" overshadows the effort to a point that the resulting software doesn't work.
A balanced practical consideration of usability, performance, scalability, features, maintainability, and flexibility goals can bring the team back to earth. "It would be great if..." should be prohibited from the vocabulary of the team. The Scrum backlog is a good tool and the team should be heard saying often... "Let's backlog that...we don't need that for version 2."
"I would hate to see us stuck in an endless loop of rewriting and never launching anything."
Which is why people use Scrum.
Define a backlog of things to build.
Prioritize, so that things which lead to a release are first. Things which should be fixed are second.
Execute sprints to get to the release. Execute a release sprint.
Get someone who has written at least three systems to lead the project.
Try to focus on short delivery cycles, i.e. force yourself to deliver something tangible and useful to the users every few weeks or month. Prioritise the tasks based on their value to the customer. This way you always have a usable, functional application with satisfied users, while under the hood you can refactor the architecture in small steps if you wish (and if you can justify the need for it - that is, towards management / the customers, not just teammates!).
It helps a lot if you have a stable build process with a good suite of automatic (unit / integration) tests.
Agile development methods like Scrum do this, and they are warmly recommended. But of course it is not always easy or even possible to adapt such a method in your team. Even if you can't, you can still take elements of it and apply it to your project's benefit (maybe without even mentioning the words "agile" or "Scrum" publicly ;-)
Make sure you document the requirements as well as possible. While obviously you need to also manage what gets into the requirements to avoid over-designing, having a fixed scope helps prevent developers from running off with ideas or gold-plating what needs to be done and it helps keep management or clients from introducing scope creep. Define all requirements and how scope changes will be addressed.
I'm all for short development cycles (make sure you're writing tests) and agile methodology, but neither of those is a shield against second syndrome system. In some ways it's easier to keep adding on feature after feature if you're working in short sprints without stopping to look at the overall picture. Use agile practices to build the simplest thing that works, then keep adding your other requirements as simply as possible. Remember YAGNI, because when you build a system a second time, that's when you're most likely to build something you're sure you'll need at some point, something that will make the system "extensible" so there never has to be a third build. It's the best of intentions, but the road to hell and all that.
You can't get close to second system syndrome. Either you're in it, or you're away from it. You'll know when you're in it, but only after wasting a lot of resources.
Tips are: know about it (so basically we got that covered already). It's invaluable information to know what a "second system" is, and even more to have some experience with that. I think we all have some experience with that, hopefully benign.
That knowledge will often make you think twice and you'll find a solution without venturing into second-system limbo.
PS: Also know how to use your current system, that includes, maybe documented solutions, and other documentation.
Focusing on the system architecture should help e.g.
Having documented interfaces which support "loose coupling" between sub-systems
Having documented design decisions (avoid re-hashing previously beaten paths)
Hence, without going for an all out swap, the current system can be "upgraded" with more appropriate interfaces to help future growth.
Another good way to focus: assign a $$$ figure to each feature and prioritize accordingly ;-)
Anyhow, just my 2cents
I up-voted S. Lott's answer and would like to add some more suggestions:
Deliver a working product to your customer as frequently as possible. Iterations lasting between a few weeks and 2 months are ideal. Agile methodologies, such as Scrum, lend themselves well to this.
Use FogBugz for feature and bug tracking. Its features are very practical for agile projects. FogBugz allows easy prioritization according to releases and priorities. If your team enters their estimated levels of effort for each task, you can also use this to calculate reasonable ship dates.
Prioritize which features you will develop according to the 80/20 rule (20 percent of the features will be used 80 percent of the time) and the most bang for the buck. This helps keep the system as simple as possible, helps prevent feature bloat, and saves development and testing time.
Give similar thought to both new features and bugs when you determine the priority of an issue. One point of the Joel Test is "Do you fix bugs before writing new code?". In most shops this doesn't happen, but do not make debugging the system an afterthought. Also, keep a working copy of the old system to compare against when bugs are found on the new system.
Also factor in the level of effort required to maintain, and if necessary rewrite, existing code. If you have not already done this, give the team some time to code review whole files that they find troublesome to change. If the system's code was difficult to maintain the first time, this will only get worse and cause your team to spend more time on new features down the road.
It can never be avoided at its entirety. But being cautious could alleviate the problem.
You should formulate some thumb rule based on the vital parameters (scarcest resource) that define the success of the system. For example, reducing potential number of bugs might directly decrease operational cost (potential service calls to support center). But this might not be the case in every other systems. Another example, scarce use of CPU, memory and other resources might be beneficial in some cases but there could be environments where they could be available in abundant.
So simply to avoid "temptations", identify the scarcest resource (time, effort, money$ etc) and consider implementing only those that exceed threshold value.[f(s1,s2...) > threshold]
Despite the iterative development, I would emphasize on how technical debts are handled.
Links that are related to this:
Tech Debts: Effort Vs Time
Tech Debt Quadrant

What kind of software development process should a lone developer have? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I work as a lone developer in a very small company. My work is quite chaotic and I'm looking for ways to make it more organized.
One problem is that my projects have practically no management. Rarely anyone asks me what I'm doing, or if I have any problems. At some point there was talk about weekly status meetings, but that's some time ago. Seems that if I'd want something like that, I would have to arrange those myself.. Sometimes I'm a bit lost on what I should do next because I don't have tasks or a clear schedule defined.
From books and articles I have found many things that might be helpful. Like having a good coding standard (there exists only a rough style guide which is somewhat outdated in my opinion), code inspections, TDD, unit testing, bug database... But in a small company it seems there are no resources or time for anything that's not essential. The fact that I work in the embedded domain seems to make things only more complicated.
I feel there's also a custom of cutting corners and doing quick hacks on short notice. This leads to unfinished and unprofessional products and bugs waiting to emerge at a later date. I would imagine they are also a pain to maintain. So, I'm about to inherit a challenging code base, doing new development that requires learning a lot of new things and I guess trying to build a process for it all at the same time. It might be rewarding in the end, but as not too experienced I'm not sure if I can pull it off.
In a small shop like this the environment is far from optimal for programming. There's many other things needed to be done occasionally like customer support, answering the phone, signing parcels, hardware testing, assembly and whatever miscellaneous tasks might appear. So you get the idea about the resources. It's not all bad (sometimes it's enlightening to solve some customer problems) and I believe it can be improved, but it's the other things that I'm really concerned.
Is it possible to have a development process in a place like this?
Would it help to have some sort of management? What kind of?
Is it possible to make quality products with small resources?
How do I convince myself and others that the company which has worked successfully for decades needs to change? What would be essential?
Maybe there's someone working in a similar shop?
Use Source Control for EVERYTHING
Develop specifications and get signoff before starting - there will be resistance, but explain it's for their own good.
Unit tests! It hurts because you just want to get it done, but this will save you in the long run.
Use bug tracking - Bugzilla or FogBugz if you can afford it.
My advice is not to be extreme. From my experience, pure agile or pure traditional will not work. Before you use any process, know what it mean to solve.
I personally use a variation of Agile RUP. I do some upfront effort such as investigate the actual needs, do high-level design with possible extension. And ask customer to sign-off some major high-level requirements.
If you work in small group, detail design or specification may not worth. Of course, if there is some libraries that are shared by many, it will be worth the trouble.
Deciding what to invest in up-front depending on its risk (likelihood and effect).
Moreover, many SW best practice is really 'best' like version control, automatic testing (to me I only used it way to early detect regression as I do not believe in TDD as driven force of the development). I suggest you read 'Pragmatic Programmer' it presents many of those techines.
As to answer you questions:
(1). Is it possible to have a development process in a place like this?
Yes, but as I say, trailer it to fit your organization.
(2). Would it help to have some sort of management? What kind of?
Management helps but no control freak. Plan what to do when: integrate, resolve conflict, dead line of some mile stone. And roughly keep them on schedule (I particularly like Scrum's sprint).
(3). Is it possible to make quality products with small resources?
Definitely as soon as the size of the work, the time to develop and the size of the team is balance. If you definition of Quality is the same with me. To me, Quality means: solve the problem it set out to in an efficient and reliable fashion.
(4). How do I convince myself and others that the company which has worked successfully for decades needs to change? What would be essential?
Point out the problems. If there are none, why change? If you want to change, you should be able to identify the problem OR potential problems. Point out the problem.
Some big one are:
Without any process, it is harder for new recruited to blend in as they must learn from observing other how to deal with things.
Without process, it is harder to work in stress.
Without schedule, it is hard to determine the progress.
Without automatic testing, it will takes more time to identify problems and regression.
Without version control, it will be harder to roll-back mistake and separation of work to each team members will be mess.
Just my though.
You need to work with the owner and setup short medium and long term goals. You will want to let them know progress even if only through email.
You will need to enforce some order on your workday or you will never get anything done (those long term goals).
Divide your day up into chunks when you can code, when you are working on hacks to keep it togther, when answering emails etc.
Definitely setup a bug tracker. This can help keep your email clean. You can even setup an email address to forward bugs to be categorized later. This is good because the bug reporters will eventually tire of the bug tracker and want to just email you the bugs anyway.
edit
And as lod3n said, source control, but you are using that already right???!!?!
Been there, done that.
The book Planning Extreme Programming helped a lot. I used 3x5 cards stuck on a wall. This kept my boss informed of my progress, helped with estimates and planning, and kept me on track. The Planning Game gives good ammo if your boss's expectations are unrealistic.
Unit testing, as others have stated, helps even if you're a sole developer. I find the TDD style valuable.
lod3n is absolutely right about Source Control.
I've gone with XP-style iterations before; you may want to establish a backlog of items (even something simple like a spreadsheet or 3x5 cards) as well.
Avoid deathmarches. Stick to 40 hours in the sense of not working overtime 2 weeks in a row. Spend extra time outside of work learning new skills - not just technologies, but principles and best practices.
Have a bug tracking system, both for defects and new features. Don't rely on your memory.
Continuous integration and an automated build can help even a single developer.
Can't emphasize the recommendation for source control enough.
I've decided I don't need unit tests because I can have automated functional/integration tests instead: because with incremental development there's no need to test before integration.
as crazy as this sounds I use scrum just because I like the concepts of sprints, and backlogs. It makes it easier to set realistic goals. Of course the idea of scrum master and team is all you but if you are working on outside projects where it is possible that you may pick up an extra team member with your backlogs it will be easy to distribute work. Maybe I just like backlogs. With scrum you will need to get someone to be the product manager to talk about the features of the product. Version control is a must at probably should be implemented b4 even worrying about a software development process. I have worked in a company that started from 2 developers and went to 12 in a year. We started with no version control and low coding standards. The changes you will need to make will be gradual so don't worry about rushing to do a 180. Set a goal to change one thing a month and find supporters of your changes to make things go smooth.
As well as the recommedations of others I'd say that if you are tight on resources but have more say over how things get done you should make good use of off the shelf and open source products and libraries. This leverages the efforts of others, saving you time, ensures your code base doesn't become too esoteric and adds to your skillset so you don't end up being an expert in something that's useless everywhere else.
First, lets make a distinction between a development process and best practices. Best practices like source control, defect tracking, unit testing, etc. are an given.
Then there is the actual development processes. I would always recommend having a process, no matter small or large the team is. The trick is finding the right process. You have a process now - it is just an ad-hoc process that doesn't seem to be working out too well for for you. Rarely can you take a textbook development process and directly apply it. What you need to do is tailor the process to your companies needs and culture. Look at as many development paradigms as you can and try to find something that is a good fit and them start molding it to your needs. You may have to try and fail with a number of different processes. Perhaps the Personal Software Process will be a good starting process, maybe an agile process, a variant of RUP? You have a lot of options, start trying them out.
You are also going to have to work with the rest of your organization - they need to be a part of the process. You may be the lone developer, but a development process involves more than the developer, it involves ever person that has a say or impact in the product.
This may not be a specific answer, but my point is that you will need some kind of process. So start researching them and trying them out and molding them to your needs until you have something that works.

Does anyone still believe in the Capability Maturity Model for Software?

Ten years ago when I first encountered the CMM for software I was, I suppose like many, struck by how accurately it seemed to describe the chaotic "level one" state of software development in many businesses, particularly with its reference to reliance on heroes. It also seemed to provide realistic guidance for an organisation to progress up the levels improving their processes.
But while it seemed to provide a good model and realistic guidance for improvement, I never really witnessed an adherence to CMM having a significant positive impact on any organisation I have worked for, or with. I know of one large software consultancy that claims CMM level 5 - the highest level - when I can see first hand that their processes are as chaotic, and the quality of their software products as varied, as other, non-CMM businesses.
So I'm wondering, has anyone seen a real, tangible benefit from adherence to process improvement according to CMM?
And if you have seen improvement, do you think that the improvement was specifically attributable to CMM, or would an alternative approach (such as six-sigma) have been equally or more beneficial?
Does anyone still believe?
As an aside, for those who haven't yet seen it, check out this funny-because-its-true parody
At the heart of the matter lies this problem, neatly described by the CMM guidance itself...
“...Sound judgment is necessary to use the CMM correctly and with insight. Intelligence, experience and knowledge must shape an appropriate interpretation of the CMM in a specific environment. That interpretation should be based on the business needs and objectives of the organization and the projects. A rote, checklist-oriented application of the CMM has the potential to harm an organization rather than help it...”
From Page 14, section 1.6 of The Capability Maturity Model, Guidelines for Improving the Software Process by the Carnegie Mellon University Software Engineering Institute, ISBN 0-201-54664-7.
I found it to be bloated, documentation exercise that was used mainly as a contract-acquiring/maintaining vehicle. Once we had the contract, then it was an exercise in getting around the process.
As a developer, I got nothing out of it but lost MONTHS of my professional life fiddle-farting around with CMMI.
The same goes for 6 Sigma, which I branded "Common Sense in a Box". I didn't need to be trained how to figure out what the problem was to a process - it was generally quite evident.
For me, small teams and agile mechanisms work much better. Short cycles, lots of communication. That might not work in all environments, but it definitely works in mine.
Just my two cents.
For a typical CMM level 1 programming shop, making the effort to get to level 2 is worthwhile; this means that you need to think about you processes and write them down. Naturally, this will meet resistance from cowboy programmers who feel limited by standards, documentation, and test cases.
The effort to get from level 2 ("there is a process") to level 3 ("everyone has the same process") normally gets bogged down in inter-departmental warfare, so it's probably not worth starting.
If you see CMM run. And run fast.
CMM and CMMI both offer some benefits if your organization takes the lessions it tries to teach at heart. The problem is that getting to the higher levels is very difficult and expensive, and the only time I have seen an organization go through the effort is because their customers won't let them bid on contracts until they are at a certain level.
This has the effect of the organization doing everything they can to "just get the number" without actually caring about it improving their process.
The higher end? No. CMM-5 shops do not impress me.
The lower end? Yes. CMM-1 organizations scare me.
CMM can help a new/novice team measure themselves and do the self improvement thing.
CMMI isn't really about improving your software, it is about documenting what you have done. You can almost estimate a company's CMMI level by the weight of the documentation it produces.
Background: I have studied CMMI in my Software Engineering graduate program and have worked on a team that followed its guidelines.
My experience is that the CMM is so vague that its very easy to fulfill. Also, when they come to certify you, they look at the project that your organization chooses. Where I used to work, this was the project with no real deadline, plenty of money, and lots of time to spend on every nook and cranny of process. Many of the other projects continued with little to no code/design review sometimes without versioning software.
I think the emphasis on CMM certification is unfortunate. Companies know how to work the system, and do. Instead of focussing on real process improvement that meets their bottom line, they focus on getting a certification and working the system. I honestly think most organizations would rather spend time on the former instead of wasting so much time on the latter.
Really what matters is having conscientious people who want to make good development decisions and know that they will need help making those decisions. There is no substitute for quality programmers who know that programming is an ongoing group activity where they are just as likely to make a mistake as anyone else.
I have been doing a lot of interviewing for small teams doing iterative development. Personally, if I see CMM on a resume it is a big red flag that signals interest in process over results.
All formal methods exist to sell books/training courses/certification, and for no other reason. That's why there are so many formal methods. Once you realise this, you are free :-)
Yourdon still believes. But he might also still believe the world is going to end with Y2K.
This is not something I would personally put a lot of faith in or want to be yoked with in the future. But often ours is not to reason why...
P.S. Though a bit off-topic, I would like to mention that faked CMMI certifications are very common as well as real certifications obtained through bribery.
CMM doesn't really speak to the quality of the software, but more towards the documentation and repeatability of the process. In other words, it is possible to have an orderly and repeatable development process, but still create crappy software. As long as the process is properly documented, it is possible to achieve CMM Level 5.
At the end of the day CMM is another tool that can be used or misused. If the end goal is to improve software quality, it is possible to use CMM to improve the development process and improve software quality. If achieving a certain CMM level is the goal, then most likely software quality will suffer.
The Model is losing it's credibility, first because the companies adopt the model not looking for a maturer software development model, but to be appraised to a CCMI level.
And the other problem, the one that I think leads to the lost credibility is that as a contractor, you have no guarantee that the project your CMMI appraisal supplier is selling you will be developed using the model practices. The CMMi label only states that the company have once developed projects that were evaluated as adherents to a specific CMMi Maturity level.
The problem is not just on CMMi but on the process developed by the companies. The CMMi does not describe the process itself, but just what the process should do. You have the same problem with PMBOK. Actually the problem is not just the PMBOK, but primarily the problem is the bad project managers that claim to follow the PMI statements.
At school, I was taught: CMM is a good Idea, but lacking certification (anyone can say they are level 5 / level 4) it ends up being a marketing tool for offshore shops. So, yeah, the idea is sound, but how do you prove adherence?
I used to. But now I find that CMM and CMMI don't really fit that well with agile approaches.
Oh sure you can squeeze things to get that square peg into the round hole, but when push comes to shove, you are still basing your approach on an ability to predict everything that is needed, and anticipating everything that will be encountered, when building a software system.
And we all know, how well that approach works in real life! (-:
cheers,
Rob
Agile is the next CMM and both are fragile. The field of process and quality consulting is a good business in any industry and like the engineering folks everyone needs new buzzwords to keep the money flowing.
CMM when it first came out of the SEI was a good concept based on solid academic work but it was soon picked up by the process consultants and is a worthless certification now, which is used by most CIOs to cover their ass (Nobody got fired for picking a CMM Level 5 company)
Agile is going to go down that route soon and then we can be sure to see the next silver bullet in the horizon soon :)
When I worked on commercial flight software, we used CMM and as our processes improved our ability to accurately predict completion times improved. But this was a cumbersome process, other approaches should work just as well.
Smaller projects are less dependant on process for success. The key metric is the Hero to Bystander Ratio. Any project with an HTBR of less than 0.2 is in serious trouble.
There are quite a few good ideas that can readily be adapted and adopted by any organisation for their own good, but getting a badge is a pain due to the requirement for all kinds of redundant documentation.
The problem is that CMMi is not a process but just a guide for whatever process you might choose to have and that in itself invites half-baked ideas flowing around.
Another point is that migration is a real pain when you are starting, but its the same as any other teething trouble, I guess.
The main issue with understanding the value of CMMi is understanding CMMi itself.
CMMi is a documented approach to Continuous Improvement for Software Production.
Understanding Continuous Improvement with SPC is difficult enough in manufacturing but add the intangible Software product and the difficulty is exponential.
I would recommend to anyone, or organization, new to CMMi: to document their current process then look at what outcomes (cost/benefit) can be measured independently of the process. In this way if any process, procedure of standard was changed would it yield a 'better' result. The prerequisite to this exercise is a documented, stable repeatable process since it is impossible to measure the benefit of any change within an ad-hoc environment as you are not comparing 'like for like'.
By focusing on the above concepts initially, the organization will begin to understand and embrace the essential value of the CMMi.
Legend has it that the US Department of Defense, which did a lot of contracting, found that many of its projects faced time and cost overruns, and even when they were delivered, the projects were not exactly what was ordered.
So they wanted a way to be sure that that a contractor would be able to deliver on time, within budget and close to what was required. Thus the capability maturity model was born.
The thesis is that if things are written down, then they survive attrition. But saying that write down everything would not be enough, it must be checked that they are written down correctly. Among other things.
Throughout all this, it never crossed their minds to consider the cost of doing all this. Because from the point of view of the DoD, if it gave out a project for $ 1 million to get something in a year, and ended up paying $ 10 million over 10 years and not getting what they wanted, and now if they instead had to pay $ 5 million for that same thing to get what they actually wanted in two years, they are still saving $ 5 million, and not to mention that they are actually getting something.
So if you are contractor to US DoD or something like that, go ahead and get CMM, because it would be a requirement. But if you are competing with the 1000s of software development shops on elance, to get projects with limited budgets, limited time and so on... CMM is not a good choice.
That said, feel free to read the CMMI Dev pdf (v 1.3 at time of writing). It makes a lot of good points. It deconstructs the organisation very nicely. And if you see any points which make you go 'aha! i have this problem', then by all means use that wisdom to resolve your problem. In our case, one small change we made was to ensure that we make a list of all the people who are allowed to give us requirements. If there was more than one person who was allowed to give us requirements, then any requirement coming from one source was circulated to the others, and they had to say 'okay' before we added it to the backlog. This small change made a big difference in how much we worked and reworked.
In short look at the process areas and compare them to your pain areas, and take the suggestions given by CMM. The way you implement it is your own. And you can always implement it in a way that does not take too much time or cost too much money. But I guess the same applies even to the relevant ISO/IEC standards.

Resources