How to evaulate the design of a brand new application which falls into fairly unfamiliar knowleadge domain to you? - sdlc

Recently I participated in designing & writing of an application which my team was given complete requirements and had to basically design and code it - it was about automation of 3rd party handwriting recognition platform to interop with a couple of our systems. Now a few months after the customer called with what seemed to be at first glance a minor issue, but after investigating it turns out that the whole application requires re-design just to fix this inaccuracy (it's easier to re-design then patched).
I personally don't think the application was particularly badly designed by any of this points mentioned on this thread but just that there was way to many small unknowns for us and looks like have now accumulated into a major design flaw - something we basically failed to see. All those small factors in the design stage seemed be insignificant & ignorable so we thought we are doing ok. Now with the problem occurred it it seems silly we couldn't spot it at design time but I guess we ignored some 'small' details & nuances which turned out to be significant after all.
So is there any approach to take when you are entering the design stage of an application the you are not too familiar with but it's design (falsely) seems to be more or less straight forward (create tables, write BOs, write UI etc) so that you can increase you chance to foresee this type of pitfalls in the implementation stage ( or at least certainly before customer deployment) ?
PS: Sometimes we hire experts to help like mathematician one time, or geographical guy another but who can help us incorporate a third party platform into ours except us

I think the approach must be to find the "best practices" in the domain. Each domain has procedures in which things had been done always; it's often forgotten by practitioner what the rationale for these practices originally was. As a newcomer, it is good to find out what these best practices are, and to follow them - blindly.
That way, you have a good chance to avoid making common mistakes, and if you do run into problems, there is a chance that these problems are typical for the domain, with well-known solutions/work-arounds.
All speaking in the abstract, of course.

Related

Scale now or later?

I am looking to start developing a relatively simple web application that will pull data from various sources and normalizing it. A user can also enter the data directly into the site. I anticipate hitting scale, if successful. Is it worth putting in the time now to use scalable or distributed technologies or just start with a LAMP stack? Framework or not? Any thoughts, suggestions, or comments would help.
Disregard my vague description of the idea, I'd love to share once I get further along.
Later. I can't remember who said it (might have been SO's Jeff Atwood) but it rings true: your first problem is getting other people to care about your work. Worry about scale when they do.
Definitely go with a well structured framework for your own sanity though. Even if it doesn't end up with thousands of users, you'll want to add features as time goes on. Maintaining an expanding codebase without good structure quickly becomes fairly horrible (been there, done that, lost the client).
btw, if you're tempted to write your own framework, be aware that it is a lot of work. My company has an in-house one we're quite proud of, but it's taken 3-4 years to mature.
Is it worth putting in the time now to use scalable or distributed technologies or just start with a LAMP stack?
A LAMP stack is scalable. Apache provides many, many alternatives.
Framework or not?
Always use the highest-powered framework you can find. Write as little code as possible. Get something in front of people as soon as you can.
Focus on what's important: Get something to work.
If you don't have something that works, scalability doesn't matter, does it?
Then read up on optimization. http://c2.com/cgi/wiki?RulesOfOptimization is very helpful.
Rule 1. Don't.
Rule 2. Don't yet.
Rule 3. Profile before Optimizing.
Until you have a working application, you don't know what -- specific -- thing limits your scalability.
Don't assume. Measure.
That means build something that people actually use. Scale comes later.
Absolutely do it later. Scaling pains is a good problem to have, it means people like your project enough to stress the hardware it's running on.
The last company I worked at started fairly small with PHP and the very very first versions of CakePHP that came out (when it was still in beta). Some of the code was dirty, the admin tool was a mess (code-wise), and sure it could have been done better from the start. But do you know what? They got it out the door before their competitors did, and became extremely successful.
When I came on board they were starting to hit the limits of their current potential scalability, and that is when they decided to start looking at CDN's, lighttpd caching techniques, and other ways to clean up the code and make things run smoother when under heavy load. I don't work for them anymore but it was a good experience in growing an architecture beyond what it was originally scoped at.
I can tell you right now if they had tried to do the scalability and optimizations before selling content and getting a website live - they would never have grown to the size they are now. The company is www.beatport.com if you're interested in who I'm talking about (To re-iterate, I'm not trying to advertise them as I am no longer affiliated with them, but it stands as a good case study and it's easier for people to understand what I'm talking about when they see their website).
Personally, after working with Ruby and Rails (and understanding the separation!) for a couple of years, and having experience with PHP at Beatport - I can confidently say that I never want to work with PHP code again =p
Funny to ask "scale now or later?" and label it "ruby on rails".
Actually, Ruby on Rails was created by David Heinemeier Hansson, who has a whole chapter in his book labeled "Scale later" :))
http://gettingreal.37signals.com/ch04_Scale_Later.php
I agree with the earlier respondents -- make it useful, make it work and get people motivated to use it first. I also agree that you should pick off-the shelf components (of which there are many) rather than roll your own, as much as possible. At the same time, make sure that you choose components for your infrastructure that you know to be scalable so that you can go there when you need to, without having to re-write major chunks of your application.
As the Product Manager for Berkeley DB, I've seen countess cases of developers who decided "Oh, we'll just write that to a flat file" or "I can write my own simple B-tree function" or "Database XYZ is 'good enough', I don't have to worry about concurrency or scalability until later". The problem with that approach is that a) you're re-inventing the wheel (and forgoing what others have learned the hard way already) and b) you're ignoring the fact that you'll have to deal with scalability at some point and going with a 'good enough' solution.
Good luck in your implementation.

implementation before documentation

I am doing a design and build dissertation for my final year.
Everything is more or less good, except I cant find a software methodology to fit my process.
Basically I did the implementation FIRST and then from that I used tools to reverse engineer class diagrams, ERD, etc...
I can blag that I followed the waterfall method or something, but I would rather try to find an actual Software Development Mythology which does the implementation first.
I do know that is REALLY bad and is probably non-existing, however its small project and personal use only.
any helpful suggestions are greatly appreciated.
If you didn't design it first I don't think it will fit in any DESIGN pattern...
Most of the Design patterns are focused on designing it first, to save you the trouble of ending up with a bad software poorly designed.
You can just say you did the design you reversed engineered and followed it, assuming it fits any design pattern.
If the project is small enough to excuse doing something you say is "bad," maybe it's small enough to redo in a way that's "right." Just pick a methodology, (re)design your project according to that methodology, then re-code it according to the design. That way you wouldn't have to fudge anything.
There is no real process or software model that endorses coding before anything else. Well, technically there is the "Code-like-hell" approach but that is universally condemned (and not actually even a "real" approach in that it's considered a mistake.)
(See #27: http://www.stevemcconnell.com/rdenum.htm)
Most software models and processes exist to bring engineering principles to projects, and coding before planning is not an engineering principle. Granted, it's not necessarily wrong on an incredibly small project to just go ahead and code (most people would do it that way). At the same time, it does not follow any type of process or model that is well-established or almost universally accepted.
Your best options now are to redo it, say you reverse engineered it, or try to fudge it to fit some type of model, but I can't actually endorse doing the last one as it would probably end up just becoming lying.
Read about XP: extreme programming
http://www.extremeprogramming.org/

What kind of software development process should a lone developer have? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I work as a lone developer in a very small company. My work is quite chaotic and I'm looking for ways to make it more organized.
One problem is that my projects have practically no management. Rarely anyone asks me what I'm doing, or if I have any problems. At some point there was talk about weekly status meetings, but that's some time ago. Seems that if I'd want something like that, I would have to arrange those myself.. Sometimes I'm a bit lost on what I should do next because I don't have tasks or a clear schedule defined.
From books and articles I have found many things that might be helpful. Like having a good coding standard (there exists only a rough style guide which is somewhat outdated in my opinion), code inspections, TDD, unit testing, bug database... But in a small company it seems there are no resources or time for anything that's not essential. The fact that I work in the embedded domain seems to make things only more complicated.
I feel there's also a custom of cutting corners and doing quick hacks on short notice. This leads to unfinished and unprofessional products and bugs waiting to emerge at a later date. I would imagine they are also a pain to maintain. So, I'm about to inherit a challenging code base, doing new development that requires learning a lot of new things and I guess trying to build a process for it all at the same time. It might be rewarding in the end, but as not too experienced I'm not sure if I can pull it off.
In a small shop like this the environment is far from optimal for programming. There's many other things needed to be done occasionally like customer support, answering the phone, signing parcels, hardware testing, assembly and whatever miscellaneous tasks might appear. So you get the idea about the resources. It's not all bad (sometimes it's enlightening to solve some customer problems) and I believe it can be improved, but it's the other things that I'm really concerned.
Is it possible to have a development process in a place like this?
Would it help to have some sort of management? What kind of?
Is it possible to make quality products with small resources?
How do I convince myself and others that the company which has worked successfully for decades needs to change? What would be essential?
Maybe there's someone working in a similar shop?
Use Source Control for EVERYTHING
Develop specifications and get signoff before starting - there will be resistance, but explain it's for their own good.
Unit tests! It hurts because you just want to get it done, but this will save you in the long run.
Use bug tracking - Bugzilla or FogBugz if you can afford it.
My advice is not to be extreme. From my experience, pure agile or pure traditional will not work. Before you use any process, know what it mean to solve.
I personally use a variation of Agile RUP. I do some upfront effort such as investigate the actual needs, do high-level design with possible extension. And ask customer to sign-off some major high-level requirements.
If you work in small group, detail design or specification may not worth. Of course, if there is some libraries that are shared by many, it will be worth the trouble.
Deciding what to invest in up-front depending on its risk (likelihood and effect).
Moreover, many SW best practice is really 'best' like version control, automatic testing (to me I only used it way to early detect regression as I do not believe in TDD as driven force of the development). I suggest you read 'Pragmatic Programmer' it presents many of those techines.
As to answer you questions:
(1). Is it possible to have a development process in a place like this?
Yes, but as I say, trailer it to fit your organization.
(2). Would it help to have some sort of management? What kind of?
Management helps but no control freak. Plan what to do when: integrate, resolve conflict, dead line of some mile stone. And roughly keep them on schedule (I particularly like Scrum's sprint).
(3). Is it possible to make quality products with small resources?
Definitely as soon as the size of the work, the time to develop and the size of the team is balance. If you definition of Quality is the same with me. To me, Quality means: solve the problem it set out to in an efficient and reliable fashion.
(4). How do I convince myself and others that the company which has worked successfully for decades needs to change? What would be essential?
Point out the problems. If there are none, why change? If you want to change, you should be able to identify the problem OR potential problems. Point out the problem.
Some big one are:
Without any process, it is harder for new recruited to blend in as they must learn from observing other how to deal with things.
Without process, it is harder to work in stress.
Without schedule, it is hard to determine the progress.
Without automatic testing, it will takes more time to identify problems and regression.
Without version control, it will be harder to roll-back mistake and separation of work to each team members will be mess.
Just my though.
You need to work with the owner and setup short medium and long term goals. You will want to let them know progress even if only through email.
You will need to enforce some order on your workday or you will never get anything done (those long term goals).
Divide your day up into chunks when you can code, when you are working on hacks to keep it togther, when answering emails etc.
Definitely setup a bug tracker. This can help keep your email clean. You can even setup an email address to forward bugs to be categorized later. This is good because the bug reporters will eventually tire of the bug tracker and want to just email you the bugs anyway.
edit
And as lod3n said, source control, but you are using that already right???!!?!
Been there, done that.
The book Planning Extreme Programming helped a lot. I used 3x5 cards stuck on a wall. This kept my boss informed of my progress, helped with estimates and planning, and kept me on track. The Planning Game gives good ammo if your boss's expectations are unrealistic.
Unit testing, as others have stated, helps even if you're a sole developer. I find the TDD style valuable.
lod3n is absolutely right about Source Control.
I've gone with XP-style iterations before; you may want to establish a backlog of items (even something simple like a spreadsheet or 3x5 cards) as well.
Avoid deathmarches. Stick to 40 hours in the sense of not working overtime 2 weeks in a row. Spend extra time outside of work learning new skills - not just technologies, but principles and best practices.
Have a bug tracking system, both for defects and new features. Don't rely on your memory.
Continuous integration and an automated build can help even a single developer.
Can't emphasize the recommendation for source control enough.
I've decided I don't need unit tests because I can have automated functional/integration tests instead: because with incremental development there's no need to test before integration.
as crazy as this sounds I use scrum just because I like the concepts of sprints, and backlogs. It makes it easier to set realistic goals. Of course the idea of scrum master and team is all you but if you are working on outside projects where it is possible that you may pick up an extra team member with your backlogs it will be easy to distribute work. Maybe I just like backlogs. With scrum you will need to get someone to be the product manager to talk about the features of the product. Version control is a must at probably should be implemented b4 even worrying about a software development process. I have worked in a company that started from 2 developers and went to 12 in a year. We started with no version control and low coding standards. The changes you will need to make will be gradual so don't worry about rushing to do a 180. Set a goal to change one thing a month and find supporters of your changes to make things go smooth.
As well as the recommedations of others I'd say that if you are tight on resources but have more say over how things get done you should make good use of off the shelf and open source products and libraries. This leverages the efforts of others, saving you time, ensures your code base doesn't become too esoteric and adds to your skillset so you don't end up being an expert in something that's useless everywhere else.
First, lets make a distinction between a development process and best practices. Best practices like source control, defect tracking, unit testing, etc. are an given.
Then there is the actual development processes. I would always recommend having a process, no matter small or large the team is. The trick is finding the right process. You have a process now - it is just an ad-hoc process that doesn't seem to be working out too well for for you. Rarely can you take a textbook development process and directly apply it. What you need to do is tailor the process to your companies needs and culture. Look at as many development paradigms as you can and try to find something that is a good fit and them start molding it to your needs. You may have to try and fail with a number of different processes. Perhaps the Personal Software Process will be a good starting process, maybe an agile process, a variant of RUP? You have a lot of options, start trying them out.
You are also going to have to work with the rest of your organization - they need to be a part of the process. You may be the lone developer, but a development process involves more than the developer, it involves ever person that has a say or impact in the product.
This may not be a specific answer, but my point is that you will need some kind of process. So start researching them and trying them out and molding them to your needs until you have something that works.

YAGNI and junior developers

When writing code for a new system I don't want to introduce unnecessary complexity in the design that I might never have any need for. So I'm following YAGNI here, and rather refactoring as I see the need for more flexibility or as responsibilities becomes more clear. This allows me to move faster.
But there is a problem here with junior devs, in that they will not recognize when to refactor or where build out the design. They just stuff more code into the existing design.
So, what are the best ways to tackle this? Should I more often build a more future-proof design so when adding to it they have a good example to follow, even if we might never have to add anything? Or should I just go ahead with more code reviews, education, etc? Or both?
Have any of you had any experience with this type of problem? How did you solve it?
I would recommend code reviews or pair programming. It gives you the chance to educate your fellow developers and increase the overall quality.
Perhaps you begin by recognizing explicitly that part of your job is to help develop the junior devs. If you're not the boss, management should sign off on this. Management needs to recognize that your choices are to develop them now or clean up after them later, and you need management's backing for the time this will take.
Code reviews and pair programming are fine ideas. They are especially good because they are not "just for junior people"–I do both with one of my close colleagues; together we are nearly 100 years old and have more than 70 years of programming experience :-)
But there's a larger problem here: the programming methodology that enables you to be most effective (YAGNI + refactor) is not effective for your junior partners. My experience is that it takes people years to learn the benefits of YAGNI, so if you expect them just to learn your way of doing things, you are setting yourself up for disappointment.
I would encourage you to identify some methodology that you think is going to be useful with your junior partners. The particular methodology probably doesn't matter (heresy!); I've had success with composite/structured design, object-based design, algebraic specification (!), and extreme programming. But
Do pick something that has a name and some literature devoted to it, that your juniors can take pride in learning, and that is a skill they can carry to future projects.
In order to show that it is tasty, you may need to eat the dog food yourself. Pick something you can live with and be productive in.
Observe your juniors carefully and teach them a decision procedure they can use to identify when they should ask you for guidance.
Good luck!
There is a reason they are junior and you are senior.
The ability to realise when a change in design is needed is one of them.
I would carry on as you are but encourage them to come to you when things are getting difficult. You can then work with them to alter the design if needed, this will be easier for you than refactoring it and will help you pass on knowledge to your junior developers.
A very good way to show how far to build out a design is to be specify about what the design will do when built out, then write tests to cover the new functionality. When the tests pass, development is done.
You might realize along the way that you forgot to test for something. That's fine, and is useful feedback to help you specify better next time. Write the missing test(s), and only enough code to make them pass.
Then refactor. Knowing what to look for when refactoring takes a bit of practice. Start with
Is there duplication in the code we've just written that we can eliminate?
Is there duplication between what we've just written and pre-existing code?
Does the code we've just written concern itself with too many things? (I.e., should we break out collaborators?)
Repeat this a few dozen times, and it'll get easier.
Another way of looking at YAGNI is that any changes to code need to be justified.
Might it be helpful to require any commit needs an associated unit test (or BDD user story, choose your poison)? It helps to communicate your intent (you want people to think about why they are adding this feature) and you get regression tests for free.
Also gets the noobs to start thinking about modularity (usually needed to make your code testable) and will help a lot if you do need to refactor later on.
I'm all for code reviews and teaching, but I think futureproof design is also important. Maybe you could think of it in terms of you designing an API and the junior developers using the API. In this way you are the one who does the hard work that they would screw up (identifying duplicated code and eliminating it) while they do all the grunt work that isn't a productive use of your time.
Of course this has to be balanced with a need to develop your junior developers skills. Neither side of the equation can be neglected.
It may help to map out what work they will do and then verify it to help build their sense of judgement which is really what you are asking, to my mind. Pairing is one option but if you can't spare that much time then having a sort of "check point" to see how they are doing and preventing them from going down the wrong path.

Does anyone still believe in the Capability Maturity Model for Software?

Ten years ago when I first encountered the CMM for software I was, I suppose like many, struck by how accurately it seemed to describe the chaotic "level one" state of software development in many businesses, particularly with its reference to reliance on heroes. It also seemed to provide realistic guidance for an organisation to progress up the levels improving their processes.
But while it seemed to provide a good model and realistic guidance for improvement, I never really witnessed an adherence to CMM having a significant positive impact on any organisation I have worked for, or with. I know of one large software consultancy that claims CMM level 5 - the highest level - when I can see first hand that their processes are as chaotic, and the quality of their software products as varied, as other, non-CMM businesses.
So I'm wondering, has anyone seen a real, tangible benefit from adherence to process improvement according to CMM?
And if you have seen improvement, do you think that the improvement was specifically attributable to CMM, or would an alternative approach (such as six-sigma) have been equally or more beneficial?
Does anyone still believe?
As an aside, for those who haven't yet seen it, check out this funny-because-its-true parody
At the heart of the matter lies this problem, neatly described by the CMM guidance itself...
“...Sound judgment is necessary to use the CMM correctly and with insight. Intelligence, experience and knowledge must shape an appropriate interpretation of the CMM in a specific environment. That interpretation should be based on the business needs and objectives of the organization and the projects. A rote, checklist-oriented application of the CMM has the potential to harm an organization rather than help it...”
From Page 14, section 1.6 of The Capability Maturity Model, Guidelines for Improving the Software Process by the Carnegie Mellon University Software Engineering Institute, ISBN 0-201-54664-7.
I found it to be bloated, documentation exercise that was used mainly as a contract-acquiring/maintaining vehicle. Once we had the contract, then it was an exercise in getting around the process.
As a developer, I got nothing out of it but lost MONTHS of my professional life fiddle-farting around with CMMI.
The same goes for 6 Sigma, which I branded "Common Sense in a Box". I didn't need to be trained how to figure out what the problem was to a process - it was generally quite evident.
For me, small teams and agile mechanisms work much better. Short cycles, lots of communication. That might not work in all environments, but it definitely works in mine.
Just my two cents.
For a typical CMM level 1 programming shop, making the effort to get to level 2 is worthwhile; this means that you need to think about you processes and write them down. Naturally, this will meet resistance from cowboy programmers who feel limited by standards, documentation, and test cases.
The effort to get from level 2 ("there is a process") to level 3 ("everyone has the same process") normally gets bogged down in inter-departmental warfare, so it's probably not worth starting.
If you see CMM run. And run fast.
CMM and CMMI both offer some benefits if your organization takes the lessions it tries to teach at heart. The problem is that getting to the higher levels is very difficult and expensive, and the only time I have seen an organization go through the effort is because their customers won't let them bid on contracts until they are at a certain level.
This has the effect of the organization doing everything they can to "just get the number" without actually caring about it improving their process.
The higher end? No. CMM-5 shops do not impress me.
The lower end? Yes. CMM-1 organizations scare me.
CMM can help a new/novice team measure themselves and do the self improvement thing.
CMMI isn't really about improving your software, it is about documenting what you have done. You can almost estimate a company's CMMI level by the weight of the documentation it produces.
Background: I have studied CMMI in my Software Engineering graduate program and have worked on a team that followed its guidelines.
My experience is that the CMM is so vague that its very easy to fulfill. Also, when they come to certify you, they look at the project that your organization chooses. Where I used to work, this was the project with no real deadline, plenty of money, and lots of time to spend on every nook and cranny of process. Many of the other projects continued with little to no code/design review sometimes without versioning software.
I think the emphasis on CMM certification is unfortunate. Companies know how to work the system, and do. Instead of focussing on real process improvement that meets their bottom line, they focus on getting a certification and working the system. I honestly think most organizations would rather spend time on the former instead of wasting so much time on the latter.
Really what matters is having conscientious people who want to make good development decisions and know that they will need help making those decisions. There is no substitute for quality programmers who know that programming is an ongoing group activity where they are just as likely to make a mistake as anyone else.
I have been doing a lot of interviewing for small teams doing iterative development. Personally, if I see CMM on a resume it is a big red flag that signals interest in process over results.
All formal methods exist to sell books/training courses/certification, and for no other reason. That's why there are so many formal methods. Once you realise this, you are free :-)
Yourdon still believes. But he might also still believe the world is going to end with Y2K.
This is not something I would personally put a lot of faith in or want to be yoked with in the future. But often ours is not to reason why...
P.S. Though a bit off-topic, I would like to mention that faked CMMI certifications are very common as well as real certifications obtained through bribery.
CMM doesn't really speak to the quality of the software, but more towards the documentation and repeatability of the process. In other words, it is possible to have an orderly and repeatable development process, but still create crappy software. As long as the process is properly documented, it is possible to achieve CMM Level 5.
At the end of the day CMM is another tool that can be used or misused. If the end goal is to improve software quality, it is possible to use CMM to improve the development process and improve software quality. If achieving a certain CMM level is the goal, then most likely software quality will suffer.
The Model is losing it's credibility, first because the companies adopt the model not looking for a maturer software development model, but to be appraised to a CCMI level.
And the other problem, the one that I think leads to the lost credibility is that as a contractor, you have no guarantee that the project your CMMI appraisal supplier is selling you will be developed using the model practices. The CMMi label only states that the company have once developed projects that were evaluated as adherents to a specific CMMi Maturity level.
The problem is not just on CMMi but on the process developed by the companies. The CMMi does not describe the process itself, but just what the process should do. You have the same problem with PMBOK. Actually the problem is not just the PMBOK, but primarily the problem is the bad project managers that claim to follow the PMI statements.
At school, I was taught: CMM is a good Idea, but lacking certification (anyone can say they are level 5 / level 4) it ends up being a marketing tool for offshore shops. So, yeah, the idea is sound, but how do you prove adherence?
I used to. But now I find that CMM and CMMI don't really fit that well with agile approaches.
Oh sure you can squeeze things to get that square peg into the round hole, but when push comes to shove, you are still basing your approach on an ability to predict everything that is needed, and anticipating everything that will be encountered, when building a software system.
And we all know, how well that approach works in real life! (-:
cheers,
Rob
Agile is the next CMM and both are fragile. The field of process and quality consulting is a good business in any industry and like the engineering folks everyone needs new buzzwords to keep the money flowing.
CMM when it first came out of the SEI was a good concept based on solid academic work but it was soon picked up by the process consultants and is a worthless certification now, which is used by most CIOs to cover their ass (Nobody got fired for picking a CMM Level 5 company)
Agile is going to go down that route soon and then we can be sure to see the next silver bullet in the horizon soon :)
When I worked on commercial flight software, we used CMM and as our processes improved our ability to accurately predict completion times improved. But this was a cumbersome process, other approaches should work just as well.
Smaller projects are less dependant on process for success. The key metric is the Hero to Bystander Ratio. Any project with an HTBR of less than 0.2 is in serious trouble.
There are quite a few good ideas that can readily be adapted and adopted by any organisation for their own good, but getting a badge is a pain due to the requirement for all kinds of redundant documentation.
The problem is that CMMi is not a process but just a guide for whatever process you might choose to have and that in itself invites half-baked ideas flowing around.
Another point is that migration is a real pain when you are starting, but its the same as any other teething trouble, I guess.
The main issue with understanding the value of CMMi is understanding CMMi itself.
CMMi is a documented approach to Continuous Improvement for Software Production.
Understanding Continuous Improvement with SPC is difficult enough in manufacturing but add the intangible Software product and the difficulty is exponential.
I would recommend to anyone, or organization, new to CMMi: to document their current process then look at what outcomes (cost/benefit) can be measured independently of the process. In this way if any process, procedure of standard was changed would it yield a 'better' result. The prerequisite to this exercise is a documented, stable repeatable process since it is impossible to measure the benefit of any change within an ad-hoc environment as you are not comparing 'like for like'.
By focusing on the above concepts initially, the organization will begin to understand and embrace the essential value of the CMMi.
Legend has it that the US Department of Defense, which did a lot of contracting, found that many of its projects faced time and cost overruns, and even when they were delivered, the projects were not exactly what was ordered.
So they wanted a way to be sure that that a contractor would be able to deliver on time, within budget and close to what was required. Thus the capability maturity model was born.
The thesis is that if things are written down, then they survive attrition. But saying that write down everything would not be enough, it must be checked that they are written down correctly. Among other things.
Throughout all this, it never crossed their minds to consider the cost of doing all this. Because from the point of view of the DoD, if it gave out a project for $ 1 million to get something in a year, and ended up paying $ 10 million over 10 years and not getting what they wanted, and now if they instead had to pay $ 5 million for that same thing to get what they actually wanted in two years, they are still saving $ 5 million, and not to mention that they are actually getting something.
So if you are contractor to US DoD or something like that, go ahead and get CMM, because it would be a requirement. But if you are competing with the 1000s of software development shops on elance, to get projects with limited budgets, limited time and so on... CMM is not a good choice.
That said, feel free to read the CMMI Dev pdf (v 1.3 at time of writing). It makes a lot of good points. It deconstructs the organisation very nicely. And if you see any points which make you go 'aha! i have this problem', then by all means use that wisdom to resolve your problem. In our case, one small change we made was to ensure that we make a list of all the people who are allowed to give us requirements. If there was more than one person who was allowed to give us requirements, then any requirement coming from one source was circulated to the others, and they had to say 'okay' before we added it to the backlog. This small change made a big difference in how much we worked and reworked.
In short look at the process areas and compare them to your pain areas, and take the suggestions given by CMM. The way you implement it is your own. And you can always implement it in a way that does not take too much time or cost too much money. But I guess the same applies even to the relevant ISO/IEC standards.

Resources