Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I work as a lone developer in a very small company. My work is quite chaotic and I'm looking for ways to make it more organized.
One problem is that my projects have practically no management. Rarely anyone asks me what I'm doing, or if I have any problems. At some point there was talk about weekly status meetings, but that's some time ago. Seems that if I'd want something like that, I would have to arrange those myself.. Sometimes I'm a bit lost on what I should do next because I don't have tasks or a clear schedule defined.
From books and articles I have found many things that might be helpful. Like having a good coding standard (there exists only a rough style guide which is somewhat outdated in my opinion), code inspections, TDD, unit testing, bug database... But in a small company it seems there are no resources or time for anything that's not essential. The fact that I work in the embedded domain seems to make things only more complicated.
I feel there's also a custom of cutting corners and doing quick hacks on short notice. This leads to unfinished and unprofessional products and bugs waiting to emerge at a later date. I would imagine they are also a pain to maintain. So, I'm about to inherit a challenging code base, doing new development that requires learning a lot of new things and I guess trying to build a process for it all at the same time. It might be rewarding in the end, but as not too experienced I'm not sure if I can pull it off.
In a small shop like this the environment is far from optimal for programming. There's many other things needed to be done occasionally like customer support, answering the phone, signing parcels, hardware testing, assembly and whatever miscellaneous tasks might appear. So you get the idea about the resources. It's not all bad (sometimes it's enlightening to solve some customer problems) and I believe it can be improved, but it's the other things that I'm really concerned.
Is it possible to have a development process in a place like this?
Would it help to have some sort of management? What kind of?
Is it possible to make quality products with small resources?
How do I convince myself and others that the company which has worked successfully for decades needs to change? What would be essential?
Maybe there's someone working in a similar shop?
Use Source Control for EVERYTHING
Develop specifications and get signoff before starting - there will be resistance, but explain it's for their own good.
Unit tests! It hurts because you just want to get it done, but this will save you in the long run.
Use bug tracking - Bugzilla or FogBugz if you can afford it.
My advice is not to be extreme. From my experience, pure agile or pure traditional will not work. Before you use any process, know what it mean to solve.
I personally use a variation of Agile RUP. I do some upfront effort such as investigate the actual needs, do high-level design with possible extension. And ask customer to sign-off some major high-level requirements.
If you work in small group, detail design or specification may not worth. Of course, if there is some libraries that are shared by many, it will be worth the trouble.
Deciding what to invest in up-front depending on its risk (likelihood and effect).
Moreover, many SW best practice is really 'best' like version control, automatic testing (to me I only used it way to early detect regression as I do not believe in TDD as driven force of the development). I suggest you read 'Pragmatic Programmer' it presents many of those techines.
As to answer you questions:
(1). Is it possible to have a development process in a place like this?
Yes, but as I say, trailer it to fit your organization.
(2). Would it help to have some sort of management? What kind of?
Management helps but no control freak. Plan what to do when: integrate, resolve conflict, dead line of some mile stone. And roughly keep them on schedule (I particularly like Scrum's sprint).
(3). Is it possible to make quality products with small resources?
Definitely as soon as the size of the work, the time to develop and the size of the team is balance. If you definition of Quality is the same with me. To me, Quality means: solve the problem it set out to in an efficient and reliable fashion.
(4). How do I convince myself and others that the company which has worked successfully for decades needs to change? What would be essential?
Point out the problems. If there are none, why change? If you want to change, you should be able to identify the problem OR potential problems. Point out the problem.
Some big one are:
Without any process, it is harder for new recruited to blend in as they must learn from observing other how to deal with things.
Without process, it is harder to work in stress.
Without schedule, it is hard to determine the progress.
Without automatic testing, it will takes more time to identify problems and regression.
Without version control, it will be harder to roll-back mistake and separation of work to each team members will be mess.
Just my though.
You need to work with the owner and setup short medium and long term goals. You will want to let them know progress even if only through email.
You will need to enforce some order on your workday or you will never get anything done (those long term goals).
Divide your day up into chunks when you can code, when you are working on hacks to keep it togther, when answering emails etc.
Definitely setup a bug tracker. This can help keep your email clean. You can even setup an email address to forward bugs to be categorized later. This is good because the bug reporters will eventually tire of the bug tracker and want to just email you the bugs anyway.
edit
And as lod3n said, source control, but you are using that already right???!!?!
Been there, done that.
The book Planning Extreme Programming helped a lot. I used 3x5 cards stuck on a wall. This kept my boss informed of my progress, helped with estimates and planning, and kept me on track. The Planning Game gives good ammo if your boss's expectations are unrealistic.
Unit testing, as others have stated, helps even if you're a sole developer. I find the TDD style valuable.
lod3n is absolutely right about Source Control.
I've gone with XP-style iterations before; you may want to establish a backlog of items (even something simple like a spreadsheet or 3x5 cards) as well.
Avoid deathmarches. Stick to 40 hours in the sense of not working overtime 2 weeks in a row. Spend extra time outside of work learning new skills - not just technologies, but principles and best practices.
Have a bug tracking system, both for defects and new features. Don't rely on your memory.
Continuous integration and an automated build can help even a single developer.
Can't emphasize the recommendation for source control enough.
I've decided I don't need unit tests because I can have automated functional/integration tests instead: because with incremental development there's no need to test before integration.
as crazy as this sounds I use scrum just because I like the concepts of sprints, and backlogs. It makes it easier to set realistic goals. Of course the idea of scrum master and team is all you but if you are working on outside projects where it is possible that you may pick up an extra team member with your backlogs it will be easy to distribute work. Maybe I just like backlogs. With scrum you will need to get someone to be the product manager to talk about the features of the product. Version control is a must at probably should be implemented b4 even worrying about a software development process. I have worked in a company that started from 2 developers and went to 12 in a year. We started with no version control and low coding standards. The changes you will need to make will be gradual so don't worry about rushing to do a 180. Set a goal to change one thing a month and find supporters of your changes to make things go smooth.
As well as the recommedations of others I'd say that if you are tight on resources but have more say over how things get done you should make good use of off the shelf and open source products and libraries. This leverages the efforts of others, saving you time, ensures your code base doesn't become too esoteric and adds to your skillset so you don't end up being an expert in something that's useless everywhere else.
First, lets make a distinction between a development process and best practices. Best practices like source control, defect tracking, unit testing, etc. are an given.
Then there is the actual development processes. I would always recommend having a process, no matter small or large the team is. The trick is finding the right process. You have a process now - it is just an ad-hoc process that doesn't seem to be working out too well for for you. Rarely can you take a textbook development process and directly apply it. What you need to do is tailor the process to your companies needs and culture. Look at as many development paradigms as you can and try to find something that is a good fit and them start molding it to your needs. You may have to try and fail with a number of different processes. Perhaps the Personal Software Process will be a good starting process, maybe an agile process, a variant of RUP? You have a lot of options, start trying them out.
You are also going to have to work with the rest of your organization - they need to be a part of the process. You may be the lone developer, but a development process involves more than the developer, it involves ever person that has a say or impact in the product.
This may not be a specific answer, but my point is that you will need some kind of process. So start researching them and trying them out and molding them to your needs until you have something that works.
Related
I am looking to start developing a relatively simple web application that will pull data from various sources and normalizing it. A user can also enter the data directly into the site. I anticipate hitting scale, if successful. Is it worth putting in the time now to use scalable or distributed technologies or just start with a LAMP stack? Framework or not? Any thoughts, suggestions, or comments would help.
Disregard my vague description of the idea, I'd love to share once I get further along.
Later. I can't remember who said it (might have been SO's Jeff Atwood) but it rings true: your first problem is getting other people to care about your work. Worry about scale when they do.
Definitely go with a well structured framework for your own sanity though. Even if it doesn't end up with thousands of users, you'll want to add features as time goes on. Maintaining an expanding codebase without good structure quickly becomes fairly horrible (been there, done that, lost the client).
btw, if you're tempted to write your own framework, be aware that it is a lot of work. My company has an in-house one we're quite proud of, but it's taken 3-4 years to mature.
Is it worth putting in the time now to use scalable or distributed technologies or just start with a LAMP stack?
A LAMP stack is scalable. Apache provides many, many alternatives.
Framework or not?
Always use the highest-powered framework you can find. Write as little code as possible. Get something in front of people as soon as you can.
Focus on what's important: Get something to work.
If you don't have something that works, scalability doesn't matter, does it?
Then read up on optimization. http://c2.com/cgi/wiki?RulesOfOptimization is very helpful.
Rule 1. Don't.
Rule 2. Don't yet.
Rule 3. Profile before Optimizing.
Until you have a working application, you don't know what -- specific -- thing limits your scalability.
Don't assume. Measure.
That means build something that people actually use. Scale comes later.
Absolutely do it later. Scaling pains is a good problem to have, it means people like your project enough to stress the hardware it's running on.
The last company I worked at started fairly small with PHP and the very very first versions of CakePHP that came out (when it was still in beta). Some of the code was dirty, the admin tool was a mess (code-wise), and sure it could have been done better from the start. But do you know what? They got it out the door before their competitors did, and became extremely successful.
When I came on board they were starting to hit the limits of their current potential scalability, and that is when they decided to start looking at CDN's, lighttpd caching techniques, and other ways to clean up the code and make things run smoother when under heavy load. I don't work for them anymore but it was a good experience in growing an architecture beyond what it was originally scoped at.
I can tell you right now if they had tried to do the scalability and optimizations before selling content and getting a website live - they would never have grown to the size they are now. The company is www.beatport.com if you're interested in who I'm talking about (To re-iterate, I'm not trying to advertise them as I am no longer affiliated with them, but it stands as a good case study and it's easier for people to understand what I'm talking about when they see their website).
Personally, after working with Ruby and Rails (and understanding the separation!) for a couple of years, and having experience with PHP at Beatport - I can confidently say that I never want to work with PHP code again =p
Funny to ask "scale now or later?" and label it "ruby on rails".
Actually, Ruby on Rails was created by David Heinemeier Hansson, who has a whole chapter in his book labeled "Scale later" :))
http://gettingreal.37signals.com/ch04_Scale_Later.php
I agree with the earlier respondents -- make it useful, make it work and get people motivated to use it first. I also agree that you should pick off-the shelf components (of which there are many) rather than roll your own, as much as possible. At the same time, make sure that you choose components for your infrastructure that you know to be scalable so that you can go there when you need to, without having to re-write major chunks of your application.
As the Product Manager for Berkeley DB, I've seen countess cases of developers who decided "Oh, we'll just write that to a flat file" or "I can write my own simple B-tree function" or "Database XYZ is 'good enough', I don't have to worry about concurrency or scalability until later". The problem with that approach is that a) you're re-inventing the wheel (and forgoing what others have learned the hard way already) and b) you're ignoring the fact that you'll have to deal with scalability at some point and going with a 'good enough' solution.
Good luck in your implementation.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Lately I have seen our development team getting dangerously close to the 'second system syndrome' type ideas while we are planning the next version of our product. While I'm all for making improvements and undoing some of the mistakes of our past, I would hate to see us stuck in an endless loop of rewriting and never launching anything.
Is there a good design / development method that lends itself to building a better version 2.0 while avoiding second system scenarios?
I have experience the second system syndrome from both sides as a customer/sponsor and part of a development team.
A root cause for problems is when the team latches on to an Utopian vision of version 2, such as the desire to make the new software "flexible". In this scenario everything is abstracted to the nth degree. For example, instead of data objects that represent real-world entities a generic "item" object is created that can represent anything. One flawed idea is that we can build in so much flexibility into the software to anticipate future needs, that non-programmers will be able to just configure new capabilities. Often one goal such as "flexibility" overshadows the effort to a point that the resulting software doesn't work.
A balanced practical consideration of usability, performance, scalability, features, maintainability, and flexibility goals can bring the team back to earth. "It would be great if..." should be prohibited from the vocabulary of the team. The Scrum backlog is a good tool and the team should be heard saying often... "Let's backlog that...we don't need that for version 2."
"I would hate to see us stuck in an endless loop of rewriting and never launching anything."
Which is why people use Scrum.
Define a backlog of things to build.
Prioritize, so that things which lead to a release are first. Things which should be fixed are second.
Execute sprints to get to the release. Execute a release sprint.
Get someone who has written at least three systems to lead the project.
Try to focus on short delivery cycles, i.e. force yourself to deliver something tangible and useful to the users every few weeks or month. Prioritise the tasks based on their value to the customer. This way you always have a usable, functional application with satisfied users, while under the hood you can refactor the architecture in small steps if you wish (and if you can justify the need for it - that is, towards management / the customers, not just teammates!).
It helps a lot if you have a stable build process with a good suite of automatic (unit / integration) tests.
Agile development methods like Scrum do this, and they are warmly recommended. But of course it is not always easy or even possible to adapt such a method in your team. Even if you can't, you can still take elements of it and apply it to your project's benefit (maybe without even mentioning the words "agile" or "Scrum" publicly ;-)
Make sure you document the requirements as well as possible. While obviously you need to also manage what gets into the requirements to avoid over-designing, having a fixed scope helps prevent developers from running off with ideas or gold-plating what needs to be done and it helps keep management or clients from introducing scope creep. Define all requirements and how scope changes will be addressed.
I'm all for short development cycles (make sure you're writing tests) and agile methodology, but neither of those is a shield against second syndrome system. In some ways it's easier to keep adding on feature after feature if you're working in short sprints without stopping to look at the overall picture. Use agile practices to build the simplest thing that works, then keep adding your other requirements as simply as possible. Remember YAGNI, because when you build a system a second time, that's when you're most likely to build something you're sure you'll need at some point, something that will make the system "extensible" so there never has to be a third build. It's the best of intentions, but the road to hell and all that.
You can't get close to second system syndrome. Either you're in it, or you're away from it. You'll know when you're in it, but only after wasting a lot of resources.
Tips are: know about it (so basically we got that covered already). It's invaluable information to know what a "second system" is, and even more to have some experience with that. I think we all have some experience with that, hopefully benign.
That knowledge will often make you think twice and you'll find a solution without venturing into second-system limbo.
PS: Also know how to use your current system, that includes, maybe documented solutions, and other documentation.
Focusing on the system architecture should help e.g.
Having documented interfaces which support "loose coupling" between sub-systems
Having documented design decisions (avoid re-hashing previously beaten paths)
Hence, without going for an all out swap, the current system can be "upgraded" with more appropriate interfaces to help future growth.
Another good way to focus: assign a $$$ figure to each feature and prioritize accordingly ;-)
Anyhow, just my 2cents
I up-voted S. Lott's answer and would like to add some more suggestions:
Deliver a working product to your customer as frequently as possible. Iterations lasting between a few weeks and 2 months are ideal. Agile methodologies, such as Scrum, lend themselves well to this.
Use FogBugz for feature and bug tracking. Its features are very practical for agile projects. FogBugz allows easy prioritization according to releases and priorities. If your team enters their estimated levels of effort for each task, you can also use this to calculate reasonable ship dates.
Prioritize which features you will develop according to the 80/20 rule (20 percent of the features will be used 80 percent of the time) and the most bang for the buck. This helps keep the system as simple as possible, helps prevent feature bloat, and saves development and testing time.
Give similar thought to both new features and bugs when you determine the priority of an issue. One point of the Joel Test is "Do you fix bugs before writing new code?". In most shops this doesn't happen, but do not make debugging the system an afterthought. Also, keep a working copy of the old system to compare against when bugs are found on the new system.
Also factor in the level of effort required to maintain, and if necessary rewrite, existing code. If you have not already done this, give the team some time to code review whole files that they find troublesome to change. If the system's code was difficult to maintain the first time, this will only get worse and cause your team to spend more time on new features down the road.
It can never be avoided at its entirety. But being cautious could alleviate the problem.
You should formulate some thumb rule based on the vital parameters (scarcest resource) that define the success of the system. For example, reducing potential number of bugs might directly decrease operational cost (potential service calls to support center). But this might not be the case in every other systems. Another example, scarce use of CPU, memory and other resources might be beneficial in some cases but there could be environments where they could be available in abundant.
So simply to avoid "temptations", identify the scarcest resource (time, effort, money$ etc) and consider implementing only those that exceed threshold value.[f(s1,s2...) > threshold]
Despite the iterative development, I would emphasize on how technical debts are handled.
Links that are related to this:
Tech Debts: Effort Vs Time
Tech Debt Quadrant
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
What would be the disadvantages (if any) of automating business process for a Enterprise/organization?
Loosing discretionary error checking, i.e. numbers that look out of line;
Potentially, knowledge of how a process is operated could be lost if it is automated but not documented. More often than not, manual processes are passed on;
Accountability for the process becomes muddled.
You lose some flexibility on unplanned situations. On a manual process, you can often "work around" the process when there is something you did not expect, but with automated processes, you cannot.
The money you need to spend to automate it.
Would depend upon the process, but the classic situation where automation fails is when the automated system decides badly or masks a problem.
Another might be where it becomes a maintenance issue. Humans adapt pretty well to fluctuations in the process flow, typically automated systems don't.
Again, its a fairly vague question so it is hard for me to talk specifics. Do you have a particular process you want to automate?
It's always difficult to find a right tool for it.
You may end up ordering a custom tool and it's risky. However an off-the-shelf tool might be as difficult to implement.
It's very difficult to capture the business process right. Sometimes it's quite convoluted - especially in a big and old organisation. You may end up having 80% of cases automated and the other 20% being impossible to do at all
It's an investment for which a ROI should be calculated carefully. Sometimes you are better off just not doing anything at all.
Saying that I was involved in the projects where the automation of the business process allowed 1 person to do the work of a small department and that department was able to spend their time on significantly increasing the company revenue
Automation is set of tools and procedures to make things work more effectively,
but there might be problems:
If user support is automatic (and non-relevant), customers might start to feel that you don't care about them. Some people want personal support instead of automated replies.
If there are problems that are not covered by set of features in the system and data gets missing because of it.
If there isn't anyone to check frequently that system is still working in right way.
If automated system haven't been tested enough and errors cause big losses (financially or in reputation).
If something else goes wrong (and there isn't anyone to fix it).
Make sure that you have enough people to maintain and support your system so if automation fails, your business wouldn't fail with it. There are many examples of failures in automation of business processes so plan your projects well before attempting to do things. Think of possible (and impossible) ways of how automation might fail and make enough of error checking to make sure that things go right way even in case of errors (in input data, processing of information or at some another level of system).
Sometimes it is better to do things manually while in other situations automation is way to go.
Automating a business process just for the sake of automation is a fools errand. And is likely to cost your business A LOT (in every way you can imagine - financial impact, business disruption, new technical issues, morale, etc...).
Without care, automating a business process can:
cost lots of money (without necessarilly the possibility of making it back somehow)
take a long time to implement (use of resources that would be better used elsewhere)
make the proces more brittle (loss of flexibility / adaptibility)
uncover issues with the current process that are minor and blow them out of proportion
create errors if something in the process is overlooked
demoralize your staff
Based on my experience, assuming the business process has been design well and implemented correctly, here are the disadvantages with BPM automated business processes:
Changes to the business process will required changes to the BPM solution.
Making changes to the business process is harder because you have historical data that you need to migrate or it forces you to use a sub-optimal work around instead.
The business process can be affected by the maintenance of the BPM platform.
The business process can be lost in the system due to BPM engine error requiring developer support.
The costs?
Loss of staff -automation can work so well that you will need fewer workers.
Loss of poor quality output - automation can provide berg predictable results.
Lack of 9-5 work - automation software Dan run anytime you set it to. I have built routines that run every three hours and have been running for years.
Loss of unhappy customers - quality and benefits go up when automation is done correctly. I created an automated Email routine that alerted our employees exactly the amount of money we placed in their bank account on pay day. This was easy to do since I had just automated the creation of the text file needed to upload to our back to pay them already had the needed numbers.
You get the idea. Automation done right costs lots up front..... but pays for itself in savings and increased benefit over a short amount of time!
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
In pair programming, the experience of every member of the team can be spread to new member. This experience is always in sync with the code, because the "senior" of the pair knows how the code works and what the design is.
So what is the utility of design documentation in this case ?
UPDATE
I don't imply no design, I imply no documentation.
With a team which practice pair programming I think that everybody is disposable, because everybody knows the code. If the senior developer leaves, I think that there is always at least one person who knows the code, because the experience was shared before.
What if your team is larger than 2 persons?
Just because two people know a part of a system does not mean it shouldn't be documented.
And I would be glad to know that I don't have to remember every tiny detail of a system just because it it's stored nowhere else than in my head.
For a small system this might work, but as the system gets larger, your limiting yourself and your colleagues. I'd rather use the memory capacity for a new system than to remember everything of the old system.
Have you ever played "telephone?" I don't think you should play it with your codebase.
What if the senior programmer leaves the company/project?
The set of deliverables should be decided independently of whether you use pair programming or not.
Six months or two years later, all the people involved could be in a different project (or a different company). Do you want to be able to come back and use the design documentation? Then, produce it. If you don't want to come back, or the design is simple enough that with the specs and the code you can understand it without the aid of an explicit design document, then you may skip it.
But don't rely on the two people explaining the design to you one year later.
Maintenance. You can't expect the team to remain static, for there to be no new members or loss of old members. Design documentation ensures that those who are new to the project, that have to maintain it years down the line, have information on decisions that were taken, why the approach was chosen, and how it was to be implemented. It's very important for the long term success of a project to have this documentation, which can be provided via a combination of traditional documents, source comments, unit tests, and various other methods.
I don't see that pair programming makes design documentation obsolete. I immediately have to think about the Truck factor. Sure, the senior may know what the design is. But what happens when he is ill? What happens when he gets hit by a truck? What if he is fired?
Pair programming does spread knowledge, but it never hurts to document that knowledge.
Who knows about the first-written code? The answer is nobody knows, because it hasn't been written. The reason it hasn't been written is because nobody knows what to do, hence the need for a design document.
Pair programming is just two people sharing one computer. By itself, it says nothing about what kind of design methodology the pair(s) uses.
Pair programming, when taking as part of "Extreme Programming", means following the Extreme Programming guidelines for design. This typically involves gathering and coding to "user stories". These stories would then stand in place of other design documentation.
The experience of people may be in sync with the code, as you say. But the design decisions are not all captured in the code - only the choices made are there.
In my experience, to really understand why code is designed the way it is, you need to know about the design choices that were not selected, the approaches that had tried and failed etc. You can hope that the "chinese whispers" chain transmits that correctly, given that there's no record of this in the code to refresh memories or correct errors...
... or you can write some documentation on the design and how it was arrived at. That way, you avoid being taken down a dark alley by the maintenance programmers in future.
Depends what you mean by "design documentation".
If you have functional tests - especially behaviour-driven development (BDD) tests, or Fitnesse or FIT tests then they're certainly a form of "active documentation"... and they certainly have value as well as being regression tests.
If you write user stories and break them down into tasks and write those tasks on cards for pairs to do then you're doing a form of documentation...
Those are the two main forms of documentation I've used in XP teams that pair on all production code.
The only other document that I find quite handy is a half-page or so set of bullet points showing people how to set up the build environment for a development machine. You're supposed to maintain the list as you go along using it.
The code base may be so large you can't humanly remember every detail of what you were intending to implement. A reference is useful in this case.
Also, you need a design if you are interacting with other components etc.
Well if you want a spreadsheet program instead of a word processor a design doc use useful :-)
XP, pair programing, agile, etc... do not mean you do not have a plan, it is just a far less detailed plan (at the micro level) of what is going on. The use cases that the user picks are more of the design, and it is more of a living document than with other styles of design/programming.
Do not fall into the trap that because youa re doing something "cool" that you no longer need good practices - indeed this style of programming requires more discipline rather than less to be successful.
Pair programming is an opportunity for the team to avoid having to spend a large proportion of the project time on documenting everything. But the need for documentation depends on how good you are at remembering the important stuff and how good your code is. You may still want lots of documentation if the code is difficult to work with.
You could try some experiments:-
Document a couple of small parts of
the design and note how often you
have to refer to it.
Document stuff that is always a pain
to work with.
No Nor does lack of pair programming mean you need documentation. Documentation is needed! What it looks like may surprise you!
An agile team will decide when and what documentation is needed. A good rule of thumb, if no one is going to read it, don't write it. Don't get caught up in the waterfall artifact thinking by provide artifacts because the Project Manager says so.
Most think of documentation as something you do with Word. If an agile team is working properly, the code itself, with TDD (test driven development) will have a set of automated test that document and enforce the requirements. Image, documentation that is in sync with the code ... and it stays that way.
Having said that, pairing does help domain, application, practice and skill knowledge propagate through the team very quickly. Pairing also helps ensure that the team follow the engineering practices including TDD and other automated test. The results are that the application remains healthy and future change is easy to bring about.
So, bottom line, pair programming produces better documentation. It does not eliminate documentation (although you might not be able to find a Word document).
I am a pro-advocate and a fan of documentation. Pair programming does not require "one senior developer". In my experience with pair programming, developers of all levels are paired together, for the purpose of rapid development. There are many times I worked with junior developers and would trade off on the keyboard. There are many times I worked with senior architects and would trade off on the keyboard. Documentation is still necessary, especially with your core components and database.
Pair Programming only enables your coding and logical aspect.
But documentation is good practice. Always do documentation...
Ten years ago when I first encountered the CMM for software I was, I suppose like many, struck by how accurately it seemed to describe the chaotic "level one" state of software development in many businesses, particularly with its reference to reliance on heroes. It also seemed to provide realistic guidance for an organisation to progress up the levels improving their processes.
But while it seemed to provide a good model and realistic guidance for improvement, I never really witnessed an adherence to CMM having a significant positive impact on any organisation I have worked for, or with. I know of one large software consultancy that claims CMM level 5 - the highest level - when I can see first hand that their processes are as chaotic, and the quality of their software products as varied, as other, non-CMM businesses.
So I'm wondering, has anyone seen a real, tangible benefit from adherence to process improvement according to CMM?
And if you have seen improvement, do you think that the improvement was specifically attributable to CMM, or would an alternative approach (such as six-sigma) have been equally or more beneficial?
Does anyone still believe?
As an aside, for those who haven't yet seen it, check out this funny-because-its-true parody
At the heart of the matter lies this problem, neatly described by the CMM guidance itself...
“...Sound judgment is necessary to use the CMM correctly and with insight. Intelligence, experience and knowledge must shape an appropriate interpretation of the CMM in a specific environment. That interpretation should be based on the business needs and objectives of the organization and the projects. A rote, checklist-oriented application of the CMM has the potential to harm an organization rather than help it...”
From Page 14, section 1.6 of The Capability Maturity Model, Guidelines for Improving the Software Process by the Carnegie Mellon University Software Engineering Institute, ISBN 0-201-54664-7.
I found it to be bloated, documentation exercise that was used mainly as a contract-acquiring/maintaining vehicle. Once we had the contract, then it was an exercise in getting around the process.
As a developer, I got nothing out of it but lost MONTHS of my professional life fiddle-farting around with CMMI.
The same goes for 6 Sigma, which I branded "Common Sense in a Box". I didn't need to be trained how to figure out what the problem was to a process - it was generally quite evident.
For me, small teams and agile mechanisms work much better. Short cycles, lots of communication. That might not work in all environments, but it definitely works in mine.
Just my two cents.
For a typical CMM level 1 programming shop, making the effort to get to level 2 is worthwhile; this means that you need to think about you processes and write them down. Naturally, this will meet resistance from cowboy programmers who feel limited by standards, documentation, and test cases.
The effort to get from level 2 ("there is a process") to level 3 ("everyone has the same process") normally gets bogged down in inter-departmental warfare, so it's probably not worth starting.
If you see CMM run. And run fast.
CMM and CMMI both offer some benefits if your organization takes the lessions it tries to teach at heart. The problem is that getting to the higher levels is very difficult and expensive, and the only time I have seen an organization go through the effort is because their customers won't let them bid on contracts until they are at a certain level.
This has the effect of the organization doing everything they can to "just get the number" without actually caring about it improving their process.
The higher end? No. CMM-5 shops do not impress me.
The lower end? Yes. CMM-1 organizations scare me.
CMM can help a new/novice team measure themselves and do the self improvement thing.
CMMI isn't really about improving your software, it is about documenting what you have done. You can almost estimate a company's CMMI level by the weight of the documentation it produces.
Background: I have studied CMMI in my Software Engineering graduate program and have worked on a team that followed its guidelines.
My experience is that the CMM is so vague that its very easy to fulfill. Also, when they come to certify you, they look at the project that your organization chooses. Where I used to work, this was the project with no real deadline, plenty of money, and lots of time to spend on every nook and cranny of process. Many of the other projects continued with little to no code/design review sometimes without versioning software.
I think the emphasis on CMM certification is unfortunate. Companies know how to work the system, and do. Instead of focussing on real process improvement that meets their bottom line, they focus on getting a certification and working the system. I honestly think most organizations would rather spend time on the former instead of wasting so much time on the latter.
Really what matters is having conscientious people who want to make good development decisions and know that they will need help making those decisions. There is no substitute for quality programmers who know that programming is an ongoing group activity where they are just as likely to make a mistake as anyone else.
I have been doing a lot of interviewing for small teams doing iterative development. Personally, if I see CMM on a resume it is a big red flag that signals interest in process over results.
All formal methods exist to sell books/training courses/certification, and for no other reason. That's why there are so many formal methods. Once you realise this, you are free :-)
Yourdon still believes. But he might also still believe the world is going to end with Y2K.
This is not something I would personally put a lot of faith in or want to be yoked with in the future. But often ours is not to reason why...
P.S. Though a bit off-topic, I would like to mention that faked CMMI certifications are very common as well as real certifications obtained through bribery.
CMM doesn't really speak to the quality of the software, but more towards the documentation and repeatability of the process. In other words, it is possible to have an orderly and repeatable development process, but still create crappy software. As long as the process is properly documented, it is possible to achieve CMM Level 5.
At the end of the day CMM is another tool that can be used or misused. If the end goal is to improve software quality, it is possible to use CMM to improve the development process and improve software quality. If achieving a certain CMM level is the goal, then most likely software quality will suffer.
The Model is losing it's credibility, first because the companies adopt the model not looking for a maturer software development model, but to be appraised to a CCMI level.
And the other problem, the one that I think leads to the lost credibility is that as a contractor, you have no guarantee that the project your CMMI appraisal supplier is selling you will be developed using the model practices. The CMMi label only states that the company have once developed projects that were evaluated as adherents to a specific CMMi Maturity level.
The problem is not just on CMMi but on the process developed by the companies. The CMMi does not describe the process itself, but just what the process should do. You have the same problem with PMBOK. Actually the problem is not just the PMBOK, but primarily the problem is the bad project managers that claim to follow the PMI statements.
At school, I was taught: CMM is a good Idea, but lacking certification (anyone can say they are level 5 / level 4) it ends up being a marketing tool for offshore shops. So, yeah, the idea is sound, but how do you prove adherence?
I used to. But now I find that CMM and CMMI don't really fit that well with agile approaches.
Oh sure you can squeeze things to get that square peg into the round hole, but when push comes to shove, you are still basing your approach on an ability to predict everything that is needed, and anticipating everything that will be encountered, when building a software system.
And we all know, how well that approach works in real life! (-:
cheers,
Rob
Agile is the next CMM and both are fragile. The field of process and quality consulting is a good business in any industry and like the engineering folks everyone needs new buzzwords to keep the money flowing.
CMM when it first came out of the SEI was a good concept based on solid academic work but it was soon picked up by the process consultants and is a worthless certification now, which is used by most CIOs to cover their ass (Nobody got fired for picking a CMM Level 5 company)
Agile is going to go down that route soon and then we can be sure to see the next silver bullet in the horizon soon :)
When I worked on commercial flight software, we used CMM and as our processes improved our ability to accurately predict completion times improved. But this was a cumbersome process, other approaches should work just as well.
Smaller projects are less dependant on process for success. The key metric is the Hero to Bystander Ratio. Any project with an HTBR of less than 0.2 is in serious trouble.
There are quite a few good ideas that can readily be adapted and adopted by any organisation for their own good, but getting a badge is a pain due to the requirement for all kinds of redundant documentation.
The problem is that CMMi is not a process but just a guide for whatever process you might choose to have and that in itself invites half-baked ideas flowing around.
Another point is that migration is a real pain when you are starting, but its the same as any other teething trouble, I guess.
The main issue with understanding the value of CMMi is understanding CMMi itself.
CMMi is a documented approach to Continuous Improvement for Software Production.
Understanding Continuous Improvement with SPC is difficult enough in manufacturing but add the intangible Software product and the difficulty is exponential.
I would recommend to anyone, or organization, new to CMMi: to document their current process then look at what outcomes (cost/benefit) can be measured independently of the process. In this way if any process, procedure of standard was changed would it yield a 'better' result. The prerequisite to this exercise is a documented, stable repeatable process since it is impossible to measure the benefit of any change within an ad-hoc environment as you are not comparing 'like for like'.
By focusing on the above concepts initially, the organization will begin to understand and embrace the essential value of the CMMi.
Legend has it that the US Department of Defense, which did a lot of contracting, found that many of its projects faced time and cost overruns, and even when they were delivered, the projects were not exactly what was ordered.
So they wanted a way to be sure that that a contractor would be able to deliver on time, within budget and close to what was required. Thus the capability maturity model was born.
The thesis is that if things are written down, then they survive attrition. But saying that write down everything would not be enough, it must be checked that they are written down correctly. Among other things.
Throughout all this, it never crossed their minds to consider the cost of doing all this. Because from the point of view of the DoD, if it gave out a project for $ 1 million to get something in a year, and ended up paying $ 10 million over 10 years and not getting what they wanted, and now if they instead had to pay $ 5 million for that same thing to get what they actually wanted in two years, they are still saving $ 5 million, and not to mention that they are actually getting something.
So if you are contractor to US DoD or something like that, go ahead and get CMM, because it would be a requirement. But if you are competing with the 1000s of software development shops on elance, to get projects with limited budgets, limited time and so on... CMM is not a good choice.
That said, feel free to read the CMMI Dev pdf (v 1.3 at time of writing). It makes a lot of good points. It deconstructs the organisation very nicely. And if you see any points which make you go 'aha! i have this problem', then by all means use that wisdom to resolve your problem. In our case, one small change we made was to ensure that we make a list of all the people who are allowed to give us requirements. If there was more than one person who was allowed to give us requirements, then any requirement coming from one source was circulated to the others, and they had to say 'okay' before we added it to the backlog. This small change made a big difference in how much we worked and reworked.
In short look at the process areas and compare them to your pain areas, and take the suggestions given by CMM. The way you implement it is your own. And you can always implement it in a way that does not take too much time or cost too much money. But I guess the same applies even to the relevant ISO/IEC standards.