(Rational) Unified Process vs Waterfall Model - waterfall

I was reading about software development model and life cycle, where I learned about waterfall model and Unified process. However both the process involves requirement gathering, design phase, development testing and deployment (inception, elaboration, construction and transition phase in unified process).
Can any one help me out with the difference between the two?

You haven't specified 'which' Unified Process or 'which' Waterfall Process - there were many variants of both, so some of the comparison will be lost in generalization.
Taking for example the Rational Unified process, which differs from waterfall processes in that the disciplines (Analysis, Design, Coding, Testing etc) are done iteratively and concurrently, whereas in waterfall processes, the disciplines are generally done sequentially (e.g. Coding only starts once Requirements have been finalized and Design has been accepted).
In RUP, the term phases (inception, elaboration, construction, transition) are NOT dedicated to a single discipline or single deliverable - RUP phases are all multi disciplinary - e.g. although Inception is primarily about requirements and analysis; some design and prototype coding are also encouraged to reduce risk and improve estimates for future phases, and even in the construction phase, further analysis may be required.
RUP uses the term 'generation' to indicate another full cycle of development e.g. for a "version 2" of a project, where new work on generation 2 would begin at the inception phase.
Another major difference was that RUP pushed the concept of Visual Models (especially UML) as deliverable artifacts which described the requirements, high and class level design (and in some cases code generation was possible from detailed UML models), whereas Waterfall artifacts were typically very document heavy (e.g. the ESA / IEEE processes)
Another difference was in the approach to commercial engagement. Waterfalls typically promoted the concept of a 'contractual' Software Requirement or Software Specification document, which defined the deliverable (functionally, and non-functionally), and from which a project budget or fixed price transaction would be based. Instead, RUP promoted budgeting at a per-phase basis, e.g. and the effort / cost for the following phase would be known / iterated / improved as one of the deliverables of the preceding phase had been delivered.
In many software development operations, Agile processes have superseded both Waterfall and RUP, although many of the artifacts and learnings of Waterfall and RUP remain. Agile has the primary benefit of breaking work down into much smaller chunks (typically 2 week sprints, instead of months-long RUP phases or year-long waterfall projects). This quick turnaround allows features to be delivered on a cost vs priority basis, is better adapted to ever-changing requirements, and allows obstacles to success to be identified much sooner than either Waterfall or RUP. Agile also cuts out a lot of the waste - lets face it, only a small percentage of developers ever read detailed specification documents or pore over detailed UML diagrams.

Related

How to align the BPMN models with the Technology Architecture?

I stuck how to proceed further and need some new ideas to align these BPMN models which I have drawn for Customer Relationship Management(CRM) and Human Resources(HR).
As far as BPM model is considered it's mainly used for Business Architecture(BA) and then for Technical Architecture(TA) I could possibly use Rational Unified Process(RUP) but when I researched I could only find IBM Rational Rose Software which is not free...
My Question:-
Is there, open Source RUP tools which I can use? I looked up OpenUp but I could not make it work(which is a different issue).
Is this the right approach; for BA -> BPM and TA -> RUP ?
The scope of BPMN (BPMN specification 1. Scope ) describes
The primary goal of BPMN is to provide a notation that is readily understandable by all business users, from the business
analysts that create the initial drafts of the processes, to the technical developers responsible for implementing the
technology that will perform those processes, and finally, to the business people who will manage and monitor those
processes. Thus, BPMN creates a standardized bridge for the gap between the business process design and process
implementation.
There are Business process management(BPM) software's which provides process modeling and process execution conformance. Thus effectively making the models executable [at least to a certain depth].
In the free/ open source world you can find jBPM, Activiti etc...
I have tried out jBPM, is pretty much mature and has standard notations compliance. Also it supports modeling, execution and operational functionalities.

where does specification by example complement/replace traditional requirements documentation?

I'm trying to understand where SBE's complement or replaces traditional requirements documentation. The diagram levels of requirements shows three levels of traditional software requirements.
Which of the items below (from the diagram) does SBE replace and which ones does it complement:
Vision and Scope Document
Business Requirements
Use Case Document
User Requirements
Business Rules
Software Requirements Specification
System Requirements
Functional Requirements
Quality Attributes
External Interfaces
Constraints
My naive understanding of SBE's would say that the SBE's are just an alternative form of the Software Requirements Specification. Is this correct?
BDD and SBE are normally used by Agile teams, who don't focus as much on documentation as traditional software development teams do.
BDD is the art of using examples in conversation to illustrate behaviour. SBE then uses the examples as a way of specifying the behaviour that you decide to address (I always think of it as a subset of BDD, since talking through examples often ends up to eliminating scope, discovering uncertainty or finding different options, none of which end up as specifications).
There are a couple of things that are hard to do with BDD. One of them is anything which isn't discrete in nature, or which needs to always be true throughout the lifetime of the system - non-functionals, quality attributes, constraints, etc. It's hard to talk through examples of these. These continuous aspects of requirements lend themselves better to monitoring, and that's discrete, so BDD can even be used to help manage these.
Since an initial vision is usually created to help the company make money, save money, or protect existing revenue (stopping customers going elsewhere, for instance), you can even come up with examples of how the project will do this. In fact, if you can't, the project is likely to fail anyway. So BDD / SBE can also be used to help complement an initial vision and scope.
Therefore, BDD / SBE can complement all of these documents, and in Agile teams, the documents themselves are usually replaced by conversations about the requirements and rules (illustrated by examples), story cards to represent placeholders for those conversations, and perhaps some lightweight capture of those conversations on a Wiki.
It is unlikely that any Agile team captures all of their examples up-front, as this leads to excessive investment in the requirements and tends to turn it into a traditional Waterfall /SDLC project instead.
This blog post I wrote about BDD in the Large may also be of interest.

SDLC and Software Process

I have some confusion about the terms SDLC and Software Process. With respect to these (and more or less similar) terms, a have a few question.
What is the difference between SDLC and Software Process ? (I understand SDLC is not just Waterfall).
Can we map SDLC with Unified Process ?
About the activities- Analysis in tradition waterfall model, Do we do Analysis in Unified Process (Any unified process- Agile or Rational) ?
SDLC stands for System Development Life Cycle, and it is a more or less generic term to describe whatever standard life cycle that you have implemented is.
SDLC is essentially your software process, but in my experience, most people associate it more directly with waterfall processes, as you indicated and more specifically, CMMI standards.
Typically with the SDLC, you will find that different groups have different methodologies to express it.
Since I don't recall the exact definition, there may be more linking it to the waterfall methods than just semantics. For instance, I believe Agile methodologies could be considered a type of SDLC, but I could be wrong about that.
I hope this helps.
SDLC in shortcut for software development life cycle for a software product that contains the process for a product the software from requirement software.. maintenance
the SDLC that contain viruses methodologies like waterfall ,scream,agile that follow each the process software from the requirement , design, implementation, testing, maintains but
different to how to apply this process some methodologies as agile do the multi-process in same time as implementation with design in want to write Document.
in a waterfall, methodology cont go to apply the next process until the previous process finish cont do mult process at the same time example cont be go implementation with the design in the same time you should be complete the design process cont be execution 2 process at same time
Software Process - is a set of activities and associated results that produce a software product. There are 4 fundamental process activities that are common to all software processes
Software Specification
Software Development
Software Validation
Software Evolution
SDLC - is the oldest and the most widely used approach in software engineering.It follows a number of sequential phases and partitioned set of activities. Based in an engineering/construction/production new.
Problem exploration
Feasibility Study
Requirement Gathering
Analysis
Design
Construction
IS implementation
Operation and Maintenance
Evoluation
- Phase out
I very much agree with you, SDLC dates back to the 1950s and it was the first framework introduced at the time. However, I have a few notes on the SDLC phases - I'd say that there are 7 stages of the SDLC:
1.Planning
2. Requirements Analysis
3. Design
4. Development
5. Testing
6. Deployment
7. Maintenance and improvement.
Today, there are a lot of SDLC models, Waterfall being the most popular one. Though, Agile is becoming quite popular lately - yet, I find a lot of teams to be highly disappointed of Agile. "We are constantly changing things that we never get anything done" - that's the most common phrase I hear.
What is the difference between SDLC and Software Process ? (I understand SDLC is not just Waterfall).
Ans: SDLC is the development lifecycle that is used in each and every project.SDLC defines all the standard phases which is very useful in software development. Software Process defines all the activities/phases to improve the quality of the product.
Software process is the testing lifecycle as it includes all the phases even the basic phases.
Can we map SDLC with Unified Process?
Ans: Yes you can map but only methodologies not the life cycle
Let's clear these queries one by one.
The difference between SDLC and Software Process:
Software Process or Software Development Process and Software Development Life Cycle - both are the concepts with a similar goal to develop a software.
There are multiple strategies or models available to develop a software. Like, Waterfall, agile, etc.
SDLC provides set of phases for the developers to follow. Each phase is a result of the privious phase.
The Unified Software Development Process or Unified Process is an iterative and incremental software development process framework.
For more details:
software process: https://www.geeksforgeeks.org/software-processes-in-software-engineering/
Software development life cycle: https://www.tatvasoft.com/outsourcing/2022/09/sdlc-best-practices.html
Yes, we can map SDLC with unified process.
You can go through this link for more details: https://www2.cdc.gov/cdcup/library/framework/mapping.htm
Unified Process, like most agile techniques, does not expect the general project plan defines when each use case is going to be implemented. So, object oriented analysis is required for the disign of the Information System.
For more details, use this reference: https://www.sciencedirect.com/topics/computer-science/unified-process

What are alternatives to the Waterfall model

Can you please give a methodology that stands to alleviate the disadvantages of waterfall model?
The problem with Waterfall is that it consists of monolithic stages, each building on the previous stage. So the code is developed in one chunk after the entire system has been designed, which in turn happened after all the requirements have been gathered and signed off.
This is problem because any change has to be ratified by a complex procedure and rippled through all the stages. But the lesson of history is: change happens. The requirements are always incomplete, or mis-specified or simply out-of-date by the time we get to coding. Too often design and build proceed on the basis of assumptions which are nullified when the system gets to UAT. This leads to frantic re-work and slippages.
The truth is not many customers are good at the sort of abstract thinking required to envisage a working software software system. And too many IT professionals lack the experience necessary to understand business logic. Waterfall refuses to accept these truth.
The only honest requirement specification is "I'll know it when I see it". So it is crucial to get working software in front of real users as soon as possible. Any methodology which focuses on delivering working software incrementally in short iterations will "alleviate the disadvantages of waterfall model".
Originally that was RAD or DSDM. Then XP tok up the banner. Now there is Agile and related things like Scrum and Kanban.
So why do people persist with the Waterfall method?
There is a common perception that Agile is just a cover for cowboy hackers to ditch all the boring process stuff and get on with what they enjoy most: writing code. The branding of "Extreme Programming" certainly encourage this thought, and, let's be honest, it is not an unfounded allegation. That is, some coders pretend to be agile as an excuse not to plan, design or document. This does not reflect the actual practice of Agile, which require just as much rigour as any other methodolgy.
Also Agile requires a much greater commitment of time from the customer's staff, which many organizations are loath to accept. Also the people footing the bill may be unwilling to empower their junior staff to make decisions. There is an important distinction between Customer and User.
When it comes to outsourcing the waterfall model provides an easy framework for matching deliverables to staged payments. Indeed the contractual aspect maybe stronger than that: in the EU Waterfall is mandated for all projects valued at EUR 100m or more.
Finally, there are projects where Waterfall works well. These projects have knowledge domains which are stable and well-understood by both the customers and the developers.
last word
Despite its failings Waterfall has delivered many projects successfully. This is because hard work, aptitude and integrity are more important than methodology.
The waterfall model was documented in 1970 by a Dr Winston Royce in a paper titled 'Managing the development of large Software Systems'. Basically outlining his ideas on sequential development. His idea was that software could be produced in a similar fashion to an automobile, where the vehicle is pieced together in sequential/linear phases.
This linear approach doesn't really allow for changes in a piece of software once it begins. There is no tight relationship with the end user/client so its harder to outline possible problem areas.
Its worth noting some phases of the waterfall model allow for 'splashback' whereby there is enough time in the development period to go back and make small changes. Time constraints and the amount of work involved and budgets don't really allow for much change if any to be made using this model.
The waterfall model is old, as time goes by software paradigms themselves change. Object Oriented programming is popular, back then it was barely alive. Through the use of the waterfall model its obvious that the flaws have been spotted and this has lead to the alternative development methodologies.
Ok, so now for alternatives. Incremental model is described by Alistair Cockburn(2008) as a staging and scheduling strategy in which various parts are developed at different times or rates and integrated upon completion of that specific part.
Basically incremental looks a lot like this:
Analysis->Design->Code->Test
Analysis->Design->Code->Test
Analysis->Design->Code->Test
Number of benefits include lifecycle being flexible and allowing for change from the get go.
Working software or rather parts are generated quickly and early on. Code produced is earlier to test and manage due to the small iterations of progress. Not all of the requirements of the system are gathered up front, just an outline. This allows for a quick start, however it might be a disadvantage in some systems as things like the system architecture being supported might be missed.
Iterative on the other hand allows parts of the system to be reworked and revised to improve the system. Time is set aside to allow for this. Iterative does not start with a full specification of requirements. Development is done by specifying and implementing just part of the software. Software is reviewed in order to identify further requirements.This is more of a top down approach. Disadvantages with this methodology are making sure all the iterations are compatible. As each new iteration is approved, developers may employ a technique known as backwards engineering, which is a systematic review and check procedure to make sure each new iteration is compatible with previous ones.A major benefit with the constant iterations is that the client is kept in the loop and the final product should meet the requirements.
Iterative approach diagram.
Other methodologies include Prototyping. Evolutionary and Throwaway. These are also deemed as more of a top down approach. Both process are borrowed from engineering.In engineering it is common to construct scale models of objects to be built. Building models allows the engineer to test certain aspects of the design. The software development prototyping methodology provides the same ideology. Prototyping is not seen as a standalone, complete development methodology but rather an approach to handling selected portions of a larger, more traditional development methodology.
Throwaway Prototyping - Throwaway prototyping does not preserve the prototype that has been developed. In throwaway prototyping there is never any intention to convert the prototype into a working system. Instead the prototype is developed quickly to demonstrate some aspect of a system design that is unclear. It can also be developed to help users or clients decide between different features or interface characteristics. Once any problems or uncertainty has been addressed the prototype can be ‘thrown away’ and the principles learned used in the design and documentation of the actual product.
Evolutionary Prototyping - In Evolutionary prototyping you begin by modeling parts of the target system and if the prototyping process is successful you evolve the rest of the system from those parts. One key aspect of this approach is that the prototype becomes the actual production system. This process allows for difficult parts of the system to be modeled successfully in prototypes and dealt with early on in a project.
Other areas to look into will include Agile-> SCRUM, Extreme programming, Paired programming etc.
Tried to keep it short but people write books on this sort of stuff and there is so much to discuss.
Might be worth having a look at:
Incremental and Iterative
The alternative to the waterfall method is "doing it the correct way".
Waterfall seems to make sense if you are on a factory floor assembly line. But I've never seen it work as part of the design process...and sofware development is ALL a design process. And so the waterfall method never really works in the sense that it doesn't help facilitate the creation of high quality product, but rather focuses on process. Process can be great, but what's the point if the product it produces is second rate?
Kanban and Scrum are two of the most commonly used alternatives to Waterfall. I tried to give a good overview and comparison of the different SDLC approaches.
Waterfall relies heavily on massive monolithic phases as mentioned by APC. This is a huge weak point because trying to determine the end product from the start is a fruitless endeavor.
Kanban is slightly cowboy, but I find if you couple it with standups it certainly still has it's place.
Scrum is great for putting pressure on the team and getting ownership on tickets. I've found most places have been going with this one but the downfall of it is some people go overboard with having meetings for everything. Sprint planning meetings, sprint kickoff meetings, daily standup meetings that last 1 hour with 20+ people present, demo meetings, and then finally the post-mortem.
Remember that agile is only as good as you make it and you can easily sink any methodology if you go wild with unrestrained meetings which aren't adding value. Keep it as lean as you can without it being chaotic.
From the top of my head, I can think of ways to palliate the shortcomings of the waterfall model:
Have the coder concentrate on automating the process itself. Automate the transitions between one step and another, so that changes will flow more or less automatically.
Make the process more bidirectional. One principal characteristic in the waterfall model is that changes flow from top to bottom. This is a unidirectional process, and that is part of the problem.
Another thing which would help is (as someone mentioned in an earlier answer) is for the developer to gain a better understanding of the business logic involved, and of what the customer wants, and for the customer to gain knowledge about the characteristics of the development process.
Here are some links about Waterfall model:
http://www.cs.odu.edu/~zeil/cs451/Lectures/01overview/process2/process2_htsu2.html
http://www.buzzle.com/editorials/3-13-2005-67039.asp

Which software development practices provide the highest ROI?

The software development team in my organization (that develops API's - middleware) is gearing to adopt atleast one best practice at a time. The following are on the list:
Unit Testing (in its real sense),
Automated unit testing,
Test Driven Design & Development,
Static code analysis,
Continuous integration capabilities, etc..
Can someone please point me to a study that shows which 'best' practices when adopted have a better ROI, and improves software quality faster. Is there a study out there?
This should help me (support my claim to) prioritize the implementation of these practices.
"a study that shows which 'best' practices when adopted have a better ROI, and improves software quality faster"
Wouldn't that be great! If there was such a thing, we'd all be doing it, and you'd simply read it in DDJ.
Since there isn't, you have to make a painful judgement.
There is no "do X for an ROI of 8%". Some of the techniques require a significant investment. Others can be started for free.
Unit Testing (in its real sense) - Free - ROI starts immediately.
Automated unit testing - not free - requires automation.
Test Driven Design & Development - Free - ROI starts immediately.
Static code analysis - requires tools.
Continuous integration capabilities - inexpensive, but not free
You can't know the ROI. So you can only prioritize on investment. Some things are easier for people to adopt than others. You have to factor in your team's willingness to embrace the technique.
Edit. Unit Testing is Free.
"time spend coding the test could have been taken to code the next feature on the list"
True, testing means developers do more work, but support does less work debugging. I think this is not a 1:1 trade. A little more time spent writing (and passing) formal unit tests dramatically reduces support costs.
"What about legacy code?"
The point is that free is a matter of managing cost. If you add unit tests to legacy code, the cost isn't free. So don't do that. Instead, add unit tests as part of maintenance, bug-fixing and new development -- then it's free.
"Traning is an issue"
In my experience, it's a matter of a few solid examples, and management demand for unit tests in addition to code. It doesn't require more than an all-hands meeting to explain that unit tests are required and here are the examples. Then it requires everyone report their status as "tests written/tests passed". You aren't 60% done, you're 232 out of 315 tests.
"it's only free on average if it works for a given project"
Always true, good point.
"require more time, time aren't free for the business"
You can either write bad code that barely works and requires a lot of support, or you can write good code that works and doesn't require a lot of support. I think that the time spent getting tests to actually pass reduces support, maintenance and debugging costs. In my experience, the value of unit tests for refactoring dramatically reduces the time to make architectural changes. It reduces the time to add features.
"I do not think either that it's ROI immediately"
Actually, one unit test has such a huge ROI that it's hard to characterize. The first test to pass becomes the one think that you can really trust. Having just one trustworthy piece of code is a time-saver because it's one less thing you have to spend a lot of time thinking about.
War Story
This week I had to finish a bulk data loader; it validates and loads 30,000 row files we accept from customers. We have a nice library that we use for uploading some internally developed files. I wanted to use that module for the customer files. But the customer files are enough different that I could see that the library module API wasn't really suitable.
So I rewrote the API, reran the tests and checked the changes in. It was a significant API change. Much breakage. Much grepping the source to find every reference and fix them.
After running the relevant tests, I checked it in. And then I reran what I thought was an not-closely-related test. Ooops. It had a failure. It was testing something that wasn't part of the API, which also broke. Fixed. Checked in again (an hour late).
Without basic unit testing, this would have broken in QA, required a bug report, required debugging and rework. Look at the labor: 1 hour of QA person to find and report the bug + 2 hours of developer time to reconstruct the QA scenario and locate the problem + 1 hour to determine what to fix.
With unit testing: 1 hour to realize that a test didn't pass, and fix the code.
Bottom Line. Did it take me 3 hours to write the test? No. But the project got three hours back for my investment in writing the test.
Are you looking for something like this?
The ROI of Software Process Improvement A New 36 Month Case Study by Capers Jones
Agile Practices with the Highest Return on Investment
You're assuming that the list you present constitutes a set of "best practices" (although I'd agree that it probably does, btw)
Rather than try to cherry-pick one process change, why not examine your current practices?
Ask yourself this:
Where are you feeling the most pain? What might you change to reduce/eliminate it?
Repeat until pain-free.
You don't mention code reviews in your list. For our team, this is probably what gave us the greatest ROI (yes, investment was steep, but return was even greater). I know Code Complete (the original version at least) mentioned statistics relative to the efficiency of reviews in finding defect VS testing.
There are some references for ROI with respect to unit testing and TDD. See my response to this related question; Is there hard evidence of the ROI of unit testing?.
There is such a thing as “local optimum”. You can read about it in Goldratt book Goal. It says that innovation is of any value only if it improves overall throughput. Decision to implement new technology should be related to critical paths inside of projects. If technology speeds up already fast enough process it only creates unnecessary backlog of ready modules. Which is not necessary improve overall speed of projects development.
I wish I had a better answer than the other answers, but I don't, because what I think really pays off is not conventional at present. That is, in design, to minimize redundancy. It is easy to say but takes experience.
In data it means keeping the data normalized, and when it cannot be, handling it in a loose fashion that can tolerate some inconsistency, not relying on tightly-bound notifications. If you do this, it simplifies the code a lot and reduces the need for unit tests.
In source code, it means if some of your "input data" changes at a very slow rate, you could consider code generation, as a way to simplify source code and get additional performance. If the source code is simpler, it is easier to review, and the need for testing it is reduced.
Not to be a grump, but I'm afraid, from the projects I've seen, there is a strong tendency to over-design, with way too many "layers of abstraction" whose correctness would not have to be questioned if they weren't even there.
One practice at a time is not going to give the best ROI. The practices are not independent.

Resources