What approach/methodology are you using for one-man software development [closed] - methodology

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
You can find thousands of questions out there about how you develop software and which methodology is the best one. But mainly these are targeting medium to large teams, with people having different roles and responsibilities.
What I'm interested in is what methodology are you using for your one-man-shows? What steps are you doing, what documents are you creating to get the things you want to develop clear and document it well, to share it with the community?
Especially, I’m interested in the following questions:
_Are you using a structured approach even you’re developing on your own or no at all?
_What phases are you using?
_Which documents are you writing before and after coding?
And if you have “your” standardized approach, can you share templates which you are using?
Thanks in advance,
cheers
Gerry

Personally I think it is a matter of making decisions when it comes to the development process (solo). In my case I wouldn't recommend setting up a massive development process but I would pick elements which prevent problems that I have earlier had. My approach for small applications (in the right order):
Always write down what you are going to make and what you are not going to make (define a scope) - Think of functional requirements (Functional Design)
(OO only) Make a class diagram that displays relations between classes. (Technical Design - Sequence diagrams, while usefull, take up massive amounts of time to make)
Write your program according to what you have just written down (or part of it).
Refactor and redesign your application (once in every X hours, write this one down)
Repeat step 3 to 4 until the result is what you wrote in the Functional Design.
Walk through every corner of your application to find every single path and write this down in a testdocument. Identify possible problems in the paths and test them.
When it comes to big applications however (or assignments for someone else) I prefer using the "medium to large teams" approach. Which almost brings a guarantee that you will not be meeting most problems.

Related

Is learning all this necessary? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
Is it even worth it to learn all of this Bootstrapping and stuff that just feels like I'm not really doing any work?
I feel like it's a bit cheat-y, y'know?
I showed someone a site I had built and they said it was good, but it didn't work well at all across multiple platforms.
So, I Googled for some tips on how I can make the site adjust to different screen sizes, and every link I went to just listed different Bootstrapping things and plug-ins that'll do it for me.
I want to learn this stuff for myself so I have better control over it, I suppose.
Is that really a good idea, or would it be more worth it to look into Bootstrapping and junk?
I would advocate to learn how things work first, and then use libraries/frameworks to accelerate your workflow.
The idea behind this is that if those tools have bugs, or issues, you'll have a much better capacity to dig in and debug.
Trying to build all of these tools yourself, however, is NOT recommended (unless its for exploratory reasons). These libraries and frameworks exist for a reason, they have many contributors (something you can't compete against as a solo dev) and they solve real-world problems.
That being said, learning how to properly select a given lib/framework for a given use-case is a skill really worth building. And that comes from understanding what problems the libs/frameworks solve, which is the result of having explored "the inner workings" by digging in.
In the end, these tools will greatly accelerate your development speed, which is great, especially when it's business related (your job).

Do BDD/ATDD stories replace the need for traditional requirements? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
From what I can tell from online forums and posts, one of the main focuses of BDD/ATDD seems to be on discussion and ensuring that the customer, developers, testers and other relevant parties are involved in the understanding what the system must do.
Question 1: Do BDD/ATDD stories replace the need for traditional requirement specifications, such as the those captured using the Volere Template?
Because the traditional requirement specifications are one of the key inputs for developers and testers, traditional requirement specifications tend to be comprehensive.
Question 2: Should BDD/ATDD stories also be comprehensive enough to allow a system to be fully tested?
Question 1: Instead of looking at this question as a black-and-white situation, we should better understand how these two requirements capture methods get along with each other. Writing a story in the BDD/ATDD methodologies, or in Scrum for example, does not imply removing the templates like volere off the table. If we take a look at the volere requirements specification here, we can see that most of the information regards to project-related issues, and the shell used for functional requirements is far from being different to a story. They just have different information, not exclusive one.
Question 2: Here we have the advantage coming from the methodology itself. BDD comes from TDD, we can more or less rely on the test-first oriented process to allow the team to test the system. But, as I mentioned in question 1, making a BDD/ATDD story more comprehensive is not a sin, and wouldn't compromise the general idea of the story. This would also prove useful when interacting with more experienced clients.

How do i know if i need to use a design-pattern? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I'm just wondering, I'm new to this pattern subject, I started like couple of weeks ago but my main problem is, when I start writing small applications (for self purposes) I can't think where to put any pattern to use, maybe it's my thinking structure that needs to be tweaked ?
If I start a new project, how would I know if I need to use a pattern ? what questions do I ask myself ? what steps do I take before writing the actual code ?
Look at the Delphi VCL...it's basically took the Design Patterns and ran with them...
Forms are Composite Patterns.
Datasets use the Iterator Pattern.
Screen and Application are Singleton Patterns.
Components use lots of
different Patterns, the Chain of Responsibility, Decorator, Facade
to name a few...
Patterns are ways to organize your program and objects in lightly coupled objects that have jobs that you do over and over again...
Design patterns are just ways to approach solutions to common problems. As you internalize the patterns and as you understand the problem better you will sometimes see that the problem (or part of the problem) you are solving is addressed by a particular pattern.
That's when you use it. When you see it solving your problem.
Design patterns are reusable solutions for common problems.
The principles of Software Engineering cites the reusability of codes, when you use a design pattern you are using a concept previously tested that went trough several validations and is less prone to an design error than if you design your own model.
So first, you have to know the existing design patterns and what they're intended to solution. When you face a common problem you may remember the design patterns you previously studied and use them to solve the situation you're facing at the moment.

What stage to add authentication? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm about to start building a Rails app that will eventually need to vary CRUD access by user (i.e. which pages do they see, which can they edit, etc).
Is there a best stage of the development cycle to incorporate this?
Part of me feels like it should be the very first thing, since almost every piece of the interface will in some way rely on checking the user's ID, and it will be an inherent part of the DB structure.
Another part feels that this would overcomplicate things to start out with, and that I should instead build the core parts of the app, then layer on the authentication/authorization later.
Are there any best practices around this sort of thing?
I would say that if your system will rely on some kind of authentication... Why wait?
Let's say that you start developing your application without the authentication layer but at the same time you know that at some point you will have to do it. That means that at some point you will develop the authentication layer, and most likely you will have to refactor what you have already built to adapt it to this new layer.
Also, to try to convince you a little bit more...When you say:
I should instead build the core parts of the app
You should consider that the authentication module might be a core part of the app too...
I prefer to do it early, but you really have roughly the same amount of work in front of you regardless of when you do it. It really a matter of opinion on when you prefer to do it.

Does using Extreme Programming have a negative impact on your ability to win new customers? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I've recently been looking at Extreme Programming and wondering if it would be realistic to implement it where I work.
My question is, if you're pitching to a potential new client and you tell them that you're using XP, and you explain what their responsibilities are as the customer, are they likely to be put off selecting your company if they've never worked within an XP environment before?
What are peoples experiences of selling XP to a client given that it seems to me to be a very customer intensive software development methodology? The context here is selling medium to large websites to a a wide variety of clients.
I usually try to explain it to my clients in non-technical terms, and focus on the benefits of my business model. With XP, you'll always have a higher degree of communication with your clients. This is always a plus for them. They like to know what's going on. Focus on that. Also, focus on the idea that they are able to discuss business requirements with you as the process moves along, so they don't get tied down into doing something the way they first envisioned it 6 months ago when they didn't really know what they wanted. This will also allow your contracts to extend their lifetimes when your clients get comfortable working with you and want to continue improving their products.
I'm working on a project that uses XP. The weekly meeting with our customer and the outcome of these meetings was that good, that our customer decided do try to implement an 'agile like' process as well.
Additionally I think that agile is getting more and more common in IT projects and that more customers are satisfied by the outcome of these projects. So I think that in a couple of years it will be harder to sell a non-agige project than an agile one.

Resources