Does it make sense to study COBOL? [closed] - cobol

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I have had a talk with a friend of mine about the relative vulnerability of different types of IT workers to unexpected unemployment (e.g. layoffs, company going out of business, obsolete skills etc.)
as it seems COBOL developers (or maintainers?) seems very secure in their positions, regardless of the state if the economy or even how good they are. With so much critical COBOL code being around on the one side and the deminishing number of COBOL know-hows on the other , it actually makes sense to recommend someone starting their way in the IT world and looking for a relativity secure job to study and intern in COBOL!
what do you think ?

I started as a programmer with Cobol more than 10 years and worked with Cobol at different institutions then for several years.
Cobol as a technology is fairly easy to learn if you know any imperative programming language.
Cobol itself differs a lot on various platforms and versions - thus it is difficult to study the right version before hand.
The real challenge with Cobol is not the technology, but the complexity of the underlying business and the lack of documentation of the systems/programs. Thus the real value of a lot of COBOL experts is in fact not the actual COBOL knowledge, but the understanding of the business.

I think it makes sense to be a good software developer. For me personally, your question sounds a bit like: "What silo should I occupy to feel secure about my job and stop improving myself?" I know you didn't mean exactly this. But anyway, that's not the best motivation for choosing a career path.
I'd say: try tinkering on some COBOL code. If it's fun for you, go for it! Just as for dozens of other things you should try.

You have to understand that the arguments you made, are relative to time, that is now. We have seen time and again, how technologies seems so prominent at time t and becomes obsolete at time (t + 0.001). Though your arguments rotate around the very fact that COBOL is more or less obsolete, but they may find a new way to deal with it and again you become out of job. So here's rule number 1:
Never rely on one single technology.
With time, they always find ways to have lesser and lesser resources do the same thing. All you can do is to be a smart software professional. When you get the core part of computer programming, technologies wont matter, with time you could just learn them. So here's the second rule:
Don't just try to expertise a
technology, expertise software
engineering
Finally, to survive in IT you always need to be cutting-edge-aware. Also immaterial of all the above, you can still be laid off depending on the harshness of the recession. So finally:
Keep a back up line of work ready,
tommorow IT industry might not be
there :)

Try it, and if you like it then study it seriously.
If you learn it too well you may find you end up stuck in a COBOL role with no way out; it begins slowly and then you are drawn in because the knowledge is very specialised. If you enjoy it, that's fine - but if you try it and don't like it, then don't continue.

IMHO, It always makes sense to study a new language.

It's true that there's a lot of COBOL code running today, and much of that code is mission critical. However, how much actual COBOL coding is happening? I see large enterprises gradually replacing those COBOL systems.

From a practical standpoint, there's a huge base of legacy COBOL code running a lot of systems out in the world (many of them mission-critical) and it's likely cost-prohibitive to replace all of that software any time soon. The average COBOL programmer is probably nearing retirement age. Therefore one could reasonably assume that there will continue to be demand for new COBOL programmers for some indeterminate amount of time to come.
From a personal development/enrichment perspective, it certainly makes sense to study COBOL (and any number of other technologies both new and "less-new".). I'm not sure I'd put it near the top of my list, but its historical significance is reason enough to put it on the list. Somewhere.

My guess is that one of the reasons COBOL programmers (I'm a Fortran programmer, similar situation I suspect) are relatively secure is because they have oodles of experience; you won't get this from learning the language. Rather than ask how many COBOL jobs are there, ask yourself how often you see a COBOL job advertised. I think that it is much easier to hold on to one of these jobs than to get one.
And, of course, when one is advertised, you're in competition with all those very experienced currently-working-in-COBOL programmers.
Regards

Cobol developers are secure in their positions because their code makes money. It is not a horrible language to learn. Actually, it is rather nice once you grok the structure of it.
But it is only one tool in your tool box, you should have several.

I'd say it is not a complete nonsense to learn COBOL, as long as it is not the only technology you learn.

Related

Which Programming methodology to used for Final Year Project? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I am a final year Computer Science Student and as part of my Bachelors degree I am doing a project on Data Mining of Microarray DNA expression data. I will have to develop a few algorithms such as Bayesian Networks to run on my datasets to find out how each variable(genes) affect each other.
As part of my Project Proposal I have to talk about which methodology I will use to develop my software. From what I have learnt in school and from extra reading I find that the Incremental Development model seems a good idea. I would run 2- 3 iterations of Plan, Design, Implement until I get the full functionality of the software. Could somebody with more knowledge than me please tell me it this sounds like a good idea.
The reason why I am not 100% sure which methodology I would use is because I don't have a team to work on the software, I don't have a client with requirements and I am very limited in terms of the amount of time to work on the project as I have 3 other modules. All the methodologies which I have read about seem to be for big software projects with teams of developers. What do you do if you are just 1 person and focusing mainly on getting 3-4 algorithms to work rather than focusing on getting broad range of functionality.
I was also thinking of using UML to get a better idea what I want the software to do and using like a stripped down version of an Object Oriented Methodology.
My guess would be I would have to use parts of more than 1 methodology at a very basic level but I just can't pick.
I am very confused and lost on the subject so any help is greatly appreciated.
Thank You,
For these types of work, I would suggest not to pay much of attention to methodologies, because after all, what matters is the algorithm. But, for the sake of having a response for your dilemma, I would suggest using XP (eXtreme Programming). Why?
Is light
It doesn't require filling many papers as RUP & others
Is more suited for changing evironments, such as yours
Just take a fast search at Google for XP methodology and you'll get a bunch of useful results. ARUP (Agile RUP) might be worth looking also.
I hope I can help you.
XP/TDD is harmonious with the scientific method; each iteration is a theory, the tests are experiments
It takes a lot of discipline to follow a methodology while working solo, make sure you pick one that isn't labour intensive or you'll never live up to it.
If I was back at school in your situation with what I know now I'd probably go for Test Driven Development. Unit tests are ideal for testing algorithms and will leave you with a body of tests that you can use to demonstrate that you did follow a methodology.
Your idea to do the project in several iterations of plan, design, code and test is fine however with small projects it's sometimes difficult to resist the urge to do it all at once.
In case you do get carried away and finish the project in just one or two iterations, keep notes about the order in which you did things (ideally use a version control system) so that you'll at least be able to fudge your documentation to make it look like you used several iterations. Not that I'd endorse such an approach of course ;-)

What are the essential concepts all programmers should learn and use? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm currently learning to program, and I didn't take CS classes so I'm basically starting out on the bottom. I have been putting together code on and off for many years, but haven't really had a good understanding of essential concepts needed for enganging in bigger projects. Object-orientation is an obvious one, and I feel I'm beginning to understand some of the concepts there. Then there is a lot of buzz and methodology, such as MVC, UML, SCRUM, SOLID and so foth and so on.. I've looked at many of these but I'm always stumped as most explanations seem to require some understanding of other concepts.
I want to learn this stuff the "right" way, so where do I begin?
What are the overarching constructs I need to understand that enable me to understand all the underpinnings of software architecture/design/development?
What am I missing?
Are there constructs and concepts that can and should wait until I've cleared the foundation?
The SOLID principles are probably the most important.
From those you understand the motivation behind using a pattern such as MVC, why people think of persistence ignorance as important and so on. They are at the core of the majority of good practices.
Loose coupling, high cohesion.
And as for books, Code Complete covers almost everything at some level, at least.
Software development is a HUGE arena and you should be careful that you don't take on too much too quickly. Unless you're going to go in the direction of functional programming I'd suggest you start off by making sure you fully understand the concepts surrounding OO design and programming as this should be your foundation.
Once you understand that well you'll be able to understand design patterns a lot better and get a feeling for when to use them.
I'd suggest you try out a few languages till you find one you feel comfortable with, personally my favourite language is Ada which is a very pure OO language but in the business world I work in C# which still has a lot of issues but these are outweighed by the more vibrant job market.
I wouldn't worry too much about Scrum at this stage as you need to focus more on your dev skills before worrying about project management.
The most important thing is to work with as much code as possible, download lots of good reference solutions and work through the code till you understand it, and try and keep an eye on the development trends.
If its viable you may also want to considering attending some developer conferences too as these can be very inspirational.
Stay away from ACRONYMS (including those you've listed) and Methodologies(tm). At least in the beginning.
Read good books. Start with this one: Pragmatic Programmer. Learn algorithms and data structures, possibly from Introduction to algorithms by Cormen et al.
Write a lot of code. Practice is more important than anything else.
How to test software with unit tests. Being able to do that will solve 90% of all the other issue automatically since you can't test while they are around.
When you know how to test, you can start on advanced topics like design.
I'd recommend "Object Oriented Analysis and Design with Applications" by Grady Booch et al. The latest editoin has detailed explanation of concepts of OOAD including MVC, UML (which he invented), and discussions on how to manage the whole process of software development. The second part of the book exemplifies all this by developing 5 sample systems (with sometimes orthogonal aspects from the very core).
Another good one is of course Design Patterns by GoF which will give you an idea of loose coupling, ways to efficient encapsulation and reuse of code, etc
For what concerns the algorithmic part, take any book which is not bounded to a particular programming language. My favorite is Introduction to Algorithms by T. H. Cormen et al, it gets a bit theoretical at some points, but I especially like it when they are proving certain things and not just asking you to believe it.
When you are working with any modern general purpose language, it is probably a good idea to get a handle on patterns (MVC or Model-View-Controller is one). The book by the "gang of four" is a must read for this, or at least research a few and use it as a reference.
clicky
Refactoring is another concept that should be in your arsenal. The book by Martin Fowler on this subject is a very nice read and helps understand the aforementioned patterns better also a little explanation about UML is included.
Can't post more than one hyperlink so...
search on amazon for: Refactoring, Improving the design of existing code
When you want to communicate your designs UML (Unified Modelling Language) is the 'tool' of choice for many people. However UML is large and unwieldy but Martin Fowler (again) has managed to boil it down to the essentials.
search on amazon for: UML Distilled (make sure you get the most recent one)
SCRUM is one of many methods that is used to manage software development groups, I do not think there is much merit in learning that when you are just starting out or on your own. Especially not in detail.
Hope it helps...
PS: SOLID I haven't heard about yet, somebody else has to help you there.
You'd have a decent foundation if you surveyed basic Data Structures, Algorithms, and Algorithms Analysis.
I think that you should start coding real world problems to get a feel for problems in the programming domain.
Then you have a better background to understand why objects are important. Then, after managing objects, you will learn why patterns and OO principles are important.
Personally, I highly recommend the Agile Software Development, by Robert C Martin.
But it may be a long and tiresome read unless you have a feel for the problems being solved. I'm afraid that you may need 500-1000 hours of coding at the minimum before you get an appreciation that the problems being solved are real.
And it probably takes 7000+ hours before you develop an instinctive heart-felt pain from merely reading the problems, making this sort of book become the page-turner that it should be.
Regrettably, many of the sound practices that you should develop are only appreciated after having to live with your code over time. If you just do many excercises and abandon the code afterwards just "because it works", then you are missing out on the greatest pain of all. It is a luxury our industry does not have, and "technical debt" is a very very real and costly to those with large code bases.
I feel kinda silly answering my own question like this.. :) But one valuable resource I've found for learning to write code, is the Euler Project at http://www.projecteuler.net
It's basically a collection of mathmatical problems that you solve by writing your own solution to it. Once you've found the answer to a particular problem, you're allowed access to that problem's forum where different solutions are discussed. I was amazed at how much I was learning in a) solving a challenge, b) reading about other peoples approaches and c) how many programming languages there are out there! :)
The problems start out easy (you can tell by the number of people who's solved them) and progress to harder and harder problems.
Currently I'm working on problem #3, having solved the previous two... I recommend you start chippin' away at them, no matter your level!

is a great memory a requirement for great programming [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Do you think having a great memory is REQUIRED to be a great programmer?
I don't consider myself a great programmer but I do think I am decent. But my memory is REALLY bad so I find myself always having to remind myself how to do things. I mean I "know where to look" but sometimes it makes me feel like I am just a crappy programmer. What makes it even worse is that I am always forgetting where things are in my source code or what algorithm I used for certain situations.
Think back on the great programmers you have encountered in your life, didn't all of them seem to have amazing memories?
Surely apocrapful, but here's Einstein's number:
A reporter interviewed Albert
Einstein. At the end of the interview,
the reporter asked if he could have
Einstein's phone number so he could
call if he had further questions.
“Certainly” replied Einstein. He
picked up the phone directory and
looked up his phone number, then wrote
it on a slip of paper and handed it to
the reporter.
Dumbfounded, the reporter said, "You
are considered to be the smartest man
in the world and you can't remember
your own phone number?”
Einstein replied, “Why should I
memorize something when I know where
to find it?”
I have this coworker that writes really bad code that is incredibly hard to maintain. I've come to the conclusion that his problem is good memory. He's simply able to remember where he put what functionality. Therefore he doesn't have to write code that is self-explanatory. He simply remembers that crap. The rest of us have a really hard time figuring out his code.
I'm sure that good memory isn't that guys only problem. But I'm sure his code would improve if his memory got worse.
Treat your short term memory as stack (not static) and don't expect much more from it. I've come back to code that I wrote only a month ago and its almost like someone else wrote it .. it just takes a while to get back in the same zone.
I get teased, often for leaving comments for myself like breadcrumbs .. but it works. If I finish some function and say "AHA, that is absolutely BRILLIANT!", I immediately comment my complexity as I'm sure to forget.
So now, to answer a question with two questions:
What did you have for lunch last Wednesday?
What is the purpose for 'counter' in hash_foo() ?
At least, with #2, you can quickly go back and look / remember.
As long as you can remember how g-o-o-g-l-e is spelt, you're fine. :)
But seriously, you do need to keep several things in yoru short term memory at once. Longer term memory is I think less important. As long as you're aware that something exists, you've seen something before, etc then when it becomes relevant you'll know you can dig it up.
Experienced programmers can generally regurgitate APIs, minor details and so forth but in my experience this has never been a case of sitting down and memorizing things by rote. It's a natural consequence of using things again and again.
I would say the opposite, having a good memory may lead to writing code which only the author can understand, because she remembers the details of its logic. On the other hand I, having bad memory, document my code and write it as clear as possible.
Honestly, I've found poor memory to be an asset, even poor short term memory. Poor short term memory really forces you to break out the separation of concerns. The end result is very clean, very simple, very well encapsulated code. I actually have pretty good short term memory, but I've learned to try really hard not to employ it after a few experiences writing code while I was distracted enough that I couldn't really retain much at once. I was actually shocked to observe that the code was actually far cleaner than code I'd developed im the past.
Poor long term memory is an asset, because you end up training yourself very well on how to find and learn techniques, API's, algorithm's, etc. It also tends to encourage you to find a small set of common themes to guide you in your work.
All in all, the mark of a good programmer isn't complexity (which is really difficult to achieve without good memory), but simplicity (which by nature doesn't require much memory).
I can answer exactly this using just one word: NO. Having great memory to memorize all about programming is not a must. Experiences and the tedious learning by practice are the best.
I also have experienced this. If you have enough experience hours (or can be years) creating softwares with best practices applied, then you're a truly master on your own job or on programming languages you use to create softwares. Please don't be sad if you have bad memories, but striving to always learn and practice can defeat your memory weakness.
To me, there are two kinds of programmers in the world. The first were born to do it, the second learnt. In both groups they range from unbelievably poor to unbelievably great. Does memory denote those ratings? No, absolutely not. While a good memory can help you with learning, nothing helps you more than practice and understanding. After all, being able to remember the entire Encyclopaedia Brittanica means nothing without understanding. My server's storage is a classic example there.
Programming is about logic, both in the code and how you approach the problem. If you want clear, easy to understand code then chances are you'll break the problem into small manageable chunks (i.e. that fit in your head in their entirity) and work on each one. Each function then condenses down into a single command for your next stage of complexity. At the end of that next stage, if there is another, you'll have a set of single commands again to build on. Logical naming, logical partitioning, logical assembly... I think I'm getting my logical point across ;)
My memory is appalling, and I mean appalling. I can be introduced to 3 people and by the time #3's name is said I've forgotten who #1 is. I can still write some good code, not everytime or everyday; when you're in the zone it's something else, at that point it is art. So, put your memory to one side, get yourself either a really quiet space or a pink noise generator and dive in. The only thing that's going to make you a better programmer is practice, practice, practice. The only thing to remember is that programming is a skill and skills are practiced, and best practiced among friends who can give constructive criticism and advice... like Stack Overflow :)
Apologies for the tome level of this answer but I couldn't remember what I'd already written ;)
Having a good memory is quite useful but certainly not required. I would say that it's not that great programmers have a great memory but rather, they have spent a lot of time investigating even the littlest issues which improved their understanding and improves recall. If you spend 4 minutes resolving a problem (Googling or asking in SO) then you probably won't remember the resolution when you hit it again 4 months down the line. IT could be an evolutionary trait or just a bad memory =)
Good programmers also have well thought out principles which allows them to work on auto-pilot without second-guessing themselves. A good set of principles also achieves consistency and predictability (which is a quality of memory) through reinforcement.
This also extends into other domains. Chess grandmasters can recall an entire game played 40 years ago. That's because they remember patterns (openings, variations, root cause and effect of moves which led to the end game, etc). which helps group individual moves into units.
In software, tools can like auto-complete or having a KB/Wiki and searchable check-in history etc can help.
No. But maybe it can make you great...
An art of programming (maybe the art) is being able to approach problems in such a way that you can grasp the whole of them, despite your limitations (such as imperfect memory). This is because everyone - including the smartest of us - has limitations. Bumping into your limitations is not a sign that you have limitations, but a sign that you are reaching further.
This art (insofar as I know it) includes things like divide and conquer (using modules of various kinds, to match the shape of the problem); using standard techniques to telegraph your intention (idioms, OO Design Patterns are just one); separating out the core of the problem (this one is not about the code: it's about the problem); and of course comments.
I used to believe that good code was self-documenting (and even, that code is truth), but recently I'm writing parsers, and including the CFG in a comment is a very helpful reference, because it is a much simpler representation of the intention of the code.
A coder's gotta know their limitations. It's unrealistic to expect to have the same grasp of something months later, as when you were in the thick of it. All the above involve accepting that problem, and working on a solution. Not only does it make your code easier for you to grasp later on, it makes it easier for someone else to grasp... but most importantly I firmly believe that clearer and simpler code is fundamentally, and transcendentally, better code.
The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague. - Dijkstra
In allusion to Edsger Dijkstra, a competent programmer is fully aware of the limited size of his own skull. The more details you don't cram in your head, the better you can tackle the problem at hand.
Modularize your code very much, refactor code, package your nifty algorithms to objects, and use those objects, in that way, you don't always have to "micro-remember" each and every implementation details of your programs.
No. The ability to forget about what you know and continue learning is at least equally important in the long term.
Good notes and bookmarks and web searches go a long way.
Remembering the really simple things is required for great programming. Things as simple as "keep at it".
An interesting perspective from the other side of your monitor: Locality of Reference
I think one benefit to having a good memory (modesty point: I have a good memory) is the ability to be able to think on your feet when not actually in front of a computer.
For example, you might be in a meeting when some new kind of functionality for your app is suggested. Can it be done? How long is it likely to take? These are questions which are easier to answer if you can pretty much walk through 250k loc in your head.
That said, I find a grain of truth in others opinions that my own code might be less clear because I can remember it better.
Some of the best-written code I've ever seen was written in such a way that each design decision was inevitable, and the code read as its own explanation. That's far better IMHO than code which requires the reader (or, worse, the maintainer) to keep tons of arbitrary detail memorized.
My own index of complexity in code is "How much stuff do I have to keep in mind to understand this one line?"
More is worse.
I think having good memory is helpful for learning new things quickly.
This doesn't mean it is a requirement for being a great programmer.
In fact it is more about intelligence rather than memory capabilities, but it's too much of a complex subject to be able to identify certain qualities and compare them with programming skills, and be able to retrieve any relevant info.
That is the mystery of the brain.
Occam's Razor suggests that a simpler theory is likelier to be true.
If code is a theory, describing inputs going to outputs, then shorter code, using expected idioms and libraries, is more likely to be "true" - that is, it is more likely to capture the essence of the solution, so it will generalize to inputs that you didn't expect.
Shorter, unsurprising code is easier to remember.
It all depends on what your good memory remembers..
I've worked on the same project for 10 years or so and I can't remember every line and who wrote it and why.
But... I can remember pretty much all the user requests and user issues. Who wanted what and when.
I can remember pretty much all the support issues we have had.
Finding old code is easy - we have great tools for that. Finding old issues is a much more abstract process: we have JIRA and Wikis but sometimes they don't cut it because they fail to provide the semantic meaning.
So. Pay attention to what REALLY matters and remember that.
Programmers with poor memory are like Universal Turing machines compared to practical computers: technically you can accomplish the same things by referring to information you or someone else has recorded somewhere... it's just that it may take a little longer....
I think it's possible to be able remember different types of things with differing degrees of aptitude.
For example, I sometimes find I have a pretty bad memory when it comes to random facts and figures, as well as things that I've done or will be doing - the latter meaning I find bug-tracking software an invaluable tool.
On the other hand, I can remember the structure of complex pieces of software I've written, and where to find specific things within that.
This may be about logical association. Well-designed software should (in theory) have a logical structure, which may make it easier to store in your memory if your brain is wired up that way.
Random piece of information, however, may not have these associations, making them harder to remember.
I would say its necessary for being great and fast. My memory for programming details is OK (but I have google for that). However, when I sit in front of applications that I've primarily written (~30-40 k lines of code) I'm able to load its structure almost completely into my memory. I can find the way I'm doing something in a couple of seconds and recall why I implemented it the way I did. That's invaluable. By 11 am I've been able to do more work than some others do all day. Now, that doesn't make me a great programmer, but it does make me an enormously productive productive programmer. This gives me time to refactor, write extra code, surf SO, grab an hour lunch, etc.
Just a simple comment "Repetition is the mother of learning", it doesn't matter if you have a good/bad memory. The function you use more in your programs you will remember the best. Also, in my case, i have the internet, when i don't remenber something i just ask, even if it is a bumd or easy question, and a lot of times I remember the answer after I post the question and then I quickly post the answer. The problem is how much time you put your mind to work....
:)
I think it depends. Memory for a programmer is very very important. Both short and long term. However, what you use that memory for is the important thing. As a programmer, if you're using it to memorize ever nuance of an API then I'd say you're wasting your memory.
Ultimately, I try to use my memory to remember the important things and anything that I can't easily find at a later point in time. I'll usually put API stuff in short term memory and use google and intellisense to help me with the specifics. Design patterns, methodologies, lessons learned from experience, on the other hand are usually what I try to put into long term memory so I can use it effectively in the future without having to relearn everything.
In short, yes a programmer needs a good memory...both long and short term. But they need to be wise in how to use that memory...and that, I think, makes the difference in a great programmer.
Having a strong enough memory to hold the things you need to use today is the important part. If you are constantly searching for answers to the same question, you probably have a weak memory.
The most important thing is remembering where you can find the answers. I will sometimes blog on topics that are a bit more complex so I have a place to find them when I need them. But I don't try to hold onto them perpetually, because I can search my own blog and find them later. I do the same with other people's blogs and I know which blogs to hit for certain types of answers.
When all else fails, Google it!
I used to think having a good memory was a time saver because the more you remembered, the less time you spent looking things up, but tools and IDE have got so good now, many things that I used to memorize like syntax and various code snippets are quickly available in a few keystrokes. That, and the fact that the amount of information in the field grows way faster than any mortal programmer can keep up with, makes me think memory is less important any more. More important, is having good access and organization to useful information.

Long-held, incorrect programming assumptions [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I am doing some research into common errors and poor assumptions made by junior (and perhaps senior) software engineers.
What was your longest-held assumption that was eventually corrected?
For example, I misunderstood that the size of an integer is not a standard and instead depends on the language and target. A bit embarrassing to state, but there it is.
Be frank; what firm belief did you have, and roughly how long did you maintain the assumption? It can be about an algorithm, a language, a programming concept, testing, or anything else about programming, programming languages, or computer science.
For a long time I assumed that everyone else had this super-mastery of all programming concepts (design patterns, the latest new language, computational complexity, lambda expressions, you name it).
Reading blogs, Stack Overflow and programming books always seemed to make me feel that I was behind the curve on the things that all programmers must just know intuitively.
I've realized over time that I'm effectively comparing my knowledge to the collective knowledge of many people, not a single individual and that is a pretty high bar for anyone. Most programmers in the real world have a cache of knowledge that is required to do their jobs and have more than a few areas that they are either weak or completely ignorant of.
That people knew what they wanted.
For the longest time I thought I would talk with people, they would describe a problem or workflow and I would put it into code and automate it. Turns out every time that happens, what they thought they wanted wasn't actually what they wanted.
Edit: I agree with most of the comments. This is not a technical answer and may not be what the questioner was looking for. It doesn't apply only to programming. I'm sure it's not my longest-held assumption either, but it was the most striking thing I've learned in the 10 short years I've been doing this. I'm sure it was pure naivete on my part but the way my brain is/was wired and the teaching and experiences I had prior to entering the business world led me to believe that I would be doing what I answered; that I would be able to use code and computers to fix people's problems.
I guess this answer is similar to Robin's about non-programmers understanding/caring about what I'm talking about. It's about learning the business as an agile, iterative, interactive process. It's about learning the difference between being a programming-code-monkey and being a software developer. It's about realizing that there is a differnce between the two and that to be really good in the field, it's not just syntax and typing speed.
Edit: This answer is now community-wiki to appease people upset at this answer giving me rep.
That I know where the performance problem is without profiling
That I should have only one exit point from a function/method.
That nonprogrammers understand what I'm talking about.
That bugfree software was possible.
That private member variables were private to the instance and not the class.
I thought that static typing was sitting very still at your keyboard.
That you can fully understand a problem before you start developing.
Smart People are Always Smarter than Me.
I can really beat myself up when I make mistakes and often get told off for self-deprecating. I used to look up in awe at a lot of developers and often assumed that since they knew more than me on X, they knew more than me.
As I have continued to gain experience and meet more people, I have started to realise that oftentimes, while they know more than me in a particular subject, they are not necessarily smarter than me/you.
Moral of the story: Never underestimate what you can bring to the table.
For the longest time I thought that Bad Programming was something that happened on the fringe.. that Doing Things Correctly was the norm. I'm not so naive these days.
I thought I should move towards abstracting as much as possible. I got hit in the head major with this, because of too much intertwined little bits of functionality.
Now I try keep things as simple and decoupled as possible. Refactoring to make something abstract is much easier than predicting how I need to abstract something.
Thus I moved from developing the framework that rules them all, to snippets of functionality that get the job done. Never looked back, except when I think about the time I naively thought I would be the one developing the next big thing.
That women find computer programmers sexy...
That the quality of software will lead to greater sales. Sometimes it does but not always.
That all languages are (mostly) created equal.
For a good long while I figured that the language of choice didn't really make much of a difference in the difficulty of the development process and the potential for project success. This is definitely not true.
Choosing the right language for the job is as important/critical as any other single project decision that is made.
That a large comment/code ratio is a good thing.
It took me a while to realize that code should be self documenting. Sure, a comment here and there is helpful if the code can't be made clearer or if there's an important reason why something is being done. But, in general, it's better to spend that comment time renaming variables. It's cleaner, clearer and the comments don't get "out of sync" with the code.
That programming is impossible.
Not kidding, I always thought that programming was some impossible thing to learn, and I always stayed away from it. And when I got near code, I could never understand it.
Then one day I just sat down and read some basic beginner tutorials, and worked my way from there. And today I work as a programmer and I love every minute of it.
To add, I don't think programming is easy, it's a challenge and I love learning more and there is nothing more fun than to solve some programming problem.
"On Error Resume Next" was some kind of error handling
That programming software requires a strong foundation in higher math.
For years before I started coding I was always told that to be a good programmer you had to be good at advanced algebra, geometry, calculus, trig, etc.
Ten years later and I have only once had to do anything that an eighth grader couldn't.
That optimizing == rewriting in assembly language.
When I first really understood assembly (coming from BASIC) it seemed that the only way to make code run faster was to rewrite it in assembly. Took quite a few years to realize that compilers can be very good at optimization and especially with CPUs with branch prediction etc they can probably do a better job than a human can do in a reasonable amount of time. Also that spending time on optimizing the algorithm is likely to give you a better win than spending time converting from a high to a low level language. Also that premature optimization is the root of all evil...
That the company executives care about the quality of the code.
That fewer lines is better.
I would say that storing the year element of a date as 2 digits was an assumption that afflicted an entire generation of developers. The money that was blown on Y2K was pretty horrific.
That anything other than insertion/bubble sort was quite simply dark magic.
That XML would be a truly interoperable and human readable data format.
That C++ was somehow intrinsically better than all other languages.
This I received from a friend a couple of years ahead of me in college. I kept it with me for an embarrassingly long time (I'm blushing right now). It was only after working with it for 2 years or so before I could see the cracks for what they were.
No one - and nothing - is perfect, there is always room for improvement.
I believed that creating programs would be exactly like what was taught in class...you sit down with a group of people, go over a problem, come up with a solution, etc. etc. Instead, the real world is "Here is my problem, I need it solved, go" and ten minutes later you get another, leaving you no real time to plan out your solution efficiently.
I thought mainstream design patterns were awesome, when they were introduced in a CS class. I had programmed about 8 years as hobby before that, and I really didn't have solid understanding of how to create good abstractions.
Design patterns felt like magic; you could do really neat stuff. Later I discovered functional programming (via Mozart/Oz, OCaml, later Scala, Haskell, and Clojure), and then I understood that many of the patterns were just boilerplate, or additional complexity, because the language wasn't expressive enough.
Of course there are almost always some kind of patterns, but they are in a higher level in expressive languages. Now I've been doing some professional coding in Java, and I really feel the pain when I have to use a convention such as visitor or command pattern, instead of pattern matching and higher order functions.
For the first few years I was programming I didn't catch on that 1 Kbyte is technically 1024 bytes, not 1000. I was always a little perplexed by the fact that the sizes of my data files seemed slightly off from what I expected them to be.
That condition checks like:
if (condition1 && condition2 && condition3)
are performed in an unspecified order...
That my programming would be faster and better if I performed it alone.

What scares you the most about the integrated IDE of most modern Smalltalks? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
As I'm riding the wave of resurgence of Smalltalk (especially because many Ruby-on-Rails people are rediscovering Smalltalk and seeing Seaside as their next upgraded web framework), I get questions like "yeah, but how do I use my favorite editor to edit Smalltalk code?" or "Does Smalltalk still insist on living in a world of its own?".
Now, having first experienced Smalltalk back in 1981, I don't understand these questions very well. It seems rather natural that I'd want the editor and debugger to be savvy of my current code state, and integrate with the change control system that is Smalltalk-aware. Using an external editor or debugger or change control manager would seem very awkward.
So what is it that scares you the most about not being able to edit the five-line methods in Smalltalk with your favorite editor, or use your favorite non-Smalltalk-aware change control system?
Everything's different. Want to go to the end of the line? It's not Ctrl-E. Want to jump a few words over, by word? It's not Meta-F....
Text editing is a fundamental programming activity. Messing with those inputs is messing with something deep in my mind.
Edit: and here is someone asking for emacs key bindings on comp.lang.smalltalk in 1987.
The only Smalltalk I've spent any time with is Squeak, so my views may not apply to other Smalltalk environments.
What concerns me about the image-based approach is that, while you have wonderful things in the Smalltalk environment, it is a walled garden that makes it difficult to interoperate with anything outside that environment. For example, what if I want to use external tools like Yacc and Lex? What if I want to use some C or Python programs to generate Smalltalk code? What if I want to mix Smalltalk in with a bunch of code written in other languages, editing code in all those languages in one editor and keeping it all stored in the same source-code tree?
I'm sure it's possible to deal with all these issues by having your Smalltalk environment invoke system functions to control external tools. But how easy is it to let external tools control your Smalltalk environment? In other words, what if I want Smalltalk to be just another component, rather than the master of everything?
Nothing scares me in particular, but I found working out the API's in VW a bit of a chore, even when I had used other smalltalks. The effect of the browsers is that you tend to see the API's a little bit at a time and quite often it's not immediately obvious where you should look for particular functionality.
Smalltalk also suffers a bit from the paradigm shift to understand how it works. When I was doing my bachelor's degree at university (some time after I had first encountered Smalltalk) I got to enjoy a bit of Schadenfraude watching everyone else in the class getting over the initial paradigm hump as they learned the system (Squeak) for the first time.
I think the combination of the paradigm shift and functionality being somewhat buried in the class libraries makes for a bit of a steep learning curve. ST had a reputation for a fairly steep learning curve to really come up to speed - most of this is due to the large class libraries and the fact that most of the language functionality is buried somewhere in the libraries.
Also (and sadly), Java came along in the mid 1990s and grabbed all of the mindshare. The major Smalltalks have either died completely or been sold off to niche players. It's quite Ironic (in a happy way) that Ruby has served to re-awaken interest in Smalltalk but the lingering perception of 'also-ran' obsolescence doesn't help.
See This post of mine for some pontification about the merits (as I see them) of getting heavily involved in Smalltalk in this day and age.
I would be quite happy to go back into Smalltalk if the opportunity were to arise.
The one big show-stopper for me is that code I write one Smalltalk VM is STILL, after all these years, not compatible with other Smalltalk VMs.
I understand why that is: the core of Smalltalk is an extremely small set of axioms and keywords. This means that after 30 minutes of learning Smalltalk, you're already learning the API library rather than the language itself. I like that approach to language design.
What it all boils down to however, in the Smalltalk world, is that unless a consensus is reached between all VM vendors to have a common base Standard API, my Smalltalk code written for one VM is almost certain not to run on other VMs when I decide to switch.
This also has the corollary of obsoleting part of my knowledge of the space when I switch VMs.
Note that I have barely tried Smalltalk in my life. I'm far from being an expert. This understanding comes from speaking with James Robertson about a month ago.
Another point I'd like to make is that Seaside does in fact run on most popular Smalltalk VMs. I wonder how much of (what should have been) a Standard API they had to build for themselves to achieve that feat.
With all that said, I always have an ear out to hear more about the state of Smalltalk. I do want to try out Smalltalk's very powerful development environment (and its other goodies).
I know it's late but the biggest annoyance for me is that there is not really good editor in none of the smalltalks. It's a thing I can not understand. Working with text is so essential and that less "supported"....
It's always this just staring at one method and then you need to have some method finder or another browser around just to check another method. This is what I really dislike....
While the restricted Smalltalk environment made things like relying on a database driven source control system possible at times where other languages still struggled with having a proper editor, it makes integration very hard in todays times.
With tools like Eclipse or Team Foundation Server you get so used to having all tools integrate with each other. E.g. if a requirement is created, it is automatically linked to the change sets that the programmer commits to implement that requirement. This "boundary breaking" between formerly different tools is nearly impossible in the Smalltalk world, but with bigger projects, bigger teams, higher levels of abstraction and so on you need tools which are more than a fancy editor and help you throughout a full software development life cycle.
No useful support for navigating with the keyboard, or supporting platform UI behavior.
While it's true you don't really need an incredible text editor for (well-written) Smalltalk, being able to move around the environment while keeping your hands on the keyboard is quite useful (and in my case, essential to reducing RSI). I just was trying VisualWorks' inspector and the arrow keys didn't even work properly to move up and down a list. When I hit the space bar, I got a walkback. Sigh.
For the Windows world, there is nothing like Dolphin Smalltalk. The IDE is fantastic. Another quality product if you want to try is Visualworks, it works well, has a very fast VM and the documentation is pretty good.
I've used both in the past, there is nothing to fear.

Resources