What is COBOL used for? [closed] - cobol

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
What is COBOL used for?

COmmon Business Oriented Language 'invented' by Grace Murray Hopper ( read about her she is one one of the pivotal people in the development of computing as we know it today). The general idea was to produce a language that was English based as opposed to mathematically based or expressed as such in the code.
Very simply put you would use a construct like
ADD YEARS TO AGE
as opposed to
age = age + years
or
age += years
Appearing in the early 1960's it was massively adopted for processing in the area of business. There are still a large volume of applications built in COBOL still running and maintained and it is still very much alive and kicking. Around 1997 Gartner reported that 80% of the world's business ran on COBOL with over 200 billion lines of code in existence and with an estimated 5 billion lines of new code annually. So you could do a lot worse than learn COBOL to ensure you have a job for life.
The structure of a cobol program is summarised in the Mnemonic In Every Damn Program. Meaning that there is an:-
Identification Division giving information about the program
Environment Division describing the hardware
Data Division (In my day we used CODASYL now better known and newly re-invented as no-sql
Procedure Division 'Here be code'
Because of the legacy from punch cards (yes i used them as well) you always started the code by indenting 8 spaces in else some compilers would not recognise it (shades of Python where whitespace is significant).
It is of course a compiled language.
Where is it used. Governments, the Military Businesses of all sizes but usually the larger corporates so i suppose you could say everywhere and it is used to run governments, and the Military and business's. I believe the US's social welfare system runs on several million lines of Cobol written in the mid 60's. Experian a large UK based credit rating company uses it throughout there operation with interfaces to the web. Again in the UK most of the Building Societies and Banks run their core systems on it.
I could go on but i won't go and read about it. And by the way you can even get Object Oriented Cobol if you want

Ever use a credit card? Your transaction is probably touching COBOL code on the backend.

The Right Tool for the Right Job
Batch
The most salient point about COBOL is not its verboseness. It is that it was primarily designed, as a language, to do batch processing. Its I/O functionality in that regard is exceptionally efficient.
Even though it predates OOLs by a geological epoch, it is useful when speaking to a modern-day OO programmer to describe batch programming and COBOL from an OO point of view. Describing it like this, though historically incorrect, helps OO programmers conceptually.
To wit, the utterly fallacious, and yet very true:
COBOL has been "optimized" to iterate over large, nay, vast sequential "collections"
(i.e. batches, also known as files). In fact, it is so optimized, that all the OO
functionality has been stripped out, leaving a basic API that opens files, processes
records, and closes files. In more complex version of the basic algorithm, multiple
files are opened, their records matched to each other and manipulated to produce one
or more output files (batches).
Where COBOL was co-opted for non-batch processes, for instance pseudo-conversational programming (backing up CICS "green" screens - aka BMS), it was least suitable. Not surprisingly, it is this functionality that has been most quickly replaced by GUI apps written in OOLs.
The Editor
The ISPF Editor on IBM mainframes has been optimized to handle the kind of coding COBOL requires. The basic unit of manipulation in the editor is the line. By default, vertical alignment is static and not flowed or shifted based on context; typing to the end of a line results in a keyboard lock. Because of this "conservation of vertical alignment," it is relatively easy to duplicate lines or blocks of lines, and align commands. With COBOL vertical alignment, as a legibility issue, is of greater importance than OO languages.
It is difficult to describe in a post, but having facility in both programming worlds and with both types of editors, I have to say that I would not want to edit COBOL in an IDE style editor, and I would not want to edit Java and C-family languages in an ISPF Editor. (I imagine you can plug-in an ISPF style editor into the various IDEs, but I haven't had the need to go there.)
N.B. OO COBOL has its uses, but not as a new way to re-engineer code that handles batch processing.

From my, although limited experience, COBOL is used a lot with IBM mainframe systems. So I believe in any situation where I/O is the emphasis (as mentioned above financial systems, insurance companies, government, etc) to the extent that a mainframe is needed or preferred and has been around for a while COBOL is probably used. I say been around for a while since in modern day I do not hear much of COBOL being a go to language.

Cobol is used primarily for financial processing. Any time banks, brokerage houses, credit card vendors, et al are doign business, there will be Cobol in the mix.

The ANSI standard for COBOL and some compilers have evolved considerably in the last 15 years and include libraries for creating and operating web frame contents and interactive sites, for data communications, for running on small processors and devices used in the hand. Well known versions are prefixed with characters like MF, CIS, RF, RM or the names of computer mainframe manufacturers old and new for versions in use primarily in Data Processing computer installations.

Today COBOL is used only because it used to be popular back in the day, and many old large businesses don't want to re-write their code into a modern language. (mainly cost + time)
The maximum length of a line of COBOL code is 72 characters long, why you ask? Because that's how many holes there were in punch cards. Even still the language hasn't been updated to allow for longer lines...
COBOL is an evil, ancient language that has little use any more, unless you are extending OLD programs...

COBOL is used for Business applications. Fortran is for scientific apps. C and C++ for hardware and firmware. Java for the web.
Then you may ask, why COBOL? Well, COBOL is about ten times easier to program for business than any of the other languages.
For example, to move a numeric to a report field and format it as a currency:
MOVE VAL-A TO REPORT-FIELD-A.
There are no getter or setter methods needed. No need to program two methods for each MOVE statement.
And all the changing to string characters, and formating to $99,999.99 IS automatic. Try that in any of the other languages.
The dirty little secret is that COBOL is really a glorified ASSEMBLER MACRO language. There is even a compiler option to print the assembler code. That makes it easy to understand and powerful.
COBOL: Easy, quick, accurate, readable and maintainable. Everything a boss could ask for.

Related

Moving away from Itanium [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
We currently have a large business-critical application written in COBOL, running on OpenVMS (Integrity/Itanium).
As the months pass, there is more and more speculation about the lifetime of the Itanium architecture. Nothing is said out in the open, of course, but articles like this and this paint a worrying picture. Although I can find nothing official to support this, there are even murmurings in the corridors of our company of HP ditching OpenVMS and HP COBOL along with it.
I cannot believe that we are alone in this.
The way I see it, there are a few options:
Emulate some old hardware and run the application on that using a product like CHARON-VAX or CHARON-AXP. The way I see it, the pros are that the process should be relatively painless, especially if the 64-bit (AXP) option is used. Potential cons are a degradation in performance (although this should be offset by faster and faster hardware);
Port the HP COBOL-based application to a more modern dialect of COBOL, such as Visual COBOL. The pros, then, are the fact that the porting effort is relatively low (it's still COBOL) and the fact that one can run the application on a Unix or Windows platform. The cons are that although you're porting COBOL, the fact that you're porting to a different operating system could make things tricky (esp. if there are OpenVMS-specific dependencies);
Automatically translate the COBOL into a more modern language like Java. This has the obvious benefit of immediately freeing one from all the legacy issues in one fell swoop: hardware support, operating system support, and especially finding administrators and programmers. Apart from this being a big job, an obvious downside is the fact that one will end up with non-idiomatic Java (or whatever target language is ultimately chosen); arguably, this is something that can be ameliorated over time.
A rewrite, from the ground up (naturally, using modern technologies). Anyone who has done this knows how expensive and time-consuming it is. I've only included it to make the list complete :)
Note that there is no dependency on a proprietary DBMS; the database is ISAM file-based.
So ... my question is:
What are others faced with the imminent obsolescence of Itanium doing to maintain business continuity when their platform of choice is OpenVMS and COBOL?
UPDATE:
We have had an official assurance from our local HP representative that Integrity/Itanium/OpenVMS will be supported at least up until 2022. I guess this means that this whole issue is less about the platform, and more about the language (COBOL).
The main problem with this effort will be the portions of the code that are OpenVMS specific. Most applications developed on OpenVMS typically use routines and procedures that are not easily ported to another platform. Rather that worry about specific language compatibility, I would initially focus on the runtime routines and command procedures used by the application.
An alternative approach may be to continue to use the current application while developing a new one or modifying a commercially available application to suit your needs. While the long term status of Itanium is in question, history indicates that OpenVMS will remain viable for some time to come. There are still VAX machines being used today for business critical applications. The fact that OpenVMS and its hardware is stable is the main reason for its longevity.
Dan
Looks like COBOL is the main dependency that keeps you worried. I undrestand Itanium+OpenVMS in this picture is just a platform.
You're definitely not alone running mission-critical stuff on OpenVMS. HP site has OpenVMS roadmap (both Alpha and Integrity), support currently stretches to 2015. Oracle seems trying to leverage it's SUN asset in different domains recently.
In any case, if your worries are substantial (sure we all worried about COMPAQ, then HP, vax>>alpha>>Itanium transitions in the past), there's time to un-tie the COBOL dependency.
So I would look now into charting out migration path from COBOL onto more portable language of choice (eg. C/C++ ANSII without platform extensions). Perhaps Java isn't the frendliest choice, given Oracle's activity. Re-write, how unpleasant it is, will be more progressive and likely will streamline the whole process. The sooner one starts, the sooner one completes.
Also, in addition to emulators, there're still plenty of second-hand hardware. Ironically, one company I know just now phases-in Integrity platforms to supplant misson-critical Alphas -- I guess, it's "corporate testing requirements"...
Do-nothing is an option as well, though obviously riskier: OpenVMS platforms are proven to be dependable, so alternatively, finding a reliable third-party support company may extend your future hardware contingency.
This summer's Rolling Roadmap makes porting off OpenVMS look like an excellent idea.
Given how much COBOL exists in the world finding people to support COBOL will not be a problem for the foreseeable future. As noted above there are COBOL compilers on other platforms. The problem lies in the OpenVMS system service calls and DEC language extensions your application uses. You do not mention where your data is stored, so worst case your COBOL uses RMS. There is a company that provides an implementation of many OpenVMS system services on Linux and the Unixes. Not needing to replace those services while porting to another operating system may reduce the complexity. Check out Sector7.com.

Cobol technical demonstration [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I'm working as a COBOL programmer - have a 6 months work experience - for a consulting firm . Today, with the rest the COBOL "department", I had a meeting with the new director of my company.
After an initial analysis made by the new team in charge, they noticed that, comparing to the other services/technologies our company has to offer - Java, C++, Objective-C, etc - COBOL was lacking "advertisement". He stated that whenever members of other teams are in-between projects they implement small demos which then can be shown to our clients whenever there is a presentation of our company. He gave examples of widgets for mobile devices in Objective-C, cool web-pages with HTML5, etc.
And noticed that there is nothing in COBOL like that. So he wants us to develop some kind of tool/app to show what our competences.
We already told him that COBOL is used under the hood, and doesn't have bells and whistles to show. Also that when hiring a COBOL programmer/analyst the most important for the client is that he shows he has "business logic" knowledge.
I know that the coding part is very important but after a two weeks introduction to the mainframe environment and to COBOL I was capable of doing my programming tasks easily. But with a 6 months experience working in a bank I find myself having to, nearly everyday, ask questions regarding "business logic" to the "gurus" just so that I don't end up changing the logic of the process when performing some maintenance task.
Can we really make something (tool/app) that shows or future clients that we have something different to offer than all the other company's which provide the same services? Or if not, is there anything that we can say to our manager so that he understands that COBOL is different from the other languages and it's purpose is not to display pretty graphics on the screen.
Thank you!
I think you're absolutely right - the "curly brace" languages have had a lot of marketing of the brand from IBM , Oracle/Sun and Microsoft over the last 20 years.
Micro Focus is trying to do something about this. They just relaunched www.cobol.com which includes a nice set of video testimonial to the power of COBOL.
There's a great video available showing some of the demos Micro Focus built as part of the development of the soon to be released "Visual COBOL R3" product. These demos include web services, rich browser-based clients, parallel processing and more (they aren't all shown in the video). Micro Focus will shortly be launching new resources for COBOL developers that will include these samples.
COBOL gets a bad rap in large part because there is the conception that it's outdated and unable to communicate with modern systems. A lot of people think that using COBOL "locks you in," unable to get data from other systems, push data to other systems, or call code that's written in other languages.
Potential clients would be very interested to know that, if they went with your COBOL solution, they could still share data with the rest of the world. I think you could build some compelling demos showing COBOL applications interacting with Web APIs or with code written in C++, Objective-C, etc. Two possibilities:
1) A simple program that submits a query to Google's search API and displays the returned results. This will show that you can issue HTTP requests and parse the responses.
2) A program that gathers data from a custom database and passes that data to a C++ function that somehow grinds up the data and returns a result. This will show that you can use COBOL's data handling strength and pass off to C++ or some other language the more processor-intensive calculation tasks that COBOL traditionally has not been as good at.
I think that's your biggest selling point: that you use COBOL for its strengths, that you're able to share data with the rest of the world, and your COBOL programs can leverage other tools for specific tasks that are not COBOL's primary function.
Having a slick demo in your back pocket to show (snow) clients is kind of like
selling sizzle not the steak. Demo applications may illustrate how skillfully your staff
can knock off a relatively light-weight problem with a lot of flair. But COBOL is less about sizzle and
more about the steak - a marketing nightmare if style is emphasized over content.
Basically, a hot demo illustrates to clients that you have some imagination, know how
to write code within a given framework that integrates various services. Undeniably these are all good selling points.
But COBOL applications don't make for great demos. Most clients running large COBOL shops already know this.
It is unlikely they would be impressed by a demo - unless it addressed some specific point of interest
to them. From my 20+ years experience as a contract COBOL
programmer, I can say my clients are mostly interested in a solid resume backed up with solid references that illustrate a
depth of experience in specific application domains (eg. accounting, human resources, inventory management,
and the like).
A solid company portfolio is probably more important than a demo. A few things you might want to outline are:
Type of businesses that you have supplied resources to (banking, finance, government, retail).
Range of technologies, middleware and frameworks employed on given projects (eg. CICS, MQ-Series, DB/2,
WebSphere, FTP, XML, RD/z, SAP, SAS etc.)
Nature of skills provided to past clients (eg. programmers, technical analysts, business analysts, DBA,
capacity planning, security analysts, data modeling, architect, project managers etc.)
Other than "putting bums in seats", point to any added value your company brought to these projects (eg.
found a no-cost optimization that reduced a clients batch schedule by 2 hours).
Put together a project case history to show how your company was able to do something special
and different for a client.
Point to any technologies, frameworks or application domains where you feel your company has some
sort of competitive advantage over the "other guys".
Try to address questions your prospective clients may have about you.
Big COBOL shops have their own way of doing things. They often use a similar set of tools, but they
all seem to have their own peculiar way of integrating them. They don't want some contractor coming in to
rock their boat. But you can impress them by showing that your "guys" bring clear
thinking, solid skills and a strong work ethic into the projects they have participated in.
Above all, make the most of what you have accomplished but never over state or fluff up your
abilities beyond the absolute truth. If a client ever catches a whiff of BS from anything
you present, they will be out the door before you can blink.
This is a late entry, but I had some fun linking OpenCOBOL WORKING-STORAGE data to CERN's ROOT/CINT framework. CINT is a way cool C/C++ interpreter, ROOT is the framework for visual analysis of particle collider and high energy physics datums. Interactive charts and graphs from COBOL working store.
A small example:
OCOBOL >>SOURCE FORMAT IS FIXED
*> ***************************************************************
*> Author: Brian Tiffin
*> Date: 20101119
*> Purpose: Pass arguments to ROOT/CINT invoked subprograms
*> Tectonics: cobc -fimplicit-init -C cobparams.cob
*> ***************************************************************
REPLACE ==ARRAYSIZE== BY ==450==.
identification division.
program-id. cobfloats.
data division.
working-storage section.
01 cnt pic 999.
01 val usage float-short.
01 xes.
02 an-x usage float-short occurs ARRAYSIZE times.
01 yes.
02 an-y usage float-short occurs ARRAYSIZE times.
linkage section.
01 vxes.
02 an-x usage float-short occurs ARRAYSIZE times.
01 vyes.
02 an-y usage float-short occurs ARRAYSIZE times.
*> ***************************************************************
procedure division using by reference vxes, vyes.
perform varying cnt from 1 by 1 until cnt >= ARRAYSIZE
compute val = cnt * function random() end-compute
move cnt to an-x in xes(cnt)
move val to an-y in yes(cnt)
end-perform
move xes to vxes
move yes to vyes
move cnt to return-code
goback.
end program cobfloats.
and a sample run
$ cobc -fimplicit-init -C cobparams.cob
$ vi cobparams.c
(Add a #pragma K&R to the top of cobparams.c so CINT lets up on type safety)
[btiffin#home cobol]$ root -l
root [0] gSystem->Load("/usr/local/lib/libcob.so");
root [1] .L cobparams.c+
root [2] int a = 0; float x[450]; float y[450];
root [3] a = cobfloats(&x, &y);
root [4] a
(int)450
root [5] TGraph *graph1 = new TGraph(450, x, y);
root [6] graph1->Draw("A*");
which produces an interactive graph of some arbitrarily constrained random numbers.
As you are working in a mainframe environment (I presume it's IBM) I suggest you check out the following link http://www-01.ibm.com/software/awdtools/cobol/ to IBM'S COBOL compiler section. There's information there about linking COBOL with web services and Java, in a mainframe environment.
As a previous answer mentioned, you could also look at Visual COBOL from Micro Focus. It's not a mainframe product but is a COBOL complier within Visual Studio and deploying in .NET
http://www.microfocus.com/products/micro-focus-developer/micro-focus-cobol/windows-and-net/micro-focus-visual-cobol.aspx
The kind of situation you may be looking at software for is called rapid prototyping and this involves a program code or script generator to demonstrate the visual appearance of displays or forms to a computer screen and can include some elements of data input and advancing to more screen pages. This is the data processing department type of style and presentation rather than one which has graphics, web access and so on that a programmer of a desk computer or laptop/notebook would probably encounter and use.
I realize this is quite a late entry, but why don't you highlight Cobol at it's strongest -- churning through massive chunks of data while applying business logic and extracting current state.
May I suggest you prepare a nice set of reports on your current database. Extract, detail, summarize, and slice and dice the data a dozen different ways. Make sure you note the runtimes to create the report suite compared to other languages trying to do the same things. And note the readability of the code (and thus, modifiability of the report suite) compared to what other languages might offer.
Extract to a CSV while you are at it and add in some Excel (or other spreadsheet) graphing and data analysis.

migrate COBOL code

I have a task to convert COBOL code to .NET. Are there any converters available? I am trying to understand COBOL code in high level. I have a trouble understanding the COBOL code. Is there any flowchart generators? I appreciate any help.
Thank you..
Migrating software systems from one language or operating environment to another is always a challenge. Here are
a few things to consider:
Legacy code tends to be poorly structured as a result of a
long history of quick fixes and problem work-arounds. This really ups the signal-to-noise ratio
when trying to warp your head around what is really going on.
Converting code leads to further "de-structuring"
to compensate for mis-matches between the source and
target implementation platforms. When you start from a poorly structured base (legacy system),
the end result may be totally un-intelligible.
Documentation of the legacy architecture and/or business processes is generally so far out of
date that it is worse than useless, it may actually be misleading.
Complexity of COBOL code is almost always under estimated.
A number of "features" will be promulgated into the converted system that were originally
built to compensate for things that "couldn't be done" at one time (due to smaller memories,
slower computers etc.). Many of these may now be non-issues and you really don't want them.
There are no obvious or straight forward ways to refactor legacy process driven
systems into an equivalent object oriented system (at least not in a meaningful way).
There have been successful projects that migrated COBOL directly into Java. See naca.
However, the end result is only something its mother (or another COBOL programmer) could love, see this discussion
In general I would be suspicious of any product or tool making claims to convert your COBOL legacy
system into anything but another version of COBOL (e.g. COBOL.net). To this end you still
end up with what is essentially a COBOL system. If this approach is acceptable then you
might want to review this white paper from Micro Focus.
IMHO, your best bet for replacing COBOL is to re-engineer your system. If you ever find
a silver bullet to get from where you are to where you want to be - write a book, become
a consultant and make many millions of dollars.
Sorry to have provided such a negative answer, but if you are working with anything
but a trivial legacy system, the problem is going to be anything but trivial to solve.
Note: Don't bother with flowcharting the existing system. Try to get a handle on process input/output and program to program data transformation and flow. You need to understand the business function here, not a specific implementation of it.
Micro Focus and Fujitsu both have COBOL products that work with .NET. Micro Focus allow you to download a product trial, while the Fujitsu NetCOBOL site has a number of articles and case studies.
Micro Focus
http://www.microfocus.com/products/micro-focus-developer/micro-focus-cobol/windows-and-net/micro-focus-visual-cobol.aspx
Fujitsu
http://www.netcobol.com/products/Fujitsu-NetCOBOL-for-.NET/overview
[Note: I work for Micro Focus]
Hi
Actually, making COBOL applications available on the .NET framework is pretty straightforward (contrary to claim made in one of the earlier responses). Fujitsu and Micro Focus both have COBOL compilers that can create ILASM code for execution in the CLR.
Micro Focus Visual COBOL (http://www.microfocus.com/visualcobol) makes it particularly easy to deploy traditional, procedural COBOL as managed code with full support for COBOL data types, file systems etc. It also includes an updated OO COBOL syntax that takes away a lot of the verbosity & complexity of the syntax to be very easy to write COBOL code based on C# examples. It's unique approach also makes it easy to use all the Visual Studio tools such as IntelliSense.
The original question mentioned "convert" and I would strongly recommend against any approach that requires the source code to be converted to some other language before being used in a .NET environment. The amount of effort and risk involved is highly unlikely to be worth any benefits accrued. On the contrary, keeping the code in COBOL maintains the existing, working code and allows for the option to deploy onto other platforms in the future. For example, how about having a single set of source code and having the option to deploy into .NET as a native language and into a Java environment without changing a line of source code?
I recommend you get a trial copy of Visual COBOL from the link above and see how you can use your existing code in .NET without making any changes.
This is not an easy task. COBOL has fundamental ideas about data types that do not map well with the object-oriented .NET framework (e.g. in COBOL, all data types are represented in terms of fixed-size buffers) and in particular the way groups and arrays work do not map well to .NET classes.
I believe there are COBOL compilers that can actually compile .NET bytecode, but they would have their own runtime libraries to manage all of that. It might be worth looking at one of these compilers and simply leaving the legacy code in COBOL.
Other than that, line-by-line translation is probably not possible. Look at the code at a higher level and translate blocks of code at a time (e.g. at the procedure level or even higher).
There are a lot mechanisms how to convert COBOL to modern scalable environments, such as .NET or Java.
The first is a migration to a new environment with saving the existing COBOL code with some minor modifications (NET Microfocus COBOL);
The second is a migration to a new platform with simulation of COBOL statements and constructions. When there are some additional NET/Java libraries to simulate some specific COBOL logic:
ACCEPT goes to NETLibrary.Accept and so on.
The third approach is the most valuable one, when you migrate to "pure" NET/Java code with all the benefits of the new environment. It can be easily maintained and developed in the future.
However, the unique expertise and toolkits are required for this approach, and there are only a few players on the global market that can help you in this case.
If we are talking about automatic migration, the number of players decreases greatly and, unfortunately for you, you have to pay for the specific technologies and tools (like ours).
However, it is a better idea to invest your money in your future growth in the modern environment, than to spend your money on the "simulation" of old technologies.
Translations is not an easy task. Besides Micro Focus and Fujitsu there is also Raincode that offers a free version of Cobol that nicely integrates with Visual Studio.

I'm interested in Programming Languages. What areas of programming are good for me?

I've always been interested in writing and designing programming languages. Of course, it's pretty difficult to find an employer that will let you write a programming language as part of your job. So I'm looking for the "next best thing".
What fields of programming will let me get some experience solving some related problems? Or what kinds of employers are most likely to view all of my dinky little interpreters as relevant experience?
If your interest in language design is irrepressible, get a Ph.D. and make it your area of research. You can count on academia to support all manner of unprofitable activity.
None. The bulk of the professionals in that field do not design languages for a living, but retarget existing compilers to new (usually embedded) targets, or work on source2source conversion systems for legacy code, making a few language extensions in the process.
You should really ask yourself if you want this, because, besides from an extremely lucky shot, that is the realistic outlook of what you will do if you go into this industry.
Remember that the big public toolchain industry is not very profitable at the moment, and that maybe a good 100 languages are in largescale pulbic use and continually maintained, after 30 years of programming languages creation.
I know this is is very gloom, but I hope it sets you on the path to chuck the romantic, hobbyist view, and start researching how the real world in this field looks like.
Moreover, having done small hobby projects on your own is not really a pre. You need to show that you can work on large projects in a team, more than that you can create a small interpreter on your own. If you really want to pursue this, I'd recommend:
stay in school, and get a bachelor (preferably a master or PHD) in CS.
join some opensource team that works on a significant project in the field. gcc, but also the Java world, Tracemonkey (Mozilla), Mono etc. Verifiable experience in real world scenarios is very important.
I think the best way to get into this type of work would be to undertake an advanced degree with a specific focus on language design, compilers etc. It's going to be very tough for you to walk in off the street into a private company and start writing new language features otherwise.
You could also shoot a little higher and on your own, or with a small team, produce something that is much more than just a dinky little interpreter. Show your potential employer that you can produce something useful.
I have worked as an embedded programmer for the past ten years. Before that I wrote compilers (and assemblers, linkers, debuggers, etc.) for 20 years.
My co-workers joke that I turn every problem in to a parsing problem. And they're right. I've used techniques that are appropriate for language design many times during the course of my career.
Today, I play around with compiler stuff on the side: http://ellcc.org. It helps me scratch my language itch.
Actually, there is a fair bit of work going on with visual programming. It isn't exactly traditional programming language work as we know it but there is a need for it. For example, a lot of advanced data analysis tools rely on visual programming tools (Pentaho). You don't have to look too hard to find good practical uses of visual programming.
To get into visual programming languages, you will need to do an advanced degree with an advisor in the area. You will need to do some human computer interaction / interface work in addition to the programming language stuff.
An employer that has a rich "domain" (i.e. a complex industry) can benefit from a "domain specific language".
Will they realise this? Unlikely. They'll be too likely trapped in their deep domain (and entrenched legacy systems) to see that a targeted language could help unclog the mire.
But if you bury yourself in a complex industry for long enough to gain rich domain knowledge you may then be able to turn them with your own skunkwork DSL. Slim chance.
Stay in academia. If you want to develop a new language your chances of being paid to do so are vanishingly small. Newer languages tend to be expressions of a novel problem domain, and you only really encounter the chance to develop them where (a) novel problems are part of the scenery, and (b) no-one is troubled by the necessity to actually earn a living.
Please take your time over it, as well. Speaking as a jobbing developer, the last thing I need is another blasted language to learn :-)
In static analysis there is a lot to do, and the problems that come up are related to those that interest you.
Most currently popular languages came out of a geniune NEED to scratch a particular ITCH. Python came about because some non-C programmers NEEDed to customize inputs their C programs and libraries. Lua came out of the NEED to embed a scripting language in to C programs. Erlang was created to address the NEED of 99.9999999% uptime, hot code loading, and highly concurrent execution. Perl came out of the NEED to easily write programs that parsed text files.
So the very simple question any employer will be asking themselves, and you should ask yourself is. What NEED can I supply a solution to that doesn't exist. Hobby work very seldom shows that you are providing solutions to a NEED, most of the time it is showing that you like to re-invent the wheel for the sake of re-inventing the wheel.

Textual versus Graphical Programming Languages [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I am part of a high school robotics team, and there is some debate about which language to use to program our robot. We are choosing between C (or maybe C++) and LabVIEW. There are pros for each language.
C(++):
Widely used
Good preparation for the future (most programming positions require text-based programmers.)
We can expand upon our C codebase from last year
Allows us to better understand what our robot is doing.
LabVIEW
Easier to visualize program flow (blocks and wires, instead of lines of code)
Easier to teach (Supposedly...)
"The future of programming is graphical." (Think so?)
Closer to the Robolab background that some new members may have.
Don't need to intimately know what's going on. Simply tell the module to find the red ball, don't need to know how.
This is a very difficult decision for us, and we've been debating for a while. Based on those pros for each language, and on the experience you've got, what do you think the better option is? Keep in mind that we aren't necessarily going for pure efficiency. We also hope to prepare our programmers for a future in programming.
Also:
Do you think that graphical languages such as LabVEIW are the future of programming?
Is a graphical language easier to learn than a textual language? I think that they should be about equally challenging to learn.
Seeing as we are partailly rooted in helping people learn, how much should we rely on prewritten modules, and how much should we try to write on our own? ("Good programmers write good code, great programmers copy great code." But isn't it worth being a good programmer, first?)
Thanks for the advice!
Edit:
I'd like to emphasize this question more:
The team captain thinks that LabVIEW is better for its ease of learning and teaching. Is that true? I think that C could be taught just as easily, and beginner-level tasks would still be around with C. I'd really like to hear your opinions. Is there any reason that typing while{} should be any more difficult than creating a "while box?" Isn't it just as intuitive that program flows line by line, only modified by ifs and loops, as it is intuitive that the program flows through the wire, only modified by ifs and loops!?
Thanks again!
Edit:
I just realized that this falls under the topic of "language debate." I hope it's okay, because it's about what's best for a specific branch of programming, with certain goals. If it's not... I'm sorry...
Before I arrived, our group (PhD scientists, with little programming background) had been trying to implement a LabVIEW application on-and-off for nearly a year. The code was untidy, too complex (front and back-end) and most importantly, did not work. I am a keen programmer but had never used LabVIEW. With a little help from a LabVIEW guru who could help translate the textual progamming paradigms I knew into LabVIEW concepts it was possible to code the app in a week. The point here is that the basic coding concepts still have to be learnt, the language, even one like LabVIEW, is just a different way of expressing them.
LabVIEW is great to use for what it was originally designed for. i.e. to take data from DAQ cards and display it on-screen perhaps with some minor manipulations in-between. However, programming algorithms is no easier and I would even suggest that it is more difficult. For example, in most procedural languages execution order is generally followed line by line, using pseudo mathematical notation (i.e. y = x*x + x + 1) whereas LabVIEW would implement this using a series of VI's which don't necessarily follow from each other (i.e. left-to-right) on the canvas.
Moreover programming as a career is more than knowing the technicalities of coding. Being able to effectively ask for help/search for answers, write readable code and work with legacy code are all key skills which are undeniably more difficult in a graphical language such as LabVIEW.
I believe some aspects of graphical programming may become mainstream - the use of sub-VIs perfectly embodies the 'black-box' principal of programming and is also used in other language abstractions such as Yahoo Pipes and the Apple Automator - and perhaps some future graphical language will revolutionise the way we program but LabVIEW itself is not a massive paradigm shift in language design, we still have while, for, if flow control, typecasting, event driven programming, even objects. If the future really will be written in LabVIEW, C++ programmer won't have much trouble crossing over.
As a postcript I'd say that C/C++ is more suited to robotics since the students will no doubt have to deal with embedded systems and FPGAs at some point. Low level programming knowledge (bits, registers etc.) would be invaluable for this kind of thing.
#mendicant Actually LabVIEW is used a lot in industry, especially for control systems. Granted NASA unlikely use it for on-board satellite systems but then software developement for space-systems is a whole different ball game...
I've encountered a somewhat similar situation in the research group I'm currently working in. It's a biophysics group, and we're using LabVIEW all over the place to control our instruments. That works absolutely great: it's easy to assemble a UI to control all aspects of your instruments, to view its status and to save your data.
And now I have to stop myself from writing a 5 page rant, because for me LabVIEW has been a nightmare. Let me instead try to summarize some pros and cons:
Disclaimer I'm not a LabVIEW expert, I might say things that are biased, out-of-date or just plain wrong :)
LabVIEW pros
Yes, it's easy to learn. Many PhD's in our group seem to have acquired enough skills to hack away within a few weeks, or even less.
Libraries. This is a major point. You'd have to carefully investigate this for your own situation (I don't know what you need, if there are good LabVIEW libraries for it, or if there are alternatives in other languages). In my case, finding, e.g., a good, fast charting library in Python has been a major problem, that has prevented me from rewriting some of our programs in Python.
Your school may already have it installed and running.
LabVIEW cons
It's perhaps too easy to learn. In any case, it seems no one really bothers to learn best practices, so programs quickly become a complete, irreparable mess. Sure, that's also bound to happen with text-based languages if you're not careful, but IMO it's much more difficult to do things right in LabVIEW.
There tend to be major issues in LabVIEW with finding sub-VIs (even up to version 8.2, I think). LabVIEW has its own way of knowing where to find libraries and sub-VIs, which makes it very easy to completely break your software. This makes large projects a pain if you don't have someone around who knows how to handle this.
Getting LabVIEW to work with version control is a pain. Sure, it can be done, but in any case I'd refrain from using the built-in VC. Check out LVDiff for a LabVIEW diff tool, but don't even think about merging.
(The last two points make working in a team on one project difficult. That's probably important in your case)
This is personal, but I find that many algorithms just don't work when programmed visually. It's a mess.
One example is stuff that is strictly sequential; that gets cumbersome pretty quickly.
It's difficult to have an overview of the code.
If you use sub-VI's for small tasks (just like it's a good practice to make functions that perform a small task, and that fit on one screen), you can't just give them names, but you have to draw icons for each of them. That gets very annoying and cumbersome within only a few minutes, so you become very tempted not to put stuff in a sub-VI. It's just too much of a hassle. Btw: making a really good icon can take a professional hours. Go try to make a unique, immediately understandable, recognizable icon for every sub-VI you write :)
You'll have carpal tunnel within a week. Guaranteed.
#Brendan: hear, hear!
Concluding remarks
As for your "should I write my own modules" question: I'm not sure. Depends on your time constraints. Don't spend time on reinventing the wheel if you don't have to. It's too easy to spend days on writing low-level code and then realize you've run out of time. If that means you choose LabVIEW, go for it.
If there'd be easy ways to combine LabVIEW and, e.g., C++, I'd love to hear about it: that may give you the best of both worlds, but I doubt there are.
But make sure you and your team spend time on learning best practices. Looking at each other's code. Learning from each other. Writing usable, understandable code. And having fun!
And please forgive me for sounding edgy and perhaps somewhat pedantic. It's just that LabVIEW has been a real nightmare for me :)
I think the choice of LabVIEW or not comes down to whether you want to learn to program in a commonly used language as a marketable skill, or just want to get stuff done. LabVIEW enables you to Get Stuff Done very quickly and productively. As others have observed, it doesn't magically free you from having to understand what you're doing, and it's quite possible to create an unholy mess if you don't - although anecdotally, the worst examples of bad coding style in LabVIEW are generally perpetrated by people who are experienced in a text language and refuse to adapt to how LabVIEW works because they 'already know how to program, dammit!'
That's not to imply that LabVIEW programming isn't a marketable skill, of course; just that it's not as mass-market as C++.
LabVIEW makes it extremely easy to manage different things going on in parallel, which you may well have in a robot control situation. Race conditions in code that should be sequential shouldn't be a problem either (i.e. if they are, you're doing it wrong): there are simple techniques for making sure that stuff happens in the right order where necessary - chaining subVI's using the error wire or other data, using notifiers or queues, building a state machine structure, even using LabVIEW's sequence structure if necessary. Again, this is simply a case of taking the time to understand the tools available in LabVIEW and how they work. I don't think the gripe about having to make subVI icons is very well directed; you can very quickly create one containing a few words of text, maybe with a background colour, and that will be fine for most purposes.
'Are graphical languages the way of the future' is a red herring based on a false dichotomy. Some things are well suited to graphical languages (parallel code, for instance); other things suit text languages much better. I don't expect LabVIEW and graphical programming to either go away, or take over the world.
Incidentally, I would be very surprised if NASA didn't use LabVIEW in the space program. Someone recently described on the Info-LabVIEW mailing list how they had used LabVIEW to develop and test the closed loop control of flight surfaces actuated by electric motors on the Boeing 787, and gave the impression that LabVIEW was used extensively in the plane's development. It's also used for real-time control in the Large Hadron Collider!
The most active place currently for getting further information and help with LabVIEW, apart from National Instruments' own site and forums, seems to be LAVA.
This doesn't answer you question directly, but you may want to consider a third option of mixing in an interpreted language. Lua, for example, is already used in the robotics field. It's fast, light-weight and can be configured to run with fixed-point numbers instead of floating-point since most microcontrollers don't have an FPU. Forth is another alternative with similar usage.
It should be pretty easy to write a thin interface layer in C and then let the students loose with interpreted scripts. You could even set it up to allow code to be loaded dynamically without recompiling and flashing a chip. This should reduce the iteration cycle and allow students to learn better by seeing results more quickly.
I'm biased against using visual tools like LabVIEW. I always seem to hit something that doesn't or won't work quite like I want it to do. So, I prefer the absolute control you get with textual code.
LabVIEW's other strength (besides libraries) is concurrency. It's a dataflow language, which means that the runtime can handle concurrency for you. So if you're doing something highly concurrent and don't want to have to do traditional synchronization, LabVIEW can help you there.
The future doesn't belong to graphical languages as they stand today. It belongs to whoever can come up with a representation of dataflow (or another concurrency-friendly type of programming) that's as straightforward as the graphical approach is, but is also parsable by the programmer's own tools.
There is a published study of the topic hosted by National Instruments:
A Study of Graphical vs. Textual Programming for Teaching DSP
It specifically looks at LabVIEW versus MATLAB (as opposed to C).
I think that graphical languages wil always be limited in expressivity compared to textual ones. Compare trying to communicate in visual symbols (e.g., REBUS or sign language) to communicating using words.
For simple tasks, using a graphical language is usually easier but for more intricate logic, I find that graphical languages get in the way.
Another debate implied in this argument, though, is declarative programming vs. imperative. Declarative is usually better for anything where you really don't need the fine-grained control over how something is done. You can use C++ in a declarative way but you would need more work up front to make it so, whereas LABView is designed as a declarative language.
A picture is worth a thousand words but if a picture represents a thousand words that you don't need and you can't change that, then in that case a picture is worthless. Whereas, you can create thousands of pictures using words, specifying every detail and even leading the viewer's focus explicitly.
LabVIEW lets you get started quickly, and (as others have already said) has a massive library of code for doing various test, measurement & control related things.
The single biggest downfall of LabVIEW, though, is that you lose all the tools that programmers write for themselves.
Your code is stored as VIs. These are opaque, binary files. This means that your code really isn't yours, it's LabVIEW's. You can't write your own parser, you can't write a code generator, you can't do automated changes via macros or scripts.
This sucks when you have a 5000 VI app that needs some minor tweak applied universally. Your only option is to go through every VI manually, and heaven help you if you miss a change in one VI off in a corner somewhere.
And yes, since it's binary, you can't do diff/merge/patch like you can with textual languages. This does indeed make working with more than one version of the code a horrific nightmare of maintainability.
By all means, use LabVIEW if you're doing something simple, or need to prototype, or don't plan to maintain your code.
If you want to do real, maintainable programming, use a textual language. You might be slower getting started, but you'll be faster in the long run.
(Oh, and if you need DAQ libraries, NI's got C++ and .Net versions of those, too.)
My first post here :) be gentle ...
I come from an embedded background in the automotive industry and now i'm in the defense industry. I can tell you from experience that C/C++ and LabVIEW are really different beasts with different purposes in mind. C/C++ was always used for the embedded work on microcontrollers because it was compact and compilers/tools were easy to come by. LabVIEW on the other hand was used to drive the test system (along with test stand as a sequencer). Most of the test equipment we used were from NI so LabVIEW provided an environment where we had the tools and the drivers required for the job, along with the support we wanted ..
In terms of ease of learning, there are many many resources out there for C/C++ and many websites that lay out design considerations and example algorithms on pretty much anything you're after freely available. For LabVIEW, the user community's probably not as diverse as C/C++, and it takes a little bit more effort to inspect and compare example code (have to have the right version of LabVIEW etc) ... I found LabVIEW pretty easy to pick up and learn, but there a nuisances as some have mentioned here to do with parallelism and various other things that require a bit of experience before you become aware of them.
So the conclusion after all that? I'd say that BOTH languages are worthwhile in learning because they really do represent two different styles of programming and it is certainly worthwhile to be aware and proficient at both.
Oh my God, the answer is so simple. Use LabView.
I have programmed embedded systems for 10 years, and I can say that without at least a couple months of infrastructure (very careful infrastructure!), you will not be as productive as you are on day 1 with LabView.
If you are designing a robot to be sold and used for the military, go ahead and start with C - it's a good call.
Otherwise, use the system that allows you to try out the most variety in the shortest amount of time. That's LabView.
I love LabVIEW. I would highly recommend it especially if the other remembers have used something similar. It takes a while for normal programmers to get used to it, but the result's are much better if you already know how to program.
C/C++ equals manage your own memory. You'll be swimming in memory links and worrying about them. Go with LabVIEW and make sure you read the documentation that comes with LabVIEW and watch out for race conditions.
Learning a language is easy. Learning how to program is not. This doesn't change even if it's a graphical language. The advantage of Graphical languages is that it is easier to visual what the code will do rather than sit there and decipher a bunch of text.
The important thing is not the language but the programming concepts. It shouldn't matter what language you learn to program in, because with a little effort you should be able to program well in any language. Languages come and go.
Disclaimer: I've not witnessed LabVIEW, but I have used a few other graphical languages including WebMethods Flow and Modeller, dynamic simulation languages at university and, er, MIT's Scratch :).
My experience is that graphical languages can do a good job of the 'plumbing' part of programming, but the ones I've used actively get in the way of algorithmics. If your algorithms are very simple, that might be OK.
On the other hand, I don't think C++ is great for your situation either. You'll spend more time tracking down pointer and memory management issues than you do in useful work.
If your robot can be controlled using a scripting language (Python, Ruby, Perl, whatever), then I think that would be a much better choice.
Then there's hybrid options:
If there's no scripting option for your robot, and you have a C++ geek on your team, then consider having that geek write bindings to map your C++ library to a scripting language. This would allow people with other specialities to program the robot more easily. The bindings would make a good gift to the community.
If LabVIEW allows it, use its graphical language to plumb together modules written in a textual language.
I think that graphical languages might be the language of the future..... for all those adhoc MS Access developers out there. There will always be a spot for the purely textual coders.
Personally, I've got to ask what is the real fun of building a robot if it's all done for you? If you just drop a 'find the red ball' module in there and watch it go? What sense of pride will you have for your accomplishment? Personally, I wouldn't have much. Plus, what will it teach you of coding, or of the (very important) aspect of the software/hardware interface that is critical in robotics?
I don't claim to be an expert in the field, but ask yourself one thing: Do you think that NASA used LabVIEW to code the Mars Rovers? Do you think that anyone truly prominent in robotics is using LabView?
Really, if you ask me, the only thing using cookie cutter things like LabVIEW to build this is going to prepare you for is to be some backyard robot builder and nothing more. If you want something that will give you something more like industry experience, build your own 'LabVIEW'-type system. Build your own find-the-ball module, or your own 'follow-the-line' module. It will be far more difficult, but it will also be way more cool too. :D
You're in High School. How much time do you have to work on this program? How many people are in your group? Do they know C++ or LabView already?
From your question, I see that you know C++ and most of the group does not. I also suspect that the group leader is perceptive enough to notice that some members of the team may be intimidated by a text based programming language. This is acceptable, you're in high school, and these people are normies. I feel as though normal high schoolers will be able to understand LabView more intuitively than C++. I'm guessing most high school students, like the population in general, are scared of a command line. For you there is much less of a difference, but for them, it is night and day.
You are correct that the same concepts may be applied to LabView as C++. Each has its strengths and weaknesses. The key is selecting the right tool for the job. LabView was designed for this kind of application. C++ is much more generic and can be applied to many other kinds of problems.
I am going to recommend LabView. Given the right hardware, you can be up and running almost out-of-the-box. Your team can spend more time getting the robot to do what you want, which is what the focus of this activity should be.
Graphical Languages are not the future of programming; they have been one of the choices available, created to solve certain types of problems, for many years. The future of programming is layer upon layer of abstraction away from machine code. In the future, we'll be wondering why we wasted all this time programming "semantics" over and over.
how much should we rely on prewritten modules, and how much should we try to write on our own?
You shouldn't waste time reinventing the wheel. If there are device drivers available in Labview, use them. You can learn a lot by copying code that is similar in function and tailoring it to your needs - you get to see how other people solved similar problems, and have to interpret their solution before you can properly apply it to your problem. If you blindly copy code, chances of getting it to work are slim. You have to be good, even if you copy code.
Best of luck!
I would suggest you use LabVIEW as you can get down to making the robot what you want to do faster and easier. LabVIEW has been designed with this mind. OfCourse C(++) are great languages, but LabVIEW does what it is supposed to do better than anything else.
People can write really good software in LabVIEW as it provides ample scope and support for that.
There is one huge thing I found negative in using LabVIEW for my applications: Organize design complexity. As a physisist I find Labview great for prototyping, instrument control and mathematical analysis. There is no language in which you get faster and better a result then in LabVIEW. I used LabView since 1997. Since 2005 I switched completely to the .NET framework, since it is easier to design and maintain.
In LabVIEW a simple 'if' structure has to be drawn and uses a lot of space on your graphical design. I just found out that many of our commercial applications were hard to maintain. The more complex the application became, the more difficult it was to read.
I now use text laguages and I am much better in maintaining everything. If you would compare C++ to LabVIEW I would use LabVIEW, but compared to C# it does not win
As allways, it depends.
I am using LabVIEW since about 20 years now and did quite a large kind of jobs, from simple DAQ to very complex visualization, from device controls to test sequencers. If it was not good enough, I for sure would have switched. That said, I started coding Fortran with punchcards and used a whole lot of programming languages on 8-bit 'machines', starting with Z80-based ones. The languages ranged from Assembler to BASIC, from Turbo-Pascal to C.
LabVIEW was a major improvement because of its extensive libraries for data acqusition and analysis. One has, however, to learn a different paradigma. And you definitely need a trackball ;-))
I don't anything about LabView (or much about C/C++), but..
Do you think that graphical languages such as LabVEIW are the future of programming?
No...
Is a graphical language easier to learn than a textual language? I think that they should be about equally challenging to learn.
Easier to learn? No, but they are easier to explain and understand.
To explain a programming language you have to explain what a variable is (which is surprisingly difficult). This isn't a problem with flowgraph/nodal coding tools, like the LEGO Mindstroms programming interface, or Quartz Composer..
For example, in this is a fairly complicated LEGO Mindstroms program - it's very easy to understand what is going in... but what if you want the robot to run the INCREASEJITTER block 5 times, then drive forward for 10 seconds, then try the INCREASEJITTER loop again? Things start getting messy very quickly..
Quartz Composer is a great exmaple of this, and why I don't think graphical languages will ever "be the future"
It makes it very easy to really cool stuff (3D particle effects, with a camera controlled by the average brightness of pixels from a webcam).. but incredibly difficult to do easy things, like iterate over the elements from an XML file, or store that average pixel value into a file.
Seeing as we are partailly rooted in helping people learn, how much should we rely on prewritten modules, and how much should we try to write on our own? ("Good programmers write good code, great programmers copy great code." But isn't it worth being a good programmer, first?)
For learning, it's so much easier to both explain and understand a graphical language..
That said, I would recommend a specialised text-based language language as a progression. For example, for graphics something like Processing or NodeBox. They are "normal" languages (Processing is Java, NodeBox is Python) with very specialised, easy to use (but absurdly powerful) frameworks ingrained into them..
Importantly, they are very interactive languages, you don't have to write hundreds of lines just to get a circle onscreen.. You just type oval(100, 200, 10, 10) and press the run button, and it appears! This also makes them very easy to demonstrate and explain.
More robotics-related, even something like LOGO would be a good introduction - you type "FORWARD 1" and the turtle drives forward one box.. Type "LEFT 90" and it turns 90 degrees.. This relates to reality very simply. You can visualise what each instruction will do, then try it out and confirm it really works that way.
Show them shiney looking things, they will pickup the useful stuff they'd learn from C along the way, if they are interested or progress to the point where they need a "real" language, they'll have all that knowledge, rather than run into the syntax-error and compiling brick-wall..
It seems that if you are trying to prepare our team for a future in programming that C(++) ma be the better route. The promise of general programming languages that are built with visual building blocks has never seemed to materialize and I am beginning to wonder if they ever will. It seems that while it can be done for specific problem domains, once you get into trying to solve many general problems a text based programming language is hard to beat.
At one time I had sort of bought into the idea of executable UML but it seems that once you get past the object relationships and some of the process flows UML would be a pretty miserable way to build an app. Imagine trying to wire it all up to a GUI. I wouldn't mind being proven wrong but so far it seems unlikely we'll be point and click programming anytime soon.
I started with LabVIEW about 2 years ago and now use it all the time so may be biased but find it ideal for applications where data acquisition and control are involved.
We use LabVIEW mainly for testing where we take continuous measurements and control gas valves and ATE enclosures. This involves both digital and analogue input and outputs with signal analysis routines and process control all running from a GUI. By breaking down each part into subVIs we are able to reconfigure the tests with the click and drag of the mouse.
Not exactly the same as C/C++ but a similar implementation of measurement, control and analysis using Visual BASIC appears complex and hard to maintain by comparision.
I think the process of programming is more important than the actual coding language and you should follow the style guidelines for a graphical programming language. LabVIEW block diagrams show the flow of data (Dataflow programming) so it should be easy to see potential race conditions although I've never had any problems. If you have a C codebase then building it into a dll will allow LabVIEW to call it directly.
There are definitely merits to both choices; however, since your domain is an educational experience I think a C/C++ solution would most benefit the students. Graphical programming will always be an option but simply does not provide the functionality in an elegant manner that would make it more efficient to use than textual programming for low-level programming. This is not a bad thing - the whole point of abstraction is to allow a new understanding and view of a problem domain. The reason I believe many may be disappointed with graphical programming though is that, for any particular program, the incremental gain in going from programming in C to graphical is not nearly the same as going from assembly to C.
Knowledge of graphical programming would benefit any future programmer for sure. There will probably be opportunities in the future that only require knowledge of graphical programming and perhaps some of your students could benefit from some early experience with it. On the other hand, a solid foundation in fundamental programming concepts afforded by a textual approach will benefit all of your students and surely must be the better answer.
The team captain thinks that LabVIEW
is better for its ease of learning and
teaching. Is that true?
I doubt that would be true for any single language, or paradigm. LabView could surely be easier for people with electronics engineering background; making programs in it is "simply" drawing wires. Then again, such people might already be exposed to programming, as well.
One essential difference - apart from from the graphic - is that LV is demand based (flow) programming. You start from the outcome and tell, what is needed to get to it. Traditional programming tends to be imperative (going the other way round).
Some languages can provide the both. I crafted a multithreading library for Lua recently (Lanes) and it can be used for demand-based programming in an otherwise imperative environment. I know there are successful robots run mostly in Lua out there (Crazy Ivan at Lua Oh Six).
Have you had a look at the Microsoft Robotics Studio?
http://msdn.microsoft.com/en-us/robotics/default.aspx
It allows for visual programming (VPL):
http://msdn.microsoft.com/en-us/library/bb483047.aspx
as well as modern languages such as C#.
I encourage you to at least take a look at the tutorials.
My gripe against Labview (and Matlab in this respect) is that if you plan on embedding the code in anything other than x86 (and Labview has tools to put Labview VIs on ARMs) then you'll have to throw as much horsepower at the problem as you can because it's inefficient.
Labview is a great prototyping tool: lots of libraries, easy to string together blocks, maybe a little difficult to do advanced algorithms but there's probably a block for what you want to do. You can get functionality done quickly. But if you think you can take that VI and just put it on a device you're wrong. For instance, if you make an adder block in Labview you have two inputs and one output. What is the memory usage for that? Three variables worth of data? Two? In C or C++ you know, because you can either write z=x+y or x+=y and you know exactly what your code is doing and what the memory situation is. Memory usage can spike quickly especially because (as others have pointed out) Labview is highly parallel. So be prepared to throw more RAM than you thought at the problem. And more processing power.
In short, Labview is great for rapid prototyping but you lose too much control in other situations. If you're working with large amounts of data or limited memory/processing power then use a text-based programming language so you can control what goes on.
People always compare labview with C++ and day "oh labview is high level and it has so much built in functionality try acquiring data doing a dfft and displaying the data its so easy in labview try it in C++".
Myth 1: It's hard to get anything done with C++ its because its so low level and labview has many things already implemented.
The problem is if you are developing a robotic system in C++ you MUST use libraries like opencv , pcl .. ect and you would be even more smarter if you use a software framework designed for building robotic systems like ROS (robot operating system). Therefore you need to use a full set of tools. Infact there are more high level tools available when you use, ROS + python/c++ with libraries such as opencv and pcl. I have used labview robotics and frankly commonly used algorithms like ICP are not there and its not like you can use other libraries easily now.
Myth2: Is it easier to understand graphical programming languages
It depends on the situation. When you are coding a complicated algorithm the graphical elements will take up valuable screen space and it will be difficult to understand the method. To understand labview code you have to read over an area that is O(n^2) complexity in code you just read top to bottom.
What if you have parallel systems. ROS implements a graph based architecture based on subcriber/publisher messages implemented using callback and its pretty easy to have multiple programs running and communicating. Having individual parallel components separated makes it easier to debug. For instance stepping through parallel labview code is a nightmare because control flow jumps form one place to another. In ROS you don't explicitly 'draw out your archietecture like in labview, however you can still see it my running the command ros run rqt_graph ( which will show all connected nodes)
"The future of programming is graphical." (Think so?)
I hope not, the current implementation of labview does not allow coding using text-based methods and graphical methods. ( there is mathscript , however this is incredibly slow)
Its hard to debug because you cant hide the parallelism easily.
Its hard to read labview code because there you have to look over so much area.
Labview is great for data aq and signal processing but not experimental robotics, because most of the high level components like SLAM (simultaneous localisation and mapping), point cloud registration, point cloud processing ect are missing. Even if they do add these components and they are easy to integrate like in ROS, because labview is proprietary and expensive they will never keep up with the open source community.
In summary if labview is the future for mechatronics i am changing my career path to investment banking... If i can't enjoy my work i may as well make some money and retire early...

Resources