As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
According to the computer language benchmark game, the LuaJIT implementation seems to beat every other JIT-ed dynamic language (V8, Tracemonkey, PLT Scheme, Erlang HIPE) by an order of magnitude.
I know that these benchmarks are not representative (as they say: "Which programming language implementations have the fastest benchmark programs?"), but this is still really impressive.
In practice, is it really the case? Someone have tested that Lua implementation?
There's a good discussion at Lambda the Ultimate. LuaJIT is very good.
Many people have reported impressive speedups on lua-l (the lua mailing list). The speedups are most impressive for pure Lua code; the trace compiler is not as effective when there are lots of calls to C functions in loadable library modules.
In my case (a game prototype development), I observed no performance improvement at all. I use lua for embedding, so there are lots of calls to C++ library functions. Even though main loop is in a lua script and all of the important logic is implemented in lua, the overall performance was determined by rendering engines and physics engines implemented in C++.
The original lua is already fast enough for such applications.
I made an experiment with the lesson learned here: http://www.sampalib.org/luajit2.0_tunning.html
Some data are not that valid anymore ( maxmcode=1024 is enough ), but luajit brings a robust improvement on a 600 lines of code pure Lua script (no C call to hit perfs...) that is not a large scale application nor an embedded use case but much more than the benchmarks.
The performance of JIT depends on two things: performance of original scripting language, and the performance of the compiler.
Compiler is a pretty mature technique and most JIT compiler have comparable performance. However, lua itself, i.e. lua-without-JIT, is probably one of the fastest scripting language.
lua is faster than Java-without-JIT.
lua is faster than Javascript-without-JIT.
lua is faster than most-scripting-languages-without-JIT.
so,
lua-JIT is faster than Java-with-JIT (the sun Java),
lua-JIT is faster than V8 (Javascript-with-JIT),
etc, ...
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
If so, what is the storage and memory footprint?
EDIT
I had done some research about this, but failed to find useful information. The site http://www.erlang-embedded.com/ doesn't help at all. The blog article http://www.1011ltd.com/web/blog/post/embedded_erlang was a little helpful, but It would be nice to hear answers from people with more experience.
EDIT 2
The hardware that I intend to use for Erlang has 32Mb of FLASH storage for the system and 512Mb of RAM. It is dual core with 400Mhz per core. It runs Linux version 2.6.18.
EDIT 3
The motivation behind my interest in Erlang would be to solve gracefully concurrency problems. On the project that I work we have some complex middleware software that is not robust, it's hard to understand and hard to extend. Of course, you can write great concurrent software in C, but Erlang just seems like a better tool for this problem domain.
What is embedded for you?
In my world it's a system with less than 1MB Flash and typically ~64kB Ram.
In my world exists C and sometimes also C++ compilers.
But nobody heard ever for an erlang compiler for such a system (and nobody missed them).
But if embedded is for you WindowsCE or a linux running on a non PC basis hardware with > 64MB Ram and 1GB Flash,
then there should be no problem with erlang.
I would echo the sentiment that the question is vague. But, ...
Not trying to troll, but I think the answer is either "Yes!!" or "No!!" depending on your assumptions regarding hardware and what problems your are trying to solve that aren't easily solved by something more standard like C (i.e., why aren't you using something like C, there must be a reason... reducing code-size, need hot-upgrade, {erlang_value_prop, n}, etc.).
Under a certain set of criteria, the answer seems to be "yes". Evidence includes:
EMBEDDED ERLANG? ABSOLUTELY (http://www.1011ltd.com/web/blog/post/embedded_erlang)
Its embedded use in ATM switches and other telecom equipment
There is (or was) an embedded-Erlang group on Google
I think Ulf Wiger has an Embedded Erlang slide-deck as part of his work with Erlang Solutions
etc
No,
Many embedded systems don't have Erlang compilers, while all have C compilers and most have C++.
Erlang lacks the low level access required by an embedded system.
Its certainly possible to get Erlang on a cluster of Raspberry Pis, but this isn't an embedded device.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I am having a hard time understanding what the major purpose of Google's programming language Dart is. What's its role? Why would I want to use it?
You may checkout the technical aspects on this article. Quote:
The Dart programming language is presented here in its early stages.
The following design goals will guide the continued evolution and
refinement of this open source project:
Create a structured yet flexible programming language for the web.
Make Dart feel familiar and natural to programmers and thus easy to
learn.
Ensure that all Dart language constructs allow high performance
and fast application startup.
Make Dart appropriate for the full range
of devices on the web—including phones, tablets, laptops, and servers.
Provide tools that make Dart run fast across all major modern
browsers.
These design goals address the following problems currently
facing web developers:
Small scripts often evolve into large web applications with no
apparent structure—they’re hard to debug and difficult to maintain. In
addition, these monolithic apps can’t be split up so that different
teams can work on them independently. It’s difficult to be productive
when a web application gets large.
Scripting languages are popular
because their lightweight nature makes it easy to write code quickly.
Generally, the contracts with other parts of an application are
conveyed in comments rather than in the language structure itself. As
a result, it’s difficult for someone other than the author to read and
maintain a particular piece of code.
With existing languages, the
developer is forced to make a choice between static and dynamic
languages. Traditional static languages require heavyweight toolchains
and a coding style that can feel inflexible and overly constrained.
Developers have not been able to create homogeneous systems that
encompass both client and server, except for a few cases such as
Node.js and Google Web Toolkit (GWT).
Different languages and formats
entail context switches that are cumbersome and add complexity to the
coding process.
Major purpose of dart language is replacement of JavaScript. It fixes common issues of JavaScript, it is actually compiled to JavaScript, but in the future it will have its own VM.
Main advantages over JavaScript are that it is object oriented interface inheritance based language, it has support for interface factory builders. It has simpliefed actor model called isolators.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I read its documentation and it seems quite awesome. But I never heard of any application developed using it.
What are the main advantages and disadvantage of Vala?
(IMO)
advantages:
No garbage collector!
generated programs are written in C which should boost performance and require less resources than other scripting languages (python) or managed code (Mono).
Provide easy to use API to a huge variety of useful libraries available in Linux written mostly in C.
Provide a C#-like syntax which is very popular and by doing so attract new developers to OSS programming.
Bring (some level of) OOP syntactic sugar into the world of C but easier to use than C++.
disadvantage:
No garbage collector!
Generated program should be recompiled for each architecture.
It's a young language. Language specifications and API change constantly. Maintaining a big project might require extra attention.
Debugging is possible but a bit tricky.
No stable IDE and tools yet. Valide crashes a lot and vtg too.
Language object model is based on glib/gobject which seem to be limited. Dova is being developed to explore an alternative path but will not be compatible with gobjects.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have a project to recognize the footprint of animals. It is similar to facial recognition.
There is a need to store footprint images in a database and compare them with images captured by camera.
What is the appropriate programming language to do this?
Any language can be used for image processing, pattern recognition and object detection, which is what you're trying to do here. But you're better off finding a library or even an application instead, and then picking the language based on that choice.
Matlab is fine if you're familiar with it, unless you plan on delivering a working system that will be used by others to add or annotate data. In that case, you'll need something easier to deploy beyond your own workstation.
OpenCV might be a good place to start, and there's an OpenCV tutorial here.
Since it's a similar problem, you may want to check out the Face Recognition Homepage for more detailed information.
I think the question is rather how you represent the data and determine likeness/sameness/distance measuring rather than an implementation language.
Lisp is a strong candidate, as is C/C++ - but really you are probably better off with whatever language you/your team knows best.
Again, figure out the data representation first.
Also - find another imaging/matching solution out there. There are already ones for license plates, fingerprint, etc - and maybe just use that source. The roblem is mostly solved...
If you need to get something working quickly, I would suggest Matlab or some similar math package. There are a lot of built-in algorithms that you can use for image processing and rapid prototyping.
Your biggest problem here is developing the algorithm, not choosing the language. My advice would be to prototype your project in Matlab, if you have access to it. What you are trying to do is an active area of research, and many researchers prefer Matlab and publish their Matlab code. This means that you may be able to find Matlab code on the web that may do at least some of what you need, such as image segmentation.
I would advise against using C++, unless you actually get your algorithm to work, and speed becomes important. Matlab would allow you to quickly try out ideas, and avoid spending most of your time on implementation details. Once you develop your algorithm to the point when you are happy with the results, then you can think about implementing it as a usable system in a "real" programming language.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm interested in studying how an interpreter works, and LOLCODE makes me laugh, so: What's the best OpenSource LOLCODE interpeter? Bonus points for providing a decent REPL.
Depends on your favorite/"best-to-understand" language - for example, here's a Java and a Perl open source interpreter.
My favorite implementation is LOLPython
So, great plus if you're a python fan. :)
And if you wanted to make changes to what's already defined, it's pretty simple. :D
While maybe not the "best" one, I think it's pretty cool that someone from DLR team actually created a LOLCode interpreter based on the DLR, with full access to the .NET Framework.
Added Link from Wayback Machine
Certainly without a definition of "best", there's little way to answer this question with any certainty. I'm writing an LOLCODE interpreter (http://pgfoundry.org/projects/pllolcode/) to support LOLCODE as a language for writing stored procedures in the PostgreSQL database. (Why, you ask? Because I wanted to learn how.) This interpreter is written in C, and uses Bison and Flex for parsing. These seem to be "best" choices in this case because that's what PostgreSQL itself uses. If you're more familiar with, say, Perl, the Perl-based interpreter is probably better.
I know it's not an interpreter, but I've used the Lolcode.net implementation, and it worked rather well for me. It follows the specifications relatively well, except for a few things (like arrays).
Also, I got it to run in Linux using Mono, if Linux compatibility is important to you.