Handling external libs with rebar - erlang

I want to use some libs in my application, like https://github.com/Panmind/erlang-ruby-marshal. That repo holds an src dir, but has no .app file (because it's not an application), so I can't use get-deps.
I tried another approach, adding a libs dir in sub_dirs and added the repo as a git submodule, but rebar won't compile any of its files. I guess that rebar only compiles otp applications, but not just .erl files that aren't tied to an application.
How do you manage those kind of dependencies? I would like to avoid copying the files to my app dir, because I don't think they belong there, and I kind of like the git submodule approach, that allows me to keep track of the lib version I am using.

Recent rebar supports raw option for dependencies. When this option is specific, rebar does not require the dependency to have a standard Erlang/OTP layout which assumes the presence of either "src/dependency_name.app.src" or "ebin/dependency_name.app" files (see more details here).
For example:
{deps, [
{erlang_ruby_marshal, "",
{git, "https://github.com/Panmind/erlang-ruby-marshal", {branch, master}},
[raw]}
]}.
Note that rebar will now be able to fetch it, but it still won't compile it. As other commenters pointed out, there's no reason why this dependency should not have an .app file. I would fork the repository and add the .app file to it.

This article goes through the bigger process of creating applications and releases with rebar.
More specifically, I think this option in rebar.config might be what you're looking for. The only way I've found so far is to have one entry for each application:
{sub_dirs, ["libs/app1",
"libs/app2",
...]}.
This requires a bit more manual work. Unfortunately rebar is very structured around the concept of one app only, and would need some better support for caring for a repository with a bunch of equally worth applications instead of a single application.

If you are using Linux, you can add the required modules as hard links, into the src directory of your application.
This is far from optimal but I have yet to find a better way to do this.

Ask the Agner guys to add it to their package management system. In the process they will create a fork and convert to make the project rebar compatible. Also, the original maintainer will quite possibly integrate the changes.

Related

Handling complex and large dependencies

Problem
I've been developing a game in C++ in my spare time and I've opted to use Bazel as my build tool since I have never had a ton of luck (or fun) working with make or cmake. I also have dependencies in other languages (python for some of the high level scripting). I'm using glfw for basic window handling and high level graphics support and that works well enough but now comes the problem. I'm uncertain on how I should handle dependencies like glfw in a Bazel world.
For some of my dependencies (like gtest and fruit) I can just reference them in my WORKSPACE file and Bazel handles them automagically but glfw hasn't adopted Bazel. So all of this leads me to ask, what should I do about dependencies that don't use Bazel inside a Bazel project?
Current approach
For many of the simpler dependencies I have, I simply created a new_git_repository entry in my WORKSPACE file and created a BUILD file for the library. This works great until you get to really complicated libraries like glfw that have a number of dependencies on their own.
When building glfw for a Linux machine running X11 you now have a dependency on X11 which would mean adding X11 to my Bazel setup. X11 Comes with its own set of dependencies (the X11 libraries like X11Cursor) and so on.
glfw also tries to provide basic joystick support which is provided by default in Linux which is great! Except that this is provided by the kernel which means that the kernel is also a dependency of my project. Now I shouldn't need anything more than the kernel headers this still seems like a lot to bring in.
Alternative Options
The reason I took the approach I've taken so far is to make the dependencies required to spin up a machine that can successfully build my game very minimal. In theory they just need a C/C++ compiler, Java 8, and Bazel and they're off to the races. This is great since it also means I can create a Docker container that has Bazel installed and do CI/CD really easily.
I could sacrifice this ease and just say that you need to have libraries like glfw installed before attempting to compile the game but that brings the whole which version is installed and how is it all configured problem back up that Bazel is supposed to help solve.
Surely there is a simpler solution and I'm overthinking this?
If the glfw project has no BUILD files, then you have the following options:
Build glfw inside a genrule.
If glfw supports some other build system like make, you could create a genrule that runs the tool. This approach has obvious drawbacks, like the not-to-be-underestimated impracticality of having to declare all inputs of that genrule, but it'd be the simplest way of Bazel'izing glfw.
Pre-build glfw.o and check it into your source tree.
You can create a cc_library rule for it, and put the .o file in the srcs. Even though this solution is the least flexible of all because you not only restrict the target platform to whatever the .o was built for, but also make it harder to reproduce the whole build, the benefits are sometimes worth the costs.
I view this approach as a last resort. Even in Bazel's own source code there's one cc_library.srcs that includes a raw object file, because it was worth it, as the commit message of 92caf38 explains.
Require that glfw be installed.
You already considered this option. Some people may prefer this to the other approaches.

Best Practices exposing Bower components

I am building a Spring project with Bower to manage client libraries. I am interested to know what is the best practices way to expose those libraries (or any sort of client libraries managed by a package manager) to the web client.
I can see that I can use a .bowerrc file to choose where to install the files. I could have them install into a static resources folder, one where each of the files installed would be accessible to http requests. It struck me as a potential code smell, however, to expose all the files, instead of the ones that I specifically need.
I could copy individual files into such a directory, or adopt an automated solution to do the same. If this is not considered necessary, however, I would prefer not to expend the effort.
Which of these, or any other solution (if any) is considered the clear best practices way to do this and why? (Please provide a reference to support your answer.) To be clear, I am not interested in individual opinion, but rather if there is a known, clearly preferred, solution.
After looking at what a lot of projects and tutorial suggest, it seems that the clear way to do this is the following:
Use a framework like Grunt or Gulp to separate "built" code from source code. Built code, in this case refers to code that is copied, minified, and/or concatenated into a separate folder. The Grunt or Gulp configuration file should include all application code, as well as select source files from bower components. The running application should reference only these "built" files. The directory of "built" client-side code should be served statically by Spring.

How to prevent Bower from bloating my app?

I'm working on a little web application using Express.js, Backbone.js, Bootstrap and some others. I've decided to give Bower a try to manage the front end components for this one, but after installing the packages, I noticed that all of them installed tons of stuff that I don't need at all, like LESS files (Bootstrap) or QUnit tests for the framework (Backbone), README.md files, documentation source code, and so on:
As you can see, it's absolute madness in here.
I've searched the package index a bit and I've found a leaner version of Bootstrap called bootstrap.css, but after installing I noticed it is still version 2.3.2, so pretty much outdated.
Isn't there a way to install up to date dist versions of all those libraries?
The idea of having a package manager is nice but it seems a bit off putting to have my application source bloated with all this stuff. I most definitely do not need the Backbone documentation installed on my web server.
This is a matter of package authors not configuring their bower.json to ignore those extraneous files and folders. Additionally, not all package authors configure their bower.json to list the main file(s) for their package.
You can see how without having those two pieces of information-- what files aren't needed, and which ones are the "main" files-- Bower, or any other tool, can't reliably guess what is necessary and what is junk.
As far as it bloating your server; ideally, you shouldn't be committing Bower components. You would have a build process that takes your source files, wherever they may be on disk, and morphs them into one minified file.
You can try bowercopy.
What does it do?
Download all the bower components listed in bower.json
Copy files you need to specified folder
(Optional) Remove the bower_components folder.
Every time you run the bowercopy task, it will do the process above.
A grunt config example
bowercopy: {
options: {
destPrefix:'app/jslib', // Here is the dest folder
clean:true // It's optinal
},
dist: {
// List all the files you need here
src:'backbone/backbone.js' // "src" can be an array
}
}
Yeah, You have to specify all the files you need one by one. But it does achieve your goal.

In Delphi, should I add shared units to my projects, to a shared package, or neither?

This question is similar to this one, but not a duplicate because I'm asking about issues not discussed in that question.
I have a client-server project in Delphi 7 with the following directory structure:
\MyApp
\MyClientApp
\MyServerApp
\lib
There are 2 actual Delphi projects (.dpr), one each in the MyClientApp and MyServerApp folders.
The lib folder has .pas units that have common code to the client and server apps. What I'm wondering is if I should include those .pas files in the client and server projects? Or should I create a package in the lib folder which includes those units? Or should I just leave the .pas files sitting in the lib folder and not add them to any app/package?
What are the pros/cons of each approach? Which way is "best"? Is there any issue with having those units from the lib folder be included in more than one project?
Right now the units in the lib folder are not a part of any app/package. One disadvantage of this is that when I have my client app open in Delphi, for example, and I want to search in all files in the project for something, it doesn't also search in the units in the lib folder. I get around this by opening those units and doing a find in all open files, or using grep search (but I'd prefer a better solution).
I would also greatly prefer a solution where I will not have to go and open some separate package and recompile it when I make changes to those files in the lib folder (is this where I should use a project group?).
Sharing units between applications always carries the risk of incompatible changes done in one application that breaks the other. On the other hand, making copies of these units is even worse, so your approcach of moving them to their own subdirectory at least adds a psychological barrier to changing them without considering other programs.
As for adding them to the project files: I usually add some units which I frequently access (either for expanding or for reference) from the IDE to the project, and leave others out for the compiler to pick using the search path. I do that on per project basis, that means, some units may be part of several projects, why not?
Putting them into a package only makes sense, if you actually want to create a package based application, otherwise, why bother?
For more info on how I organize my projects and libraries, see http://www.dummzeuch.de/delphi/subversion/english.html
I dislike having files shared by projects. All too often, you'll be tempted to edit one of the shared files, and you'll either break something in the other project, or you'll forget that you have to rebuild the other project at all.
When the shared files are instead separated into their own library (package), then there's a little extra barrier to editing them. I consider that a good thing. It will be a light reminder that you're switching from project-specific code to shared code. You can use project groups to let you keep every together in a single IDE instance. arrange the library projects ahead of the executable projects. The "build all" command will build everything in order, starting with the first project.
Keep your DCU files separate from your PAS files. You can do this easily by setting the "DCU output directory" project option to send your package's units to some other location. Then put that destination directory on your other projects' "search path." They'll find the DCU, but they won't find the PAS file, and so no other project will accidentally recompile a unit that isn't really a member.
Having a separate package also discourages use of project-specific conditional defines. Those cause all sorts of trouble when you're sharing units between projects. Find a way to instead keep all project-specific options within the respective projects. A shared library shouldn't require project-specific modifications. If a library needs to act differently based on who's using it, then employ techniques like callback functions that the library user can set to modify the library's behavior.
I would need to have a very good reason to add shared code to a package. If you just have a few shared files stick them all in a directory called Shared. This should make it obvious the files are shared between projects.
Also use a good build tool to do automated builds so you will find out soon enough if you break something.
.bpl files are fine for components, but bring in serious added complexity for things like this, unless you have a huge amount of shared files.
I usually create a package with all shared unit, and just use the units.
If you do not explicitly mark "Build with run time packages" the package content (all used dcu's) will be linked to your project as any other unit.
I would only use runtime packages if you actually had two binaries that were supposed to run on the same physical machine and that shared some code. Keep in mind that runtime packages are mostly an all-or-nothing approach. Once you decide to use them you will also no longer be able to link the RTL and VCL units straight into your projects and will instead have to deploy those separately as well.
However, packages might still be a good solution to your problem when combined with project groups which is exactly what I'm doing. I hate having shared units included in multiple projects. Including the shared units in a package (but not compiling your actual projects with runtime packages) allows you to add that package to your project group so you (and the IDE!) will always have them easily accessible yet nicely separated from the project-specific code. Strictly speaking you don't even ever have to compile those packages. They can merely serve as an organisational unit in the project manager.
Note that for the Find in Files, you can also specify "in all files in project group"

Directory Layout for Erlang Services?

In our Java applications we typically use the maven conventions (docs, src/java, test, etc.). For Perl we follow similar conventions only using a top level 'lib' which is easy to add to Perl's #INC.
I'm about to embark on creating a service written in Erlang, what's a good source layout for Erlang applications?
The Erlang recommended standard directory structure can be found here.
In addition you may need a few more directories depending on your project, common ones are (credit to Vance Shipley):
lib: OS driver libraries
bin: OS executables
c_src: C language source files (e.g. for drivers)
java_src: Java language source files
examples: Example code
mibs: SNMP MIBs
Other projects such as Mochiweb have their own structures, Mochiweb even have a script to create it all for you. Other projects such as Erlware overlay on the standard structure.
Another critical directory is the priv directory. Here you can store files that can easily be found from your applications.
code:priv_dir(Name) -> string() | {error, bad_name}
where Name is the name of your application.
Erlware is changing that - in a couple of days the Erlware structures will be exactly that of Erlang OTP. Actually the structure of app packages is already exactly that of OTP and as specified above. What will change is that Erlware installed directory structure will fit exactly over an existing Erlang/OTP install (of course one is not needed to install Erlware though) Erlware can now be used to add packages to an existing install very easily.
Cheers,
Martin

Resources