I'm writing a new C/C++ library with tasks. It's low level so it'll need the ability to tune the build for CPU, O/S, whether libraries (or tasks) are build opt, dbg, or which network library is used (e.g. generic TCP, Solarflare, Mellanox, Infiniband). This seems like an ideal Bazel use-case.
I have basic Bazel working e.g. I can build sample tasks, libraries, with dependencies. And so far that works nicely.
Now, to the point: as the code is just C with some C++, it seems impractical to build a whole new toolchain. What Bazel ships with and/or can do on GCC/Clang is good enough; most defaults are fine. Can't I just customize it? Still, I need a simple way to accomplish the following:
allow developers to choose their compiler typically clang, or gcc, with version
allow developers to decide how to build their code e.g. x86-32 dbg libraries but x86-32 opt binaries. Or 64 bit versions of same ...
allow developers to select which network library to link in, and therefore build it but not the others
for dbg builds developers may want to toss in an extra compilation flag and be assured it's used everywhere when build libraries and tasks
to limit developers to valid configurations. For example, if I know the code works on Intel chips x86-32bit or 64 bit but PPC processors aren't supported ... then ... developers may have lots of dials to turn for x86, it's desirable to stop with an error when cpu=ppc
Which basic approach is best?
allow Bazel to auto-discover the platform, toolchain and instruct developers to modify it to task? How?
provide a copy-and-pasted-and-edited custom c/c++ tool chain .bzl file?
Focus on on CROSSTOOL only?
Ship the library with customized platform and tool chain files?
TIA
Related
Having just switched to VS2019 I’m exploring whether to use code analysis. In the project properties, “code analysis” tab, there are numerous built-in Microsoft rule sets, and I can see the editor squiggles when my code violates one of these rules. I can customise these rule sets and “save as” to create my own.
I have also seen code analyser NuGet packages such as “Roslynator” and “StyleCop.Analyzers”. What’s the difference between these and the built-in MS rules? Is it really just down to more comprehensive sets of rules/more choice?
If I wanted to stick with the built-in MS rules, are there any limitations? E.g. will they still get run and be reported on during a TFS/Azure DevOps build?
What's the difference between legacy FxCop and FxCop analyzers?
Legacy FxCop runs post-build analysis on a compiled assembly. It runs as a separate executable called FxCopCmd.exe. FxCopCmd.exe loads the compiled assembly, runs code analysis, and then reports the results (or diagnostics).
FxCop analyzers are based on the .NET Compiler Platform ("Roslyn"). You install them as a NuGet package that's referenced by the project or solution. FxCop analyzers run source-code based analysis during compiler execution. FxCop analyzers are hosted within the compiler process, either csc.exe or vbc.exe, and run analysis when the project is built. Analyzer results are reported along with compiler results.
Note
You can also install FxCop analyzers as a Visual Studio extension. In this case, the analyzers execute as you type in the code editor, but they don't execute at build time. If you want to run FxCop analyzers as part of continuous integration (CI), install them as a NuGet package instead.
https://learn.microsoft.com/en-us/visualstudio/code-quality/fxcop-analyzers-faq?view=vs-2019
So, the built-in legacy FxCop and NuGet analyzers only run at build time while the extension analyzers can run at the same time the JIT compiler does as you type. Also, you have to specifically say to run legacy code analysis on build, whereas the NuGet analyzers will run on build just because they are installed. And analyzers installed as NuGet or extensions won't run when you go to the menu option "Run Code Analysis".
At least, that's what I get out of that page.
There's a link near the bottom of that page that takes you to what code analysis rules have moved over to the new analyzers, including rules that are now deprecated.
https://learn.microsoft.com/en-us/visualstudio/code-quality/fxcop-rule-port-status?view=vs-2019
The different analyzers attempt to cover different coding styles and things Microsoft didn't cover when they built FxCop. With the little research I just did on this, there's a whole rabbit hole to follow, Alice, that would take more time than I have right now to devote to it. And it seems to be filled with lots of arcane knowledge and OCD style code nitpicks that make Wonderland seem normal. But that's just my opinion.
There's lots of personal and professional opinion about various rules in these and basic Microsoft rules, so there's plenty of room to use what you want and disable what you don't. For a beginner, I'd suggest turning on only a few rules at a time. That way you aren't inundated with more warnings and errors than lines of code you might have. Ok, so that might be a bit of an exaggeration, but there's so many rules that really are nitpicks, especially on legacy code, that they aren't really worth it to have enabled, since you likely won't have time to fix it all. You will also want to do basic research and use "common sense" when you decide what to enable. ("Do I really need to worry about variable capitalization coding style consistency on an app that's been ported into 4 different languages over 15+ years and has 10k files?") This is both personal and professional opinion here, so follow it or not.
And don't forget the rules that contradict each other. Those are fun to deal with.......
Problem
I've been developing a game in C++ in my spare time and I've opted to use Bazel as my build tool since I have never had a ton of luck (or fun) working with make or cmake. I also have dependencies in other languages (python for some of the high level scripting). I'm using glfw for basic window handling and high level graphics support and that works well enough but now comes the problem. I'm uncertain on how I should handle dependencies like glfw in a Bazel world.
For some of my dependencies (like gtest and fruit) I can just reference them in my WORKSPACE file and Bazel handles them automagically but glfw hasn't adopted Bazel. So all of this leads me to ask, what should I do about dependencies that don't use Bazel inside a Bazel project?
Current approach
For many of the simpler dependencies I have, I simply created a new_git_repository entry in my WORKSPACE file and created a BUILD file for the library. This works great until you get to really complicated libraries like glfw that have a number of dependencies on their own.
When building glfw for a Linux machine running X11 you now have a dependency on X11 which would mean adding X11 to my Bazel setup. X11 Comes with its own set of dependencies (the X11 libraries like X11Cursor) and so on.
glfw also tries to provide basic joystick support which is provided by default in Linux which is great! Except that this is provided by the kernel which means that the kernel is also a dependency of my project. Now I shouldn't need anything more than the kernel headers this still seems like a lot to bring in.
Alternative Options
The reason I took the approach I've taken so far is to make the dependencies required to spin up a machine that can successfully build my game very minimal. In theory they just need a C/C++ compiler, Java 8, and Bazel and they're off to the races. This is great since it also means I can create a Docker container that has Bazel installed and do CI/CD really easily.
I could sacrifice this ease and just say that you need to have libraries like glfw installed before attempting to compile the game but that brings the whole which version is installed and how is it all configured problem back up that Bazel is supposed to help solve.
Surely there is a simpler solution and I'm overthinking this?
If the glfw project has no BUILD files, then you have the following options:
Build glfw inside a genrule.
If glfw supports some other build system like make, you could create a genrule that runs the tool. This approach has obvious drawbacks, like the not-to-be-underestimated impracticality of having to declare all inputs of that genrule, but it'd be the simplest way of Bazel'izing glfw.
Pre-build glfw.o and check it into your source tree.
You can create a cc_library rule for it, and put the .o file in the srcs. Even though this solution is the least flexible of all because you not only restrict the target platform to whatever the .o was built for, but also make it harder to reproduce the whole build, the benefits are sometimes worth the costs.
I view this approach as a last resort. Even in Bazel's own source code there's one cc_library.srcs that includes a raw object file, because it was worth it, as the commit message of 92caf38 explains.
Require that glfw be installed.
You already considered this option. Some people may prefer this to the other approaches.
I am very new to Erlang programming language. Is there a standard build tool in Erlang?
I have googled out these, not sure which one I should use. I don't know that what kind of occasion is it used for?
Rebar2: It is first usable and de facto build tool that most of the Erlang projects are using it. It uses Erlang script for getting dependencies, compiling, testing and making release of your project. However it is not a modern build tool and suffers from slowness of compiling in developing phase, difficulty for using in larger projects and a bit hard to understand for newcomers.
Rebar3: It is a successor to rebar2 with an attempt to improving its mechanism and providing new features which is compatible with modern build tools. Also it is easier to use for newcomers
Erlang.mk: It is a big Makefile. As Makefile is fast and is available by default in every unix system, so you can benefit from these features for your Erlang application build tool. It has a package index of most well-known Erlang projects and other standard features like Rebar. Also it is faster that rebar2 in developing phase (preliminary results show that rebar3 is notably faster than Erlang.mk)
I myself use Rebar and this possible duplicate of your question has two answers that recommend Rebar as well. But it is a matter of taste and I recommend to consider two different approaches and choose what is closer to your purposes.
Question: How do make a single make file to compile several different systems, environments, and sets of libraries at once?
Info:
I'm a student and as such most of my work is done for the sake of learning how these things work. Right now I'm building a game engine from the ground up. I'd like to be able to make it cross platform in terms of OS, but also for different environments. My target environments are both 32 and 64 bit (my desktop as well as my netbook), with a graphics card and with mesa, and linux and windows. so overall it should out put 8 binaries.
I'm still very new to make, as well as the whole concept of cross compiling. I imagine that the process of compiling more than 1 binary isn't hard. but where I'm kind of stuck is how do i get it to attach the right libraries? The Ubuntu Linux vs the WinAPI libraries, 32bit vs 64bit libraries. etc etc. Is it even possible to get libraries in such a manner?
If you need me to clarify further I can. Thanks.
Addendum: Basically I want to know how to compile headers for drivers i may not have. for example. I want to compile all the files on my netbook, including the ones for openCL, I don't want to run them, as my netbook has no GPU, I just want to compile. conversely, I want to use my desktop compile for my netbook which uses ocelot and mesa for its gpu dealings, but my desktop does not have mesa or ocelot on it. that sort of thing. Thanks.
Apache Maven is a very popular build and dependency management tool in the Java open source ecosphere. I did some tests to find out if it can handle compiled Free Pascal / Delphi units and found it easy to implement. So it would be possible to
release open source libraries precompiled for Free Pascal (or Delphi) in a public Maven repository
include metadata in this repository which contains dependency information
use Maven on the command line to download the open source library from the public repository, and automatically resolve all dependencies
local repositories, working as proxies, could be used to cache frequently used binaries
automatic checksum generation and verification (provided by Maven) would reduce the risk of downloading corrupted binaries
source code and even documentation files could be provided with the binaries
binaries can be provided with or without debug information
continuous integration servers like Hudson, TeamCity or CruiseControl can be used to build projects whenever changes have been submitted to the source control system and notify developers about build errors
This way of dependency management could be very beneficial for open source projects which use many third party libraries with complex dependencies. It would avoid typical conflicts caused by using wrong versions.
For the developer, the workflow for editing and building a project would be reduced to a minimum:
checkout the project source from internal version control system
edit source file(s)
run mvn package to automatically download all required third party libraries (precompiled units) if they are not yet in the workstation's local repository
compile and run
The only additional file for Apache Maven which is required in the project folder is the POM.XML file containing the project information.
Edit: while Maven is usable for some of the required tasks, implementing a solution like Maven in native Free Pascal would have some advantages: no Java SDK required, support for all development platforms where Free Pascal is available, maintenance and plugin development in Pascal.
Usage of a Maven-like tool would not be helpful for open source projects only - commercial projects could access and use the artifacts in public Maven repositories in the same way as well.
Maven features are listed at http://maven.apache.org/maven-features.html
Update:
one use case could be the build of Lazarus, where Maven would download all required libraries and invoke the compiler with the necessary build path arguments. Changes in the dependencies on lower levels would be propagated automatically up to the parent build.
Possible benefits:
less time needed to set up a new work
station, no manual installation of
third party libraries required
less errors caused by wrong library
versions, detection of version
conflicts (for example if two
libraries depend on different
versions of a third library)
artifacts which are created inhouse
can be added to the local maven
repository and shared between
developers and project, central
storage of all artifacts with
metadata
builds are reproducible, just by
using the same source and project
metadata file (pom.xml)
can reduce development time and
increase project stability
Update #2: FPMake
the FPMake build system for Free Pascal seems to be a tool with much potential, in many details it is quite similar to Maven:
FPMake is a pascal based build system developed for and distributed with FPC
FPMake standardizes the building by defining some limits like standard directories
the command fppkg <packagename> will look in a database for the package, extract it, and then compile fpmake.pp and run it
it has standard build targets (clean, build, install, ...)
it can create a 'manifest' file suitable for import into a repository (like mvn deploy or mvn install), the manifest is an XML file which looks very similar to a pom.xml in Maven:
FPMake manifest file:
<packages>
<package name="my-package">
<version major="0" minor="7" micro="6" build="1"/>
<filename>my-package-0.7.6-1.zip</filename>
<author>my name</author>
<license>GPL</license>
<homepageurl>http://www.freepascal.org/</homepageurl>
<email>myname#freepascal.org</email>
<description>this is the package description</description>
<dependencies>
<dependency>
<package packagename="rtl"/>
</dependency>
</dependencies>
</package>
</packages>
Freepascal has been working on a package system of its own in a cross between apt-get and freebsd ports style. (download source/build/install automatically), called fppkg.
However work has stalled. People investing time are the bottleneck, not people wanting to choose tools.
As far as Maven goes, I don't like auxilary tools that need installation of huge external runtimes. It might be fine for a big major app (like Open Office), but not for an util.
I also prefer a tool that is designed to the FPC reality and workflow.
Documentation tools, build tools, download systems, testsuite systems are already all there, it just need a person that dedicates a lot of time into it to make it happen.
Some typical problems when introducing a new technology in a project as FPC, and why it has a tendency to make its own tools:
need to train 20+ committers in parttime.
The only COMMON programming language you can assume is Free Pascal. Even Delphi inner workings can't be taken for granted to be known (many committers came directly to FPC or even still via TP or a Mac Pascal)
Obviously that makes something with plugins in a different language annoying.
Bash script is a close second. (g)make third, but already a magnitude less.
All servers are *nix-like (FreeBSD, OS X, Linux), but not all run Apache. (e.g. my FreeBSD mirror runs XSHTTPD)
somebody most knowledgable must be dedicated maintainer for a long time. Fix problems, update/ do migrations etc. Perferably more than one for obvious reasons.
a major pain are Linux distributions (and FreeBSD to a lesser degree), most maintainers of *nix packages are not capable of more than "./configure;make;make install", and must be spoonfed with a near buildable repository and auxilary files.
In-distribution packaging of FPC/Lazarus has always been important, and is still increasing
All distributions have their own special rules about metadata, depedancies, and how sources must be published. Particularly Debian/Ubuntu is very bureaucratic and slow.
Most don't like third party auto-installers on top of their systems (since that bypasses their dependancy control)
This all leads to the effective practice that own tools in Pascal with minimal scripting work best. Some tools used:
Gmake is mainly used to parameterise the build process on a per directory level, a successor, fpcmake (not really a make derivative despite the name) has begun, but the migration hasn't completed.
Latex and a latex to html conversion (tex4ht, but debian uses hevea) are used in the documentation building (the non library documentation)
The community site (netscape community server which uses TCL scripting, a heavy complex application server) has been a trouble ever since it started, but specially lately since the maintainer became less active.
Mantis has been a problem (specially the email module would crash or lame the server due to the volume), but it has been whipped into shape during successive updates and hard work of several lazarus devels. Currently it is a decent workhorse.
lazarus.freepascal.org PHPBB forum OTOH is relatively painless since a lot of younger people know how to deal with it.
The same goes for subversions (though the more advanced scale needs some adjusting, not everybody is deep into the ins and outs of mergetracking)
If somebody was really serious about Maven, I usually would ask him:
to CRITICIALLY investigate the use for the project. In a very concrete way, with schedule and time estimates. Birds-eye level "everything's possible" overviews are essentialy worthless.
Give some thought on future change of used technologies. Every technology is eventually replaced, even the in-house ones, in 18 year+ projects. A new technology must not make migrations of other infrastructural components hard or involved. The new technology to end all new technologies doesn't exist.
Make a migration plan. Migration is often underrated and underestimated.
And in the end, there is always the 1000000 Euro question, who will do the daily maintenance?
Keep in mind that in a company you just kick the person responsible for the application server. But in an informal environment this is way harder, specially long term, since people's lives, occupations and time spent on the project vary.
Sounds like an interesting plan, but the Delphi community (and FPC even more so, I'd imagine!) values libraries as source far more than precompiled libraries. The general consensus is that anyone who uses a binary-only library is a fool, for two reasons: You can't fix any bugs you find in it, and compiler changes will break compatibility.