So I'm browsing through a proprietary nix codebase and I'm trying to make sense of it. It is building rust packages among other things. There is this autogenerated nix code using crate2nix which among other things invokes nixpkgs.buildPackages. After googling this expression I get very random results about people using this for compiling many languages but I cannot find a single case of some doc mentioning this? Is this some builting nix expression? Where is it documented?
This looks like a package set to explicitly reference all packages built on the host during cross-compilation (see pkgs/top-level/stage.nix).
$ nix repl
Welcome to Nix version 2.3.10. Type :? for help.
nix-repl> :l <nixpkgs>
Added 13917 variables.
nix-repl> :t buildPackages
a set
nix-repl> builtins.length (builtins.attrNames buildPackages)
13918
nix-repl> lib.lists.take 20 (builtins.attrNames buildPackages)
[ "AAAAAASomeThingsFailToEvaluate" "AMB-plugins" "ArchiSteamFarm" "AusweisApp2" "CHOWTapeModel" "CoinMP" "DisnixWebService" "EBTKS" "EmptyEpsilon" "FIL-plugins" "Fabric" "LAStools" "LASzip" "LASzip2" "Literate" "MACS2" "MIDIVisualizer" "MMA" "NSPlist" "OSCAR" ]
Related
I'm trying to set up a Nix cache containing all the store paths needed to build a simple derivation. The goal is for this to work on an empty store, with no cache misses so I don't have to hit cache.nixos.org at all. I'm having trouble because Nix seems to download a bunch of extra bootstrapping stuff, apparently to help with fetching.
For example, consider this derivation:
# empty_store_test_simple.nix
with { inherit (import <nixpkgs> {}) fetchFromGitHub; };
(fetchFromGitHub {
owner = "codedownio";
repo = "templates";
rev = "ba68b83d25d2b74f5475521ac00de3bbb884c983";
sha256 = "sha256-LNTi1ZBEsThmGWK53U9Na1j5DKHljcS42/PRXj97p6s=";
})
If I build this on an empty store with --dry-run, I see the following:
λ nix build --impure --store ~/experimental-store --substituters https://cache.nixos.org/ --dry-run -f ./empty_store_test_simple.nix
this derivation will be built:
/nix/store/jr55kq3kk6va95rvjcdyn5jmh059007p-source.drv
these 48 paths will be fetched (24.52 MiB download, 121.41 MiB unpacked):
/nix/store/02bfycjg1607gpcnsg8l13lc45qa8qj3-libssh2-1.10.0
/nix/store/0fi0432kdh46x9kbngnmz2y7z0q68cdz-xz-5.2.5-bin
/nix/store/0rizskpri8d8qawx6qjqcnvlxcvzr1bm-keyutils-1.6.3-lib
/nix/store/1l4r0r4ab3v3a3ppir4jwiah3icalk9d-zlib-1.2.11
/nix/store/1xyz8jwyg9rya2f7gs549c7n2ah378v6-stdenv-linux
/nix/store/3h4a92kysiw3s3rvbsa6a2nys3lf8f8v-libkrb5-1.18
/nix/store/3ibnw61rlgj2lj5hycy2dn3ybpq7wapm-libev-4.33
/nix/store/5qbrz5fimkbywws73vaim8allyh8kjy5-nghttp2-1.43.0-bin
/nix/store/67x6kxbanrqafx3hg7pb3bc83i3d1v3f-gzip-1.11
/nix/store/6irxz4fbf1d1ac7wvdjf8cqb3sgmnvg8-zlib-1.2.11-dev
/nix/store/71pachqc22wlvf3xjhwjh2rqbl6l3ngg-diffutils-3.8
/nix/store/9mp06ni69a44dmrjhn28mn15brdry52w-gnused-4.8
/nix/store/9ppi191zsi7zvynkm8vy2bi22lci9iwg-bzip2-1.0.6.0.2
/nix/store/c9f15p1kwm0mw5p13wsnvd1ixrhbhb12-gcc-10.3.0-lib
/nix/store/d1n274a607fmqdgr7888nq19hdsj7av0-openssl-1.1.1l-bin
/nix/store/d8p27w4d21xs6svkaf3ij60lsw243rn2-openssl-1.1.1l-dev
/nix/store/fdbwa5jrijn0yzwl8l4xdxa0l5daf5j6-curl-7.79.1
/nix/store/fvprxgcxf4px865gdjd81fbwnxcjrg41-coreutils-9.0
/nix/store/gf6j3k1flnhayvpnwnhikkg0s5dxrn1i-openssl-1.1.1l
/nix/store/gmnh4jfjhx83aggwgwzcnrwmpmqr8fwf-gnutar-1.34
/nix/store/gmzhclix3kzhir5jmmwakwhpg6j5zwf1-acl-2.3.1
/nix/store/h0b8ajwz9lvw3a3vqrf41cxrhlx9dz7p-nghttp2-1.43.0-lib
/nix/store/h7srws2r1nalsih91lrm0hfhhar14jzm-libkrb5-1.18-dev
/nix/store/h97sr1q1rpv1ry83031q51jbkba7q0m4-bzip2-1.0.6.0.2-bin
/nix/store/ihscadskdrvwc9dvbirff51lr70cphjj-curl-7.79.1-bin
/nix/store/ikvp5db9hygc14da45lvxi1c9b4ylna9-pcre-8.44
/nix/store/ilszk5f0zcv8lifkixg47ja1f2lsgxkd-nghttp2-1.43.0-dev
/nix/store/k0qa3rjifblr2vrgx4g54a59zxlfhg90-xz-5.2.5
/nix/store/kd14wd2wfmb56zpv5y71yq2lqs11l06k-attr-2.5.1
/nix/store/ksqy6mszsld4z3w8ybxa2vkjf5cqxw3f-c-ares-1.17.2
/nix/store/l0wlqpbsvh1pgvhcdhw7qkka3d31si7k-bash-5.1-p8
/nix/store/lhambyc1v2c7qzzr5sq7p449xs1j6pg8-gnugrep-3.7
/nix/store/lypy3bif096j0qc1divwa87gdvv3r575-curl-7.79.1-dev
/nix/store/p12km8psjlmvbmi52wb9r6gfykqxcdnd-libssh2-1.10.0-dev
/nix/store/pkpynsyxm8c38z4m8ngv52c7v8vhkr2h-unzip-6.0
/nix/store/prq96vz3ywk955nnxlr7s892wf5qvbr0-mirrors-list
/nix/store/psqacrv7k5fxz6mdiawc28sxcdchb4c9-ed-1.17
/nix/store/qbdsd82q5fyr0v31cvfxda0n0h7jh03g-libunistring-0.9.10
/nix/store/qzr7r4w5gm5m20afn2wz4vlv7ah4sr89-gnumake-4.3
/nix/store/r5niwjr8r8qags2bzv9z583r9vajxag3-patchelf-0.13
/nix/store/rnx655nq2qs53yb5arv2gapa91r1wsbn-findutils-4.8.0
/nix/store/scz4zbxirykss3hh5iahgl39wk9wpaps-libidn2-2.3.2
/nix/store/sqn31ly001033hsz0dpxwcsay5qdbk2w-gawk-5.1.1
/nix/store/vslsa0l17xjcrdgm2knwj0z5hlvf73m7-perl-5.34.0
/nix/store/x6pz7c0ffcd6kxzc8m1rflvqmdbjiihh-nghttp2-1.43.0
/nix/store/yj11v0gdjqli4nzax4x48xjnh9y36b2q-curl-7.79.1-man
/nix/store/z56jcx3j1gfyk4sv7g8iaan0ssbdkhz1-glibc-2.33-56
/nix/store/zjm4xv4nr872mdhvv3j22bzb08rgf1hk-patch-2.7.6
However, I can't find all these paths by using the means I would expect:
λ nix repl empty_store_test_simple.nix
nix-repl> :b with import <nixpkgs> {}; closureInfo { rootPaths = [inputDerivation]; }
This derivation produced the following outputs:
out -> /nix/store/1iqm7sr2rr6i5njnfxlqqyzi567mb4cz-closure-info
λ cat /nix/store/1iqm7sr2rr6i5njnfxlqqyzi567mb4cz-closure-info/store-paths
/nix/store/icvdinlsgl6y2kxk9wzkj82a53jgpdlm-source
I would expect to see ~48 paths here, but I only see 1! How can I get all the build-time dependencies indicated by the dry run? I've seen this kind of issue in the past when IFD is present, could there be some going on in Nixpkgs?
If you build your derivation and store it in your own Nix cache that should work; Nix shouldn't need to get the build-time dependencies of that thing, it can just download the result (i.e. the source code you're fetching) from your cache.
If you want to get the build-time dependencies anyway, try:
nix-store -qR $(nix-instantiate test.nix)
I think this will get you one step closer but it's not a complete solution. You'd probably need to build all the derivations in this list or something.
I'm trying to write a simple machine learning application in Ada, and also trying to find a good framework to use. My knowledge of one thing is extremely minimal, and of the other is somewhat minimal.
There are several nifty machine learning frameworks out there, and I'd like to leverage one for use with an Ada program, but I guess I'm just...at a loss. Can I use an existing framework written in Python, for instance and wrap (or I guess, bind?) the API calls in Ada? Should I just pass off the scripting capabilities? I'm trying to figure it out.
Case in point: Scikit (sklearn)
https://scikit-learn.org/stable/tutorial/text_analytics/working_with_text_data.html#
This does some neat stuff, and I'd like to be able to leverage this, but with an Ada program. Does anyone have advice from a similar experience?
I am just researching, so I have tried finding information.
http://www.inspirel.com/articles/Ada_Python_Binding.html
https://scikit-learn.org/stable/tutorial/text_analytics/working_with_text_data.html#
The inspirel solution is based on python2.7. If you're using anything from python3.5 onwards a few mods need to be made. On Linux, changing to say python 3.7, you'd just change
--for Default_Switches ("Ada") use ("-lpython2.7");
for Default_Switches ("Ada") use ("-lpython3.7");
but on windows, the libraries aren't dumped in a community lib so gnat doesn't know where to find them. All the packages are kept separately. The -L has to be added to tell the linker where to find the library. Alternatively, you can use for lib_dir. In my case, I did a non-admin install of python, so it looks something like
for Default_Switches ("Ada") use ("-L\Users\StdUser\AppData\Local\Programs\Python\Python37-32\libs", "-lpython37");
Note that on windows, the library is called python37: not python3.7. Use gprbuild instead of gnatmake -p, which has been deprecated. If you do all your mods correctly
gprbuild ada_main.gpr
should give you an executable in obj\ada_main.exe if it builds. If a later version of python is used, some edits need to be made
python_module.py
#print 'Hello from Python module'
print('Hello from Python module')
#print 'Python adding:', a, '+', b
print('Python adding:', a, '+', b)
ada_main.adb
-- Python.Execute_String("print 'Hello from Python!'");
Python.Execute_String("print('Hello from Python!')");
Some routines have been deprecated so the linkage has to change
python.adb
--pragma Import(C, PyInt_AsLong, "PyInt_AsLong");
pragma Import(C, PyInt_AsLong, "PyLong_AsLong");
--pragma Import(C, PyString_FromString, PyString_FromString");
pragma Import(C, PyString_FromString, "PyUnicode_FromString");
Running the build and executable should give
C:\Users\StdUser\My Documents\ada-python>gprbuild ada_main.gpr
Compile
[Ada] ada_main.adb
Bind
[gprbind] ada_main.bexch
[Ada] ada_main.ali
Link
[link] ada_main.adb
C:\Users\StdUser\My Documents\ada-python>obj\ada_main.exe
executing Python directly from Ada:
Hello from Python!
loading external Python module and calling functions from that module:
Hello from Python module!
asking Python to add two integers:
Python adding: 10 + 2
Ada got result from Python: 12
we can try other operations, too:
subtract: 8
multiply: 20
divide : 5
Remember to put the pythonxx.dll somewhere on your path otherwise it won't be able to find the library when it starts executing.
Elixir source may be injected using Code.eval_string/3. I don't see mention of running raw Erlang code in the docs:
https://hexdocs.pm/elixir/Code.html#eval_string/3
I am coming from a Scala world in which Java objects are callable using Scala syntax, and Scala is compiled into Java and visible by intercepting the compiler output (directly generated with scalac).
I get the sense that Elixir does not provide such interoperating features, nor allow injection of custom Erlang into the runtime. Is this the case?
You can use the erlang standard library modules from Elixir, as described here or here.
For example:
def random_integer(upper) do
:rand.uniform(upper) # rand is an erlang library
end
You can also add erlang packages to your mix.exs dependencies and use them in your project, as long as these packages are published on hex or on github.
You can also use erlang and elixir code together in a project as described here.
So yeah, it's perfectly possible to call erlang code from elixir.
Vice-versa is also possible, see here for more information:
Elixir compiles into BEAM byte code (via Erlang Abstract Format). This
means that Elixir code can be called from Erlang and vice versa,
without the need to write any bindings.
Expanding what #zwippie have written:
All remote function calls (by that I mean calling function with explicitly set module/alias) are in form of:
<atom with module name>.<function name>(<arguments>)
# Technically it is the same as:
# apply(module, function_name_as_atom, [arguments])
And all "upper case module names" in Elixir are just atoms:
is_atom(Foo) == true
Foo == :"Elixir.Foo" # => true
So from Elixir viewpoint there is no difference between calling Erlang functions and Elixir functions. It is just different atom passed as the receiving module.
So you can easily call Erlang modules from Elixir. That mean that without much of the hassle you should be able to compile Erlang AST from within Elixir as well:
"rand:uniform(100)"
|> :merl.quote()
|> :erl_eval.expr(#{})
No need for any mental translation.
Additionally you can without any problems mix Erlang and Elixir code in single Mix project. With tree structure like:
.
|`- mix.exs
|`- src
| `- example.erl
`- lib
`- example.ex
Where example.erl is:
-module(example).
-export([hello/0]).
hello() -> <<"World">>.
And example.ex:
defmodule Example do
def print_hello, do: IO.puts(:example.hello())
end
You can compile project and run it with
mix run -e "Example.print_hello()"
And see that Erlang module was successfully compiled and executed from within Elixir code in the same project without problems.
One more thing to watch for when calling erlang code from elixir. erlang uses charlists for strings. When you call a erlang function that takes a string, convert the string to a charlist and convert returned string to a string.
Examples:
iex(17)> :string.to_upper "test"
** (FunctionClauseError) no function clause matching in :string.to_upper/1
The following arguments were given to :string.to_upper/1:
# 1
"test"
(stdlib 3.15.1) string.erl:2231: :string.to_upper/1
iex(17)> "test" |> String.to_charlist() |> :string.to_upper
'TEST'
iex(18)> "test" |> String.to_charlist() |> :string.to_upper |> to_string
"TEST"
iex(19)>
One of the packages I'm trying to debug is hidden inside a few links of derivations. I've found a reference in one of the paths, but that's a string I can't pull inside of nix repl. Is there a way to go from nix store path (which doesn't exist yet, because that's the derivation that fails) to a nix derivation object?
nix-repl> de.dev.packages.hie-bios.pkgs
"[{\"paths\":[\"/nix/store/f04qyvqaj6s6y5f5a7svpfppsq5wx2p6-haskell-ide-engine-ghc864-7541d1ec71\"],\"priority\":-864}]"
Doesn't seem like you can. The Nix language does let you access the string context, but the string context is not intended to reproduce a derivation, let alone the original expression that the derivation came from.
nix-repl> :p builtins.getContext "example string ${(import <nixpkgs> {}).hello.outPath}"
{ "/nix/store/m2capxzda4ams4fi3awmriz7hfkdxyp9-hello-2.10.drv" = { outputs = [ "out" ]; }; }
Technically you can read the derivation contents and parse the ATerm inside etc etc, but that's really not supported. It doesn't handle string contexts, probably won't let you build anything and will kill your dog. And even then, you don't get the original expression back.
nix-repl> :p builtins.readFile (builtins.head (builtins.attrNames (builtins.getContext "example string ${(import <nixpkgs> {}).hello.outPath}")))
"Derive([(\"out\",\"/nix/[...]
It's probably best to solve your hie-bios problem directly.
In the recent Erlang R14, inets' file httpd.hrl has been moved from:
src/httpd.hrl
to:
src/http_server/httpd.hrl
The Erlang Web framework includes the file in several places using the following directive:
-include_lib("inets/src/httpd.hrl").
Since I'd love the Erlang Web to compile with both versions of Erlang (R13 and R14), what I'd need ideally is:
-ifdef(OLD_ERTS_VERSION).
-include_lib("inets/src/httpd.hrl").
-else.
-include_lib("inets/src/http_server/httpd.hrl").
-endif.
Even if it possible to retrieve the ERTS version via:
erlang:system_info(version).
That's indeed not possible at pre-processing time.
How to deal with these situations? Is the parse transform the only way to go? Are there better alternatives?
Not sure if you'll like this hackish trick, but you could use a parse transform.
Let's first define a basic parse transform module:
-module(erts_v).
-export([parse_transform/2]).
parse_transform(AST, _Opts) ->
io:format("~p~n", [AST]).
Compile it, then you can include both headers in the module you want this to work for. This should give the following:
-module(test).
-compile({parse_transform, erts_v}).
-include_lib("inets/src/httpd.hrl").
-include_lib("inets/src/http_server/httpd.hrl").
-export([fake_fun/1]).
fake_fun(A) -> A.
If you're on R14B and compile it, you should have the abstract format of the module looking like this:
[{attribute,1,file,{"./test.erl",1}},
{attribute,1,module,test},
{error,{3,epp,{include,lib,"inets/src/httpd.hrl"}}},
{attribute,1,file,
{"/usr/local/lib/erlang/lib/inets-5.5/src/http_server/httpd.hrl",1}},
{attribute,1,file,
{"/usr/local/lib/erlang/lib/kernel-2.14.1/include/file.hrl",1}},
{attribute,24,record,
{file_info,
[{record_field,25,{atom,25,size}},
{record_field,26,{atom,26,type}},
...
What this tells us is that we can use both headers, and the valid one will automatically be included while the other will error out. All we need to do is remove the {error,...} tuple and get a working compilation. To do this, fix the parse_transform module so it looks like this:
-module(erts_v).
-export([parse_transform/2]).
parse_transform(AST, _Opts) ->
walk_ast(AST).
walk_ast([{error,{_,epp,{include,lib,"inets/src/httpd.hrl"}}}|AST]) ->
AST;
walk_ast([{error,{_,epp,{include,lib,"inets/src/http_server/httpd.hrl"}}}|AST]) ->
AST;
walk_ast([H|T]) ->
[H|walk_ast(T)].
This will then remove the error include, only if it's on the precise module you wanted. Other messy includes should fail as usual.
I haven't tested this on all versions, so if the behaviour changed between them, this won't work. On the other hand, if it stayed the same, this parse_transform will be version independent, at the cost of needing to order the compiling order of your modules, which is simple enough with Emakefiles and rebar.
If you are using makefiles, you can do something like
ERTS_VER=$(shell erl +V 2>&1 | egrep -o '[0-9]+.[0-9]+.[0-9]+')
than match the string and define macro in erlc arguments or in Emakefile.
There is no other way, AFAIK.