I'm trying to add some patches to the llvm Opam package, but I'm having issues testing it because it seems like running opam install . from the package root ignores the url section and doesn't download & decompress the source archive, thus failing when applying patches.
This is the opam file for reference:
opam-version: "2.0"
maintainer: "Kate <kit.ty.kate#disroot.org>"
authors: [
"whitequark <whitequark#whitequark.org>"
"The LLVM team"
]
license: "MIT"
doc: "http://llvm.moe/ocaml"
bug-reports: "http://llvm.org/bugs/"
dev-repo: "git+http://llvm.org/git/llvm.git"
homepage: "http://llvm.moe"
install: [
["bash" "-ex" "install.sh" "%{conf-llvm:config}%" lib "%{conf-cmake:cmd}%" make]
]
depends: [
"ocaml" {>= "4.00.0"}
"ctypes" {>= "0.4"}
"ounit" {with-test}
"ocamlfind" {build}
"conf-llvm" {build & = version}
"conf-python-2-7" {build}
"conf-cmake" {build}
]
patches: [
"fix-shared.patch"
]
synopsis: "The OCaml bindings distributed with LLVM"
description: "Note: LLVM should be installed first."
extra-files: [
["link-META.patch" "md5=ef4ebb8706be2ed402f31fc351d7dc75"]
["install.sh" "md5=683ec0478ee422a57dcd3716277b3ef3"]
["fix-shared.patch" "md5=dce86b1db352332968ceb6d042b408a8"]
["META.patch" "md5=1d0af08bab7a0f831f68849b6556e414"]
["add-buildfence-llvm.ml.patch" "md5=a3bc667bd2fc937ee51c3b9d33b8ad63"]
["add-buildfence-llvm.mli.patch" "md5=99c739d74deeb1b990fe63cf914fc479"]
["add-buildfence-llvm_ocaml.c.patch" "md5=a29282f2e1e435abff57cecfd269ccb9"]
]
url {
src: "https://github.com/llvm/llvm-project/releases/download/llvmorg-11.1.0/llvm-11.1.0.src.tar.xz"
checksum: "sha256=ce8508e318a01a63d4e8b3090ab2ded3c598a50258cc49e2625b9120d4c03ea5"
}
and this is the result of running opam install . -vvv on the package root:
Processing 1/1: [llvm.11.0.0: rsync]
+ /usr/bin/rsync "-rLptgoDrvc" "--exclude" ".git" "--exclude" "_darcs" "--exclude" ".hg" "--exclude" ".#*" "--exclude" "_opam*" "--delete" "--delete-excluded" "/home/frabert/opam-repository/packages/llvm/llvm.11.0.0/" "/home/frabert/.opam/4.11.1/.opam-switch/sources/llvm"
- sending incremental file list
- ./
- out
-
- sent 828 bytes received 39 bytes 1,734.00 bytes/sec
- total size is 19,120 speedup is 22.05
[llvm.11.0.0] synchronised from file:///home/frabert/opam-repository/packages/llvm/llvm.11.0.0
+ /usr/bin/lsb_release "-s" "-r"
- 18.04
+ /usr/bin/ocamlc "-vnum"
- 4.05.0
The following actions will be performed:
∗ install llvm 11.0.0*
+ /usr/bin/rsync "-rLptgoDrvc" "--exclude" ".git" "--exclude" "_darcs" "--exclude" ".hg" "--exclude" ".#*" "--exclude" "_opam*" "--delete" "--delete-excluded" "/home/frabert/opam-repository/packages/llvm/llvm.11.0.0/" "/home/frabert/.opam/4.11.1/.opam-switch/sources/llvm"
- sending incremental file list
- ./
- opam
- out
- files/
- files/META.patch
- files/add-buildfence-llvm.ml.patch
- files/add-buildfence-llvm.mli.patch
- files/add-buildfence-llvm_ocaml.c.patch
- files/fix-shared.patch
- files/install.sh
- files/link-META.patch
-
- sent 20,648 bytes received 202 bytes 41,700.00 bytes/sec
- total size is 19,775 speedup is 0.95
[llvm.11.0.0] synchronised from file:///home/frabert/opam-repository/packages/llvm/llvm.11.0.0
<><> Processing actions <><><><><><><><><><><><><><><><><><><><><><><><><><><><>
+ /bin/cp "-PRp" "/home/frabert/.opam/4.11.1/.opam-switch/sources/llvm" "/home/frabert/.opam/4.11.1/.opam-switch/build/llvm.11.0.0"
+ /bin/cp "/home/frabert/.opam/4.11.1/.opam-switch/overlay/llvm/files/link-META.patch" "/home/frabert/.opam/4.11.1/.opam-switch/build/llvm.11.0.0/link-META.patch"
+ /bin/cp "/home/frabert/.opam/4.11.1/.opam-switch/overlay/llvm/files/install.sh" "/home/frabert/.opam/4.11.1/.opam-switch/build/llvm.11.0.0/install.sh"
+ /bin/cp "/home/frabert/.opam/4.11.1/.opam-switch/overlay/llvm/files/fix-shared.patch" "/home/frabert/.opam/4.11.1/.opam-switch/build/llvm.11.0.0/fix-shared.patch"
+ /bin/cp "/home/frabert/.opam/4.11.1/.opam-switch/overlay/llvm/files/add-buildfence-llvm_ocaml.c.patch" "/home/frabert/.opam/4.11.1/.opam-switch/build/llvm.11.0.0/add-buildfence-llvm_ocaml.c.patch"
+ /bin/cp "/home/frabert/.opam/4.11.1/.opam-switch/overlay/llvm/files/add-buildfence-llvm.mli.patch" "/home/frabert/.opam/4.11.1/.opam-switch/build/llvm.11.0.0/add-buildfence-llvm.mli.patch"
+ /bin/cp "/home/frabert/.opam/4.11.1/.opam-switch/overlay/llvm/files/add-buildfence-llvm.ml.patch" "/home/frabert/.opam/4.11.1/.opam-switch/build/llvm.11.0.0/add-buildfence-llvm.ml.patch"
+ /bin/cp "/home/frabert/.opam/4.11.1/.opam-switch/overlay/llvm/files/META.patch" "/home/frabert/.opam/4.11.1/.opam-switch/build/llvm.11.0.0/META.patch"
+ /bin/cp "/home/frabert/.opam/4.11.1/.opam-switch/build/llvm.11.0.0/fix-shared.patch" "/home/frabert/.opam/log/processed-patch-13793-c743ac"
Processing 1/2: [llvm: patch]
+ /usr/bin/patch "-p1" "-i" "/home/frabert/.opam/log/processed-patch-13793-c743ac" (CWD=/home/frabert/.opam/4.11.1/.opam-switch/build/llvm.11.0.0)
- can't find file to patch at input line 5
- Perhaps you used the wrong -p or --strip option?
- The text leading up to this was:
- --------------------------
- |diff --git a/cmake/modules/AddOCaml.cmake b/cmake/modules/AddOCaml.cmake
- |index 554046b20..b27cbd36c 100644
- |--- a/cmake/modules/AddOCaml.cmake
- |+++ b/cmake/modules/AddOCaml.cmake
- --------------------------
- File to patch:
- Skip this patch? [y]
- Skipping patch.
- 1 out of 1 hunk ignored
<><> Error report <><><><><><><><><><><><><><><><><><><><><><><><><><><><><><><>
┌─ The following actions failed
│ λ build llvm 11.0.0
└─
╶─ No changes have been performed
Is this a known problem?
EDIT: Clarification regarding the workflow
I have a local git clone of the opam-repository, of which I have edited and committed the llvm.11.0.0 package definition.
To test the edits, I run opam install . from inside the llvm.11.0.0 directory which contains the opam file.
The correct1 workflow for changing a package definition in the ocaml/opam-repository is the following.
clone the opam-repository
git clone git#github.com:ocaml/opam-repository.git
add the local repository to the opam repo list
opam repo add local ./opam-repository
make a copy of the package that you would like to change (using llvm as the working example), we will use -<patch> since we're just adding a patch, not releasing a new version,
cd opam-repository/packages/llvm/
cp -r llvm.11.0.0/ llvm.11.0.0-1
work with the patched version ...
commit the work
git add llvm.11.0.0-1
git commit -m 'wip'
test it
opam update
opam install llvm
1) Well, at least it is the workflow that I am using everyday :)
I was using the wrong workflow, as hinted correctly by #ivg
The correct one, though, appears to be the one described here: https://github.com/ocaml/opam/issues/4654
Basically, I needed to add a local repository, and then install the llvm package as usual.
opam repo add local ~/opam-repository
opam install llvm
Related
I'm attempting to compile a Rust project in Docker and it's being frustrating. As far as I can tell it should be incredibly straightforward, but for some reason Docker can't find the Rust package.
Here's the Dockerfile:
FROM rust:1.6.1
COPY . .
RUN cargo build --release
CMD ["./run.sh"]
Here's the Cargo.toml:
[package]
name = "project1_1"
version = "0.1.0"
edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
utility = { path = "./src/utility" }
iced = "0.6.0"
iced_native = "0.4"
iced_style = "0.5"
iced_core = "0.5"
reqwest = { version = "0.11", features = ["json"] }
tokio = { version = "1", features = ["full"] }
webassembly = "0.8"
wry = "0.23.4"
And here's the output:
peterweyand#Peters-MacBook-Pro project1_1 % ./dockerrun.sh
2022/12/15 16:04:06 must use ASL logging (which requires CGO) if running as root
Sending build context to Docker daemon 1.351GB
Step 1/4 : FROM rust:1.6.1
manifest for rust:1.6.1 not found: manifest unknown: manifest unknown
Updating git repository `https://github.com/iced-rs/iced`
Updating crates.io index
error: no matching package found
searched package name: `iced`
perhaps you meant: cached
location searched: https://github.com/iced-rs/iced
required by package `project1_1 v0.1.0 (/)`
If instead of FROM rust:1.6.1 I use FROM rust:latest then Docker hangs like this:
peterweyand#Peters-MacBook-Pro project1_1 % ./dockerrun.sh
2022/12/15 16:08:18 must use ASL logging (which requires CGO) if running as root
Sending build context to Docker daemon 1.351GB
Step 1/4 : FROM rust:latest
---> 7767cd0ef4e0
Step 2/4 : COPY . .
---> 356cfb7caed9
Step 3/4 : RUN cargo build --release
---> Running in 87dbc6c86df9
Updating crates.io index
Does anyone have any ideas of what is going wrong?
EDIT: I usually suck at Docker because there are several ways of doing the same thing.
I tried this
FROM rust:1.66-alpine
WORKDIR /
COPY . .
RUN cargo install --path .
RUN apk add alpine-sdk # This one is necessary for linking cc.
COPY ./src/backend/Cargo.toml ./src/backend/Cargo.lock ./
RUN cargo build --release
CMD ["./run.sh"]
and it gave this output -
Compiling libloading v0.7.4
Compiling lock_api v0.4.9
Compiling parking_lot_core v0.9.5
Compiling toml v0.5.10
Compiling proc-macro-error-attr v1.0.4
Compiling proc-macro-error v1.0.4
Compiling parking_lot v0.12.1
Compiling anyhow v1.0.66
Compiling system-deps v6.0.3
error: linking with `cc` failed: exit status: 1
|
= note: "cc" "-Wl,--version-script=/tmp/rustc2quFlR/list" "/tmp/rustc2quFlR/symbols.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.0.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.1.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.10.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.11.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.12.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.13.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.14.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.15.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.2.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.3.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.4.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.5.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.6.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.7.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.8.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.proc_macro_error_attr.1df7e329-cgu.9.rcgu.o" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.2venke7z0qwetnto.rcgu.rmeta" "/target/release/deps/proc_macro_error_attr-4660f50c1ee088e6.2bt08o7nuxqqfzig.rcgu.o" "-Wl,--as-needed" "-L" "/target/release/deps" "-L" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib" "-Wl,-Bstatic" "/target/release/deps/libquote-a78ece94a3329b46.rlib" "/target/release/deps/libproc_macro2-3f4d87b0b9f5c5f5.rlib" "/target/release/deps/libunicode_ident-5c658c6e9005d30d.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libproc_macro-16667debc3013ce2.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libstd-19bcd24d54b4a32c.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libpanic_unwind-3814851f75d61802.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libobject-5007cbad366e7f54.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libmemchr-c7fe0e6a7e22626a.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libaddr2line-cdbb9a3725d71a8c.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libgimli-ad66f8ef705486ae.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/librustc_demangle-775cf8425902e89f.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libstd_detect-d09dc442a73afb02.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libcfg_if-508c53ad79acc8ea.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libhashbrown-ac7c150ef5940f2b.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libminiz_oxide-d2c5dbbafb505b02.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libadler-fa394ecd0326b64a.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/librustc_std_workspace_alloc-d36cf05357e3b9d8.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libunwind-32953872cd386e07.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libcfg_if-e844973f6e14767e.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/liblibc-2f732132bffc407e.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/liballoc-e0c40e7f51c7608f.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/librustc_std_workspace_core-3dc8593378fc4be9.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libcore-d31b035ed558dec3.rlib" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib/libcompiler_builtins-2b952fa9bf703518.rlib" "-Wl,-Bdynamic" "-lgcc_s" "-lc" "-Wl,--eh-frame-hdr" "-Wl,-znoexecstack" "-L" "/usr/local/rustup/toolchains/1.66.0-aarch64-unknown-linux-musl/lib/rustlib/aarch64-unknown-linux-musl/lib" "-o" "/target/release/deps/libproc_macro_error_attr-4660f50c1ee088e6.so" "-Wl,--gc-sections" "-shared" "-Wl,-zrelro,-znow" "-nodefaultlibs"
= note: /usr/lib/gcc/aarch64-alpine-linux-musl/12.2.1/../../../../aarch64-alpine-linux-musl/bin/ld: cannot find crti.o: No such file or directory
collect2: error: ld returned 1 exit status
error: could not compile `proc-macro-error-attr` due to previous error
warning: build failed, waiting for other jobs to finish...
error: failed to compile `project1_1 v0.1.0 (/)`, intermediate artifacts can be found at `/target`
The command '/bin/sh -c cargo install --path .' returned a non-zero code: 101
Unable to find image 'my-rust-app:latest' locally
2022/12/15 17:31:02 must use ASL logging (which requires CGO) if running as root
docker: Error response from daemon: pull access denied for my-rust-app, repository does not exist or may require 'docker login': denied: requested access to the resource is denied.
See 'docker run --help'.
And I also tried this -
FROM ubuntu:latest
WORKDIR /
COPY . .
RUN apt-get update
RUN apt-get install protobuf-compiler openssl pkg-config libssl-dev rustc cmake cargo -y
RUN rustc -V
CMD ["./run.sh"]
But still ran into problems compiling dependencies that relied on C.
So, my question at this point is "What is the easiest way of getting a rust program to run with a standard list of dependencies?" The hello world examples don't cut it for any Rust program that has a large crate stack.
Thanks.
Here is the Makefile: https://github.com/somersbmatthews/vault/blob/master/Makefile
Here is what happens when I run it:
somersbmatthews#pop-os:~/go/src/vault$ make static-dist dev-ui
--> Installing JavaScript assets
yarn install v1.19.1
[1/5] Validating package.json...
[2/5] Resolving packages...
success Already up-to-date.
Done in 0.75s.
> node-sass#4.14.1 install /home/somersbmatthews/go/src/vault/ui/node_modules/node-sass
> node scripts/install.js
node-sass build Binary found at /home/somersbmatthews/go/src/vault/ui/node_modules/node-sass/vendor/linux-x64-64/binding.node
> node-sass#4.14.1 postinstall /home/somersbmatthews/go/src/vault/ui/node_modules/node-sass
> node scripts/build.js
Binary found at /home/somersbmatthews/go/src/vault/ui/node_modules/node-sass/vendor/linux-x64-64/binding.node
Testing binary
Binary is fine
node-sass#4.14.1 /home/somersbmatthews/go/src/vault/ui/node_modules/node-sass
--> Building Ember application
yarn run v1.19.1
$ ember build -prod
INFORMATION (ember-cli-pretender)
ember-auto-import seems to be in your package dependencies.
As a result, you don't need pretender to be wrapped anymore.
You can install pretender and remove ember-cli-pretender.
⠋ BuildingWARNING: Option "nodeWorker" is deprecated since workerpool#5.0.0. Please use "workerType" instead.
WARNING: Option "nodeWorker" is deprecated since workerpool#5.0.0. Please use "workerType" instead.
WARNING: Option "nodeWorker" is deprecated since workerpool#5.0.0. Please use "workerType" instead.
Environment: production
⠏ BuildingThe 'this' keyword is equivalent to 'undefined' at the top level of an ES module, and has been rewritten
⠦ Building'#ember/string' is imported by ../../../../../../tmp/broccoli-607060b8WPADOlU6j8/cache-260-rollup/build/-private/system/normalize-model-name.js, but could not be resolved – treating it as an external dependency
'#ember/string' is imported by ../../../../../../tmp/broccoli-607060b8WPADOlU6j8/cache-260-rollup/build/-private/adapters/build-url-mixin.js, but could not be resolved – treating it as an external dependency
'#ember/string' is imported by ../../../../../../tmp/broccoli-607060b8WPADOlU6j8/cache-260-rollup/build/-private/system/debug/debug-adapter.js, but could not be resolved – treating it as an external dependency
⠏ Building[BABEL] Note: The code generator has deoptimised the styling of /home/somersbmatthews/go/src/vault/ui/node_modules/swagger-ui-dist/swagger-ui-bundle.js as it exceeds the max of 500KB.
Generating files needed by Storybook
Parsing /tmp/broccoli-607060b8WPADOlU6j8/out-630-broccoli_merge_trees/index.html
Generating preview-head.html
Generating files needed by Storybook
Generating .env
cleaning up...
Built project successfully. Stored in "../pkg/web_ui".
File sizes:
- ../pkg/web_ui/assets/chunk.3.e73ac42f48b4e5ab3d48.js: 1.08 MB (316.69 KB gzipped)
- ../pkg/web_ui/assets/node-asset-manifest.js: 1.02 KB (445 B gzipped)
- ../pkg/web_ui/assets/vault-895816690cab246cbd3b9423defc2f53.css: 482.96 KB (56.99 KB gzipped)
- ../pkg/web_ui/assets/vault-b8afdc29f93ad91f89268835698b0711.js: 1.2 MB (185.17 KB gzipped)
- ../pkg/web_ui/assets/vendor-8381b7eebdb7ea85cb88b80f3029e0e8.css: 14.21 KB (3.66 KB gzipped)
- ../pkg/web_ui/assets/vendor-ded9c2047ac30c216b8015683667178a.js: 1.82 MB (457.27 KB gzipped)
- ../pkg/web_ui/ember-fetch/fetch-fastboot-38cfd9007f94f81f5a2bc13690efc343.js: 1020 B (562 B gzipped)
- ../pkg/web_ui/engines-dist/kmip/assets/engine-ce86d837f49968e27331ecc744f8288d.js: 68.55 KB (9.29 KB gzipped)
- ../pkg/web_ui/engines-dist/kmip/assets/engine-vendor-d41d8cd98f00b204e9800998ecf8427e.css: 0 B
- ../pkg/web_ui/engines-dist/kmip/assets/engine-vendor-d41d8cd98f00b204e9800998ecf8427e.js: 0 B
- ../pkg/web_ui/engines-dist/kmip/config/environment-0123205ae026fc9ed3e41f1d552270f8.js: 86 B (100 B gzipped)
- ../pkg/web_ui/engines-dist/open-api-explorer/assets/engine-83cdd1e87b4c1568b63b394b62e6e0c5.js: 27.16 KB (5.14 KB gzipped)
- ../pkg/web_ui/engines-dist/open-api-explorer/assets/engine-9dcfdf942f31c3caa1d6dfd57c3cc072.css: 3.38 KB (829 B gzipped)
- ../pkg/web_ui/engines-dist/open-api-explorer/assets/engine-vendor-6faadde6d1de73cd00d4f818f4f60c75.css: 149.46 KB (22.77 KB gzipped)
- ../pkg/web_ui/engines-dist/open-api-explorer/assets/engine-vendor-d41d8cd98f00b204e9800998ecf8427e.js: 0 B
- ../pkg/web_ui/engines-dist/open-api-explorer/config/environment-6da0fcce17b2031e2559754701e92d69.js: 194 B (170 B gzipped)
- ../pkg/web_ui/engines-dist/replication/assets/engine-52dc634acbe2629436188771450e81ba.js: 97.81 KB (15.78 KB gzipped)
- ../pkg/web_ui/engines-dist/replication/assets/engine-vendor-d41d8cd98f00b204e9800998ecf8427e.css: 0 B
- ../pkg/web_ui/engines-dist/replication/assets/engine-vendor-d41d8cd98f00b204e9800998ecf8427e.js: 0 B
- ../pkg/web_ui/engines-dist/replication/config/environment-fcc3a0f22bdfd265a50708864776440a.js: 100 B (104 B gzipped)
- ../pkg/web_ui/sw-registration-65dd6e15d4d40ce435383a9edaccfc03.js: 1.14 KB (616 B gzipped)
- ../pkg/web_ui/sw.js: 1.26 KB (675 B gzipped)
Done in 70.33s.
--> Generating static assets
make[1]: Entering directory '/home/somersbmatthews/go/src/vault'
goimports -w $(find . -name '*.go' | grep -v pb.go | grep -v vendor)
make[1]: Leaving directory '/home/somersbmatthews/go/src/vault'
==> Checking compiled UI assets...
==> Checking that build is using go version >= 1.14.7...
==> Using go version 1.15.2...
==> Removing old directory...
==> Building...
flag provided but not defined: -gcflags
Usage: gox [options] [packages]
Gox cross-compiles Go applications in parallel.
If no specific operating systes or architectures are specified, Gox
will build for all pairs supported by your version of Go.
Options:
-arch="" Space-separated list of architectures to build for
-build-toolchain Build cross-compilation toolchain
-ldflags="" Additional '-ldflags' value to pass to go build
-os="" Space-separated list of operating systems to build for
-osarch="" Space-separated list of os/arch pairs to build for
-output="foo" Output path template. See below for more info
-parallel=-1 Amount of parallelism, defaults to number of CPUs
-verbose Verbose mode
Output path template:
The output path for the compiled binaries is specified with the
"-output" flag. The value is a string that is a Go text template.
The default value is "{{.Dir}}_{{.OS}}_{{.Arch}}". The variables and
their values should be self-explanatory.
Platforms (OS/Arch):
The operating systems and architectures to cross-compile for may be
specified with the "-arch" and "-os" flags. These are space separated lists
of valid GOOS/GOARCH values to build for, respectively. You may prefix an
OS or Arch with "!" to negate and not build for that platform. If the list
is made up of only negations, then the negations will come from the default
list.
Additionally, the "-osarch" flag may be used to specify complete os/arch
pairs that should be built or ignored. The syntax for this is what you would
expect: "darwin/amd64" would be a valid osarch value. Multiple can be space
separated. An os/arch pair can begin with "!" to not build for that platform.
The "-osarch" flag has the highest precedent when determing whether to
build for a platform. If it is included in the "-osarch" list, it will be
built even if the specific os and arch is negated in "-os" and "-arch",
respectively.
make: *** [Makefile:39: dev-ui] Error 2
Here is the full repo: https://github.com/somersbmatthews/vault
Lines 38 and 39 in the Makefile are:
dev-ui: assetcheck prep
#CGO_ENABLED=$(CGO_ENABLED) BUILD_TAGS='$(BUILD_TAGS) ui' VAULT_DEV_BUILD=1 sh -c "'$(CURDIR)/scripts/build.sh'"
How do I get more information on this error? "Error 2" appears twice in the code in two files as errors for a MongoDB dependency:
https://github.com/somersbmatthews/vault/blob/master/vendor/go.mongodb.org/mongo-driver/x/mongo/driver/auth/internal/gssapi/sspi_wrapper.h
https://github.com/somersbmatthews/vault/blob/master/vendor/go.mongodb.org/mongo-driver/x/mongo/driver/auth/internal/gssapi/gss_wrapper.h
Thanks for any help :)
Problem solved as per https://groups.google.com/g/vault-tool/c/xyV7-FMHrEE?pli=1
use command make bootstrap to update gox and other go tools.
NOTE: previous versions of gox will make an empty file called bindata.go Just delete this.
Here is what happened: I built a docker image with stack, but the process got interrupted five times. The message said that each interruption happened while building package Cabal-2.4.1.0. The process exited with code: ExitFailure (-9) (THIS MAY INDICATE OUT OF MEMORY).
I want to automate the process in a docker-script, so it is essential that stack makes it to the end. That is my problem. Here is what happened when building (I left out some lines to keep your scrolling within reasonable limits, so expect to see <... skipping <n> lines ...>) every now and then. (to be continued at the bottom.)
BA92-C02VP224HTDF:Ampersand stefjoosten$ docker build -t amp:latest .
Sending build context to Docker daemon 70.28MB
Step 1/8 : FROM ubuntu:latest
---> 7698f282e524
Step 2/8 : RUN apt-get update && apt-get --yes install curl && apt-get --yes install git-core
---> Using cache
---> 81322e02eb2a
Step 3/8 : RUN curl -sSL https://get.haskellstack.org/ | sh # install Haskell and stack
---> Using cache
---> 0ff9ca0665b9
Step 4/8 : WORKDIR /Ampersand/ # build from the Ampersand source code directory
---> Using cache
---> a66b539a6868
Step 5/8 : RUN git clone https://github.com/AmpersandTarski/Ampersand/ .
---> Using cache
---> 1806c9a40c82
Step 6/8 : RUN git checkout feature/rio-phase2 # get Ampersand sources in the correct version
---> Using cache
---> 456074437186
Step 7/8 : RUN stack setup # set up Haskell stack (version taken from stack.yaml)
---> Using cache
---> fcadefd6812c
Step 8/8 : RUN stack install # installs Ampersand executables in /root/.local/bin
---> Running in 0a8d298a757c
Updating package index Hackage (mirrored at https://s3.amazonaws.com/hackage.fpcomplete.com/) ...
Selected mirror https://s3.amazonaws.com/hackage.fpcomplete.com/
Downloading root
Selected mirror https://s3.amazonaws.com/hackage.fpcomplete.com/
Downloading timestamp
Downloading snapshot
Downloading mirrors
Cannot update index (no local copy)
Downloading index
Updated package index downloaded
Update complete
Populating index cache ...
Populated index cache.
[1 of 2] Compiling Main ( /root/.stack/setup-exe-src/setup-mPHDZzAJ.hs, /root/.stack/setup-exe-src/setup-mPHDZzAJ.o )
[2 of 2] Compiling StackSetupShim ( /root/.stack/setup-exe-src/setup-shim-mPHDZzAJ.hs, /root/.stack/setup-exe-src/setup-shim-mPHDZzAJ.o )
Linking /root/.stack/setup-exe-cache/x86_64-linux/tmp-Cabal-simple_mPHDZzAJ_2.4.0.1_ghc-8.6.4 ...
Cabal-2.4.1.0: download
StateVar-1.1.1.1: download
HsYAML-0.1.1.3: download
StateVar-1.1.1.1: configure
SHA-1.6.4.4: download
StateVar-1.1.1.1: build
HsYAML-0.1.1.3: configure
HsYAML-0.1.1.3: build
Cabal-2.4.1.0: configure
StateVar-1.1.1.1: copy/register
<... skipping 55 lines ...>
cereal-0.5.8.0: download
cereal-0.5.8.0: configure
cereal-0.5.8.0: build
basement-0.0.10: copy/register
cereal-0.5.8.0: copy/register
blaze-html-0.9.1.1: copy/register
-- While building package Cabal-2.4.1.0 using:
/root/.stack/setup-exe-cache/x86_64-linux/Cabal-simple_mPHDZzAJ_2.4.0.1_ghc-8.6.4 --builddir=.stack-work/dist/x86_64-linux/Cabal-2.4.0.1 build --ghc-options " -ddump-hi -ddump-to-file"
Process exited with code: ExitFailure (-9) (THIS MAY INDICATE OUT OF MEMORY)
Logs have been written to: /Ampersand/ # build from the Ampersand source code directory/.stack-work/logs/Cabal-2.4.1.0.log
Configuring Cabal-2.4.1.0...
Preprocessing library for Cabal-2.4.1.0..
Building library for Cabal-2.4.1.0..
[ 1 of 220] Compiling Distribution.Compat.Binary ( Distribution/Compat/Binary.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/Binary.o )
[ 2 of 220] Compiling Distribution.Compat.Directory ( Distribution/Compat/Directory.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/Directory.o )
[ 3 of 220] Compiling Distribution.Compat.Exception ( Distribution/Compat/Exception.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/Exception.o )
[ 4 of 220] Compiling Distribution.Compat.Internal.TempFile ( Distribution/Compat/Internal/TempFile.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/Internal/TempFile.o )
[ 5 of 220] Compiling Distribution.Compat.MonadFail ( Distribution/Compat/MonadFail.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/MonadFail.o )
[ 6 of 220] Compiling Distribution.Compat.Newtype ( Distribution/Compat/Newtype.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/Newtype.o )
<... skipping 56 lines ...>
[ 63 of 220] Compiling Distribution.System ( Distribution/System.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/System.o )
[ 64 of 220] Compiling Distribution.SPDX.LicenseReference ( Distribution/SPDX/LicenseReference.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/SPDX/LicenseReference.o )
[ 65 of 220] Compiling Distribution.SPDX.LicenseId ( Distribution/SPDX/LicenseId.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/SPDX/LicenseId.o )
The command '/bin/sh -c stack install # installs Ampersand executables in /root/.local/bin' returned a non-zero code: 1
BA92-C02VP224HTDF:Ampersand stefjoosten$ docker start `docker ps -q -l` # restart it in the background
0a8d298a757c
BA92-C02VP224HTDF:Ampersand stefjoosten$ docker attach `docker ps -q -l`
cmark-gfm-0.1.8: build
clock-0.7.2: copy/register
colour-2.3.4: download
colour-2.3.4: configure
colour-2.3.4: build
<... skipping 200 lines ...>
primitive-0.6.4.0: copy/register
reflection-2.1.4: copy/register
hxt-9.3.1.16: copy/register
-- While building package Cabal-2.4.1.0 using:
/root/.stack/setup-exe-cache/x86_64-linux/Cabal-simple_mPHDZzAJ_2.4.0.1_ghc-8.6.4 --builddir=.stack-work/dist/x86_64-linux/Cabal-2.4.0.1 build --ghc-options " -ddump-hi -ddump-to-file"
Process exited with code: ExitFailure (-9) (THIS MAY INDICATE OUT OF MEMORY)
Logs have been written to: /Ampersand/ # build from the Ampersand source code directory/.stack-work/logs/Cabal-2.4.1.0.log
Configuring Cabal-2.4.1.0...
Preprocessing library for Cabal-2.4.1.0..
Building library for Cabal-2.4.1.0..
[ 1 of 220] Compiling Distribution.Compat.Binary ( Distribution/Compat/Binary.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/Binary.o )
[ 2 of 220] Compiling Distribution.Compat.Directory ( Distribution/Compat/Directory.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/Directory.o )
[ 3 of 220] Compiling Distribution.Compat.Exception ( Distribution/Compat/Exception.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/Exception.o )
[ 4 of 220] Compiling Distribution.Compat.Internal.TempFile ( Distribution/Compat/Internal/TempFile.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/Internal/TempFile.o )
[ 5 of 220] Compiling Distribution.Compat.MonadFail ( Distribution/Compat/MonadFail.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/MonadFail.o )
[ 6 of 220] Compiling Distribution.Compat.Newtype ( Distribution/Compat/Newtype.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Compat/Newtype.o )
<... skipping 104 lines ...>
[111 of 220] Compiling Distribution.Types.AbiDependency ( Distribution/Types/AbiDependency.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Types/AbiDependency.o )
[112 of 220] Compiling Distribution.Simple.InstallDirs ( Distribution/Simple/InstallDirs.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Simple/InstallDirs.o )
[113 of 220] Compiling Distribution.Types.LegacyExeDependency ( Distribution/Types/LegacyExeDependency.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Types/LegacyExeDependency.o )
[114 of 220] Compiling Distribution.Types.BuildInfo ( Distribution/Types/BuildInfo.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Distribution/Types/BuildInfo.o )
BA92-C02VP224HTDF:Ampersand stefjoosten$ docker start `docker ps -q -l` # restart it in the background
0a8d298a757c
BA92-C02VP224HTDF:Ampersand stefjoosten$ docker attach `docker ps -q -l`
regex-base-0.93.2: copy/register
regex-pcre-builtin-0.94.4.8.8.35: download
<... skipping 678 lines with three more interruptions ...>
pandoc-2.5: copy/register
pandoc-crossref-0.3.4.0: download
pandoc-crossref-0.3.4.0: configure
pandoc-crossref-0.3.4.0: build
pandoc-crossref-0.3.4.0: copy/register
Building all executables for `ampersand' once. After a successful build of all of them, only specified executables will be rebuilt.
ampersand-3.17.0: configure (lib + exe)
[1 of 2] Compiling Main ( /Ampersand/ # build from the Ampersand source code directory/Setup.hs, /Ampersand/ # build from the Ampersand source code directory/.stack-work/dist/x86_64-linux/Cabal-2.4.0.1/setup/Main.o )
[2 of 2] Compiling StackSetupShim ( /root/.stack/setup-exe-src/setup-shim-mPHDZzAJ.hs, /Ampersand/ # build from the Ampersand source code directory/.stack-work/dist/x86_64-linux/Cabal-2.4.0.1/setup/StackSetupShim.o )
Linking /Ampersand/ # build from the Ampersand source code directory/.stack-work/dist/x86_64-linux/Cabal-2.4.0.1/setup/setup ...
Configuring ampersand-3.17.0...
ampersand-3.17.0: build (lib + exe)
Warning: Cannot read previously generated src/Ampersand/Prototype/StaticFiles_Generated.hs:
src/Ampersand/Prototype/StaticFiles_Generated.hs: openFile: does not exist (No such file or directory)
This warning should disappear the next time you build Ampersand. If the error persists, please report this as a bug.
Static files have changed, updating src/Ampersand/Prototype/StaticFiles_Generated.hs
Preprocessing library for ampersand-3.17.0..
Building library for ampersand-3.17.0..
[ 1 of 113] Compiling Ampersand.Basics.Prelude ( src/Ampersand/Basics/Prelude.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Ampersand/Basics/Prelude.o )
[ 2 of 113] Compiling Ampersand.Basics.Languages ( src/Ampersand/Basics/Languages.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Ampersand/Basics/Languages.o )
[ 3 of 113] Compiling Ampersand.Basics.Exit ( src/Ampersand/Basics/Exit.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Ampersand/Basics/Exit.o )
[ 4 of 113] Compiling Ampersand.Basics.BuildInfo_Generated ( src/Ampersand/Basics/BuildInfo_Generated.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Ampersand/Basics/BuildInfo_Generated.o )
[ 5 of 113] Compiling Ampersand.Basics.Auxiliaries ( src/Ampersand/Basics/Auxiliaries.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Ampersand/Basics/Auxiliaries.o )
[ 6 of 113] Compiling Ampersand.Basics.String ( src/Ampersand/Basics/String.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Ampersand/Basics/String.o )
<... skipping 103 lines ...>
[110 of 113] Compiling Ampersand.Test ( src/Ampersand/Test.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Ampersand/Test.o )
[111 of 113] Compiling Ampersand ( src/Ampersand.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Ampersand.o )
[112 of 113] Compiling MainApps ( src/MainApps.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/MainApps.o )
[113 of 113] Compiling Paths_ampersand ( .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/autogen/Paths_ampersand.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/Paths_ampersand.o )
Preprocessing executable 'ampersand' for ampersand-3.17.0..
Building executable 'ampersand' for ampersand-3.17.0..
[1 of 2] Compiling Main ( app/Ampersand/Main.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/ampersand/ampersand-tmp/Main.o )
[2 of 2] Compiling Paths_ampersand ( .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/ampersand/autogen/Paths_ampersand.hs, .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/ampersand/ampersand-tmp/Paths_ampersand.o )
Linking .stack-work/dist/x86_64-linux/Cabal-2.4.0.1/build/ampersand/ampersand ...
ampersand-3.17.0: copy/register
Installing library in /Ampersand/ # build from the Ampersand source code directory/.stack-work/install/x86_64-linux/lts-13.16/8.6.4/lib/x86_64-linux-ghc-8.6.4/ampersand-3.17.0-K72VvTMgyU7EFfE6avLPOe
Installing executable ampersand in /Ampersand/ # build from the Ampersand source code directory/.stack-work/install/x86_64-linux/lts-13.16/8.6.4/bin
Registering library for ampersand-3.17.0..
Completed 25 action(s).
Copying from /Ampersand/ # build from the Ampersand source code directory/.stack-work/install/x86_64-linux/lts-13.16/8.6.4/bin/ampersand to /root/.local/bin/ampersand
Copied executables to /root/.local/bin:
- ampersand
It is strange to see the building process interrupted for the (possible) reason of memory exhaustion, but the process happily continues after the docker-build process is restarted. I needed 5 restarts to get to the end.
I tried to increase docker's memory, experimenting with the commands:
docker build -m 4g -t amp:latest .
docker build -m 12g -t amp:latest .
but that makes no noteworthy difference.
I'd be grateful for any ideas...
Your best bet is probably to pass --jobs 1 to stack. This will turn off concurrent builds which will reduce the memory requirements. GHC is generally a memory hog, and some code in particular really uses a lot of memory to compile. What's probably happening is that two modules that both take a lot of memory are ending up being built at the same time, and when that happens you'll end up with the OOM. But each time you run the build, a few more packages get built and the build order can change, so eventually you luck out and the memory hogs don't end up building concurrently, so the build is able to finish.
I have a library used by a number of Arduino projects. I use PlatformIO as my build system, so I've created a library.json file in the root of the library to identify dependent libraries that should be loaded when I include this library in a project. All good.
Sometimes the dependent libraries get changed - PlatformIO is particularly sensitive to renaming them in the Arduino library.properties file. It is a pain when I discover that my library is broken only when I try to build a project that uses it.
I'd like to configure Travis to run periodically (thanks, Travis cron jobs!) and confirm that I can load all dependent libaries.
pio ci does not really apply to libraries. pio test requires a PlatformIO subscription (highly recommended, but not always an option).
Put the following in .travis.yml:
```
PlatformIO dependency test
- language: python
python: 2.7
install:
- pip install -U platformio
script:
- mkdir test_platformio_deps
- cd test_platformio_deps
- echo "[env:adafruit_feather_m0]" > platformio.ini
- echo "platform = atmelsam" >> platformio.ini
- echo "board = adafruit_feather_m0" >> platformio.ini
- echo "framework = arduino" >> platformio.ini
- if [ "${TRAVIS_PULL_REQUEST_SLUG}" = "" ]; then echo "lib_deps = SPI, https://github.com/${TRAVIS_REPO_SLUG}" ; else echo "lib_deps = SPI, https://github.com/${TRAVIS_PULL_REQUEST_SLUG}#${TRAVIS_PULL_REQUEST_BRANCH}" ; fi >> platformio.ini
- cat platformio.ini
- mkdir src
- echo "int main() {}" > src/main.cpp
- platformio run
cache:
directories:
- "~/.platformio"
```
It will create a simple project that depends on your library and then attempt to build it. If all dependencies load, it will succeed.
The tricky line with TRAVIS_PULL_REQUEST_SLUG handles running the test within a PR.
I am playing with travis-ci for the first time and have run into an error thats confusing me.
Below is my .travis.yml which validates when linted.
language: node_js
node_js:
- 0.8
after_script:
- echo "Hello World"
The following is my travis build output:
$ git clone --depth=50 --branch=master git://github.com/iancrowther/travis-experiment.git iancrowther/travis-experiment
Cloning into 'iancrowther/travis-experiment'...
remote: Counting objects: 27, done.
remote: Compressing objects: 100% (21/21), done.
remote: Total 27 (delta 2), reused 20 (delta 0)
Receiving objects: 100% (27/27), done.
Resolving deltas: 100% (2/2), done.
$ cd iancrowther/travis-experiment
git.2
$ git checkout -qf xxx
$ nvm use 0.8
Now using node v0.8.22
$ node --version
v0.8.22
$ npm --version
1.2.14
$ make test
make: *** No rule to make target `test'. Stop.
The command "make test" exited with 2.
after_script
$ echo "Hello World"
Hello World
Done. Your build exited with 1.
How can i prevent the make file being executed?
Any help would be great, I cant seem to ind the errors explained in the doc's
Ian
ps. does anyone have any links to a guide about styling code??
UPDATE
when #User re-formatted the question, they added the following comment: "used {} button to make the code visible".
change after_script to script
Have a look at this .travis.yml :
before_install:
- sudo apt-get update -qq
- sudo apt-get install -qq python3.2 python2.7
python:
- "2.7"
script: ./run_build.sh
The last line is the important one.
script is the command that runs the test.
before is something that enables the test to run
and after is something to clean up.
It is the same pattern you may know from unittests. Setup, Test, Teardown
Since Travis does not find a script: I guess it takes a default option for testing.