How to combine multiple files in a zipFile in Jenkins? - jenkins

I wish to combine 3 files into a zipFile in my jenkins build. I have taken a look at these two sources (1, 2), but I cannot understand what that means for my example? My example is the following (generic example to hide real names):
zip zipFile: "myZip.zip", archive: true, dir: './someLocation1', glob: 'file1.py*'
zip zipFile: "myZip.zip", archive: true, dir: './someLocation2', glob: 'file2.py*'
zip zipFile: "myZip.zip", archive: true, dir: './someLocation3', glob: ''
The top two lines of code I want a specific file from each location, and the bottom line I want all files from that location, illustrated with the empty string (reference) glob: ''. Of course this way is not correct and produces an error when it gets to the second line because myZip.zip already exists.
I want these files to be in the same zip because I want to then send the zip to storage. How can I write this to achieve what I want? Is this the right way to solve this problem?

You can do something like this. First copy all the files you need to a temp directory and then Zip that directory.
sh'''
mkdir tempDir
cp ./someLocation1/file1.py tempDir
cp ./someLocation2/file2.py tempDir
cp -r ./someLocation3/* tempDir/
'''
zip zipFile: "myZip.zip", archive: true, dir: './tempDir'

Related

Unix. Parse file with full paths to SHA256 checksums files. Run command in each path/file

I have a file file.txt with filenames ending with *.sha256, including the full paths of each file. This is a toy example:
file.txt:
/path/a/9b/x3.sha256
/path/7c/7j/y2.vcf.gz.sha256
/path/e/g/7z.sha256
Each line has a different path/file. The *.sha256 files have checksums.
I want to run the command "sha256sum -c" on each of these *.sha256 files and write the output to an output_file.txt. However, this command only accepts the name of the .sha256 file, not the name including its full path. I have tried the following:
while read in; do
sha256sum -c "$in" >> output_file.txt
done < file.txt
but I get:
"sha256sum: WARNING: 1 listed file could not be read"
which is due to the path included in the command.
Any suggestion is welcome
#!/bin/bash
while read in
do
thedir=$(dirname "$in")
thefile=$(basename "$in")
cd "$thedir"
sha256sum -c "$thefile" >>output_file.txt
done < file.txt
Modify your code to extract the directory and file parts of your in variable.

Jenkins - how to set Variable value into the zip file name?

I have a variable, set as global environment, which outputs the timestamp.
echo "Current build version: ${BUILDVERSION}"
Current build version: 20211117-114343
Now I want to add this value to zip step for setting the zip file name and I check the content of the zip file.
zip zipFile: 'test_${BUILDVERSION}.zip'
sh 'zipinfo -1 test_${BUILDVERSION}.zip'
ZIP is not created properly: test_${BUILDVERSION}.zip while zipinfo takes this value properly. zipinfo -1 test_20211117-114343.zip
Can you please assist what I do wrong? Thanks
Solution
Use double quotes. Read about String interpolation in Groovy
zip zipFile: "test_${BUILDVERSION}.zip"

Dockerignore: allow to add only specific extension like *.json from any subfolder

I have a .dockerignore file and I'm trying to allow Docker to upload only *.json files but from any of subfolders.
For example, for the next files structure:
public/readme.md
public/subfolder/a.json
public/subfolder/b.json
public/other/c.json
public/other/file.txt
I'm expecting to see only json files in the image:
public/subfolder/a.json
public/subfolder/b.json
public/other/c.json
Of course they must be located in the same directories as in original source.
I tried several ways but didn't succeed.
UP: I don't know how many subfolders will be created in the public/ directory and how deep will be the directories structure.
I think you can achieve what you want by relying on one such .dockerignore:
public/*
!public/subfolder
public/subfolder/*
!public/other
public/other/*
!**/*.json
The tricky thing is that the first line of this file is public/* but not public nor * (otherwise the !... subsequent lines won't work).
Note also that you may want to automate the generation of one such .dockerignore, to cope with possible tree structure changes.
For example:
gen-dockerignore.sh
#!/usr/bin/env bash
{ echo '*' ; # header of the .dockerignore - to be changed if need be
find public -type d -exec echo -en "!{}\n{}/*\n" \; ;
echo '!**/*.json' ; } > .dockerignore
$ ./gen-dockerignore.sh would output the following file:
.dockerignore
*
!public
public/*
!public/other
public/other/*
!public/subfolder
public/subfolder/*
!**/*.json

extract a file from xz file

I have a huge file file.tar.xz containing many smaller text files with a similar structure. I want to quickly examine a file out of the compressed file and have a glimpse of files content structure. I don't have information about names of the files within the compressed file. Is there anyway to extract a single file out given the above the above scenario?
Thank you.
EDIT: I don't want to tar -xvf file.tar.xz.
Based on the discussion in the comments, I tried the following which worked for me. It might not be the most optimal solution, the regex might need some improvement, but you'll get the idea.
I first created a demo archive:
cd /tmp
mkdir demo
for i in {1..100}; do echo $i > "demo/$i.txt"; done
cd demo && tar cfJ ../demo.tar.xz * && cd ..
demo.tar.xz now contains 100 txt files.
The following lists the contents of the archive, selects the first file and stores the path within the archive into the variable firstfile:
firstfile=`tar -tvf demo.tar.xz | grep -Po -m1 "(?<=:[0-9]{2} ).*$"`
echo $firstfile will output 1.txt.
You can now extract this single file from the archive:
tar xf demo.tar.xz $firstfile

Extract text files in each subfolder and join them with the subfolder name

I have compressed text files in the following folder structure:
~/A/1/1.faa.tgz #each tgz file has dozens of faa text files
~/A/2/2.faa.tgz
~/A/3/3.faa.tgz
I would like to extract the faa files (text) from each tgz file and then join them using the subfoldername (1,2 and 3) to create a single text file for each subfolder.
My attempt was the following, but the files were extracted in the folder where I ran the script:
#!/bin/bash
for FILE in ~/A/*/*.faa.tgz; do
tar -vzxf "$FILE"
done
After extracting the faa files I would use "cat" to join them (for example, using cat *.faa > .txt.
Thanks in advance.
To extract:
#!/bin/bash
for FILE in ~/A/*/*.faa.tgz; do
echo "$FILE"
mkdir "`basename "$FILE"`"
tar -vzxf "$FILE" -C "`basename "$FILE"`"
done
To join:
#!/bin/bash
for dir in ~/A/*/; do
(
cd "$dir"
file=( *.faa )
cat "${file[#]}" > "${PWD##*/}.txt"
)
done

Resources