I have some problems regarding some automatically builds made in Jenkins.
So I have a situation when a batch script trigger a setup.py file that should be run with another version compared to the common one used in the project. But it fails with this message: Requested Python Version(3.9) not installed.
I tried to re-clone my branch and run that batch file and it works locally.
Related
I have a C++ project which has a large set of dependencies and a very long compilation time. My idea is to only distribute the project executable along with the dependencies using docker. However , i am unable to identify how to do this?
A common way I've seen this achieved is signed release packages along with signature verification with packages deployed on something like github or using ppa packages for ubuntu.
I suppose my question is part docker , part build related.
How do I build and package this ?
I am running arch linux with a higher kernel , and will be building docker with ubuntu lts. Does the binary have any issues on what kernel it was built on ?
Can i build on local arch and package that to release or do i need to do some CI delivery via github actions?
Instead of using releases via github/ppa , can i just do a cp of binary built locally as a docker file action?
Thanks!
Is it possible to run bentoml build without importing the services.py file during the process?
I'm trying to put the bento build and containarize steps in our CI/CD server. Our model depends on some OS packages installed and some python packages. I thought I could run bentoml build to package the model code and binaries that are present. I'd leave the dependencies especification to the contanairize step.
To my surprise the bentoml build process tried to import the service file during the packaging and the build failed since I didn't have the dependencies installed in my CI/CD machine.
Can I prevent this importing while building/packaging the model? Maybe I should ignore the bento containarize and create my bento container by hand and just execute the bentoml serve inside.
I feel that the need to install by hand the dependencies is doubling the effort to specify them in the bentofile.yaml and preventing the reproducibility of my environment.
This is not currently possible. The community is working on an environment management feature, such that an environment with the necessary dependencies will be automatically created during build.
I am currently testing Xcode Cloud as a member of Apple's private beta program and encountered an issue when trying to Archive / Build (any action) on the cloud for my project.
The project is a fairly basic SwiftUI app with CocoaPods dependencies. I have followed the steps to integrate CocoaPods into my project as described by Apple by simply committing my Pods directory to GitHub. However, I am getting a build error on the cloud for every attempted action:
Command PhaseScriptExecution failed with a nonzero exit code
Here is the ASC log for reference:
This is very strange because the same project builds and archives successfully on my local machine. I have used the same macOS and Xcode versions in the Workflow editor as my local version of Xcode.
How can I resolve this error?
Custom shell scripts restrictions
TL;DR
Apple has locked down the security of their hosted infrastructure by only enabling shell scripts to run as part of ci_post_clone.sh, ci_pre_xcodebuild.sh or ci_post_xcodebuild.sh in the ci_scripts folder. Your Pods project has a custom shell script outside of this folder that is not triggered by one of these CI scripts, so does not have running permissions.
The solution for this specific issue are:
(technically) Refactor your build script phase to inline the shell script file inside the run script.
(recommended, but not always possible) Use the Swift Package version of the CocoaPod if available.
(workaround) Downgrade the CocoaPod to a version without an external shell script.
Reference
From Customize your advanced Xcode Cloud workflows:
If your script doesn't appear to be running when you expect it to,
double-check that you've named it correctly and placed it in a
ci_scripts folder alongside your project.
...
Lastly, it should be
noted that in a test action, multiple environments are used to build
and run your tests. Only the environment that is used for building
your tests will have your source code cloned into it by default. The
environments that run your tests won't have source code cloned into
them. They'll only have the ci_scripts folder made available on
them. As a result, the post-clone script won't run in these
environments and your custom scripts and any of their dependencies,
such as other shell scripts and small tools, must be entirely
contained within the ci_scripts folder.
Build script phases ARE allowed to run user-defined code as part of the build process, however, we can only run inlined custom scripts here. As discussed, external shell scripts have restricted permissions. Running the external shell script file after moving to ci_scripts does NOT work. e.g. "${PODS_ROOT}/../ci_scripts/AppCenter-xcframeworks.sh".
Although not relevant here, note that the environment that tests your project won't have the source code cloned into them.
For tests or other environments to reference custom script files, we need to store additional scripts inside the ci_scripts folder to ensure the action has access to it. Apple only allows 3 scripts to run corresponding to 3 stages of a build:
After cloning the source code into the build environment.
Before running xcodebuild.
After running xcodebuild.
Additional shell scripts can can ONLY run here after delegating from the respective ci_post_clone.sh, ci_pre_xcodebuild.sh or ci_post_xcodebuild.sh files in the ci_scripts folder.
Solution 1
My issue was running an external shell script during the build process. Apple does allow Run Script Build Phases in Xcode Cloud workflows, but they have to be inlined. So, I had to do 4 steps to run a custom Pod shell script as part of a build phase:
Refactor your Build Phases -> Run Script Phase script to inline the shell script file.
Check the project builds locally after clearing DerivedData.
Commit your code to GitHub / VCS.
Trigger the workflow.
Solution 2 (recommended, but not always possible)
Add the Swift Package version of the CocoaPod as a dependency following Apple's documentation.
Solution 3 (workaround)
Downgrade your CocoaPod to a version without external shell scripts.
Notes
As you can tell from the amount of effort required to workaround custom shell script build phases with Xcode Cloud, I suggest raising an issue on the specific CocoaPod repository to migrate away from custom shell script files. These kinds of steps make using Xcode Cloud very painful.
As Xcode Cloud adoption grows it is entirely possible that individual CocoaPods no longer reference custom shell script files. I don't see Apple opening up their infrastructure to enable arbitrary shell script execution because this is a security risk, and to be honest, should have been prevented on other CI providers too.
I can also see how hassles like these to include legacy CocoaPods dependencies could accelerate more projects to migrate to SPM. SPM is already popular, and will likely become more popular as Apple ensures first-class integration.
Disclaimer: Xcode Cloud is in private beta so this issue may be resolved in future versions if shell script permissions are relaxed...
I'm using TFS 2015 to build and deploy my websites.
I have multipe websites and i need to deploy then to multiple machines that have a NLB.
So the steps are:
1 - Stop NLB on machine 1
2 - Deploy files
3 - Start NLB on machine 1
4 - Repeat to all machines.
Is there a way of doing this without have to configure this steps to each machine?
Its possible to have a machine group and apply the steps to each one?
Thanks
You need to use a custom task called Tokenizer in the workflow of the release. It tokenizes the variable in web.config which then can be transformed. Tokenizer needs the initial values of the custom variable in a specific format.
To install the tokenizer you first need node.js with npm packager
installed on our machine. Follow this process to install and use
Tokenizer.
Download and install node.js on your machine if it is not present. It
also installs npm package loader.
Download tokenizer from https://github.com/openalm/VSOtasks. It comes
as a .zip file. Unzip it.
Open command prompt and change directory to the folder
“Tokenizer\x.x.x” in the unzipped folder.
From that folder run the command npm install -g tfx-cli to install the
command line tool that can upload the tokenizer task.
After using this you will be albe to write the environment specific configuration file when you are deploying to different environments. More detail steps and tutorials. Please take a look at this blog from MSDN: Deploy to multiple environments with appropriate configurations
Update
For "rolling deploy", this can't be achieved for now. No this option and task in web base release management. You may have to apply the steps to each machine. If you really need this feature, you can add it in uservoice of VSTS, TFS admin and PM will kindly review your suggestion.
Using csx scripts in Azure Functions I can use the Project.json file to install nuget packages, but when I'm using fsx scripts the packages aren't installed (the log console never shows the Starting NuGet restore message). The only way I found is installing locally and uploading the dependencies. Am I missing something?
I think that the current execution model for F# in Azure functions does not support project.json. There is a work in progress PR to improve F# support that will enable this.
For now, I think there are two options:
Install the packages locally and upload them to Azure (as you are doing)
If you're deploying via git, then I think the deployment lets you run deployment script (in the same way in which Azure WebSites let you run a deployment script).
I have not tested the second approach with Azure functions, but I think it could work. For example, see the F# Snippets' deployment script which calls a build script that starts by using Paket to restore dependencies. This way, you need just paket.bootstrapper.exe and paket.dependencies with paket.lock to specify your NuGet dependencies.