I am working on setting up CI/CD for my organization. I am trying to build an automated system that will deploy our changes to our Shopify theme when we push to a branch.
I have a config.yml file that contains information like password, themeID and so on. I am trying to find a way to hide these variables in our repo. Github secrets have not seemed to work, as I can't figure out how to pass those secrets from my workflow yaml file to my root level config.yml file. Any help would be greatly appreciated!
Related
I've been using AWS amplify to build my iOS app's backend.
I have created 4 DTAP environments in the backend, with 4 different configurations, and use a run-script to switch in the correct versions of awsconfiguration.json and amplifyconfiguration.json at compile-time based on the selected scheme.
Since these auto-generated config files contain a number of secrets and API keys, I am keeping them away from source control in my .gitignore as this would be a point of failure, and I don't want to expose my entire backend in this way.
This works fine locally, but when I run my CI on Bitrise, the build fails since these config files aren't present. I need to find a way to get these AWS and Amplify config files into the CI to be able to create my test builds.
If I am being overly cautious, and the config files are actually fine to keep in source control (i.e. not secret), please let me know. I really don't want to set up secrets as individual environment variables, since Amplify will have several secrets and endpoints for each environment I need, and it feels too messy and complicated to have a script building these config files as a CI stage.
Things I've tried:
Creating mock config files with fake secrets that is copied in at compile time - this fails because the compile-time script still tries to copy the non-existent config files for the real environment
Using individual environment variables as secrets in Bitrise - this is likely to work, but will be a monumental effort for my 1-dev startup to maintain
Touching a fake config file to copy over - this works but means the actual AWS infra doesn't work in the test builds
I'll be grateful for any thoughts, suggestions or experience anyone has.
Thanks
Jacob
I would recommend using Generic File Storage and the related step to download them. This will inject them into your build and you will be able to put them where they need to be before the project is compiled.
I have several separate github repositories in a github organization for which I want to run the same build test with travis-ci.
That is, I want to be able to use the same .travis.yml for all of these repositories. Moreover, I'd like to be able to update this file and have those changes be valid for each repository.
I could copy the .travis.yml into each repository. But if I have hundred or two hundred repositories, that gets annoying real fast.
Is there anyway to simply point each repository to an external .travis.yml rather than having to put a duplicate .travis.yml file in each repository.
There isn't a way to do this with a remote .travis.yml file, as Travis-CI will look at the root of the project for this file. An alternative approach I would suggest to accomplish your goal:
Build automation around updating all of your repository's .travis.yml files from a shared common file. Using your favorite scripting language, updating the file in all specified repositories and then pushed to GitHub/GitLab automatically. This should help in maintenance of your repositories with just a bit of extra automated work.
I want to include the Terraform orchestration process in my Continuous Integration pipelines. The idea is that each time someone modifies a Terraform template, a new version is bumped up and a snapshot is saved on a repository somewhere, like Nexus.
In a very naive approach, I was thinking of putting a comment on the top of every Terraform template file like this: # Version 1.0.0 and on every release I look into this string and bump it up to # Version 1.0.1.
Is there however, a recommended way of doing it, the Terraform way?
I believe what you are looking for is a terraform S3 backend with terraboard view.
By this way, the state file goes to S3 bucket whenever a change happens. Terraboard gives a good UI to view/compare the versions/states.
https://github.com/camptocamp/terraboard#use-with-docker
Remember: AWS S3 needs to have Versioning enabled.
Thanks.
I have looked into jenkins tutorials and all most all of them mention that we should provide with the URL to the git repo.
Fine.
But once jenkins has an access to the git repo, what part of project does it look into to figure out which tests should be run or wether to run them at all etc ? Is it some configuration file in the repo ?
Guess that depends on what kind of project your repo is. If I understand the question correctly. The provided url gives Jenkins the information to do a git clone url which checks out the project in Jenkins workspace.
Then according to the type, lets say it's a Maven-project, you fill in the goals you'd like Jenkins to run locally. Usually clean test. It is then run at top level, root of the project, guessing it will find a pom.xml there. If not you'll have to tell it where to look.
A more clearer answer would perhaps be easier if you told what kind of project you'd like to build.
I'm trying to deploy my MVC4 app to ELB. The project has several post-build steps which pull together dependencies. The AWS SDK publish wizard then does not do the trick - it builds a Web Deploy package behind the scenes, which does not action those post-build steps or preserve the resulting directory structure.
So, I downloaded the command-line EB tools, got a git repository working, but can't work out the next step: what do I push to the server with git aws.push: because if it's just the resulting files then I can't specify the "Enable 32-bit applications" flag (required), etc. Do I then push a web deploy package from my repository?
I presume so, but if so, how do I include the files copied into the output folder during "normal" builds by my post-build steps?
Here we go. This seems to be in conflict with what Jim Flanagan was saying - below it's a zip file, but Jim says it's the contents of it.
#Jim Flanagan - perhaps you could comment if you have some time. Thanks.
Hi thanks for contacting AWS Premium Support
Communication from the Elastic Beanstalk Engineering Team.
When you aws.push an ASP.NET/MVC app you do not push the web deploy archive, rather you push the artifacts as you want them deployed on the machine. From the customers stack overflow question it seems they have already found the local git repo that the VS deployment wizard created and looking their should give them a good indication of what is needed in the git repository.
There isn't a nice way through the aws.push to specify what the "Enable 32-bit Application" app pool setting should be (or any other configuration setting). If you need a specific configuration setting set I would suggest creating the environment (via the console or using the eb command line tool) which allow you to specify the configuration. And then use git aws.push to deploy to that environment, git aws.push will just use the configuration that is already present on the environment.
The last question about still being incremental is not really valid since you are not pushing just one big zip file. But if you were, it could still be incremental depending on what changed in the zip file, it might just send a diff between the two versions of the zip file. As the question implies though that use case is not really what incremental deployments were designed to help with.