I need to up TeamCity CI/CD for my iOS project. I'm using Carthage for dependency management.
I understood that performing carthage update for each build is a bad idea. A build-agent will be rebuilding frameworks for every new build. It's very time-consuming operation.
Is there any approach for the caching my dependencies for speedup a build?
Possible approach would be setting up a separate build configuration (e.g.,"Producer") that executes carthage update (whenever needed), then uploads zipped Carthage/Build to build server. Other build configurations for your project should have an artifact dependency on "Producer" and fetch binaries back to Carthage/Build
Related
I am currently running Carthage dependency manager for iOS, and I am having a hard time clearing the cache when running it over Azure Pipeline. This is what I am calling to build all the dependencies for the project.
carthage bootstrap --platform ios --verbose --new-resolver --use-xcframeworks
I have tried removing the cache based on the logs as following:
/Users/*****/Library/Caches/org.carthage.CarthageKit
However, it still grabs the cached version of the repo and fails. I have tried building the same project locally and it works and builds fine.
I have also tried to clean the workspace as part of the pool job, but still, it doesn't clean the cache.
Any help in this context is much appreciated.
Cocoapods embed a step in the build phase to check if the Pods folder is in sync with the versions in Podfile.lock. This blocks the developer to work with the stale versions of the Pod with the following error:
error: The sandbox is not in sync with the Podfile.lock. Run 'pod
install' or update your CocoaPods installation.
Carthage has Cartfile.resolved, but how is it used to check if the Carthage Builds are fresh vs stale? Is it something that has to be manually enforced via some script?
Afaik the Carthage guides do not mention this topic at all. Usually you will be informed via failing compilation that a certain framework is not found, e.g. if you switched to a new branch where a new feature is implemented that needs framework XYZ. Then you know you have to run carthage bootstrap.
We have a script that runs so fast that you could even place it into the git hooks so it runs automatically after switching branches. The carthage part just cann the following command:
carthage bootstrap --use-ssh --use-xcframeworks --cache-builds
It makes sure carthage is up to date and it is fast since it uses cached builds. Runs fine for several years now.
I am starting using Bitrise as the CI for my iOS app but I'm on the free version, so the build have to take 10 minutes or less.
The main issue is when Bitrise builds my Carthage dependencies, is there a way where I can prebuild it in local, push everything to my repo so I can completely avoid the Carthage step?
At the moment I'm using carthage bootstrap. I have also tried carthage update --cache-builds.
Thanks
Use rome https://github.com/tmspzz/Rome, this provides a cloud storage based Carthage cache, your Carthage step will pull pre-built binaries from your choice of cloud storage (S3 for instance).
I have an foo.xcconfig file that will have different values depending where in the world a developer is. Currently, we use a phase to create this file (which is imported by other config files). In the new build system, Xcode is now resolving the dependencies before I have a chance to create a dummy to satisfy the include.
This is easily solved for CI builds by doing the work before calling xcodebuild, but for local builds using the GUI, there does not seem to be a good solution.
Every time I run xcodebuild ...params... archive it performs a clean build without reusing the cache from a previous build.
While I understand the motivation behind it, it has serious consequences on my CI build times because I generate an IPA for every push. Since my workspace has a Pods project (unlikely to change between commits) and an App project (changing all the time) I'm looking for a way to cache the Pods compilation phase somehow – which xcodebuild archive doesn't seem to allow.
Is there a way to force using the cache when archiving? Is there any other way to cache the archived version of Pods?