Limit visibility for targets of a subpackage to only that subpackage - bazel

In our mono-repo we have a directory called wild_west, were people are allowed to submit any code they want without code review. It's great for one-off tests and experimentatal code that is still worth keeping checked into git.
Developers are not allowed to depend on anything from wild_west in the rest of the code-base.
Is there a way to enforce that with bazel?
The only way I can think of currently is to require that all targets under //wild_west must add visibility = ["//wild_west:__subpackages__"]. However, this would not prevent people from simply making targets public. I would ideally like to be able to apply some sort of "maximum visibility scope" for all targets under wild_west so they can't be visible outside of wild_west.
Is this possible?
Alternatively, maybe there is a robust way to add a check in our CI for this?
Thank you!

Related

Managing multiple versions of an iOS App

Lets say I have an iOS App for let's say, Football news, now I want to create an other version for Basketball news that will be based mostly on the Football App but with a freedom to create a different behaviour in some aspects of each app + adding more apps in the future for other news subjects.
An other condition is that they will have a separate CoreData model, assets, icon etc.
As I understand I have few options:
Manage the apps separately, place them in the same directory and point to the shared files in the first (Football app).
Create a different target for each app in the same project
Create a Workspace with one project that will hold the common code and a project for each project.
What are the pros / cons for each option and what are the best practices in this situation ?
Just to clarify - the apps I mention are an example, the App is not for news, and it must be a different app for each concept.
Thanks
I work in an enterprise environment, and we have a mobile app that's a product of the company I work for. We sell licenses of that software to our costumers, which are always huge companies. Our app doesn't go through the App Store.
Each of our clients have some sort of customization on the app, either by simply changing their logos or even adding some specific features for one of them. What I mean by this is: we have to deal everyday with a situation very close to what you are describing, and here's my two cents.
In advance: sorry if I'm too honest sometimes, I don't mean to offend anyone.
1. Manage the apps separately, place them in the same directory and point to the shared files in the first (Football app).
Well... That's a weird solution, but it sure could work. It might be hard to maintain locally and even harder when using SVN/Git (specially when working on a team).
I had some issues before related to symbolic links before, but I'm not sure if that's what you are referring to in this option. If you explain a little bit better, I can edit this and try to give you a better opinion.
2. Create a different target for each app in the same project
That's a better start, in my opinion.
We use this approach mostly to handle various possible backend servers. For example, one of our targets uses our development backend server, while another target uses the production server. This helps us ensure that we can use the development-targetted app without risking serious costs to our team (due to a mistakenly placed order, for instance).
In your case, you could for example configure preprocessor macros on the targets to enable/disable some target-specific feature that's called by code. You could also use different storyboards for each target.
The downside of this option is that the code will be messy, because every piece of code will be on the same project. This is the main reason why I'd go with option #3.
3. Create a Workspace with one project that will hold the common code and a project for each project.
Again, I'd go for this. To be honest, we're not using this at our company YET, but that's due to internal reasons. I'm trying to get this going for our projects as soon as possible.
I wouldn't call it easy to set up, but if done properly it can help you save some time because of maintenance reasons. You'll be able to reuse any code that's possible to reuse, and still be able to keep your target-specific images, classes and views to their own "container"(project).
This way you'll get a default project (the app itself), multiple targets for it, and a "framework" to keep the code for each one of the targets. In other words, you'll be able to share code between the multiple targets/apps, and at the same time you'll be able to separate what belongs to each one of them. No messy project :)
I'm not sure about how CoreData is compiled by Xcode, as we're not using it. But check out the answer I just did for another question. It's not Swift, but that shouldn't make much difference as almost all of the answer is about configuring the workspace to achieve this solution. Unfortunately I think it too big, that's the reason why I'm linking the answer instead of pasting it here.
If you need any help setting that up, let me know and I'll do my best to help you.
This may be overkill for you, but this solution is scalable. We had to build ~15 apps from one codebase
The problem we had to solve was branding. Application design and flow was basically the same, along with the structure of the data we received.
A lot of the heavy lifting was done by our CI server.
We had a core application with all of the UI and some common business logic. this was known as the White-app.
We then had a specific project (frameworks didn't exist then) for each of the different endpoints & data models and mappers into the White-app's view models. Those applications were private pods and managed by cocoa pods.
Our CI was configured in a way that it would compile all 'Branded' app's by copying, compiling, signing all the varying plist, assets, string files into each application along with each of the specific data models for each application. So when a end-to-end build was triggered, it would build all the different branded apps.
The advantage of this is the target layout within Xcode is not cluttered, we had a release, test and development target which applied to each application built. This meant our project was succinct with no risk of accidentally editing a branded apps build settings.
This solution will also provide you with an .xcworkspace (mostly utilised by cocoa pods) which contains reference to the the different model pod's
This solution because it is work to setup i.e when building in Xcode we created a special scheme which installed a pod and copied in all the correct assets (as CI would)
This is a question that many developers were thinking about many times, and they came up with different solutions specific to their needs. Here's my thoughts on this.
Putting the common parts, which you could see as the core, into something separate is a good thing. Besides supporting reusability, it often improves code quality by the clear separation and clean interfaces. From my experience, this makes testing also easier. How you package this is determined by what you put in there. A static library is a pretty good start for core business logic, but lacks support for Swift, and resources are painful to include. Frameworks are great, but raise the bar on the minimum iOS development target. Of course, if you're just using very few files, just adding the folder to your app projects might work as well - keeping the project structure up to date can be automated (the dropbox/djinni thing does this), but it's a non-trivial approach.
Then there are the actual products to build, which must include the core module, and the individual parts. This could be a project with several targets, or a workspace with several projects, or a mix of both. In the given context, I make my decision based on how close the apps relate. If one is just a minor change from the other, like changing a sports team, or configuring some features out as in light vs. pro, this would be different targets in the same project. On the other hand, I'd use different projects (maybe arranged within a common workspace) if the apps are clearly different, like a Facebook client and a Twitter client, a board game app for offline play and an online gaming app etc.
Of course, there are many more things to consider. For example, if you build your app for clients and ship the sources, separate projects are probably needed
.
It's better to create a framework that will contain the most shared code you need in all 3 options. Also, the first option is bad in any case. For better control it is better to have 2 or 3 option. The workspace is more preferable, imho, since it will not harm to other sub-projects if you, for example, will decide to use cocoapods. The workspace also allows you to have a different set of localizations in each project. Plus, only targets that related to a specific project will appear in targets list, which is better than a bunch of target in one pile (if you have, for example, a share extension in all products - it will be frustrating to find one you need). What you choose depends on your needs, but both second and third options are good enough.
I think that the best way to do that is something that encloses all the 3.
First I would create a configurable framework, that shares with all targets everything that they have in common, from UI (elements such as custom alerts etc) to business logic.
Then I will create different bundles or folders for each target checking the membership target (in these way you guarantee only to import the exact resources), then using preprocessor macro you can create a path builder specific to the right bundle or directory where your resources reside.
During the years I've collected some interesting links about best practice.
Here they are:
Use asset catalog with multiple targets
Use multiple tagets XCode 6
XCode groups vs Folders
Create libraries with resources
Create lite and pro version of an app
I know that in SWIFT they made some changes about preprocessor macros, so some article are still valid but little bit outdated.
We all face this kind of situation. But here are the things I do and maybe you can pick something here that can help you. (I hope).
have a project that contains the core features
have modular projects that can be used by other variants of the product
manage the project under version control or git flow that will help keep the main source / project under the main branch accessible through branches / features
create new branch / feature for the project variant if necessary or just enable / disable or use project modules needed for that variant (whatever is most appropriate on the current setup).
if the app has a web service that it connects to, provide a licensing stage where the mobile app will do it's first ever request to a common (to all variants or even all mobile apps) web service URL. This web service interprets the request and respond with the given information to what the app's settings will be (e.g. web service to connect to for the given license number, modules to be enabled, client's logo, etc).
the main projects and modules created can be converted to frameworks, libraries or even bundles for resources & assets depending on the level or frequency of changes done to these items. If these items are constantly changing or updated by others, then don't compress it; have a workspace with targets that link the whole project / module to the current project variant so that the changes to these modules reflect immediately (with consideration of version control of course).

Different files per build

I'm having a class with stuff that changes per build. For the debug build some network calls are different because of another server, release and mock too.
In Android I use flavors and put in each flavor a file with the same name but the code is different.
I'm searching for the same possibility in xCode. I've seen tutorials like this but thats doing it via plists. But it's not working for classes.
There are so many ways to do this.
My personal favorite is to use multiple Targets. I won't go into real detail here as a simple google search should reveal plenty of information on how to create and work with multiple targets.
I prefer targets because it is so simple to switch between them depending upon your needs and the fact that you can have each of the targets on one device as needed. For example you can have a current production version of your app on the device along with your latest dev and QA versions as well.
An alternative would be to use "Categories" - again google should get you plenty of information to implement.
Use the category to extend your class with the specific information you need for each environment. Create multiple iterations of your category (one for each group of settings) and use a pre-build script to copy the desired instance into your project.

How to use TFS Build Process: LabelSources?

I'm attempting to modify my build process file for TFS 2010. I have a flag that is set when queuing the build, and when said flag is set, I want to create a Label, and add all the source files in the compiled project to that label.
On sequential builds, with the flag set, I than want to replace older source files in said label with anything new in the changeset being compiled.
I've been attempting to do this with LabelSources with no luck, and there is but vary poor documentation on either LabelSources or LabelWorkspace (whats the difference?).
Here's what I currently have:
<mtbwa:LabelSources
Child="[LabelChildOption.Replace]"
Comment="Published to Container"
DisplayName="Create Container Label"
sap2010:WorkflowViewState.IdRef="LabelSources_1"
Items="[{"$/Foo/LabelTest/Sandbox/"}]"
Name="[String.Format("{0}-{1}", LabelName, Version_Container)]"
Recursion="[RecursionType.Full]"
Scope="$/Foo"
mva:VisualBasic.Settings="Assembly references and imported namespaces serialized as XML namespaces"
Version="T" />
It definitely hits the action, but no labels can be found after the fact.
Any help would be much appreciated. and Any tangible documentation, other than Class Documentation with sparse definitions would also be greatly appreciated
Edit 1: Tried to clear up my goal.
What you are trying to do is built into the existing template. There should be an option in the process definition that refers to Clean Sources which will be set to True.
This option controls wither the build sources get cleaned, deleted and start afresh. Or if a differential is done.
If you have a lot of source code you can set clean sources to false and save a bunch of time getting the code.
You can also speed the build by placing a TFS Proxy on the build box which will cache the files and make a clean build quicker.
In my experience, Most of the built-in activities are poorly-documented for a reason - their only well-tested use case is their use inside TFS' built-in templates (DefaultTemplate.11.0.xaml, etc.). I'm afraid you're going to have to write some custom code, in the form of a custom activity, powershell script or something, to achieve other goals.
That said, I don't really understand the process you're trying to set up. Do you just want to have a label set as your latest-successfully-built sources? Why not use the one created automatically by the build itself?

Reuse parts of a TFS build process template

TFS build flow is defined in TFS 2010's build template(which in fact is Windows Workflow Foundation file with *.xaml extension).
It was pretty convenient for dealing with single build definition in simple project, but in the near future we'll have more complicated project where we'll have many very different build definitions, but in the same time some of them will have some significant common parts in logic.
And there is no wish to have common logic replicated in each build template, and on the other hand having one super-smart-parametrizable build is considered as not the best idea.
Long story short, but the questions is:
is there any possibility to put common logic into another build template/or_whatever and reuse it?
If not - do you have some approaches/recommendation regarding such situation?
UPDATE
As K.Hoff mentioned, there is a possibility to create custom activities, but I want to go deeper and reuse not only activities but sequences as well(put simply, similarly to like Ant or NAnt do - include one file into another, call one sequences from another, etc).
I would recommend you to check whether it is possible to write code activity which executes workfow (.xaml file) with common build functionality. As a result such code activity could be put into several "master" build templates so it is possible to reuse common flow.
Here is an example how to dynamically load and execute workflow - http://msdn.microsoft.com/en-us/vs2010trainingcourse_introtowf_topic8.aspx.
We have a similar situation, but since most of our build scenarios are similar (i.e. get->build->test->deploy) we have mostly solved it with one big definition and custom activities. But we also make use of the ExecuteWorkflow activity available from Community TFS Build Extensions.
This works well for "simple" scenarios, the reason we don't use this more extensively is because it's quite complicated to pass parameters between workflow executions. Here's a link to a problem I had with this (and further down the solution I found).
You can create custom code activities as explained here and reuse them in other build templates.
An other way is to implement good old msbuild scripts and put them in the msbuild execution activities to reuse them in many build process templates.
I can't find a quick way to reuse complete sequences, the only way we found is to write the acitvities as common as possible and inject parameters to get them run.
But I don't think it's a TFS problem it's a Workflow problem.

Working with MSBuild and TFS

I'm trying to work with MSBuild and TFS.
I've managed to create my own MSBuild script, that works great from the command-line. The script works with csproj files, and compiles, obfuscate, sign and copies everything that's needed.
However, looking at the documentation of TFS & Team Build, it appears that it expect solutions as the "input" for the script.
Also, I haven't found an easy/intuitive way of performing a "Get Latest Version" from the TFS as part of the script. I'm assuming that the Team Build automatically do a "Get Latest" on the solutions it's suppose to compile, but again - I don't (want to) work with solutions...
Any insights? any pointers? any links?
Team Build defines about 25 targets of its own. When you queue a Team Build, they are automatically run for you in the predefined order listed # MSDN. Don't modify this process. Instead, simply set a couple of these properties that determine how the tasks behave. For example, set <IncrementalGet> to "true" if you want ordinary Get behavior, or "false" if you want something closer to tf get /force.
As far as running your own MSBuild script, again this shouldn't be necessary. Start with the TFSBuild.proj file that's provided for you. It should only require minimal modifications to do everything you describe. Call your obfuscation & signing code by overriding a task like AfterCompile or AfterTest. Put your auto-deploy code in AfterDropBuild. Etc.
Even really complex scenarios are possible if you refactor appropriately. See past answers #1 #2.
As far as the actual compile, you're right that Team Build operates on solutions. I recommend giving it what it wants. I'll be the first to admit that *.sln files are ugly and largely undocumented, but at least you're offloading the work to a well tested & supported product.
If you really wanted to, you could give it a blank/dummy solution and override the CoreCompile task with your custom compiler logic. But this is really asking for trouble. At bare minimum, you lose all of Team Build's flexibility WRT building multiple platforms and flavors. More practically, you're bound to spend a lot of time debugging something that's designed to "just work" -- and there are no good MSBuild debuggers yet (that I know of). Not worth it, IMO.
BTW, the solution files do not affect the Get process. As you can see in the 1st link, the Get is done very early on, long before Team Build even reads the solution file(s). Apart from a few options like <IncrementalGet>, this is not controlled from MSBuild at all -- in particular, the paths to be downloaded are determined by the workspace mappings associated with the build definition. I.e., they are stored in the Team Build SQL database, not the filesystem, and managed with tools (like Team Explorer) that call the TFS webservice API.

Resources