I am attempting to implemtent a way to deploy my compiled assemblies to QA for testing. I have a number of repositories, each of which contain folders with projects and source code, and also a folder that contains the compiled assemblies produced from those projects. The image below is representative of my source repository workspaces.
I also have a Development user group and a QA user group. The Development user group has full permissions to the source repositories.
I have tried a couple different ways to accomplish my goal, but have had little to no success:
I have tried creating a new repository called Assemblies, and adding Xlinks to my source repositories, and then denying all permissions to the QA user group for all folders and files in the source repository, with the exception of the Assemblies folder. However, when logged into Plastic as a member of the QA user group, I can still download all of the source files.
I also tried directly accessing the source repositories as a member of the QA user group, but I still have access to files that shouldn't be available.
I am considering a check-in trigger. The trigger would add/check-in .exe's and .dll's into the Assemblies repository whenever they are checked into the main branch of a source repository. However, I'm not sure if I'm heading in the right direction. I don't want to 'reinvent the wheel' if there is already preferred method or best practice that I should be employing.
Any suggestions or references would be greatly appreciated.
This is my suggestion,
Create a new repository in order to place all your binaries at output artifacts. In your central repository, the one that xlinks to the external 'Third party' repository, create a new xlink to the new brand new "Binaries" repository.
Now you will need to change your build system to copy all the binaries and artifact to this xlinked repository. A (cm ci -a) command will commit all the changed binaries, finally label the changeset with the "cm label" command, this label will help the QA team to test certain releases.
Remove the view permission for the QA Plastic SCM group at the central repository and the "Third party" repository. Now the QA group will never know that there's even a source code repository. The QA group will have a workspace working with the "Binaries" repository and your source code/repositories will be safely hidden and not accesible by this QA group.
You can even have the "Binaries" repository in a external server/machine. So the source code is in devMachine:8087 and the binaries are in a different Plastic SCM server, for example "qaMachine:8087". Using this methodology the QA Plastic users will be using a dedicated server.
Related
I have a solution that contains several projects, I want a team to work on this source code, but every developer can only see his own project and can't see the rest of the projects. But he can build and run the whole solution. What solution do you have for this?
It should be able to achieve this no matter you are using TFVC or Git as your source control.
TFS build is using build service account not the user which trigger build. Once the build service account have appropriate permission, it will get source from TFS server and download to build agent.
The permission in TFS is mutual independence. And permissions can be granted directly to an individual user, or to a group.
In version control permissions, explicit deny takes precedence over administrator group permissions.
You could deny the Read permission for those users. And set the other related build permission such as view build definition and queue build to allow.
Read
Can read the contents of a file or folder. If a user has Read
permissions for a folder, the user can see the contents of the folder
and the properties of the files in it, even if the user does not have
permission to open the files.
Queue builds Can queue new builds.
View build definition Can view build definition(s).
View builds Can view builds belonging to build definition(s).
...
However, there is still some difference for Git and TFVC for control project permission:
In TFVC you can open the web portal and go to the code tab. There you can right click on any folder and select permissions. You can use any fine grain you like and control inheritance.
In Git you can only control permission ls at the Repository and Branch level.
Besides, you may also need a account with full permission of solution and all projects to create the build pipeline. Otherwise, the user may not able to select the mapping relationship in source get configuration.
We are two developer(me and my friend) and working on a MVC ASP.Net project in Visual Studio 2017 with TFS Online(visualstudio.com, TFVC).
All of us have full access to all files for developing and building to view and test.
We want to outsource part of our project to another developers and we don't want to access full permission of project files to new developers.
If we didn't access full permission to all files to new developers, they can't build project to view and test.
Is there a way to access just some files of a project to another developers but they could build project to view and test?
VSTS/TFS grant users the specific set of permissions that are appropriate for certain roles in your organization. Details of permission please refer this link.
It's not hard to restrict someone to access some project files. You could just deny the Read permission on a folder level when using TFVC source control.
Read
Can read the contents of a file or folder. If a user has Read permissions for a folder, the user can see the contents of the folder
and the properties of the files in it, even if the user does not have
permission to open the files.
If you are using the host agent to build, during the get source step. You(=the new developers) are using your own account to pull source from server to build agent. Without the access to some files in the project, you could not pull down the entire project source code. Definitely, you should not be able to run the build as well.
If you are using the private agent to build, it's able to use network service account as the build service account and queue build. Just give the build service account appropriately permission, you should be able to pull down all source files in the project and run the build. However, since you have denied the access of some other files in the project, it's also not able to view and test them .
In this case, as a workaround, suggest you create an apart branch with your main develop branch, just put some files which your new developers need to work with in this branch. And deny their access to your main branch, When their work have done, they could build/test on the new branch.
Once everything is fine, they could merge changes from the new branch to your main branch. You could then also build/test the entire project in the main branch again.
With this kind of architecture, it's more easy to manage both permissions and team work.
I have a problem with getting Google's Repo, gerrit & Jenkins working together.
our setup:
we have a Repo build out of few git repositories, which are all on our gerrit server. Manifest is also there.
Workflow:
user do changes in few files which sits in different git repositories in the Rep.
user commit & push the changes to Code review.
Problem:
for each of the repositories, a separate gerrit code review is created.
For each of those code reviews, Jenkins is being triggered.
If the changes are inter-depended (which they usually do), the Jenkins build will fail, as it takes only the changes in one repository each time.
How do we make Jenkins/Gerrit co-op to get the full changes into one build (that will work)?
Thanks
The configuration described here is very problematic,
and is sure to cause you grief in the future as well.
In the long run, consider one of those paths:
Reduce dependency between the components to the minimum
(this is preferred)
Merge the code to a single repository
Meanwhile -
Make sure the verification-job (in Jenkins) checks-out the latest code from all the other repositories
(those that are relevant for building this change)
Make (non-breaking) changes to one repository at a time
(requires some planning, of course)
We are going to be migrating an existing BizTalk codebase into TFS 2015 update 2. We would like to use GIT rather than TFVC for version control.
I have a problem getting my head around the repo to project relationship. I would like to run independent TFS projects for managing discrete pieces of work that align with "projects" as run by the business. Instinct tells me that I should create a branch for such projects but each TFS project seems to need its own repo?
If I stick with a single (BizTalk) TFS project, I will be able to create a branch for each business project but the work items will all be mixed together. This would make helpful reporting tools such as the burndown chart useless.
I guess the other option is to run multiple TFS projects each with their own repo and then manually merges between the repos? Maybe have a "Main" project and use its repo as the main branch of the project repos?
How are people managing this problem?
First note that inside a Team Project you can create several Git repositories.
Also, remember that Git branches have repository scope, it is not a directory like in TFVC (which it is not like Git at all :-) ).
Then if you want to migrate to Git, you need to modularize your projects. Once you have a modular code base, for each module you should create a Git repository. From each repository you should be able to build and publish a Nuget package. Then resolve inter module dependencies by resolving dependencies by means of Nuget package.
No need to merge anything from repo to repo or from module to module :-) You only need to merge from branch to branch inside the same repository.
The better way is that you can create multiple repositories in GIT Team Project.
Regarding work items, you may create multiple teams and areas, then put work items in different areas and change area per to requirements (the teams can have the same area).
There is a blog that may benefit you: Many Git Repositories, but one Team Project to rule them all
Our builds generally have a mish mash of work items and commits associated with them and I cannot tell how TFS determines what to add. We are using TFS 2015 update 3 and TFVC.
When a build runs, it gets code from a location somewhere in the branching and folder of TFVC. Typically, something like "root\dev\src\component name" in this way we avoid getting all of the code in our repository and we have CI set up to run so that any changes in this folder will result in a CI build running.
We also run daily builds which run more tests and create a release package that is used by TFS Release Management. I would expect that any changes to code inside of the folder defined in setting up the repository for this build to be included in the associated change-sets of a build. I also expect that any changes checked-in outside of these branches would not be associated. But this is not the case. We see commits from across the entire project.
Does anyone know how this is supposed to work?
I am not sure if this should go in the question or the answer but I have found some additional information, thanks to the hints provided in the answers below.
It appears that the source settings will take the common root between mapped folders of the repository settings, so if I have 2 folders $/Relo/Dev/B1/src/Claims.Services and $/Relo/Dev/B1/src/PSScripts it will take the common root $/Relo/Dev/B1/src as the source settings and include any changes from that folder down within the build. Can anyone confirm this? Of course thats not what I want to have happen. In the History tab of the build definition if I looked at the diff I can see a field "defaultBranch" in the json which seems to be the value that controls this, is there any way to update this field directly?
TFS determines what changesets should be mapped to a build based on the Source Repository Mappings (Build vNext) in the build definition and the last successful build.
So, you will see a list of the changesets with files committed in the lowest common base of any of the mapped folders including all their descendents, since the latest successful build. Whenever you get a successful build (I hope that it happens more often than failing ones ;-)) the list will shorten and only show the last check-in.
Example mappings below will result in any changeset made to anything below $/Relo/Dev/B1/src (because it is the lowest common base):
$/Relo/Dev/B1/src/Claims.Services
$/Relo/Dev/B1/src/PSScripts
Similar it will pick up all the related work items to the above changesets.
This is what should happen. If you see something else, I would have a closer look at the Repository Mappings or Source Settings of the build definition.
#Noel - I guess you are using vNext build and not XAML builds. Or are you using a mix of XAML and vNext?
In general a scheduled TFS build will associate all changes which were not associated in the last successful run of the same build.
I suggest you check once again if the source folder locations are the same for CI build and Daily build?