I would have a question related to copy directory's between streams in UCM.
For example I have 2 streams included in the same project, lets name streams A and B.
Stream A was created a directory "my folder" which includes several sub directories and files.
I want to copy directory "my folder" including the entire content to stream B.
Can someone help me? What alternative do I have?
With Merge Manager it's not working, because I have to choose which directory/files from Stream B to be merged(but "my folder" is on Stream A), to create new directory and files is not working because of "evil twin".
You need first to merge the parent folder which includes "my folder"
You can do so simply by launching the version tree of that parent folder in a view on stream B, selecting the version of that folder in stream A (in the version tree), and right-clicking "merge to" the current version in stream B.
That will create a new version of that parent folder (in stream B) which references the "my folder": checkin that version to conclude that one merge.
Then you can launch merge on the view on stream B, indicating you want to merge stream A.
That will merge "my folder", and the sub-folders from "my folder" in that destination view on stream B.
The OP AlexM adds in the comments:
was not working to force merge because on stream B always was saying that merge already done (I merged before several files from Stream A);
I created a stream C at the same level with stream A, doing the same steps on Stream C ...version tree, start merge for parent folder, choose folder to be merged "my folder"...
I could copy "my folder", now I merge all files from Stream A to Stream C.
Related
I am trying to copy folders with their files from ftp into an azure data storage, by looping through the folders and for each folder copy the content into a container that has the folder's name. for this, I used a metadata ,for each and copy data component. For now I am able to copy all the folders into the same container , but what I want is to have multiple containers named after the the folders in the output, each one containing files from the ftp.
ps : I am still new to azure data factory
Any advise or help is very welcome :)
You need to add a Get Metadata activity before the for-each. The Get Metadata activity will get the files in the current directory and pass them to the For-Each. You connect it to your Blob storage folder.
try something like this
Setup a JSON source:
Create a pipeline, use GetMetadata activity to list all the folders in the container/storage. Select fields as childItems
Feed the Metadata output (list of container contents) into filter activity and filter only folders.
Input the list of folders to a ForEach activity
Inside ForEach, set the current item() to a variable, and use it as a parameter for a parameterized source dataset which is a clone of original source !
This would result in listing the files from each folder in your container.
Feed this to another filter and this time filter on files. Use #equals(item().type,'File')
Now create another pipeline where we will have our copy activity running for each file found to be having same name as that of its parent folder.
Create parameters in the new child pipeline to receive the current Folder and File name in the iteration from Parent Pipeline to evaluate for copy.
Inside child pipeline, start with foreach whose input will be the list of filenames inside the folder received into parameter: #pipeline().parameters.filesnamesreceived
Use variable to hold the current item and use IfCondition to check if filename and folder names match.
Note: Try to evaluate dropping the file extension as per your requirement as metadata would hold the complete file name along with
its extension.
If True - > the names match, copy from source to sink.
Here the hierarchy is preserved and you can also use "Prefix" to mention the file path as it copies with preserving hierarchy. It utilizes the service-side filter for Blob storage, which provides better performance than a wildcard filter.
The sub-path after the last "/" in prefix will be preserved. For example, you have source container/folder/subfolder/file.txt, and configure prefix as folder/sub, then the preserved file path is subfolder/file.txt. Which fits your scenario.
This copies files like /source/source/source.json to /sink/source/source.json
AzCopy is simplest solution for this than Data factory, dry run can be used to check which files/folders will be copied
az storage blob copy start \
--destination-container destContainer \
--destination-blob myBlob \
--source-account-name mySourceAccount \
--source-account-key mySourceAccountKey \
--source-container myContainer \
--source-blob myBlob
I am working on a project that contains multiples files with same file name. I am using git to maintain local versions of my changes. After staging the modified files, I notice that files with same name are appearing with status "R" implying replacing one file with another of same name but in different directory tree. How to make sure both are committed without being replaced by one another. I could not find relevant help material regarding this in any of git documentation.
Since this is a proprietary code, I am pasting only sample directory structure:
M <Proj_Root_Folder>/<dirA>/<dirAA>/file1.h
M <Proj_Root_Folder>/<dirA>/<dirAA>/file2.h
M <Proj_Root_Folder>/<dirA>/<dirAA>/file3.h
R <Proj_Root_Folder>/<dirB>/<dirBA>/file4.h -> <Proj_Root_Folder>/<dirA>/<dirAA>/file4.h
In git "R" means "rename". Git thinks that ///file4.h is a file that has been moved from where it was originally. Most likely because the file looks simmer.
I have a folder into tfs,and I want to take branch of this folder with creating new folder and put branch under this new folder programmatically.Normally when we do it in tfs it automatically change folder to branch.
When I use createbranch command ,it works ,create folder and under these new folder it create new branch,but branch seen like folder in tfs but I can merge it vs so it is working.If I want to change visualization I have to use second command CreateBranchObject.Is it possible to that in one command
Folder A-->take new branch
Folder A'(New Folder) --> Branch
Code Sample
int changesetId = VersionControlServer.CreateBranch(#"myfolder ", "mynewfolder\newbranch",
VersionSpec.Latest);
Changeset changeset = vcs.GetChangeset(changesetId);
changeset.Update();
This is not like in a single command and you will need to call both in sequence.
I would like to extend GitDiffMargin addin so that when a user is modifying a file in Visual Studio she can see the update diff in the margin even without saving the file.
Is it possible with libgit2sharp to do such a diff from a Tree and another Tree which I would have to build myself?
As far as I understand it, this question can be splitted into 3 sub questions:
How to diff two Trees
How to build a new Tree by modifying an existing one file (Blob) from it
How to create a Blob from the content of a file that hasn't been previously saved to disk.
How to diff two Trees:
API: repo.Diff.Compare<T>(Tree, Tree)
Tests: DiffTreeToTreeFixture.cs
How to build a new Tree by modifying an existing one file (Blob) from it:
API: TreeDefinition.From(Tree), TreeDefinition.Add(string, Blob, Mode) and repo.ObjectDatabase.CreateTree(TreeDefinition)
Tests: TreeDefinitionFixture.cs and ObjectDatabaseFixture.cs
How to create a Blob from the content of a file that hasn't been previously saved to disk:
API: CreateBlob(Stream, string)
Tests: ObjectDatabaseFixture.cs
I have 2 separate solutions in TFS, the structure is shown below:
App.A1
App.A1.Web
App.A2
App.A2.Core
App.A2.Web
Now I want to merge it to one solution. Additionally, I want App.A2.Core to became common Core project for 2 Web projects, so finally it should look like below:
App.B
App.B.Core
App.B.A1.Web
App.B.A2.Web
I'm using TFS. How it should be done not to loose history ?
Are the following steps:
Creating App.B solution folder in Source Control
Branching App.A1.Web, App.A2.Web and App.A2.Core to this folder
Changing names App.A1.Web -> App.B.A1.Web, App.A2.Web -> App.B.A2.Web, App.A2.Core -> App.B.Core
a good solution ?
Renaming and moving files in TFS2010 will disconnect the history but not lose it.
So, if I move file X in folder A to folder B then the history of X starts again in B. It will however still be in A (if you switch on the ability to see deleted files) where you can view the full history of X.
Similar story if you rename file X to be Y. The history of Y starts, the "deleted" X still has full history.
What this means for you is that the history is never lost.
As for what you want to do here's what I'd do:
Repurpose the App.A1 solution as App.B solution
Move all the App.A2 projects to be with the App.A1/B solution
Delete the App.A2 solution