I'm writing an application that pulls changesets from TFS and exports a csv file that describes the latest changes for use in a script to push those changes into ClearCase. The "latest" doesn't necessarily mean the latest, however. If a file was added and then edited, I only need to know that the file was added, and get the latest version so that my script knows how to properly handle it. Most of this is fairly straight-forward. I'm getting hung up on files that have been renamed or moved, as I do not want to show that item as being deleted, and another item added. To uphold the integrity of ClearCase, I need to have in the CSV file that the item is moved or renamed, along with the old location and the new location.
So, the issue I'm having is tracing a renamed (or moved) file back to the previous name or location so that I can correlate it to the new location/name. Where in the API can I get this information?
Here is your answer:
http://social.msdn.microsoft.com/Forums/en/tfsgeneral/thread/f9c7e7b4-b05f-4d3e-b8ea-cfbd316ef737
Using QueryHistory you can find out that an item was renamed, then using its previous changeset (previous to the one that says it was renamed) you can find its previous name.
You will need to use VersionControlServer.QueryHistory in a manner similar to the following method. Pay particular attention to SlotMode which must be false in order for renames to be followed.
private static void PrintNames(VersionControlServer vcs, Change change)
{
//The key here is to be sure Slot Mode is enabled.
IEnumerable<Changeset> queryHistory =
vcs.QueryHistory(
new QueryHistoryParameters(change.Item.ServerItem, RecursionType.None)
{
IncludeChanges = true,
SlotMode = false,
VersionEnd = new ChangesetVersionSpec(change.Item.ChangesetId)
});
string name = string.Empty;
var changes = queryHistory.SelectMany(changeset => changeset.Changes);
foreach (var chng in changes)
{
if (name != chng.Item.ServerItem)
{
name = chng.Item.ServerItem;
Console.WriteLine(name);
}
}
}
EDIT: Moved the other solution up. What follows worked when I was testing against a pure Rename change but broke when I tired against a Rename and Edit change.
This is probably the most efficient way to get the previous name. While it works (TFS2013 API against as TFS2012 install), it look like a bug to me.
private static string GetPreviousServerItem(VersionControlServer vcs, Item item)
{
Change[] changes = vcs.GetChangesForChangeset(
item.ChangesetId,
includeDownloadInfo: false,
pageSize: int.MaxValue,
lastItem: new ItemSpec(item.ServerItem, RecursionType.None));
string previousServerItem = changes.Single().Item.ServerItem;
//Yep, this passes
Trace.Assert(item.ServerItem != previousServerItem);
return previousServerItem;
}
it would be used like:
if (change.ChangeType.HasFlag(ChangeType.Rename))
{
string oldServerPath = GetPreviousServerItem(vcs, change.Item);
// ...
}
Related
All,
I am trying to get the list of all the files that are in a particular repo in TFS GIT using REST API.
I found the below one but it only display the contents of the specific file name mentioned after "scopePath=/buld.xml", it only display the contents of file build.xml.
But I am trying, only to list all the files that are in a particular repository with out mentioning the particular file name.
Please help me.
https://{accountName}.visualstudio.com/{project}/_apis/git/repositories/{repositoryId}/items?items?scopePath=/&api-version=4.1
You can use the api below:
https://{accountName}.visualstudio.com/{project}/_apis/git/repositories/{repositoryId}/items?recursionLevel=Full&api-version=4.1
Also that could be achieved using VisualStudioOnline libs (at the date of writing comment it becomes AzureDevOps): Microsoft.TeamFoundationServer.Client, Microsoft.VisualStudio.Services.Client.
First, you need to create access token. Then just use code below:
VssBasicCredential credintials = new VssBasicCredential(String.Empty, "YOUR SECRET CODE HERE");
VssConnection connection = new VssConnection(new Uri("https://yourserverurl.visualstudio.com/"), credintials);
GitHttpClient client = connection.GetClient<GitHttpClient>();
List<GitRepository> repositories = await client.GetRepositoriesAsync(true); // or use GetRepositoryAsync()
var repo = repositories.FirstOrDefault(r => r.Name == "Some.Repo.Name");
GitVersionDescriptor descriptor = new GitVersionDescriptor()
{
VersionType = GitVersionType.Branch,
Version = "develop",
VersionOptions = GitVersionOptions.None
};
List<GitItem> items = await client.GetItemsAsync(repo.Id, scopePath: "/", recursionLevel: VersionControlRecursionType.Full, versionDescriptor: descriptor);
Under the hood it's using the REST API. So if you try the same effect using c# lang, better delegate it to lib.
You need to call the items endpoint first, which gives you an objectId (the gitObjectType should be "tree"):
http://{tfsURL}/tfs/{collectionId}/{teamProjectId}/_apis/git/repositories/{repositoryId}/items?recursionLevel=Full&api-version=4.1
Then call the trees end point to list the objects in the tree:
http://{tfsURL}/tfs/{collectionId}/{teamProjectId}/_apis/git/repositories/{repositoryId}/trees/{objectId}?api-version=4.1
test
the ASP.NET_SessionState table grows all the time, already at 18GB, not a sign of ever deleting expired sessions.
we have tried to execute DynamoDBSessionStateStore.DeleteExpiredSessions, but it seems to have no effect.
our system is running fine, sessions are created and end-users are not aware of the issue. however, it doesn't make sense the table keeps growing all the time...
we have triple checked permissions/security, everything seems to be in order. we use SDK version 3.1.0. what else remains to be checked?
With your table being over 18 GB, which is quite large (in this context), it does not surprise me that this isn't working after looking at the code for the DeleteExpiredSessions method on GitHub.
Here is the code:
public static void DeleteExpiredSessions(IAmazonDynamoDB dbClient, string tableName)
{
LogInfo("DeleteExpiredSessions");
Table table = Table.LoadTable(dbClient, tableName, DynamoDBEntryConversion.V1);
ScanFilter filter = new ScanFilter();
filter.AddCondition(ATTRIBUTE_EXPIRES, ScanOperator.LessThan, DateTime.Now);
ScanOperationConfig config = new ScanOperationConfig();
config.AttributesToGet = new List<string> { ATTRIBUTE_SESSION_ID };
config.Select = SelectValues.SpecificAttributes;
config.Filter = filter;
DocumentBatchWrite batchWrite = table.CreateBatchWrite();
Search search = table.Scan(config);
do
{
List<Document> page = search.GetNextSet();
foreach (var document in page)
{
batchWrite.AddItemToDelete(document);
}
} while (!search.IsDone);
batchWrite.Execute();
}
The above algorithm is executed in two parts. First it performs a Search (table scan) using a filter is used to identify all expired records. These are then added to a DocumentBatchWrite request that is executed as the second step.
Since your table is so large the table scan step will take a very, very long time to complete before a single record is deleted. Basically, the above algorithm is useful for lazy garbage collection on small tables, but does not scale well for large tables.
The best I can tell is that the execution of this is never actually getting past the table scan and you may be consuming all of the read throughput of your table.
A possible solution for you would be to run a slightly modified version of the above method on your own. You would want to call the the DocumentBatchWrite inside of the do-while loop so that records will start to be deleted before the table scan is concluded.
That would look like:
public static void DeleteExpiredSessions(IAmazonDynamoDB dbClient, string tableName)
{
LogInfo("DeleteExpiredSessions");
Table table = Table.LoadTable(dbClient, tableName, DynamoDBEntryConversion.V1);
ScanFilter filter = new ScanFilter();
filter.AddCondition(ATTRIBUTE_EXPIRES, ScanOperator.LessThan, DateTime.Now);
ScanOperationConfig config = new ScanOperationConfig();
config.AttributesToGet = new List<string> { ATTRIBUTE_SESSION_ID };
config.Select = SelectValues.SpecificAttributes;
config.Filter = filter;
Search search = table.Scan(config);
do
{
// Perform a batch delete for each page returned
DocumentBatchWrite batchWrite = table.CreateBatchWrite();
List<Document> page = search.GetNextSet();
foreach (var document in page)
{
batchWrite.AddItemToDelete(document);
}
batchWrite.Execute();
} while (!search.IsDone);
}
Note: I have not tested the above code, but it is just a simple modification to the open source code so it should work correctly, but would need to be tested to ensure the pagination works correctly on a table whose records are being deleted as it is being scanned.
I've written a console app using the Umbraco (7.1.4) contentService API to move some nodes and rename them in a site redesign. It all works fine except when I rename the document the 'Link to Document' doesn't change. The code is adapted from https://github.com/sitereactor/umbraco-console-example.
private static void MoveNode(IContentService contentService, int nodeId, int newParentId, string newname)
{
//Get the Root Content
var nodeContent = contentService.GetByIds(nodeId.AsEnumerableOfOne()).First();
nodeContent.Name = newname;
contentService.Move(nodeContent, newParentId);
var status = contentService.SaveAndPublishWithStatus(nodeContent);
Console.WriteLine(status);
}
Status is True and the page name is changed when I look at it in back office but the 'Link to Document' doesn't change. Now if I use
var status = contentService.PublishWithChildrenWithStatus(nodeContent);
Then it works but takes a lot longer (minutes), but if I change the name in back office then it only takes seconds but the links are updated correctly. Is there another way to rename a document without Publishing all the children?
(I've left out a bit of code in the above - sometimes it moves sometimes it just renames, but in either case I have to publish with the children to get it to work.)
It seems this happens because the XML cache file has not been updated.
Have a look here for a few ways of doing it
http://our.umbraco.org/wiki/reference/api-cheatsheet/publishing-and-republishing
Or the quickest is just delete APP_Data\umbraco.config - the XML cache file, probably not recommended for Prod but gets things working quickly in dev.
I am looking to write a small firefox add-on that detects when files that were downloaded are (or have been) deleted locally and removes the corresponding entry in the firefox download list.
Can anybody point me to the relevant api to manipulate the download list? I cannot seem to find it.
The relevant API is PlacesUtils which abstracts the complexity of the Places database.
If your code runs in the context of a chrome window then you get a PlacesUtils glabal variable for free. Otherwise (bootstrapped, Add-on SDK, whatever) you have to import PlacesUtils.jsm.
Cu.import("resource://gre/modules/PlacesUtils.jsm");
As far as Places is concerned, downloaded files are nothing more than a special kind of visited pages, annotated accordingly. It's a matter of just one line of code to get an array of all downloaded files.
var results = PlacesUtils.annotations.getAnnotationsWithName("downloads/destinationFileURI");
Since we asked for the destinationFileURI annotation, each element of the resultarray holds the download location in the annotationValue property as a file: URI spec string.
With that you can check if the file actually exists
function getFileFromURIspec(fileurispec){
// if Services is not available in your context Cu.import("resource://gre/modules/Services.jsm");
var filehandler = Services.io.getProtocolHandler("file").QueryInterface(Ci.nsIFileProtocolHandler);
try{
return filehandler.getFileFromURLSpec(fileurispec);
}
catch(e){
return null;
}
}
getFileFromURIspec will return an instance of nsIFile, or null if the spec is invalid which shouldn't happen in this case but a sanity check never hurts. With that you can call the exists() method and if it returns false then the associated page entry in Places is eligible for removal. We can tell which is that page by its uri, which conveniently is also a property of each element of the results.
PlacesUtils.bhistory.removePage(result.uri);
To sum it up
var results = PlacesUtils.annotations.getAnnotationsWithName("downloads/destinationFileURI");
results.forEach(function(result){
var file = getFileFromURIspec(result.annotationValue);
if(!file){
// I don't know how you should treat this edge case
// ask the user, just log, remove, some combination?
}
else if(!file.exists()){
PlacesUtils.bhistory.removePage(result.uri);
}
});
I was hoping that I could pass a DateVersionSpec into VersionControlServer.DownloadFile() but it doesn't work. It tells me that the item doesn't exist at that version, even though the file existed in source on the date passed.
Do I need to query the Item history just so I can figure out what version the file was at on the date in question? Use QueryHistory(...) method?
My current code:
version = new DateVersionSpec(date);
var changeSets = this.vcServer.QueryHistory(remoteFile, VersionSpec.Latest, 0,
RecursionType.None, user, version, version, 50, true, false);
if (changeSets == null)
{
throw new Exception("Failed to find...");
}
foreach (var item in changeSets)
{
}
Currently I'm not getting anything back when I pull the changeSets enumerable.
I'm using code that's a lot like this: http://blogs.microsoft.co.il/blogs/srlteam/archive/2009/06/14/how-to-get-a-file-history-in-tfs-source-control-using-code.aspx
Update: the code that I have is pretty close (practically identical to the code from the post) but it dies if the file was added on a date before the date passed in and hasn't been changed since i.e. it only has one change and that's an add.
This got me what I was looking for on my app. If it doesn't work check to make sure your file path is correct. That's what I had wrong the first time around.
this.vcServer.GetItem(remoteFile, new DateVersionSpec(date));