Azure DevOps Client Library - Download Earlier Repository Files - azure-devops-rest-api

I am trying to download earlier repository versions (.csproj files) from the commit records that I am obtaining from the Azure DevOps NuGet client library. I want to do this so I can access the Assembly Version information in the .csproj file. The GetFile() function I am using to get the current version of the file works fine but I want to download the older versions of the file from the commit records.
This is the GetFile function.
public string GetFile(string projectName, string repoName, string fileName)
{
try
{
var items = ListItems(projectName, repoName);
var projectPath = items.FirstOrDefault(i => i.Path.Contains(fileName))?.Path ?? "";
GitHttpClient gitClient = GetGitHttpClient();
GitRepository repo = GetRepositoryAsync(projectName, repoName);
var stream = gitClient.GetItemTextAsync(repo.Id, projectPath).Result;
var reader = new StreamReader(stream);
return reader.ReadToEnd();
}
catch (Exception e)
{
return "";
}
}
And this function gets the commits.
public async Task<List<GitCommitRef>> GetCommitsAsync(string projectName, string repoName)
{
var client = GetGitHttpClient();
var repo = GetRepositoryAsync(projectName, repoName);
var gitQueryCommitsCriteria = new GitQueryCommitsCriteria();
return await client.GetCommitsAsync(repo.Id, gitQueryCommitsCriteria);
}
Now I want to download each version of the .csproj that relates to each of these commits.
Any help anyone can give me with this would be greatly appreciated.
Kind regards, Stuart

The GetFile() function I am using to get the current version of the
file works fine but I want to download the older versions of the file
from the commit records.
I think you're in the right direction. I once used Azure Devops Rest Api Items-Get to get content of one specific version of file successfully with the help of versionDescriptor parameter.
https://dev.azure.com/MyOrgName/MyProjectName/_apis/git/repositories/MyReposName/Items?path=/README.md&versionDescriptor%5BversionOptions%5D=0&versionDescriptor%5BversionType%5D=2&versionDescriptor%5Bversion%5D={Commit ID}&download=true&resolveLfs=true&%24format=octetStream&api-version=5.0-preview.1
Since functions available in client library are corresponding to that in Rest Api, so client library must have corresponding parameters to do that.
And here's what I found:
Most of the overloads of gitClient.GetItemTextAsync() functions have GitVersionDescriptor versionDescriptor = null as input, and I think this is what you need.
Hope it makes some help.

Related

Configure bitbucket plugin to avoid hardcoding of secure variable

I have developed an Atlasian Bitbucket plugin which globally listens for push/PR and send repository details to databases using REST API.
I need to configure REST API URL and credential so that my plugin can make an API call. Currently I have hardcoded REST API URL and credential in my plugin properties file. Which I don't like because every time if I need to create a package to target my test environment or production, I have to change. Also, I don't like to keep credentials in the source code.
What is the best way to add configuration screen in the bitbucket plugin? I would like to have form for URL, username and password (once I installed the plugin) and update the storage in Bitbucket only once. If I need to restart my bitbucket, I do not want to lose saved data.
I tried to search on how to configure a bitbucket plugin, however I could not find an easy way. I do see multiple approaches, for example to add "Configure" button which will open a servelet to take user input. Seems very cryptic to me. Also, I see so many recommendations for template, for example velocity, soy etc which confused me a lot.
Since I am new to plugin development therefore not able to explore. Looking for some help.
I have solution for this case:
From pom.xml please add more library:
<dependency>
<groupId>com.atlassian.plugins</groupId>
<artifactId>atlassian-plugins-core</artifactId>
<version>5.0.0</version>
<scope>provided</scope>
</dependency>
Create new abc-server.properties on resources/ folder with following content:
server.username=YOUR_USERNAME
server.password=YOUR_PASSWORD
Get value from abc-server.properties on service class as the following:
import com.atlassian.plugin.util.ClassLoaderUtils;
...
final Properties p = new Properties();
final InputStream is = ClassLoaderUtils.getResourceAsStream("abc-server.properties", this.getClass());
try {
if (is != null) {
p.load(is);
String username = p.getProperty("server.username");
String password = p.getProperty("server.password");
}
} catch (IOException e) {
e.printStackTrace();
}
Please try to implement it. Thanks!
One possibility for a simple configuration file, is to read somefile.properties from the Bitbucket home directory, this way the config file will survive application updates.
Create somefile.properties in BITBUCKET_HOME
server.username=YOUR_USERNAME
server.password=YOUR_PASSWORD
Read the properties in your plugin class like this
// imports
import com.atlassian.bitbucket.server.StorageService;
import com.atlassian.plugin.spring.scanner.annotation.imports.ComponentImport;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
private final StorageService storageService;
// StorageService injected via constructor injection
public SomePlugin(#ComponentImport final StorageService storageService) {
this.storageService = storageService;
}
Properties p = new Properties();
File file = new File(storageService.getHomeDir().toString(), "somefile.properties");
FileInputStream fileInputStream;
try {
fileInputStream = new FileInputStream(file);
p.load(fileInputStream);
String username = p.getProperty("server.username");
String password = p.getProperty("server.password");
} catch (IOException e) {
//handle exception
}

Response Header issue on Azure Web Application

I am not sure what is happening here.
When I run my web application locally and click a button to download a file, the file is downloaded fine and Response header as you can see in the attached screenshot where it says local.
But when I publish the application to azure web app. Somehow the download button stops working. I checked the Response Header and you can see the difference.
What would cause this problem? The code is the same? Is there any settings that I should be setting in azure web app in azure portal?
Updated to add code
I have debugged remotely to figure out what is going on as #Amor suggested.
It is so strange that When I debug on my local machine first ExportTo action gets hit which prepares the TempData then Download action gets called once the first action completed with ajax call.
However, this is not the case when I debug remotely. Somehow the ExportTo action never gets called. It directly calls the Download action. As a result the TempData null checking is always null.
But why? Why on earth and how that is possible? Is there something cached somewhere?
I have wiped the content of web application on the remote and re-publish evertyhing to ensure everything is updated. But still no success.
here is the code:
[HttpPost]
public virtual ActionResult ExportTo(SearchVm searchVm)
{
var data = _companyService.GetCompanieBySearchTerm(searchVm).Take(150).ToList();
string handle = Guid.NewGuid().ToString();
TempData[handle] = data;
var fileName = $"C-{handle}.xlsx";
var locationUrl = Url.Action("Download", new { fileGuid = handle, fileName });
var downloadUrl = Url.Action("Download");
return Json(new { success = true, locationUrl, guid = handle, downloadUrl }, JsonRequestBehavior.AllowGet);
}
[HttpGet]
public ActionResult Download(string fileGuid, string fileName)
{
if (TempData[fileGuid] != null)
{
var fileNameSafe = $"C-{fileGuid}.xlsx";
var data = TempData[fileGuid] as List<Company>;
using (MemoryStream ms = new MemoryStream())
{
GridViewExtension.WriteXlsx(GetGridSettings(fileNameSafe), data, ms);
MVCxSpreadsheet mySpreadsheet = new MVCxSpreadsheet();
ms.Position = 0;
mySpreadsheet.Open("myDoc", DocumentFormat.Xlsx, () =>
{
return ms;
});
mySpreadsheet.Document.Worksheets.Insert(0);
var image = Server.MapPath("~/images/logo.png");
var worksheet = mySpreadsheet.Document.Worksheets[0];
worksheet.Name = "Logo";
worksheet.Pictures.AddPicture(image, worksheet.Cells[0, 0]);
byte[] result = mySpreadsheet.SaveCopy(DocumentFormat.Xlsx);
DocumentManager.CloseDocument("myDoc");
Response.Clear();
//Response.AppendHeader("Set-Cookie", "fileDownload=true; path=/");
Response.ContentType = "application/force-download";
Response.AddHeader("content-disposition", $"attachment; filename={fileNameSafe}");
Response.BinaryWrite(result);
Response.End();
}
}
return new EmptyResult();
}
here is the javascript:
var exportData = function (urlExport) {
console.log('Export to link in searchController: ' + urlExport);
ExportButton.SetEnabled(false);
var objData = new Object();
var filterData = companyFilterData(objData);
console.log(filterData);
$.post(urlExport, filterData)
.done(function (data) {
console.log(data.locationUrl);
window.location.href = data.locationUrl;
});
};
When Export button is clicked exportData function is called:
var exportToLink = '#Url.Action("ExportTo")';
console.log('Export to link in index: '+exportToLink);
SearchController.exportData(exportToLink);
As I mentioned that this code works perfectly on the local machine. something weird is happening on azure webapp that ExportTo action breakpoint is never gets hit.
I am not sure what else I could change to get the ExportTo action hit?
Based on the Response Header of Azure Web App, we find that the value of Content-Length is 0. It means that no data has been sent from web app server side.
In ASP.NET MVC, we can response file using following ways.
The first way, send the file which hosted on server. For this way, please check whether the excel file has been uploaded to Azure Web App. You could use Kudu or FTP to the folder to check whether the file is exist.
string fileLocation = Server.MapPath("~/Content/myfile.xlsx");
string contentType = System.Net.Mime.MediaTypeNames.Application.Octet;
string fileName = "file.xlsx";
return File(fileLocation, contentType, fileName);
The second way, we can read the file from any location(database, server or azure storage) and send the file content to client side. For this way, please check whether the file has been read successfully. You can remote debug your azure web app to check whether the file content hasn't been read in the right way.
byte[] fileContent = GetFileContent();
string contentType = System.Net.Mime.MediaTypeNames.Application.Octet;
string fileName = "file.xlsx";
return File(fileContent, contentType, fileName);
5/27/2017 Update
Somehow the ExportTo action never gets called. It directly calls the Download action. As a result the TempData null checking is always null.
How many instances does your Web App assigned? If your Web App have multi instances, the ExportTo request is handled by one instance and the Download request is handled by another instance. Since the TempData is store in memory of dedicated instance, it can't be got from another instance. According to the remote debug document. I find out the reason why the ExportTo action never gets called.
If you do have multiple web server instances, when you attach to the debugger you'll get a random instance, and you have no way to ensure that subsequent browser requests will go to that instance.
To solve this issue, I suggest you response the data directly from the ExportTo action or save the temp data in Azure blob storage which can't be accessed from multi instances.

How to fetch Build Warning from TFS MS Build or TFS API

I am trying to fetch Build Warning from MS Build,(in Build which contain or having number of solutions)
Is it possible to fetch using TFS API, or any TFS DB using QUERY ?
You could use this TFS REST API to get logs of a TFS builds. To get those logs out, you need to fetch those warnings by yourself. There's no API to only get warnings.
Http method: GET
http:/servername"8080/tfs/DefaultCollection/teamproject/_apis/build/builds/391/logs?api-version=2.0
You could also install a TFS ExtendedClient Nuget package to use TFS object model API.
Here is the code snippet:
Like the comment said above, the VNext build definition information couldn't be reached using the old version API. Install this TFS ExtendedClient Nuget package for your project, using the method below to get all build definitions.
using Microsoft.VisualStudio.Services.WebApi;
using Microsoft.VisualStudio.Services.Common;
using Microsoft.TeamFoundation.Build.WebApi;
using Microsoft.TeamFoundation.Core.WebApi;
using Microsoft.VisualStudio.Services.Operations;
private static void GetBuildWarnings()
{
var u = new Uri("http://v-tinmo-12r2:8080/tfs/MyCollection/");
VssCredentials c = new VssCredentials(new Microsoft.VisualStudio.Services.Common.WindowsCredential(new NetworkCredential("username", "password", "domain")));
var connection = new VssConnection(u, c);
BuildHttpClient buildServer = connection.GetClient<BuildHttpClient>();
List<BuildLog> logs = buildServer.GetBuildLogsAsync("teamprojectname",buildId).Result;
foreach (BuildLog log in logs)
{
var list = buildServer.GetBuildLogLinesAsync("A92FB795-A956-45B5-A017-7A7DFB96A040",buildId,log.Id).Result; //A92FB795-A956-45B5-A017-7A7DFB96A040 is the team project Guid
foreach (var line in list)
{
if (l.Contains("[Warning]"))
{
Console.WriteLine(line);
}
}
}
Console.ReadLine();
}

clone a git repository with SSH and libgit2sharp

I'm trying to use the library "libgit2sharp" to clone a repository via a SSH key and... I can't find anything... I can clone it via "https" but what I'd like to do is using an SSH key. It's really unclear if it is supported or not.
As of now, there is a SSH implementation using libssh2 library. You can find it here LibGit2Sharp - SSH
You should add libgit2sharp-ssh dependency on you Project to be able to use it. It is available as a nugget: https://www.nuget.org/packages/LibGit2Sharp-SSH
Disclaimer: I haven't found a formal usage guide yet, what I know is from putting together bits and pieces from other user questions through LibGit2 forums.
From what I understood, you would need to create a new credential using eitherSshUserKeyCredentials OR SshAgentCredentials to authenticate using SSH, and pass it as part of CloneOptions.
In the sample code I use "git" as user, simply because the remote would be something like git#bitbucket.org:project/reponame.git , in which case "git" is the correct user, otherwise you will get an error saying
$exception {"username does not match previous request"}LibGit2Sharp.LibGit2SharpException
The code to clone a repo with SSH should be something like that:
public CloneOptions cloningSSHAuthentication(string username, string path_to_public_key_file, string path_to_private_key_file)
{
CloneOptions options = new CloneOptions();
SshUserKeyCredentials credentials = new SshUserKeyCredentials();
credentials.Username = username;
credentials.PublicKey = path_to_public_key_file;
credentials.PrivateKey = path_to_private_key_file;
credentials.Passphrase = "ssh_key_password";
options.CredentialsProvider = new LibGit2Sharp.Handlers.CredentialsHandler((url, usernameFromUrl, types) => credentials) ;
return options;
}
public CloneOptions cloneSSHAgent(string username){
CloneOptions options = new CloneOptions();
SshAgentCredentials credentials = new SshAgentCredentials();
credentials.Username = username;
var handler = new LibGit2Sharp.Handlers.CredentialsHandler((url, usernameFromUrl, types) => credentials);
options.CredentialsProvider = handler;
return options;
}
public void CloneRepo(string remotePath, string localPath){
CloneOptions options = cloningSSHAuthentication("git", "C:\\folder\\id_rsa.pub", "C:\\folder\\id_rsa");
Repository.Clone(remotePath, localPath, options);
}

Authenticating on TFS 2010

I'm having trouble authenticating as a specific user on MS Team Foundation Server. In older versions it would look like:
teamFoundationCredential = new System.Net.NetworkCredential("<USERNAME>", "<PASSWORD>", "<DOMAIN>");
TeamFoundationServer tfs = new TeamFoundationServer("http://mars:8080/", teamFoundationCredential);
Can some one tell me the equivilent for the 2010 version. So far I have:
ICredentialsProvider cred = null;
tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://asebeast.cpsc.ucalgar.ca:8080/tfs/DefualtCollection"));
tfs.EnsureAuthenticated();
Thanks
For TFS 2010, use the following:
TfsTeamProjectCollection collection = new TfsTeamProjectCollection(
new Uri("http://asebeast.cpsc.ucalgar.ca:8080/tfs/DefaultCollection",
new System.Net.NetworkCredential("domain_name\\user_name", "pwd"));
collection.EnsureAuthenticated();
I've been having the same problem. The above solution doesn't work for me and really can't figure out why. I keep getting a cast exception.
Spent a day trying to figure this out - so thought I'd share my current workaround to the problem.
I've created my own internal class that implements ICredentialsProvider - as below:
private class MyCredentials : ICredentialsProvider
{
private NetworkCredential credentials;
#region ICredentialsProvider Members
public MyCredentials(string user, string domain, string password)
{
credentials = new NetworkCredential(user, password, domain);
}
public ICredentials GetCredentials(Uri uri, ICredentials failedCredentials)
{
return credentials;
}
public void NotifyCredentialsAuthenticated(Uri uri)
{
throw new NotImplementedException();
}
#endregion
}
I then instantiate this and pass it in as below:
MyCredentials credentials = new MyCredentials(UserName, Password, Domain);
TfsTeamProjectCollection configurationServer =
TfsTeamProjectCollectionFactory.GetTeamProjectCollection(
new Uri(tfsUri), credentials);
Note that I haven't implemented the NotifyCredentialsAuthenticated - not sure what this actually does, so left the NotImplementedException in there so I could catch when its called, which so far hasn't happened. Now successfully connected to TFS.
I've had some problems connecting to our old TFS 2008 server using this method as well, but the thing that solved my case was really simple:
First I defined the TFS Url to be:
private const string Tfs2008Url = #"http://servername:8080/tfs/";
static readonly Uri Tfs2008Uri = new Uri(Tfs2008Url);
The path used in the URL is the one we use when connecting via VisualStudio, so I thought this had to be the same in API calls, but when I tried to use this with the following authentication, I got a TF31002 / 404 error:
var collection = new TfsTeamProjectCollection(Tfs2008Uri,new NetworkCredential("AdminUser","password","domain_name"));
collection.EnsureAuthenticated();
But when I changed the Url to the TFS root, it authenticated OK!
private const string Tfs2008Url = #"http://servername:8080/";
static readonly Uri Tfs2008Uri = new Uri(Tfs2008Url);
Don't know if that helped anyone, but it sure did the trick for me!
This has worked pretty good for me:
_tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(tfsUri);
_tfs.ClientCredentials = new TfsClientCredentials(new WindowsCredential(new NetworkCredential("myUserName", "qwerty_pwd", "myDomainName")));
_tfs.EnsureAuthenticated();

Resources