Uploading large files to Sharepoint 365 - sharepoint-api

I'm using the CSOM to upload files to a Sharepoint 365 site.
I've logged in succesfully with Claims based authentication using methods found here "http://www.wictorwilen.se/Post/How-to-do-active-authentication-to-Office-365-and-SharePoint-Online.aspx"
But using SaveBinaryDirect on the ClientContext fails with a 405 due to cookies being attached to request too late.
Another method of using CSOM to upload files is similar to below. But with SP 365, this limits the file size to about 3 meg.
var newFileFromComputer = new FileCreationInformation
{
Content = fileContents,
Url = Path.GetFileName(sourceUrl)
};
Microsoft.SharePoint.Client.File uploadedFile = customerFolder.Files.Add(newFileFromComputer);
context.Load(uploadedFile);
context.ExecuteQuery();
Is there ANY way to do this using CSOM, SP 365 and file sizes up to say 100 meg?

Indeed while trying to upload a file in SharePoint Online which size exceeds 250MB file limit the following exception will occur:
Response received was -1,
Microsoft.SharePoint.Client.InvalidClientQueryExceptionThe request
message is too big. The server does not allow messages larger than
262144000 bytes.
To circumvent this error chunked file upload methods have been introduced which support uploading files larger than 250 MB. In the provided link there is an sample which demonstrates how to utilize it via SharePoint CSOM API.
Supported versions:
SharePoint Online
SharePoint On-Premise 2016 or above
The following example demonstrates how to utilize chunked file upload methods in SharePoint REST API:
class FileUploader
{
public static void ChunkedFileUpload(string webUrl, ICredentials credentials, string sourcePath, string targetFolderUrl, int chunkSizeBytes, Action<long, long> chunkUploaded)
{
using (var client = new WebClient())
{
client.BaseAddress = webUrl;
client.Credentials = credentials;
client.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");
var formDigest = RequestFormDigest(webUrl, credentials);
client.Headers.Add("X-RequestDigest", formDigest);
//create an empty file first
var fileName = System.IO.Path.GetFileName(sourcePath);
var createFileRequestUrl = string.Format("/_api/web/getfolderbyserverrelativeurl('{0}')/files/add(url='{1}',overwrite=true)", targetFolderUrl, fileName);
client.UploadString(createFileRequestUrl, "POST");
var targetUrl = System.IO.Path.Combine(targetFolderUrl, fileName);
var firstChunk = true;
var uploadId = Guid.NewGuid();
var offset = 0L;
using (var inputStream = System.IO.File.OpenRead(sourcePath))
{
var buffer = new byte[chunkSizeBytes];
int bytesRead;
while ((bytesRead = inputStream.Read(buffer, 0, buffer.Length)) > 0)
{
if (firstChunk)
{
var endpointUrl = string.Format("/_api/web/getfilebyserverrelativeurl('{0}')/startupload(uploadId=guid'{1}')", targetUrl, uploadId);
client.UploadData(endpointUrl, buffer);
firstChunk = false;
}
else if (inputStream.Position == inputStream.Length)
{
var endpointUrl = string.Format("/_api/web/getfilebyserverrelativeurl('{0}')/finishupload(uploadId=guid'{1}',fileOffset={2})", targetUrl, uploadId, offset);
var finalBuffer = new byte[bytesRead];
Array.Copy(buffer, finalBuffer, finalBuffer.Length);
client.UploadData(endpointUrl, finalBuffer);
}
else
{
var endpointUrl = string.Format("/_api/web/getfilebyserverrelativeurl('{0}')/continueupload(uploadId=guid'{1}',fileOffset={2})", targetUrl, uploadId, offset);
client.UploadData(endpointUrl, buffer);
}
offset += bytesRead;
chunkUploaded(offset, inputStream.Length);
}
}
}
}
public static string RequestFormDigest(string webUrl, ICredentials credentials)
{
using (var client = new WebClient())
{
client.BaseAddress = webUrl;
client.Credentials = credentials;
client.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");
client.Headers.Add("Accept", "application/json; odata=verbose");
var endpointUrl = "/_api/contextinfo";
var content = client.UploadString(endpointUrl, "POST");
var data = JObject.Parse(content);
return data["d"]["GetContextWebInformation"]["FormDigestValue"].ToString();
}
}
}
Source code: FileUploader.cs
Usage
var userCredentials = GetCredentials(userName, password);
var sourcePath = #"C:\temp\jellyfish-25-mbps-hd-hevc.mkv"; //local file path
var targetFolderUrl = "/Shared Documents"; //library reltive url
FileUploader.ChunkedFileUpload(webUrl,
userCredentials,
sourcePath,
targetFolderUrl,
1024 * 1024 * 5, //5MB
(offset, size) =>
{
Console.WriteLine("{0:P} completed", (offset / (float)size));
});
References
Always use File Chunking to Upload Files > 250 MB to SharePoint Online

Well, I haven't found a way to do it with the CSOM and that is truly infuriating.
A work around was posted by SEvans at the comments on http://www.wictorwilen.se/Post/How-to-do-active-authentication-to-Office-365-and-SharePoint-Online.aspx .
Basically just do an http put and attach the cookie collection from the claims based authentication. SEvans posted workaround is below
Great piece of code Wichtor. As others have noted, SaveBinaryDirect does not work correctly, as the FedAuth cookies never get attached to the HTTP PUT request that the method generates.
Here's my workaround:
// "url" is the full destination path (including filename, i.e. https://mysite.sharepoint.com/Documents/Test.txt)
// "cookie" is the CookieContainer generated from Wichtor's code
// "data" is the byte array containing the files contents (used a FileStream to load)
System.Net.ServicePointManager.Expect100Continue = false;
HttpWebRequest request = HttpWebRequest.Create(url) as HttpWebRequest;
request.Method = "PUT";
request.Accept = "*/*";
request.ContentType = "multipart/form-data; charset=utf-8";
request.CookieContainer = cookie; request.AllowAutoRedirect = false;
request.UserAgent = "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)";
request.Headers.Add("Accept-Language", "en-us");
request.Headers.Add("Translate", "F"); request.Headers.Add("Cache-Control", "no-cache"); request.ContentLength = data.Length;
using (Stream req = request.GetRequestStream())
{ req.Write(data, 0, data.Length); }
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream res = response.GetResponseStream();
StreamReader rdr = new StreamReader(res);
string rawResponse = rdr.ReadToEnd();
response.Close();
rdr.Close();

Related

How to upload a small file plus metadata with GraphServiceClient to OneDrive with a single POST request?

I would like to upload small files with metadata (DriveItem) attached so that the LastModifiedDateTime property is set properly.
First, my current workaround is this:
var graphFileSystemInfo = new Microsoft.Graph.FileSystemInfo()
{
CreatedDateTime = fileSystemInfo.CreationTimeUtc,
LastAccessedDateTime = fileSystemInfo.LastAccessTimeUtc,
LastModifiedDateTime = fileSystemInfo.LastWriteTimeUtc
};
using (var stream = new System.IO.File.OpenRead(localPath))
{
if (fileSystemInfo.Length <= 4 * 1024 * 1024) // file.Length <= 4 MB
{
var driveItem = new DriveItem()
{
File = new File(),
FileSystemInfo = graphFileSystemInfo,
Name = Path.GetFileName(item.Path)
};
try
{
var newDriveItem = await graphClient.Me.Drive.Root.ItemWithPath(item.Path).Content.Request().PutAsync<DriveItem>(stream);
await graphClient.Me.Drive.Items[newDriveItem.Id].Request().UpdateAsync(driveItem);
}
catch (Exception ex)
{
throw;
}
}
else
{
// large file upload
}
}
This code works by first uploading the content via PutAsync and then updating the metadata via UpdateAsync. I tried to do it vice versa (as suggested here) but then I get the error that no file without content can be created. If I then add content to the DriveItem.Content property, the next error is that the stream's ReadTimeout and WriteTimeout properties cannot be read. With a wrapper class for the FileStream, I can overcome this but then I get the next error: A stream property 'content' has a value in the payload. In OData, stream property must not have a value, it must only use property annotations.
By googling, I found that there is another way to upload data, called multipart upload (link). With this description I tried to use the GraphServiceClient to create such a request. But it seems that this is only fully implemented for OneNote items. I took this code as template and created the following function to mimic the OneNote behavior:
public static async Task UploadSmallFile(GraphServiceClient graphClient, DriveItem driveItem, Stream stream)
{
var jsondata = JsonConvert.SerializeObject(driveItem);
// Create the metadata part.
StringContent stringContent = new StringContent(jsondata, Encoding.UTF8, "application/json");
stringContent.Headers.ContentDisposition = new ContentDispositionHeaderValue("related");
stringContent.Headers.ContentDisposition.Name = "Metadata";
stringContent.Headers.ContentType = new MediaTypeHeaderValue("application/json");
// Create the data part.
var streamContent = new StreamContent(stream);
streamContent.Headers.ContentDisposition = new ContentDispositionHeaderValue("related");
streamContent.Headers.ContentDisposition.Name = "Data";
streamContent.Headers.ContentType = new MediaTypeHeaderValue("text/plain");
// Put the multiparts together
string boundary = "MultiPartBoundary32541";
MultipartContent multiPartContent = new MultipartContent("related", boundary);
multiPartContent.Add(stringContent);
multiPartContent.Add(streamContent);
var requestUrl = graphClient.Me.Drive.Items["F4C4DC6C33B9D421!103"].Children.Request().RequestUrl;
// Create the request message and add the content.
HttpRequestMessage hrm = new HttpRequestMessage(HttpMethod.Post, requestUrl);
hrm.Content = multiPartContent;
// Send the request and get the response.
var response = await graphClient.HttpProvider.SendAsync(hrm);
}
With this code, I get the error Entity only allows writes with a JSON Content-Type header.
What am I doing wrong?
Not sure why the provided error occurs, your example appears to be a valid and corresponds to Request body example
But the alternative option could be considered for this matter, since Microsoft Graph supports JSON batching, the folowing example demonstrates how to upload a file and update its metadata within a single request:
POST https://graph.microsoft.com/v1.0/$batch
Accept: application/json
Content-Type: application/json
{
"requests": [
{
"id":"1",
"method":"PUT",
"url":"/me/drive/root:/Sample.docx:/content",
"headers":{
"Content-Type":"application/octet-stream"
},
},
{
"id":"2",
"method":"PATCH",
"url":"/me/drive/root:/Sample.docx:",
"headers":{
"Content-Type":"application/json; charset=utf-8"
},
"body":{
"fileSystemInfo":{
"lastModifiedDateTime":"2019-08-09T00:49:37.7758742+03:00"
}
},
"dependsOn":["1"]
}
]
}
Here is a C# example
var bytes = System.IO.File.ReadAllBytes(path);
var stream = new MemoryStream(bytes);
var batchRequest = new BatchRequest();
//1.1 construct upload file query
var uploadRequest = graphClient.Me
.Drive
.Root
.ItemWithPath(System.IO.Path.GetFileName(path))
.Content
.Request();
batchRequest.AddQuery(uploadRequest, HttpMethod.Put, new StreamContent(stream));
//1.2 construct update driveItem query
var updateRequest = graphClient.Me
.Drive
.Root
.ItemWithPath(System.IO.Path.GetFileName(path))
.Request();
var driveItem = new DriveItem()
{
FileSystemInfo = new FileSystemInfo()
{
LastModifiedDateTime = DateTimeOffset.UtcNow.AddDays(-1)
}
};
var jsonPayload = new StringContent(graphClient.HttpProvider.Serializer.SerializeObject(driveItem), Encoding.UTF8, "application/json");
batchRequest.AddQuery(updateRequest, new HttpMethod("PATCH"), jsonPayload, true, typeof(Microsoft.Graph.DriveItem));
//2. execute Batch request
var result = await graphClient.SendBatchAsync(batchRequest);
var updatedDriveItem = result[1] as DriveItem;
Console.WriteLine(updatedDriveItem.LastModifiedDateTime);
where SendBatchAsync is an extension method which implements JSON Batching support for Microsoft Graph .NET Client Library

Object reference not set to an object while file upload in OneDrive

I am using Microsoft Graph SDK to upload file in chunks in OneDrive. I am using below code to upload the file:
try
{
GraphServiceClient graphClient = this.GetGraphServiceClient(accessToken);
string fileName = Path.GetFileName(srcFilePath);
using (var fileContentStream = System.IO.File.Open(srcFilePath, System.IO.FileMode.Open))
{
var uploadSession = await graphClient.Me.Drive.Root.ItemWithPath(fileName).CreateUploadSession().Request().PostAsync();
var maxChunkSize = 5 * 1024 * 1024;
var provider = new ChunkedUploadProvider(uploadSession, graphClient, fileContentStream, maxChunkSize);
var chunkRequests = provider.GetUploadChunkRequests();
var readBuffer = new byte[maxChunkSize];
var trackedExceptions = new List<Exception>();
Microsoft.Graph.DriveItem itemResult = null;
foreach (var request in chunkRequests)
{
var result = await provider.GetChunkRequestResponseAsync(request, readBuffer, trackedExceptions);
if (result.UploadSucceeded)
{
itemResult = result.ItemResponse;
}
}
}
}
catch (Microsoft.Graph.ServiceException e)
{
}
catch (Exception ex)
{
}
The above code works fine with normal file names. However, when I am trying to upload a file with name as Test#123.pdf, "Object reference not set to an object" exception is thrown at line var provider = new ChunkedUploadProvider(uploadSession, graphClient, fileContentStream, maxChunkSize); Please see below screenshot:
Is this a limitation of OneDrive SDK, or am I not passing the parameters correctly?
The # sign has a special meaning in a URL. Before you can use it, you'll need to URL Encode the file name: Test%23123.pdf.

The request was aborted error while writing to request stream

public HttpWebResponse PushFileToWistia(byte[] contentFileByteArray, string fileName)
{
StringBuilder postDataBuilder = new StringBuilder();
postDataBuilder.Append("I am appending all the wistia config and setting here");
byte[] postData = null;
using (MemoryStream postDataStream = new MemoryStream())
{
byte[] postDataBuffer = Encoding.UTF8.GetBytes(postDataBuilder.ToString());
postDataStream.Write(postDataBuffer, 0, postDataBuffer.Length);
postDataStream.Write(contentFileByteArray, 0, contentFileByteArray.Length);
postDataBuffer = Encoding.UTF8.GetBytes("\r\n--" + boundary + "--");
postDataStream.Write(postDataBuffer, 0, postDataBuffer.Length);
postData = postDataStream.ToArray();
}
ServicePointManager.Expect100Continue = false;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(AppConfig.WistiaCustomCourseBucket);
request.Method = "POST";
request.Expect = String.Empty;
request.Headers.Clear();
request.ContentType = "multipart/form-data; boundary=" + boundary;
request.ContentLength = postData.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(postData, 0, postData.Length); //for file > 100mb this call throws and error --the requet was aborted. the request was canceled.
requestStream.Flush();
requestStream.Close();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
return response;
}
The above code works for video file mp4 less then 50mb. But when I try to upload a 100mb file it throws and exception (Request was aborted.) I need to support file size up to 1.5gb So now I am not sure if this approach is correct for such a big file size upload. Any suggestions in the right direction will be helpful...thanks(I am trying to upload the file to Wistia Server)
The exception is thrown at this line
-- requestStream.Write(postData, 0, postData.Length);
I have tried changing the web.config setting but didn't work:
httpRuntime targetFramework="4.5" maxRequestLength="2048576" executionTimeout="12000" requestLengthDiskThreshold="1024"
------Async Call-------
MemoryStream wistiaFileStream = null;
using (MemoryStream postDataStream = new MemoryStream())
{
postDataStream.Write(contentFileByteArray, 0, contentFileByteArray.Length);
wistiaFileStream = postDataStream;
postDataStream.Flush();
postDataStream.Close();
}
Stream requestStream = await request.GetRequestStreamAsync();
await requestStream.WriteAsync(wistiaMetadata, 0, wistiaMetadata.Length);
using (wistiaFileStream)
{
byte[] wistiaFileBuffer = new byte[500*1024];
int wistiaFileBytesRead = 0;
while (
(wistiaFileBytesRead =
await wistiaFileStream.ReadAsync(wistiaFileBuffer, 0, wistiaFileBuffer.Length)) != 0)
{
await requestStream.WriteAsync(wistiaFileBuffer, 0, wistiaFileBytesRead);
await requestStream.FlushAsync();
}
await requestStream.WriteAsync(requestBoundary, 0, requestBoundary.Length);
}
I would suggest moving to async and write file directly from file system to request in order to avoid triple buffering of 1.5GB in memory (warning below is not tested).
public async Task<HttpWebResponse> PushFileToWistiaAsync(string contentFilePath)
{
string boundary = "---------------------------" + DateTime.Now.Ticks.ToString("x");
string contentBoundary = "\r\n--" + boundary + "\r\n";
StringBuilder wistiaMetadataBuilder = new StringBuilder();
wistiaMetadataBuilder.Append("--" + boundary + "\r\n");
// Append all the wistia config and setting here
byte[] wistiaMetadata = Encoding.UTF8.GetBytes(wistiaMetadataBuilder.ToString());
byte[] requestBoundary = Encoding.UTF8.GetBytes(contentBoundary);
ServicePointManager.Expect100Continue = false;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(AppConfig.WistiaCustomCourseBucket);
request.Method = "POST";
request.Headers.Clear();
request.Expect = String.Empty;
request.ContentType = "multipart/form-data; boundary=" + boundary;
Stream requestStream = await request.GetRequestStreamAsync();
await requestStream.WriteAsync(wistiaMetadata, 0, wistiaMetadata.Length);
using (FileStream wistiaFileStream = new FileStream(contentFilePath, FileMode.Open, FileAccess.Read))
{
byte[] wistiaFileBuffer = new byte[500 * 1024];
int wistiaFileBytesRead = 0;
while ((wistiaFileBytesRead = await wistiaFileStream.ReadAsync(wistiaFileBuffer, 0, wistiaFileBuffer.Length)) != 0)
{
await requestStream.WriteAsync(wistiaFileBuffer, 0, wistiaFileBytesRead);
await requestStream.FlushAsync();
}
}
await requestStream.WriteAsync(requestBoundary, 0, requestBoundary.Length);
return (HttpWebResponse)(await request.GetResponseAsync());
}
You should play with buffer sizes, amount of data you read at once and request.SendChunked to achieve reasonable performance.
Here is another approach (not asynchronous so possibly worst scalability) which wirtes directly from buffer to request:
public HttpWebResponse PushFileToWistia(byte[] contentFileByteArray)
{
string boundary = "---------------------------" + DateTime.Now.Ticks.ToString("x");
string contentBoundary = "\r\n--" + boundary + "\r\n";
StringBuilder wistiaMetadataBuilder = new StringBuilder();
wistiaMetadataBuilder.Append("--" + boundary + "\r\n");
// Append all the wistia config and setting here
byte[] wistiaMetadata = Encoding.UTF8.GetBytes(wistiaMetadataBuilder.ToString());
byte[] requestBoundary = Encoding.UTF8.GetBytes(contentBoundary);
ServicePointManager.Expect100Continue = false;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(AppConfig.WistiaCustomCourseBucket);
request.Method = "POST";
request.Headers.Clear();
request.Expect = String.Empty;
request.ContentType = "multipart/form-data; boundary=" + boundary;
request.ContentLength = wistiaMetadata.Length + contentFileByteArray.Length + requestBoundary.Length
// You can play with SendChunked and AllowWriteStreamBuffering to control the size of chunks you send and performance
//request.SendChunked = true;
//request.AllowWriteStreamBuffering = false;
int contentFileChunkSize = 500 * 1024;
int contentFileBytesRead = 0;
Stream requestStream = request.GetRequestStream();
requestStream.Write(wistiaMetadata, 0, wistiaMetadata.Length);
while (contentFileBytesRead < contentFileByteArray.Length)
{
if ((contentFileBytesRead + contentFileChunkSize) > contentFileByteArray.Length)
{
contentFileChunkSize = contentFileByteArray.Length - contentFileBytesRead;
}
requestStream.Write(contentFileByteArray, contentFileBytesRead, contentFileChunkSize);
requestStream.Flush();
contentFileBytesRead += contentFileChunkSize;
}
requestStream.Write(requestBoundary, 0, requestBoundary.Length);
requestStream.Close();
// You might need to play with request.Timeout here
return (HttpWebResponse)request.GetResponse();
}
Also if you doing this in web application and you want to use asynchronous approach you need to "async/await" all the way up (so async action in async controller etc.).
In general I would discourage doing this as part of request handling in web application (the total time observed from user perspective would be a sum of uploading to your app and then to Wistia which might be much more than client timeout allows). In such case it is usually better to save the file and schedule some other "background task" to do the upload job.

Download functionality MVC 4

I have created a web api which connects users to dropbox via OAuth. I am using an API to interact with Dropbox, which works locally as I would like, however when I deploy the API to my Azure server, I am unable to download. I had anticipated this would happen, as my API is currently hard codded to a path on my machine.
Here is the method I am using:
NOTE: I call this method through an ActionResult, as part of the MVC portion of my project
public FileSystemInfo DownloadFile(string root, string path)
{
var uri = new Uri(new Uri(DropboxRestApi.ApiContentServer),
String.Format("files?root={0}&path={1}",
root, UpperCaseUrlEncode(path)));
var oauth = new OAuth();
var requestUri = oauth.SignRequest(uri, _consumerKey, _consumerSecret, _accessToken);
var request = (HttpWebRequest) WebRequest.Create(requestUri);
request.Method = WebRequestMethods.Http.Get;
var response = request.GetResponse();
var metadata = response.Headers["x-dropbox-metadata"];
var file = ParseJson<FileSystemInfo>(metadata);
using (Stream responseStream = response.GetResponseStream())
using (MemoryStream memoryStream = new MemoryStream())
{
byte[] buffer = new byte[1024];
int bytesRead;
do
{
bytesRead = responseStream.Read(buffer, 0, buffer.Length);
memoryStream.Write(buffer, 0, bytesRead);
} while (bytesRead > 0);
file.Data = memoryStream.ToArray();
}
return file;
}
This is where I call the method in my action result.
var file = api.DownloadFile("dropbox", "Public/downloadThis.jpg");
path = file.Path;
file.Save(#"....\Desktop\DemoTest\Downloads\downloadThis.jpg"); --- this is the problem & *Save* is a stream writer
Is there a procedure to follow when downloading files from a server on a browser?
public ActionResult download(Models.downloadModel dowld, Models.LoggerView log)
{
string TC_ID = Request.QueryString["id"].ToString();
string filename = TC_ID+"_LoggerData" + ".zip";
Response.ContentType = "application/octet-stream";
Response.AppendHeader("Content-Disposition", "attachment;filename=" + filename);
Response.TransmitFile(Server.MapPath("~/files/" + filename));
Response.End();
}

Failed to upload image to twitter using RestSharp

I'm trying to upload image in WinRT application to twitter using RestSharp
Code is here:
RestClient twClient = new RestClient("https://upload.twitter.com");
twClient.Authenticator = OAuth1Authenticator.ForProtectedResource(........);
var postTweet = new RestRequest("/1/statuses/update_with_media.json", Method.POST);
postTweet.AddParameter("status", TweetBox.Text);
byte[] img = await GetDataAsync(imageFile);
postTweet.AddFile("media[]", img, imageFile.Name, "multipart/form-data");
twClient.ExecuteAsync(postTweet, (response =>
{
try
{
if (response.StatusCode == HttpStatusCode.OK)
{
....
Here is my GetDataAsync, which takes byte array from file in Isolated Storage
public static async Task<byte[]> GetDataAsync(StorageFile file)
{
IRandomAccessStream stream = await file.OpenAsync(FileAccessMode.Read);
DataReader reader = new DataReader(stream.GetInputStreamAt(0));
uint streamSize = (uint)stream.Size;
await reader.LoadAsync(streamSize);
byte[] buffer = new byte[streamSize];
reader.ReadBytes(buffer);
return buffer;
}
Server response:
Expectation Failed
The expectation given in the Expect request-header field could not be met by this server.
The client sent
Expect: 100-continue
but we only allow the 100-continue expectation.

Resources