OleDb - not getting up to date data from SharePoint - oledb

I am having the following problem:
1.I connect to the OleDb data source (SharePoint list) 2. I select the data using the following code:
success = true;
message = "";
readData = new DataTable();
if(oledbConnection == null)
{
success = false;
message = "First establish connection with the oledb source.";
}
if(success == true)
{
try
{
OleDbCommand command = new OleDbCommand(queryString, oledbConnection);
command.CommandTimeout = (int)Convert.ToDecimal(timeout);
OleDbDataReader reader = command.ExecuteReader();
readData.Load(reader);
command.Dispose();
reader.Close();
}
catch (Exception ex)
{
success=false;
message=ex.Message;
}
}
I update the data on sharepoint using the same oledbConnection (still open).
I get back to point 2 - to read the updated data from SharePoint list still using the same oledbConnection (still open). But when the data is downloaded I cannot see the changes I've made.
The question is:
How can I get the up to date data from sharepoint using the same conneciont (without closing it and reopening).

Related

Large File upload to ASP.NET Core 3.0 Web API fails due to Request Body to Large

I have an ASP.NET Core 3.0 Web API endpoint that I have set up to allow me to post large audio files. I have followed the following directions from MS docs to set up the endpoint.
https://learn.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.0#kestrel-maximum-request-body-size
When an audio file is uploaded to the endpoint, it is streamed to an Azure Blob Storage container.
My code works as expected locally.
When I push it to my production server in Azure App Service on Linux, the code does not work and errors with
Unhandled exception in request pipeline: System.Net.Http.HttpRequestException: An error occurred while sending the request. ---> Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Request body too large.
Per advice from the above article, I have configured incrementally updated Kesterl with the following:
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((ctx, options) =>
{
var config = ctx.Configuration;
options.Limits.MaxRequestBodySize = 6000000000;
options.Limits.MinRequestBodyDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.MinResponseDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.RequestHeadersTimeout =
TimeSpan.FromMinutes(2);
}).UseStartup<Startup>();
Also configured FormOptions to accept files up to 6000000000
services.Configure<FormOptions>(options =>
{
options.MultipartBodyLengthLimit = 6000000000;
});
And also set up the API controller with the following attributes, per advice from the article
[HttpPost("audio", Name="UploadAudio")]
[DisableFormValueModelBinding]
[GenerateAntiforgeryTokenCookie]
[RequestSizeLimit(6000000000)]
[RequestFormLimits(MultipartBodyLengthLimit = 6000000000)]
Finally, here is the action itself. This giant block of code is not indicative of how I want the code to be written but I have merged it into one method as part of the debugging exercise.
public async Task<IActionResult> Audio()
{
if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
{
throw new ArgumentException("The media file could not be processed.");
}
string mediaId = string.Empty;
string instructorId = string.Empty;
try
{
// process file first
KeyValueAccumulator formAccumulator = new KeyValueAccumulator();
var streamedFileContent = new byte[0];
var boundary = MultipartRequestHelper.GetBoundary(
MediaTypeHeaderValue.Parse(Request.ContentType),
_defaultFormOptions.MultipartBoundaryLengthLimit
);
var reader = new MultipartReader(boundary, Request.Body);
var section = await reader.ReadNextSectionAsync();
while (section != null)
{
var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(
section.ContentDisposition, out var contentDisposition);
if (hasContentDispositionHeader)
{
if (MultipartRequestHelper
.HasFileContentDisposition(contentDisposition))
{
streamedFileContent =
await FileHelpers.ProcessStreamedFile(section, contentDisposition,
_permittedExtensions, _fileSizeLimit);
}
else if (MultipartRequestHelper
.HasFormDataContentDisposition(contentDisposition))
{
var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name).Value;
var encoding = FileHelpers.GetEncoding(section);
if (encoding == null)
{
return BadRequest($"The request could not be processed: Bad Encoding");
}
using (var streamReader = new StreamReader(
section.Body,
encoding,
detectEncodingFromByteOrderMarks: true,
bufferSize: 1024,
leaveOpen: true))
{
// The value length limit is enforced by
// MultipartBodyLengthLimit
var value = await streamReader.ReadToEndAsync();
if (string.Equals(value, "undefined",
StringComparison.OrdinalIgnoreCase))
{
value = string.Empty;
}
formAccumulator.Append(key, value);
if (formAccumulator.ValueCount >
_defaultFormOptions.ValueCountLimit)
{
return BadRequest($"The request could not be processed: Key Count limit exceeded.");
}
}
}
}
// Drain any remaining section body that hasn't been consumed and
// read the headers for the next section.
section = await reader.ReadNextSectionAsync();
}
var form = formAccumulator;
var file = streamedFileContent;
var results = form.GetResults();
instructorId = results["instructorId"];
string title = results["title"];
string firstName = results["firstName"];
string lastName = results["lastName"];
string durationInMinutes = results["durationInMinutes"];
//mediaId = await AddInstructorAudioMedia(instructorId, firstName, lastName, title, Convert.ToInt32(duration), DateTime.UtcNow, DateTime.UtcNow, file);
string fileExtension = "m4a";
// Generate Container Name - InstructorSpecific
string containerName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{instructorId}";
string contentType = "audio/mp4";
FileType fileType = FileType.audio;
string authorName = $"{firstName} {lastName}";
string authorShortName = $"{firstName[0]}{lastName}";
string description = $"{authorShortName} - {title}";
long duration = (Convert.ToInt32(durationInMinutes) * 60000);
// Generate new filename
string fileName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{Guid.NewGuid()}";
DateTime recordingDate = DateTime.UtcNow;
DateTime uploadDate = DateTime.UtcNow;
long blobSize = long.MinValue;
try
{
// Update file properties in storage
Dictionary<string, string> fileProperties = new Dictionary<string, string>();
fileProperties.Add("ContentType", contentType);
// update file metadata in storage
Dictionary<string, string> metadata = new Dictionary<string, string>();
metadata.Add("author", authorShortName);
metadata.Add("tite", title);
metadata.Add("description", description);
metadata.Add("duration", duration.ToString());
metadata.Add("recordingDate", recordingDate.ToString());
metadata.Add("uploadDate", uploadDate.ToString());
var fileNameWExt = $"{fileName}.{fileExtension}";
var blobContainer = await _cloudStorageService.CreateBlob(containerName, fileNameWExt, "audio");
try
{
MemoryStream fileContent = new MemoryStream(streamedFileContent);
fileContent.Position = 0;
using (fileContent)
{
await blobContainer.UploadFromStreamAsync(fileContent);
}
}
catch (StorageException e)
{
if (e.RequestInformation.HttpStatusCode == 403)
{
return BadRequest(e.Message);
}
else
{
return BadRequest(e.Message);
}
}
try
{
foreach (var key in metadata.Keys.ToList())
{
blobContainer.Metadata.Add(key, metadata[key]);
}
await blobContainer.SetMetadataAsync();
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
blobSize = await StorageUtils.GetBlobSize(blobContainer);
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
Media media = Media.Create(string.Empty, instructorId, authorName, fileName, fileType, fileExtension, recordingDate, uploadDate, ContentDetails.Create(title, description, duration, blobSize, 0, new List<string>()), StateDetails.Create(StatusType.STAGED, DateTime.MinValue, DateTime.UtcNow, DateTime.MaxValue), Manifest.Create(new Dictionary<string, string>()));
// upload to MongoDB
if (media != null)
{
var mapper = new Mapper(_mapperConfiguration);
var dao = mapper.Map<ContentDAO>(media);
try
{
await _db.Content.InsertOneAsync(dao);
}
catch (Exception)
{
mediaId = string.Empty;
}
mediaId = dao.Id.ToString();
}
else
{
// metadata wasn't stored, remove blob
await _cloudStorageService.DeleteBlob(containerName, fileName, "audio");
return BadRequest($"An issue occurred during media upload: rolling back storage change");
}
if (string.IsNullOrEmpty(mediaId))
{
return BadRequest($"Could not add instructor media");
}
}
catch (Exception ex)
{
return BadRequest(ex.Message);
}
var result = new { MediaId = mediaId, InstructorId = instructorId };
return Ok(result);
}
I reiterate, this all works great locally. I do not run it in IISExpress, I run it as a console app.
I submit large audio files via my SPA app and Postman and it works perfectly.
I am deploying this code to an Azure App Service on Linux (as a Basic B1).
Since the code works in my local development environment, I am at a loss of what my next steps are. I have refactored this code a few times but I suspect that it's environment related.
I cannot find anywhere that mentions that the level of App Service Plan is the culprit so before I go out spending more money I wanted to see if anyone here had encountered this challenge and could provide advice.
UPDATE: I attempted upgrading to a Production App Service Plan to see if there was an undocumented gate for incoming traffic. Upgrading didn't work either.
Thanks in advance.
-A
Currently, as of 11/2019, there is a limitation with the Azure App Service for Linux. It's CORS functionality is enabled by default and cannot be disabled AND it has a file size limitation that doesn't appear to get overridden by any of the published Kestrel configurations. The solution is to move the Web API app to a Azure App Service for Windows and it works as expected.
I am sure there is some way to get around it if you know the magic combination of configurations, server settings, and CLI commands but I need to move on with development.

How to force excel file to open in browser instead of download?

I am working with MVC5 Crystal reports of pdf and excel files, my code works well for pdf to view in the web browser but when i change content-type to excel file its download only,
public ActionResult Summary(string startDate, string endDate, string summaryBy, string reportType)
{
using (MMTModel entity = new MMTModel())
{
string CryRpt_Name = null;
ObjectResult<DeeqtoonSummary> ObjRsl = null;
if (reportType == "Summary")
{
CryRpt_Name = "Summary.rpt";
ObjRsl = entity.rpt_Summary(startDate, endDate, summaryBy);
}
else if (reportType == "Detail")
{
CryRpt_Name = "Detail.rpt";
ObjRsl = entity.rpt_Detail(startDate, endDate);
}
ReportDocument rpt = new ReportDocument();
rpt.Load(Path.Combine(rpt.FileName = Server.MapPath("~/CrystalReports"), CryRpt_Name));
List<Summary> ObjRslLst = ObjRsl.ToList();
rpt.SetDataSource(ObjRslLst);
try
{
//Excel file
Stream stream = rpt.ExportToStream(ExportFormatType.Excel);
return File(stream, "application/vnd.ms-excel");
//pdf file
Stream stream = rpt.ExportToStream(ExportFormatType.PortableDocFormat);
return File(stream, "application/pdf");
}
catch (Exception ex)
{
throw new Exception(ex.Message);
}
}
}
}
How can i force to open the excel file in the web browser?
I don't know anything about crystal-reports, but if your browser doesn't have an implementation for viewing excel documents then it will just download them. I have never been able to open an excel document with my browser unless the web application itself implements the display of the document.
Basically, you have to implement the MVC that displays an excel document yourself. Your browser doesn't support it out of the box.

can eml directly open with outlook, instead eml file will download then click it to open in outlook?

I have an asp.net MVC application, below code works file.
But the code is that, When navigate to Email action in browser, an EML file is download, then when we click on that file, the file will open with outlook.
Can it be possible, when action calls, then EML file will directly open with outlook, instead of download and then click to open??
Code
public async Task<FileStreamResult> Email()
{
string dummyEmail = "test#localhost.com";
var mailMessage = new MailMessage();
mailMessage.From = new MailAddress(dummyEmail);
mailMessage.To.Add("dejan.caric#gmail.com");
mailMessage.Subject = "Test subject";
mailMessage.Body = "Test body";
// mark as draft
mailMessage.Headers.Add("X-Unsent", "1");
// download image and save it as attachment
using (var httpClient = new HttpClient())
{
var imageStream = await httpClient.GetStreamAsync(new Uri("http://dcaric.com/favicon.ico"));
mailMessage.Attachments.Add(new Attachment(imageStream, "favicon.ico"));
}
var stream = new MemoryStream();
ToEmlStream(mailMessage, stream, dummyEmail);
stream.Position = 0;
return File(stream, "message/rfc822", "test_email.eml");
}
private void ToEmlStream(MailMessage msg, Stream str, string dummyEmail)
{
using (var client = new SmtpClient())
{
var id = Guid.NewGuid();
var tempFolder = Path.Combine(Path.GetTempPath(), Assembly.GetExecutingAssembly().GetName().Name);
tempFolder = Path.Combine(tempFolder, "MailMessageToEMLTemp");
// create a temp folder to hold just this .eml file so that we can find it easily.
tempFolder = Path.Combine(tempFolder, id.ToString());
if (!Directory.Exists(tempFolder))
{
Directory.CreateDirectory(tempFolder);
}
client.UseDefaultCredentials = true;
client.DeliveryMethod = SmtpDeliveryMethod.SpecifiedPickupDirectory;
client.PickupDirectoryLocation = tempFolder;
client.Send(msg);
// tempFolder should contain 1 eml file
var filePath = Directory.GetFiles(tempFolder).Single();
// create new file and remove all lines that start with 'X-Sender:' or 'From:'
string newFile = Path.Combine(tempFolder, "modified.eml");
using (var sr = new StreamReader(filePath))
{
using (var sw = new StreamWriter(newFile))
{
string line;
while ((line = sr.ReadLine()) != null)
{
if (!line.StartsWith("X-Sender:") &&
!line.StartsWith("From:") &&
// dummy email which is used if receiver address is empty
!line.StartsWith("X-Receiver: " + dummyEmail) &&
// dummy email which is used if receiver address is empty
!line.StartsWith("To: " + dummyEmail))
{
sw.WriteLine(line);
}
}
}
}
// stream out the contents
using (var fs = new FileStream(newFile, FileMode.Open))
{
fs.CopyTo(str);
}
}
}
With Chrome you can make it automatically open certain files, once they are downloaded.
.EML should attempt to open in Outlook.
I am not sure about other browsers, but Chrome seemed to be the only one with this option.
It's not a pefect solution because if someone downloaded an .EML from another website in Chrome, it will open automatically aswell.
I recommend having Chrome dedicated to your Web application.
You sure can open local .eml file with Outlook.
But in context of web application, you must firstly download it.

OleDbConnection to Excel File in MOSS 2007 Shared Documents

I need to programmatically open an Excel file that is stored in a MOSS 2007 Shared Documents List. I’d like to use an OleDbConnection so that I may return the contents of the file as a DataTable. I believe this is possile since a number of articles on the Web imply this is possible. Currently my code fails when trying to initialize a new connection (oledbConn = new OleDbConnection(_connStringName); The error message is:
Format of the initialization string does not conform to specification starting at index 0.
I believe I am just not able to figure the right path to the file. Here is my code:
public DataTable GetData(string fileName, string workSheetName, string filePath)
{
// filePath == C:\inetpub\wwwroot\wss\VirtualDirectories\80\MySpWebAppName\Shared Documents\FY12_FHP_SPREADSHEET.xlsx
// Initialize global vars
_connStringName = DataSource.Conn_Excel(fileName, filePath).ToString();
_workSheetName = workSheetName;
dt = new DataTable();
//Create the connection object
if (!string.IsNullOrEmpty(_connStringName))
{
SPSecurity.RunWithElevatedPrivileges(delegate()
{
oledbConn = new OleDbConnection(_connStringName);
try
{
oledbConn.Open();
//Create OleDbCommand obj and select data from worksheet GrandTotals
OleDbCommand cmd = new OleDbCommand("SELECT * FROM " + _workSheetName + ";", oledbConn);
//create new OleDbDataAdapter
OleDbDataAdapter oleda = new OleDbDataAdapter();
oleda.SelectCommand = cmd;
oleda.Fill(dt);
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(ex.Message);
}
finally
{
oledbConn.Close();
}
});
}
return dt;
}
public static OleDbConnection Conn_Excel(string ExcelFileName, string filePath)
{
// filePath == C:\inetpub\wwwroot\wss\VirtualDirectories\80\MySpWebAppName\Shared Documents\FY12_FHP_SPREADSHEET.xlsx
OleDbConnection myConn = new OleDbConnection();
myConn.ConnectionString = string.Format(#"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + filePath + ";Extended Properties=Excel 12.0");
return myConn;
}
What am I doing wrong, or is there a better way to get the Excel file contents as a DataTable?
I ended up using the open source project Excel Data Reader

How to handle SQL Query CommandTimeout in C# 2.0

I have got below code in c#.
SqlConnection conn = new SqlConnection("Data Source=MANOJ-PC\\SQLEXPRESS;Initial Catalog=master;Integrated Security=False;User Id=sa;Password=Manoj;");
conn.Open();
if (conn != null)
{
//create command
SqlCommand cmd = new SqlCommand("dbo.GETTridionLinkData", conn);
cmd.Parameters.AddWithValue("#PageID", "637518");
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandTimeout = 500;
StringBuilder sbXML = new StringBuilder();
//Adding Root node
sbXML.Append("<TridionLinks>");
//Reading all the values of Stored procedure return
using (XmlReader reader = cmd.ExecuteXmlReader())
{
while (reader.Read())
{
sbXML.Append(reader.ReadOuterXml().Replace("//", "/"));
}
}
//Closing the root node tag
sbXML.Append("</TridionLinks>");
XmlDocument xDoc = new XmlDocument();
//Loading string xml in XML Document
xDoc.LoadXml(sbXML.ToString());
}
In above code you can see that, I have set the cmd.CommandTimeout = 500;, now I want to give user an error message if the timeout is more than this or you can say database is down.
Please suggest!!
Please refer to
How to catch SQLServer timeout exceptions
The question has already been answered..
To improve coding, you can use
try{
using (sqlconnection Conn = new SqlConnection("Data Source=MANOJ-PC\\SQLEXPRESS;Initial Catalog=master;Integrated Security=False;User Id=sa;Password=Manoj;"){
...
}
}catch(sqlException ex){
if (ex.Number == -2) {
//return your message to the control or display the error
}
}
well just an example..

Resources