XNA XBOX highscore port - xna

I'm trying to port my pc XNA game to the xbox and have tried to implement xna easystorage alongside my existing pc file management for highscores. Basically trying to combine http://xnaessentials.com/tutorials/highscores.aspx/tutorials/highscores.aspx with http://easystorage.codeplex.com/
I'm running into one specific error regarding the LoadHighScores() as error with 'return (data);' - Use of unassigned local variable 'data'.
I presume this is due to async design of easystorage/xbox!? but not sure how to resolve - below are code samples:
ORIGINAL PC CODE: (works on PC)
public static HighScoreData LoadHighScores(string filename)
{
HighScoreData data; // Get the path of the save game
string fullpath = "Content/highscores.lst";
// Open the file
FileStream stream = File.Open(fullpath, FileMode.Open,FileAccess.Read);
try
{ // Read the data from the file
XmlSerializer serializer = new XmlSerializer(typeof(HighScoreData));
data = (HighScoreData)serializer.Deserialize(stream);
}
finally
{ // Close the file
stream.Close();
}
return (data);
}
XBOX PORT: (with error)
public static HighScoreData LoadHighScores(string container, string filename)
{
HighScoreData data;
if (Global.SaveDevice.FileExists(container, filename))
{
Global.SaveDevice.Load(container, filename, stream =>
{
File.Open(Global.fileName_options, FileMode.Open,//FileMode.OpenOrCreate,
FileAccess.Read);
try
{
// Read the data from the file
XmlSerializer serializer = new XmlSerializer(typeof(HighScoreData));
data = (HighScoreData)serializer.Deserialize(stream);
}
finally
{
// Close the file
stream.Close();
}
});
}
return (data);
}
Any ideas?

Assign data before return. ;)
data = (if_struct) ? new your_struct() : null;
if (Global.SaveDevice.FileExists(container, filename))
{
......
}
return (data);
}

Related

Large File upload to ASP.NET Core 3.0 Web API fails due to Request Body to Large

I have an ASP.NET Core 3.0 Web API endpoint that I have set up to allow me to post large audio files. I have followed the following directions from MS docs to set up the endpoint.
https://learn.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.0#kestrel-maximum-request-body-size
When an audio file is uploaded to the endpoint, it is streamed to an Azure Blob Storage container.
My code works as expected locally.
When I push it to my production server in Azure App Service on Linux, the code does not work and errors with
Unhandled exception in request pipeline: System.Net.Http.HttpRequestException: An error occurred while sending the request. ---> Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Request body too large.
Per advice from the above article, I have configured incrementally updated Kesterl with the following:
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((ctx, options) =>
{
var config = ctx.Configuration;
options.Limits.MaxRequestBodySize = 6000000000;
options.Limits.MinRequestBodyDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.MinResponseDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.RequestHeadersTimeout =
TimeSpan.FromMinutes(2);
}).UseStartup<Startup>();
Also configured FormOptions to accept files up to 6000000000
services.Configure<FormOptions>(options =>
{
options.MultipartBodyLengthLimit = 6000000000;
});
And also set up the API controller with the following attributes, per advice from the article
[HttpPost("audio", Name="UploadAudio")]
[DisableFormValueModelBinding]
[GenerateAntiforgeryTokenCookie]
[RequestSizeLimit(6000000000)]
[RequestFormLimits(MultipartBodyLengthLimit = 6000000000)]
Finally, here is the action itself. This giant block of code is not indicative of how I want the code to be written but I have merged it into one method as part of the debugging exercise.
public async Task<IActionResult> Audio()
{
if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
{
throw new ArgumentException("The media file could not be processed.");
}
string mediaId = string.Empty;
string instructorId = string.Empty;
try
{
// process file first
KeyValueAccumulator formAccumulator = new KeyValueAccumulator();
var streamedFileContent = new byte[0];
var boundary = MultipartRequestHelper.GetBoundary(
MediaTypeHeaderValue.Parse(Request.ContentType),
_defaultFormOptions.MultipartBoundaryLengthLimit
);
var reader = new MultipartReader(boundary, Request.Body);
var section = await reader.ReadNextSectionAsync();
while (section != null)
{
var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(
section.ContentDisposition, out var contentDisposition);
if (hasContentDispositionHeader)
{
if (MultipartRequestHelper
.HasFileContentDisposition(contentDisposition))
{
streamedFileContent =
await FileHelpers.ProcessStreamedFile(section, contentDisposition,
_permittedExtensions, _fileSizeLimit);
}
else if (MultipartRequestHelper
.HasFormDataContentDisposition(contentDisposition))
{
var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name).Value;
var encoding = FileHelpers.GetEncoding(section);
if (encoding == null)
{
return BadRequest($"The request could not be processed: Bad Encoding");
}
using (var streamReader = new StreamReader(
section.Body,
encoding,
detectEncodingFromByteOrderMarks: true,
bufferSize: 1024,
leaveOpen: true))
{
// The value length limit is enforced by
// MultipartBodyLengthLimit
var value = await streamReader.ReadToEndAsync();
if (string.Equals(value, "undefined",
StringComparison.OrdinalIgnoreCase))
{
value = string.Empty;
}
formAccumulator.Append(key, value);
if (formAccumulator.ValueCount >
_defaultFormOptions.ValueCountLimit)
{
return BadRequest($"The request could not be processed: Key Count limit exceeded.");
}
}
}
}
// Drain any remaining section body that hasn't been consumed and
// read the headers for the next section.
section = await reader.ReadNextSectionAsync();
}
var form = formAccumulator;
var file = streamedFileContent;
var results = form.GetResults();
instructorId = results["instructorId"];
string title = results["title"];
string firstName = results["firstName"];
string lastName = results["lastName"];
string durationInMinutes = results["durationInMinutes"];
//mediaId = await AddInstructorAudioMedia(instructorId, firstName, lastName, title, Convert.ToInt32(duration), DateTime.UtcNow, DateTime.UtcNow, file);
string fileExtension = "m4a";
// Generate Container Name - InstructorSpecific
string containerName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{instructorId}";
string contentType = "audio/mp4";
FileType fileType = FileType.audio;
string authorName = $"{firstName} {lastName}";
string authorShortName = $"{firstName[0]}{lastName}";
string description = $"{authorShortName} - {title}";
long duration = (Convert.ToInt32(durationInMinutes) * 60000);
// Generate new filename
string fileName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{Guid.NewGuid()}";
DateTime recordingDate = DateTime.UtcNow;
DateTime uploadDate = DateTime.UtcNow;
long blobSize = long.MinValue;
try
{
// Update file properties in storage
Dictionary<string, string> fileProperties = new Dictionary<string, string>();
fileProperties.Add("ContentType", contentType);
// update file metadata in storage
Dictionary<string, string> metadata = new Dictionary<string, string>();
metadata.Add("author", authorShortName);
metadata.Add("tite", title);
metadata.Add("description", description);
metadata.Add("duration", duration.ToString());
metadata.Add("recordingDate", recordingDate.ToString());
metadata.Add("uploadDate", uploadDate.ToString());
var fileNameWExt = $"{fileName}.{fileExtension}";
var blobContainer = await _cloudStorageService.CreateBlob(containerName, fileNameWExt, "audio");
try
{
MemoryStream fileContent = new MemoryStream(streamedFileContent);
fileContent.Position = 0;
using (fileContent)
{
await blobContainer.UploadFromStreamAsync(fileContent);
}
}
catch (StorageException e)
{
if (e.RequestInformation.HttpStatusCode == 403)
{
return BadRequest(e.Message);
}
else
{
return BadRequest(e.Message);
}
}
try
{
foreach (var key in metadata.Keys.ToList())
{
blobContainer.Metadata.Add(key, metadata[key]);
}
await blobContainer.SetMetadataAsync();
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
blobSize = await StorageUtils.GetBlobSize(blobContainer);
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
Media media = Media.Create(string.Empty, instructorId, authorName, fileName, fileType, fileExtension, recordingDate, uploadDate, ContentDetails.Create(title, description, duration, blobSize, 0, new List<string>()), StateDetails.Create(StatusType.STAGED, DateTime.MinValue, DateTime.UtcNow, DateTime.MaxValue), Manifest.Create(new Dictionary<string, string>()));
// upload to MongoDB
if (media != null)
{
var mapper = new Mapper(_mapperConfiguration);
var dao = mapper.Map<ContentDAO>(media);
try
{
await _db.Content.InsertOneAsync(dao);
}
catch (Exception)
{
mediaId = string.Empty;
}
mediaId = dao.Id.ToString();
}
else
{
// metadata wasn't stored, remove blob
await _cloudStorageService.DeleteBlob(containerName, fileName, "audio");
return BadRequest($"An issue occurred during media upload: rolling back storage change");
}
if (string.IsNullOrEmpty(mediaId))
{
return BadRequest($"Could not add instructor media");
}
}
catch (Exception ex)
{
return BadRequest(ex.Message);
}
var result = new { MediaId = mediaId, InstructorId = instructorId };
return Ok(result);
}
I reiterate, this all works great locally. I do not run it in IISExpress, I run it as a console app.
I submit large audio files via my SPA app and Postman and it works perfectly.
I am deploying this code to an Azure App Service on Linux (as a Basic B1).
Since the code works in my local development environment, I am at a loss of what my next steps are. I have refactored this code a few times but I suspect that it's environment related.
I cannot find anywhere that mentions that the level of App Service Plan is the culprit so before I go out spending more money I wanted to see if anyone here had encountered this challenge and could provide advice.
UPDATE: I attempted upgrading to a Production App Service Plan to see if there was an undocumented gate for incoming traffic. Upgrading didn't work either.
Thanks in advance.
-A
Currently, as of 11/2019, there is a limitation with the Azure App Service for Linux. It's CORS functionality is enabled by default and cannot be disabled AND it has a file size limitation that doesn't appear to get overridden by any of the published Kestrel configurations. The solution is to move the Web API app to a Azure App Service for Windows and it works as expected.
I am sure there is some way to get around it if you know the magic combination of configurations, server settings, and CLI commands but I need to move on with development.

jwplayer unable to pseudo stream a video loaded via http handler

I managed to set up modh264 on IIS 7 working just fine, pseudo streaming is working great.
I can't get jwplayer pseudo streaming to work with a httphandler in-between.
I mean the video starts from the beginning whenever you click in a different position!
if I remove the handler the pseudo streaming works as expected.
My problem here is to prevent people gaining direct access to my videos (I don't care if they save the video via browser cache).
I had to load via 10k bytes chunks since videos are big enough to get memory exception
here's my httphandler
public class DontStealMyMoviesHandler : IHttpHandler
{
/// <summary>
/// You will need to configure this handler in the web.config file of your
/// web and register it with IIS before being able to use it. For more information
/// see the following link: http://go.microsoft.com/?linkid=8101007
/// </summary>
#region IHttpHandler Members
public bool IsReusable
{
// Return false in case your Managed Handler cannot be reused for another request.
// Usually this would be false in case you have some state information preserved per request.
get { return true; }
}
public void ProcessRequest(HttpContext context)
{
HttpRequest req = context.Request;
string path = req.PhysicalPath;
string extension = null;
string contentType = null;
string fileName = "";
if (req.UrlReferrer == null)
{
context.Response.Redirect("~/Home/");
}
else
{
fileName = "file.mp4";
if (req.UrlReferrer.Host.Length > 0)
{
if (req.UrlReferrer.ToString().ToLower().Contains("/media/"))
{
context.Response.Redirect("~/Home/");
}
}
}
extension = Path.GetExtension(req.PhysicalPath).ToLower();
switch (extension)
{
case ".m4v":
case ".mp4":
contentType = "video/mp4";
break;
case ".avi":
contentType = "video/x-msvideo";
break;
case ".mpeg":
contentType = "video/mpeg";
break;
//default:
// throw new notsupportedexception("unrecognized video type.");
}
if (!File.Exists(path))
{
context.Response.Status = "movie not found";
context.Response.StatusCode = 404;
}
else
{
try
{
//context.Response.Clear();
//context.Response.AddHeader("content-disposition", "attachment; filename=file.mp4");
//context.Response.ContentType = contentType;
//context.Response.WriteFile(path, false);
//if(HttpRuntime.UsingIntegratedPipeline)
// context.Server.TransferRequest(context.Request.Url.ToString(), true);
//else
// context.RewritePath(context.Request.Url.AbsolutePath.ToString(), true);
// Buffer to read 10K bytes in chunk:
byte[] buffer = new Byte[10000];
// Length of the file:
int length;
// Total bytes to read:
long dataToRead;
using (FileStream iStream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read))
{
// Total bytes to read:
dataToRead = iStream.Length;
context.Response.Clear();
context.Response.Cache.SetNoStore();
context.Response.Cache.SetLastModified(DateTime.Now);
context.Response.AppendHeader("Content-Type", contentType);
context.Response.AddHeader("Content-Disposition", "attachment; filename=" + fileName);
// Read the bytes.
while (dataToRead > 0)
{
// Verify that the client is connected.
if (context.Response.IsClientConnected)
{
// Read the data in buffer.
length = iStream.Read(buffer, 0, 10000);
// Write the data to the current output stream.
context.Response.OutputStream.Write(buffer, 0, length);
// Flush the data to the HTML output.
context.Response.Flush();
buffer = new Byte[10000];
dataToRead = dataToRead - length;
}
else
{
//prevent infinite loop if user disconnects
dataToRead = -1;
}
}
}
}
catch (Exception e)
{
context.Response.Redirect("home");
}
finally
{
context.Response.Close();
}
}
}
#endregion
}
Thank you in advance
I solved creating a httpmodule instead, Because with a httpHandler I had to manage the response myself causing the pseudo stream to fail (the file was loaded entirely to the output stream). While this way if someone is accessing the file directly I just do a simple redirect. I don't get why redirect to "~/" doesn't work.
public class DontStealMyMoviesModule : IHttpModule
{
public DontStealMyMoviesModule()
{
}
public void Init(HttpApplication r_objApplication)
{
// Register our event handler with Application object.
r_objApplication.PreSendRequestContent +=new EventHandler(this.AuthorizeContent);
}
public void Dispose()
{
}
private void AuthorizeContent(object r_objSender, EventArgs r_objEventArgs)
{
HttpApplication objApp = (HttpApplication)r_objSender;
HttpContext objContext = (HttpContext)objApp.Context;
HttpRequest req = objContext.Request;
if (Path.GetExtension(req.PhysicalPath).ToLower() != ".mp4") return;
if (req.UrlReferrer == null)
{
objContext.Response.Redirect("/");
}
}
}

PCLStorage and binary data

I'm just new in this PCL libraries, I'm developing an iPhone app with Xamarin and I can't find the way to save it on the phone. The closest I get is with PCLStorage but he only saves text.
There is another way that I can save binary files with other procedure.
Thank you.
foreach (images element in json_object)
{
//var nameFile = Path.Combine (directoryname, element.name);
try{
IFile file = await folder_new.GetFileAsync(element.name);
}catch(FileNotFoundException ex ){
RestClient _Client = new RestClient(element.root);
RestRequest request_file = new RestRequest("/images/{FileName}");
request_file.AddParameter("FileName", element.name, ParameterType.UrlSegment);
_Client.ExecuteAsync<MemoryStream>(
request_file,
async Response =>
{
if (Response != null)
{
IFolder rootFolder_new = FileSystem.Current.LocalStorage;
IFile file_new = await rootFolder_new.CreateFileAsync(element.name,CreationCollisionOption.ReplaceExisting);
await file_new.WriteAllTextAsync(Response.Data);
}
});
}
}
Use the IFile.OpenAsync method to get a stream which you can use to read/write binary data. Here's how you would read a file:
IFile file = await folder_new.GetFileAsync(element.name);
using (Stream stream = await file.OpenAsync(FileAccess.Read))
{
// Read stream and process binary data from it...
}

How to compress the files in Blackberry?

In my application I used html template and images for browser field and saved in the sdcard . Now I want to compress that html,image files and send to the PHP server. How can I compress that files and send to server? Provide me some samples that may help lot.
i tried this way... my code is
EDIT:
private void zipthefile() {
String out_path = "file:///SDCard/" + "newtemplate.zip";
String in_path = "file:///SDCard/" + "newtemplate.html";
InputStream inputStream = null;
GZIPOutputStream os = null;
try {
FileConnection fileConnection = (FileConnection) Connector
.open(in_path);//read the file from path
if (fileConnection.exists()) {
inputStream = fileConnection.openInputStream();
}
byte[] buffer = new byte[1024];
FileConnection path = (FileConnection) Connector
.open(out_path,
Connector.READ_WRITE);//create the out put file path
if (!path.exists()) {
path.create();
}
os = new GZIPOutputStream(path.openOutputStream());// for create the gzip file
int c;
while ((c = inputStream.read()) != -1) {
os.write(c);
}
} catch (Exception e) {
Dialog.alert("" + e.toString());
} finally {
if (inputStream != null) {
try {
inputStream.close();
} catch (IOException e) {
e.printStackTrace();
Dialog.alert("" + e.toString());
}
}
if (os != null) {
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
Dialog.alert("" + e.toString());
}
}
}
}
this code working fine for single file but i want to compress all the file(more the one file)in the folder .
In case you are not familiar with them, I can tell you that in Java the stream classes follow the Decorator Pattern. These are meant to be piped to other streams to perform additional tasks. For instance, a FileOutputStream allows you to write bytes to a file, if you decorate it with a BufferedOutputStream then you get also buffering (big chunks of data are stored in RAM before being finally written to disc). Or if you decorate it with a GZIPOutputStream then you get also compression.
Example:
//To read compressed file:
InputStream is = new GZIPInputStream(new FileInputStream("full_compressed_file_path_here"));
//To write to a compressed file:
OutputStream os = new GZIPOutputStream(new FileOutputStream("full_compressed_file_path_here"));
This is a good tutorial covering basic I/O . Despite being written for JavaSE, you'll find it useful since most things work the same in BlackBerry.
In the API you have these classes available:
GZIPInputStream
GZIPOutputStream
ZLibInputStream
ZLibOutputStream
If you need to convert between streams and byte array use IOUtilities class or ByteArrayOutputStream and ByteArrayInputStream.

OleDbConnection to Excel File in MOSS 2007 Shared Documents

I need to programmatically open an Excel file that is stored in a MOSS 2007 Shared Documents List. I’d like to use an OleDbConnection so that I may return the contents of the file as a DataTable. I believe this is possile since a number of articles on the Web imply this is possible. Currently my code fails when trying to initialize a new connection (oledbConn = new OleDbConnection(_connStringName); The error message is:
Format of the initialization string does not conform to specification starting at index 0.
I believe I am just not able to figure the right path to the file. Here is my code:
public DataTable GetData(string fileName, string workSheetName, string filePath)
{
// filePath == C:\inetpub\wwwroot\wss\VirtualDirectories\80\MySpWebAppName\Shared Documents\FY12_FHP_SPREADSHEET.xlsx
// Initialize global vars
_connStringName = DataSource.Conn_Excel(fileName, filePath).ToString();
_workSheetName = workSheetName;
dt = new DataTable();
//Create the connection object
if (!string.IsNullOrEmpty(_connStringName))
{
SPSecurity.RunWithElevatedPrivileges(delegate()
{
oledbConn = new OleDbConnection(_connStringName);
try
{
oledbConn.Open();
//Create OleDbCommand obj and select data from worksheet GrandTotals
OleDbCommand cmd = new OleDbCommand("SELECT * FROM " + _workSheetName + ";", oledbConn);
//create new OleDbDataAdapter
OleDbDataAdapter oleda = new OleDbDataAdapter();
oleda.SelectCommand = cmd;
oleda.Fill(dt);
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(ex.Message);
}
finally
{
oledbConn.Close();
}
});
}
return dt;
}
public static OleDbConnection Conn_Excel(string ExcelFileName, string filePath)
{
// filePath == C:\inetpub\wwwroot\wss\VirtualDirectories\80\MySpWebAppName\Shared Documents\FY12_FHP_SPREADSHEET.xlsx
OleDbConnection myConn = new OleDbConnection();
myConn.ConnectionString = string.Format(#"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + filePath + ";Extended Properties=Excel 12.0");
return myConn;
}
What am I doing wrong, or is there a better way to get the Excel file contents as a DataTable?
I ended up using the open source project Excel Data Reader

Resources