How to pass string to Beacon alte - altbeacon

I copied and edit this code in order to pass string by ALTEBeacon, but I can only see two characters for example String name = "Paulo" And I only see "Pa". I do not understand why. I used this method code.
if (Build.VERSION.SDK_INT >= 21) {
// Call some material design APIs here
device.setText("supported");
// new code
String stringToTransmit = "Paulo";
byte[] stringToTransmitAsAsciiBytes = stringToTransmit.getBytes(StandardCharsets.US_ASCII);
Beacon beacon = new Beacon.Builder()
.setId1(MY_MATCHING_IDENTIFIER.toString())
.setId2(Identifier.fromBytes(stringToTransmitAsAsciiBytes, 0, 5, false).toString())
.setId3("2")
.setManufacturer(0x0118)
.setTxPower(-59)
.setDataFields(Arrays.asList(new Long[] {255l}))
.setBluetoothName(Identifier.fromBytes(stringToTransmitAsAsciiBytes, 0, 5, false).toString())
.build();
BeaconParser beaconParser = new BeaconParser()
.setBeaconLayout("m:2-3=beac,i:4-19,i:20-21,i:22-23,p:24-24,d:25-25");
BeaconTransmitter beaconTransmitter = new BeaconTransmitter(getActivity(), beaconParser);
beaconTransmitter.startAdvertising(beacon);
}
and I recover with this code
for (Beacon b:beacons) {
//new
String receivedString = null;
// byte[] bytes = b.getId2().toByteArray();
byte[] bytes = b.getId2().toByteArray();
receivedString = null;
try {
receivedString = new String(bytes, 0, bytes.length, "ASCII");
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
}

The code attempts to store the string "Paulo" into the AltBeacon Id2 field. The problem is that ASCII encoding uses one byte for each character, and the Id2 field is only two bytes long. Therefore, the field only has room for two characters. This is why you only see "Pa".

Related

Large File upload to ASP.NET Core 3.0 Web API fails due to Request Body to Large

I have an ASP.NET Core 3.0 Web API endpoint that I have set up to allow me to post large audio files. I have followed the following directions from MS docs to set up the endpoint.
https://learn.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.0#kestrel-maximum-request-body-size
When an audio file is uploaded to the endpoint, it is streamed to an Azure Blob Storage container.
My code works as expected locally.
When I push it to my production server in Azure App Service on Linux, the code does not work and errors with
Unhandled exception in request pipeline: System.Net.Http.HttpRequestException: An error occurred while sending the request. ---> Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Request body too large.
Per advice from the above article, I have configured incrementally updated Kesterl with the following:
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((ctx, options) =>
{
var config = ctx.Configuration;
options.Limits.MaxRequestBodySize = 6000000000;
options.Limits.MinRequestBodyDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.MinResponseDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.RequestHeadersTimeout =
TimeSpan.FromMinutes(2);
}).UseStartup<Startup>();
Also configured FormOptions to accept files up to 6000000000
services.Configure<FormOptions>(options =>
{
options.MultipartBodyLengthLimit = 6000000000;
});
And also set up the API controller with the following attributes, per advice from the article
[HttpPost("audio", Name="UploadAudio")]
[DisableFormValueModelBinding]
[GenerateAntiforgeryTokenCookie]
[RequestSizeLimit(6000000000)]
[RequestFormLimits(MultipartBodyLengthLimit = 6000000000)]
Finally, here is the action itself. This giant block of code is not indicative of how I want the code to be written but I have merged it into one method as part of the debugging exercise.
public async Task<IActionResult> Audio()
{
if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
{
throw new ArgumentException("The media file could not be processed.");
}
string mediaId = string.Empty;
string instructorId = string.Empty;
try
{
// process file first
KeyValueAccumulator formAccumulator = new KeyValueAccumulator();
var streamedFileContent = new byte[0];
var boundary = MultipartRequestHelper.GetBoundary(
MediaTypeHeaderValue.Parse(Request.ContentType),
_defaultFormOptions.MultipartBoundaryLengthLimit
);
var reader = new MultipartReader(boundary, Request.Body);
var section = await reader.ReadNextSectionAsync();
while (section != null)
{
var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(
section.ContentDisposition, out var contentDisposition);
if (hasContentDispositionHeader)
{
if (MultipartRequestHelper
.HasFileContentDisposition(contentDisposition))
{
streamedFileContent =
await FileHelpers.ProcessStreamedFile(section, contentDisposition,
_permittedExtensions, _fileSizeLimit);
}
else if (MultipartRequestHelper
.HasFormDataContentDisposition(contentDisposition))
{
var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name).Value;
var encoding = FileHelpers.GetEncoding(section);
if (encoding == null)
{
return BadRequest($"The request could not be processed: Bad Encoding");
}
using (var streamReader = new StreamReader(
section.Body,
encoding,
detectEncodingFromByteOrderMarks: true,
bufferSize: 1024,
leaveOpen: true))
{
// The value length limit is enforced by
// MultipartBodyLengthLimit
var value = await streamReader.ReadToEndAsync();
if (string.Equals(value, "undefined",
StringComparison.OrdinalIgnoreCase))
{
value = string.Empty;
}
formAccumulator.Append(key, value);
if (formAccumulator.ValueCount >
_defaultFormOptions.ValueCountLimit)
{
return BadRequest($"The request could not be processed: Key Count limit exceeded.");
}
}
}
}
// Drain any remaining section body that hasn't been consumed and
// read the headers for the next section.
section = await reader.ReadNextSectionAsync();
}
var form = formAccumulator;
var file = streamedFileContent;
var results = form.GetResults();
instructorId = results["instructorId"];
string title = results["title"];
string firstName = results["firstName"];
string lastName = results["lastName"];
string durationInMinutes = results["durationInMinutes"];
//mediaId = await AddInstructorAudioMedia(instructorId, firstName, lastName, title, Convert.ToInt32(duration), DateTime.UtcNow, DateTime.UtcNow, file);
string fileExtension = "m4a";
// Generate Container Name - InstructorSpecific
string containerName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{instructorId}";
string contentType = "audio/mp4";
FileType fileType = FileType.audio;
string authorName = $"{firstName} {lastName}";
string authorShortName = $"{firstName[0]}{lastName}";
string description = $"{authorShortName} - {title}";
long duration = (Convert.ToInt32(durationInMinutes) * 60000);
// Generate new filename
string fileName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{Guid.NewGuid()}";
DateTime recordingDate = DateTime.UtcNow;
DateTime uploadDate = DateTime.UtcNow;
long blobSize = long.MinValue;
try
{
// Update file properties in storage
Dictionary<string, string> fileProperties = new Dictionary<string, string>();
fileProperties.Add("ContentType", contentType);
// update file metadata in storage
Dictionary<string, string> metadata = new Dictionary<string, string>();
metadata.Add("author", authorShortName);
metadata.Add("tite", title);
metadata.Add("description", description);
metadata.Add("duration", duration.ToString());
metadata.Add("recordingDate", recordingDate.ToString());
metadata.Add("uploadDate", uploadDate.ToString());
var fileNameWExt = $"{fileName}.{fileExtension}";
var blobContainer = await _cloudStorageService.CreateBlob(containerName, fileNameWExt, "audio");
try
{
MemoryStream fileContent = new MemoryStream(streamedFileContent);
fileContent.Position = 0;
using (fileContent)
{
await blobContainer.UploadFromStreamAsync(fileContent);
}
}
catch (StorageException e)
{
if (e.RequestInformation.HttpStatusCode == 403)
{
return BadRequest(e.Message);
}
else
{
return BadRequest(e.Message);
}
}
try
{
foreach (var key in metadata.Keys.ToList())
{
blobContainer.Metadata.Add(key, metadata[key]);
}
await blobContainer.SetMetadataAsync();
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
blobSize = await StorageUtils.GetBlobSize(blobContainer);
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
Media media = Media.Create(string.Empty, instructorId, authorName, fileName, fileType, fileExtension, recordingDate, uploadDate, ContentDetails.Create(title, description, duration, blobSize, 0, new List<string>()), StateDetails.Create(StatusType.STAGED, DateTime.MinValue, DateTime.UtcNow, DateTime.MaxValue), Manifest.Create(new Dictionary<string, string>()));
// upload to MongoDB
if (media != null)
{
var mapper = new Mapper(_mapperConfiguration);
var dao = mapper.Map<ContentDAO>(media);
try
{
await _db.Content.InsertOneAsync(dao);
}
catch (Exception)
{
mediaId = string.Empty;
}
mediaId = dao.Id.ToString();
}
else
{
// metadata wasn't stored, remove blob
await _cloudStorageService.DeleteBlob(containerName, fileName, "audio");
return BadRequest($"An issue occurred during media upload: rolling back storage change");
}
if (string.IsNullOrEmpty(mediaId))
{
return BadRequest($"Could not add instructor media");
}
}
catch (Exception ex)
{
return BadRequest(ex.Message);
}
var result = new { MediaId = mediaId, InstructorId = instructorId };
return Ok(result);
}
I reiterate, this all works great locally. I do not run it in IISExpress, I run it as a console app.
I submit large audio files via my SPA app and Postman and it works perfectly.
I am deploying this code to an Azure App Service on Linux (as a Basic B1).
Since the code works in my local development environment, I am at a loss of what my next steps are. I have refactored this code a few times but I suspect that it's environment related.
I cannot find anywhere that mentions that the level of App Service Plan is the culprit so before I go out spending more money I wanted to see if anyone here had encountered this challenge and could provide advice.
UPDATE: I attempted upgrading to a Production App Service Plan to see if there was an undocumented gate for incoming traffic. Upgrading didn't work either.
Thanks in advance.
-A
Currently, as of 11/2019, there is a limitation with the Azure App Service for Linux. It's CORS functionality is enabled by default and cannot be disabled AND it has a file size limitation that doesn't appear to get overridden by any of the published Kestrel configurations. The solution is to move the Web API app to a Azure App Service for Windows and it works as expected.
I am sure there is some way to get around it if you know the magic combination of configurations, server settings, and CLI commands but I need to move on with development.

I am writing 3DES (using SHA1 HASH) encryption algorithm using C #. Key size error

I am writing 3DES (using SHA1 HASH) encryption algorithm using C #.
Size error in tdes.Key = keyArray of the following code. I do not know what went wrong.
public static string Encrypt(string toEncrypt, bool useHashing)
{
byte[] keyArray;
byte[] toEncryptArray = UTF8Encoding.UTF8.GetBytes(toEncrypt);
System.Configuration.AppSettingsReader settingsReader = new AppSettingsReader();
// Get the key from config file
string key = (string)settingsReader.GetValue("SecurityKey", typeof(String));
//System.Windows.Forms.MessageBox.Show(key);
if (useHashing)
{
SHA1CryptoServiceProvider objSHA1CryptoService = new SHA1CryptoServiceProvider();
keyArray = objSHA1CryptoService.ComputeHash(UTF8Encoding.UTF8.GetBytes(key));
objSHA1CryptoService.Clear();
}
else
keyArray = UTF8Encoding.UTF8.GetBytes(key);
TripleDESCryptoServiceProvider tdes = new TripleDESCryptoServiceProvider();
tdes.Key = keyArray;
tdes.Mode = CipherMode.ECB;
tdes.Padding = PaddingMode.PKCS7;
ICryptoTransform cTransform = tdes.CreateEncryptor();
byte[] resultArray = cTransform.TransformFinalBlock(toEncryptArray, 0, toEncryptArray.Length);
tdes.Clear();
return Convert.ToBase64String(resultArray, 0, resultArray.Length);
}
}

tryparse a value form a file

what i am trying to do is take the variable from the file but throw an exception if input is not a number. i just want an error message to show when the entered amount is a word or negative number. i want to use a try catch but am not sure how to structure it. thanks you guys.
StreamReader read = new StreamReader("../../data.dat");
Stopwatch st = new Stopwatch();
bool ok;
int num;
string input=(read.ReadLine());
ok = int.TryParse(input, out num);
if (ok ==false)
{
throw new Exception("Input in incorrect format");
}
int sum = 0;
Scanner scan = new Scanner("../../data.dat");
int num = Integer.MIN_VALUE;
try {
num = Integer.parseInt(scan.next());
} catch (Exception e) {
System.out.println("Input in incorrect format.");
e.printStackTrace();
}
scan.close();

Read incoming sms without spaces

The following code reads an incoming sms then prints the body of the message. How do I get the app to print out the message without any spaces inbetween?
For example: The incoming sms reads "Here I am", so "Here I am" is printed out, but I want the app to print out "HereIam".
How can I do this? Any help would be most appreciated.
Here is my code:
public void run() {
try {
DatagramConnection _dc = (DatagramConnection)Connector.open("sms://");
for(;;) {
Datagram d = _dc.newDatagram(_dc.getMaximumLength());
_dc.receive(d);
byte[] bytes = d.getData();
String address = d.getAddress();
String msg = new String(bytes);
System.out.println(address);
System.out.println(msg);
}
}catch (Exception me) {
}
}
Thanks
try this
add this line to code
System.out.println(replaceAll(msg," ",""));
Add this method as well
public static String replaceAll(String source, String pattern,
String replacement) {
if (source == null)
return "";
StringBuffer sb = new StringBuffer();
int idx = -1;
int patIdx = 0;
while ((idx = source.indexOf(pattern, patIdx)) != -1) {
sb.append(source.substring(patIdx, idx));
sb.append(replacement);
patIdx = idx + pattern.length();
}
sb.append(source.substring(patIdx));
return sb.toString();
}
It replaces all the spaces with empty string, which is what you want.
Use the String.replace() method:
msg = msg.replace("\s+", "");

Read Image stored in Oracle using Long DataType

I want to read the image stored in Oracle Long datatype.
Number of images are stored in a remote Oracle database in a column with datatype long. I just need to retrieve those images and show them on my aspx page.
I could retrieve the image from database but when tried to caste it to byte array, it threw error that, string can not be converted to byte[]'.
Anybody have any suggestions on how to retrieve these images stored in long column in database.
byte[] signatureBlobReceived = cls_TBL_BROKER_BL.GetInstance().GetSignatureBlobFromAccountNumber_BL(strCRNnumber);
return File(signatureBlobReceived, "image/jpeg");
public byte[] GetSignatureBlobFromAccountNumber_BL()
{
object SignatureBlob = null;
Database db = DatabaseFactory.CreateDatabase("imageConnectionString");
DbCommand dbc = db.GetSqlStringCommand(ConfigurationSettings.AppSettings["signqry"].ToString());
dbc.CommandType = CommandType.Text;
SignatureBlob = db.ExecuteScalar(dbc);
byte[] array = Encoding.ASCII.GetBytes(Convert.ToString(SignatureBlob));
string aa = string.Empty;
return array;
}
Query used is:
<add key="signqry" value="SELECT image FROM table1"/> `
Try this (odp.net)
string connStr = "User Id=user;Password=pwd;Data Source=mySID;";
OracleConnection _conn = new OracleConnection(connStr);
_conn.Open();
string sel = #"select long_raw_col from long_raw_test";
OracleCommand cmd = new OracleCommand(sel, _conn);
cmd.InitialLONGFetchSize = 5000;
OracleDataReader reader = cmd.ExecuteReader();
int rows = 0;
// loop through rows from table
while (reader.Read())
{
rows++;
byte[] buf = new byte[5000];
long bytesRead = reader.GetBytes(reader.GetOrdinal("long_raw_col"), 0, buf, 0, 5000);
FileStream fs = new FileStream("C:\\test\\test_long" + rows + ".dat", FileMode.Create);
fs.Write(buf, 0, (int)bytesRead);
fs.Close();
Console.WriteLine("Row " + rows + ": Read " + bytesRead + " bytes from table, see test_long" + rows + ".dat");
}
This example just reads the long raw data from Oracle into a byte array, then outputs to a file. Note the InitalLONGFetchSize > 0.
I use this class :my database is informix and the images are stored in Byte type .Hope this can help you.
public class MyPhoto
{
public static Stream RetrievePhoto()
{
DBConnection DAL_Helper = new DBConnection(ConfigurationSettings.AppSettings["connection"].ToString());
Byte[] myByteBuff;
Stream myImgStream;
string qry = "----------";
DataTable dt = DAL_Helper.Return_DataTable(qry);
try
{
if (dt.Rows.Count > 0)
{
if (!string.IsNullOrEmpty(dt.Rows[0][0].ToString()))
{
myByteBuff = (Byte[])((object)(dt.Rows[0][0]));
myImgStream = new MemoryStream(myByteBuff);
}
else
{
myImgStream = RetrievePhotoNoProfile();
}
}
else
{
myImgStream = RetrievePhotoNoProfile();
}
}
catch (Exception ex)
{
myImgStream = RetrievePhotoNoProfile();
}
return myImgStream;
}
public static byte[] StreamToByteArray(Stream stream)
{
if (stream is MemoryStream)
{
return ((MemoryStream)stream).ToArray();
}
else
{
return ReadFully(stream);
}
}
public static byte[] ReadFully(Stream input)
{
byte[] buffer = new byte[input.Length];
using (MemoryStream ms = new MemoryStream())
{
int read;
while ((read = input.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
return ms.ToArray();
}
}
private static Stream RetrievePhotoNoProfile()
{
string noprofileimgPath = HttpContext.Current.Server.MapPath("~/images/noprofile.png");
System.IO.FileStream fs = new System.IO.FileStream(noprofileimgPath, System.IO.FileMode.Open, FileAccess.Read);
byte[] ba = new byte[fs.Length];
fs.Read(ba, 0, (int)fs.Length);
Stream myImgStream = new MemoryStream(ba);
fs.Close();
return myImgStream;
}
public static Image byteArrayToImage(byte[] byteArrayIn)
{
MemoryStream ms = new MemoryStream(byteArrayIn);
Image returnImage = Image.FromStream(ms);
return returnImage;
}
}

Resources