IoT Edge Offline Data Storage decoding .log data file - azure-iot-edge

I have mounted local storage to my Edge Hub/Agent Modules, and when the device goes offline it stores the data locally. In the event that the device cant go up online, I need to be able to read the offline data and send it to IoT hub. After looking at the file, some portion of the message is base64 encoded, and some parts have non-base64 encoded characters.
Is there a method for decoding the message or any architecture patterns to support cases when a device can't go back up online and upload the data?

Related

How much data can I store in IndexDB from a mobile phone using a PWA

I have a PWA designed in Ionic and we are using it to take pictures. However the requirements is that it needs to work offline so we are storing these pictures in IndexDB.
However I have been unable to track down how much storage we will have access to in IndexDB and am concerned it will only be around 50MB or so. The the customer potentially wants to store 100 images or more offline and when it gets back to wifi it will start sending up to a remote API.
Is there any information on storage capability for indexDB when a PWA is run on a mobile device either IOS or Android?
I have seen some posts that but mainly talk about storage of the javascript files where this is actual data captured while the app is running.
NOTE: Individual images will be no more than 2-3MB each in size.

How to get the saved text file from my iOS App to PC?

I build an iOS App that collects data via BLE and saves it as text files. What I want to do now is to retrieve the saved data on my PC (Windows) for further analysis. As the developer I know I can download the App container via Xcode for accessing the saved file and it works well. However, I wonder if there’s any approach that I can get the saved files without using Xcode? Can I save the text files to a public location so I can access them directly?
In short, you can't do what you want in this exact way, as iOS devices do not support the USB mass storage protocol. The only way to get files out is using iTunes, which can access your iOS device's Documents folder.
Of course, you could reverse-engineer that protocol, but that's a bit unreliable and might break if Apple ever changes something.
Instead, find another way. E.g. you could have your PC app contain a tiny HTTP server and have your iOS app send an HTTP request to it that contains the file's data. Alternately you could even go via an actual server on the internet.
Or, given all iOS users have iCloud, you could also just save your data to iCloud, then have your PC users install Apple's iCloud for Windows stuff and then just access the file from there.
If I understand you correctly, you want to access the client's data on the PC.
You can create a file server or use a third party service like Amazon S3(https://aws.amazon.com/s3/).
Client upload that collects data through the http/https protocol.
PC download the client's data and analysis.

Is iOS Data Protection compatible with the Parse local datastore?

Apple offers the Data Protection capability to apps which encrypts data stored on the device when the device is locked. The Parse iOS local datastore is apparently stored as an unencrypted sqlite database. Is it possible to apply the required Data Protection attributes to the local datastore, specifically NSFileProtectionComplete? The Parse iOS docs don't say anything about this.
I have already applied the appropriate CLPs and ACLs to my Parse classes/objects. I'm looking into iOS Data Protection so that if a user's phone is lost or stolen and a device passcode is in place, the data inside their local datastore cannot be read.

iOS broadcasting live to Azure Media Services

I am trying to make a Periscope-like app (not practically, but technical requirements are alike) where users can start streaming quickly from their iPhone to an unknown amount of users, both mobile. I am trying to use Azure Media Services for live video streaming, but even after reading pages of documentation I'm stuck.
I'm using VideoCore (https://github.com/jgh-/VideoCore) to publish from iOS device to the RTMP server. On local (using Wowza) I can just connect to the local server with my set username and password as shown:
vcSession = [[VCSimpleSession alloc] initWithVideoSize:CGSizeMake(1280, 720) frameRate:30 bitrate:1000000 useInterfaceOrientation:NO];
[self.view addSubview:vcSession.previewView];
vcSession.previewView.frame = self.view.bounds;
vcSession.delegate = self;
[vcSession startRtmpSessionWithURL:#"rtmp://172.20.10.2:1935/live?rtmpauth=test:test" andStreamKey:#"test"];
Where the rtmpauth parameter has the username:password format, which I've set both to test on my local server. It works. In Azure, I've created a channel named test, and I've got the following Ingest URL:
rtmp://test-myappname.channel.mediaservices.windows.net:1935/live/some-long-hexadecimal-string
In Wirecast, I'm able to stream to URL (though EXTREMELY slow and connection frequently lost, don't know why) by selecting Azure Media Services in Output Settings and typing that Ingest URL. In iOS, I have no idea how to connect to Azure Media Services.
In startRtmpSessionWithURL:andStreamKey: method, I've tried all the possible combinations of URL and a stream key, but no luck. I have no idea what my username/password is (nothing is given at the Azure side), what the stream key is (I've tried test, live, empty string) and what that long hexadecimal string is (some sources say that it's called a locator, though).
What is the correct format of RTMP URL and stream key when connecting to Azure Media Services for streaming?
I'll find someone to help you. I think you are just missing a stream name after the long hex string in the URL.
rtmp://test-myappname.channel.mediaservices.windows.net:1935/live/some-long-hexadecimal-string/[YOUR-CUSTOM-STREAM-NAME-Anything Really!]
Also, do you have any control over the encoding settings? Its possible that some encoding settings are not right. We have not tested with that VideoCore library, so it may also be that there is a slight variation in the RTMP protocol (since it is very poorly documented and there is a lot of missing information out there).
I'm curious why your Wirecast setup is having trouble as well. That doesn't sound good to start with. Network issue? Are you setting it to the proper Encoder preset with H.264 and NOT x264 set?
Review your settings in Wirecast against Cenk's blog post here: http://azure.microsoft.com/blog/2014/09/18/azure-media-services-rtmp-support-and-live-encoders/

Why/when does iOS convert audio attachments sent using Messages into different formats?

Our app sends AIFF files as attachments in Messages using UIActivityViewController. Sometimes, when the message arrives on a mobile device or desktop machine, the file example.aiff has become example.amr or example.aiff.m4a.
I imagine that Apple is converting to a compressed format to stay under some size limits, but I can't find any documentation about the specific circumstances under which this happens. Does anyone know the details here?

Resources