How to transfer a file via bluetooth using Ymodem in iOS (Swift) - ios

I am tasked with updating a bluetooth device's firmware. I am using Swift and need some idea on how to accomplish this. Here are the instructions.
Protocol: Ymodem. Handshake with unit, In Ymodem, the 1024 bytes is
changed to 256bytes, detailed algorithm refers to source code we
provide.
I have the .bin file and am able to download it to my app but I have no idea how to transfer it to the device after I make the bluetooth connection using Ymodem. I see very little on the web for Ymodem and iOS.

Related

UUID when using Bluetooth with IOS and Arduino

I'm attempting to send data between an iPad and Arduino via Bluetooth.
I read several online tutorials but I'm confused about the UUIDs used in them.
How do I find the UUIDs used by services and characteristics for the specific Bluetooth module connected to the Arduino?
Is there an AT command for this? Can they be set? Are they a constant?
This a pretty simple app and I have the Bluetooth communicating with the Arduino via the Terminal program on my Mac. The iOS app just needs to transmit and receive a few bytes of data. Do I really need all the code involved with Core Bluetooth or is there an easier way? What's the simplest, fastest way to implement 2 way communication of a few bytes via Bluetooth?
It appears that a HC-06 Bluetooth module in not BLE and is not compatable with iOS devices. I have succeeded connecting it to my Mac and have written a test app using ORSSerialPort

Redpark Serial Cable Modbus RTU iOS

I'm playing around with a Redpark Serial Cable and the External Accessory Framework to be able to talk to a device through ModBus RTU using libmodbus. I found an objective-c wrapper that I have already used to do this through ModBus TCP.
I am having trouble getting the library to use the Serial Cable as the device to connect. I was wondering if anyone has tried doing this before.
My question really is how does iOS "talk" to the external accessory? What are the paths to these port locations?
I believe in OSX these paths lie in the /dev directory
I am trying to use the following function
modbus_t *modbus_new_rtu(const char *device, int baud, char parity, int data_bit, int stop_bit);
Here is the documentation.
From my brief research on the Redpark Lightning Serial Cable, the cable works with the Rsc Mgr SDK. I suspect you'll need to port libmodbus to iOS using the Rsc Mgr SDK for access to the serial data rather than having libmodbus open a serial port directly.
When the iOS accessory manager receives data from the cable and we
receive an event that data is available in the read stream, the
readBytesAvailable call is made - source
iOS appears to talk to the external accessory via the iOS accessory manager.
This may be a bit late for an answer but I'll give it a go anyway. You will have to modify the Modbus RTU implementation in the original libmodbus using the redpark sdk interface read/write methods. And adapting the serial settings to the redpark sdk. A decent guideline may be to look at the Arduino implementation for libmodbus. It has modifications to run on Arduino. And since the Redpark SDK is objective C, the libmodbus files will need to be changed to a '.m' extension.

external video input into iOS devices

I am working on the experimental project that need to connect external video camera to iPhone.
I found out that we can connect iPhone to external interface like Arduino using redpark cable that ship together with SDK. But I am not sure how iOS handle the raw data taken from the external camera.
I am wondering if AVFoundation can handle this part because we can specify the input device. But I am not sure how to point it to external device.
Or is there any other frameworks that can handle this task?
I am looking for tutorial or sample project that I can learn more about this.
The decoding you need to do depends entirely on the camera you will use.
But, given the data rate limitation of the serial cable you are considering, you will be practically limited to using a camera that can provide a low bit rate h.264 stream.
Decoding such a stream can be done with the ffmpeg library. Instructions for integrating it in an iOS project can be found in this SO question.

Bluetooth connection to LEGO Mindstorms EV3 brick from iOS app

Does anybody know how to establish a bluetooth connection from a self-written iOS app to the
new LEGO Mindstorms EV3 programmable brick?
I tried to do this via the scanForPeripheralsWithServices:options: method of CBCentralManager,
but the brick is not recognized.
But if I enable Bluetooth in the Settings of the iPhone, then the EV3 device is displayed there. There is also an app in the AppStore from LEGO ("Commander") which talks to the brick via Bluetooth, so I think this should be possible in general (as I know, it was not possible for the previous Mindstorms NXT brick).
Does anybody have an idea how I can do this?
Thanks!
As said, the device isn't listed using CoreBluetooth, got it using EAcessory framework, you need to have the item "COM.LEGO.MINDSTORMS.EV3" in UISupportedExternalAccessoryProtocols in your App-Info.plist :
<EAAccessory: 0x15567180> {
connected:YES
connectionID:18565483
name: MFI Accessory
manufacturer: LEGO
modelNumber: DM240411
serialNumber:
firmwareRevision: 1.0.0
hardwareRevision: 1.0.0
protocols: (
"COM.LEGO.MINDSTORMS.EV3"
)
delegate: (null)
}
As with the Lego app, you need to first connect to the EV3 using Settings App.
Then, look at the Apple EADemo sample, it show how to use EASession (encapsulate read/write stream).
Maybe sending data like the C# gathered from monobrick.dk source code (said in Mailerdaimon answer) will work... I'll get a try via Wifi (after porting C# to ObjC, long job), after that, writing to EASession might be easier. I'll update this answer when done.
You will have to wait until Lego release the SDK which hopefully contains information about the protocol.
It was possible with the NXT and i think it will be possible with the EV3.
In the mean time you could try to send your messages via Wifi which is possible right now.
Note that there are two protocol with which the EV3 communicates over Bluetooth. One is used by the LEGO EV3 App on the iPhone and the other one is the same available over USB-HID and WiFi and is partly specified by Communication developer manual and by the source code. The latter protocol is the one you should use.
You can call/link against/check the source code of my uploader for c4ev3 to see how the connection is realized.
HTH.
Enable the Bluetooth and iPod/iPhone option on the EV3 brick. Can be done from tools menu on the EV3.
Enable Bluetooth on the iPhone.
Start Bluetooth pairing process.
Launch the Lego EV3 App on the iPhone.
Done.

Using gPhoto on iOS to communicate with digital cameras over USB

I want bidirectional USB communication between an iOS device and a digital camera using gPhoto2. gPhoto2 "abstracts communication ports and camera protocol, to allow a complete modularity."
Issues I've found:
Apple's strict requirements for apps interfacing with the iOS hardware layer will enviably lead to rejection in the App Store approval process. MFi may mediate this issue.
Getting full access to the lightning/30-pin doc connector to send/receive USB packets may require a private iOS library such as IOKit, and that will get my binary rejected from the App Store.
Connecting a camera via lightning/30-pin launches a PTPCamera-like task to allow the photo app to take over to import photos. That task must be killed get full USB access on OSX, so I'm guessing it's similar on iOS, and killing a task from an app's sandbox seems impossible.
Compiling gPhoto2 for iOS is inherently difficult since I can't dynamically link the gphoto2 library, and thus I must compile it as a static library.
Those are some of the issues I've run into. Is this project worth pursuing? Do you think it's even possible?
Yes, I know there are other solutions, such as using a wifi router or a custom built bluetooth module plugged into the camera to shuttle USB packets to and from the iOS device.

Resources