Streaming live video from ios [closed] - ios

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I have a need to stream video from the iPhone/iPad camera to a server. It looks like this will need to be done with AVCaptureSession but I don't know how to best architect this.
I found this post:
streaming video FROM an iPhone
But it doesn't handle the "live" part, latency needs to be 2 or 3 seconds at most. Devices can be constrained to 4 or 4S capability if needed, and there is no requirement for HD, VGA is probably what we'll end up with. I assume any solution would use ffmpeg, I haven't found any more appropriate library.
How is this best accomplished?

According to Apple, if you send large amounts of data from an iPhone app you're going to have to use HTTP Live Streaming.
HTTP Live Streaming Overview
It's possible, here's an App that does it called Livu
Try working with ffmpeg for iPhone and the segmenter from Carson Macdonald's excellent Ion Cannon site which has a lot of useful information on HTTP Live Streaming. He's a user here too and can offer invaluable advice.

Related

How to create a composite video with iOS SDK [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I realize this is a difficult question without a straightforward answer, but I'm hoping for some suggestions of frameworks or libraries I should start with.
Imagine a view that contains a background image, some smaller images overlaid, and a small video (an MPMoviePlayerController view) overlaid. What I need to do is create a composite video of the entire view. So the final result saved to disk would be all the images and the video combined into one video file.
What AV tools are available that would be most likely to help accomplish this?
AVAssetWriter seems to be the best option for doing this.

Read external input from iPad [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
For my current project, I need to read the status of a digital ON/OFF input (0Volt or 5 Volt) from iPad.
I need to do this by bluetooth becouse the iPad needs the 3G connection to contact some web services in internet and this prevent me to use a WIFI module.
I read that exists some module like
RN42 ( https://www.sparkfun.com/products/retired/10253 )
or Bluegiga ( http://www.bluegiga.com/ )
but I can't find some example in internet to do what I need.
I need an help to understand what is the best and cheaper hardware that I must buy, and, most of all, I need of some example of code (xCode) for connect my iOS program to the bluetooth module for get the status of my external digital input.
You have a number of options for doing this.
Join the MFI program and either read the input via a physical connector or via Bluetooth.
Use Wi-Fi. Probably easiest in terms of programming but requires more expensive hardware (and maybe more complicated installation)
Use BLE (Bluetooth Low Energy) and CoreBluetooth. Cheap, easy to use.
As you have already suggested, BLE is an easy way to go that doesn't require joining an expensive program. The Bluegiga chips are excellent in talking with an iOS device (I have personally tried the BLE112 device) and they are easy to program, comes with their own microcontroller etc.
To start on the iOS side, you need to read up on CoreBluetooth. Apple has in general excellent documentation about this framework.
I would recommend starting out with the examples, for example the Heart Rate monitor sample project. Also consider buying a dev kit from Bluegiga, it has among other things, a Heart rate device sample that works with iOS.

Suggest an OCR Library for iOS [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I want to make an offline iPhone application that can grab text from a picture. Can anyone suggest the best library which I can use. I heard ZBAR and ZXING can be used only for barcode reading. Is there any other OCR Libraries for iOS to read text from images. I expect your valuable suggestions as soon as possible.
Thanks in Advance..!
Currenlty offline OCR is possible only with Tesseract
You can get source code here
Here is the good tutorial about how to use Tesseract
Also you can perform OCR on multiple language. You can dowlnoad other language trained data here
You might want to look here. They offer a large variation of language they can detect and i heard only the best from their framework.
You can also test the framework live: live demo
Unfortunately i think it is very expensive to use this framework. But if you want an A1(all my Breaking Bad fans ;-)) framework this might be your best shot.

Is there anyway read MIDI file to staff on iOS? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I just wanna know, is there some library or source can read midi file, parse it to staff and show it on iOS ?
Check the MusicPlayer class. Combined with the AUSampler audio unit available since iOS 5.0 you can build a MIDI player quite easily. (The link is OS X, but it applies for iOS as well)
Apple Documentation
About the sampler audio unit see Simple embeddable MidiSynth for iOS?
Reference from How to play MIDI on the iPhone?

Is there any convenient iOS AR SDK available? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I want to do something like this:
With iPhone's camera and AR tech, I could stick one virtual paper to the desk in front of me in the real world and wherever I put my iPhone around that place I stick the paper and it still seems that the paper is on the original position where I've put it at first.
If I want to make it come true, what do I need? And what SDK is more suitable for me to use?
This question is pretty old now. Is it still open? You can try out Qualcomm Vuforia
Take a look at http://socialcompare.com/en/comparison/augmented-reality-sdks to see which AR SDK support iOS. You can start by looking into ARToolkit, Metaio, String, Vuforia, and Wikitude.

Resources