I am trying to allow HD capture from my iPad app via Apple's HDMI adapter. When trying to do this with various recording devices, I get messages that say the operation is not permitted due to copyright protection. Is it possible to disable HDCP protection for my own app programmatically in some way? The techniques for otherwise capturing HD from the iPad are convoluted to impossible, stringing together various sorts of adapters that may or may not work, and may or may not give full HD resolution results.
We have found a device that works perfectly. It is the BlackMagic H.264 Pro Recorder. It records in H.264, up to 20KB which is nice enough for broadcast. What a relief - we tried and returned 4 or 5 other devices before finding this one. And yes, it captures the HDCP protected signal.
Related
It's very general question. I just want to understand if it's technically possible and why? Also to get some idea of what's the good starting point to investigate this topic.
So question is: Is it technically possible
Connect another device with HDMI-out to iPad with HDMI-in to USB-C adapter
Capture HDMI signal from the device
Render video from the device in iPad app
I guess that it was very limited in previous iOS version.
But in iOS 16 there is a DriverKit now.
Can it be used to solve this problem?
Short answer: yes.
Long answer:
Previously, this was possible for members of the MFi programme who designed their own MFi-compliant devices. It continues to be the only way to do it for iPhones and iPads which are not based on the M1 SoC.
On iPadOS 16+, on iPads with M1 (or in future, presumably, better) SoC, you can indeed drive near-arbitrary USB devices using a DriverKit extension. Note that there is currently no video capture API you can implement on iPadOS, so you cannot make the device available to any app through a standard interface - each app that uses the DriverKit driver will need to implement that driver's specific IOService user client API. (And the app providing the driver needs to be installed.)
If your capture device also captures audio, it should already be picked up automatically if it's a USB Audio Class compliant device; if it's not class-compliant, you can make it available as a system wide audio device using AudioDriverKit.
we are developing a web app, that is supposed to play high quality stereo in combination with accessing the microphone input. We got this to work on all android and pc browsers, but the iphone is refusing to do this properly. We nailed the problem down to the access of the microphone input by "getUserMedia". Web audio plays stereo, until the microphone is enabled. Then, the quality drops and the output goes to mono. I have researched this problem in the internet, but only found posts that are several years old. My hope is, that things have changed in the meantime and solutions have been found. It seems like the phone is switching into some kind of "call mode". I would like to avoid this by either overriding corresponding settings or maybe by using a different way than using web-audio to play the stereo signal. I am open for any ideas. The worst case seems that we have to develop a dedicated native app for ios. If there is any workaround to make this happening in a web app, this would be highly appreciated. If desired, I can provide code snipplets, but I think, at this moment the problem should be clear.
BTW, in android we had similar problems and found that "Dolby Atmos" setting is causing strange mixing down to mono under certain circumstances. Switching it off fixed the android issues. Maybe this helps somebody else and maybe there are global settings on the iphone as well that could cure the problem..
Thank you very much in advance!
Cheers,
Chris
The worst case seems that we have to develop a dedicated native app for ios.
Unfortunately, this is the path forward most likely to yield success for you (if you haven't already figured this problem out, since I'm answering your question almost a year later).
The audio I/O device landscape is complicated, and there are several standards and factors that play into the quality of audio input and output an application yields. For instance:
Is the audio input device the same as the audio output device?
If yes, is the audio I/O device Bluetooth?
If yes, it's unlikely that the Bluetooth audio device supports stereo audio out and simultaneous audio input. Few I/O devices support that, and few host devices support that.
If yes, which Bluetooth version does the I/O device support?
Which operating system is the host device running?
Does the host device's operating system support the selection of audio input and output devices separately?
How much access to the host operating system does your application have?
For example, if your application is running in a browser, your application will have significantly less control over the host device's operating system's audio subsystem.
I have been trying to work around this problem recently, too. Another developer did a detailed investigation of this using a spectrum analyser and scope, and didn't have any luck either: testing iOS audio play/record with scope.
I think building a native app will end up being inevitable, and would also fix the myriad other problems with Safari web audio. Either that or wait until Apple fixes the bugs or implements AudioWorklets.
I am developing an app to do live video streaming and when I recently upgraded to an iPhone 6s, all noise going through the microphone sounds robotic (or like a cricket). I can reproduce this issue only using the skype app, but no other app have this similar output. The skype support forums say that this is a problem they're seeing on iPhone 6s [1] but don't give any details as to what's causing it.
The interesting thing is it doesn't have issues when I use the microphone jack via my headphones, only when the built in microphone is used. Is there a permission or a change I need to make to my app to fix this?
[1] - http://community.skype.com/t5/iOS-iPhone-and-iPad/iPhone-6s-Distorted-sound/td-p/4138308
The 6S is apparently locked to a 48000 sample rate when using the built in microphone. Maybe you tried to set the format to 44100. I think the only way to deal with this is to query an active AVAudioSession for it's sampleRate property and set your format's sample rate to that.
This is a know issue consumer users are seeing. It might be the iOS 9 update was not working properly.
Here's a source (might not be that trusted for some but it's a start)
http://9to5mac.com/2015/09/30/iphone-6s-touch-id-3d-touch-speaker-power-issues/
You can try a different app from the App Store, or call someone on speaker, to check if the issues is reproducible.
I've had a contractor compress a video to be used in an app we're developing and while the video plays just fine on a 3G device, artifacts appear when watched on a 2G device. Why would this be? The 2G device is a 3.1.3 version.
Probably has to do with a mixture of network speed and hardware. Take a look at wired's comparison: http://www.wired.com/reviews/2011/02/verizon-iphone/
Same exact phone, but since ATT's network is faster, the Verizon phone plays a less quality video.
It's probably going to be difficult to get a good looking video over 2G. Also, is it even worth it? How many phones are still 2G? I can't imagine that many.
Is there a way to capture video of a screen from your own application? As far as I see there is no way to do it with UIImagePickerController (cameras only), but maybe there is a way to do it with iOS 4 AV Foundation or Core Video?
There seems to be two ways of capturing the content of the application while it's running:
Use the private API UIGetScreenImage() function which seems to be accepted by Apple now;
Use the following thread's captureView method to capture the image.
You'll have to capture it at many times per second (I guess 24 times should be ok for human eye persistence) then you'll have to produce the movie. Perhaps you could use the ffmpeg iphone port.
Alternatively, if you'd like to capture your application's running for a demo, your best bet would be to run your application on the simulator and use a Mac OS X screencast software to capture it. See also SimFinger which "bundle of little tricks to make a screen capture of the iPhone Simulator suck less".
Finally, perhaps the following StackOverflow thread might help you produce better screencasts.
SimFinger and ScreenFlow are great if you can shoot in the simulator.
If you have to shoot on the device (e.g. when accelerometer, GPS, camera, etc. are used) you currently have to resort to the jailbreak world. The app "Display Recorder", available for $5 in the Cydia Store, allows to create an AVI movie of the iPhone's screen content. This works across all apps. There's a YouTube video showing it. The movie files can then be uploaded to YouTube or pulled off the iPhone via the built-in web server.