I want to apply reverb effect on a sound file and want to save it on disk using Core Audio. I successfully apply the effect and played it using kAudioUnitSubType_RemoteIO. Now I want to save it on disk so I use kAudioUnitSubType_GenericOutput and call AudioUnitRender and I used same code listed below.
core audio offline rendering GenericOutput
Now Problem is that when I run same code from above link on iOS7 it works fine.
When I run it on iOS 8 device it simply save a file having no sound.
Can any one help me?
Thanks
Related
The backend team uses ffmpeg to create video from images.
The strange thing is that the video can be played on Mac browser/iPhone Simulator but not on browser/iOS app on real phone.
I tried using AVPlayer to print the error but error = nil
here is the file: https://firebasestorage.googleapis.com/v0/b/chatchatapp123.appspot.com/o/image_rendered.mp4?alt=media
here is its metadata: https://www.metadata2go.com/result/46e72635-7fac-46ee-acfe-cb6ffda49692
Has anyone encountered this before and if so, any ideas as to why?
Thanks.
I've noticed, field order in metadata is tt, which means
Interlaced video, top field coded and displayed first
But according to the document of “HTTP Live Streaming” from Apple, there is a description,” Important: Interlaced video is not supported on iOS devices.”, which is written on the official document.
https://developer.apple.com/documentation/http_live_streaming/hls_authoring_specification_for_apple_devices
I'm able to record slow motion videos at 120fps using AVFoundation. These videos play as expected on iphone and mac quicktime player. But other players like videojs and my own player[AVPlayer] are not able to play it in slow motion. But if I create a slow motion video using iphone native camera app, then these same players are able to play it as expected.
I compared the two videos using mediainfo and two differences came up -
1. Format Profile: High#L5.1 vs High#L4.1 . I used ffpmeg to make my video's profile to High#L5.1 but it didn't make any difference.
Format GOP : M=1, N=120 vs M=1, N=30. Again I used ffmpeg to set N=120 on my video but it didn't make any difference.
I also read online that exporting using PHAsset may help but my video files are created inside my Documents directory and will never go to Photo Album/Camera roll. I can probably try exporting to camera roll using PHAsset and then move it to my Documents directory as a hack (if it works at all). But I really need to know the underlying reason.
Any ideas what is causing this issue ?
Thanks
I'm developing an APP on iphone using as3,flash builder.
I'd like to know is this possible to record a video and save it into native gallery directly WITHOUT bump out the camera UI.
1.I've try taking photo and save it directly without showing CameraUI. And it works.
(use it--> cameraRoll.addBitmapData(_video))
*
2.but call the video should use this method -> deviceCameraApp.launch( MediaType.IMAGE );
this is the main reason that app launch the native camera UI I guess.
*
3.I've test an as3 FLVRecorder library build by Joris and work good in MAC.
And I'd like to move it to the APP. But when I done the video record,I can't find my video on gallery maybe because of the video is saved into APP.
(like applicationStorageDirectory)
*
so here is my question:
1.is there other way to record and save video directly into gallery without Native camera UI in AS3?
/
2.if is not,could I done this with FLVRecorder library just change the path?
(I'm looking for something like /private/var/mobile/Media/DCIM)
Thanks for help!
I'm using the YouTube API in my flex app to play videos and it works fine running on my computer and android, but I'm getting an issue when I try to use it on iOS, I just get a black screen!
Any ideas?
I was curious so did some research.
First, I searched for SWFLoader on iOS and this post came up. I'll quote:
You cannot load SWFS that run ActionScript on IOS
Then I realized from your comment you may be using the Loader class; so I googled Flash Loader on iOS and came across this. To quote:
In an iPhone application, you cannot use the Loader.load() method.
However, you cannot run any ActionScript code in SWF content loaded
with the Loader.load() method. However, you can use assets in the SWF
file (such as movie clips, images, fonts, and sounds in the library).
You can also use the Loader.load() method to load image files.
You may benefit from reading through the actual Adobe docs. Specifically this page which focuses on unsupported APIs. It is the last link where the above quote comes from.
It's true you cannot load and run a SWF and in this case the video is a stream so you there is no media to save. Youtube does have an alternative URL that points to an actual mp4 file which will play on iOS. You can use a VideoDisplay player and that URL to play the video.
The key functionality of the app would be 1) recording short videos (typically 20-30 sec), 2) playing the videos 1-5 times right after shooting them (slow motion and pausing are a must) and 3) drawing over the videos, i.e. I'd need an additional data layer on top of the raw video.
I've been studying the HTML5 app platforms (PhoneGap, Titanium) because I'd like to minimize writing native iOS code, but it seems both recording and showing embedded video doesn't work on these platforms. The record-play-edit process is pretty simple, but it needs to be super-smooth and fast.
If you want to use JS / HTML5 and then generate the app with eg. Phonegap, then one option could be a custom Phonegap plugin built for "Media capture" and then use HTML5 in creating the app logic.
Objective-C Media Capture:
http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5
Example Phonegap plugin for Audio Capture:
https://github.com/purplecabbage/phonegap-plugins/tree/master/iPhone/AudioRecord
More info about Phonegap plugin creation for iOS can be found from Phonegap wiki...