In AVFoundation we can split and merge tracks of any media files. I assume a subtitle to be another track and I want to include this track based on the language the user chooses. My Idea is to include the hardcoded subtitles files as per languages I support in the project and add this subtitle track to the video file I have at run time.
I am not sure if this is possible with AVFoundation. Please direct me to a solution.
The apple provided example code "avsubtitleswriterOSX" is compatible with iOS 7 and 8. This solved my issue.
avsubtitleswriterOSX
Related
I have a requirement to display an interactive 3D model (the client supplies few FBX files) in one of the screens in a native Objective-C/Swift app.
I know that there's a possibility to work with Unity/Cocos3D but then the entire app will have to be Unity/Cocos3D based. In my case the app has to be native and only one screen (one of the tabs in the tab bar controller) should integrate the 3D model that the user can move/rotate etc.
Thanks.
The FBX SDK is available on iOS and I wrote a little tutorial to demo how to use it here. However, the FBX SDK does contain a viewport/canvas for displaying FBX file on any OS. For displaying FBX you would need to create your own view. There is couple of solutions you can use here:
without programming, you can use the 'FBX Review'. This tools is free and designed to display FBX, DAE, OBJ, ... files
you can implement your own viewport/view such as an obj view like I did here, but it would probably need to be rewritten. I wrote it overnight for a hack, so it needs to be optimized.
you use the iOS SceneKit library which would be a better approach than implementing your own view.
If you go with option 3 above, you can use Unity to export a Collada (.dae) file for importing into SceneKit thru the collada exporter. You can buy the exporter on the Unity Asset Store directly; there's one for Unity 5.x and one for Unity 2017.
I'm using Qt5.5 for iOS development.
I'm wondering how to find and open a file in an iOS device to read and write using Qt5.5. As I know, there's no such file tree structure in iOS. When I download a picture, for example, I even do not know where it locates. But I can see it in apps.
Is there anyone can help? Thanks very much.
I am no expert with Qt, but I believe you need the QStandardPaths class.
iOS is no different to any other platform that stores files in certain pre-defined locations.
I absolutely need to play remote files in an iOS app I'm developing, so using AVPlayer seems to be my only option. (I don't want to download/load files as NSData, then implement the AVAudioPlayer because this doesn't allow the files to start playing immediately.) These remote files sometimes vary greatly in output volume from one to another. How can I implement some form of automatic gain control for the AVPlayer? It's seeming like it's not even possible.
Also: I've explored the MTAudioProcessingTap, but couldn't get it to work using information from the following blogs:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
and
https://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
I'm open to any ideas that involve the AVPlayer. Can it be done? (Thanks in advance - cheers!)
There is a good open-source library that uses Apple's private frameworks to add audio file to the iTunes library on a jailbroken iOS device, libipodimport. What it's missing is the ability to also add artwork when adding a music file (one contributor checked in code that should do this, but the author later removed it as it didn't work).
Does anyone know what was missing from the code in libipodimport (see previous link), or perhaps an alternative way of adding artwork files to a music file?
The contributor to libipodimport told me the missing line that made the artwork adding work:
// metad is a SSDownloadMetadata object, see libipodimport for rest of code
[metad setFullSizeImageURL:[NSURL fileURLWithPath:[userInfo objectForKey:kIPIKeyArtworkPath]]];
It seems to only work with jpeg files, not png files.
The library libipodimport will most likely be updated by the author in the near future.
It is possible using AudioUnits on iOS to create samplers that load and play soundfont (or SF2) files. This is a really great feature. The problem is that I don't see any interface for inspecting a soundfont to see: a) how many presets it contains and b) the names of the presets it contains.
It is possible to obtain the current preset name by first loading the soundfont into the sampler using AudioUnitSetProperty with kAUSamplerProperty_LoadInstrument and then calling AudioUnitGetProperty with kAudioUnitProperty_ClassInfo on the sampler. This is not very efficient however, and only tells you the name of the currently loaded preset. It also does not seem to tell you how many presets are contained in the soundfont.
How does one do these things without using 3rd party code (surely it is natively supported)?
Another option is a soundfont editor for OSX called polyphone
This is a very old question, but I do have another solution: my SoundFonts
application. It is available on the AppStore for a small fee, or you can use the source to build what you want.
The repo contains an SF2 parser in C++ that I reworked from some code I found online. The repo also contains a catalog.py Python script that generates listing from a SF2 file. It uses the sf2utils Python package.