I am developing an iphone app with the capability of recording voice using AVAudioRecorder.Currently I am saving the notes in TemporaryDirectory()
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithFormat:#"Voice-%d.caf",count]]];
I am not sure weather this is the best location to store the file.
Can anyone suggest the best location to store such audio files which can be later accessed using there path?
I used the same path and lowered the sample rate.I also made number of channels 1 from 2 which decreased the size of my voice record.
Related
I want to record some thing in my app and post it to some API.
Please help
You can follow this link for recording and saving audio in local iPhone storage (ex: Document Directory).
Record audio file and save locally on iPhone
Once you saved it you can convert file content as NSData
NSDate *audioData = [NSData dataWithContentsOfFile:audioFilePath];
// audioFilePath is the path of audio either from document
// directory or any other location.
This simple tutorial might help you
https://www.appcoda.com/ios-avfoundation-framework-tutorial/
Once you are done recording then you have to choose do you want
AFNetworking or NSURLConnection
to send the audio to the server
I'm using AVPlayer to play youtube videos, for each youtube video id I retrieve a couple of stream urls in different qualities.
I want to play a particular stream quality according to the network state. For example if user is on 3G I want to play the lowest quality URL but if user moves to wifi I want to seamlessly switch to the better quality stream.
This is nothing new, youtube is doing that in their app and many others.
So I wonder what is the best way to do this kind of switching with AVPlayer, I don't want the user to notice the switching as possible, without pausing the video playback or buffering.
Any advices?
I'm not sure if this kind of functionality is supported on the youtube servers or if I need to do it on client side.
You should have a look at the Apple documentation on HTTP live streaming.
The only way to achieve the type of switching that you want and is talked about in the documentation is the use of m3u index files and TS files containing the video data.
You connect to the index file and store its contents, which will be multiple URL's along with bandwidth requirements. See the examples here. Then use the
Reachability class to check network status and connect to the appropriate stream. Start the Reachability notifier and react to events by changing the stream you're connected to. This will cause the TS file that belongs to the stream to be downloaded and buffered for playback, achieving the type of switching you want.
As I previously said, the drawback is the requirement to use TS files. This would mean you video files would have to be downloaded from Youtube, prepared using the Apple provided mediafilesegmenter command line tool and the stored on an FTP server! Not ideal at all but as far as I'm aware the only way to do this.
Check out the AVPlayer replaceCurrentItemWithPlayerItem: method. If I were you, I would use Reachbility to observe the user's network status. When the network reachability degrades, you can do something like this:
AVPlayerItem *item = [AVPlayerItem playerItemWithURL:urlOfLowerQuality];
[item seekToTime:player.currentTime];
[player replaceCurrentItemWithPlayerItem:item];
Use ReachabilityManager to check current status of data type either wifi or 3G. According to data mode switch url type.While Switching url take current time of video and need to set seek time of video.
I need to store and constantly update thousands of audio clips and for use with an iOS app. I also need to store metadata with each audio clip. Based on user inputs I need to query the database that the clips are held in and download several clips to a temporary folder in the app so they can be played.
It doesn't seem that a service like Parse can store large audio files.
What would be the best approach for something like this?
I'm currently making an iOS application where you can search a song. You will then be given a list of songs which you can choose from and then stream from the app. It uses a combination of the tinysong API and I'm attempting to use a URL from Grooveshark combined with the songID given to me to play the song... E.g.
https://grooveshark.com/facebookWidget.swf?songID=13963
I've already had a quick look at this post: How can i share a grooveshark music url to facebook with the music widget?
but I was wondering if I could get more information.
What I want to do is play the audio from that link with a changing songID...
My Objective C code for this is:
//Opens the link of Url from first song and plays it
NSString *SongID = [[jsonArray objectAtIndex:0] objectForKey:#"SongID"];
NSString *playString = [#"https://grooveshark.com/facebookWidget.swf?songID=" stringByAppendingString:SongID];
//Converts songURL into a playable NSURL
NSURL *playURL = [NSURL URLWithString:playString];
AVAudioPlayer *backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:playURL error:&error];
[backgroundMusicPlayer prepareToPlay];
[backgroundMusicPlayer play];
But it isn't playing. What should I do to make it play? I know iOS can't play Adobe flash player and considering it is in a swf format, that's why I figured it can't play but I don't know much about flash and/or it in conjunction with iOS. So any help or ideas would be much appreciated.
If there is another way to access and play songs from a massive database of songs with their artists and albums too from objective C (like spotify API, but free), please comment as that may be another solution!
Thanks!
I'm having a streaming URL which is something like "http://myserver.com/master.m3u8". (this is dummy URL)
This URL is playing fine in the safari browser on iPhone.
But when playing the same within the app using following code, i'm facing some issues:
NSURL* theURL = [NSURL URLWithString:#"http://myserver.com/master.m3u8"];
MPMoviePlayerViewController* moviePlayerViewController = [[MPMoviePlayerViewController alloc] initWithContentURL:theURL];
moviePlayerViewController.moviePlayer.movieSourceType = MPMovieSourceTypeStreaming;
[self presentMoviePlayerViewControllerAnimated:moviePlayerViewController];
The problem when playing within the app is that, after sometime the screen turns to black color. But i'm still able to hear the audio.
How can I debug where is the issue.
Can some one help me who faced similar issue please?
If you create a standard m3u8 file the lowest version of the video will include an audio only version of the stream. So if the bandwidth is to low the player may switch to this stream and play audio only.
I haven't found a solution yet to do something meaningful in the app when this happens(for example pause the video and wait until the bandwidth is sufficient to play the next higher version of the stream which has video again) but if you can tweak the m3u8 or the encoding process you could just remove the audio only version from your m3u8. Then the player would switch to the lowest video stream and pause if the bandwidth isn't enough to show it.
Please keep in mind that you have to provide this to the App Review team when submitting the app to the store. This is mentioned in this Technical QA from Apple: Resolving App Store Approval Issues for HTTP Live Streaming
Note: As the baseline 64 kbps maximum audio-only HTTP Live stream requirement is specifically for streaming over a cellular network, if your application is self-restricting to Wi-Fi only HTTP Live Streaming and you choose to not supply a baseline 64 kbps audio-only stream, you must provide this information to the App Review team. Developers can include this information in the Review Notes field for your application.