output.videoSettings = [NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
I get this error on this particular line
No known class method for selector "dictionaryWithObject:forKey:
Any reason for this error?
Try this:
videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
nil];
Related
I'm attempting to use the AVAssetWriterInput to crop a video that I read in a screencast of my application. Here is my current configuration.
NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:320], AVVideoCleanApertureWidthKey,
[NSNumber numberWithInt:480], AVVideoCleanApertureHeightKey,
[NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey,
[NSNumber numberWithInt:10], AVVideoCleanApertureVerticalOffsetKey,
nil];
NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:3], AVVideoPixelAspectRatioHorizontalSpacingKey,
[NSNumber numberWithInt:3],AVVideoPixelAspectRatioVerticalSpacingKey,
nil];
NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:960000], AVVideoAverageBitRateKey,
[NSNumber numberWithInt:1],AVVideoMaxKeyFrameIntervalKey,
videoCleanApertureSettings, AVVideoCleanApertureKey,
videoAspectRatioSettings, AVVideoPixelAspectRatioKey,
AVVideoProfileLevelH264BaselineAutoLevel, AVVideoProfileLevelKey,
nil];
NSDictionary *videoSettings = #{AVVideoCodecKey:AVVideoCodecH264,
AVVideoCompressionPropertiesKey:codecSettings,
AVVideoScalingModeKey:AVVideoScalingModeResizeAspectFill,
AVVideoWidthKey:[NSNumber numberWithInt:320],
AVVideoHeightKey:[NSNumber numberWithInt:480]};
_videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
I'm receiving the following error: "AVAssetWriterInput does not currently support AVVideoScalingModeFit"
This is a common error for anyone using this library, but I can't find the actual solution to it. I just see people saying: "I figured it out eventually" without explaining it. The problem is definitely related to this line: "AVVideoScalingModeKey:AVVideoScalingModeResizeAspectFill," which tells the AVAssetWriter to crop the video and maintain the aspect ratio. Anyone know the solution to this?
There is no "solution to it" per se. It's simply unsupported. You'll need to scale the video frames yourself using Core Image or a VTPixelTransferSession or whatever is appropriate for your pipeline.
Scenario: The App which I am working on I am suppose to recored voice memos and send them to the Windows client. Recording audio in iOS is not that much hard but the recorded file (in my case in the form of .mp3) could not be playable in Windows but the same file could be easily playable in Mac.
I strongly suspect on my audio recording settings which I am using in the following code,
_audioRecorder = [[AVAudioRecorder alloc] initWithURL:audioFileURL settings:_recordingSettings error:&error];
and the settings that I am using is,
_recordingSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatMPEG4AAC], AVFormatIDKey,
[NSNumber numberWithInt:AVAudioQualityMin], AVEncoderAudioQualityKey,
[NSNumber numberWithInt:16], AVEncoderBitRateKey,
[NSNumber numberWithInt: 2], AVNumberOfChannelsKey,
[NSNumber numberWithFloat:44100.0], AVSampleRateKey,
nil];
I have already tried the various AVFormatIDKeys such as
kAudioFormatLinearPCM - .lpcm
kAudioFormatMPEG4AAC - .acc
kAudioFormatMPEGLayer3 - .mp3
but no gain.
I appreciate any help. In case if it is not possible I need the explanation why!
After some random tries I have managed myself with the settings,
_recordingSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsNonInterleaved,
[NSNumber numberWithInt:AVAudioQualityMin], AVEncoderAudioQualityKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithFloat:2000.0], AVSampleRateKey,
nil];
Important: The filename should be "someName.wav"
This was the question I had asked in order to write data into firebase through an android app, I want to represent the information in the same way in an iphone app.
I am using a dictionary to represent the key-value pairs.
f = [[Firebase alloc] initWithUrl:#"https://ums-ios.firebaseio.com/"];
NSDictionary *dictionary = [NSDictionary dictionaryWithObjectsAndKeys:
#"latitude",[[NSNumber numberWithFloat:latitude] stringValue],
#"longitude",[[NSNumber numberWithFloat:longitude] stringValue],
#"timestamp",[[NSNumber numberWithDouble:timeStamp] stringValue],
nil];
Firebase* tempRef = [f childByAppendingPath:#"mobileNum"];
[tempRef setValue:dictionary];
I am getting an exception when I try to run this. But when I replace the NSDictionary with NSArray my data is getting mapped to array indices, which is not what I would require.
any suggestions ?
I think it's unrelated, butyour dictionary is backwards. I also find it terribly annoying, and I feel like I always have to double check, but values come before keys for iOS
NSDictionary *dictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[[NSNumber numberWithFloat:latitude] stringValue], #"latitude",
[[NSNumber numberWithFloat:longitude] stringValue], #"longitude",
[[NSNumber numberWithDouble:timeStamp] stringValue], #"timestamp",
nil];
The full code would be this
float latitude = 30.472;
float longitude = 42.467;
double timeStamp = NSTimeIntervalSince1970;
NSDictionary *dictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[[NSNumber numberWithFloat:latitude] stringValue], #"latitude",
[[NSNumber numberWithFloat:longitude] stringValue], #"longitude",
[[NSNumber numberWithDouble:timeStamp] stringValue], #"timestamp",
nil];
Firebase * f = [[Firebase alloc]initWithUrl:#"https://ums-ios.firebaseio.com/"];
Firebase* tempRef = [f childByAppendingPath:#"mobileNum"];
[tempRef setValue:dictionary];
This will create the following URL scheme (added .json so you can see easy)
https://ums-ios.firebaseio.com/mobileNum/.json = the dictionary
https://ums-ios.firebaseio.com/mobileNum/latitude/.json = value for latitude
https://ums-ios.firebaseio.com/mobileNum/longitude/.json = value for longitude
https://ums-ios.firebaseio.com/mobileNum/timestamp/.json = value for timestamp
A note on security,
I just saved this data to your firebase, make sure you update security before releasing!
I integrated successfully openfeint to my app on iPhone/iPod Touch. But the layout is broken on the iPad.
I copy pasted the code from the sample app from openfeint.
- (void) performOpenfeintInitLogic
{
UIViewController * rootVC = [UIApplication sharedApplication].keyWindow.rootViewController;
NSDictionary* settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:UIInterfaceOrientationPortrait], OpenFeintSettingDashboardOrientation,
#"asdasdasdas", OpenFeintSettingShortDisplayName,
[NSNumber numberWithBool:YES], OpenFeintSettingEnablePushNotifications,
[NSNumber numberWithBool:NO], OpenFeintSettingDisableUserGeneratedContent,
[NSNumber numberWithBool:NO], OpenFeintSettingAlwaysAskForApprovalInDebug,
#ifdef DEBUG
[NSNumber numberWithInt:OFDevelopmentMode_DEVELOPMENT], OpenFeintSettingDevelopmentMode,
#else
[NSNumber numberWithInt:OFDevelopmentMode_RELEASE], OpenFeintSettingDevelopmentMode,
#endif
window, OpenFeintSettingPresentationWindow,
#ifdef DEBUG
[NSNumber numberWithInt:OFDevelopmentMode_DEVELOPMENT], OpenFeintSettingDevelopmentMode,
#else
[NSNumber numberWithInt:OFDevelopmentMode_RELEASE], OpenFeintSettingDevelopmentMode,
#endif
nil
];
[OpenFeint initializeWithProductKey:#"hgghf"
andSecret:#"nbvnb"
andDisplayName:#"ncvnv"
andSettings:settings
andDelegates:nil];
[OpenFeint launchDashboard];
OFGameFeedView * gameFeed = [OFGameFeedView gameFeedView];
[rootVC.view addSubview:gameFeed];
}
Here the broken layout
Its running now with this configuration. The error was a wrong bundle. This is the right one: OFResources_Universal.bundle.
The config is now:
UIViewController * rootVC = [UIApplication sharedApplication].keyWindow.window.rootViewController;
NSDictionary* settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:UIInterfaceOrientationLandscapeRight], OpenFeintSettingDashboardOrientation,
#"asdasdads", OpenFeintSettingShortDisplayName,
[NSNumber numberWithBool:YES], OpenFeintSettingGameCenterEnabled,
[NSNumber numberWithBool:YES], OpenFeintSettingEnablePushNotifications,
[NSNumber numberWithBool:NO], OpenFeintSettingDisableUserGeneratedContent,
[NSNumber numberWithBool:NO], OpenFeintSettingAlwaysAskForApprovalInDebug,
#ifdef DEBUG
[NSNumber numberWithInt:OFDevelopmentMode_DEVELOPMENT], OpenFeintSettingDevelopmentMode,
#else
[NSNumber numberWithInt:OFDevelopmentMode_RELEASE], OpenFeintSettingDevelopmentMode,
#endif
rootVC, OpenFeintSettingPresentationWindow,
#ifdef DEBUG
[NSNumber numberWithInt:OFDevelopmentMode_DEVELOPMENT], OpenFeintSettingDevelopmentMode,
#else
[NSNumber numberWithInt:OFDevelopmentMode_RELEASE], OpenFeintSettingDevelopmentMode,
#endif
nil
];
[OpenFeint initializeWithProductKey:#"asdasdas"
andSecret:#"asdasdasd"
andDisplayName:#"asdasdsad"
andSettings:settings
andDelegates:nil];
Is there any way to convert my recorded .WAV file to .M4A file in iOS?
And also I have to convert .M4A file to .WAV file.
I tried with Audio Queue Services, but I am not able to do.
This post: From iPod Library to PCM Samples in Far Fewer Steps Than Were Previously Necessary describes how to load a file from the users ipod library and write it to the file system as a linear pcm (wav) file.
I believe that the change that you will need to make to the code to load a file from the file system instead would be in the NSURL that describes where the asset is:
-(IBAction) convertTapped: (id) sender {
// set up an AVAssetReader to read from the iPod Library
NSURL *assetURL = [[NSURL alloc] initFileURLWithPath:#"your_m4a.m4a"];
AVURLAsset *songAsset =
[AVURLAsset URLAssetWithURL:assetURL options:nil];
NSError *assetError = nil;
AVAssetReader *assetReader =
[[AVAssetReader assetReaderWithAsset:songAsset
error:&assetError]
retain];
if (assetError) {
NSLog (#"error: %#", assetError);
return;
}
If you are going in the opposite direction, you will need to change the formatting on the output end:
NSDictionary *outputSettings =[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[NSNumber numberWithFloat:44100.0], AVSampleRateKey,
[NSNumber numberWithInt:2], AVNumberOfChannelsKey,
[NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)],
AVChannelLayoutKey,
[NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey,
nil];
I am not sure of the exact settings that would go in here for m4a, but this should get you closer.
The other option would be to load in ffmpeg lib and do all your conversion in there, but that seems like different than what you want.
TPAACAudioConverter works fine