GPUImage failing the CVPixelBufferGetPlaneCount check on iOS - ios

I'm currently bringing a legacy project up from iOS 5/6 to iOS 6/7.
Part of this project involves taking a picture using the GPUImage library, processing it with a crop filter, then optionally adding some saturation and blur effects. I am currently using version 0.1.2 installed via cocoa pods.
The problem I am having is that when I try to capture an image from the camera, I hit the following assert in GPUImageStillCamera.m line 254
if (CVPixelBufferGetPlaneCount(cameraFrame) > 0)
{
NSAssert(NO, #"Error: no downsampling for YUV input in the framework yet");
}
where cameraFrame is a CVImageBufferRef
I have reproduced the code where this is called and move it to another project, where it works perfectly.
Once I moved this reproduced class back into the main project, I was hitting the assert every time.
Things I've ruled out with my own debugging
64bit (it’s happening on both)
different lib version
initial object setup / code / usage
This has lead me to believe that perhaps it might be a project setting that I've over looked. Any help or even a pointer in the right direction would be very very welcome. I've spent a good 1-2 days on this now and am still entirely lost!
I've included the stripped down class below which shows the general use.
#import "ViewController.h"
#import "GPUImage.h"
#import "ImageViewController.h"
#interface ViewController ()
#property (nonatomic, strong) IBOutlet GPUImageView *gpuImageView;
#property (nonatomic, strong) GPUImageStillCamera *camera;
#property (nonatomic, strong) GPUImageCropFilter *cropFilter;
#end
#implementation ViewController
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[self setupCameraCapture];
}
- (void)setupCameraCapture
{
if (self.camera) {
return;
}
self.cropFilter = [[GPUImageCropFilter alloc] initWithCropRegion:CGRectMake(0, 0, 1, 0.5625)];
if ([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceRear]) {
self.camera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPresetPhoto cameraPosition:AVCaptureDevicePositionBack];
}
else {
self.camera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPresetPhoto cameraPosition:AVCaptureDevicePositionFront];
}
self.camera.outputImageOrientation = UIInterfaceOrientationPortrait;
NSError *error = nil;
[self.camera.inputCamera lockForConfiguration:&error];
[self.camera.inputCamera setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[self.camera.inputCamera setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
if ([self.camera.inputCamera respondsToSelector:#selector(isLowLightBoostSupported)]) {
BOOL isSupported = self.camera.inputCamera.isLowLightBoostSupported;
if (isSupported) {
[self.camera.inputCamera setAutomaticallyEnablesLowLightBoostWhenAvailable:YES];
}
}
[self.camera.inputCamera unlockForConfiguration];
[self.camera addTarget:self.cropFilter];
[self.cropFilter addTarget:self.gpuImageView];
[self.camera startCameraCapture];
}
- (IBAction)capturePressed:(id)sender
{
[self.camera capturePhotoAsImageProcessedUpToFilter:self.cropFilter withCompletionHandler:^(UIImage *image, NSError *error) {
// do something with the image here
}];
}
#end

The actual culprit was a swizzled method found by my colleague Marek. Hidden away in the depths of the old codebase. The above code works fine.
Lesson: if you really have to swizzle something, make sure you leave proper documentation for the future devs.

Related

ReplayKit - keeping reference to RPPreviewViewController in ObjectiveC

I have a problem with keeping a reference to a RPPreviewViewController in ReplayKit with ObjectiveC and I'm wondering what am I doing wrong.
The .h file:
#interface ReplayKitHelper : NSObject <RPPreviewViewControllerDelegate, RPScreenRecorderDelegate>
-(void)startRecording;
-(void)stopRecording;
-(void)previewRecording;
#property(strong) RPPreviewViewController* previewViewControllerRef;
#end
The .mm file:
#implementation ReplayKitHelper
#synthesize previewViewControllerRef;
-(void)startRecording
{
RPScreenRecorder* recorder = RPScreenRecorder.sharedRecorder;
recorder.delegate = self;
[recorder startRecordingWithMicrophoneEnabled : true handler : ^ (NSError *error)
{
}];
}
-(void)stopRecording
{
RPScreenRecorder* recorder = RPScreenRecorder.sharedRecorder;
[recorder stopRecordingWithHandler : ^ (RPPreviewViewController * previewViewController, NSError * error)
{
if (error == nil)
{
if (previewViewController != nil)
{
previewViewControllerRef = previewViewController;
}
}
}];
}
-(void)previewRecording
{
if (previewViewControllerRef != nil)
{
previewViewControllerRef.modalPresentationStyle = UIModalPresentationFullScreen;
previewViewControllerRef.previewControllerDelegate = self;
[[IOSAppDelegate GetDelegate].IOSController presentViewController : previewViewControllerRef animated : YES completion : nil];
// IOSController is my main UIViewController
}
}
#end
During the runtime I launch methods startRecording, stopRecording and previewRecording, in that order. Everything is going fine, until previewRecording, where it looks like the previewViewControllerRef is no longer valid (it's not nil, but it crashes when I'm trying to refer to it).
When I try to run [self previewRecording] inside the stopRecordingWithHandler block, after I pass the reference - everything works fine.
It looks like the previewViewController from the handler is released right after app leaves the block.
Most of the examples are written in Swift, unfortunatelly I'm condemned to ObjectiveC. In Swift examples the reference to previeViewController is just passed to the variable, but in ObjectiveC it seems to not working.
Do You have any ideas what's wrong here?
I'm going to assume that you are using ARC, if so there is no need to synthesize properties anymore.
Change your RPPreviewViewController in the Interface file to:
#property (nonatomic, strong) RPPreviewViewController *RPPreviewViewController;
Drop the #synthesize.
then in the stopRecording handler you can keep a reference to the available RPPreviewViewController like so:
- (void)stopScreenRecording {
RPScreenRecorder *sharedRecorder = RPScreenRecorder.sharedRecorder;
[sharedRecorder stopRecordingWithHandler:^(RPPreviewViewController *previewViewController, NSError *error) {
if (error) {
NSLog(#"stopScreenRecording: %#", error.localizedDescription);
}
if (previewViewController) {
self.previewViewController = previewViewController;
}
}];
}
In my experience ReplayKit is still buggy and there isn't a lot of documentation about it yet.

pebbleCentral:watchDidConnect never gets called on iOS

I have an old Pebble Classic watch, upgraded to latest firmware (3.8.2) and using latest Pebble-SDK.
I have followed the few simple steps to install SDK, setup an Xcode project and adding the code to initialise and connect:
https://developer.getpebble.com/guides/mobile-apps/ios/
My problem is, that the delegate method pebbleCentral:watchDidConnect never gets called!
I am using the Pebble Time app on the iPad to install the watchApp in the watch, so I know the iPad is connected to the watch. The same iPad runs the iOS app, which apparently does not discover the watch.
I have tried to import an old test project from a colleague, who had it running a year or two ago. Same watch, same watchApp, but of course older firmware and SDK versions. Same result...
I think the documentation on the pebble site is quite simple and easy to follow. However, I feel I am missing some explanations of how and when this watchDidConnect is supposed to be triggered.
I am most likely missing some simple step somewhere, but I am quite lost in where to look!
Any ideas are welcome!
EDIT: My code looks like this:
ViewController.h:
#import <UIKit/UIKit.h>
#import PebbleKit;
#interface ViewController : UIViewController<PBPebbleCentralDelegate>
#end
ViewController.m:
#import "ViewController.h"
#interface ViewController ()
#property (weak, nonatomic) PBWatch* connectedWatch;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
[PBPebbleCentral defaultCentral].delegate = self;
[[PBPebbleCentral defaultCentral] run];
NSLog(#"Pebble initialised");
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (void)pebbleCentral:(PBPebbleCentral*)central watchDidConnect (PBWatch*)watch isNew:(BOOL)isNew {
NSLog(#"Pebble connected: %#", [watch name]);
self.connectedWatch = watch;
}
- (void)pebbleCentral:(PBPebbleCentral*)central watchDidDisconnect:(PBWatch*)watch {
NSLog(#"Pebble disconnected: %#", [watch name]);
if ([watch isEqual:self.connectedWatch]) {
self.connectedWatch = nil;
}
}
#end
Are you calling the -run method, after setting up your Pebble central? I noticed that the code snippet in the link you posted does not show the central's delegate being set.
[PBPebbleCentral defaultCentral].delegate = self; // Set your delegate
[PBPebbleCentral defaultCentral].appUUID = [[NSUUID alloc] initWithUUIDString:#"Your Pebble App UUID"]; // Set the app UUID
[[PBPebbleCentral defaultCentral] run]; // Call -run on the central

AVAudioplayer and Blocks

As Apple encourages the usage of blocks, and i wanted to do a series of animations, with sound output in between them which is basicly like a todolist, i wanted to implement this using blocks.
unfortunatly AVAudiosplayer doesnt appear to support onCompletion blocks, in the manner UIAnimation does.
So i thought it would be cool to add that support to the AVAudioplayer.
so what ive dont is this
header
#import <AVFoundation/AVFoundation.h>
#interface AVAudioPlayer (AVAudioPlayer_blockSupport)
typedef void(^AVPlaybackCompleteBlock)(void);
#property (nonatomic, copy) AVPlaybackCompleteBlock block;
-(id)initWithContentsOfURL:(NSURL*)pathURL error:(NSError**)error onCompletion:(AVPlaybackCompleteBlock) block;
-(void)setBlock:(AVPlaybackCompleteBlock)block;
-(AVPlaybackCompleteBlock)block;
-(void) executeBlock;
#end
and the m file
#import "AVAudioPlayer+blocks.h"
#implementation AVAudioPlayer (AVAudioPlayer_blockSupport)
-(id)initWithContentsOfURL:(NSURL *)pathURL error:(NSError **)error onCompletion:(AVPlaybackCompleteBlock )block {
self = [[AVAudioPlayer alloc] initWithContentsOfURL:pathURL error:error];
self.block = block;
return self;
}
-(void)setBlock:(AVPlaybackCompleteBlock)block {
self.block = block;
}
-(AVPlaybackCompleteBlock)block {
return self.block;
}
-(void) executeBlock {
if (self.block != NULL) {
self.block();
}
}
#end
after doing this, i thought i should be able to create a new audioplayer like this:
player = [[AVAudioPlayer alloc] initWithContentsOfURL:pathURL error:&error onCompletion:block];
this appears to be working.
now in the delegate will try to execute the block attached.
if (localPlayer.block) {
[localPlayer executeBlock];
}
unfortunately when i try to run the code, it appears to be looping infinitely. I wanted to use synthesize instead, but thats not for category use...
If i dont implement that Method im stuck with '-[AVAudioPlayer setBlock:]: unrecognized selector sent to instance which makes sense, since there is no method with that name.
i found this Block references as instance vars in Objective-C so i thought i should be able to attach the additional property(my Block) to the AudioPlayer.
I figured it out, i needed to use
objc_setAssociatedObject(self, &defaultHashKey, blocked, OBJC_ASSOCIATION_COPY_NONATOMIC);
to store and access the property. maybe thats what jere meant with, i have tot ake care of the memory management myself.
-(void)setBlock:(AVPlaybackCompleteBlock)blocked {
objc_setAssociatedObject(self, &defaultHashKey, blocked, OBJC_ASSOCIATION_COPY_NONATOMIC);
}
-(AVPlaybackCompleteBlock)block {
return objc_getAssociatedObject(self, &defaultHashKey) ;
}

ARC semantic issue "multiple methods named 'setRotation' " while archiving only

My project in cocos2dv3 is throwing ARC Sematic Issue
Multiple methods named 'setRotation:' found with mismatched result, parameter type or attributes
while archiving(release mode). It runs fine while deploying to simulator/device (debug mode).
In release mode compiler gets confused between the implementation of rotation in UIRotationGestureRecognizer and CCNode.
When I got the error in CCBAnimationManager.m , I typecasted the object calling the selector setRotation to (CCNode*) but then the error crept up in CCActionInterval. I'm hoping there is a better solution than typecasting everywhere in cocos2d library.
What am i doing wrong?
Thankyou for your time.
EDIT
#interface CCAction : NSObject <NSCopying> {
id __unsafe_unretained _originalTarget;
id __unsafe_unretained _target;
NSInteger _tag;
}
#property (nonatomic,readonly,unsafe_unretained) id target;
#property (nonatomic,readonly,unsafe_unretained) id originalTarget;
#property (nonatomic,readwrite,assign) NSInteger tag;
in
CCAction.m
#synthesize tag = _tag, target = _target, originalTarget = _originalTarget;
-(void) startWithTarget:(id)aTarget
{
_originalTarget = _target = aTarget;
}
-(void) startWithTarget:(id)aTarget
{
_originalTarget = _target = aTarget;
}
Class Hierarchy
#interface CCActionFiniteTime : CCAction <NSCopying>
#interface CCActionInterval: CCActionFiniteTime <NSCopying>
#interface CCBRotateTo : CCActionInterval <NSCopying>
CCBRotateTo.m {
-(void) startWithTarget:(CCNode *)aTarget
{
[super startWithTarget:aTarget];
startAngle_ = [self.target rotation];
diffAngle_ = dstAngle_ - startAngle_;
}
-(void) update: (CCTime) t
{
[self.target setRotation: startAngle_ + diffAngle_ * t];
}
}
This problem gave me a big headache. Though I've upgraded cocos2d to v2.2 version for my old project (too complex to update to v3), I still got this warning. And any animation I created use rotation in the SpriteBuilder does act oddly, as I described here:
Rotation animation issue on iPhone5S with cocos2d 2.0
Finally I used type casting to solve it as following in CCBAnimationManager.m
#implementation CCBRotateTo
-(void)startWithTarget:(CCNode *)aTarget
{
[super startWithTarget:aTarget];
starAngle_ = [(CCNode *)self.target rotation];
diffAngle_ = dstAngle_ - startAngle_;
}
-(void)update:(ccTime)t
{
[(CCNode *)self.target setRotation: startAngle_ + diffAngle_ * t];
}
With this change, now I can support arm64 too.
update your cocos2dv3 to latest (RC4 for now).
I was using Xcode 5.0 and cocos2dv3 RC1 with no problem.
But updating Xcode to 5.1 I had this problem.
So I updated the cocos2dv3 to RC4 and now it's working fine.
You can find cocos 2d latest versions from here.

Detect frequency value using iPhone

I'm developing an app that should detect a frequency of a certain sound. I based my app on Pitch Detector. I imported the files that are in the Pitch Detector example, then I fixed my code to accept this new classes. I post here my code to explain you my issue:
ViewController.h
#import <UIKit/UIKit.h>
#class RIOInterface;
#interface ViewController : UIViewController {
BOOL isListening;
float currentFrequency;
RIOInterface *rioRef; // HERE I'M GETTING ISSUE
}
- (IBAction)startListenWatermark:(UIButton *)sender;
#property(nonatomic, assign) RIOInterface *rioRef;
#property(nonatomic, assign) float currentFrequency;
#property(assign) BOOL isListening;
#pragma mark Listener Controls
- (void)startListener;
- (void)stopListener;
- (void)frequencyChangedWithValue:(float)newFrequency;
#end
ViewController.m
#synthesize isListening;
#synthesize rioRef;
#synthesize currentFrequency;
- (IBAction)startListenWatermark:(UIButton *)sender {
if (isListening) {
[self stopListener];
} else {
[self startListener];
}
isListening = !isListening;
}
- (void)startListener {
[rioRef startListening:self];
}
- (void)stopListener {
[rioRef stopListening];
}
- (void)frequencyChangedWithValue:(float)newFrequency {
NSLog(#"FREQUENCY: %f", newFrequency);
}
In the code you can see where's my issue and Xcode says: Existing instance variable 'rioRef' with assign attribute must be __unsafe_unretained. If I delete the row that give this errore the app doesn't calls the method [rioRef startListening:self]; and [rioRef stopListening];.
In the file RIOInterface.mm I'm getting another error in line 97 and Xcode suggested me to change it from:
RIOInterface* THIS = (RIOInterface *)inRefCon; --> RIOInterface* THIS = (RIOInterface *)CFBridgingRelease(inRefCon);
The it gives me this other error on the line 283:
callbackStruct.inputProcRefCon = self;
It says me this: Assigning to 'void' from incompatible type 'RIOInterface *const__strong', so I looked to the web and I found this solution:
callbackStruct.inputProcRefCon = (__bridge void*)self;
I'm not sure if it's right to do so or not, I hope you can help me to solve this issues, thank you in advice.
For the 2nd and the 3rd problem I solved by disabling the ARC for the file in which there are the code I provided above. For the first problem I solved by writing this code:
rioRef = [RIOInterface sharedInstance];

Resources