Using OSAtomicCompareAndSwapPtr on iOS with ARC enabled - ios

Pre-automatic reference counting, you could do the appropriate pointer casts in Objective-c to allow you to use bool OSAtomicCompareAndSwapPtr(void* oldValue, void* newValue, void* volatile *theValue); to attempt to atomically swap pointers when dealing with multithreaded accesses.
Under ARC these pointer casts are not valid. Is there an equivalent atomic pointer swap available under ARC for iOS? I was hoping to avoid the more expensive locking if this alternative is still available.

Disclaimer: code in this answer is not tested!
First of all I'd like to mention that most pointer uses don't really need compare-and-swap. C pointer reads and writes are atomic by themselves. See this SO answer for more detail. Same goes for ARM. So if you implement atomic getters and setters you only need a memory barrier to guarantee that other threads see fully initialized objects:
NSObject * global;
NSObject * get() { return global; }
void set(NSObject * value) { OSMemoryBarrier(); global = value; }
Now back to the question, because who knows, maybe there are real uses for compare-and-swapping objects. The casts are still possible, you just declare them differently now:
NSString * a = #"A";
NSObject * b = #"B";
OSAtomicCompareAndSwapPtrBarrier(
(__bridge void *)a,
(__bridge void *)b,
(__bridge void * *)&a);
However this code has a problem: the string #"A" loses a reference, and #"B" gets referenced twice, without ARC knowing. Therefore #"A" will leak, and the program will likely crash when leaving the scope because #"B" will be released twice while only having the retain counter of 1.
I think the only option is to use Core Foundation objects. You can use the fact that NSObject is toll-free bridged with CFType. I couldn't find any definitive documentation on this but it follows from common sense and practical evidence. So e.g. it is possible to implement a singleton:
CFTypeRef instance;
Thingamabob * getInstance() {
if (!instance) {
CFTypeRef t = (__bridge_retained CFTypeRef)[Thingamabob new];
if (!OSAtomicCompareAndSwapPtrBarrier(NULL, t, &instance)) {
CFRelease(t);
}
}
return (__bridge Thingamabob *)instance;
}

You may be able to use this easily, if one condition is met, and maybe if you're willing to play games.
Create a new file, mark it as not using ARC in the Build Phase, and put this swap into a small C function. At the top of the function get the object's retainCounts, and if they are equal (and you have reason to believe they are not sitting in an autorelease pool) you can just swap them, as ARC will insure the proper releases to each.
If they are not equal, well, you can play games by changing the retain count.

I found these comparable classes - before and update which help illustrate update
BEFORE
#import "SBlockDisposable.h"
#import <libkern/OSAtomic.h>
#import <objc/runtime.h>
#interface SBlockDisposable ()
{
void *_block;
}
#end
#implementation SBlockDisposable
- (instancetype)initWithBlock:(void (^)())block
{
self = [super init];
if (self != nil)
{
_block = (__bridge_retained void *)[block copy];
}
return self;
}
- (void)dealloc
{
void *block = _block;
if (block != NULL)
{
if (OSAtomicCompareAndSwapPtr(block, 0, &_block))
{
if (block != nil)
{
__strong id strongBlock = (__bridge_transfer id)block;
strongBlock = nil;
}
}
}
}
- (void)dispose
{
void *block = _block;
if (block != NULL)
{
if (OSAtomicCompareAndSwapPtr(block, 0, &_block))
{
if (block != nil)
{
__strong id strongBlock = (__bridge_transfer id)block;
((dispatch_block_t)strongBlock)();
strongBlock = nil;
}
}
}
}
#end
AFTER
//
// KAGRACDisposable.m
// ReactiveCocoa
//
// Created by Josh Abernathy on 3/16/12.
// Copyright (c) 2012 GitHub, Inc. All rights reserved.
//
#import "KAGRACDisposable.h"
#import "KAGRACScopedDisposable.h"
#import <stdatomic.h>
#interface KAGRACDisposable () {
// A copied block of type void (^)(void) containing the logic for disposal,
// a pointer to `self` if no logic should be performed upon disposal, or
// NULL if the receiver is already disposed.
//
// This should only be used atomically.
void * volatile _disposeBlock;
}
#end
#implementation KAGRACDisposable
#pragma mark Properties
- (BOOL)isDisposed {
return _disposeBlock == NULL;
}
#pragma mark Lifecycle
- (id)init {
self = [super init];
if (self == nil) return nil;
_disposeBlock = (__bridge void *)self;
atomic_thread_fence(memory_order_seq_cst);
return self;
}
- (id)initWithBlock:(void (^)(void))block {
NSCParameterAssert(block != nil);
self = [super init];
if (self == nil) return nil;
_disposeBlock = (void *)CFBridgingRetain([block copy]);
atomic_thread_fence(memory_order_seq_cst);
return self;
}
+ (instancetype)disposableWithBlock:(void (^)(void))block {
return [[self alloc] initWithBlock:block];
}
- (void)dealloc {
if (_disposeBlock == NULL || _disposeBlock == (__bridge void *)self) return;
CFRelease(_disposeBlock);
_disposeBlock = NULL;
}
#pragma mark Disposal
- (void)dispose {
void (^disposeBlock)(void) = NULL;
while (YES) {
void *blockPtr = _disposeBlock;
if (atomic_compare_exchange_strong((volatile _Atomic(void*)*)&_disposeBlock, &blockPtr, NULL)) {
if (blockPtr != (__bridge void *)self) {
disposeBlock = CFBridgingRelease(blockPtr);
}
break;
}
}
if (disposeBlock != nil) disposeBlock();
}
#pragma mark Scoped Disposables
- (KAGRACScopedDisposable *)asScopedDisposable {
return [KAGRACScopedDisposable scopedDisposableWithDisposable:self];
}
#end

Related

How do you have multiple UnityAppController overrides?

I'm using the Vuforia plugin within Unity 2018.1 and I have my own override for UnityAppController. When I do that the Vuforia one wipes out mine so it never gets called.
I found many possible solutions but the only one I've managed to get working is to manually replace the Vuforia one with one that also calls my own code too. Which is a total hack and not a good way forward.
I found this possible solution //Can you have more than one subclass of UnityAppController?
but I don't understand it.
#import "UnityAppController.h"
namespace {
typedef BOOL (*ApplicationDidFinishLaunchingWithOptionsImp)(UnityAppController *appController,
SEL selector,
UIApplication *application,
NSDictionary *launchOptions);
ApplicationDidFinishLaunchingWithOptionsImp OriginalApplicationDidFinishLaunchingWithOptions;
BOOL ApplicationDidFinishLaunchingWithOptions(UnityAppController *appController,
SEL selector,
UIApplication *application,
NSDictionary *launchOptions) {
// Initialize Google Play Games, etc
return OriginalApplicationDidFinishLaunchingWithOptions(appController, selector, application, launchOptions);
}
IMP SwizzleMethod(SEL selector, Class klass, IMP newImp) {
Method method = class_getInstanceMethod(klass, selector);
if (method != nil) {
return class_replaceMethod(klass, selector, newImp, method_getTypeEncoding(method));
}
return nil;
}
} // anonymous namespace
#interface AppController : UnityAppController
#end
#implementation AppController
+ (void)load {
OriginalApplicationDidFinishLaunchingWithOptions = (ApplicationDidFinishLaunchingWithOptionsImp)
SwizzleMethod(#selector(application:didFinishLaunchingWithOptions:),
[UnityAppController class],
(IMP)&ApplicationDidFinishLaunchingWithOptions);
}
#end
Where do you add your own code once you have swizzled the original?
This is the code I used to replace the Vuforia version:
#implementation VuforiaNativeRendererController
- (BOOL)application:(UIApplication*)application didFinishLaunchingWithOptions:(NSDictionary*)launchOptions
{
//printf_console("Did Finish Launching with options\n");
NSURL *URL = [launchOptions valueForKey:UIApplicationLaunchOptionsURLKey];
if (URL)
{
const char *URLString = [URL.absoluteString UTF8String];
//printf_console("Application started with URL:\n");
//printf_console("%s\n", URLString);
UnitySendMessage("Scripts", "openURLComplete", URLString);
}
BOOL ret = [super application:application didFinishLaunchingWithOptions:launchOptions];
if (ret)
{
_unityView.backgroundColor = UIColor.clearColor;
}
return ret;
}
But how would I introduce that using a "swizzle" method as above?

Implementing a simple SuperpoweredAdvancedAudioPlayer in swift

I am trying to implement a simple SuperpoweredAdvancedAudioPlayer in swift. I successfully modified the SuperpoweredCrossExample project so that playerA plays the song on starting the application.
ViewController.mm now looks like this:
#import "ViewController.h"
#import "SuperpoweredAdvancedAudioPlayer.h"
#import "SuperpoweredFilter.h"
#import "SuperpoweredRoll.h"
#import "SuperpoweredFlanger.h"
#import "SuperpoweredIOSAudioIO.h"
#import "SuperpoweredSimple.h"
#import <stdlib.h>
#define HEADROOM_DECIBEL 3.0f
static const float headroom = powf(10.0f, -HEADROOM_DECIBEL * 0.025);
/*
This is a .mm file, meaning it's Objective-C++.
You can perfectly mix it with Objective-C or Swift, until you keep the member variables and C++ related includes here.
Yes, the header file (.h) isn't the only place for member variables.
*/
#implementation ViewController {
SuperpoweredAdvancedAudioPlayer *playerA;
SuperpoweredIOSAudioIO *output;
float *stereoBuffer, volA;
unsigned int lastSamplerate;
}
void playerEventCallbackA(void *clientData, SuperpoweredAdvancedAudioPlayerEvent event, void *value) {
if (event == SuperpoweredAdvancedAudioPlayerEvent_LoadSuccess) {
ViewController *self = (__bridge ViewController *)clientData;
self->playerA->setBpm(126.0f);
self->playerA->setFirstBeatMs(353);
self->playerA->setPosition(self->playerA->firstBeatMs, false, false);
};
}
// This is where the Superpowered magic happens.
static bool audioProcessing(void *clientdata, float **buffers, unsigned int inputChannels, unsigned int outputChannels, unsigned int numberOfSamples, unsigned int samplerate, uint64_t hostTime) {
__unsafe_unretained ViewController *self = (__bridge ViewController *)clientdata;
if (samplerate != self->lastSamplerate) { // Has samplerate changed?
self->lastSamplerate = samplerate;
self->playerA->setSamplerate(samplerate);
};
bool silence = !self->playerA->process(self->stereoBuffer, false, numberOfSamples, self->volA);
if (!silence) SuperpoweredDeInterleave(self->stereoBuffer, buffers[0], buffers[1], numberOfSamples); // The stereoBuffer is ready now, let's put the finished audio into the requested buffers.
return !silence;
}
- (void)viewDidLoad {
[super viewDidLoad];
[self f];
}
- (void) f {
volA = 1.0f * headroom;
if (posix_memalign((void **)&stereoBuffer, 16, 4096 + 128) != 0) abort(); // Allocating memory, aligned to 16.
playerA = new SuperpoweredAdvancedAudioPlayer((__bridge void *)self, playerEventCallbackA, 44100, 0);
playerA->open([[[NSBundle mainBundle] pathForResource:#"lycka" ofType:#"mp3"] fileSystemRepresentation]);
output = [[SuperpoweredIOSAudioIO alloc] initWithDelegate:(id<SuperpoweredIOSAudioIODelegate>)self preferredBufferSize:12 preferredMinimumSamplerate:44100 audioSessionCategory:AVAudioSessionCategoryPlayback channels:2 audioProcessingCallback:audioProcessing clientdata:(__bridge void *)self];
[output start];
playerA->play(false);
}
- (void)dealloc {
delete playerA;
free(stereoBuffer);
#if !__has_feature(objc_arc)
[output release];
[super dealloc];
#endif
}
- (void)interruptionStarted {}
- (void)recordPermissionRefused {}
- (void)mapChannels:(multiOutputChannelMap *)outputMap inputMap:(multiInputChannelMap *)inputMap externalAudioDeviceName:(NSString *)externalAudioDeviceName outputsAndInputs:(NSString *)outputsAndInputs {}
- (void)interruptionEnded { // If a player plays Apple Lossless audio files, then we need this. Otherwise unnecessary.
playerA->onMediaserverInterrupt();
}
#end
I am trying to use the same code in swift following the same method used in SuperpoweredFrequencies project to import c++ files in swift.
Superpowered.h:
#import <UIKit/UIKit.h>
#interface Superpowered: NSObject
-(void) f;
#end
Superpowered.mm:
#import "Superpowered.h"
#import "Superpowered/Headers/SuperpoweredAdvancedAudioPlayer.h"
#import "Superpowered/Headers/SuperpoweredFilter.h"
#import "Superpowered/Headers/SuperpoweredRoll.h"
#import "Superpowered/Headers/SuperpoweredFlanger.h"
#import "Superpowered/SuperpoweredIOSAudioIO.h"
#import "Superpowered/Headers/SuperpoweredSimple.h"
#import <stdlib.h>
#define HEADROOM_DECIBEL 3.0f
static const float headroom = powf(10.0f, -HEADROOM_DECIBEL * 0.025);
/*
This is a .mm file, meaning it's Objective-C++.
You can perfectly mix it with Objective-C or Swift, until you keep the member variables and C++ related includes here.
Yes, the header file (.h) isn't the only place for member variables.
*/
#implementation Superpowered {
SuperpoweredAdvancedAudioPlayer *playerA;
SuperpoweredIOSAudioIO *output;
float *stereoBuffer, volA;
unsigned int lastSamplerate;
}
void playerEventCallbackA(void *clientData, SuperpoweredAdvancedAudioPlayerEvent event, void *value) {
if (event == SuperpoweredAdvancedAudioPlayerEvent_LoadSuccess) {
Superpowered *self = (__bridge Superpowered *)clientData;
self->playerA->setBpm(126.0f);
self->playerA->setFirstBeatMs(353);
self->playerA->setPosition(self->playerA->firstBeatMs, false, false);
};
}
// This is where the Superpowered magic happens.
static bool audioProcessing(void *clientdata, float **buffers, unsigned int inputChannels, unsigned int outputChannels, unsigned int numberOfSamples, unsigned int samplerate, uint64_t hostTime) {
__unsafe_unretained Superpowered *self = (__bridge Superpowered *)clientdata;
if (samplerate != self->lastSamplerate) { // Has samplerate changed?
self->lastSamplerate = samplerate;
self->playerA->setSamplerate(samplerate);
};
bool silence = !self->playerA->process(self->stereoBuffer, false, numberOfSamples, self->volA);
if (!silence) SuperpoweredDeInterleave(self->stereoBuffer, buffers[0], buffers[1], numberOfSamples); // The stereoBuffer is ready now, let's put the finished audio into the requested buffers.
return !silence;
}
- (void)f {
volA = 1.0f * headroom;
if (posix_memalign((void **)&stereoBuffer, 16, 4096 + 128) != 0) abort(); // Allocating memory, aligned to 16.
playerA = new SuperpoweredAdvancedAudioPlayer((__bridge void *)self, playerEventCallbackA, 44100, 0);
playerA->open([[[NSBundle mainBundle] pathForResource:#"lycka" ofType:#"mp3"] fileSystemRepresentation]);
output = [[SuperpoweredIOSAudioIO alloc] initWithDelegate:(id<SuperpoweredIOSAudioIODelegate>)self preferredBufferSize:12 preferredMinimumSamplerate:44100 audioSessionCategory:AVAudioSessionCategoryPlayback channels:2 audioProcessingCallback:audioProcessing clientdata:(__bridge void *)self];
[output start];
playerA->play(false);
}
- (void)dealloc {
delete playerA;
free(stereoBuffer);
#if !__has_feature(objc_arc)
[output release];
[super dealloc];
#endif
}
- (void)interruptionStarted {}
- (void)recordPermissionRefused {}
- (void)mapChannels:(multiOutputChannelMap *)outputMap inputMap:(multiInputChannelMap *)inputMap externalAudioDeviceName:(NSString *)externalAudioDeviceName outputsAndInputs:(NSString *)outputsAndInputs {}
- (void)interruptionEnded { // If a player plays Apple Lossless audio files, then we need this. Otherwise unnecessary.
playerA->onMediaserverInterrupt();
}
#end
Project-Bridging-Header.h:
#import "Superpowered.h"
Controller.swift:
override func viewDidLoad() {
super.viewDidLoad()
let s = Superpowered();
s.f();
}
When running the app it crashes and gives the following error:
let s = Superpowered(); should be declared outside viewDidLoad(). Declaring it as an instance variable solved the problem.

Why my object deallocated by itself when switching thread in Xcode without ARC?

I have an object like this:
typedef void (^ Completion) (Response *);
// Response class
#interface Response : NSObject {
NSDictionary * kdata;
}
- (id)initWithJson:(NSDictionary *)data;
#property (nonatomic, assign) NSDictionary * data;
#end
#implementation Response
- (id)initWithJson:(NSDictionary *)data { kdata = data; }
- (NSDictionary *) data { return kdata; }
- (void) setData: (NSDictionary *)data { kdata = data; }
- (NSDictionary *) msg { return kdata[#"msg"]; }
#end
// inside a networking class X implementation
- (void) doSomething:(completionBlock)completion {
NSDictionary * json = // get from networking function, which will always have key "msg".
Response * responseObj = [[Response alloc] initWithJson:json];
dispatch_async(dispatch_get_main_queue(), ^{
if (completion != nil) { completion (responseObj); }
});
}
// inside caller method
[X doSomething:^(Response * response) {
NSLog (#"%#", [response msg]);
}
This code will raise error on accessing kdata[#"msg"], even though I'm sure from the debug that the object was initialised properly with a dictionary contains key "msg". When I debug the object, on the watch window, it shows me that the kdata data type keeps changing, from NSArrayM, NSSet, NSDictionary, etc. And its contents also keep changing. I even add retain keyword when calling completion ([responseObj retain]); but still produce error.
But if the code in class X is changed into like this:
// inside a networking class X implementation
- (void) doSomething:(completionBlock)completion {
NSDictionary * json = // get from networking function, which will always have key "msg".
Response * responseObj = [[Response alloc] initWithJson:json];
if (completion != nil) { completion (responseObj); } // here is the change, no more switching to main thread
}
// inside caller method - no change here
[X doSomething:^(Response * response) {
NSLog (#"%#", [response msg]);
}
The code works perfectly. Why is that happened? This is built in Xcode without ARC.
EDIT: someone mentioned about the init. This is my mistake that what was written above is not exactly my code, and I copy the init method wrong. This is my init method:
- (instancetype) initWithData:(NSDictionary *)freshData {
NSParameterAssert(freshData); // make sure not nil
self = [super init];
if (self) {
kdata = freshData;
}
return self;
}
The problem is the object get's released right when you call the 'async' .
The way you declared your object is added to the autorelease pool since the control does not wait for 'async' to complete and the control return's by reaching the end of function 'doSomething' and releasing it's local objects which were added to the autorelease pool, and after that the memory location is used for other data and that's what you see confusing data.
I think by adding the __block specifier in front of your declaration you instruct the code to capture this object in following blocks strongly and release it when the block finished executing. Give it a try.
// inside a networking class X implementation
- (void) doSomething:(completionBlock)completion {
NSDictionary * json = // get from networking function, which will always have key "msg".
__block Response * responseObj = [[Response alloc] initWithJson:json];
dispatch_async(dispatch_get_main_queue(), ^{
if (completion != nil) { completion (responseObj); }
});
}
- (id)initWithJson:(NSDictionary *)data { kdata = data; }
You need call supers init here and return self.
Start to learn basics.

FFMPEG: closing RTSP stream cleanly -- av_read_frame crash on avformat_close_input

I'm using KxMovie: https://github.com/kolyvan/kxmovie
It appears to stop a stream and close the view controller one should use [pause];
However, I'm trying to receive a stream from a version of gstreamer that has a memory leak if a stream isn't closed properly (it's just left hanging).
So, just [pause]ing isn't an option for me.
I'm trying to use [closeFile] in the KxMovie decoder:
-(void) closeFile
{
[self closeAudioStream];
[self closeVideoStream];
[self closeSubtitleStream];
_videoStreams = nil;
_audioStreams = nil;
_subtitleStreams = nil;
if (_formatCtx) {
_formatCtx->interrupt_callback.opaque = NULL;
_formatCtx->interrupt_callback.callback = NULL;
avformat_close_input(&_formatCtx);
_formatCtx = NULL;
}
}
However, I usually get a EXC_BAD_ACCESS from av_read_frame after [closeFile] issues avformat_close_input.
Can anyone give me some advice on how to cleanly shutdown an RTSP stream using ffmpeg?
Thanks!
I was also confused by this, and I do not quite understand your solution.
I fixed it like below, could you give some advice?
_dispatchQueue is the same queue as doing asyncDecodeFrames work.
- (void)unSetup {
_buffered = NO;
_interrupted = YES;
dispatch_async(_dispatchQueue, ^{
if (_decoder) {
[self pause];
[self freeBufferedFrames];
if (_moviePosition == 0 || _decoder.isEOF)
[gHistory removeObjectForKey:_decoder.path];
else if (!_decoder.isNetwork)
[gHistory setValue:[NSNumber numberWithFloat:_moviePosition]
forKey:_decoder.path];
[_decoder closeFile];
}
});
}
Needed to use the interrupt callbacks to interrupt av_read_frame
_formatCtx->interrupt_callback.opaque
_formatCtx->interrupt_callback.callback
Wait for the callback to be called and return non zero.
After the callback has returned an interrupt value av_close_input can safely be called (after closing any codecs used).
The below code snippets are in Objective-C and the implementation file .m is for the object that handles RTSP stuff (RTSPProvider).
It is tested with Xcode Version 10.1 (10B61) and an FFmpeg manually built version of the current FFmpeg versions to date (4.2.1 / 15.10.2019).
Should you need the build script configuration and or library versions used (just ask).
I had the same issue as the OP but couldn't use his solution.
The full versions was with the interrupt callback I used was:
int interruptCallBack(void *ctx){
RTSPProviderObject *whyFFmpeg = (__bridge RTSPProviderObject*)ctx;
NSLog(#"What is this!");
if(whyFFmpeg.whatIsHappeningSTR) {
return 1;
} else {
return 0;
}
}
The return value 1 should have interrupted the av_read_frame() and exited without a crash as to my current understanding.
It still crashed. My solution was to let av_read_frame() finish reading and terminate the session context which will be freed and don't allow any more reading. This was easy since I had this issue when I deallocated my RTSPProviderObject and no reading was done.
The final usage was:
[self.rtspProvider cleanup];
self.rtspProvider = nil;
Below is the full code snippet:
#import "Don't forget the required ffmpeg headers or header file"
int interruptCallBack(void *ctx){
RTSPProviderObject *whyFFmpeg = (__bridge RTSPProviderObject*)ctx;
NSLog(#"What is this!");
if(whyFFmpeg.whatIsHappeningSTR) {
return 1;
} else {
return 0;
}
}
#interface RTSPProviderObject ()
#property (nonatomic, assign) AVFormatContext *sessionContext;
#property (nonatomic, assign) NSString *whatIsHappeningSTR;
#property (nonatomic, assign) AVDictionary *sessionOptions;
#property (nonatomic, assign) BOOL usesTcp;
#property (nonatomic, assign) BOOL isInputStreamOpen;
#property (nonatomic, strong) NSLock *audioPacketQueueLock;
#property (nonatomic, strong) NSLock *packetQueueLock;
#property (nonatomic, strong, readwrite) NSMutableArray *audioPacketQueue;
#property (nonatomic, assign) int selectedVideoStreamIndex;
#property (nonatomic, assign) int selectedAudioStreamIndex;
#end
#implementation RTSPProviderObject
- (id _Nullable)init
{
self = [super init];
if (!self)
{
return nil;
}
self.sessionContext = NULL;
self.sessionContext = avformat_alloc_context();
AVFormatContext *pFormatCtx = self.sessionContext;
if (!pFormatCtx)
{
// Error handling code...
}
// MUST be called before avformat_open_input().
av_dict_free(&_sessionOptions);
self.sessionOptions = 0;
if (self.usesTcp)
{
// "rtsp_transport" - Set RTSP transport protocols.
// Allowed are: udp_multicast, tcp, udp, http.
av_dict_set(&_sessionOptions, "rtsp_transport", "tcp", 0);
}
// Open an input stream and read the header with the demuxer options.
// rtspURL - connection url to your remote ip camera which supports RTSP 2.0.
if (avformat_open_input(&pFormatCtx, rtspURL.UTF8String, NULL, &_sessionOptions) != 0)
{
self.isInputStreamOpen = NO;
// Error handling code...
}
self.isInputStreamOpen = YES;
// user-supplied AVFormatContext pFormatCtx might have been modified.
self.sessionContext = pFormatCtx;
pFormatCtx->interrupt_callback.callback = interruptCallBack;
pFormatCtx->interrupt_callback.opaque = (__bridge void *)(self);
// ... Other needed but currently not relevant code for codec/stream and other setup.
}
- (BOOL)prepareNextFrame
{
NSLog(#"%s", __PRETTY_FUNCTION__);
int isVideoFrameAvailable = 0;
// The session context is needed to provide frame data. Frame data is provided for video and audio.
// av_read_frame reads from pFormatCtx.
AVFormatContext *pFormatCtx = self.sessionContext;
if (!pFormatCtx) { return NO; }
// Audio packet access is forbidden.
[self.packetQueueLock lock];
BOOL readResult = YES;
// Calling av_read_frame while it is reading causes a bad_exception.
// We read frames as long as the session context cotains frames to be read and cosumed (usually one).
while (!isVideoFrameAvailable && self.isInputStreamOpen && readResult) {
if (packet.buf == nil && self.whatIsHappeningSTR) {
[self.packetQueueLock unlock];
return NO;
}
NSLog(#"New frame will be read.");
if (self.shouldTerminateStreams) {
[self terminate];
[self.packetQueueLock unlock];
return NO;
}
readResult = av_read_frame(pFormatCtx, &packet) >=0;
// Video packet data decoding.
// We need to make sure that the frame video data which is consumed matches the user selected stream.
if(packet.stream_index == self.selectedVideoStreamId) {
// DEPRECIATED:
// avcodec_decode_video2(self.videoCodecContext, self.rawFrameData, &isVideoFrameAvailable, &packet);
// Replaced by this new implememtation. Read more: https://blogs.gentoo.org/lu_zero/2016/03/29/new-avcodec-api/
// *
// We need the video context to decode video data.
AVCodecContext *videoContext = self.videoCodecContext;
if (!videoContext && videoContext->codec_type == AVMEDIA_TYPE_VIDEO) { isVideoFrameAvailable = 1; }
int ret;
// Supply raw packet data as input to a decoder.
ret = avcodec_send_packet(videoContext, &packet);
if (ret < 0)
{
NSLog(#"codec: sending video packet failed");
[self.packetQueueLock unlock];
return NO;
}
// Return decoded output data from a decoder.
ret = avcodec_receive_frame(videoContext, self.rawFrameData);
if (isVideoFrameAvailable < 0 && isVideoFrameAvailable != AVERROR(EAGAIN) && isVideoFrameAvailable != AVERROR_EOF)
{
[self.packetQueueLock unlock];
return NO;
}
if (ret >= 0) { isVideoFrameAvailable = 1; }
// *
} else {
// avcodec_decode_video2 unreference all the buffers referenced by self.rawFrameData and reset the frame fields.
// We must do this manually if we don't use the video frame or we will leak the frame data.
av_frame_unref(self.rawFrameData);
isVideoFrameAvailable = 1;
}
// Audio packet data consumption.
// We need to make sure that the frame audio data which will be consumed matches the user selected stream.
if (packet.stream_index == self.selectedAudioStreamIndex) {
[self.audioPacketQueueLock lock];
[self.audioPacketQueue addObject:[NSMutableData dataWithBytes:&packet length:sizeof(packet)]];
[self.audioPacketQueueLock unlock];
}
}
[self.packetQueueLock unlock];
return isVideoFrameAvailable!=0;
}
- (void)cleanup
{
NSLog(#"%s", __PRETTY_FUNCTION__);
self.shouldTerminateStreams = YES;
self.whatIsHappeningSTR = #"";
}
- (void)terminate
{
avformat_close_input(&_sessionContext);
}
#end
Hope this helps anyone. Thank you for reading and contributing.

ios singleton class crashes my app

I have a problem with an singleton pattern.
I have read the following tutorials about singleton classes and have created my own.
http://www.galloway.me.uk/utorials/singleton-classes/
http://www.johnwordsworth.com/2010/04/iphone-code-snippet-the-singleton-pattern/
The first time i build & run the app it works like it should. No problems at all!
But when i rebuild the app the singleton class does not work properly anymore. The first init works like it should but when i call it again after a button click it crashes my app.
My singleton class:
BPManager.h
#interface BPManager : NSObject {
NSString *dbPath;
}
#property (nonatomic, retain) NSString *dbPath;
+ (id)bpManager;
- (void)initDatabase:(NSString *)dbName;
- (int)getQuestions;
#end
BPManager.m
static BPManager *sharedMyManager = nil;
#implementation BPManager
#synthesize dbPath;
- (void)initDatabase:(NSString *)dbName
{
dbPath = dbName;
}
-(int)getQuestions
{
NSLog(#"getQuestions");
}
- (id)init {
if ((self = [super init])) {
}
return self;
}
+ (BPManager *) bpManager {
#synchronized(self) {
if(sharedMyManager != nil) return sharedMyManager;
static dispatch_once_t pred; // Lock
dispatch_once(&pred, ^{ // This code is called at most once per app
sharedMyManager = [[BPManager alloc] init];
});
}
return sharedMyManager;
}
- (void)dealloc {
[dbPath release];
[super dealloc];
}
When i call the following code when building my interface, the app creates the singleton:
BPManager *manager = [BPManager bpManager];
[manager initDatabase:#"database.db"];
Note: At this point i can create references to the class from other files as well. But when i click on a button it seems to loose his references.
But when a button is clicked, the following code is ecexuted:
BPManager *manager = [BPManager bpManager];
int count = [manager getQuestions];
The app should get the sharedInstance. That works, only the parameters (like dbPath) are not accessible. Why is that?
Edit:
after some research, i have changed the method to:
+ (BPManager *) bpManager {
#synchronized(self) {
if(sharedMyManager != nil) return sharedMyManager;
static dispatch_once_t pred; // Lock
dispatch_once(&pred, ^{ // This code is called at most once per app
sharedMyManager = [[BPManager alloc] init];
});
}
return sharedMyManager;
}
But the problem is not solved
How about
#interface BPManager : NSObject
#property (nonatomic, copy) NSString *dbName;
#property (nonatomic, assign) int questions;
-(id) initWithDBName:(NSString*) dbName {
#end
#import "BPManager.h"
#implementation BPManager
#synthesize dbName=_dbName, questions;
+(BPManager *)singleton {
static dispatch_once_t pred;
static BPManager *shared = nil;
dispatch_once(&pred, ^{
shared = [[BPManager alloc] initWithDBName:#"database.db"];
});
return shared;
}
-(id) initWithDBName:(NSString*) dbName {
self = [super init]
if (self) self.dbName = dbName;
return self;
}
-(void)dealloc {
[_dbName release];
[super dealloc];
}
#end
BPManager *manager = [BPManager singleton];
int count = [manager questions];
The static is private to the implementation file but no reason it should be even accessible outside the singleton method. The init overrides the default implementation with the default implementation so it's useless. In Objective-C you name the getter with the var name (count), not getCount. Initializing a class twice causes an undefined behaviour. No need to synchronize or check for if==nil when you are already using dispatch_once, see Care and Feeding of Singletons. NSString should always use copy instead retain in #property. You don't need the dealloc because this is going to be active forever while your app is running, but it's just there in case you want to use this class as a non singleton . And you probably are as good with this class being an ivar in your delegate instead a singleton, but you can have it both ways.
I'm not sure whether it's the (complete) answer, but one major flaw is that you're using instance variables (self, super) in a class method, +(id)bpManager; I'm actually surprised it let you compile that at all. Change the #synchronized(self) to #synchronized(sharedMyManager), and the [[super alloc...] init] to [[BPManager alloc...] init]. And, writing that just made me realize that the problem looks like accessing a subclassed method on an object instantiated as the superclass, but that should have been overwritten in the dispatch. Shouldn't you really only need one of those anyway, why double-init like that? (And while we're there, that's a memory leak - init'd in the if() and then overwritten in the closure without releasing it.)
The solution of Jano must work well. I use this way too to create singleton object. And I don't have any problem.
For your code, I think that if you use #synchronized (it's not necessary cause your have dispatch_once_t as Jano said), you should not call return in #synchronized.
+ (BPManager *) bpManager {
#synchronized(self) {
if(sharedMyManager == nil) {
static dispatch_once_t pred; // Lock
dispatch_once(&pred, ^{ // This code is called at most once per app
sharedMyManager = [[BPManager alloc] init];
});
}
}
return sharedMyManager;
}

Resources