Objective C method not working properly inside objective C++ - ios

I'm calling a objective C method in objective C++ class where I create a label also hiding the status bar but the status bar is hiding fine while the label was not getting created and shown! If I call that method anywhere else other than this objective C++ class it works fine without any issue
The Objective C++ class is below
void audioRouteChangeListenerCallback (void *inUserData, AudioSessionPropertyID inPropertyID, UInt32 inPropertyValueSize, const void *inPropertyValue )
{
// ensure that this callback was invoked for a route change
if (inPropertyID != kAudioSessionProperty_AudioRouteChange) return;
{
// Determines the reason for the route change, to ensure that it is not
// because of a category change.
CFDictionaryRef routeChangeDictionary = (CFDictionaryRef)inPropertyValue;
CFNumberRef routeChangeReasonRef = (CFNumberRef)CFDictionaryGetValue (routeChangeDictionary, CFSTR (kAudioSession_AudioRouteChangeKey_Reason) );
SInt32 routeChangeReason;
CFNumberGetValue (routeChangeReasonRef, kCFNumberSInt32Type, &routeChangeReason);
if (routeChangeReason == kAudioSessionRouteChangeReason_OldDeviceUnavailable) {
//Handle Headset Unplugged
NSLog(#"PluggedOut");
}
else if (routeChangeReason == kAudioSessionRouteChangeReason_NewDeviceAvailable)
{
//Handle Headset plugged in
NSLog(#"Something Plugged In");
audiotest *test = [[audiotest alloc] init];
ArmorController *armorcontroller =[[ArmorController alloc]init];
NSLog(#"%d", [test Checkheadphonestatus]);
if([test Checkheadphonestatus] == 1)
{
[armorcontroller deviceconnectedalert];
}
else if([test Checkheadphonestatus] == 0)
{
NSLog(#"Device not connected");
}
}
}
The objective C method which not executing properly inside objective C++ class is below
- (void) deviceconnectedalert
{
[[UIApplication sharedApplication] setStatusBarHidden:YES withAnimation:UIStatusBarAnimationFade];
self.titleLabel = [[UILabel alloc] initWithFrame:CGRectMake(20, 20, 320, 30)];
[self.titleLabel setText:#"Device Connected Successfully!"];
self.titleLabel.backgroundColor =[UIColor redColor];
[self.view addSubview:self.titleLabel];
}

Related

MPMoviePlayerController initialPlaybackTime property not working in iOS 8.4

After setting initialPlaybackTime property, the video(HTTP streaming) still plays from the beginning.
The same code works well in iOS <= 8.3:
self.moviePlayer.initialPlaybackTime = self.lastPlaybackTime;
[self.moviePlayer play];
This works for me, basically you need to setCurrentPlaybackTime when the movie starts playing, But you also need a flag playbackDurationSet which is set to NO when you present movieplayer and it is set to YES when the movie is seeked to the playbackDuration for the first time.
NOTE: this flag is required because when you seek the movie from the seek scrubber the moviePlayerPlaybackStateChanged is fired with playbackState of
MPMoviePlaybackStatePlaying.
BOOL playbackDurationSet = NO;
- (void)moviePlayerPlaybackStateChanged:(NSNotification*)notification
{
MPMoviePlayerController* player = (MPMoviePlayerController*)notification.object;
switch ( player.playbackState ) {
case MPMoviePlaybackStatePlaying:
if(!playbackDurationSet){
[self.moviePlayer setCurrentPlaybackTime:yourStartTime];
playbackDurationSet = YES;
}
break;
}
}
- (void)moviePlayerPresented
{
playbackDurationSet = NO;
}
I've seen some of these issues as well. Since MPMoviePlayerController is deprecated in iOS 9, they're unlikely to be fixed.
hariszaman answer worked for me:
- (void)moviePlaybackStateChanged:(NSNotification*)notif
{
//...
if (self.videoPlayer.playbackState == MPMoviePlaybackStatePlaying) {
if (self.currentBookmark) {
if ([[PSDeviceInfo sharedInstance] is_iOSatLeast84]){
NSTimeInterval toTime = [self.currentBookmark.seconds doubleValue];
[self.videoPlayer setCurrentPlaybackTime:toTime];
self.currentBookmark = nil;
}
}
}
if (self.videoPlayer.playbackState == MPMoviePlaybackStatePaused) {
//...
}
if (self.videoPlayer.playbackState == MPMoviePlaybackStateStopped) {
//...
}
}
also:
- (void)configureMoviePlayer
{
if (self.currentBookmark) {
if ([[PSDeviceInfo sharedInstance] is_iOSatLeast84]){
// will set start after did start
}else {
NSTimeInterval toTime = [self.currentBookmark.seconds doubleValue];
self.videoPlayer.initialPlaybackTime = toTime;
self.currentBookmark = nil;
}
}
else {
self.videoPlayer.initialPlaybackTime = 0;
}
//...
}
the method is_iOSatLeast84:
- (BOOL)is_iOSatLeast84
{
// version 8.4
NSString *version = [[UIDevice currentDevice] systemVersion];
BOOL isAtLeast84 = [version floatValue] >= 8.35;
return isAtLeast84;
}
Although all of this is a workaround, it gets the job done.

AVSpeechSynthesizer read text just one time and then doesn't work anymore

I have a problem with AVSpeechSynthesizer, this read the text just one time and then doesn't work anymore.
Currently I'm using the AVSpeechSynthesizerFacade class to do it.
I have a map with some points. If I tap on a pin, I see the details view with the text. I start the synthesizer in a singleton class in my project, so it is always the same.
+ (WikipediaConnectionManager*)sharedmanager {
static WikipediaConnectionManager* sharedManager = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
sharedManager = [WikipediaConnectionManager new];
NSLocale *locale = [NSLocale currentLocale];
NSString* language = [locale localeIdentifier];
sharedManager.language = [language substringToIndex:2];
sharedManager.synthetizer = [[AVSpeechSynthesizerFacade alloc] init];
});
return sharedManager;
}
I call the method speakText when I tap on play start to speak.
-(IBAction)didTapPlayButton:(id)sender{
WikipediaConnectionManager* sharedManager = [WikipediaConnectionManager sharedmanager];
if(self.playCount == 0){
[sharedManager.synthetizer speakText:self.textView.text];
self.playCount = self.playCount + 1;
[self.playButton setImage:[UIImage imageNamed:#"pausa.png"] forState:UIControlStateNormal];
self.play = YES;
}
else{
if(self.play == NO){
BOOL continueSpeaking = [sharedManager.synthetizer continueSpeak];
if (continueSpeaking == NO) {
[sharedManager.synthetizer speakText:self.textView.text];
}
self.play = YES;
[self.playButton setImage:[UIImage imageNamed:#"pausa.png"] forState:UIControlStateNormal];
}
else{
[sharedManager.synthetizer pause];
self.play = NO;
[self.playButton setImage:[UIImage imageNamed:#"play.png"] forState:UIControlStateNormal];
}
}
}
When I come back to the map and I come back again in the details view, in another pin or the same doesn't matter, the synthesizer doesn't read the text.

How to Detect wifi or 3G signal strength in iOS?

I am working on video player using MPMediaplayer framework in iOS, I need to play two videos in my app.(i.e. My first video will play when user has strong Network signal).
My second video player will be played when their low network. I need to play my video in wifi or 3G etc... My First thing is how to detect my wifi speed in my iphone mobile and 3G speed also. I need to get mobile wifi speed in MBPS. I am was trying to write some code. But it is not working for me. Thanks in advance.
#import "Reachability.h"
#interface NetworkViewController ()
#end
#implementation NetworkViewController
- (void)viewDidLoad
{
[super viewDidLoad];
self.view.backgroundColor = [UIColor whiteColor];
[self getDataCounters];
networkLabel = [[UILabel alloc]initWithFrame:CGRectMake(20, 100, 300, 40)];
networkLabel.textColor = [UIColor blackColor];
networkLabel.backgroundColor = [UIColor whiteColor];
networkLabel.userInteractionEnabled = NO;
networkLabel.text = myNewString;
[self.view addSubview:networkLabel];
}
- (NSArray *)getDataCounters
{
BOOL success;
struct ifaddrs *addrs;
const struct ifaddrs *cursor;
const struct if_data *networkStatisc;
int WiFiSent = 0;
int WiFiReceived = 0;
int WWANSent = 0;
int WWANReceived = 0;
NSString *name=[[NSString alloc]init];
success = getifaddrs(&addrs) == 0;
if (success)
{
cursor = addrs;
while (cursor != NULL)
{
name=[NSString stringWithFormat:#"%s",cursor->ifa_name];
NSLog(#"ifa_name %s == %#\n", cursor->ifa_name,name);
// names of interfaces: en0 is WiFi ,pdp_ip0 is WWAN
if (cursor->ifa_addr->sa_family == AF_LINK)
{
if ([name hasPrefix:#"en"])
{
networkStatisc = (const struct if_data *) cursor->ifa_data;
WiFiSent+=networkStatisc->ifi_obytes;
WiFiReceived+=networkStatisc->ifi_ibytes;
NSLog(#"WiFiSent %d ==%d",WiFiSent,networkStatisc->ifi_obytes);
NSLog(#"WiFiReceived %d ==%d",WiFiReceived,networkStatisc->ifi_ibytes);
NSLog(#"wifi data is %.2f",(float)WiFiReceived/1048576);
myNewString = [NSString stringWithFormat:#"%2f", (float)WiFiReceived/1048576];
networkLabel.text = myNewString;
}
if ([name hasPrefix:#"pdp_ip"])
{
networkStatisc = (const struct if_data *) cursor->ifa_data;
WWANSent+=networkStatisc->ifi_obytes;
WWANReceived+=networkStatisc->ifi_ibytes;
NSLog(#"WWANSent %d ==%d",WWANSent,networkStatisc->ifi_obytes);
NSLog(#"WWANReceived %d ==%d",WWANReceived,networkStatisc->ifi_ibytes);
}
}
cursor = cursor->ifa_next;
}
freeifaddrs(addrs);
}
return [NSArray arrayWithObjects:[NSNumber numberWithInt:WiFiSent], [NSNumber numberWithInt:WiFiReceived],[NSNumber numberWithInt:WWANSent],[NSNumber numberWithInt:WWANReceived], nil];
}
You can play high resolution video for wifi and low resolution video on cellular data.
Reachability *reachability = [Reachability reachabilityForInternetConnection];
[reachability startNotifier];
NetworkStatus status = [reachability currentReachabilityStatus];
if(status == NotReachable)
{
//No internet
}
else if (status == ReachableViaWiFi)
{
//WiFi Play high resolution video
}
else if (status == ReachableViaWWAN)
{
//3G Play low resolution video
}
Apple can reject your app if you will stream high resolution video over 3g Network.

How to find the memory leak in ios xcode?

It's my RTSP streaming ios application with FFMPEG decoder and it streaming fine, But the memory continuously increasing while running. Please help me, Is it a memory leak ?. And how can I track the leak ?.
Its my video streaming class: RTSPPlayer.m
#import "RTSPPlayer.h"
#import "Utilities.h"
#import "AudioStreamer.h"
#interface RTSPPlayer ()
#property (nonatomic, retain) AudioStreamer *audioController;
#end
#interface RTSPPlayer (private)
-(void)convertFrameToRGB;
-(UIImage *)imageFromAVPicture:(AVPicture)pict width:(int)width height:(int)height;
-(void)setupScaler;
#end
#implementation RTSPPlayer
#synthesize audioController = _audioController;
#synthesize audioPacketQueue,audioPacketQueueSize;
#synthesize _audioStream,_audioCodecContext;
#synthesize emptyAudioBuffer;
#synthesize outputWidth, outputHeight;
- (void)setOutputWidth:(int)newValue
{
if (outputWidth != newValue) {
outputWidth = newValue;
[self setupScaler];
}
}
- (void)setOutputHeight:(int)newValue
{
if (outputHeight != newValue) {
outputHeight = newValue;
[self setupScaler];
}
}
- (UIImage *)currentImage
{
if (!pFrame->data[0]) return nil;
[self convertFrameToRGB];
return [self imageFromAVPicture:picture width:outputWidth height:outputHeight];
}
- (double)duration
{
return (double)pFormatCtx->duration / AV_TIME_BASE;
}
- (double)currentTime
{
AVRational timeBase = pFormatCtx->streams[videoStream]->time_base;
return packet.pts * (double)timeBase.num / timeBase.den;
}
- (int)sourceWidth
{
return pCodecCtx->width;
}
- (int)sourceHeight
{
return pCodecCtx->height;
}
- (id)initWithVideo:(NSString *)moviePath usesTcp:(BOOL)usesTcp
{
if (!(self=[super init])) return nil;
AVCodec *pCodec;
// Register all formats and codecs
avcodec_register_all();
av_register_all();
avformat_network_init();
// Set the RTSP Options
AVDictionary *opts = 0;
if (usesTcp)
av_dict_set(&opts, "rtsp_transport", "tcp", 0);
if (avformat_open_input(&pFormatCtx, [moviePath UTF8String], NULL, &opts) !=0 ) {
av_log(NULL, AV_LOG_ERROR, "Couldn't open file\n");
goto initError;
}
// Retrieve stream information
if (avformat_find_stream_info(pFormatCtx,NULL) < 0) {
av_log(NULL, AV_LOG_ERROR, "Couldn't find stream information\n");
goto initError;
}
// Find the first video stream
videoStream=-1;
audioStream=-1;
for (int i=0; i<pFormatCtx->nb_streams; i++) {
if (pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
NSLog(#"found video stream");
videoStream=i;
}
if (pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO) {
audioStream=i;
NSLog(#"found audio stream");
}
}
if (videoStream==-1 && audioStream==-1) {
goto initError;
}
// Get a pointer to the codec context for the video stream
pCodecCtx = pFormatCtx->streams[videoStream]->codec;
// Find the decoder for the video stream
pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
if (pCodec == NULL) {
av_log(NULL, AV_LOG_ERROR, "Unsupported codec!\n");
goto initError;
}
// Open codec
if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot open video decoder\n");
goto initError;
}
if (audioStream > -1 ) {
NSLog(#"set up audiodecoder");
[self setupAudioDecoder];
}
// Allocate video frame
pFrame = avcodec_alloc_frame();
outputWidth = pCodecCtx->width;
self.outputHeight = pCodecCtx->height;
return self;
initError:
// [self release];
return nil;
}
- (void)setupScaler
{
// Release old picture and scaler
avpicture_free(&picture);
sws_freeContext(img_convert_ctx);
// Allocate RGB picture
avpicture_alloc(&picture, PIX_FMT_RGB24, outputWidth, outputHeight);
// Setup scaler
static int sws_flags = SWS_FAST_BILINEAR;
img_convert_ctx = sws_getContext(pCodecCtx->width,
pCodecCtx->height,
pCodecCtx->pix_fmt,
outputWidth,
outputHeight,
PIX_FMT_RGB24,
sws_flags, NULL, NULL, NULL);
}
- (void)seekTime:(double)seconds
{
AVRational timeBase = pFormatCtx->streams[videoStream]->time_base;
int64_t targetFrame = (int64_t)((double)timeBase.den / timeBase.num * seconds);
avformat_seek_file(pFormatCtx, videoStream, targetFrame, targetFrame, targetFrame, AVSEEK_FLAG_FRAME);
avcodec_flush_buffers(pCodecCtx);
}
- (void)dealloc
{
// Free scaler
sws_freeContext(img_convert_ctx);
// Free RGB picture
avpicture_free(&picture);
// Free the packet that was allocated by av_read_frame
av_free_packet(&packet);
// Free the YUV frame
av_free(pFrame);
// Close the codec
if (pCodecCtx) avcodec_close(pCodecCtx);
// Close the video file
if (pFormatCtx) avformat_close_input(&pFormatCtx);
[_audioController _stopAudio];
// [_audioController release];
_audioController = nil;
// [audioPacketQueue release];
audioPacketQueue = nil;
// [audioPacketQueueLock release];
audioPacketQueueLock = nil;
// [super dealloc];
}
- (BOOL)stepFrame
{
// AVPacket packet;
int frameFinished=0;
while (!frameFinished && av_read_frame(pFormatCtx, &packet) >=0 ) {
// Is this a packet from the video stream?
if(packet.stream_index==videoStream) {
// Decode video frame
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
}
if (packet.stream_index==audioStream) {
// NSLog(#"audio stream");
[audioPacketQueueLock lock];
audioPacketQueueSize += packet.size;
[audioPacketQueue addObject:[NSMutableData dataWithBytes:&packet length:sizeof(packet)]];
[audioPacketQueueLock unlock];
if (!primed) {
primed=YES;
[_audioController _startAudio];
}
if (emptyAudioBuffer) {
[_audioController enqueueBuffer:emptyAudioBuffer];
}
}
}
return frameFinished!=0;
}
- (void)convertFrameToRGB
{
sws_scale(img_convert_ctx,
pFrame->data,
pFrame->linesize,
0,
pCodecCtx->height,
picture.data,
picture.linesize);
}
- (UIImage *)imageFromAVPicture:(AVPicture)pict width:(int)width height:(int)height
{
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CFDataRef data = CFDataCreateWithBytesNoCopy(kCFAllocatorDefault, pict.data[0], pict.linesize[0]*height,kCFAllocatorNull);
CGDataProviderRef provider = CGDataProviderCreateWithCFData(data);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGImageRef cgImage = CGImageCreate(width,
height,
8,
24,
pict.linesize[0],
colorSpace,
bitmapInfo,
provider,
NULL,
NO,
kCGRenderingIntentDefault);
CGColorSpaceRelease(colorSpace);
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CGDataProviderRelease(provider);
CFRelease(data);
return image;
}
- (void)setupAudioDecoder
{
if (audioStream >= 0) {
_audioBufferSize = AVCODEC_MAX_AUDIO_FRAME_SIZE;
_audioBuffer = av_malloc(_audioBufferSize);
_inBuffer = NO;
_audioCodecContext = pFormatCtx->streams[audioStream]->codec;
_audioStream = pFormatCtx->streams[audioStream];
AVCodec *codec = avcodec_find_decoder(_audioCodecContext->codec_id);
if (codec == NULL) {
NSLog(#"Not found audio codec.");
return;
}
if (avcodec_open2(_audioCodecContext, codec, NULL) < 0) {
NSLog(#"Could not open audio codec.");
return;
}
if (audioPacketQueue) {
// [audioPacketQueue release];
audioPacketQueue = nil;
}
audioPacketQueue = [[NSMutableArray alloc] init];
if (audioPacketQueueLock) {
// [audioPacketQueueLock release];
audioPacketQueueLock = nil;
}
audioPacketQueueLock = [[NSLock alloc] init];
if (_audioController) {
[_audioController _stopAudio];
// [_audioController release];
_audioController = nil;
}
_audioController = [[AudioStreamer alloc] initWithStreamer:self];
} else {
pFormatCtx->streams[audioStream]->discard = AVDISCARD_ALL;
audioStream = -1;
}
}
- (void)nextPacket
{
_inBuffer = NO;
}
- (AVPacket*)readPacket
{
if (_currentPacket.size > 0 || _inBuffer) return &_currentPacket;
NSMutableData *packetData = [audioPacketQueue objectAtIndex:0];
_packet = [packetData mutableBytes];
if (_packet) {
if (_packet->dts != AV_NOPTS_VALUE) {
_packet->dts += av_rescale_q(0, AV_TIME_BASE_Q, _audioStream->time_base);
}
if (_packet->pts != AV_NOPTS_VALUE) {
_packet->pts += av_rescale_q(0, AV_TIME_BASE_Q, _audioStream->time_base);
}
[audioPacketQueueLock lock];
audioPacketQueueSize -= _packet->size;
if ([audioPacketQueue count] > 0) {
[audioPacketQueue removeObjectAtIndex:0];
}
[audioPacketQueueLock unlock];
_currentPacket = *(_packet);
}
return &_currentPacket;
}
- (void)closeAudio
{
[_audioController _stopAudio];
primed=NO;
}
#end
Presented as an answer for formatting and images.
Use instruments to check for leaks and memory loss due to retained but not leaked memory. The latter is unused memory that is still pointed to. Use Mark Generation (Heapshot) in the Allocations instrument on Instruments.
For HowTo use Heapshot to find memory creap, see: bbum blog
Basically the method is to run Instruments allocate tool, take a heapshot, run an iteration of your code and take another heapshot repeating 3 or 4 times. This will indicate memory that is allocated and not released during the iterations.
To figure out the results disclose to see the individual allocations.
If you need to see where retains, releases and autoreleases occur for an object use instruments:
Run in instruments, in Allocations set "Record reference counts" on (For Xcode 5 and lower you have to stop recording to set the option). Cause the app to run, stop recording, drill down and you will be able to see where all retains, releases and autoreleases occurred.

Matchmaking works on wifi but doesn't work on 3g

For some reason game center matchMaking works well when both devices on the same wifi but does not work when one of the devices is on 3g (the devices keep searching and cannot find each other).
I am using:
1. iPad 2 with iOS 7.0.4 with sandbox account (compatible to the device's app store account)
2. Iphone 4s with iOS 7.0.4 with sandbox account (compatible to the device's app store account but different from the iPad account).
The code that create the match goes like this:
- (IBAction)continueButtonPressed:(id)sender {
GKMatchRequest *request = [[GKMatchRequest alloc] init];
request.minPlayers = 2;
request.maxPlayers = 4;
request.defaultNumberOfPlayers = 2;
if(([[self.gameTypeSegmentControl objectAtIndex:0] selectedSegmentIndex] == 0) ||
([[self.gameTypeSegmentControl objectAtIndex:1] selectedSegmentIndex] == 0))
{
int temp = [self.allLanguages indexOfObject:[[self.languages objectAtIndex:selectedRow] primaryLanguage]];
if ((temp > 0) && (temp <= self.allLanguages.count))
{
request.playerGroup = temp;
}else
{
request.playerGroup = ENGLISH_US_LANG;//50
}
}
if(([[self.gameTypeSegmentControl objectAtIndex:0] selectedSegmentIndex] == 1) ||
([[self.gameTypeSegmentControl objectAtIndex:1] selectedSegmentIndex] == 1))
{
request.playerGroup = 255;
}
if (isJoining)// Not the creator of the game
{
request.playerAttributes = JOIN_ATTRIBUTE;
[[GKMatchmaker sharedMatchmaker] findMatchForRequest:request withCompletionHandler:^(GKMatch *match, NSError *error) {
if (error)
{
NSLog(#"findMatchForRequest ended with error");
// Process the error.
}
else if (match != nil)
{
self.myMatch = match; // Use a retaining property to retain the match.
match.delegate = self;
if (!self.matchStarted && match.expectedPlayerCount == 0)
{
self.matchStarted = YES;
NSLog(#"match begin");
// Insert game-specific code to begin the match.
}
}
}];
}else //The creator of the game
{
request.playerAttributes = CREATE_ATTRIBUTE;
GKMatchmakerViewController *mmvc = [[GKMatchmakerViewController alloc] initWithMatchRequest:request];
mmvc.matchmakerDelegate = self;
[self presentViewController:mmvc animated:YES completion:nil];
}
}
Any idea what causing the problem???

Resources