How to view txt/pdf flle on Google drive in UIWebView - ios

I am working on an App in which I require to access to files stored on Google drive,which works well.
Now, I am trying to display that selected file on the device, I am using UIWebView for that purpose. I have read the documents from Google, I could not find anything in relation with viewing the file(txt/pdf/spread sheet etc).
Here is my code -- I am firing when user taps on a cell of UITableView:
GTLDriveFile *file = [loadedFiles objectAtIndex:indexPath.row];
GTMHTTPFetcher * fetcher = [applicationDelegate.serviceDriveGtl.fetcherService fetcherWithURLString:file.downloadUrl];
[fetcher beginFetchWithCompletionHandler:^(NSData * data,NSError * error) {
if (error != nil) {
NSLog(#"ERROR ON LOADING FILE:%#", [error localizedDescription]);
} else {
NSString * link = [[NSString alloc] initWithData:data
encoding:NSUTF32StringEncoding];
DisplayViewController * displayView = [[DisplayViewController alloc] init];
[displayView showDocumentOnLink:link];
[self.navigationController pushViewController:displayView animated:YES];
}
Now, Here is how I am populating it WebView:
I am collecting the "link" variable in "getLink" in "DisplayViewController".
displayWebView = [[UIWebView alloc] init];
[displayWebView setFrame:CGRectMake(posX, posY+230, width, 200)];
[displayWebView setDelegate:self];
[self.view addSubview:displayWebView];
NSURLRequest * request = [NSURLRequest requestWithURL:[NSURL URLWithString:getLink]];
[displayWebView loadRequest:request];
What am I missing here ? Why does the WebView not populate the file contents ?
Any help with truly appreciated,
Thanks

Related

Proper way to use SFSpeechRecognizer?

I'm trying to use SFSpeechRecognizer but I don't have a way to test if I'm implementing it correctly, and since its a relatively new class i couldn't find a sample code (I don't know swift). Am I making any unforgivable mistakes/missing something ?
[SFSpeechRecognizer requestAuthorization:^(SFSpeechRecognizerAuthorizationStatus status){
if (status == SFSpeechRecognizerAuthorizationStatusAuthorized) {
SFSpeechRecognizer* recognizer = [[SFSpeechRecognizer alloc] init];
recognizer.delegate = self;
SFSpeechAudioBufferRecognitionRequest* request = [[SFSpeechAudioBufferRecognitionRequest alloc] init];
request.contextualStrings = #[#"data", #"bank", #"databank"];
SFSpeechRecognitionTask* task = [recognizer recognitionTaskWithRequest:request resultHandler:^(SFSpeechRecognitionResult* result, NSError* error){
SFTranscription* transcript = result.bestTranscription;
NSLog(#"%#", transcript);
}];
}
}];
I´m trying too but this code works for me, after all SFSpeechRecognizer and SFSpeechAudioBufferRecognitionRequest are not the same, so I think (haven´t tested) you have to ask for different permissions (have you asked for permissions before? to use the microphone and the speechRecognition?). Ok here´s the code:
//Available over iOS 10, only for maximum 1 minute, need internet connection; can be sourced from an audio recorded file or over the microphone
NSLocale *local =[[NSLocale alloc] initWithLocaleIdentifier:#"es-MX"];
speechRecognizer = [[SFSpeechRecognizer alloc] initWithLocale:local];
NSString *soundFilePath = [myDir stringByAppendingPathComponent:#"/sound.m4a"];
NSURL *url = [[NSURL alloc] initFileURLWithPath:soundFilePath];
if(!speechRecognizer.isAvailable)
NSLog(#"speechRecognizer is not available, maybe it has no internet connection");
SFSpeechURLRecognitionRequest *urlRequest = [[SFSpeechURLRecognitionRequest alloc] initWithURL:url];
urlRequest.shouldReportPartialResults = YES; // YES if animate writting
[speechRecognizer recognitionTaskWithRequest: urlRequest resultHandler: ^(SFSpeechRecognitionResult * _Nullable result, NSError * _Nullable error)
{
NSString *transcriptText = result.bestTranscription.formattedString;
if(!error)
{
NSLog(#"transcriptText");
}
}];

Vpn on demand in ios 9.2

I want to use VPN on Demand feature that is available in Network Extension framework,I want to connect the VPN when the specific URL or site is open in the browser.
For Example: when I write www.google.com my VPN will be connected and not on any other sites.
After a lot of searching on internet,I tried the following code
NEVPNManager *manager = [NEVPNManager sharedManager];
[manager loadFromPreferencesWithCompletionHandler:^(NSError *error) {
if(error) {
NSLog(#"Load error: %#", error);
} else {
NEVPNProtocolIKEv2 *p = [[NEVPNProtocolIKEv2 alloc] init];
p.username = #"Username";
p.passwordReference = [res objectForKey:#"v_PersistentRef"];
p.serverAddress = strAddress;
p.authenticationMethod = NEVPNIKEAuthenticationMethodCertificate;
p.serverCertificateIssuerCommonName = #"COMODO RSA Domain Validation Secure Server CA";
p.serverCertificateCommonName =strAddress;
p.identityData = [NSData dataWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"point-to-client2" ofType:#"p12"]];
p.identityDataPassword = #"vpnuser";
p.localIdentifier = strAddress;
p.remoteIdentifier = strAddress;
p.useExtendedAuthentication = YES;
p.disconnectOnSleep = NO;
[manager setProtocol:p];
NSLog(#"password: %#", [manager protocol].passwordReference);
[manager setOnDemandEnabled:YES];
[manager setEnabled:YES];
Then setting the onDemandRules by the following code:
NEEvaluateConnectionRule * ru = [[NEEvaluateConnectionRule alloc] initWithMatchDomains:#[#"google.com"] andAction:NEEvaluateConnectionRuleActionConnectIfNeeded];
ru.probeURL = [[NSURL alloc] initWithString:#"http://www.google.com"];
NSArray *arr = [[NSArray alloc] initWithObjects:ru, nil];
NEOnDemandRuleEvaluateConnection *ec =[[NEOnDemandRuleEvaluateConnection alloc] init];
ec.interfaceTypeMatch = 2;
[ec setConnectionRules:arr];
NSArray *arr2 = [[NSArray alloc] initWithObjects:ec, nil];
NSLog(#"onDemandRules: %#", arr2);
//
[manager setOnDemandRules:arr2];
[manager setLocalizedDescription:#"VPN Profile"];
[manager saveToPreferencesWithCompletionHandler:^(NSError *error) {
if(error) {
NSLog(#"Save error: %#", error);
}
else {
NSLog(#"Saved");
}
Here the VPN configuration profile is updated and then start VPN connection through the following code:
NSError *startError;
[[NEVPNManager sharedManager].connection startVPNTunnelAndReturnError:&startError];
if(startError) {
NSLog(#"Start error: %#", startError.localizedDescription);
} else {
NSLog(#"Connection established!");
}
Now I faced the following problems:
1)When the profile is updated the ondemand feature is not working as when I write in the browser Url i.e www.google.com,the vpn is not connected.So I cannot understand what I did wrong in the code ?
2)How to give the dynamic domain or url like what I placed in the initWithMatchDomains:#[#"google.com"] or in the probeUrl that will be workable for google.com as well as google.de or any google domain??
I know its quite a long and detailed question but I really want help.
Any help will be highly appreciated.
Thanks
Ok,I am going to answer my question as I almost solve the problem.
The On demand Vpn feature is perfectly working as I did a slightly modified the code as follows:
I just made an array for the domain
NSArray *arrDomains =[[NSArray alloc] initWithObjects:#"youtube.com","google.com",nil];
Then Evaluate the connection rule
NEEvaluateConnectionRule * ru = [[NEEvaluateConnectionRule alloc] initWithMatchDomains:arrDomains andAction:NEEvaluateConnectionRuleActionConnectIfNeeded];
Then using the property
ru.useDNSServers = arrDomains;
Then the remaining code is same as above mentioned in the question.
NSArray *arrRules = [[NSArray alloc] initWithObjects:ru,nil];
NEOnDemandRuleEvaluateConnection *ec = [[NEOnDemandRuleEvaluateConnection alloc] init];
ec.interfaceTypeMatch = 2;
[ec setConnectionRules:arrRules];
NSArray *arr2 = [[NSArray alloc] initWithObjects:ec, nil];
NSLog(#"onDemandRules: %#", arr2);
[manager setOnDemandRules:arr2];
The above code can create vpn connection on the domains defined in the array,this is the answer to my first question.
And to answer my second question what I have searched on the internet that to make the dynamic domain use wildcard * but it will be only used only as the left label of the domain name e.g. *.example.com.,no other positions the * will work.
I hope this will help others who also faced the same problem.

NSURLSession - use uploadTaskWithStreamedRequest with AWS IOS SDK

I've searched for days for a way to upload an IOS asset without creating a copy of the file in temp directory without luck. I got the code working with a temp copy but copying a video file that could be anywhere from 10MB to 4GB is not realistic.
The closest I have come to reading the asset in read-only mode is the code below. Per the apple documentation this should work - see the following links:
https://developer.apple.com/library/ios/documentation/Miscellaneous/Reference/EntitlementKeyReference/Chapters/EnablingAppSandbox.html
I have enabled these keys:
<key>com.apple.security.assets.movies.read-write</key>
<string>YES</string>
<key>com.apple.security.assets.music.read-write</key>
<string>YES</string>
<key>com.apple.security.assets.pictures.read-write</key>
<string>YES</string>
<key>com.apple.security.files.downloads.read-write</key>
<string>YES</string>
Here is the code:
// QueueController.h
#import <AVFoundation/AVFoundation.h>
#import <AWSS3.h>
#import <Foundation/Foundation.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import "Reachability1.h"
#import "TransferType.h"
#import "TransferModel.h"
#import "Util.h"
#interface QueueController : NSObject<NSURLSessionDelegate>
#property(atomic, strong) NSURLSession* session;
#property(atomic, strong) NSNumber* sessionCount;
#property(atomic, strong) NSURLSessionConfiguration* configuration;
+ (QueueController*)sharedInstance;
- (void)transferMediaViaQueue:(MediaItem*)mediaItem
withTransferType:(TransferType)transferType;
#end
#implementation QueueController {
NSOperationQueue* copyQueue;
NSOperationQueue* transferQueue;
NSMutableArray* inProcessTransferArray;
NSMutableArray* pendingTransferArray;
bool isTransferring;
}
static QueueController* sharedInstance = nil;
// Get the shared instance and create it if necessary.
+ (QueueController*)sharedInstance {
#synchronized(self) {
if (sharedInstance == nil) {
sharedInstance = [[QueueController alloc] init];
}
}
return sharedInstance;
}
- (id)init {
if (self = [super init]) {
appDelegate =
(RootViewControllerAppDelegate*)[UIApplication sharedApplication]
.delegate;
copyQueue = [[NSOperationQueue alloc] init];
transferQueue = [[NSOperationQueue alloc] init];
transferQueue.maxConcurrentOperationCount = MAX_CONCURRENT_TRANSFERS;
inProcessTransferArray = [[NSMutableArray alloc] init];
pendingTransferArray = [[NSMutableArray alloc] init];
isTransferring = false;
if (self.session == nil) {
self.configuration = [NSURLSessionConfiguration
backgroundSessionConfigurationWithIdentifier:#"transferQueue"];
self.session = [NSURLSession sessionWithConfiguration:self.configuration
delegate:self
delegateQueue:transferQueue];
}
}
return self;
}
- (void)transferMediaViaQueue:(MediaItem*)mediaItem
withTransferType:(TransferType)transferType {
// Create a transfer model
NSUserDefaults* defaultUser = [NSUserDefaults standardUserDefaults];
NSString* user_id = [defaultUser valueForKey:#"UserId"];
TransferModel* transferModel = [[TransferModel alloc] init];
transferModel.mediaItem = mediaItem;
transferModel.transferType = transferType;
transferModel.s3Path = user_id;
transferModel.s3file_name = mediaItem.mediaName;
transferModel.assetURL =
[[mediaItem.mediaLocalAsset defaultRepresentation] url];
ALAssetRepresentation* mediaRep =
[mediaItem.mediaLocalAsset defaultRepresentation];
transferModel.content_type =
(__bridge_transfer NSString*)UTTypeCopyPreferredTagWithClass(
(__bridge CFStringRef)[mediaRep UTI], kUTTagClassMIMEType);
#synchronized(pendingTransferArray) {
if ((!isTransferring) &&
(transferQueue.operationCount < MAX_CONCURRENT_TRANSFERS)) {
isTransferring = true;
if (transferModel.transferType == UPLOAD) {
/**
* Read ALAsset from NSURLRequestStream
*/
NSInvocationOperation* uploadOP = [[NSInvocationOperation alloc]
initWithTarget:self
selector:#selector(uploadMediaViaLocalPath:)
object:transferModel];
[transferQueue addOperation:uploadOP];
[inProcessTransferArray addObject:transferModel];
}
} else {
// Add to pending
[pendingTransferArray addObject:transferModel];
}
}
}
- (void)uploadMediaViaLocalPath:(TransferModel*)transferModel {
#try {
/**
* Fetch readable asset
*/
NSURL* assetURL =
[[transferModel.mediaItem.mediaLocalAsset defaultRepresentation] url];
NSData* fileToUpload = [[NSData alloc] initWithContentsOfURL:assetURL];
NSURLRequest* assetAsRequest =
[NSURLRequest requestWithURL:assetURL
cachePolicy:NSURLRequestUseProtocolCachePolicy
timeoutInterval:60.0];
/**
* Fetch signed URL
*/
AWSS3GetPreSignedURLRequest* getPreSignedURLRequest =
[AWSS3GetPreSignedURLRequest new];
getPreSignedURLRequest.bucket = BUCKET_NAME;
NSString* s3Key = [NSString stringWithFormat:#"%#/%#", transferModel.s3Path, transferModel.s3file_name];
getPreSignedURLRequest.key = s3Key;
getPreSignedURLRequest.HTTPMethod = AWSHTTPMethodPUT;
getPreSignedURLRequest.expires = [NSDate dateWithTimeIntervalSinceNow:3600];
// Important: must set contentType for PUT request
// getPreSignedURLRequest.contentType = transferModel.mediaItem.mimeType;
getPreSignedURLRequest.contentType = transferModel.content_type;
NSLog(#"mimeType: %#", transferModel.content_type);
/**
* Upload the file
*/
[[[AWSS3PreSignedURLBuilder defaultS3PreSignedURLBuilder]
getPreSignedURL:getPreSignedURLRequest]
continueWithBlock:^id(BFTask* task) {
NSURLSessionUploadTask* uploadTask;
transferModel.sessionTask = uploadTask;
if (task.error) {
NSLog(#"Error: %#", task.error);
} else {
NSURL* presignedURL = task.result;
NSLog(#"upload presignedURL is: \n%#", presignedURL);
NSMutableURLRequest* request =
[NSMutableURLRequest requestWithURL:presignedURL];
request.cachePolicy = NSURLRequestReloadIgnoringLocalCacheData;
[request setHTTPMethod:#"PUT"];
[request setValue:transferModel.content_type
forHTTPHeaderField:#"Content-Type"];
uploadTask =
[self.session uploadTaskWithStreamedRequest:assetAsRequest];
[uploadTask resume];
}
return nil;
}];
} #catch (NSException* exception) {
NSLog(#"exception: %#", exception);
} #finally {
}
}
- (void)URLSession:(NSURLSession*)session
task:(NSURLSessionTask*)task
didSendBodyData:(int64_t)bytesSent
totalBytesSent:(int64_t)totalBytesSent
totalBytesExpectedToSend:(int64_t)totalBytesExpectedToSend {
// Calculate progress
double progress = (double)totalBytesSent / (double)totalBytesExpectedToSend;
NSLog(#"UploadTask progress: %lf", progress);
}
- (void)URLSession:(NSURLSession*)session
task:(NSURLSessionTask*)task
didCompleteWithError:(NSError*)error {
NSLog(#"(void)URLSession:session task:(NSURLSessionTask*)task "
#"didCompleteWithError:error called...%#",
error);
}
- (void)URLSessionDidFinishEventsForBackgroundURLSession:
(NSURLSession*)session {
NSLog(#"URLSessionDidFinishEventsForBackgroundURLSession called...");
}
// NSURLSessionDataDelegate
- (void)URLSession:(NSURLSession*)session
dataTask:(NSURLSessionDataTask*)dataTask
didReceiveResponse:(NSURLResponse*)response
completionHandler:
(void (^)(NSURLSessionResponseDisposition disposition))completionHandler {
//completionHandler(NSURLSessionResponseAllow);
}
#end
But I'm receiving this error:
(void)URLSession:session task:(NSURLSessionTask*)task didCompleteWithError:error called...Error Domain=NSURLErrorDomain Code=-999 "cancelled" UserInfo=0x17166f840 {NSErrorFailingURLStringKey=assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV, NSLocalizedDescription=cancelled, NSErrorFailingURLKey=assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV}
userInfo: {
NSErrorFailingURLKey = "assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV";
NSErrorFailingURLStringKey = "assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV";
NSLocalizedDescription = cancelled;
}
Thanks in advance for your help.
Regards,
-J
A couple of comments regarding using NSURLSessionUploadTask:
If you implement didReceiveResponse, you must call the completionHandler.
If you call uploadTaskWithStreamedRequest, the documentation for the request parameter warns us that:
The body stream and body data in this request object are ignored, and NSURLSession calls its delegate’s URLSession:task:needNewBodyStream: method to provide the body data.
So you must implement needNewBodyStream if implementing a NSInputStream based request.
Be forewarned, but using a stream-base request like this creates a request with a "chunked" transfer encoding and not all servers can handle that.
At one point in the code, you appear to try to load the contents of the asset into a NSData. If you have assets that are that large, you cannot reasonably load that into a NSData object. Besides, that's inconsistent with using uploadTaskWithStreamedRequest.
You either need to create NSInputStream or upload it from a file.
You appear to be using the asset URL for the NSURLRequest. That URL should be the URL for your web service.
When using image picker, you have access to two URL keys: the media URL (a file:// URL for movies, but not pictures) and the assets library reference URL (an assets-library:// URL). If you're using the media URL, you can use that for uploading movies. But you cannot use the assets library reference URL for uploading purposes. You can only use that in conjunction with ALAssetsLibrary.
The ALAssetPropertyURL is a purely URL identifier for the asset i.e to identify assets and asset groups and I dont think you can use it directly to upload to a service.
You could use AVAssetExportSession to export the asset to temp url if other methods are teditious.
i.e
[AVAssetExportSession exportSessionWithAsset:[AVURLAsset URLAssetWithURL:assetURL options:nil] presetName:AVAssetExportPresetPassthrough];

Playing youtube video with MPMoviePlayerController

i've been using an html string for a long period for playing youtube videos through an UIWebView, the problem is i want to get notifications with playbackstate changed. i've decided to create an MPMoviePlayerController and play the youtube video through this, but cant seem to make it work. i'm using following code in viewdidload:
NSString *urlAddress = #"http://www.youtube.com/watch?v=m01MYOpbdIk";
NSURL *url = [NSURL URLWithString:urlAddress];
CGFloat width = [UIScreen mainScreen].bounds.size.width;
CGFloat height = [UIScreen mainScreen].bounds.size.height;
movie = [[MPMoviePlayerController alloc] initWithContentURL:url];
movie.scalingMode=MPMovieScalingModeAspectFill;
movie.view.frame = CGRectMake(0.0, 0.0, width, height);
[self.view addSubview:movie.view];
[movie play];
Gives me this error:
_itemFailedToPlayToEnd: {
kind = 1;
new = 2;
old = 0;
}
For me this library did work perfectly! https://github.com/hellozimi/HCYoutubeParser
moviePlayer = [[MPMoviePlayerController alloc] init];
moviePlayer.shouldAutoplay = YES;
moviePlayer.fullscreen = YES;
moviePlayer.repeatMode = MPMovieRepeatModeNone;
moviePlayer.controlStyle = MPMovieControlStyleDefault;
moviePlayer.movieSourceType = MPMovieSourceTypeFile;
moviePlayer.scalingMode = MPMovieScalingModeAspectFit;
After this call method.
[self callYouTubeURL:[NSString stringWithFormat:#"http://www.youtube.com/embed/%#",_urlcode]];
In this method parse youtube link.
- (void)callYouTubeURL:(NSString *)urlLink
{
NSURL *url = [NSURL URLWithString:urlLink];
actvity.hidden = NO;
[HCYoutubeParser thumbnailForYoutubeURL:url thumbnailSize:YouTubeThumbnailDefaultHighQuality completeBlock:^(UIImage *image, NSError *error) {
if (!error) {
[HCYoutubeParser h264videosWithYoutubeURL:url completeBlock:^(NSDictionary *videoDictionary, NSError *error) {
NSDictionary *qualities = videoDictionary;
NSString *URLString = nil;
if ([qualities objectForKey:#"small"] != nil) {
URLString = [qualities objectForKey:#"small"];
}
else if ([qualities objectForKey:#"live"] != nil) {
URLString = [qualities objectForKey:#"live"];
}
else {
[[[UIAlertView alloc] initWithTitle:#"Error" message:#"Couldn't find youtube video" delegate:nil cancelButtonTitle:#"Close" otherButtonTitles: nil] show];
return;
}
_urlToLoad = [NSURL URLWithString:URLString];
[self urlLoadintoPlayer];
}];
}
else {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:[error localizedDescription] delegate:nil cancelButtonTitle:#"Dismiss" otherButtonTitles:nil];
[alert show];
}
}];
}
After this load newly parse url into player.
-(void)urlLoadintoPlayer
{
moviePlayer.contentURL = _urlToLoad;
}
Make use of custom LBYouTubePlayerViewController
It is a subclass of MPMoviePlayerViewController.
LBYouTubeView is just a small view that is able to display YouTube videos in a MPMoviePlayerController. You even have the choice between high-quality and standard quality stream.
It just loads the HTML code of YouTube's mobile website and looks for the data in the script tag.
LBYouTubeView doesn't use UIWebView which makes it faster and look cleaner.
Official way to play youtube videos: 1) UIWebView with the embed tag from Youtube the UIWebView's content.2) Using youtube-ios-player-helper[Google official way]
Unfortunately, there's no way to directly play a youtube video with MPMoviePlayerController because youtube does not expose direct links to the video files.
There are library which runs youtube videos through MPMoviePlayerController, but they are against the TOC of youtube. Hence simplest method is to go for youtube-ios-player-helper.
In case youtube-ios-player-helper pod doesn't work you can add YTPlayer.h/.m and assets folder to your project and write a bridge header with #import "YTPlayerView.h" and rest procedure you can follow on https://github.com/youtube/youtube-ios-player-helper
This definitely worked for me!

control is not returned from web view shouldstartload delegate method

In my application i am using Uiwebview for displaying vimeo authorization page, after the user has authorized it, i have to parse the url for OAtoken and dismiss it, for that i am using should startLoad delegate method, but after the process is over and when i am returning NO, the control is not transferred back...
- (BOOL)webView:(UIWebView *)webView shouldStartLoadWithRequest:(NSURLRequest *)request navigationType:(UIWebViewNavigationType)navigationType{
NSURL *url = [request mainDocumentURL];
NSString *str1 = [url absoluteString];
NSString *str = #"https://vimeo.com/oauth/confirmed";
//[webView setBackgroundColor:[UIColor colorWithPatternImage:[UIImage imageNamed:#"bg_cork.png"]]];
if([str isEqualToString:str1])
{
//removing the webview after the user approves.
//[webView removeFromSuperview];
return YES;
}
//parsing the redirected url to get the oauth_verifier.
URLParser *parser = [[URLParser alloc] initWithURLString:str1];
Oauth_verifier = [parser valueForVariable:#"oauth_verifier"];
//getting the final access token by giving the oauth verifier.
NSURL *url_access = [[NSURL alloc] initWithString:#"https://vimeo.com/oauth/access_token"];
OAMutableURLRequest *reques_access = [[OAMutableURLRequest alloc]initWithURL:url_access consumer:consumer token:acessToken realm:nil signatureProvider:nil];
OARequestParameter *p2 = [[OARequestParameter alloc] initWithName:#"oauth_verifier" value:Oauth_verifier];
NSArray *params2 = [NSArray arrayWithObject:p2];
[reques_access setParameters:params2];
[reques_access setHTTPMethod:#"GET"];
OADataFetcher *fetcher_access = [[OADataFetcher alloc]init];
[fetcher_access fetchDataWithRequest:reques_access delegate:self didFinishSelector:#selector(acessTokenTicket:didFinishWithData:) didFailSelector:#selector(acessTokenTicket:didFailWithError:)];
//if the access token is successfully generated then the control transferrd to acessTokenTicket did finish with data
// Return YES if you want to load the page, and NO if you don't.
NSLog(#"at return yes");
if (i==1) {
NSLog(#"returning no");
[webView removeFromSuperview];
return NO;
}
return YES;
}
i am sure that it is going to return no because the statement "returning no" is printed, but the control is not returned,i have given the statement
NSLog(#"returned from web view delegate");
in the main function to know whether the control is returned it is not returned and also the operations below are not performed.

Resources