I'm trying to use SFSpeechRecognizer but I don't have a way to test if I'm implementing it correctly, and since its a relatively new class i couldn't find a sample code (I don't know swift). Am I making any unforgivable mistakes/missing something ?
[SFSpeechRecognizer requestAuthorization:^(SFSpeechRecognizerAuthorizationStatus status){
if (status == SFSpeechRecognizerAuthorizationStatusAuthorized) {
SFSpeechRecognizer* recognizer = [[SFSpeechRecognizer alloc] init];
recognizer.delegate = self;
SFSpeechAudioBufferRecognitionRequest* request = [[SFSpeechAudioBufferRecognitionRequest alloc] init];
request.contextualStrings = #[#"data", #"bank", #"databank"];
SFSpeechRecognitionTask* task = [recognizer recognitionTaskWithRequest:request resultHandler:^(SFSpeechRecognitionResult* result, NSError* error){
SFTranscription* transcript = result.bestTranscription;
NSLog(#"%#", transcript);
}];
}
}];
I´m trying too but this code works for me, after all SFSpeechRecognizer and SFSpeechAudioBufferRecognitionRequest are not the same, so I think (haven´t tested) you have to ask for different permissions (have you asked for permissions before? to use the microphone and the speechRecognition?). Ok here´s the code:
//Available over iOS 10, only for maximum 1 minute, need internet connection; can be sourced from an audio recorded file or over the microphone
NSLocale *local =[[NSLocale alloc] initWithLocaleIdentifier:#"es-MX"];
speechRecognizer = [[SFSpeechRecognizer alloc] initWithLocale:local];
NSString *soundFilePath = [myDir stringByAppendingPathComponent:#"/sound.m4a"];
NSURL *url = [[NSURL alloc] initFileURLWithPath:soundFilePath];
if(!speechRecognizer.isAvailable)
NSLog(#"speechRecognizer is not available, maybe it has no internet connection");
SFSpeechURLRecognitionRequest *urlRequest = [[SFSpeechURLRecognitionRequest alloc] initWithURL:url];
urlRequest.shouldReportPartialResults = YES; // YES if animate writting
[speechRecognizer recognitionTaskWithRequest: urlRequest resultHandler: ^(SFSpeechRecognitionResult * _Nullable result, NSError * _Nullable error)
{
NSString *transcriptText = result.bestTranscription.formattedString;
if(!error)
{
NSLog(#"transcriptText");
}
}];
Related
This is my first time that I am using Baidu API. I am having problem implementing Baidu places auto-complete API in my project. I am using the Baidu developers link to http://lbsyun.baidu.com/index.php?title=iossdk.
someone please give me to some tutorial in this regard?
i am following this tutorial. link
but in this tutorial i can not receive json file, give me a error
{ "Status": 102, "message": "MCODE parameter is not present, mobile
type mcode required parameter"}
Seems you should use the POI Search module of BaiduMapKit.Try this.
BMKCitySearchOption *citySearchOption = [[BMKCitySearchOption alloc]init];
citySearchOption.pageIndex = curPage;//here is the page index , you can set it to 0
citySearchOption.pageCapacity = 10;
citySearchOption.city= #"上海";//here is the city where you want to search the road
citySearchOption.keyword = #"淮海路";//here is the road name or someplace name you want to search
BOOL flag = [_poisearch poiSearchInCity:citySearchOption];
if(flag) {
_nextPageButton.enabled = true;
NSLog(#"success");
}
else {
_nextPageButton.enabled = false;
NSLog(#"fail");
}
Implement AutoComplete In Baidu Map using Baidu Web API
- (void)viewDidLoad {
BaseString = #"http://api.map.baidu.com/place/v2/suggestion?query=";
ak = #"56dIEtBAp1CU7u8ZMcq8DyUH2mVsn38x"; mcode = #"com.baidu.Baidu-Map-Demo";
regionkey = #"中国";
PathString = #"http://api.map.baidu.com/direction/v2/transit?origin=";
self .mapView .userTrackingMode = BMKUserTrackingModeFollow;
// 2. Set the map type self.mapView.mapType = BMKMapTypeStandard;
// 3. Set Agent self.mapView.delegate = self;
[super viewDidLoad];
mapView.frame = CGRectMake(0,0,self.view.frame.size.width,self.view.frame.size.height);
mapView.delegate = self; anotation = [[BMKPointAnnotation alloc]init];
destination = [[BMKPointAnnotation alloc]init];
PathUrl = [[NSURL alloc]init];
finalPathArray = [[NSMutableArray alloc]init];
session = [NSURLSession sessionWithConfiguration:[NSURLSessionConfiguration defaultSessionConfiguration]];
downloadURL = [[NSURL alloc]init];
path = [[BMKPolyline alloc]init];
flag = 0;
}
-(void)GetSuggestion: (NSString *)query {
NSString *stringUrl = [NSString stringWithFormat:#"%#%#&page_size=10&page_num=0&scope=1®ion=%#&output=json&ak=%#&mcode=%#",BaseString,query,regionkey,ak,mcode]; stringUrl = [stringUrl stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLFragmentAllowedCharacterSet]];
downloadURL = [NSURL URLWithString:stringUrl];
if (downloadURL != nil) {
if (DownloadTask != nil) {
[DownloadTask suspend];
}
DownloadTask = [session dataTaskWithURL:downloadURL completionHandler:^(NSData * _Nullable data, NSURLResponse * _Nullable response, NSError * _Nullable error) {
NSDictionary *AutocompleteData = [NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:nil];
resultArray = AutocompleteData[#"result"];
tbl_result.hidden = NO;
[tbl_result reloadData];
}];
[DownloadTask resume];
}
}
MCODE parameter means your bundle id must spacify bundle id with urlFor example write url for autocomplete FOR Autocomplete use this function
I want to use VPN on Demand feature that is available in Network Extension framework,I want to connect the VPN when the specific URL or site is open in the browser.
For Example: when I write www.google.com my VPN will be connected and not on any other sites.
After a lot of searching on internet,I tried the following code
NEVPNManager *manager = [NEVPNManager sharedManager];
[manager loadFromPreferencesWithCompletionHandler:^(NSError *error) {
if(error) {
NSLog(#"Load error: %#", error);
} else {
NEVPNProtocolIKEv2 *p = [[NEVPNProtocolIKEv2 alloc] init];
p.username = #"Username";
p.passwordReference = [res objectForKey:#"v_PersistentRef"];
p.serverAddress = strAddress;
p.authenticationMethod = NEVPNIKEAuthenticationMethodCertificate;
p.serverCertificateIssuerCommonName = #"COMODO RSA Domain Validation Secure Server CA";
p.serverCertificateCommonName =strAddress;
p.identityData = [NSData dataWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"point-to-client2" ofType:#"p12"]];
p.identityDataPassword = #"vpnuser";
p.localIdentifier = strAddress;
p.remoteIdentifier = strAddress;
p.useExtendedAuthentication = YES;
p.disconnectOnSleep = NO;
[manager setProtocol:p];
NSLog(#"password: %#", [manager protocol].passwordReference);
[manager setOnDemandEnabled:YES];
[manager setEnabled:YES];
Then setting the onDemandRules by the following code:
NEEvaluateConnectionRule * ru = [[NEEvaluateConnectionRule alloc] initWithMatchDomains:#[#"google.com"] andAction:NEEvaluateConnectionRuleActionConnectIfNeeded];
ru.probeURL = [[NSURL alloc] initWithString:#"http://www.google.com"];
NSArray *arr = [[NSArray alloc] initWithObjects:ru, nil];
NEOnDemandRuleEvaluateConnection *ec =[[NEOnDemandRuleEvaluateConnection alloc] init];
ec.interfaceTypeMatch = 2;
[ec setConnectionRules:arr];
NSArray *arr2 = [[NSArray alloc] initWithObjects:ec, nil];
NSLog(#"onDemandRules: %#", arr2);
//
[manager setOnDemandRules:arr2];
[manager setLocalizedDescription:#"VPN Profile"];
[manager saveToPreferencesWithCompletionHandler:^(NSError *error) {
if(error) {
NSLog(#"Save error: %#", error);
}
else {
NSLog(#"Saved");
}
Here the VPN configuration profile is updated and then start VPN connection through the following code:
NSError *startError;
[[NEVPNManager sharedManager].connection startVPNTunnelAndReturnError:&startError];
if(startError) {
NSLog(#"Start error: %#", startError.localizedDescription);
} else {
NSLog(#"Connection established!");
}
Now I faced the following problems:
1)When the profile is updated the ondemand feature is not working as when I write in the browser Url i.e www.google.com,the vpn is not connected.So I cannot understand what I did wrong in the code ?
2)How to give the dynamic domain or url like what I placed in the initWithMatchDomains:#[#"google.com"] or in the probeUrl that will be workable for google.com as well as google.de or any google domain??
I know its quite a long and detailed question but I really want help.
Any help will be highly appreciated.
Thanks
Ok,I am going to answer my question as I almost solve the problem.
The On demand Vpn feature is perfectly working as I did a slightly modified the code as follows:
I just made an array for the domain
NSArray *arrDomains =[[NSArray alloc] initWithObjects:#"youtube.com","google.com",nil];
Then Evaluate the connection rule
NEEvaluateConnectionRule * ru = [[NEEvaluateConnectionRule alloc] initWithMatchDomains:arrDomains andAction:NEEvaluateConnectionRuleActionConnectIfNeeded];
Then using the property
ru.useDNSServers = arrDomains;
Then the remaining code is same as above mentioned in the question.
NSArray *arrRules = [[NSArray alloc] initWithObjects:ru,nil];
NEOnDemandRuleEvaluateConnection *ec = [[NEOnDemandRuleEvaluateConnection alloc] init];
ec.interfaceTypeMatch = 2;
[ec setConnectionRules:arrRules];
NSArray *arr2 = [[NSArray alloc] initWithObjects:ec, nil];
NSLog(#"onDemandRules: %#", arr2);
[manager setOnDemandRules:arr2];
The above code can create vpn connection on the domains defined in the array,this is the answer to my first question.
And to answer my second question what I have searched on the internet that to make the dynamic domain use wildcard * but it will be only used only as the left label of the domain name e.g. *.example.com.,no other positions the * will work.
I hope this will help others who also faced the same problem.
I've searched for days for a way to upload an IOS asset without creating a copy of the file in temp directory without luck. I got the code working with a temp copy but copying a video file that could be anywhere from 10MB to 4GB is not realistic.
The closest I have come to reading the asset in read-only mode is the code below. Per the apple documentation this should work - see the following links:
https://developer.apple.com/library/ios/documentation/Miscellaneous/Reference/EntitlementKeyReference/Chapters/EnablingAppSandbox.html
I have enabled these keys:
<key>com.apple.security.assets.movies.read-write</key>
<string>YES</string>
<key>com.apple.security.assets.music.read-write</key>
<string>YES</string>
<key>com.apple.security.assets.pictures.read-write</key>
<string>YES</string>
<key>com.apple.security.files.downloads.read-write</key>
<string>YES</string>
Here is the code:
// QueueController.h
#import <AVFoundation/AVFoundation.h>
#import <AWSS3.h>
#import <Foundation/Foundation.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import "Reachability1.h"
#import "TransferType.h"
#import "TransferModel.h"
#import "Util.h"
#interface QueueController : NSObject<NSURLSessionDelegate>
#property(atomic, strong) NSURLSession* session;
#property(atomic, strong) NSNumber* sessionCount;
#property(atomic, strong) NSURLSessionConfiguration* configuration;
+ (QueueController*)sharedInstance;
- (void)transferMediaViaQueue:(MediaItem*)mediaItem
withTransferType:(TransferType)transferType;
#end
#implementation QueueController {
NSOperationQueue* copyQueue;
NSOperationQueue* transferQueue;
NSMutableArray* inProcessTransferArray;
NSMutableArray* pendingTransferArray;
bool isTransferring;
}
static QueueController* sharedInstance = nil;
// Get the shared instance and create it if necessary.
+ (QueueController*)sharedInstance {
#synchronized(self) {
if (sharedInstance == nil) {
sharedInstance = [[QueueController alloc] init];
}
}
return sharedInstance;
}
- (id)init {
if (self = [super init]) {
appDelegate =
(RootViewControllerAppDelegate*)[UIApplication sharedApplication]
.delegate;
copyQueue = [[NSOperationQueue alloc] init];
transferQueue = [[NSOperationQueue alloc] init];
transferQueue.maxConcurrentOperationCount = MAX_CONCURRENT_TRANSFERS;
inProcessTransferArray = [[NSMutableArray alloc] init];
pendingTransferArray = [[NSMutableArray alloc] init];
isTransferring = false;
if (self.session == nil) {
self.configuration = [NSURLSessionConfiguration
backgroundSessionConfigurationWithIdentifier:#"transferQueue"];
self.session = [NSURLSession sessionWithConfiguration:self.configuration
delegate:self
delegateQueue:transferQueue];
}
}
return self;
}
- (void)transferMediaViaQueue:(MediaItem*)mediaItem
withTransferType:(TransferType)transferType {
// Create a transfer model
NSUserDefaults* defaultUser = [NSUserDefaults standardUserDefaults];
NSString* user_id = [defaultUser valueForKey:#"UserId"];
TransferModel* transferModel = [[TransferModel alloc] init];
transferModel.mediaItem = mediaItem;
transferModel.transferType = transferType;
transferModel.s3Path = user_id;
transferModel.s3file_name = mediaItem.mediaName;
transferModel.assetURL =
[[mediaItem.mediaLocalAsset defaultRepresentation] url];
ALAssetRepresentation* mediaRep =
[mediaItem.mediaLocalAsset defaultRepresentation];
transferModel.content_type =
(__bridge_transfer NSString*)UTTypeCopyPreferredTagWithClass(
(__bridge CFStringRef)[mediaRep UTI], kUTTagClassMIMEType);
#synchronized(pendingTransferArray) {
if ((!isTransferring) &&
(transferQueue.operationCount < MAX_CONCURRENT_TRANSFERS)) {
isTransferring = true;
if (transferModel.transferType == UPLOAD) {
/**
* Read ALAsset from NSURLRequestStream
*/
NSInvocationOperation* uploadOP = [[NSInvocationOperation alloc]
initWithTarget:self
selector:#selector(uploadMediaViaLocalPath:)
object:transferModel];
[transferQueue addOperation:uploadOP];
[inProcessTransferArray addObject:transferModel];
}
} else {
// Add to pending
[pendingTransferArray addObject:transferModel];
}
}
}
- (void)uploadMediaViaLocalPath:(TransferModel*)transferModel {
#try {
/**
* Fetch readable asset
*/
NSURL* assetURL =
[[transferModel.mediaItem.mediaLocalAsset defaultRepresentation] url];
NSData* fileToUpload = [[NSData alloc] initWithContentsOfURL:assetURL];
NSURLRequest* assetAsRequest =
[NSURLRequest requestWithURL:assetURL
cachePolicy:NSURLRequestUseProtocolCachePolicy
timeoutInterval:60.0];
/**
* Fetch signed URL
*/
AWSS3GetPreSignedURLRequest* getPreSignedURLRequest =
[AWSS3GetPreSignedURLRequest new];
getPreSignedURLRequest.bucket = BUCKET_NAME;
NSString* s3Key = [NSString stringWithFormat:#"%#/%#", transferModel.s3Path, transferModel.s3file_name];
getPreSignedURLRequest.key = s3Key;
getPreSignedURLRequest.HTTPMethod = AWSHTTPMethodPUT;
getPreSignedURLRequest.expires = [NSDate dateWithTimeIntervalSinceNow:3600];
// Important: must set contentType for PUT request
// getPreSignedURLRequest.contentType = transferModel.mediaItem.mimeType;
getPreSignedURLRequest.contentType = transferModel.content_type;
NSLog(#"mimeType: %#", transferModel.content_type);
/**
* Upload the file
*/
[[[AWSS3PreSignedURLBuilder defaultS3PreSignedURLBuilder]
getPreSignedURL:getPreSignedURLRequest]
continueWithBlock:^id(BFTask* task) {
NSURLSessionUploadTask* uploadTask;
transferModel.sessionTask = uploadTask;
if (task.error) {
NSLog(#"Error: %#", task.error);
} else {
NSURL* presignedURL = task.result;
NSLog(#"upload presignedURL is: \n%#", presignedURL);
NSMutableURLRequest* request =
[NSMutableURLRequest requestWithURL:presignedURL];
request.cachePolicy = NSURLRequestReloadIgnoringLocalCacheData;
[request setHTTPMethod:#"PUT"];
[request setValue:transferModel.content_type
forHTTPHeaderField:#"Content-Type"];
uploadTask =
[self.session uploadTaskWithStreamedRequest:assetAsRequest];
[uploadTask resume];
}
return nil;
}];
} #catch (NSException* exception) {
NSLog(#"exception: %#", exception);
} #finally {
}
}
- (void)URLSession:(NSURLSession*)session
task:(NSURLSessionTask*)task
didSendBodyData:(int64_t)bytesSent
totalBytesSent:(int64_t)totalBytesSent
totalBytesExpectedToSend:(int64_t)totalBytesExpectedToSend {
// Calculate progress
double progress = (double)totalBytesSent / (double)totalBytesExpectedToSend;
NSLog(#"UploadTask progress: %lf", progress);
}
- (void)URLSession:(NSURLSession*)session
task:(NSURLSessionTask*)task
didCompleteWithError:(NSError*)error {
NSLog(#"(void)URLSession:session task:(NSURLSessionTask*)task "
#"didCompleteWithError:error called...%#",
error);
}
- (void)URLSessionDidFinishEventsForBackgroundURLSession:
(NSURLSession*)session {
NSLog(#"URLSessionDidFinishEventsForBackgroundURLSession called...");
}
// NSURLSessionDataDelegate
- (void)URLSession:(NSURLSession*)session
dataTask:(NSURLSessionDataTask*)dataTask
didReceiveResponse:(NSURLResponse*)response
completionHandler:
(void (^)(NSURLSessionResponseDisposition disposition))completionHandler {
//completionHandler(NSURLSessionResponseAllow);
}
#end
But I'm receiving this error:
(void)URLSession:session task:(NSURLSessionTask*)task didCompleteWithError:error called...Error Domain=NSURLErrorDomain Code=-999 "cancelled" UserInfo=0x17166f840 {NSErrorFailingURLStringKey=assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV, NSLocalizedDescription=cancelled, NSErrorFailingURLKey=assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV}
userInfo: {
NSErrorFailingURLKey = "assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV";
NSErrorFailingURLStringKey = "assets-library://asset/asset.MOV?id=94F90EEB-BB6A-4E9D-B77E-CDD60173B60C&ext=MOV";
NSLocalizedDescription = cancelled;
}
Thanks in advance for your help.
Regards,
-J
A couple of comments regarding using NSURLSessionUploadTask:
If you implement didReceiveResponse, you must call the completionHandler.
If you call uploadTaskWithStreamedRequest, the documentation for the request parameter warns us that:
The body stream and body data in this request object are ignored, and NSURLSession calls its delegate’s URLSession:task:needNewBodyStream: method to provide the body data.
So you must implement needNewBodyStream if implementing a NSInputStream based request.
Be forewarned, but using a stream-base request like this creates a request with a "chunked" transfer encoding and not all servers can handle that.
At one point in the code, you appear to try to load the contents of the asset into a NSData. If you have assets that are that large, you cannot reasonably load that into a NSData object. Besides, that's inconsistent with using uploadTaskWithStreamedRequest.
You either need to create NSInputStream or upload it from a file.
You appear to be using the asset URL for the NSURLRequest. That URL should be the URL for your web service.
When using image picker, you have access to two URL keys: the media URL (a file:// URL for movies, but not pictures) and the assets library reference URL (an assets-library:// URL). If you're using the media URL, you can use that for uploading movies. But you cannot use the assets library reference URL for uploading purposes. You can only use that in conjunction with ALAssetsLibrary.
The ALAssetPropertyURL is a purely URL identifier for the asset i.e to identify assets and asset groups and I dont think you can use it directly to upload to a service.
You could use AVAssetExportSession to export the asset to temp url if other methods are teditious.
i.e
[AVAssetExportSession exportSessionWithAsset:[AVURLAsset URLAssetWithURL:assetURL options:nil] presetName:AVAssetExportPresetPassthrough];
I'm currently trying to use iOS 7's newest api's to scan code 39 barcodes, but it's driving me crazy. I have to hold the phone a specific way really still for like 10 seconds in order for it to detect it. I compared it to Red Laser, Zbar, etc and they could analyze it in 1 second even if it was a little skewed. I'm not sure if it's because of the way that I load my capture session or what. I'd appreciate the help. Any suggestions on how to improve performance?
Here's how I load the scanner in my viewDidLoad method:
//Initialize Laser View
laserView = [[UIView alloc] init];
laserView.autoresizingMask = UIViewAutoresizingFlexibleTopMargin|UIViewAutoresizingFlexibleLeftMargin|UIViewAutoresizingFlexibleRightMargin|UIViewAutoresizingFlexibleBottomMargin;
laserView.layer.borderColor = [UIColor redColor].CGColor;
laserView.layer.borderWidth = 8;
laserView.layer.cornerRadius = 10;
[self.view addSubview:laserView];
//Start Session
scannerSession = [[AVCaptureSession alloc] init];
scannerDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
//Define Error Messages
NSError *error = nil;
//Define Input
scannerInput = [AVCaptureDeviceInput deviceInputWithDevice:scannerDevice error:&error];
//Check if Device has a Camera
if (scannerInput) {
[scannerSession addInput:scannerInput];
} else {
NSLog(#"Error: %#", error);
}
// Locks the configuration
BOOL success = [scannerDevice lockForConfiguration:nil];
if (success) {
if ([scannerDevice isAutoFocusRangeRestrictionSupported]) {
// Restricts the autofocus to near range (new in iOS 7)
[scannerDevice setAutoFocusRangeRestriction:AVCaptureAutoFocusRangeRestrictionNear];
}
}
// unlocks the configuration
[scannerDevice unlockForConfiguration];
//Define Output & Metadata Object Types
scannerOutput = [[AVCaptureMetadataOutput alloc] init];
[scannerOutput setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[scannerSession addOutput:scannerOutput];
scannerOutput.metadataObjectTypes = [scannerOutput availableMetadataObjectTypes];
//Create Video Preview Layer
scannerPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:scannerSession];
scannerPreviewLayer.frame = self.view.bounds;
scannerPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:scannerPreviewLayer];
//Start Session
[scannerSession startRunning];
[self.view bringSubviewToFront:cancelButton];
[self.view bringSubviewToFront:laserView];
And:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection {
//Prepare Laser View
CGRect laser = CGRectZero;
//Format Date
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"M/d"];
//Format Time
NSDateFormatter *timeFormatter = [[NSDateFormatter alloc] init];
[timeFormatter setDateFormat:#"h:ma"];
//Define Barcode Types to Recognize
AVMetadataMachineReadableCodeObject *barCodeObject;
NSString *idNumber = nil;
NSArray *barCodeTypes = #[AVMetadataObjectTypeCode39Code];
if ([metadataObjects count] > 1) {
NSLog(#"%lu Barcodes Found.", (unsigned long)[metadataObjects count]);
}
//Get String Value For Every Barcode (That Matches The Type We're Looking For)
for (AVMetadataObject *metadata in metadataObjects) {
for (NSString *type in barCodeTypes) {
//If The Barcode Is The Type We Need Then Get Data
if ([metadata.type isEqualToString:type]) {
barCodeObject = (AVMetadataMachineReadableCodeObject *)[scannerPreviewLayer transformedMetadataObjectForMetadataObject:(AVMetadataMachineReadableCodeObject *)metadata];
laser = barCodeObject.bounds;
idNumber = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
// If IDNumber Found
if (idNumber != nil) {
//Stop Session
[scannerSession stopRunning];
[self vibrate];
NSLog(#"ID: %#", idNumber);
break;
}
//If IDNumber Is Not Found
else {
NSLog(#"No ID Found.");
}
}
//Update Laser
laserView.frame = laser;
}
I had a similar problem with the AVCaptureSession, the capture was very slow and sometimes it took long time to finish it.
Don't know if my solution is the one that is good for you but definetly can be helpfull to somebody else looking for this problem, like i did.
AVCaptureSession *captureSession = [AVCaptureSession new];
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
With this code you force the camera to a high quality preset.
Hope this will help someone.
Try zooming in a little... videoDevice.videoZoomFactor = 2.0;
A lot of it has to do with the quality of the image (focus, glare, lighting, etc.) Under really good conditions, you should get scanning that is nearly instantaneous.
I suggested you put a NSLog in your delegate function captureOutput:
You will see that it gets called many times just for scanning a barcode once. Initializing NSDateFormatter is a very expensive operation per Why is allocating or initializing NSDateFormatter considered "expensive"?.
I would suggest that you create the NSDateFormatter outside of the delegate function and re-use it instead of creating it every time the function is called. This should make your app more responsive.
I am working on an App in which I require to access to files stored on Google drive,which works well.
Now, I am trying to display that selected file on the device, I am using UIWebView for that purpose. I have read the documents from Google, I could not find anything in relation with viewing the file(txt/pdf/spread sheet etc).
Here is my code -- I am firing when user taps on a cell of UITableView:
GTLDriveFile *file = [loadedFiles objectAtIndex:indexPath.row];
GTMHTTPFetcher * fetcher = [applicationDelegate.serviceDriveGtl.fetcherService fetcherWithURLString:file.downloadUrl];
[fetcher beginFetchWithCompletionHandler:^(NSData * data,NSError * error) {
if (error != nil) {
NSLog(#"ERROR ON LOADING FILE:%#", [error localizedDescription]);
} else {
NSString * link = [[NSString alloc] initWithData:data
encoding:NSUTF32StringEncoding];
DisplayViewController * displayView = [[DisplayViewController alloc] init];
[displayView showDocumentOnLink:link];
[self.navigationController pushViewController:displayView animated:YES];
}
Now, Here is how I am populating it WebView:
I am collecting the "link" variable in "getLink" in "DisplayViewController".
displayWebView = [[UIWebView alloc] init];
[displayWebView setFrame:CGRectMake(posX, posY+230, width, 200)];
[displayWebView setDelegate:self];
[self.view addSubview:displayWebView];
NSURLRequest * request = [NSURLRequest requestWithURL:[NSURL URLWithString:getLink]];
[displayWebView loadRequest:request];
What am I missing here ? Why does the WebView not populate the file contents ?
Any help with truly appreciated,
Thanks