How do i generate a barcode on apple watch with Watch OS2, i can do it using API's like ZXing on iOS, but i wonder is there a way to do the same in watchOS2
NSError *error = nil;
ZXMultiFormatWriter *writer = [ZXMultiFormatWriter writer];
ZXBitMatrix* result;
//generate code 128 barcode
result= [writer encode:barCodeNumber
format:kBarcodeFormatCode128
width:500
height:500
error:&error];
if (result) {
CGImageRef image = [[ZXImage imageWithMatrix:result] cgimage];
return [UIImage imageWithCGImage:image];
}
I figured it out by generating image in iOS app and passing it to watch os using background transfers, by creating NSData from image, something like this
- (void)viewDidLoad {
[super viewDidLoad];
if([WCSession isSupported]){
self.watchSession = [WCSession defaultSession];
self.watchSession.delegate = self;
[self.watchSession activateSession];
}
}
-(void)sendDatatoAppleWatch
{
NSMutableArray*barCodesArray=[[NSMutableArray alloc]init];
UIImage* barCodeImage=[self generateBarCode];
NSData *pngData = UIImagePNGRepresentation(barCodeImage);
[barCodeArray addObject:pngData]
if(self.watchSession){
NSError *error = nil;
if(![self.watchSession
updateApplicationContext:
#{#"cardData" : userCardsArray }
error:&error]){
NSLog(#"Updating the context failed: %#", error.localizedDescription);
UIAlertView* errorAlert=[[UIAlertView alloc]initWithTitle:error.localizedDescription message:error.debugDescription delegate:self cancelButtonTitle:#"OK" otherButtonTitles: nil];
[errorAlert show];
}
}
//*** Apple Watch Code**//
- (void)awakeWithContext:(id)context {
[super awakeWithContext:context];
if([WCSession isSupported]){
self.watchSession = [WCSession defaultSession];
self.watchSession.delegate = self;
[self.watchSession activateSession];
}
}
- (void) session:(WCSession *)session didReceiveApplicationContext:(NSDictionary<NSString *,id> *)applicationContext {
NSData* imageData = [[[applicationContext objectForKey:#"cardData"] objectAtIndex:0] valueForKey:#"barCodeImage"];
[self.barcodeImageView setImage:[UIImage imageWithData:imageData]];
}
Generate on iOS app, then transfer it to the watch. Good Luck!
Check this library: EFQRCode.
As from its doc, the original implementation is from swift_qrcodejs, which is Cross-appleOS SIMPLE QRCode generator for swift, without using CIFilter.
Related
This the code read QRcode
- (instancetype)init {
if (self = [super init]) {
if (self.session == nil)
self.session = [[AVCaptureSession alloc] init];
//device
if (self.device == nil)
self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
//output
if (self.output == nil)
self.output = [[AVCaptureMetadataOutput alloc] init];
}
return self;
}
- (void)creatScanQR{
NSError *error = nil;
//input
if (self.input == nil)
self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:&error];
if(self.input) {
[self.session addInput:self.input];
} else {
NSLog(#"%#", error);
return;
}
[self.session addOutput:self.output];
[self.output setMetadataObjectTypes:#[AVMetadataObjectTypeQRCode]];
[self.output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[self.session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection {
for (AVMetadataMachineReadableCodeObject *metadata in metadataObjects) {
if ([metadata.type isEqualToString:AVMetadataObjectTypeQRCode]) {
NSLog(#"======%#=======",metadata.stringValue);
}
}
}
It works in native app. But my app is build by Unity, it used Vuforia, When I use AVCapture read QRcode, vuforia is black screen. Because camera are only one which is using by Voforia. How can I use AVCaptureInput to read QRcode and vuforia is still working?
My planB is get the vuforia view , write a image by vuforia view , use iOS CIDetector read the qrcode, but I got a nil image.why ?
UIView *view = UnityGetGLView();
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image= UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(image);
CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeQRCode context:[CIContext contextWithOptions:nil] options:#{CIDetectorAccuracy:CIDetectorAccuracyLow}];
NSArray *features = [detector featuresInImage:[CIImage imageWithData:imageData]];
for (CIFeature *feature in features) {
NSLog(#"%#",feature.type);
if ([feature isKindOfClass:[CIQRCodeFeature class]]) {
NSLog(#"?????? %# ????? ", ((CIQRCodeFeature *)feature).messageString);
dispatch_sync(queue, ^{ dispatch_suspend(timer); });
}
}
sounds like your problem is caused by competition of using camera. The only solution may be to use the exactly same view in vuforia and QRCode reading, namely workaround the competition, and share the camera images between vuforia and QRCode reading.
I make some change for Plan B.
Use GCD timer every second get the image from rootview, the shot view method is
- (BOOL)drawViewHierarchyInRect:(CGRect)rect afterScreenUpdates:(BOOL)afterUpdates NS_AVAILABLE_IOS(7_0);
.Then CIDetector read then qrcode from image.It works.It looks well.
I don't know how many bugs in it.But there is no way to do. Boss pushed me if I didn't use plan b. He wants Vuforia can read qrcode same time, and quickly no slowly.So problem temporarily solved . If you have better idea, love to listen.
I'm trying to mirror iOS device screen via USB connection to OSX. QuickTime does this fine, and I read this article with a code example: https://nadavrub.wordpress.com/2015/07/06/macos-media-capture-using-coremediaio/
However, the callback of CMIOStreamCopyBufferQueue is never called and I'm wondering what am I doing wrong?
Have anyone faced this issue and can provide a working example ?
Thanks.
Well.. eventually I did what Nadav told me in his blog - discover DAL devices and capture their output using AVCaptureSession like this:
-(id) init {
// Allow iOS Devices Discovery
CMIOObjectPropertyAddress prop =
{ kCMIOHardwarePropertyAllowScreenCaptureDevices,
kCMIOObjectPropertyScopeGlobal,
kCMIOObjectPropertyElementMaster };
UInt32 allow = 1;
CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
&prop, 0, NULL,
sizeof(allow), &allow );
// Get devices
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeMuxed];
BOOL deviceAttahced = false;
for (int i = 0; i < [devices count]; i++) {
AVCaptureDevice *device = devices[i];
if ([[device uniqueID] isEqualToString:/*deviceUDID*/]) {
deviceAttahced = true;
[self startSession:device];
break;
}
}
NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
// Device not attached - subscribe to onConnect notifications
if (!deviceAttahced) {
id deviceWasConnectedObserver = [notificationCenter addObserverForName:AVCaptureDeviceWasConnectedNotification
object:nil
queue:[NSOperationQueue mainQueue]
usingBlock:^(NSNotification *note) {
AVCaptureDevice *device = note.object;
[self deviceConnected:device];
}];
observers = [[NSArray alloc] initWithObjects:deviceWasConnectedObserver, nil];
}
return self;
}
- (void) deviceConnected:(AVCaptureDevice *)device {
if ([[device uniqueID] isEqualToString:/*deviceUDID*/]) {
[self startSession:device];
}
}
- (void) startSession:(AVCaptureDevice *)device {
// Init capturing session
session = [[AVCaptureSession alloc] init];
// Star session configuration
[session beginConfiguration];
// Add session input
NSError *error;
newVideoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (newVideoDeviceInput == nil) {
dispatch_async(dispatch_get_main_queue(), ^(void) {
NSLog(#"%#", error);
});
} else {
[session addInput:newVideoDeviceInput];
}
// Add session output
videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (id)kCVPixelBufferPixelFormatTypeKey];
dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
[videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
[session addOutput:videoDataOutput];
// Finish session configuration
[session commitConfiguration];
// Start the session
[session startRunning];
}
#pragma mark - AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSImage *resultNSImage = [self imageFromSampleBuffer:sampleBuffer];
/*
* Here you can do whatever you need with the frame (e.g. convert to JPG)
*/
}
Hi everyone and thanks for your support in advance,
I just started working with Watch OS 2 and Objective-C and I am trying to send a message to an iPhone device when a button is tapped, I don't know if the next approach is the best but from Apple docs it seems to best feet my needs, as I need to send a request to and paired iOS device and receive some user info, I also read that this only works if the app in in foreground witch is a downside :(
Update 1
- (IBAction)didTappedButton {
WCSession *session = [WCSession defaultSession];
session.delegate = self;
[session activateSession];
if ([session isReachable] == YES) {
NSDictionary *postDictionarry = [[NSDictionary alloc] initWithObjects:[NSArray arrayWithObject:#"retrieveAPISessionKey"] forKeys:[NSArray arrayWithObject:#"request"]];
[self.button setBackgroundColor:[UIColor blueColor]];
[session sendMessage:postDictionarry
replyHandler:^(NSDictionary<NSString *,id> * _Nonnull replyMessage) {
[self postToServer];
}
errorHandler:^(NSError * _Nonnull error) {
[self.button setBackgroundColor:[UIColor redColor]];
[self showAlertViewwithTitle:#"Oops..." andMessage:#"Something went Wrong"];
}];
}else{
[self showAlertViewwithTitle:#"Oops..." andMessage:#"Please pair with a device"];
}
}
And in the AppDelegate of I implemeted the next code in .h:
#import WatchConnectivity;
#interface AppDelegate : UIResponder <UIApplicationDelegate, UIGestureRecognizerDelegate, WCSessionDelegate>
and in the .m:
- (void)session:(nonnull WCSession *)session
didReceiveMessage:(NSDictionary<NSString *,id> *)message
replyHandler:(void(^)(NSDictionary<NSString *,id> *))replyHandler {
NSString *action = message[#"request"];
NSString *actionPerformed;
// more code here...
}
note I am testing on the simulator and have an issues in getting tap gestures and also I see a lot of spinners on the watch simulator
Update 2
- (void)awakeWithContext:(id)context {
[super awakeWithContext:context];
// Configure interface objects here.
WCSession *session = [WCSession defaultSession];
session.delegate = self;
[session activateSession];
}
- (IBAction)didTappedButton {
if ([[WCSession defaultSession] isReachable] == YES) {
NSDictionary *postDictionarry = [[NSDictionary alloc] initWithObjects:[NSArray arrayWithObject:#"retrieveAPISessionKey"] forKeys:[NSArray arrayWithObject:#"request"]];
[self.button setBackgroundColor:[UIColor blueColor]];
[[WCSession defaultSession] sendMessage:postDictionarry
replyHandler:^(NSDictionary<NSString *,id> * _Nonnull replyMessage) {
[self postData];
}
errorHandler:^(NSError * _Nonnull error) {
[self.button setBackgroundColor:[UIColor redColor]];
[self showAlertViewwithTitle:#"Oops..." andMessage:#"Something went Wrong"];
}];
}else{
[self showAlertViewwithTitle:#"Oops..." andMessage:#"Please pair with a device"];
}
}
- (void)postData{
//post stress signal to server
}
- (void)showAlertViewwithTitle:(NSString *)title andMessage:(NSString *)message{
WKAlertAction *act = [WKAlertAction actionWithTitle:#"OK" style:WKAlertActionStyleCancel handler:^(void){}];
NSArray *actions = #[act];
[self presentAlertControllerWithTitle:title message:message preferredStyle:WKAlertControllerStyleAlert actions:actions];
}
So, just succeede in sending the message and also pair devices but now i don't receive the sent dictionary in the iOS app.
You need to activate your WCSession in both the WatchKit App Extension and in the iPhone application. Based on the code you've shown us, it does not appear you are activating it in your iPhone application.
Add the following to your iPhone application:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
if ([WCSession isSupported]) {
WCSession* session = [WCSession defaultSession];
session.delegate = self;
[session activateSession];
}
...
You should then be able to receive the message via your existing session:didReceiveMessage:replyHandler: method.
I'm using WatchConnectivity to send a simple dictionary from an iPhone to Apple Watch.
On the Apple Watch side, to get around the fact that contexts may not be queued when the app is opened, the last received data is saved to UserDefaults and retrieved if there is nothing in the queue when setting up my WatchKit table. I have implemented this in another WatchKit app and everything worked somewhat fine, but in this one data is never received by the Watch.
I've only tried it in the simulator because my app spins for eternity on my Watch and never loads (the loading screen looks like a WatchOs 1 screen?). The WatchConnectivity framework is included in each product (Extension and iPhone app). Thanks for your help.
Here's the iPhone code (implemented in a ViewController):
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
if ([WCSession isSupported]) {
WCSession *session = [WCSession defaultSession];
session.delegate = self;
[session activateSession];
}
}
- (void)viewDidAppear:(BOOL)animated {
NSDictionary *toPass = [[NSDictionary alloc] initWithObjectsAndKeys:AppDelegate.profiles,#"profiles", nil];
[[WCSession defaultSession] updateApplicationContext:toPass error:nil];
NSLog(#"sent data");
[self.tableView reloadData];
}
And the Apple Watch Code:
- (void)awakeWithContext:(id)context {
[super awakeWithContext:context];
// Configure interface objects here.
self.profiles = [[NSMutableArray alloc] init];
if ([WCSession isSupported]) {
WCSession *session = [WCSession defaultSession];
session.delegate = self;
[session activateSession];
}
[self setupTable];
}
- (void)session:(WCSession *)session didReceiveApplicationContext:(NSDictionary<NSString *,id> *)applicationContext {
NSDictionary *receieved = [[NSDictionary alloc] init];
receieved = applicationContext;
NSMutableArray *profiles = [[NSMutableArray alloc] init];
profiles = [receieved objectForKey:#"profiles"];
self.profiles = [[NSMutableArray alloc] init];
self.profiles = profiles;
NSData *arrayData = [NSKeyedArchiver archivedDataWithRootObject:self.profiles];
NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
[defaults setObject:arrayData forKey:#"bookmarks"];
[self setupTable];
NSLog(#"new");
}
- (void)setupTable {
...
After some setup code
if (self.profiles.count == 0) {
NSLog(#"Nothing in the queue, checking defaults");
NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
NSData *got = [defaults objectForKey:#"bookmarks"];
NSLog(#"Got Defaults!");
self.profiles = [[NSMutableArray alloc] init];
self.profiles = [NSKeyedUnarchiver unarchiveObjectWithData:got];
}
...
More setup code later
}
Change this line:
[[WCSession defaultSession] updateApplicationContext:toPass error:nil];
To be:
NSError *error = nil;
If (![[WCSession defaultSession] updateApplicationContext:toPass error:&error]) {
NSLog(#"error: %#", error);
}
And I bet you'll see you are getting an error returned!
Also, what type of objects does AppDelegate.profiles contain?
I use ZBarReaderViewController for scan QR Code. and it was perfectly worked on iOS 6.
But when i use iOS 7 with my project then it is not properly working with ZBarReaderViewController
Issue is related to memory, it take more then 100 MB and my device is hang at this time.
Generally in my project. user can scan QR Generator image and i have function which recognize code of QR code is related to my string which i got from server then if YES then i goes to next view controller otherwise remain in current (continue QR SCAN) screen.
If QR code mach with my string then on next screen has "cancel" button which make be scan another code ( it means i got to previous viewController (QR SCAN)).
At that time when i go to next viewController and back to pervious (QR Scan screen) then each time i got ZBarReaderViewController is allocated so (May be) memory related issue is generated.
but i write code
if(self.ZBarReaderVC)
{
for(UIView *subVies in self.ZBarReaderVC.cameraOverlayView.subviews)
[subVies removeFromSuperview];
for(UIView *subVies in self.ZBarReaderVC.view.subviews)
[subVies removeFromSuperview];
[self.ZBarReaderVC removeFromParentViewController];
self.ZBarReaderVC = nil;
}
after [self.ZBarReaderVC dismissModalViewControllerAnimated: YES]; I remove ZBarReaderViewController at the end time then why each time i got allocated ZBarReaderViewController ???
And also i put [self.ZBarReaderVC.readerView stop]; before dismiss ZBarReaderViewController fro stop scanning stream of reader
but also it not worked for me.
But i tried to solve my problem about hours of time but i am not able to solve my issue
please help me.
Alos i found similar problem
Zbar SDK and ios7/xcode 5 - App is reaching 100% cpu use and memory more than 100MB
http://sourceforge.net/p/zbar/discussion/1072195/thread/df4c215a/
But No one can help me.
I found that in iOS 7 issue is occur at
self.ZBarReaderVC.view.frame = self.view.bounds;
I put break point here and check whenever i come bake from previous viewController, its take more time and also memory (issue) at this code.
So first i need to remove view of self.ZBarReaderVC with its all subViews.. so at first i need to write
if(self.ZBarReaderVC) // first check `self.ZBarReaderVC` is created or not?
{
[self.ZBarReaderVC.readerView stop]; // then stop continue scanning stream of "self.ZBarReaderVC"
for(UIView *subViews in self.ZBarReaderVC.view.subviews) // remove all subviews
[subViews removeFromSuperview];
[self.ZBarReaderVC.view removeFromSuperview];
self.ZBarReaderVC.view = nil;
}
And also i got that in iOS 7 self.ZBarReaderVC has remain continue scanning stream of QR Code so each time we need to stop it whenever your QR Code scanning is done and you need to dismiss your self.ZBarReaderVC then first stop scanning by [self.ZBarReaderVC.readerView stop];
And some time user need to write/call (For do/implement some type of extra features)
[self.ZBarReaderVC viewDidLoad];
[self.ZBarReaderVC viewWillAppear:NO];
[self.ZBarReaderVC viewDidAppear:NO];
Methods of self.ZBarReaderVC then it is not need to use in iOS 7, so if any user who call this methods of self.ZBarReaderVC then please put it in comment.
I hope this my suggestion is helpful for others.
Thanks :)
If you are targetting your app for iOS7 only, I ditched the ZBar component and used the native AVFoundation method, making the viewcontroller a AVCaptureMetadataOutputObjectsDelegate. Works prefectly with 3% CPU usage:
viewcontroller.h:
#interface viewcontroller : UIViewController <AVCaptureMetadataOutputObjectsDelegate> {
AVCaptureSession *_session;
AVCaptureDevice *_device;
AVCaptureDeviceInput *_input;
AVCaptureMetadataOutput *_output;
AVCaptureVideoPreviewLayer *_prevLayer;
UIView *_highlightView;
}
viewcontroller.m
- (IBAction)btnScan:(id)sender {
_session = [[AVCaptureSession alloc] init];
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
_input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
if (_input) {
[_session addInput:_input];
} else {
NSLog(#"Error: %#", error);
}
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];
_output.metadataObjectTypes = [_output availableMetadataObjectTypes];
_prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_prevLayer.frame = self.view.bounds;
_prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:_prevLayer];
[_session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection {
AVMetadataMachineReadableCodeObject *barCodeObject;
NSString *detectionString = nil;
NSArray *barCodeTypes = #[AVMetadataObjectTypeCode39Code, AVMetadataObjectTypeCode128Code,
AVMetadataObjectTypeQRCode];
for (AVMetadataObject *metadata in metadataObjects) {
for (NSString *type in barCodeTypes) {
if ([metadata.type isEqualToString:type]) {
barCodeObject = (AVMetadataMachineReadableCodeObject *)
[_prevLayer transformedMetadataObjectForMetadataObject:
(AVMetadataMachineReadableCodeObject *)metadata];
detectionString = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
break;
}
}
if (detectionString != nil) {
NSLog(#"%#", detectionString);
[self buscarCarga:detectionString]; //Do whatever you want with the data
[_session stopRunning];
AVCaptureInput* input = [_session.inputs objectAtIndex:0];
[_session removeInput:input];
AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[_session.outputs objectAtIndex:0];
[_session removeOutput:output];
[_prevLayer removeFromSuperlayer];
}
else
NSLog(#"No data");
}
}
- (void)viewDidLoad
{
[super viewDidLoad];
self.ZBarReaderVC = [ZBarReaderViewController new];
self.ZBarReaderVC.readerDelegate=self;
self.ZBarReaderVC.supportedOrientationsMask = ZBarOrientationMaskAll;
ZBarImageScanner *scanner = self.ZBarReaderVC.scanner;
[scanner setSymbology: ZBAR_I25 config: ZBAR_CFG_ENABLE to: 0];
}
#pragma mark - Button click method
- (IBAction)startScanning:(id)sender {
NSLog(#"Scanning..");
resultTextView.text = #"Scanning..";
[self presentViewController:self.ZBarReaderVC animated:YES completion:nil];
}