ZXing iOS barcode scanning: ZXCapture custom camera size - ios

I would like to use the ZXing iOS framework to bring barcode scanning to my iOS application.
After downloading the Github project (https://github.com/TheLevelUp/ZXingObjC) I played around with the iOS demo project.
In the demo project the phone camera is fullscreen - for my needs I would like to adjust the size of the camera.
ZXing is using the class 'ZXCapture' for doing live barcode scanning.
The demo app has the following viewDidLoad function:
- (void)viewDidLoad {
[super viewDidLoad];
self.capture = [[ZXCapture alloc] init];
self.capture.camera = self.capture.back;
self.capture.focusMode = AVCaptureFocusModeContinuousAutoFocus;
// here I tried to change the size of the camera
self.capture.layer.frame = CGRectMake(0, 0, 200, 200);
[self.view.layer addSublayer:self.capture.layer];
[self.view bringSubviewToFront:self.scanRectView];
[self.view bringSubviewToFront:self.decodedLabel];
}
As you can see, I added a line of code to change the size of the capture frame.
Without success... (I also tried to add the capture layer to a custom sized UIView - also without success - the camera still has a fixed size)
Has anybody ever had the same problem or use case with ZXing for iOS?
Or dose anybody has an idea to do what I want?

Related

Draw overlay on camera iOS

Users need to be able to take photo of their ID. I need to add a blue frame to camera view as a guide. Guide should have same aspect ratio on all device sizes and fit a label with instructions. Can I accomplish this using UIImagePicker?
Here is some incomplete code. Thanks for any help.
UIImageView *overlayImage = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#""]];
CGRect overlayRect = CGRectMake(0, 0, self.view.frame.size.height - 16, self.view.frame.size.width - 16);
[overlayImage setFrame:overlayRect];
[self.imagePicker setCameraOverlayView:overlayImage];
Use AVCaptureDevice, AVCaptureSession, AVCaptureVideoPreviewLayer and AVCapturePhotoOutput.
Set AVCaptureDeviceInput as input of capture session, and photo output as output of capture session. Initialize AVCaptureVideoPreviewLayer with AVCaptureSession and add to your view.layer thought addSublayer. You can use UIViewController from storyboard or programmatically instantiated controller, add picture of overlay or cornered view.layer.borderWidth or UIBezierPath. Set up controller as AVCapturePhotoCaptureDelegate, add delegate methods. Use capturePhoto(with:delegate:) method. Enjoy.

Is Apple using black magic to accomplish camera's preview orientation?

I have this AVFoundation camera app of mine. The camera preview is the result of a filter, applied by didOutputSampleBuffer method.
When I setup the camera I am following what apple did on one of their sample codes (CIFunHouse):
// setting up the camera
CGRect bounds = [self.containerOpenGL bounds];
_eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES3];
_videoPreviewView = [[GLKView alloc] initWithFrame:bounds
context:_eaglContext];
[self.containerOpenGL addSubview:_videoPreviewView];
[self.containerOpenGL sendSubviewToBack:_videoPreviewView];
id<MTLDevice> device = MTLCreateSystemDefaultDevice();
NSDictionary *options = #{kCIContextUseSoftwareRenderer : #(NO),
kCIContextPriorityRequestLow : #(YES),
kCIContextWorkingColorSpace : [NSNull null]};
_ciContext = [CIContext contextWithEAGLContext:_eaglContext options:options];
[_videoPreviewView bindDrawable];
_videoPreviewViewBounds = CGRectZero;
_videoPreviewViewBounds.size.width = _videoPreviewView.drawableWidth;
_videoPreviewViewBounds.size.height = _videoPreviewView.drawableHeight;
dispatch_async(dispatch_get_main_queue(), ^(void) {
CGAffineTransform transform = CGAffineTransformMakeRotation(M_PI);
_videoPreviewView.transform = transform;
_videoPreviewView.frame = bounds;
});
self.containerOpenGL is a full screen view and is constrained to the four corners of the screen. Autorotation is on.
But this is the problem...
When I setup the GLKView and self.ciContext it is created assuming the device is on a particular orientation. If the device is on a particular orientation and I run the application, the previewView will fit the entire self.containerOpenGL area but when I rotate the device the previewView will be out center.
I see that Apple code works perfectly and they don't use any constraints. They do not use any autorotation method, no didLayoutSubviews and when you rotate the device, running their code, everything rotates except the preview view. And worse than that, my previewView appears to rotate but not their's.
Is this black magic? How do I they do that?
They add their preview view to a uiwindow and that is why it does not rotate. I hope this answers the question. If not I will continue to look through their source code.
Quote from source code.
we make our video preview view a subview of the window, and send it to the back; this makes FHViewController's view (and its UI elements) on top of the video preview, and also makes video preview unaffected by device rotation
They also add this
_videoPreviewView.enableSetNeedsDisplay = NO;
This may keep it from responding as well
Edit: It appears that now the preview rotates and the UI does as well so to combat this you can add a second window and send it to the back and make the main window clear and add the previewView to the second window with a dummyViewController that tells it not to autorotate by overriding the appropriate method. This will allow the preview to not rotate but the UI to rotate.

iOS YTPlayerView force video quality

I am currently using iOS-youtube-player-helper library in our application. There is a view controller, with a YTPlayerView that has an aspect ratio of 16:9, which means it takes only a part of the screen. The video is loaded in medium and no matter how, I could not get it to play in 720P or 1080P. I am certain that these qualities are available, it's just the YTPlayerView forcing the quality based on the video player height. Because this is a library and not direct iframe embed, I cannot use "vq" parameter(specifying vq in playerVars does not seem to work), and setting the quality to be small then change it later does not work either(refer to this issue on GitHub)
Now, given the factor that I cannot make the YTPlayerView to fill up the whole screen, because of UI designing issues. So, is it possible to force the YTPlayerView to play in at least 720P? (Workarounds, changing the library code, ...)
Because this is an app that will be on App Store(and of course we don't want to have any legal disputes with google either), please don't suggest using libraries that are against the Youtube ToC such as XCDYouTubeKit
Many Thanks
I've found a workaround and this works well for me.
First of all, the problems depends by the webView size constructed inside the YTPlayerView. For example if you have a 320x200 playerView, try to forcing your video to 720hd don't work because the iFrame youtube player class re-switch to a better resolution according to your player size (in this case small quality because you have 320x200).
You can see this SO answer that explain this issue.
When you have imported the YTPlayerView class to your project you have two files: YTPlayerView.h and YTPlayerView.m
YTPlayerView.m (Update to work also on iPads)
I've change the function where the webview is initialized with a custom size (4k resolution) and to the last part I've added the possibility to scale the contents and restore the original frame, like this:
- (UIWebView *)createNewWebView {
CGRect frame = CGRectMake(0.0, 0.0, 4096.0, 2160.0); //4k resolution
UIWebView *webView = [[UIWebView alloc] initWithFrame:frame];
//UIWebView *webView = [[UIWebView alloc] initWithFrame:self.bounds];
webView.autoresizingMask = (UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight);
webView.scrollView.scrollEnabled = NO;
webView.scrollView.bounces = NO;
if ([self.delegate respondsToSelector:#selector(playerViewPreferredWebViewBackgroundColor:)]) {
webView.backgroundColor = [self.delegate playerViewPreferredWebViewBackgroundColor:self];
if (webView.backgroundColor == [UIColor clearColor]) {
webView.opaque = NO;
}
}
webView.scalesPageToFit = YES;
if ( UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad )
{
CGSize contentSize = webView.scrollView.contentSize;
CGSize viewSize = self.bounds.size;
float scale = viewSize.width / contentSize.width;
webView.scrollView.minimumZoomScale = scale;
webView.scrollView.maximumZoomScale = scale;
webView.scrollView.zoomScale = scale;
// center webView after scaling..
[webView setFrame:CGRectMake(0.0, self.frame.origin.y/3, 4096.0, 2160.0)];
} else {
webView.frame = self.bounds;
}
[webView reload];
return webView;
}
Hope this helps who try to use the Youtube original player in his project.
P.S.: All my tries were did with a swift project and the following vars:
var playerVars:Dictionary =
["playsinline":"1",
"autoplay":"1",
"modestbranding":"1",
"rel":"0",
"controls":"0",
"fs":"0",
"origin":"https://www.example.com",
"enablejsapi":"1",
"iv_load_policy":"3",
"showinfo":"0"]
Using the functions:
self.playerView.load(withVideoId: "Rk6_hdRtJOE", playerVars: self.playerVars)
and the follow method to force the resolution:
self.playerView.loadVideo(byId: "Rk6_hdRtJOE", startSeconds: 0.0, suggestedQuality: YTPlaybackQuality.HD720)
About iPads:
As reported by Edward in comments there was a little problem on iPads, that's because these devices seems don't apply scalesPageToFit. The goal is to check if the device is an iPad, then to scale (zooming out) the scrollView to host the little view bounds.
I've tested on my iPad air and it works. Let me know.

Cropping/zooming not working while setting iOS Wallpaper using PhotoLibrary private framework

I have managed (with the help of this post) to open up a PLStaticWallpaperImageViewController from the PhotoLibrary private framework, which allows the direct setting of the wallpaper and lock screen (using same UI as the Photos app). Unfortunately, the image cropping/zooming features don't seem to work, as touches to the image view itself don't seem to be coming through (the main view is also not dismissed properly after the cancel/set buttons are touched, but this isn't so important).
I have an Xcode project demonstrating the wallpaper setting (can be run in simulator as well as a non-jailbroken device):
https://github.com/newenglander/WallpaperTest/
The code is quite basic, and involves a ViewController inheriting from PLStaticWallpaperImageViewController and implementing an init method similar to the following:
- (id)initWithCoder:(NSCoder *)aDecoder {
self = [self initWithUIImage:[UIImage imageWithContentsOfFile:#"/System/Library/WidgetResources /ibutton/white_i#2x.png"]];
self.allowsEditing = YES;
self.saveWallpaperData = YES;
return self;
}
(It will be necessary to allow access to the photo library after the first launch, and for some reason the popup for this comes up behind the app, rather than on top.)
Perhaps someone has insight as to why the cropping/zooming isn't working, or can give me an alternative way to set the wallpaper in an app (destined for Cydia rather than the App Store of course)?
Use this sample project, working very well.
Have inside camera control and custom layout, crop image when taken or after chose from your library, i used for my project and in very simple to customize.
https://github.com/yuvirajsinh/YCameraView
//---------- Answer improved----------//
I take a look on your project and i see 2 problem:
here you have 3 warning of semantic issue:
- (id)initWithUIImage:(id)arg1 cropRect:(struct CGRect { struct CGPoint { float x_1_1_1; float x_1_1_2; } x1; struct CGSize { float x_2_1_1; float x_2_1_2; } x2; })arg2;
in your ViewController.m you setting to get the image from where?
- (id)initWithCoder:(NSCoder *)aDecoder
{
// black_i
//what directory is this?
self = [self initWithUIImage:[UIImage imageWithContentsOfFile:#"/System/Library/WidgetResources/ibutton/white_i#2x.png"]];
//--------------------
self.allowsEditing = YES;
self.saveWallpaperData = YES;
return self;
}
i try to remove your
- (id)initWithUIImage:(id)arg1 cropRect:(struct CGRect { struct CGPoint { float x_1_1_1; float x_1_1_2; } x1; struct CGSize { float x_2_1_1; float x_2_1_2; } x2; })arg2;
change IMG directory in to:
self = [self initWithUIImage:[UIImage imageNamed:#"myImage.png"]];
and all working well but can't crop image, with my git hub YCameraView you have first understand how it work CROPPING function if you want to use crop or more simple, you have to create a fullScreen UICameraPicker allow user to get from camera or from library and allow the editing in cameraPicker then you can load a new picture in your View like this
self = [self initWithUIImage:[UIImage imageNamed:imageSelected.image]];
for a dismiss view, you can't because is a full app allow user to setUp background wallpaper and you can't terminate the app to see a SpringBoard, you have to create first view > picker > detail view with settings for a Home and LockScreen > then dismiss and come back to a first view.
PS: I think in your project to enable editing direct in a view you have to improve your code with a pinch and pan gesture on the UIView
Hope this help you!

IOS5 - How To play videos within a programatically created View?

I'm creating a game for iPad using OpenGL. I have created a view (with a simple background) grammatically using the following code:
- (void) addNewView {
UIWindow* window = [UIApplication sharedApplication].keyWindow;
UIView *polygonView = [[UIView alloc] initWithFrame: CGRectMake ( 60, 0, 900, 900)];
polygonView.backgroundColor = [UIColor blueColor];
//here play a movie:-)
[window addSubview:polygonView];
[polygonView release];
}
and i'm display it using the following code within a case switch in a window:
[self addNewView];
This all works well.
However, I need help to implement a video feature in this view, when it is displayed.
Can anyone provide me with assistance in the form of code or a link to a relevant tutorial that can aid me with this?
Take a look at:
MPMoviePlayerController Class Reference
and
MPMoviePlayerViewController Class Reference
That should get you started.

Resources