iPhone back camera cannot focus correctly - ios

I've been making an iOS camera app and trying to solve this problem for two days (but cannot solve this).
What I'm working on now is change the focus and exposure automatically depending on the user's tapped location. Sometimes it works fine (maybe about 20% in total), but mostly it fails. Especially when I try to focus on a far object (like 5+metre) or when there are two objects and try to switch the focus of one object to another. The image below is an example.
The yellow square locates where the user tapped and even though I tapped the black cup in the first picture, the camera still focuses on the red cup.
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let touchPoint = touches.first! as UITouch
let focusPoint = touchPoint.location(in: lfView)
print("focusPoint \(focusPoint)")
showPointOfInterestViewAtPoint(point: focusPoint)
setFocus(focusMode: .autoFocus, exposureMode: .autoExpose, atPoint: focusPoint, shouldMonitorSujectAreaChange: true)
}
func setFocus(focusMode: AVCaptureDevice.FocusMode, exposureMode: AVCaptureDevice.ExposureMode, atPoint devicePoint: CGPoint, shouldMonitorSujectAreaChange: Bool) {
guard let captureDevice = captureDevice else { return }
do {
try captureDevice.lockForConfiguration()
} catch let error as NSError { return }
if captureDevice.isFocusPointOfInterestSupported, captureDevice.isFocusModeSupported(focusMode) {
captureDevice.focusPointOfInterest = devicePoint
captureDevice.focusMode = focusMode
print("devicePoint: \(devicePoint)")
}
// other codes in here...
captureDevice.isSubjectAreaChangeMonitoringEnabled = shouldMonitorSujectAreaChange
captureDevice.unlockForConfiguration()
}
I called the setFocus function in touchesBegan function and both focusPoint & devicePoint comments show the same coordinate, like (297.5, 88.0).
When I tapped the black cup in the picture, I can see the iPhone camera is zooming in and out a little bit, like same as when I use the default iPhone camera app and try to focus on an object. So I guess my camera app is trying to focus on the black cup but it fails.
Since this is not an error, I'm not sure which code to change. Is there any clue what is going on here and what causes this problem?
ADD THIS PART LATER
I also read this document and it says
This property’s CGPoint value uses a coordinate system where {0,0} is the top-left of the picture area and {1,1} is the bottom-right.
As I wrote before, the value of devicePoint gives me more than 1, like 297.5, 88.0. Does this cause the problem?

Thanks to #Artem I was able to solve the problem. All I needed to do was convert the absolute coordinate to the value used in focusPointOfInterest (min (0,0) to max (1,1)).
Thank you, Artem!!

Related

How to handle a video overexposure in Swift

I'm working on a camera app, and I think the behavior of my app and the iPhone default camera app against overexposure is very different.
Like the image below, the default camera app adjusts the overexposure when it's detected. (I feel the whole screen gets slightly yellow-ish to get rid of the overexposed brightness area. So I can see the white keyboard even putting dark stuff covers most of the screen.
Here is my app and I set the exposure mode to the continuous exposure mode, but it won't adjust the overexposed area.
I want to adjust the brightness, but I also don't want to display the image including the overexposed part (I mean... I just want my app to show like the default camera does.)
This is the code for adjust the focus and exposure.
func setFocus(with focusMode: AVCaptureDevice.FocusMode, with exposureMode: AVCaptureDevice.ExposureMode, at point: CGPoint, monitorSubjectAreaChange: Bool, completion: #escaping (Bool) -> Void) {
guard let captureDevice = captureDevice else { return }
do {
try captureDevice.lockForConfiguration()
} catch {
completion(false)
return
}
if captureDevice.isSmoothAutoFocusSupported, !captureDevice.isSmoothAutoFocusEnabled { captureDevice.isSmoothAutoFocusEnabled = true }
if captureDevice.isFocusPointOfInterestSupported, captureDevice.isFocusModeSupported(focusMode) {
captureDevice.focusPointOfInterest = point
captureDevice.focusMode = focusMode
}
if captureDevice.isExposurePointOfInterestSupported, captureDevice.isExposureModeSupported(exposureMode) {
captureDevice.exposurePointOfInterest = point
captureDevice.exposureMode = exposureMode
}
captureDevice.isSubjectAreaChangeMonitoringEnabled = monitorSubjectAreaChange
captureDevice.unlockForConfiguration()
completion(true)
}
and this is how I call the function
func setFocusToCenter() {
let center: CGPoint = CGPoint(x: cameraView.bounds.width / 2, y: cameraView.bounds.height / 2)
let pointInCamera = cameraView.layer.captureDevicePointConverted(fromLayerPoint: center)
setFocus(with: .continuousAutoFocus, with: .continuousAutoExposure, at: pointInCamera, monitorSubjectAreaChange: false, completion: { [weak self] success in
guard let self = self, success else { return }
// do some animation
})
}
if I need to work on the camera exposure and even if I set the ExposureMode as continuous auto exposure, do I still need to handle overexposure in code?
Also, if you have experienced for adjusting the overexposure, how did you achieve that?
Added this part later...
I took screenshots to compare the my app camera and the native iPhone camera app.
Here is my camera app with .continuousAutoExposure and set the exposurePointOfInterest to center of the screen.
However, the native iPhone camera app wont overexposed if I shoot a dark image from the similar distance...
I think the native iPhone app is also .continuousAutoExposure mode until I touch the screen and adjust focus to a point.
I droped the image quality in order to paste on this post, but I don't really see the blur on the original screenshots. I configure the fps to 30 (also the native iPhone camera is also 30).
So waht could be the reason for getting this overexposure....

Multiple touch event in single image view

I would like to implement multiple touch event for single image view. For example, I have an image of India map, where I should be able to capture the touch event of different states. Can someone provide me some idea to implement this in objective c?
Well, quite an interesting question. The solution can be achieved in many ways.
Solution 1 :
Get an image with the states in different distinct colors. Then you can get the color at the point of touch and compare it with the color of the state.
So if the color at a point is RGB base 256 is ( 240, 125, 131 ) then the state is Maharashtra.
Note in this picture above some states have the same color so this exact image would not work.
To get the color at a pixel you can refer to this link.
Solution 2: (Coded in Swift but would work perfectly with Objective C tested with Obj-C link to it)
Get all the images of individual states.
Make a view a subclass of StateView in IB
Set the image and name of State in IB
If the touch is within the state it prints the state name as far as now for demo purposes.
Link to the project.
References :
Mark Moeykens Youtube
Hope this helps.
Result :
Yes i did it in one of my application. You need to keep tract of paths for states, then on tap, find the path in which the point lies.
In my case i had to devide the image with set of rectangles then i just noted down the first & list point (CGPoint) and save the height width of rectangle. Then based on this data i compared whether the point lies in which region.
How to fine the state area ?
You need to manually tap on boundary of area & get the CGPoint from touch default methods as:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let point = touches.first?.location(in: self)
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
let point = touches.first?.location(in: self)
touches.forEach { (touch) in
print(touch.location(in: self))
}
}

ARKit - Object stuck to camera after tap on screen

I started out with the template project which you get when you choose ARKit project. As you run the app you can see the ship and view it from any angle.
However, once I allow camera control and tap on the screen or zoom into the ship through panning the ship gets stuck to camera. Now wherever I go with the camera the ship is stuck to the screen.
I went through the Apple Guide and seems like the don't really consider this as unexpected behavior as there is nothing about this behavior.
How to keep the position of the ship fixed after I zoom it or touch the screen?
Well, looks like allowsCameraControl is not the answer at all. It's good for SceneKit but not for ARKit(maybe it's good for something in AR but I'm not aware of it yet).
In order to zoom into the view a UIPinchGestureRecognizer is required.
// 1. Find the touch location
// 2. Perform a hit test
// 3. From the results take the first result
// 4. Take the node from that first result and change the scale
#objc private func handlePan(recognizer: UIPinchGestureRecognizer) {
if recognizer.state == .changed {
// 1.
let location = recognizer.location(in: sceneView)
// 2.
let hitTestResults = sceneView.hitTest(location, options: nil)
// 3.
if let hitTest = hitTestResults.first {
let shipNode = hitTest.node
let newScaleX = Float(recognizer.scale) * shipNode.scale.x
let newScaleY = Float(recognizer.scale) * shipNode.scale.y
let newScaleZ = Float(recognizer.scale) * shipNode.scale.z
// 4.
shipNode.scale = SCNVector3(newScaleX, newScaleY, newScaleZ)
recognizer.scale = 1
}
}
Regarding #2. I got confused a little with another hitTest method called hitTest(_:types:)
Note from documentation
This method searches for AR anchors and real-world objects detected by
the AR session, not SceneKit content displayed in the view. To search
for SceneKit objects, use the view's hitTest(_:options:) method
instead.
So that method cannot be used if you want to scale a node which is a SceneKit content

Draw straight line in Swift with Sprite Kit, creating multiple lines

I'm trying to make a game in iOS, using Swift and Sprite Kit. I have the following bit of codes, but it's creating multiple lines. What I want to do is to create a single line, end of which will follow the tip of the finger of the user as the user drags the finger across the screen until the he or she takes the finger off the screen, then for the line to remain there. I found a similar question on stack overflow at Drawing straight line with spritekit and UITouch creates multiple lines, but it is written in Objective-C. And I don't think I understood the answer entirely to be honest. I'm using Xcode 7 and Swift 2.
import SpriteKit
class GameScene: SKScene {
override func didMoveToView(view: SKView) {
}
let path = CGPathCreateMutable()
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
let touch = touches.first
let position1 = touch!.locationInNode(self)
CGPathMoveToPoint(path, nil, position1.x, position1.y)
}
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
let touch = touches.first
let position2 = touch!.locationInNode(self)
CGPathAddLineToPoint(path, nil, position2.x, position2.y)
CGPathCloseSubpath(path)
let line = SKShapeNode()
line.path = path
line.strokeColor = UIColor.blackColor()
line.lineWidth = 5
self.addChild(line)
}
override func update(currentTime: CFTimeInterval) {
/* Called before each frame is rendered */
}
}
Any thoughts or suggestions would be greatly appreciated. Thanks.
Ryan
I may be wrong because I don't know too much about game development with Swift. But it appears you are creating a line every time you more your finger, have you tried creating the line in touchesEnded?
Did you try your code without the line
CGPathCloseSubpath(path)
According to the documentation, this is what it does:
Appends a line from the current point to the starting point of the
current subpath and ends the subpath.
and I don't think you want to have an additional line from your last touch to the start of the line with every new touch. Closing of a CGPath is only needed if you want a line that ends where it started.
I would cheat to do this:
Create a very thin sprite, in the line colour you want, and stretch its end point to where you want it to go.
This is effectively a rectangular box made out of an SKSpriteNode that's stretched from the origin to the current point of touch.
Rotating and then stretching a SKSpriteKit node is best done (for me) by attaching it to an SKNode, and rotating the node as needed, then stretching the nested SKSpriteNode as far as is needed to reach the user's current touch point.
Put the origin of the SKNode and the SKSpriteNode, at the same place, the middle left edge, and rotate around that. Or whatever works for you mathematically... This is makes the most sense to me, but it could be the centre points of any edge of the rectangle and you then stretch accordingly.

How do I detect if there is a tap on right or left side of the iPhone screen?

I'm a fairly new beginner into the iOS world, so forgive me if I leave out some details or if I'm not being clear enough. I have a ball placed on the screen at the bottom and would like to know how to make it go left if the user taps on the left half of iPhone and go right if the user taps on the right half of the iPhone.
The code that I'm trying to make work is this:
override func touchesBegan(touches: Set<NSObject>, withEvent event: UIEvent) {
for touch: AnyObject in touches {
let location = touch.locationOfTouch(<#touchIndex: Int#>, inView: <#UIView?#>)
}
ball!.physicsBody!.applyImpulse(CGVectorMake(25.0, 40.0))
}
I know there is code missing, but I can't seem to understand how to approach it. Am I doing this right? I will deeply appreciate your help.
so I did figure out the code. This is what I did:
override func touchesBegan(touches: Set<NSObject>, withEvent event: UIEvent) {
var touch = touches.first as! UITouch
var point = touch.locationInView(self.view)
if point.x < size.width / 2 {
ball!.physicsBody!.applyImpulse(CGVectorMake(-5.0, 10.0))
}
else {
ball!.physicsBody!.applyImpulse(CGVectorMake(5.0, 10.0))
}
}
Now, I'm coming across another problem, which might not be as complicated as I'm thinking to solve it. So, initially the ball is stationary when the app launches, and it goes in (-5,10) direction if tapped left, and (5,10) if tapped right. The problem is, when I tap right, when the app starts, it goes in (5,10) direction, but it doesn't go in left in the same direction. If I tap on right first when app launches and ball starts moving towards (5,10) direction, I want it to move left in the exact same direction from which it started while moving forward, almost in like a zig zag format. Something like this format /V, if you were to look at that in portrait view, except the line will be the ball going left and right. I hope it makes sense :)
I will keep trying to figure it out and hopefully have it figured out by the time you read it.
I'm also fairly new so I don't know of a more advanced solution.
What I would do is draw/add a line in the center of the screen. For example, I would use a SKSpriteNode and place it in the center of the screen.
var middleLine = SKSpriteNode(imageNamed: "CenterLine")
middleLine.alpha = 0 //you don't want the line to show
middleLine.position = CGPoint(x: CGRectGetMidX(self.frame), y: CGRectGetMidY(self.frame))
This would place the line sprite at the center of the screen. And then, inside the touchesBegan, I would make an if statement concerning if the tap is less than the y coordinate of the center line or greater. If it's less, then the tap would be on the left, if it's greater than the y coordinate, the tap would be on the right.
You would need to make it return the location of the touch where the user tapped in order to compare.
EDIT: To make it easier, you don't even have to add the sprite, just compare the touch x and y to the CGRectGetMidX(self.frame) and CGRectGetMidY(self.frame)
I used this in my game when figuring out if an object was leaving the screen. I compared its x,y coordinates if it was greater than the coordinates inside the frame (or less than).

Resources