Augment Reality app on IOS - ios

I'm trying to write an augment reality app for iOS tablet. For this I use AVFoundation classes to translate video from camera to screen. The augment objects are simple UILabels "flying in the air". It must look like the labels are in front of you and save the position even if you rotate the tablet. As usual i'm using CMMotionManager to capture device gyroscope data.
My problem is - I can't correctly apply device rotation quaternion to the flying UILabels.
The labels are rotating around they own axes, but they must rotate "around the tablet axes".
How to describe this mathematically? For now i only have an idea of a imaginary sphere that is around the tablet, the labels are positioned on this sphere and when you rotate the tablet than this imaginary sphere is rotating with labels in opposite direction around the same axis.
I don't wanna use any 3rd party libs.

I have worked on a similar app. But in my case the scenario was a little different. I was using UIImageView instead of UILabel. What I did was I made another UIView and made its background to default/transparent. And on that view I placed my UIImageView which rotates when UIView rotates but the CameraView UIView was creating problems with rotating. Luckily as per my requirements I have to force the device orientation to stay in landscape mode. For which I used:
- (NSUInteger)supportedInterfaceOrientations
{
return UIInterfaceOrientationMaskLandscapeRight;
}
and
- (BOOL)shouldAutorotateToInterfaceOrientation: (UIInterfaceOrientation)interfaceOrientation
{
return YES;
}
So add a transparent UIView. Then check if the problem still remains define your UIInterfaceOrientationMask in all the four possible orientations and see if it works or not.

At the end i decided to use this library: https://github.com/antonholmquist/rend-ios. Its not that easy to make this task on pure UIKit.

Related

Dealing with different iOS device resolutions in SpriteKit

I'm playing around with SpriteKit in Xcode 6, iOS 8 beta 5. Everything is all laid out and working perfectly on the iPhone 4S simulator, however when switching to the 5S, the elements at the bottom of the screen are cut off.
It was to my understanding that the bottom left corner of the iPhone screen should be CGPoint(0, 0) but after checking the location by printing the coordinates to the console that the lowest point of the left corner I could click was around (5, 44). Is there something wrong in my scene setup thats causing this?
No changes have been made to the GameViewController file and even after I strip the GameScene file the problem persists.
Can anyone at least point me in the right direction with this?
Adding the following code will fix your problem (code is in Swift):
scene.scaleMode = SKSceneScaleMode.ResizeFill
Now if you want to know why this fixes your problem, what your problem actually is, and how to handle multiple resolutions – I suggest you continue reading.
There are three things that can impact the position of nodes in your scene.
1) Anchor Point
Make sure your scene's anchor point is set to (0,0) bottom left. By default the scene's anchor point starts at (0,0) so i'm assuming that is not causing the issue.
2) Size Check the size of your scene. I typically make my scene size match the size of the device (i.e. iPad, iPhone 4-inch, iPhone 3.5 inch), then I place another layer in the scene for storing my nodes. This makes me able to do a scrolling effect for devices with smaller resolutions, but it depends on your game of-course. My guess is that your scene size might be set to 320, 480 which could be causing the positioning problems on your iPhone 5s.
3) Scale Mode The scale mode has a huge effect on the positioning of nodes in your scene. Make sure you set the scale mode to something that makes sense for your game. The scale mode kicks in when your scene size does not match the size of the view. So the purpose of the scale mode is to let Sprite Kit know how to deal with this situation. My guess is that you have the scene size set to 320,480 and the scene is being scaled to match the iPhone 5 view which will cause positioning problems identical to what you described. Below are the various scale modes you can set for your scene.
SKSceneScaleMode.AspectFill
The scaling factor of each dimension is calculated and the larger of
the two is chosen. Each axis of the scene is scaled by the same
scaling factor. This guarantees that the entire area of the view is
filled, but may cause parts of the scene to be cropped.
SKSceneScaleMode.AspectFit
The scaling factor of each dimension is calculated and the smaller of
the two is chosen. Each axis of the scene is scaled by the same
scaling factor. This guarantees that the entire scene is visible, but
may require letterboxing in the view.
SKSceneScaleMode.Fill
Each axis of the scene is scaled independently so that each axis in
the scene exactly maps to the length of that axis in the view.
SKSceneScaleMode.ResizeFill
The scene is not scaled to match the view. Instead, the scene is
automatically resized so that its dimensions always matches those of
the view.
Conclusion
It looks like you want to remove the scaling of your scene, that way your positions in the scene will match the actual positions in the view. You can either set your scene's size to match the view size, in which case no scaling will take place. Or you can set your scene's scale mode to ResizeFill which will always make the scene's size match your view's size and it won't scale anything. In general I would stay away from any scaling and instead adjust the interface and the scene size to best suit each device. You may also want to add zoom and/or scrolling to allow devices with smaller resolutions to achieve the same view field.
But what if I want to scale my scene?
If however you need to scale your scene, but you still want positions to be relative to the view (i.e. You want (0,0) to be the bottom left of screen even when scene is cutoff) then see my answer here
Additional Info
See answer here for sample code showing how I layout nodes dynamically.
See answer here for more details about scaling to support multiple devices.
If you want to preserve the size of your scene (usually desired when you work with a fixed size and coordinates system), you might want to add padding to either side of your scene. This would remove the letter boxing and preserve all the physics and dynamics of your app on any platform.
I created a small Framework to help with this:
https://github.com/Tokuriku/tokuriku-framework-stash
Just:
Download the ZIP file for the Repository
Open the "SceneSizer" sub-folder
Drag the SceneSizer.framework "lego block" in your project
Make sure that the Framework in Embedded and not just Linked
Import the Framework somewhere in your code import SceneSizer
And you're done, you can now call the sizer Class with:
SceneSizer.calculateSceneSize(#initialSize: CGSize, desiredWidth: CGFloat, desiredHeight: CGFloat) -> CGSize
Just in case, try doing CMD+1, worked for me. Some of the elements were cut off because they were simply not displayed in Simulator - I stress this, this is just a simulator feature (and a bug if you ask me, wasted hours of time to solve this). CMD+2, CMD+3 views can sometimes hide parts of the scene.

How to rotate a view on iOS

I use CoreImage on iOS for face detection. I already did this using this helpful tutorial
. My problem is I added it in a View Controller, I'm able to rotate the image to match the circle drawn over the eyes and mouth but I can't rotate the whole view controller. Is there a better approach.
My image is look like this.
I want to rotate it upside down.
I'm using storyboard and ios7
Actually the coordinate system of the CoreImage & iOS is different. CoreImage is using exactly opposite coordinate systems as iPhone so you must convert the mouth & eye points according to the iOS coordinate system.
Refer the 5th point here:-
5) Adjust For The Coordinate System
http://maniacdev.com/2011/11/tutorial-easy-face-detection-with-core-image-in-ios-5

Take a picture, choose between Full or Square Mode in camera

Is there a way to use the overlays in the UIImagePickerController to show the square picture a user might use, while having a toggle button in there somewhere to switch on the fly?
Currently the iOS 7 camera has this capability, but UIImagePickerController does not (thanks Apple), so is there a way to add this functionality?
Would using AVCaptureSession be necessary? It seems like serious overkill, and I'd have to program flash/focus/etc all over again, I think.
Failing that, is there a customizable class that already exists that I could just implement?
I'm banging my head against the wall trying to figure out the best course of action here, so any help would be appreciated.
You can adjust the camera view by transforming the scale
CGAffineTransform transform = CGAffineTransformMakeScale(1.0, 0.5);
imagePickerController.cameraViewTransform = transform;
with the arguments being the scale to change the x and y axis of the camera screen by. You could add a button that re-presents the camera view within the view controller at the top of your hierarchy. It may not perfectly recapitulate the native phone app, but it will be a pretty good approximation.
I would probably do this with a custom cameraOverlayView that letterboxes the viewfinder. Once the picture has been taken you'll have to crop the resulting image.

Showing a compass image that rotates with the map in usertrackingmodefollowwithheading (iOS 6)

In Apple's Maps app, when UserTrackingModeFollowWithHeading is enabled, there is a compass image that shows the direction of North-South and rotates with the rotation of the map. I have enabled UserTrackingModeFollowWithHeading in my app and would also like to show such a compass image.
I have tried to add a UIImageView with a compass image as a Subview of MKMapView but the image does not rotate with the map. Is there a way that I can make it rotate with the rotation of the map?
I know that I can manually rotate the map using UpdatedHeading and rotate the compass image with the map. This works. But I would rather use Apple's UserTrackingModeFollowWithHeading. I tried to use UserTrackingModeFollowWithHeading and UpdatedHeading together (where I rotate the compass image manually in UpdatedHeading) but this causes problems; the app crashes after a minute or so because of some conflict in using these two functions simultaneously.
Many Thanks for any help. I am using Monotouch 6.2 (but can translate any Objective C code in the responses provided to Monotouch).

Synchronize COCOS2D Layer on top of a MKMapView

I'm tring to synchronize cocos2D layer objects with the map, I managed to get it working by adjusting the glView to the visibleMapRect of the MKMapView. I can zoom, move, my objects are following the map. But, there is a small and annoying lag between the MKMapView and the cocos2D Layer.
I'm synchronizing it at each display loop.
Method:
1) Retrieve the MKMapView.visibleMapRect
2) Set the glViewPort
3) Do an orthographic projection to adjust my layer to the MapView.
I already tried others methods, like moving the cocos2D layer with touch and then move the coordinates of my map according to the touch, still laggy.
Even disabling acceleration and deceleration of the MapView doesn't remove the lag.
Thanks.
Shot in the dark: we know that iOS devices use optimizations to speed up rendering while scaling. This is true for Safari browser, when you zoom in you actually only zoom in on the image that is currently being displayed as the browser window's contents. Only after you stop the pinch motion does the device update the view.
You'll see this specifically with text on older devices. When the device re-renders the contents with the new scale factor, the text suddenly becomes sharp and crisp again. I believe the same optimization is done in MKMapView.
You might want to check if the visibleMapRect values are actually updated during the zoom, and whether they accurately reflect the current zoom level or not.
The other issue I can imagine is that the framerate with MKMapView + Cocos2D is simply low. And specifically zooming might consume a lot of CPU power. You might want to enable the cocos2d FPS display to see what the framerate is.
Another trick that's necessary to allow smooth scrolling of views in cocos2d (particularly complex views like UITableView) is to reduce cocos2d's max framerate (animationInterval) and/or to run the rendering of the gl view on a separate thread. Your issue may simply be a variation of this issue: UIScrollView pauses NSTimer until scrolling finishes
Note that this also occurs with DisplayLink director. The info in this question did the trick for me.

Resources