How can I use the iOS Vision framwork from Objective-C? - ios

The example code on the Apple documentation for detecting still images only has Swift example code.
Most tutorials seem to be in Swift and indicate to just "import Vision" in the header, but do not explain how to get the compiler to identify Vision is enabled.
How can I use the Vision framework with Objective-C?

I figured it out. In XCode select your target project, under the General tab in the Frameworks, Libraries, and Embedded Content section press the add button. From there you can add the Vision framework to your project.

Related

How can I build an application using only a Swift playground?

I have a basic understanding of how to make iOS apps with Swift in Xcode Projects (I can make apps like flappy birds and an adapted variation of pong). My main target for this year is to be able to at least apply for the Apple WWDC student scholarship. I have read online that the scholarship submission needs to be completed and uploaded in Swift playgrounds.
As I am self-taught, I have little to no experience with Swift playgrounds. So my main question is: how do I make an application in playgrounds? I am used to beginning with a main.storyboard, but I can't find that in Swift playgrounds.
How do you go about creating an interactive application in Swift playgrounds?
You cannot use storyboards directly in a playground, which is why Xcode doesn't support creating a storyboard inside of a playground. You can access a framework from a storyboard inside of a workspace (including cocoapods), so one solution is to make a framework to load your storyboards. This doesn't seem to fit with your criteria of making everything in playgrounds though. So you can either make all of your view programmatically or you can drag in compiled nib files to your Playground Resources folder and load the nibs from there (the playground cannot load xib, which is the XML representation of the nib).

How to implement auto complete feature using Swift 2

I want to implement auto complete feature like "Google" for example, I'm open for suggestions to use native controls or recommendations for third party controls.
I just want a text field that will take an array of strings and display suggestions according to what is written
I couldn't find cocoa pods controls for what i want in Swift 2.
I have tried this control: Auto complete Cocoa pods but no luck its implemented in a way that does not fit my needs "He is depending on xib file" and also tried UISearchBar with Content. But couldnt find a proper tutorial for it on swift 2
Your support is highly appreciated

Image morphing by integrating OpenCV2.framework in iOS

I am new to iOS and I am planning to work on my first app in which I have to morph one image into another smoothly. I have downloaded OpenCv2.framework and I am planning to integrate into my code. I have gone through some links showing the mathematical calculations required in this but I couldn't get it. Can anyone please suggest me some way or some sample tutorial demonstrating that how can we integrate and use OpenCv2.framework to morph one image into another in ios? Thanks.

Applying annotations to map kit

My problem is that I cannot find any up to date tutorials with iOS6 and map kit that include storyboards instructing you on how to apply annotations/pins with the use of for-ordinates.
My homework tells me that the tutorials out there are xib, iOS 5 related which now defeats the point because apple implemented their own maps with ios6 and are no longer using Google's..
I've used google and checked out apple's developer's guide..
I can't be the only one searching for this answer...?
Many thanks.
The MapKit API is essentially same between iOS5 and iOS6 - what changed was the source of the mapping data, the graphic design of the maps, and the use of vector data over bitmap data. But most of this is opaque at the API level. Between iOS versions you continue to use the same framework and classes. There are a few additions (such as MKMapItem, routing) and a few simplifications, but any code written for Apple's iOS5 MapKit SDK will work with iOS6 MapKit.
Issues of xib vs. storyboard tutorials are not specific to MapKit, and nothing in MapKit relies on one or the other, so you shouldn't allow that to confuse you. However, here is an iOS6 tutorial to get you started:
http://www.raywenderlich.com/21365/introduction-to-mapkit-in-ios-6-tutorial
For an overview of changes between ios5 and ios6 MapsKit, take a look at the WWDC video from 2012,Getting Around Using Map Kit.
so by using the code below, I've manage to place one pin by using coordinates and placing it in viewDidLoad, but when the code is copied, it fails to build. Any solutions for adding multiple co-ordinates?

How to implement Cocos2D?

I have downloaded the latest cocos2d from their site, and I have unzipped it. Now I am confused exactly how to implement it into an already existing project. What files do I drag over and is there anything else I have to do besides that?
Thanks!
You can link cocos2d into your project as shown here. He uses a special version that supports arc, but it works in exactly the same way with the original version. Then in a next step make sure that you import all the important libraries:
QuartzCore.framework
OpenGLES.framework
AVFoundation.framework
UIKit.framework
Foundation.framework
CoreGraphics.framework
If you use sound you also will need:
OpenAL.framework
AudioToolbox.framework
libcocosDenshion.a
Now you need to set up a rootview that supports OpenGL. Look at the Hello World example how they did this there. You should now be able to implement and display Cocos2d scenes on screen.
Here straight from the source:
Cocos2D tutorial

Resources