I'm following this documentation to implement ML Kit for iOS, but I can't figure out which import should I use to create a VisionImage.
The pod I am using is GoogleMLKit/BarcodeScanning.
I have tried with the following imports without luck:
import MLImage
import MLKit
import MLKitBarcodeScanning
Which is the right import to create a VisionImage and then a BarcodeScanner?
According to MLKit Release Notes (as of June 29, 2021), the correct pod artifact names are listed like this.
Update your pod declaration to the latest version like this.
pod 'MLKitBarcodeScanning', '1.3.0'
After this change, do a pod install and now you can use following two imports to work with MLKit on iOS.
import MLKitBarcodeScanning
import MLKitVision
At top of the documentation you referenced, there is a link to the ML Kit quickstart sample, where you can find sample code in both Swift and Objective-C showing you how to import ML Kit. Specifically, in Swift you can import MLKit and in Objective-C #import MLKit;.
Related
is it possible to import one pod that is equivalent to it's dependencies?
I know I can do it in the Podfile, but what about the actual imports in code?
For example, instead of import SwiftyJSON and import RxSwift, I'd have only import AllMyPods, which is my own pod I created just to aggregate the pods I use.
I left Xcode unsupervised for the weekend and I come back and it's telling me I have missing modules Corelocation and SQLite at:
import SQLite
import CoreLocation
both giving me the error Missing required module 'CSQLite'. I'm not sure where its getting "CSQLite" especially in relation to Corelocation
I have the SQLite.xcodeproj added to the general page to be linked it was building fine a couple days ago and absolutely nothing's changed.
In swift, if you want to use sqlite, you can either use modulemap feature of llvm to import c library or you can use cocoapods and import sqlite based any third party pod.
I'm working on a personal project and I've installed Firebase with Cocoapods and imported it in the bridging-header.
My question is why do I get an error if I try to use a Firebase method in my project if I haven't written import Firebase at the top? The same can be said for Google Analytics etc.
The reason I ask is because in this tutorial: Ray Wenderlich Firebase Tutorial
Firebase is configured the same way as mine - but none of the documents contain import Firebase, and the project recognizes any Firebase method universally without errors.
I know I can just use import but I don't understand what I am doing differently and would rather not have to.
I think u have in pod file line:
use_frameworks!
delete it and reinstall pods it should help
In the mentioned project, the import is done from the Grocr-Bridging-Header.h, it's actually an Objective-C import:
#import <Firebase/Firebase.h>
The reason why the import was done from Objective-C is beause the project is quite old and Cocoapods didn't have support for Swift then.
I am trying to implement the Venmo-iOS-SDK into my app using CocoaPods. Inside my Podfile, I have the use_frameworks! statement to make all dependencies into embedded frameworks. After running pod install, the Venmo-iOS-SDK appears to correctly be installed in CocoaPods, but I cannot figure out what to put at the top of my files as an import statement.
The other pods that I have worked with do not have dashes in their name, so I am just able to put import PodName. However, import Venmo-iOS-SDK triggers a compile time error stating "consecutive statements on a line must be separated by a ;".
I have tried all the following statements and none work:
import Venmo-iOS-SDK
import Venmo
import VenmoiOSSDK
import VenmoSDK
Does anyone know what to import for this framework in swift, or what the import statement looks like for other pods with a - in their name?
I'd try the below option
import Venmo_iOS_SDK
I downloaded the latest pre-built opencv2.framework from the OpenCV SourceForge page. then in Xcode6-Beta3, I added opencv2.framework as a required linked framework under the "General" tab of my Swift project settings:
This is the structure of the framework after the framework is added to the project:
If this was a Objective-C project I could add the following import statement:
#ifdef __cplusplus
#import <opencv2/opencv.hpp>
#endif
In my ViewController.swift file, if I do something similar I get a "no such module" error:
I tried the following variations that result in an error:
import opencv2/opencv.hpp
import opencv2/opencv
import opencv
import "opencv2/opencv.hpp"
import <opencv2/opencv>
What is the correct way to import opencv in to my Swift project?
It's easiest if you start with a simple project all ready to compile and run out of the box.
This xcode project works with Swift, XCode 6.1, OpenCV 2, and is MIT licensed:
http://whitneyland.com/2014/10/opencv-swift.html
Here's one I haven't tried yet but there are so few still it's worth looking at:
https://github.com/foundry/OpenCVSwiftStitch
You will need to create a bridge from C++ to Objective C or plain C. After that, you will need to include/import the bridge's header into an "Objective-C Bridging Header", which Xcode should have generated when you added a Swift source file in an existing project.