Metaio SDK: problems setting up a new project - ios

I'm trying to setting up a new project using Metaio SDK but i have a few problems with this.
First of all i've followed the tutorial of this website, but it ended up not working. (http://dev.metaio.com/sdk/getting-started/ios/creating-a-new-ar-application/index.html)
I've been trying to do it with objective C or with Swift with an Obj-C bridging file.
This is what i've done:
-DL metaioSDK.framework and add to project
-Add other frameworks:
-Import metaioSDK (in view controller or in bridge)
#import <MetaioSDK/MetaioSDKViewController.h>
-Add subclass in ViewController (MetiaoSDKViewController)
Just like this:
import UIKit
class ViewController: MetiaoSDKViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
-This images show the errors i'm getting:

The errors you're getting are coming from c++ code being compiled with the Swift compiler.
You should change it to Obj-C++. Either you change the file extension of your ViewController from 'm' to 'mm' (obj-c++) or change the compile-settings to compile your whole project as Obj-C++.

In the Metaio tutorial which I assume you're following, they have a spelling error in subclassing MetaioSDKViewController since Metaio is spelt Metiao. So all you need to change is that line.

Related

iOS-Swift 5 After creating bridging header "MicrosoftCognitiveServicesSpeech-iOS/SPXSpeechApi.h" none of the classes are accessible

So I downloaded the MicrosoftCognitiveServicesSpeech-iOS SDK. I created bridging header for my Swift project. Everything works fine. But I try using the classes like SPXSpeechConfiguration, SPXAudioConfiguration. Complier unable to find these classes
I was able to use the Quickstart sample swift code.
I am using CocoaPods for installing the SDK, Xcode 13
My bridging header looks like this:
#ifndef MicrosoftCognitiveServicesSpeech_iOS_Bridging_Header_h
#define MicrosoftCognitiveServicesSpeech_iOS_Bridging_Header_h
#import "MicrosoftCognitiveServicesSpeech-iOS/SPXSpeechApi.h"
#endif /* MicrosoftCognitiveServicesSpeech_iOS_Bridging_Header_h */
Usage in my ViewController.swift file
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
var speechConfig: SPXSpeechConfiguration?
let audioConfig = SPXAudioConfiguration()
}
}
Could you please try to add the framework you were trying to import inside the bridging header file to Target->Build Phases -> Link Binary With Libraries?
It should help. Thanks.

AudioKit 4.1 Mach-O Linker Error Swift 4

I am trying to follow a basic tutorial using AudioKit 4.1. I first imported the AudioKit framework in the project as shown in the image below.
After importing the AudioKit framework, I added a few lines of code in the ViewController as follows:
import UIKit
import AudioKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
let oscillator = AKOscillator()
oscillator.amplitude = 0.1
AudioKit.output = oscillator
oscillator.start()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
Upon running the code, I got 59 error as shown in the image below. How do you fix it?
Since version 4.1, AudioKit is now shipped as a static framework. Because of all the internal C++ code, it depends on the standard C++ library. This dependency used to be resolved automatically by the dynamic linker, but not any more.
The easiest way to make these errors go away is to simply add the -lstdc++ linker flag in your target settings in Xcode (under "Other Linker Flags").

CocoaPods Swift Unresolved Identifier

I am trying to create my own CocoaPod. I have updated cocoapods to the latest version (sudo gem install cocoapods) and then ran this: pod lib create MySwiftLibrary. I set the language to Swift, used the Quick testing framework, added an auto-generated example project, and did not include UI tests. I then navigated to MySwiftLibrary/Example and ran pod install.
That way, my example project was configured. I opened the *.xcworkspace in the auto-generated example project, navigated to Pods -> Development Pods/MySwiftLibrary/MySwiftLibrary/Classes/ReplaceMe.swift and added the following code:
public class ReplaceMe : UILabel {
public func testMe() -> String {
return "Hello World"
}
}
I ran pod install again, and in the example part of the project (MySwiftLibrary -> Example for MySwiftLibrary/ViewController.swift) I added the line import MySwiftLibrary and modified the viewDidLoad method such that the file looked as follows:
import UIKit
import MySwiftLibrary
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
let label = ReplaceMe()
// Do any additional setup after loading the view, typically from a nib.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
I ran pod install and pod update, just to be sure, but when trying to build the project, I get the following error: Use of unresolved identifier 'ReplaceMe'. What am I doing wrong? This is literally the barest of bare-bone project modifications.
UPDATE: I just found out the issue, and it is something a responded could not have guessed. I have anonymized the name of the pod. In reality, I didn't call the pod MySwiftLibrary, but MyLibrarySwift. (MyLibrary is still not what it really was). I tried making that same pod again and again, and it didn't work. But then I had the idea to try a name that did not end with Swift, and it instantly started working.

Compile and runtime failures when importing interfaces with category extensions in XCode 7

I'm trying to get an example of running OpenEars with the RapidEars plugin running in Swift 2.2 (XCode 7.3.1). However, I suspect I'm having a larger issue with using Objective-C interfaces with extensions in a Swift project (or my understanding of how that works).
The OpenEars code is Obj-C. However I was able to get it running in my swift project through the standard Obj-C -> Swift translation techniques.
Abbreviated code follows. The full example is on a forked Github and updated to Swift-2.2: https://github.com/SuperTango/OpenEars-with-Swift-
This following example is working great. You can see the entire project by checkng out the "working-opears-swift2.2" tag.
OpenEarsTest-Bridging-Header.h:
#import <OpenEars/OELanguageModelGenerator.h>
#import <OpenEars/OEAcousticModel.h>
#import <OpenEars/OEPocketsphinxController.h>
#import <OpenEars/OEAcousticModel.h>
#import <OpenEars/OEEventsObserver.h>
ViewController.swift:
class ViewController: UIViewController, OEEventsObserverDelegate {
var openEarsEventsObserver = OEEventsObserver()
override func viewDidLoad() {
super.viewDidLoad()
loadOpenEars()
}
func loadOpenEars() {
self.openEarsEventsObserver = OEEventsObserver()
self.openEarsEventsObserver.delegate = self
var lmGenerator: OELanguageModelGenerator = OELanguageModelGenerator()
addWords()
var name = "LanguageModelFileStarSaver"
lmGenerator.generateLanguageModelFromArray(words, withFilesNamed: name, forAcousticModelAtPath: OEAcousticModel.pathToModel("AcousticModelEnglish"))
lmPath = lmGenerator.pathToSuccessfullyGeneratedLanguageModelWithRequestedName(name)
dicPath = lmGenerator.pathToSuccessfullyGeneratedDictionaryWithRequestedName(name)
}
func startListening() {
do {
try OEPocketsphinxController.sharedInstance().setActive(true)
OEPocketsphinxController.sharedInstance().startListeningWithLanguageModelAtPath(lmPath, dictionaryAtPath: dicPath, acousticModelAtPath: OEAcousticModel.pathToModel("AcousticModelEnglish"), languageModelIsJSGF: false)
} catch {
NSLog("Error!")
}
}
// A whole bunch more OEEventsObserverDelegate methods that are all working fine...
func pocketsphinxDidStartListening() {
print("Pocketsphinx is now listening.")
statusTextView.text = "Pocketsphinx is now listening."
}
Up until this point, everything is working great.
However, In order to use the "RapidEars" plugin, the documentation (http://www.politepix.com/rapidears/) says to:
Add the framework to the project and ensure it's being included properly.
import two new files (that are both "categories" to existing OpenEars classes):
#import <RapidEarsDemo/OEEventsObserver+RapidEars.h>
#import <RapidEarsDemo/OEPocketsphinxController+RapidEars.h>
Change methods that used: startListeningWithLanguageModelAtPath to use startRealtimeListeningWithLanguageModelAtPath
add two new OEEventsObservableDelegate methods.
func rapidEarsDidReceiveLiveSpeechHypothesis(hypothesis: String!, recognitionScore: String!)
func rapidEarsDidReceiveFinishedSpeechHypothesis(hypothesis: String!, recognitionScore: String!)
The new code can be found by checking out the rapidears-notworking-stackoverflow tag from the above github repo
Problem 1:
When doing completion in the XCode editor, the editor sees WILL perform autocompletion on the startRealtimeListeningWithLanguageModelAtPath method, however when the code is run, it always fails with the error:
[OEPocketsphinxController startRealtimeListeningWithLanguageModelAtPath:dictionaryAtPath:acousticModelAtPath:]: unrecognized selector sent to instance 0x7fa27a7310e0
Problem 2:
When doing auto completion in the XCode editor, it doesn't see the two new delegate methods defined in RapidEarsDemo/OEPocketsphinxController+RapidEars.h.
I have a feeling that these are related, and also related to the fact that they failing methods are defined as Categories to Objective-C classes. But that's only a guess at this point.
I've made sure that the RapidEars framework is imported and in the framework search path.
Can anyone tell me why this is happening? Or if there's some Swift magic incantation that I missed?
The problem could be the one described in the link below, where category methods in a static library produce selector not recognized runtime errors.
Technical Q&A QA1490: Building Objective-C static libraries with categories

How to access a Freestreamer Objective C class from Swift class?

I am totally new to iOS development, and I decided to start directly with the new Swift programming language. But some Objective C knowledge is needed though, so this is a very basic question!
I am trying to use the FreeStreamer library, to play a shoutcast stream in my iOS app. I followed the basic steps (copied among others the needed file in my Xcode project) but how can I access to the FSAudioStream from my Swift class?
I tried this:
import UIKit
class FirstViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
let test = FSAudioStream
}
// ...
}
But the FSAudioStream class is not found, which doesn't surprise me. Should I add an extra import to my file? In that case, which one?
Ok, I found the solution from this Apple Developer page on mixing Swift and Objective C.
But there was something important: when setting the "Objective-C Bridging header" in the "Swift Compiler - Code Generation" section of the project, the path has to be set generally which in turns set value for both the "Debug" and "Release" keys.
So I set the "Swift Compiler - Code Generation" parameter to MyProject/MyProject-Header.h.
And in the MyProject/MyProject-Header.h file I added this:
#import "FSAudioStream.h"
And then there is nothing to import in the Swift file.
Problem solved!

Resources