How would I write this typedef enum from objective c to swift? - ios

Below, I have the objective-c code which is used for tinder style animation effect , inspired by - https://github.com/ngutman/TinderLikeAnimations/tree/master/TinderLikeAnimations .
Objective-c
typedef NS_ENUM(NSUInteger , GGOverlayViewMode) {
GGOverlayViewModeLeft,
GGOverlayViewModeRight
};
- (void)setMode:(GGOverlayViewMode)mode
{
if (_mode == mode) return;
_mode = mode;
if (mode == GGOverlayViewModeLeft) {
self.imageView.image = [UIImage imageNamed:#"button1"];
} else {
self.imageView.image = [UIImage imageNamed:#"button2"];
}
}
I am trying to replicate the same in swift. This is what I have in swift -
enum GGOverlayViewMode : Int {
case GGOverlayViewModeLeft
case GGOverlayViewModeRight
}
func setMode(mode: GGOverlayViewMode){
// if (_ mode == mode) {
// return
// }
//
// _mode = mode;
if(mode == GGOverlayViewMode.GGOverlayViewModeLeft) {
imageView.image = UIImage(named: "button1")
} else {
imageView.image = UIImage(named: "button2")
}
}
But somehow its not making sense to how would I be handling the typdefs here.
Any help is appreciated.
Thanks

In Swift each enumeration has its own member values, so you don't have to give
them a unique prefix as in (Objective-)C. A typical definition would be
enum GGOverlayViewMode {
case Left
case Right
}
Also you don't have to specify an underlying "raw type" (such as Int), unless
you have other reasons to do so.
Instead of a custom setter method you would implement a property observer.
didSet is called immediately after the new value is stored, and has an implicit
parameter oldValue containing the old property value:
var mode : GGOverlayViewMode = .Right {
didSet {
if mode != oldValue {
switch mode {
case .Left :
imageView.image = UIImage(named: "button1")
case .Right:
imageView.image = UIImage(named: "button2")
}
}
}
}

I think in swift, your function will look like this.
enum GGOverlayViewMode : Int
{
case GGOverlayViewModeLeft
case GGOverlayViewModeRight
}
func setMode(mode: GGOverlayViewMode){
switch mode
{
case .GGOverlayViewModeLeft:
imageView.image = UIImage(named: "button1")
case .GGOverlayViewModeRight:
imageView.image = UIImage(named: "button2")
}
}

Related

Swift enum function with string parameter calling with a parameter of the enum class

I'm trying to convert from a string to an enum key value, but I couldn't find any simple method so I hardcoded it, but it still won't work.
I tried using an enum function to return an enum key value from a string, but I can't call it with the string even though I declared it with a String as a parameter.
I then tried to move it to a different class, but the same thing happened.
My relevant code is below.
enum pickedColor: String {
case green = "71D25E"
case red = "FF0000"
case maroon = "800000"
case yellow = "FFFF00"
case olive = "808000"
case lime = "00FF00"
case aqua = "00FFFF"
case teal = "008080"
case blue = "0000FF"
case navy = "000080"
case fuchsia = "FF00FF"
case purple = "800080"
func toEnum(_ colorName: String) -> pickedColor {
if colorName.elementsEqual("green") {
return .green
} else if colorName.elementsEqual("red") {
return .red
} else if colorName.elementsEqual("maroon") {
return .maroon
} else if colorName.elementsEqual("yellow") {
return .yellow
} else if colorName.elementsEqual("olive") {
return .olive
} else if colorName.elementsEqual("lime") {
return .lime
} else if colorName.elementsEqual("aqua") {
return .aqua
} else if colorName.elementsEqual("teal") {
return .teal
} else if colorName.elementsEqual("blue") {
return .blue
} else if colorName.elementsEqual("navy") {
return .navy
} else if colorName.elementsEqual("fuchsia") {
return .fuchsia
} else {
return .purple
}
}
{
The code that shows up when I try to call it is this.Picture1
When manually completed, it still shows an error of cannot convert string to pickedColor.
I then moved the code to a new class, however, it still does not work.
class Color {
func toEnum(_ colorName: String) -> pickedColor {
if colorName.elementsEqual("green") {
return .green
} else if colorName.elementsEqual("red") {
return .red
} else if colorName.elementsEqual("maroon") {
return .maroon
} else if colorName.elementsEqual("yellow") {
return .yellow
} else if colorName.elementsEqual("olive") {
return .olive
} else if colorName.elementsEqual("lime") {
return .lime
} else if colorName.elementsEqual("aqua") {
return .aqua
} else if colorName.elementsEqual("teal") {
return .teal
} else if colorName.elementsEqual("blue") {
return .blue
} else if colorName.elementsEqual("navy") {
return .navy
} else if colorName.elementsEqual("fuchsia") {
return .fuchsia
} else {
return .purple
}
}
}
The second picture of the wrong parameter is here. Picture2
What is going on?
You need to make your toEnum function a static function since you are not calling it on a specific enum instance.
You should also name your enum starting with an uppercase letter.
And a switch is better than your long if/else. I would also consider dealing with an unknown color as well.
enum PickedColor: String {
case green = "71D25E"
case red = "FF0000"
case maroon = "800000"
case yellow = "FFFF00"
case olive = "808000"
case lime = "00FF00"
case aqua = "00FFFF"
case teal = "008080"
case blue = "0000FF"
case navy = "000080"
case fuchsia = "FF00FF"
case purple = "800080"
static func toEnum(_ colorName: String) -> PickedColor? {
switch colorName {
case "green":
return .green
case "red":
return .red
case "maroon":
return .maroon
case "yellow":
return .yellow
case "olive":
return .olive
case "lime":
return .lime
case "aqua":
return .aqua
case "teal":
return .teal
case "blue":
return .blue
case "navy":
return .navy
case "fuchsia":
return .fuchsia
case "purple":
return .purple
default:
return nil
}
}
}
Now you can call it as you tried:
if let color = PickedColor.toEnum(colorName) {
// use color as needed
}
I have played around with many variants of doing this over the last couple of years and have settled on the below. It is similar to #rmaddy's version but the rawValue of the enum is a String so that it is easy to convert back and forth between the name and the enum.
I then have static var's for the actual colours themselves.
This allows me to refer to the colours in a similar way to UIColor
view.backgroundColor = Palette.blueColor
or
view.backgroundColor = Palette.colorNamed("blue")
.
enum Palette : String
{
case white
case orange
case red
case pink
case purple
case blue
static func color( named:String) -> UIColor?
{
switch named
{
case white.rawValue : return whiteColor
case orange.rawValue : return orangeColor
case red.rawValue : return redColor
case pink.rawValue : return pinkColor
case purple.rawValue : return purpleColor
case blue.rawValue : return blueColor
default : return nil
}
}
static let whiteColor = color( withHex:0xffffff)
static let orangeColor = color( withHex:0xfc622f)
static let redColor = color( withHex:0xdd202b)
static let pinkColor = color( withHex:0xff2f7e)
static let purpleColor = color( withHex:0x9166e6)
static let blueColor = color( withHex:0x049edd)
}
extension UIColor
{
convenience init( withHexAlpha hex:UInt32)
{
let red = CGFloat((hex >> 24) & 0xff) / 255.0
let green = CGFloat((hex >> 16) & 0xff) / 255.0
let blue = CGFloat((hex >> 8) & 0xff) / 255.0
let alpha = CGFloat((hex >> 0) & 0xff) / 255.0
self.init( red:red, green:green, blue:blue, alpha:alpha)
}
convenience init( withHex hex:UInt32)
{
self.init( withHexAlpha:(hex << 8) | 0xff)
}
}
Both with mine and #rmaddy's you don't need to do the if let as most UIColor properties will also take a nil
It would be much easier to add a computed property and a switch to convert your enumeration cases to hexa and use the default enumeration rawValue initializer:
enum PickedColor: String {
case green, red, maroon, yellow, olive, lime, aqua, teal, blue, navy, fuchsia, purple
}
extension PickedColor {
var hexa: String {
let hexa: String
switch self {
case .green:
hexa = "71D25E"
case .red:
hexa = "FF0000"
case .maroon:
hexa = "800000"
case .yellow:
hexa = "FFFF00"
case .olive:
hexa = "808000"
case .lime:
hexa = "00FF00"
case .aqua:
hexa = "00FFFF"
case .teal:
hexa = "008080"
case .blue:
hexa = "0000FF"
case .navy:
hexa = "000080"
case .fuchsia:
hexa = "FF00FF"
case .purple:
hexa = "800080"
}
return hexa
}
}
if let color = PickedColor(rawValue: "purple") {
print(color) // "purple\n"
print(color.hexa) // "800080"
}

How to check if a button has this image

I want to check if myButton has a named image.
I try this but it doesn't work
if (myButton.currentImage?.isEqual(UIImage(named: "ButtonAppuyer.png")) != nil){
print("YES")
} else {
print("NO")
}
and this too doesn't work
if myButton.currentImage?.isEqual(UIImage(named: "ButtonAppuyer.png")){
print("YES")
} else {
print("NO")
}
Here is what I came up with in Swift 3.0.
if let myButtonImage = myButton.image(for: .normal),
let buttonAppuyerImage = UIImage(named: "ButtonAppuyer.png"),
UIImagePNGRepresentation(myButtonImage) == UIImagePNGRepresentation(buttonAppuyerImage)
{
print("YES")
} else {
print("NO")
}
This could be cleaned up a lot.
extension UIButton {
func hasImage(named imageName: String, for state: UIControlState) -> Bool {
guard let buttonImage = image(for: state), let namedImage = UIImage(named: imageName) else {
return false
}
return UIImagePNGRepresentation(buttonImage) == UIImagePNGRepresentation(namedImage)
}
}
Then use it
if myButton.hasImage(named: "ButtonAppuyer.png", for: .normal) {
print("YES")
} else {
print("NO")
}
in swift 4.2 and Xcode 10.1
buttonOne.image(for: .normal)!.pngData() == UIImage(named: "selected")!.pngData()
You can convert the button's image and the named image to NSDatas and compare the two data objects:
let imgData1 = UIImagePNGRepresentation(buttonImage)
let imgData2 = UIImagePNGRepresentation(namedImage)
let equal = imgData1 == imgData2
Note that this will only work if the two images are completely identical (e.g. come from the same source file); if one is a scaled down version of the other, it won't work.
Also, it should be mentioned that this can be a pretty expensive operation and should not be run frequently.
There is no way to retrieve the image name from a UI element. You will need to store the value you want to retrieve in another location.
For instance, you could create a custom subclass of UIButton that contains an additional property to store the image name or other identifying data.
if let ButtonImage = myButton.image(for: .normal),
let Image = UIImage(named: "ButtonAppuyer.png"),
UIImagePNGRepresentation(ButtonImage) == UIImagePNGRepresentation(Image)
{
print("YES")
} else {
print("NO")
}
if myCell.followButton.currentImage.isEqual(UIImage(named: "yourImageName")) {
//do something here
}

How do I check if UIImage is empty in swift?

This is my code but it seems the conditional "if" check for UIImage being empty is not executing. Is it the wrong way of checking if UIImage is empty? I have also tried
if allStyleArrays[i][ii][iii] as UIImage? == nil {
...
}
And not working, please if none of those methods checks, what is the correct way?
Thanks.
Code:
for var i = 0; i <= allStyleArrays.count; i++ {
if i == allStyleArrays.count {
self.performSegueWithIdentifier("toStylistReviewAndConfirm", sender: self)
} else {
for var ii = 0; ii < allStyleArrays[i].count; ii++ {
for var iii = 0; iii < allStyleArrays[i][ii].count; iii++ {
if allStyleArrays[i][ii][iii] as UIImage? == UIImage(named: "") {
} else {
switch i {
case 0:
styleN[0].append(allStyleArrays[i][ii][iii])
//print("style 1 \(style1.count)")
break
case 1:
styleN[1].append(allStyleArrays[i][ii][iii])
//print("style 2 \(style2.count)")
break
default:
styleN[2].append(allStyleArrays[i][ii][iii])
//print("style 3 \(style3.count)")
break
}
}
}
}
}
}
try
if allStyleArrays[i][ii][iii].imageAsset == nil

How do you get the original filename of a sprite/texture in Swift?

I create a sprite and assign an image file to it.
var logoImage = SKSpriteNode(imageNamed: "image1.png")
Then in some circumstances, I change the image.
logoImage.texture = SKTexture(imageNamed: "image2.png")
In another part of the app I want to check which image is currently being displayed. But I don't know how to get the filename.
Using:
print(logoImage.texture?.description)
Returns:
"<SKTexture> 'image2.png' (500 x 500)"
Which obviously contains the filename, but how do I get the filename on it's own?
I suggest your to avoid exposing the internal logic of your objects.
And definitely you should NOT build your game logic on the texture currently used. The presentation should be a mere representation of the actual data.
So you should create your own class that wraps logic and data, like this.
class Logo: SKSpriteNode {
enum Type: String {
case A = "Image1", B = "Image2"
var imageName: String {
return self.rawValue
}
}
private var type: Type {
didSet {
self.texture = SKTexture(imageNamed: type.imageName)
}
}
init(type: Type) {
self.type = type
let texture = SKTexture(imageNamed: type.imageName)
super.init(texture: texture, color: .clearColor(), size: texture.size())
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
}
Usage
Now you can easily create a Logo
let logo = Logo(type: .A)
You can change the texture for that sprite
logo.type = .B
And you can check what texture is currently using
switch logo.type {
case .A: print("It's using Image1")
case .B: print("it's using Image2")
}
Last thing. I replaced Image1.png and Image2.png with Image1 and Image2. If you are using Asset Catalog (and you should) then you don't need to specify the file extension.
You could use:
if logoImage.texture?.descriptio.containsString("image1.png") {
// your code for image1 here
} else {
// your code for image2 here
}
You'll have use:
import Foundation
in your code for containsString
I'm pretty sure that there is a more elegant method but it works:
if let rangeOfIndex = texture.description.rangeOfCharacterFromSet(NSCharacterSet(charactersInString: "'"), options: .BackwardsSearch) {
let filename = texture.description.substringToIndex(rangeOfIndex.endIndex)
if let r = filename.rangeOfCharacterFromSet(NSCharacterSet(charactersInString: "'"), options: .LiteralSearch) {
print(filename.substringFromIndex(r.startIndex).stringByReplacingOccurrencesOfString("'", withString: "", options: NSStringCompareOptions.LiteralSearch, range: nil))
}
}
You could use this :
func getTextureName(textureTmp: String) -> String {
var texture:String = ""
var startInput = false
for char in textureTmp {
if startInput {
if char != "'" {
texture += String(char)
} else {
return texture
}
}
if char == "'" {
startInput = true
}
}
return texture
}
Its not beautiful but it works

Convert Microsoft Project Oxford Speech Recognition from Objective-C to SWIFT

Microsoft Project Oxford has a nice Speech Recognition API and instructions for Objective-C on IOS. I build it easily following the getting started instructions. However, I am having hard time to convert it to Swift language.
I created a swift project first. I created the bridge header file (ProjectName-Bridging-Header.h) and inserted following code to this file:
#import "SpeechRecognitionService.h"
I want to convert Objective-C both header and implementation files into ViewController.swift.
contents of ViewController.h:
#import <UIKit/UIKit.h>
#import "SpeechRecognitionService.h"
#interface ViewController : UIViewController<SpeechRecognitionProtocol>
{
NSMutableString* textOnScreen;
DataRecognitionClient* dataClient;
MicrophoneRecognitionClient* micClient;
SpeechRecognitionMode recoMode;
bool isMicrophoneReco;
bool isIntent;
int waitSeconds;
}
#property (nonatomic, strong) IBOutlet UIButton* startButton;
/* In our UI, we have a text box to show the reco results.*/
#property (nonatomic, strong) IBOutlet UITextView* quoteText;
/* Action for pressing the "Start" button */
-(IBAction)startButtonTapped:(id)sender;
#end
contents of ViewController.m:
#import "ViewController.h"
#import <AVFoundation/AVAudioSession.h>
#interface ViewController (/*private*/)
/* Create a recognition request to interact with the Speech Service.*/
-(void)initializeRecoClient;
#end
NSString* ConvertSpeechRecoConfidenceEnumToString(Confidence confidence);
/* The Main App */
#implementation ViewController
/* Initialization to be done when app starts. */
-(void)viewDidLoad
{
[super viewDidLoad];
textOnScreen = [NSMutableString stringWithCapacity: 1000];
recoMode = SpeechRecognitionMode_ShortPhrase;
isMicrophoneReco = true;
isIntent = false;
waitSeconds = recoMode == SpeechRecognitionMode_ShortPhrase ? 20 : 200;
[self initializeRecoClient];
}
/* Called when a partial response is received. */
-(void)onPartialResponseReceived:(NSString*) response
{
dispatch_async(dispatch_get_main_queue(), ^{
[textOnScreen appendFormat:(#"%#\n"), response];
self.quoteText.text = response;
});
}
/* Called when a final response is received. */
-(void)onFinalResponseReceived:(RecognitionResult*)response
{
bool isFinalDicationMessage = recoMode == SpeechRecognitionMode_LongDictation &&
(response.RecognitionStatus == RecognitionStatus_EndOfDictation ||
response.RecognitionStatus == RecognitionStatus_DictationEndSilenceTimeout);
if (isMicrophoneReco && ((recoMode == SpeechRecognitionMode_ShortPhrase) || isFinalDicationMessage)) {
[micClient endMicAndRecognition];
}
if ((recoMode == SpeechRecognitionMode_ShortPhrase) || isFinalDicationMessage) {
dispatch_async(dispatch_get_main_queue(), ^{
[[self startButton] setEnabled:YES];
});
}
}
NSString* ConvertSpeechErrorToString(int errorCode)
{
switch ((SpeechClientStatus)errorCode) {
case SpeechClientStatus_SecurityFailed: return #"SpeechClientStatus_SecurityFailed";
case SpeechClientStatus_LoginFailed: return #"SpeechClientStatus_LoginFailed";
case SpeechClientStatus_Timeout: return #"SpeechClientStatus_Timeout";
case SpeechClientStatus_ConnectionFailed: return #"SpeechClientStatus_ConnectionFailed";
case SpeechClientStatus_NameNotFound: return #"SpeechClientStatus_NameNotFound";
case SpeechClientStatus_InvalidService: return #"SpeechClientStatus_InvalidService";
case SpeechClientStatus_InvalidProxy: return #"SpeechClientStatus_InvalidProxy";
case SpeechClientStatus_BadResponse: return #"SpeechClientStatus_BadResponse";
case SpeechClientStatus_InternalError: return #"SpeechClientStatus_InternalError";
case SpeechClientStatus_AuthenticationError: return #"SpeechClientStatus_AuthenticationError";
case SpeechClientStatus_AuthenticationExpired: return #"SpeechClientStatus_AuthenticationExpired";
case SpeechClientStatus_LimitsExceeded: return #"SpeechClientStatus_LimitsExceeded";
case SpeechClientStatus_AudioOutputFailed: return #"SpeechClientStatus_AudioOutputFailed";
case SpeechClientStatus_MicrophoneInUse: return #"SpeechClientStatus_MicrophoneInUse";
case SpeechClientStatus_MicrophoneUnavailable: return #"SpeechClientStatus_MicrophoneUnavailable";
case SpeechClientStatus_MicrophoneStatusUnknown:return #"SpeechClientStatus_MicrophoneStatusUnknown";
case SpeechClientStatus_InvalidArgument: return #"SpeechClientStatus_InvalidArgument";
}
return [[NSString alloc] initWithFormat:#"Unknown error: %d\n", errorCode];
}
/* Called when an error is received. */
-(void)onError:(NSString*)errorMessage withErrorCode:(int)errorCode
{
dispatch_async(dispatch_get_main_queue(), ^{
[[self startButton] setEnabled:YES];
[textOnScreen appendString:(#"********* Error Detected *********\n")];
[textOnScreen appendFormat:(#"%# %#\n"), errorMessage, ConvertSpeechErrorToString(errorCode)];
self.quoteText.text = textOnScreen;
});
}
/* Event fired when the microphone recording status has changed. */
-(void)onMicrophoneStatus:(Boolean)recording
{
if (!recording) {
[micClient endMicAndRecognition];
}
dispatch_async(dispatch_get_main_queue(), ^{
if (!recording) {
[[self startButton] setEnabled:YES];
}
self.quoteText.text = textOnScreen;
});
}
/* Create a recognition request to interact with the Speech Recognition Service.*/
-(void)initializeRecoClient
{
NSString* language = #"en-us";
NSString* path = [[NSBundle mainBundle] pathForResource:#"settings" ofType:#"plist"];
NSDictionary* settings = [[NSDictionary alloc] initWithContentsOfFile:path];
NSString* primaryOrSecondaryKey = [settings objectForKey:(#"primaryKey")];
NSString* luisAppID = [settings objectForKey:(#"luisAppID")];
NSString* luisSubscriptionID = [settings objectForKey:(#"luisSubscriptionID")];
if (isMicrophoneReco) {
if (!isIntent) {
micClient = [SpeechRecognitionServiceFactory createMicrophoneClient:(recoMode)
withLanguage:(language)
withKey:(primaryOrSecondaryKey)
withProtocol:(self)];
}
else {
MicrophoneRecognitionClientWithIntent* micIntentClient;
micIntentClient = [SpeechRecognitionServiceFactory createMicrophoneClientWithIntent:(language)
withKey:(primaryOrSecondaryKey)
withLUISAppID:(luisAppID)
withLUISSecret:(luisSubscriptionID)
withProtocol:(self)];
micClient = micIntentClient;
}
}
else {
if (!isIntent) {
dataClient = [SpeechRecognitionServiceFactory createDataClient:(recoMode)
withLanguage:(language)
withKey:(primaryOrSecondaryKey)
withProtocol:(self)];
}
else {
DataRecognitionClientWithIntent* dataIntentClient;
dataIntentClient = [SpeechRecognitionServiceFactory createDataClientWithIntent:(language)
withKey:(primaryOrSecondaryKey)
withLUISAppID:(luisAppID)
withLUISSecret:(luisSubscriptionID)
withProtocol:(self)];
dataClient = dataIntentClient;
}
}
}
/* Take enum value and produce NSString */
NSString* ConvertSpeechRecoConfidenceEnumToString(Confidence confidence)
{
switch (confidence) {
case SpeechRecoConfidence_None:
return #"None";
case SpeechRecoConfidence_Low:
return #"Low";
case SpeechRecoConfidence_Normal:
return #"Normal";
case SpeechRecoConfidence_High:
return #"High";
}
}
/* Action for pressing the "Start" button */
-(IBAction)startButtonTapped:(id)sender
{
[textOnScreen setString:(#"")];
self.quoteText.text = textOnScreen;
[[self startButton] setEnabled:NO];
if (isMicrophoneReco) {
OSStatus status = [micClient startMicAndRecognition];
if (status) {
[textOnScreen appendFormat:(#"Error starting audio. %#\n"), ConvertSpeechErrorToString(status)];
}
}
}
/* Action for low memory */
-(void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
}
#end
I am new in ios programming. I will appreciate any help on this. Thanks.
Please convert your objective-c view controller to swift. Dont import it via bridging-header.
2.use the new converted class same as u were using previously in objective-c version
3.just import frame work header files in bridging header.
To convert objective-c code to swift use swiftify
EDIT
Here is the converted Code
Both Files Are Combined
class ViewController: UIViewController, SpeechRecognitionProtocol {
//variable declaration.
var textOnScreen: NSMutableString
var dataClient: DataRecognitionClient
var micClient: MicrophoneRecognitionClient
var recoMode: SpeechRecognitionMode
var isMicrophoneReco: Bool
var isIntent: Bool
var waitSeconds: Int
//IBOutlets
#IBOutlet var startButton: UIButton!
/* In our UI, we have a text box to show the reco results.*/
#IBOutlet var startButton: UITextView!
//IBAction
/* Action for pressing the "Start" button */
#IBAction func startButtonTapped(sender: AnyObject) {
textOnScreen.string = ("")
self.quoteText.text = textOnScreen
self.startButton().enabled = false
if isMicrophoneReco {
var status: OSStatus = micClient.startMicAndRecognition()
if status != nil {
textOnScreen.appendFormat(("Error starting audio. %#\n"), ConvertSpeechErrorToString(status))
}
}
}
/* Initialization to be done when app starts. */
override func viewDidLoad() {
super.viewDidLoad()
textOnScreen = NSMutableString(capacity: 1000)
recoMode = SpeechRecognitionMode_ShortPhrase
isMicrophoneReco = true
isIntent = false
waitSeconds = recoMode == SpeechRecognitionMode_ShortPhrase ? 20 : 200
self.initializeRecoClient()
}
/* Action for low memory */
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
/* Called when a partial response is received. */
func onPartialResponseReceived(response: String) {
dispatch_async(dispatch_get_main_queue(), {() -> Void in
textOnScreen.appendFormat(("%#\n"), response)
self.quoteText.text = response
})
}
/* Called when a final response is received. */
func onFinalResponseReceived(response: RecognitionResult) {
var isFinalDicationMessage: Bool = recoMode == SpeechRecognitionMode_LongDictation && (response.RecognitionStatus == RecognitionStatus_EndOfDictation || response.RecognitionStatus == RecognitionStatus_DictationEndSilenceTimeout)
if isMicrophoneReco && ((recoMode == SpeechRecognitionMode_ShortPhrase) || isFinalDicationMessage) {
micClient.endMicAndRecognition()
}
if (recoMode == SpeechRecognitionMode_ShortPhrase) || isFinalDicationMessage {
dispatch_async(dispatch_get_main_queue(), {() -> Void in
self.startButton().enabled = true
})
}
}
func ConvertSpeechErrorToString( errorCode :Int) -> String
{
switch errorCode as! SpeechClientStatus {
case SpeechClientStatus_SecurityFailed:
return "SpeechClientStatus_SecurityFailed"
case SpeechClientStatus_LoginFailed:
return "SpeechClientStatus_LoginFailed"
case SpeechClientStatus_Timeout:
return "SpeechClientStatus_Timeout"
case SpeechClientStatus_ConnectionFailed:
return "SpeechClientStatus_ConnectionFailed"
case SpeechClientStatus_NameNotFound:
return "SpeechClientStatus_NameNotFound"
case SpeechClientStatus_InvalidService:
return "SpeechClientStatus_InvalidService"
case SpeechClientStatus_InvalidProxy:
return "SpeechClientStatus_InvalidProxy"
case SpeechClientStatus_BadResponse:
return "SpeechClientStatus_BadResponse"
case SpeechClientStatus_InternalError:
return "SpeechClientStatus_InternalError"
case SpeechClientStatus_AuthenticationError:
return "SpeechClientStatus_AuthenticationError"
case SpeechClientStatus_AuthenticationExpired:
return "SpeechClientStatus_AuthenticationExpired"
case SpeechClientStatus_LimitsExceeded:
return "SpeechClientStatus_LimitsExceeded"
case SpeechClientStatus_AudioOutputFailed:
return "SpeechClientStatus_AudioOutputFailed"
case SpeechClientStatus_MicrophoneInUse:
return "SpeechClientStatus_MicrophoneInUse"
case SpeechClientStatus_MicrophoneUnavailable:
return "SpeechClientStatus_MicrophoneUnavailable"
case SpeechClientStatus_MicrophoneStatusUnknown:
return "SpeechClientStatus_MicrophoneStatusUnknown"
case SpeechClientStatus_InvalidArgument:
return "SpeechClientStatus_InvalidArgument"
}
return String(format: "Unknown error: %d\n", errorCode)
}
/* Called when an error is received. */
func onError(errorMessage: String, withErrorCode errorCode: Int) {
dispatch_async(dispatch_get_main_queue(), {() -> Void in
self.startButton().enabled = true
textOnScreen.appendString(("********* Error Detected *********\n"))
textOnScreen.appendFormat(("%# %#\n"), errorMessage, ConvertSpeechErrorToString(errorCode))
self.quoteText.text = textOnScreen
})
}
/* Event fired when the microphone recording status has changed. */
func onMicrophoneStatus(recording: Boolean) {
if !recording {
micClient.endMicAndRecognition()
}
dispatch_async(dispatch_get_main_queue(), {() -> Void in
if !recording {
self.startButton().enabled = true
}
self.quoteText.text = textOnScreen
})
}
func ConvertSpeechRecoConfidenceEnumToString( confidence:Confidence) -> String
{
switch confidence {
case SpeechRecoConfidence_None:
return "None"
case SpeechRecoConfidence_Low:
return "Low"
case SpeechRecoConfidence_Normal:
return "Normal"
case SpeechRecoConfidence_High:
return "High"
}
}
/* Create a recognition request to interact with the Speech Recognition Service.*/
override func initializeRecoClient() {
var language: String = "en-us"
var path: String = NSBundle.mainBundle().pathForResource("settings", ofType: "plist")
var settings: [NSObject : AnyObject] = [NSObject : AnyObject](contentsOfFile: path)
var primaryOrSecondaryKey: String = (settings[("primaryKey")] as! String)
var luisAppID: String = (settings[("luisAppID")] as! String)
var luisSubscriptionID: String = (settings[("luisSubscriptionID")] as! String)
if isMicrophoneReco {
if !isIntent {
micClient = SpeechRecognitionServiceFactory.createMicrophoneClient(withLanguage as! recoMode, : withKey as! language, : withProtocol as! primaryOrSecondaryKey, : (self))
}
else {
var micIntentClient: MicrophoneRecognitionClientWithIntent
micIntentClient = SpeechRecognitionServiceFactory.createMicrophoneClientWithIntent(withKey as! language, : withLUISAppID as! primaryOrSecondaryKey, : withLUISSecret as! luisAppID, : withProtocol as! luisSubscriptionID, : (self))
micClient = micIntentClient
}
}
else if !isIntent {
dataClient = SpeechRecognitionServiceFactory.createDataClient(withLanguage as! recoMode, : withKey as! language, : withProtocol as! primaryOrSecondaryKey, : (self))
}
else {
var dataIntentClient: DataRecognitionClientWithIntent
dataIntentClient = SpeechRecognitionServiceFactory.createDataClientWithIntent(withKey as! language, : withLUISAppID as! primaryOrSecondaryKey, : withLUISSecret as! luisAppID, : withProtocol as! luisSubscriptionID, : (self))
dataClient = dataIntentClient
}
}
}

Resources