Compiling on Swift 2.0, running IOS 9.3 working with Xcode 7.2.1 under 10.11.3
Trying to implement a code 39 bar code scanner, but unable to figure how to cast this without crashing my code in the process, unless I use this hack.
AVFoundation returns both AVMetadataFaceObject & AVMetadataMachineReadableCodeObject objects when scanning my bar code.
If I try and cast the wrong object into the wrong type it crashes, and for the life of I can not seem to find a way of figuring out which type of code it is looking at beyond this hack.
Tried guard statement; no crash. Tried do/catch; no crashes. Tried to test the type, but nothing seems to work.
if metadataObjects == nil || metadataObjects.count == 0 {
//print("No code is detected")
return
} else {
let A1 = String(metadataObjects[0])
if (A1.hasPrefix("<AVMetadataFaceObject:")) {
print("Face -> \(A1)")
}
if (A1.hasPrefix("<AVMetadataMachineReadableCodeObject:")) {
let metadataObj = metadataObjects[0] as! AVMetadataMachineReadableCodeObject
self.lblDataInfo.text = metadataObj.stringValue
self.lblDataType.text = metadataObj.type
print("Machine -> \(A1) ")
}
}
Ok, this works, but I don't think its good coding practice, I fear it is hack.
Related
I'm using the following code in my project to draw fading on a view:
let customBlurClass: AnyObject.Type = NSClassFromString("_UICustomBlurEffect")!
let customBlurObject: NSObject.Type = customBlurClass as! NSObject.Type
self.blurEffect = customBlurObject.init() as! UIBlurEffect
self.blurEffect.setValue(1.0, forKeyPath: "scale")
self.blurEffect.setValue(radius, forKeyPath: "blurRadius")
super.init(effect: radius == 0 ? nil : self.blurEffect)
sometimes on Fabric I get crash report from the app on this line:
let customBlurClass: AnyObject.Type = NSClassFromString("_UICustomBlurEffect")!
which means that the NSClassFromString return nil value,
I searched a lot about this problem but no useful answers,
Please Help,
Thanks.
The most likely explanation is that those crashes occur on devices running iOS 8 or earlier. _UICustomBlurEffect was introduced in iOS 9.
You should do:
if let blurClass = NSClassFromString("_UICustomBlurEffect") {
// set up blur view
}
to avoid crashes on devices where it's not supported.
I want to detect some triangle patterns from camera input using iPhone. I found some example code that can detect QR/bar code using AVFoundation. The main part seems to be AVMetadataMachineReadableCodeObject class. Here is some sample code from AppCoda:
func captureOutput(captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [AnyObject]!, fromConnection connection: AVCaptureConnection!) {
// Check if the metadataObjects array is not nil and it contains at least one object.
if metadataObjects == nil || metadataObjects.count == 0 {
qrCodeFrameView?.frame = CGRectZero
messageLabel.text = "No barcode/QR code is detected"
return
}
// Get the metadata object.
let metadataObj = metadataObjects[0] as! AVMetadataMachineReadableCodeObject
// Here we use filter method to check if the type of metadataObj is supported
// Instead of hardcoding the AVMetadataObjectTypeQRCode, we check if the type
// can be found in the array of supported bar codes.
if supportedBarCodes.contains(metadataObj.type) {
// if metadataObj.type == AVMetadataObjectTypeQRCode {
// If the found metadata is equal to the QR code metadata then update the status label's text and set the bounds
let barCodeObject = videoPreviewLayer?.transformedMetadataObjectForMetadataObject(metadataObj)
qrCodeFrameView?.frame = barCodeObject!.bounds
if metadataObj.stringValue != nil {
messageLabel.text = metadataObj.stringValue
}
}
In the above code, once a QR code is detected, a boundary box will be drawn and the text field will be updated. Similarly, AVMetadataFaceObject class is used in face detection applications. I saw from the reference that both classes are subclasses of AVMetadataObject.
I'm wondering if it is possible to customize a triangles detector by writing a subclass of AVMetadataObject, say, we call the subclass AVMetadataTriangleObject. (I have a readily available detection algorithm and have code written in Matlab. Transcribing it into swift shouldn't be tough.) If this approach is not possible, can anyone suggest alternative way(s) for achieving the above goal?
Thank you so much!
AVMetadataObject provides no hooks for implementing such a thing, and apart from AVAssetResourceLoader, AVFoundation does not allow much extension.
I think you should transcribe your algorithm to swift and run it against the captured CMSampleBuffer images that you get when you capture using from an AVCaptureVideoDataOutput.
This is a part if code when error happens:
class func randomWord() -> TBWord {
let randomIndex = Int(arc4random_uniform(UInt32(TBAppSettings.wordsForCurrentGame.count)))
let word = TBAppSettings.wordsForCurrentGame[randomIndex]
TBAppSettings.wordsForCurrentGame.removeAtIndex(randomIndex)
MagicalRecord.saveWithBlock { context in
let word = TBWord.findWordWithIdentifier(word.identifier, inContext: context) //here error happens
word?.used = true
}
return word
}
How can I workaround this? I know about other questions about this problem, but they are not enough for me.
(Besides the fact that MagicalRecord is a big misunderstanding of how to use Core Data properly...)
Have you tried to run you code with -com.apple.CoreData.ConcurrencyDebug 1 as a launch argument? This smells like a threading problem.
Is there a better way in Swift to check if a specific flag is set on application.currentUserNotificationSettings().types?
For example how would you check if the application is allow to update it's badge?
Below is the current method I'm using, but I thought there might be a better way in Swift, like some operator I don't know about.
func printUserNotificationSettings() {
println("Notification Settings:")
let notificationSettingsTypes = UIApplication.sharedApplication().currentUserNotificationSettings().types
let badgeOn: Bool = (notificationSettingsTypes & UIUserNotificationType.Badge) == UIUserNotificationType.Badge
let soundOn: Bool = (notificationSettingsTypes & UIUserNotificationType.Sound) == UIUserNotificationType.Sound
let alertOn: Bool = (notificationSettingsTypes & UIUserNotificationType.Alert) == UIUserNotificationType.Alert
println("\tBadge? \(badgeOn)")
println("\tSound? \(soundOn)")
println("\tAlert? \(alertOn)")
}
Looks like the only thing you can do to improve the code is make it more concise.
func printUserNotificationSettings() {
println("Notification Settings:")
let notificationSettingsTypes = UIApplication.sharedApplication().currentUserNotificationSettings().types
let badgeOn = (notificationSettingsTypes & .Badge) != nil
let soundOn = (notificationSettingsTypes & .Sound) != nil
let alertOn = (notificationSettingsTypes & .Alert) != nil
println("\tBadge? \(badgeOn)")
println("\tSound? \(soundOn)")
println("\tAlert? \(alertOn)")
}
UIUserNotificationType implements RawOptionSetType which is the swift mapping from NS_OPTIONS in Objective C code. In earlier betas of Xcode, these objects also implemented BooleanType, which would have allowed you to write this code a little more concisely, but that seems to have been removed prior to release.
Also, searching around, the most common way to do the check is != nil so I have included that modification as well, it seems to improve the readability a bit.
Here is a pretty robust StackOverflow post on the topic: Switch statement for imported NS_OPTIONS (RawOptionSetType) in Swift?
And another great article on the background of RawOptionSetType: http://nshipster.com/rawoptionsettype/
i don't understand why this code doesn't work, the detector is always nil with the CIDetectorTypeQRCode constant, everything work with CIDetectorTypeFace.
I Supect a bug from API of Apple. This a the official doc : Apple documentation
#IBAction func analyseTag(sender: AnyObject) {
var detector:CIDetector = CIDetector(ofType: CIDetectorTypeQRCode, context:nil, options:[CIDetectorAccuracy: CIDetectorAccuracyHigh])
var decode = ""
var ciImage:CIImage = CIImage(image: ImgChoosed.image)
var message:String = "";
let features = detector.featuresInImage(ciImage)
for feature in features as [CIQRCodeFeature] {
message += feature.messageString
}
if(message == "") {
println("nothing")
} else {
println("\(message)")
}
}
Have you a solution?
Thank in advance guy's
The code you provided can't have a nil detector because it's not an optional and the compiler would complain about several places in your code if it was.
If features is empty then you know it didn't find a QR code in your image. Try providing a better image or turning down the CIDetectorAccuracy.
If features isn't empty then your cast is failing.
Edit:
You can't pass a nil context in the constructor.
This happened to us as well. iPhone 4s doesn't return a CIDetector of type QRCode. The other types (rectangle, face) work though…
The same code works as expected on the iPhone 6. Haven't tested on a 5 or 5s yet.
But two weeks ago it was still working on the 4s, I believe. It was still on iOS 8 back then, I guess.
Make sure your ImgChoosed.image is not nil.
Change another input image for testing.
Try for feature in features as! [CIQRCodeFeature].
I found that using on a device resolved this issue. The simulator seemed to always return nil for me.