I was able to display a subtitle track with AVPlayer on iOS 6, but I am not able to customize it. It just shows the same style (a small font size, in white).
Here's how I select the subtitle:
AVMediaSelectionGroup *subtitle = [asset mediaSelectionGroupForMediaCharacteristic: AVMediaCharacteristicLegible];
[self.videoPlayer.currentItem selectMediaOption:subtitle.options[0] inMediaSelectionGroup: subtitle];
And how I'm trying to customize the subtitle:
AVTextStyleRule *rule = [[AVTextStyleRule alloc] initWithTextMarkupAttributes:#{
(id)kCMTextMarkupAttribute_ForegroundColorARGB : #[ #1, #1, #0, #0 ],
(id) kCMTextMarkupAttribute_ItalicStyle : #(YES)}];
self.videoPlayer.currentItem.textStyleRules = #[rule];
No matter if I put this snippet before or after selecting the subtitle, the result is the same.
The AVPlayer is created with a local (file) URL (a mp4 file).
Any thoughts on how to do this?
I asked this question on Apple Developer Forums and I got an answer from an Apple employee:
The textStyleRules property only applies to WebVTT content. Your local file is probably carrying subtitles in TX3G format.
You're right that the documentation doesn't mention this limitation, so you should file a bug so we can get our documentation updated.
So, I'll open a radar to ask that they update the docs and I'll post its number here if someone wants to dupe it.
EDIT:
I created rdar://14923673 to ask Apple to update the docs about this current limitation. I also created rdar://14923755 to ask them to provide support to subtitles in TX3G format.
Please dupe them if you're affected by this issue.
I found workaround to modify text foreground color and background correctly.
Just separate styles to multiple AVTextStyleRule.
func initSubtitleStyle() {
let textStyle:AVTextStyleRule = AVTextStyleRule(textMarkupAttributes: [
kCMTextMarkupAttribute_CharacterBackgroundColorARGB as String: [0.2,0.3,0.0,0.3]
])!
let textStyle1:AVTextStyleRule = AVTextStyleRule(textMarkupAttributes: [
kCMTextMarkupAttribute_ForegroundColorARGB as String: [0.2,0.8,0.4,0.0]
])!
let textStyle2:AVTextStyleRule = AVTextStyleRule(textMarkupAttributes: [
kCMTextMarkupAttribute_BaseFontSizePercentageRelativeToVideoHeight as String: 20,
kCMTextMarkupAttribute_CharacterEdgeStyle as String: kCMTextMarkupCharacterEdgeStyle_None
])!
player.currentItem?.textStyleRules = [textStyle, textStyle1, textStyle2]
}
please do not ask me why, that solution come from try and error XD
Related
I'm trying to share a story with a background image a a sticker image via URL Scheme on my ios app, i am using the attached code and it dose not work.
When i'm trying to share just a background image or just a sticker it does work. But when im trying share both a background image and a sticker in top of it, it dose not work.
Any Ideas?
func shareToInstagram(deepLinkString : String){
let url = URL(string: "instagram-stories://share")!
if UIApplication.shared.canOpenURL(url){
let backgroundData = UIImageJPEGRepresentation(UIImage(named: "shop_placeholder")!, 1.0)!
let creditCardImage = UIImage(named: "share_instagram")!
let stickerData = UIImagePNGRepresentation(creditCardImage)!
let pasteBoardItems = [
["com.instagram.sharedSticker.backgroundImage" : backgroundData],
["com.instagram.sharedSticker.stickerImage" : stickerData],
]
if #available(iOS 10.0, *) {
UIPasteboard.general.setItems(pasteBoardItems, options: [.expirationDate: Date().addingTimeInterval(60 * 5)])
} else {
UIPasteboard.general.items = pasteBoardItems
}
UIApplication.shared.openURL(url)
}
I copy pasted OP's code for use in my own app (only substituting different UIImages) and found only 1 issue, pasteboard items should be contained in a single array otherwise instagram will render only the first item (in this case the background layer). To fix this, replace the declaration of pasteboard items with the following code
let pasteBoardItems = [
["com.instagram.sharedSticker.backgroundImage" : backgroundData,
"com.instagram.sharedSticker.stickerImage" : stickerData]
]
(basically just remove the close and open bracket separating the two items)
Also as a previous answer stated, make sure "instagram-stories" is included in LSApplicationQueriesSchemes in the info.plist file
I use this exact code in my app and it now works perfect
Everything is correct, my code is similar and it works for iOS 11+. I suggest you the following:
check the image data you pass to pasteboard (jpg can't be converted with
UIImagePNGRepresentation and vice versa)
check the info.plist. You should enable "instagram-stories" scheme in it (LSApplicationQueriesSchemes key)
Like Alec said, you need to put all of Instagram data in one list, not multiple lists. look at the example from the meta documents:
NSArray *pasteboardItems = #[#{#"com.instagram.sharedSticker.stickerImage" : stickerImage,
#"com.instagram.sharedSticker.backgroundTopColor" : backgroundTopColor,
#"com.instagram.sharedSticker.backgroundBottomColor" : backgroundBottomColor}];
2. For more recent readers, as of swift 4.2 and iOS 12 UIImageJPEGRepresentation is replaced by jpegData. change
let backgroundData = UIImageJPEGRepresentation(yourImage, 1.0)
with
let backgroundData = yourImage.jpegData(compressionQuality: 1.0)
I am writing a test framework for an ios app that requires importing image from photos/gallery app for validation. I am using XCTest Framework for testing. I have looked over the Internet for some resources but couldn't find any. Can anyone help me how to approach the problem. Again, I have pick the image not from inside the app but from image but from Photos library.
Yes
You can access the photo library but it require XCUITest and the recorder doesn't work inside of Apple's UIRemoteView's like the photo picker. What you have to do is get to the photo picker inside an XCUITest, set a breakpoint, then inspect the view hierarchy to find the elements that are able to be navigated with XCUITest. The example below works with the pictures that come with the simulator.
let app = XCUIApplication()
// get to the photo library
// set a breakpoint, po [[UIWindow keyWindow] recursiveDescription]
let tablesQuery = app.tables
app.sheets.buttons["Choose From Library"].tap()
app.cells["Camera Roll"].tap()
app.cells["Photo, Landscape, March 12, 2011, 7:17 PM"].tap()
let photosApp = XCUIApplication(bundleIdentifier: "com.apple.mobileslideshow")
photosApp.launch()
let continueButton = photosApp.buttons["Continue"]
if continueButton.waitForExistence(timeout: 2) {
continueButton.tap()
}
photosApp.collectionViews["PhotosGridView"].cells.firstMatch.tap()
This works reliably for me, derived from Steve Moser's helpful answer:
/*
LIKE: The left hand expression equals the right-hand expression: ? and * are allowed as wildcard characters, where ? matches 1 character and * matches 0 or more characters.
Resource: https://nshipster.com/nspredicate/
*/
let yellowLeafGreenBackground = NSPredicate(format: "label LIKE 'Photo, October 09, 2009, *:09*'")
let waterfallCloseUpOverRocks = NSPredicate(format: "label LIKE 'Photo, August 08, 2012, *:29*'")
let treeWithWaterfallBackground = NSPredicate(format: "label LIKE 'Photo, August 08, 2012, *:52*'")
let yellowFlowerSucculentBeach = NSPredicate(format: "label LIKE 'Photo, March 12, 2011, *:17*'")
let amazingWaterfall = NSPredicate(format: "label LIKE 'Photo, August 08, 2012, *:55*'")
// take note with this one, it is an HDR image
let beautyFlowers = NSPredicate(format: "label LIKE 'Photo, March 30, 2018, *:14*'")
func testTappingOnSpecificImage() {
// ... test setup and navigation to get to presented Camera Roll (PHPicker or UIImagePickerController)
// trigger presentation of camera roll
app.buttons["Choose from Photos"].tap()
XCTAssertTrue(app.buttons["Cancel"].waitForExistence(timeout: 3))
let activeQuery = treeWithWaterfallBackground
app.scrollViews.otherElements.images.matching(activeQuery).element.tap()
}
You can see the dates for the individual photos by going into the Photos app on the simulator and tapping into the detail view for a specific image. Like Mikkel Selsøe pointed out, the timestamps are localized for the current time zone.
Available in a gist here: https://gist.github.com/bitops/9182b5ee96c682aba57d9fa16ca6b987
XCTest provides special method just for that case.
let galleryAccessMonitor = addUIInterruptionMonitor(withDescription: "Intercept Gallery Access") { alert -> Bool in
alert.buttons.element(boundBy: 1).tap() /// tap accept
return true /// mark as handled
}
see https://useyourloaf.com/blog/handling-system-alerts-in-ui-tests/
Xcode 14.1 with iOS 16.1 simulator solution:
let app = XCUIApplication()
app.scrollViews.otherElements.images.containing(NSPredicate(format: "label BEGINSWITH 'Photo'")).element(boundBy: 0).tap()
Here's my solution that is independent on the photos being added to the library:
let app = XCUIApplication()
app.launch()
app.buttons["add.photo.button"].tap()
let photosNavBar = app.navigationBars["Photos"]
if photosNavBar.waitForExistence(timeout: 2) {
XCTAssert(app.navigationBars["Photos"].exists)
} else {
XCTFail()
}
If you do not care which image you want to choose then this worked for me:
let image = app.scrollViews.images.matching(NSPredicate(format: "label LIKE '*2012*'")).firstMatch
if image.waitForExistence(timeout: 5) {
image.tap()
}
Note that similar answers which use full name of one of the default photos tend to fail in many cases because for example device/simulator language can be different than english and then image names do not start with "Photo" word, also according to other answers it seems that the time in those names can also be different so that is why I used only year in above code.
You can't interact with an app outside your own app using XCTest. The tests have a reference to your app's bundle identifier, and that is the only app they are able to interact with.
XCTest requires a certain amount of access to the internals of your app in order to give you information about it to allow you to interact with it through XCTest, and this is not something that is available to access in apps that you did not make yourself.
I want to create an app that receive voice input using iOS speech API.
In google's API, there is an option for speechContext which I can provide hint or bias to some uncommon words.
Do iOS API provide this feature? I've been searching the site for a while but din't find any.
there is no sample code about implementing hints for Google Speech Clouds for Swift online, so I made it up!
Open this class: SpeechRecognitionService.swift
You have to add your hint list array to the SpeechContext, add the SpeechContext to RecognitionConfig, and finally add RecognitionConfig to Streaming recognition config. Like this:
let recognitionConfig = RecognitionConfig()
recognitionConfig.encoding = .linear16
recognitionConfig.sampleRateHertz = Int32(sampleRate)
recognitionConfig.languageCode = "en-US"
recognitionConfig.maxAlternatives = 3
recognitionConfig.enableWordTimeOffsets = true
let streamingRecognitionConfig = StreamingRecognitionConfig()
streamingRecognitionConfig.singleUtterance = true
streamingRecognitionConfig.interimResults = true
//Custom vocabulary (Hints) code
var phraseArray=NSMutableArray(array: ["my donkey is yayeerobee", "my horse is tekkadan", "bet four for kalamazoo"])
var mySpeechContext = SpeechContext.init()
mySpeechContext.phrasesArray=phraseArray
recognitionConfig.speechContextsArray = NSMutableArray(array: [mySpeechContext])
streamingRecognitionConfig.config = recognitionConfig
//Custom vocabulary (Hints) code
let streamingRecognizeRequest = StreamingRecognizeRequest()
streamingRecognizeRequest.streamingConfig = streamingRecognitionConfig
Bonus: Adding your custom words mixed inside a simple phrase instead of adding the word alone gave me better results.
I'm building custom keyboard extension and want to implement autocompletion, like Apple does.
As I see method completionsForPartialWordRangereturns list of words sorted alphabetically. How can I get results sorted by usage?
The docs for completionsForPartialWordRange:inString:language: say:
The strings in the array are in the order they should be presented to the user—that is, more probable completions come first in the array.
However, the results are very clearly sorted in alphabetical order, and it's not true that "more probable completions come first in the array." The below was tested with iOS 9:
NSString *partialWord = #"th";
UITextChecker *textChecker = [[UITextChecker alloc] init];
NSArray *completions = [textChecker completionsForPartialWordRange:NSMakeRange(0, partialWord.length) inString:partialWord language:#"en"];
iOS word completions for "th":
thalami,
thalamic,
thalamus,
thalassic,
thalidomide,
thallium,
...
the,
...
So, the results will need to be sorted again after obtaining the word completions.
The OS X NSSpellChecker version of this method does not have the same problem:
NSString *partialWord = #"th";
NSArray *completions = [[NSSpellChecker sharedSpellChecker] completionsForPartialWordRange:NSMakeRange(0, partialWord.length) inString:partialWord language:#"en" inSpellDocumentWithTag:0];
List of complete words from the spell checker dictionary in the order they should be presented to the user.
Mac OS X word completions for "th":
the,
this,
that,
they,
thanks,
there,
that's,
...
Filing a radar bug report would be a good idea, so that the behavior will hopefully be fixed in a later version of iOS. I've reported this as rdar://24226582 if you'd like to duplicate.
Swift 4.0
func autoSuggest(_ word: String) -> [String]? {
let textChecker = UITextChecker()
let availableLangueages = UITextChecker.availableLanguages
let preferredLanguage = (availableLangueages.count > 0 ? availableLangueages[0] : "en-US");
let completions = textChecker.completions(forPartialWordRange: NSRange(0..<word.utf8.count), in: word, language: preferredLanguage)
return completions
}
the text that caused the crash is the following:
the error occurred at the following line:
let size = CGSize(width: 250, height: DBL_MAX)
let font = UIFont.systemFontOfSize(16.0)
let attributes = [
NSFontAttributeName:font ,
NSParagraphStyleAttributeName: paraStyle
]
var rect = text.boundingRectWithSize(size, options:.UsesLineFragmentOrigin, attributes: attributes, context: nil)
where text variable contains the inputted string
parastyle is declared as follows:
let paraStyle = NSMutableParagraphStyle()
paraStyle.lineBreakMode = NSLineBreakMode.ByWordWrapping
My initial idea is that the system font can't handle these characters and I need to do an NSCharacterSet, but I'm not sure how to either just ban characters that'll crash my app or make it so i can handle this input (ideal). I don't want to ban emojis/emoticons either.
Thanks!
Not an answer but some information and that possibly provids a way code way to avoid it.
Updated to information from The Register:
The problem isn’t with the Arabic characters themselves, but in how the unicode representing them is processed by CoreText, which is a library of software routines to help apps display text on screens.
The bug causes CoreText to access memory that is invalid, which forces the operating system to kill off the currently running program: which could be your text message app, your terminal, or in the case of the notification screen, a core part of the OS.
From Reddit but this may not be completely correct:
It only works when the message has to be abbreviated with ‘…’. This is usually on the lock screen and main menu of Messages.app.
The words effective and power can be anything as long as they’re on two different lines, which forces the Arabic text farther down the message where some of the letters will be replaced with ‘…’
The crash happens when the first dot replaces part of one of the Arabic characters (they require more than one byte to store) Normally there are safety checks to make sure half characters aren’t stored, but this replacement bypasses those checks for whatever reason.
My solution is the next category:
static NSString *const CRASH_STRING = #"\u0963h \u0963 \u0963";
#implementation NSString (CONEffectivePower)
- (BOOL)isDangerousStringForCurrentOS
{
if (IS_IOS_7_OR_LESS || IS_IOS_8_4_OR_HIGHER) {
return NO;
}
return [self containsEffectivePowerText];
}
- (BOOL)containsEffectivePowerText
{
return [self containsString:CRASH_STRING];
}
#end
Filter all characters to have same directionality. Unfortunately, I'm only aware of such API in Java.
Don't even try. This is a bug in the operating system that will be fixed. It's not your problem. If you try to fix it, you are just wasting your time. And you are very likely to introduce bugs - when you say you "sanitise" input that means you cannot handle some perfectly fine input.
The company I work at develops a multiplatform group video chat.
In Crashlytics report we started noticing that some users are "effectively" trolling iOS users using this famous unicode sequence.
We can't just sit and wait for Apple to fix this bug.
So, I've worked on this problem, this is the shortest crashing sequence I got:
// unichar representation
unichar crashChars[8] = {1585, 1611, 32, 2403, 32, 2403, 32, 2403};
// string representation
NSString *crashString = #"\u0631\u064b \u0963 \u0963 \u0963"
So, I decided to filter out all text messages that contains two U+0963 'ॣ' symbols with one symbol between them (hope you are able to decipher this phrase)
My code from NSString+Extensions category.
static const unichar kDangerousSymbol = 2403;
- (BOOL)isDangerousUnicode {
NSUInteger distance = 0;
NSUInteger charactersFound = 0;
for (NSUInteger i = 0; i < self.length; i++) {
unichar character = [self characterAtIndex:i];
if (charactersFound) {
distance++;
}
if (distance > 2) {
charactersFound = 0;
}
if (kDangerousSymbol == character) {
charactersFound++;
}
if (charactersFound > 1 && distance > 0) {
return YES;
}
}
return NO;
}
Lousy Specta test:
SpecBegin(NSStringExtensions)
describe(#"NSString+Extensions", ^{
//....
it(#"should detect dangerous Unicode sequences", ^{
expect([#"\u0963 \u0963" isDangerousUnicode]).to.beTruthy();
expect([#"\u0631\u064b \u0963 \u0963 \u0963" isDangerousUnicode]).to.beTruthy();
expect([#"\u0631\u064b \u0963 \u0963 \u0963" isDangerousUnicode]).to.beFalsy();
});
//....
});
SpecEnd
I'm not sure if it's OK to "discriminate" messages with too many "devanagari vowel sign vocalic ll".
I'm open to corrections, suggestions, criticism :).
I would love to see a better solution to this problem.