My purpose : Draw outline of every glyph
example1:
input: text= "666棒"
display:
Attach:In the figure above, 1 is displayView,2 is inputView.
example2:
input: text= "666😁棒"
display:
Attach: In the figure above, 1 is displayView,2 is inputView,3 is nothing rendered.
Main ideas is :
Use CoreText obtained every CGglyph.
Get every glyph's CGPath
Use CAShapeLayer display the glyph on screen.
Main method:
let letters = CGMutablePath()
let font = CTFontCreateWithName(fontName as CFString?, fontSize, nil)
let attrString = NSAttributedString(string: text, attributes: [kCTFontAttributeName as String : font])
let line = CTLineCreateWithAttributedString(attrString)
let runArray = CTLineGetGlyphRuns(line)
for runIndex in 0..<CFArrayGetCount(runArray) {
let run : CTRun = unsafeBitCast(CFArrayGetValueAtIndex(runArray, runIndex), to: CTRun.self)
let dictRef : CFDictionary = unsafeBitCast(CTRunGetAttributes(run), to: CFDictionary.self)
let dict : NSDictionary = dictRef as NSDictionary
let runFont = dict[kCTFontAttributeName as String] as! CTFont
for runGlyphIndex in 0..<CTRunGetGlyphCount(run) {
let thisGlyphRange = CFRangeMake(runGlyphIndex, 1)
var glyph = CGGlyph()
var position = CGPoint.zero
CTRunGetGlyphs(run, thisGlyphRange, &glyph)
CTRunGetPositions(run, thisGlyphRange, &position)
let letter = CTFontCreatePathForGlyph(runFont, glyph, nil)
let t = CGAffineTransform(translationX: position.x, y: position.y)
if let letter = letter {
letters.addPath(letter, transform: t)
}
}
}
let path = UIBezierPath()
path.move(to: CGPoint.zero)
path.append(UIBezierPath(cgPath: letters))
let pathLayer = CAShapeLayer()
pathLayer.path = path.cgPath
self.layer.addSubLayer(pathLayer)
...
Question:
How to get emoji path ,in this case I can draw the emoji outline instead of draw the whole emoji? Another benefit is I can draw emoji path animated if I need.
Any help is appreciate!
************************ update 2.15.2017 ***********************
Thanks #KrishnaCA 's suggest.
I used bool supports = CTFontGetGlyphWithName(myFont, "😀") find that no font is support emoji.
Fortunately is Google's Noto provide good support for emoji fonts
You can find it there :google's Noto
I used font Noto Emoji
Display:
Only Noto Emoji and Noto Color Emoji support Emoji (I guess)
Hope to help people who come here!
I believe you need to check whether a glyph for an unicode corresponding to the CTFont exist or not. If it doesn't exist, fall back to any default CTFont that has a glpyh for an unicode
You can check that using the following code.
bool supports = CTFontGetGlyphWithName(myFont, "😀")
here, myFont is a CTFontRef object.
Please let me know if this is what you're not looking for
I believe you'll need CATextLayers to help you out.
I know it's a bit late, but sadly you can not - emojis are actually bitmaps drawn into the same context as shapes representing regular characters. You best bet is probably to draw emoji characters separately at needed scale into the context. This won't give you access to the actual vector data.
If you really need it in a vector form:
I'd go with finding Apple Emoji font redrawn in vector (I remember seeing it on the internet, not sure if it contains all the latest emojis though)
Mapping names of the individual vector images you found to the characters and then drawing the vector images
Related
I would like to display math terms inside a text, in particular in an inline mode, i.e. inside a sentence.
Using LaTeX, this would for example look like:
"Given a right triangle having catheti of length \(a\) resp. \(b\) and a hypotenuse of length \(c\), we have
\[a^2 + b^2 = c^2.\]
This fact is known as the Pythagorean theorem."
Does anybody know how this can be achieved in Swift?
(I know that this example may be achieved in Swift without LaTeX-like tools. However, the expressions in my mind are in fact more complex than in this example, I do need the power of LaTeX.)
The optimal way would be a UITextView-like class which recognizes the math delimiters \(,\) resp. \[,\], recognizes LaTeX code inside these delimiters, and formats the text accordingly.
In the Khan Academy app, this problem seems to be solved as the screenshots in the Apple App Store/Google Play Store show inline (LaTeX) math.
I’ve found the package iosMath which provides a UILabel-like class MTMathUILabel. As this class can display solely formulas, this seems to be not good enough for my purpose, except if there was a method which takes a LaTeX source text such as in the example above, formats expressions such as \(a\) into tiny MTMathUILabels and sets these labels between the other text components. As I am new to Swift, I do not know whether and how this can be achieved. Moreover, this seems to be very difficult from a typographical point of view as there will surely occur difficulties with line breaks. And there might occur performance issues if there are a large number of such labels on the screen at the same time?
It is possible to achieve what I want using a WKWebView and MathJax or KaTeX, which is also a hack, of course. This leads to other difficulties, e.g. if one wants to set several of these WKWebViews on a screen, e.g. inside UITableViewCells.
Using iosMath, my solution on how to get a UILabel to have inline LaTeX is to include LATEX and ENDLATEX markers with no space. I replaced all ranges with an image of the MTMathUILabel, going from last range to first range so the positions don't get screwed up (This solution allows for multiple markers). The image returned from my function is flipped so i used .downMirrored orientation, and i sized it to fit my text, so you might need to fix the numbers a little for the flip scale of 2.5 and the y value for the attachment.bounds.
import UIKit
import iosMath
let question = UILabel()
let currentQuestion = "Given a right triangle having catheti of length LATEX(a)ENDLATEX resp. LATEX(b)ENDLATEX and a hypotenuse of length LATEX(c)ENDLATEX, we have LATEX[a^2 + b^2 = c^2]ENDLATEX. This fact is known as the Pythagorean theorem."
question.text = currentQuestion
if (question.text?.contains("LATEX"))! {
let tempString = question.text!
let tempMutableString = NSMutableAttributedString(string: tempString)
let pattern = NSRegularExpression.escapedPattern(for: "LATEX")
let regex = try? NSRegularExpression(pattern: pattern, options: [])
if let matches = regex?.matches(in: tempString, options: [], range: NSRange(location: 0, length: tempString.count)) {
var i = 0
while i < matches.count {
let range1 = matches.reversed()[i+1].range
let range2 = matches.reversed()[i].range
let finalDistance = range2.location - range1.location + 5
let finalRange = NSRange(location: range1.location, length: finalDistance)
let startIndex = String.Index(utf16Offset: range1.location + 5, in: tempString)
let endIndex = String.Index(utf16Offset: range2.location - 3, in: tempString)
let substring = String(tempString[startIndex..<endIndex])
var image = UIImage()
image = imageWithLabel(string: substring)
let flip = UIImage(cgImage: image.cgImage!, scale: 2.5, orientation: .downMirrored)
let attachment = NSTextAttachment()
attachment.image = flip
attachment.bounds = CGRect(x: 0, y: -flip.size.height/2 + 10, width: flip.size.width, height: flip.size.height)
let replacement = NSAttributedString(attachment: attachment)
tempMutableString.replaceCharacters(in: finalRange, with: replacement)
question.attributedText = tempMutableString
i += 2
}
}
}
func imageWithLabel(string: String) -> UIImage {
let label = MTMathUILabel()
label.latex = string
label.sizeToFit()
UIGraphicsBeginImageContextWithOptions(label.bounds.size, false, 0)
defer { UIGraphicsEndImageContext() }
label.layer.render(in: UIGraphicsGetCurrentContext()!)
return UIGraphicsGetImageFromCurrentImageContext() ?? UIImage()
}
I'm building an iOS App, and Emojis play a big part in it.
In iOS 10.2, new emojis were released.
I'm pretty sure that if someone has iOS 8, for example, they wouldn't actually be able to see these emojis. Is there a way to detect this? I'm trying to dynamically build a list of all the Emojis that are supported on the user's iOS version, but I'm having a bit of trouble.
Clarification: an Emoji is merely a character in the Unicode Character space, so the present solution works for all characters, not just Emoji.
Synopsis
To know if a Unicode character (including an Emoji) is available on a given device or OS, run the unicodeAvailable() method below.
It works by comparing a given character image against a known undefined Unicode character U+1FFF.
unicodeAvailable(), a Character extension
private static let refUnicodeSize: CGFloat = 8
private static let refUnicodePng =
Character("\u{1fff}").png(ofSize: Character.refUnicodeSize)
func unicodeAvailable() -> Bool {
if let refUnicodePng = Character.refUnicodePng,
let myPng = self.png(ofSize: Character.refUnicodeSize) {
return refUnicodePng != myPng
}
return false
}
Discussion
All characters will be rendered as a png at the same size (8) as defined once in
static let refUnicodeSize: CGFloat = 8
The undefined character U+1FFF image is calculated once in
static let refUnicodePng = Character("\u{1fff}").png(ofSize: Character.refUnicodeSize)
A helper method optionally creates a png from a Character
func png(ofSize fontSize: CGFloat) -> Data?
1. Example: Test against 3 emoji
let codes:[Character] = ["\u{2764}","\u{1f600}","\u{1F544}"] // ❤️, 😀, undefined
for unicode in codes {
print("\(unicode) : \(unicode.unicodeAvailable())")
}
2. Example: Test a range of Unicode characters
func unicodeRange(from: Int, to: Int) {
for unicodeNumeric in from...to {
if let scalar = UnicodeScalar(unicodeNumeric) {
let unicode = Character(scalar)
let avail = unicode.unicodeAvailable()
let hex = String(format: "0x%x", unicodeNumeric)
print("\(unicode) \(hex) is \(avail ? "" : "not ")available")
}
}
}
Helper function: Character to png
func png(ofSize fontSize: CGFloat) -> Data? {
let attributes = [NSAttributedStringKey.font:
UIFont.systemFont(ofSize: fontSize)]
let charStr = "\(self)" as NSString
let size = charStr.size(withAttributes: attributes)
UIGraphicsBeginImageContext(size)
charStr.draw(at: CGPoint(x: 0,y :0), withAttributes: attributes)
var png:Data? = nil
if let charImage = UIGraphicsGetImageFromCurrentImageContext() {
png = UIImagePNGRepresentation(charImage)
}
UIGraphicsEndImageContext()
return png
}
► Find this solution on GitHub and a detailed article on Swift Recipes.
Just for future reference, after discovering my app had 13.2 emojis not supported in 12.x versions, I used the answer from here: How can I determine if a specific emoji character can be rendered by an iOS device? which worked really well for me.
Some unicode characters cannot be displayed on iOS but are displayed correctly on macOS. Similarly, some unicode characters that iOS can display cannot be displayed on watchOS. This is due to different built-in fonts installed on these platforms.
When a character cannot be displayed it appears as a ? inside a box, like so:
I've also seen some characters display as an alien instead (not sure why the difference):
Is there a way to know when a specific unicode character will not be displayed properly given a string of the unicode character such as "ᄥ"?
I am in need of a solution that works for both iOS and watchOS.
You can use CTFontGetGlyphsForCharacters() to determine if a font has a glyph for a particular code point (note that supplementary characters need to be checked as surrogate pairs):
CTFontRef font = CTFontCreateWithName(CFSTR("Helvetica"), 12, NULL);
const UniChar code_point[] = { 0xD83C, 0xDCA1 }; // U+1F0A1
CGGlyph glyph[] = { 0, 0 };
bool has_glyph = CTFontGetGlyphsForCharacters(font, code_point, glyph, 2);
Or, in Swift:
let font = CTFontCreateWithName("Helvetica", 12, nil)
var code_point: [UniChar] = [0xD83C, 0xDCA1]
var glyphs: [CGGlyph] = [0, 0]
let has_glyph = CTFontGetGlyphsForCharacters(font, &code_point, &glyph, 2)
If you want to check the complete set of fallback fonts that the system will try to load a glyph from, you will need to check all of the fonts returned by CTFontCopyDefaultCascadeListForLanguages(). Check the answer to this question for information on how the fallback font list is created.
Compare against the known, undefined character U+1FFF:
/// - Parameter font: a UIFont
/// - Returns: true if glyph exists
func glyphAvailable(forFont font:UIFont) -> Bool {
if let refUnicodePng = Character("\u{1fff}").png(forFont: font),
let myPng = self.png(forFont: font) {
return refUnicodePng != myPng
}
return false
}
using a png bitmap:
/// - Parameter font: a UIFont
/// - Returns: an optional png representation
func png(forFont font: UIFont) -> Data? {
let attributes = [NSAttributedStringKey.font: font]
let charStr = "\(self)" as NSString
let size = charStr.size(withAttributes: attributes)
UIGraphicsBeginImageContext(size)
charStr.draw(at: CGPoint(x: 0,y :0), withAttributes: attributes)
var png:Data? = nil
if let charImage = UIGraphicsGetImageFromCurrentImageContext() {
png = UIImagePNGRepresentation(charImage)
}
UIGraphicsEndImageContext()
return png
}
Answered here.
Some unicode characters cannot be displayed on iOS but are displayed correctly on macOS. Similarly, some unicode characters that iOS can display cannot be displayed on watchOS. This is due to different built-in fonts installed on these platforms.
When a character cannot be displayed it appears as a ? inside a box, like so:
I've also seen some characters display as an alien instead (not sure why the difference):
Is there a way to know when a specific unicode character will not be displayed properly given a string of the unicode character such as "ᄥ"?
I am in need of a solution that works for both iOS and watchOS.
You can use CTFontGetGlyphsForCharacters() to determine if a font has a glyph for a particular code point (note that supplementary characters need to be checked as surrogate pairs):
CTFontRef font = CTFontCreateWithName(CFSTR("Helvetica"), 12, NULL);
const UniChar code_point[] = { 0xD83C, 0xDCA1 }; // U+1F0A1
CGGlyph glyph[] = { 0, 0 };
bool has_glyph = CTFontGetGlyphsForCharacters(font, code_point, glyph, 2);
Or, in Swift:
let font = CTFontCreateWithName("Helvetica", 12, nil)
var code_point: [UniChar] = [0xD83C, 0xDCA1]
var glyphs: [CGGlyph] = [0, 0]
let has_glyph = CTFontGetGlyphsForCharacters(font, &code_point, &glyph, 2)
If you want to check the complete set of fallback fonts that the system will try to load a glyph from, you will need to check all of the fonts returned by CTFontCopyDefaultCascadeListForLanguages(). Check the answer to this question for information on how the fallback font list is created.
Compare against the known, undefined character U+1FFF:
/// - Parameter font: a UIFont
/// - Returns: true if glyph exists
func glyphAvailable(forFont font:UIFont) -> Bool {
if let refUnicodePng = Character("\u{1fff}").png(forFont: font),
let myPng = self.png(forFont: font) {
return refUnicodePng != myPng
}
return false
}
using a png bitmap:
/// - Parameter font: a UIFont
/// - Returns: an optional png representation
func png(forFont font: UIFont) -> Data? {
let attributes = [NSAttributedStringKey.font: font]
let charStr = "\(self)" as NSString
let size = charStr.size(withAttributes: attributes)
UIGraphicsBeginImageContext(size)
charStr.draw(at: CGPoint(x: 0,y :0), withAttributes: attributes)
var png:Data? = nil
if let charImage = UIGraphicsGetImageFromCurrentImageContext() {
png = UIImagePNGRepresentation(charImage)
}
UIGraphicsEndImageContext()
return png
}
Answered here.
for example:
import Foundation
import UIKit
var str = NSString(string: "saldkjaskldjhf")
var font = UIFont.systemFontOfSize(14.0)
var attributes:[String:AnyObject] = [NSFontAttributeName: font]
var attriStrWithoutParagraph = NSAttributedString(string: str, attributes: attributes)
var size = attriStrWithoutParagraph.boundingRectWithSize(CGSize(width: CGFloat.max, height: CGFloat.max), options: NSStringDrawingOptions.UsesLineFragmentOrigin, context: nil)
var paragraphstyle = NSMutableParagraphStyle()
paragraphstyle.firstLineHeadIndent = CGFloat(20)
attributes[NSParagraphStyleAttributeName] = paragraphstyle
attriStrWithoutParagraph = NSAttributedString(string: str, attributes: attributes)
size = attriStrWithoutParagraph.boundingRectWithSize(CGSize(width: CGFloat.max, height: CGFloat.max), options: NSStringDrawingOptions.UsesLineFragmentOrigin, context: nil)
here is the output:
(0.0,0.0,87.276,16.702)
(0.0,0.0,87.276,16.702)
we can see the result is the same, so the firstlineindent is not considered in why it works like this???
You're specifying very large (effectively infinite) values (CGFloat.max) for the size that you're passing to -boundingRectWithSize:options:. So, the text will never wrap. It will always be laid out in one long line.
Furthermore, the docs for -boundingRectWithSize:options: say:
The origin of the rectangle returned from this method is the first glyph origin.
So, the result is always relative to where the first glyph is placed. You're basically measuring the size of the line. The indent doesn't change the size of the line. It changes where the first glyph is placed, but the result is relative to the first glyph, so it doesn't change the result.
It would change the result if you were providing a real limit for the width and making the paragraph wrap. In that case, the second line would be "outdented" relative to the first line (and the first glyph), so the bounding rectangle would change as you change the firstLineHeadIndent.
You can simply apply the desired indent yourself. That is, after you get the bounding rect, add the indent distance to the X coordinate of the origin (edit: or to the width, if you want a rect encompassing the indent and not just the text positioned by the indent). (Although it's not clear to me what it could mean to indent text in an "infinite" space.)
You could also provide an actual bounding size for your desired destination for the text.