boundingRectWithSize:options:context: calculate size not consider NSParagraphStyle.firstLineHeadIndent - ios

for example:
import Foundation
import UIKit
var str = NSString(string: "saldkjaskldjhf")
var font = UIFont.systemFontOfSize(14.0)
var attributes:[String:AnyObject] = [NSFontAttributeName: font]
var attriStrWithoutParagraph = NSAttributedString(string: str, attributes: attributes)
var size = attriStrWithoutParagraph.boundingRectWithSize(CGSize(width: CGFloat.max, height: CGFloat.max), options: NSStringDrawingOptions.UsesLineFragmentOrigin, context: nil)
var paragraphstyle = NSMutableParagraphStyle()
paragraphstyle.firstLineHeadIndent = CGFloat(20)
attributes[NSParagraphStyleAttributeName] = paragraphstyle
attriStrWithoutParagraph = NSAttributedString(string: str, attributes: attributes)
size = attriStrWithoutParagraph.boundingRectWithSize(CGSize(width: CGFloat.max, height: CGFloat.max), options: NSStringDrawingOptions.UsesLineFragmentOrigin, context: nil)
here is the output:
(0.0,0.0,87.276,16.702)
(0.0,0.0,87.276,16.702)
we can see the result is the same, so the firstlineindent is not considered in why it works like this???

You're specifying very large (effectively infinite) values (CGFloat.max) for the size that you're passing to -boundingRectWithSize:options:. So, the text will never wrap. It will always be laid out in one long line.
Furthermore, the docs for -boundingRectWithSize:options: say:
The origin of the rectangle returned from this method is the first glyph origin.
So, the result is always relative to where the first glyph is placed. You're basically measuring the size of the line. The indent doesn't change the size of the line. It changes where the first glyph is placed, but the result is relative to the first glyph, so it doesn't change the result.
It would change the result if you were providing a real limit for the width and making the paragraph wrap. In that case, the second line would be "outdented" relative to the first line (and the first glyph), so the bounding rectangle would change as you change the firstLineHeadIndent.
You can simply apply the desired indent yourself. That is, after you get the bounding rect, add the indent distance to the X coordinate of the origin (edit: or to the width, if you want a rect encompassing the indent and not just the text positioned by the indent). (Although it's not clear to me what it could mean to indent text in an "infinite" space.)
You could also provide an actual bounding size for your desired destination for the text.

Related

Aligning glyphs to the top of a UITextView after sizeToFit

The app I'm working on supports hundreds of different fonts. Some of these fonts, particularly the script fonts, have significant ascenders and descenders. When sizeToFit() is called on a UITextView with some of these fonts, I end up with significant top and bottom padding (the image on the left). The goal is to end up with the image on the right, such that the tallest glyph is aligned flush with the top of the text view's bounding box.
Here's the log for the image above:
Point Size: 59.0
Ascender: 70.21
Descender: -33.158
Line Height: 103.368
Leading: 1.416
TextView Height: 105.0
My first thought was to look at the height of each glyph in the first line of text, and then calculate the offset between the top of the container and the top of the tallest glyph. Then I could use textContainerInset to adjust the top margin accordingly.
I tried something like this in my UITextView subclass:
for location in 0 ..< lastGlyphIndexInFirstLine {
let glphyRect = self.layoutManager.boundingRect(forGlyphRange: NSRange(location: location, length: 1), in: self.textContainer)
print(glphyRect.size.height) // prints 104.78399999999999 for each glyph
}
Unfortunately, this doesn't work because boundRect(forGlyphRange:in:) doesn't appear to return the rect of the glyph itself (I'm guessing this is always the same value because it's returning the height of the line fragment?).
Is this the simplest way to solve this problem? If it is, how can I calculate the distance between the top of the text view and the top of the tallest glyph in the first line of text?
This doesn't appear to be possible using TextKit, but it is possible using CoreText directly. Specifically, CGFont's getGlyphBBoxes returns the correct rect in glyph space units, which can then be converted to points relative to the font size.
Credit goes to this answer for making me aware of getGlyphBBoxes as well as documenting how to convert the resulting rects to points.
Below is the complete solution. This assumes you have a UITextView subclass with the following set beforehand:
self.contentInset = .zero
self.textContainerInset = .zero
self.textContainer.lineFragmentPadding = 0.0
This function will now return the distance from the top of the text view's bounds to the top of the tallest used glyph:
private var distanceToGlyphs: CGFloat {
// sanity
guard
let font = self.font,
let fontRef = CGFont(font.fontName as CFString),
let attributedText = self.attributedText,
let firstLine = attributedText.string.components(separatedBy: .newlines).first
else { return 0.0 }
// obtain the first line of text as an attributed string
let attributedFirstLine = attributedText.attributedSubstring(from: NSRange(location: 0, length: firstLine.count)) as CFAttributedString
// create the line for the first line of attributed text
let line = CTLineCreateWithAttributedString(attributedFirstLine)
// get the runs within this line (there will typically only be one run when using a single font)
let glyphRuns = CTLineGetGlyphRuns(line) as NSArray
guard let runs = glyphRuns as? [CTRun] else { return 0.0 }
// this will store the maximum distance from the baseline
var maxDistanceFromBaseline: CGFloat = 0.0
// iterate each run
for run in runs {
// get the total number of glyphs in this run
let glyphCount = CTRunGetGlyphCount(run)
// initialize empty arrays of rects and glyphs
var rects = Array<CGRect>(repeating: .zero, count: glyphCount)
var glyphs = Array<CGGlyph>(repeating: 0, count: glyphCount)
// obtain the glyphs
self.layoutManager.getGlyphs(in: NSRange(location: 0, length: glyphCount), glyphs: &glyphs, properties: nil, characterIndexes: nil, bidiLevels: nil)
// obtain the rects per-glyph in "glyph space units", each of which needs to be scaled using units per em and the font size
fontRef.getGlyphBBoxes(glyphs: &glyphs, count: glyphCount, bboxes: &rects)
// iterate each glyph rect
for rect in rects {
// obtain the units per em from the font ref so we can convert the rect
let unitsPerEm = CGFloat(fontRef.unitsPerEm)
// sanity to prevent divide by zero
guard unitsPerEm != 0.0 else { continue }
// calculate the actual distance up or down from the glyph's baseline
let glyphY = (rect.origin.y / unitsPerEm) * font.pointSize
// calculate the actual height of the glyph
let glyphHeight = (rect.size.height / unitsPerEm) * font.pointSize
// calculate the distance from the baseline to the top of the glyph
let glyphDistanceFromBaseline = glyphHeight + glyphY
// store the max distance amongst the glyphs
maxDistanceFromBaseline = max(maxDistanceFromBaseline, glyphDistanceFromBaseline)
}
}
// the final top margin, calculated by taking the largest ascender of all the glyphs in the font and subtracting the max calculated distance from the baseline
return font.ascender - maxDistanceFromBaseline
}
You can now set the text view's top contentInset to -distanceToGlyphs to achieve the desired result.

Inconsistent behavior of kerning in NSAttributedString

I need to use Kern attribute of NSAttributedString. As I can see in the documentation, default value of that attribute is 0.0. But I faced with strange behaviour for phrase Hello, world (for phrase "Hello" all fine):
NSDictionary<NSString*, id>* attributes = #{NSFontAttributeName: [UIFont systemFontOfSize:12]};
NSString* text = #"Hello, World";
NSAttributedString* string = [[NSAttributedString alloc] initWithString:text attributes:attributes];
CGSize size1 = [string size];
NSMutableDictionary<NSString*, id>* attributesWithKernel = [attributes mutableCopy];
attributesWithKernel[NSKernAttributeName] = #(0.0);
NSAttributedString* stringWithKern = [[NSAttributedString alloc] initWithString:text attributes:attributesWithKernel];
CGSize size2 = [stringWithKern size];
XCTAssertTrue(CGSizeEqualToSize(size1, size2)); //here test falls
//size1 = size1 = (width = 68.8125, height = 14.3203125)
//size2 = (width = 69.515625, height = 14.3203125)
To make size1 and size2 equal, kerning should be equal -7.105427357601002e-15 , I know that this is very close to 0.0, but it is strange, because this changes the width almost a pixel.
NSAttributedString has same behaviour in Objective-C and in Swift, example for swift:
let text = "Hello, World"
let attributes : [NSAttributedString.Key : Any] = [NSAttributedString.Key.font: UIFont.systemFont(ofSize: UIFont.systemFontSize)]
let str = NSAttributedString(string: text, attributes: attributes)
let size = str.size()
var attributesWithKern = attributes
attributesWithKern[NSAttributedString.Key.kern] = NSNumber(value: 0.0)
let strWithKern = NSAttributedString(string: text, attributes: attributesWithKern)
let sizeWithKern = strWithKern.size()
XCTAssertTrue(size == sizeWithKern)
How I can fix this behaviour?
P.S.
Now I just remove NSKernAttributeKey from an attribute string if the key equal to 0.0, but I do not think that it is a good solution.
I believe the docs here are wrong, and it is worth opening a radar about that. When no value is set, it is interpreted as "normal kerning." When 0 is set, it is interpreted as "disable kerning," which is why the width is a little wider (kerning typically is slightly negative, bringing kern-pair character, like "W" and "o" in this font, a little closer). I don't think there's any way to explicitly request "default kerning" without removing the attribute.
For your purposes, I believe you're doing the right thing by removing the value when it's zero, because you want default kerning, not to disable kerning.
The reason your tiny negative value is working is because it's not zero, so it's not disabling kerning, but it's so small that the behavior is very, very close to the default, and you're running into the precision of Double in the intermediate calculations (or possibly the precision of Float, depending on how it's implemented internally). You should find that your test passes for any value smaller (closer to zero) than this one, not just that value. In my tests, positive 7e-15 also works, for example.
I did some tests based on your code and I confirm this behaviour which looks like a bug. The point is for some letters strings are equal, for some not. For instance strings with "Hello Mo" or "Hello Oo" are equal, but for instance "Hello WoWoWo" differ a lot. So we see here some kerning being added for "W" without any reason. It can also depend on chosen font, I did not test it though. I see here the only solution that you've used - removing NSKernAttributeKey if it equals to 0.

How to get emoji path in iOS

My purpose : Draw outline of every glyph
example1:
input: text= "666棒"
display:
Attach:In the figure above, 1 is displayView,2 is inputView.
example2:
input: text= "666😁棒"
display:
Attach: In the figure above, 1 is displayView,2 is inputView,3 is nothing rendered.
Main ideas is :
Use CoreText obtained every CGglyph.
Get every glyph's CGPath
Use CAShapeLayer display the glyph on screen.
Main method:
let letters = CGMutablePath()
let font = CTFontCreateWithName(fontName as CFString?, fontSize, nil)
let attrString = NSAttributedString(string: text, attributes: [kCTFontAttributeName as String : font])
let line = CTLineCreateWithAttributedString(attrString)
let runArray = CTLineGetGlyphRuns(line)
for runIndex in 0..<CFArrayGetCount(runArray) {
let run : CTRun = unsafeBitCast(CFArrayGetValueAtIndex(runArray, runIndex), to: CTRun.self)
let dictRef : CFDictionary = unsafeBitCast(CTRunGetAttributes(run), to: CFDictionary.self)
let dict : NSDictionary = dictRef as NSDictionary
let runFont = dict[kCTFontAttributeName as String] as! CTFont
for runGlyphIndex in 0..<CTRunGetGlyphCount(run) {
let thisGlyphRange = CFRangeMake(runGlyphIndex, 1)
var glyph = CGGlyph()
var position = CGPoint.zero
CTRunGetGlyphs(run, thisGlyphRange, &glyph)
CTRunGetPositions(run, thisGlyphRange, &position)
let letter = CTFontCreatePathForGlyph(runFont, glyph, nil)
let t = CGAffineTransform(translationX: position.x, y: position.y)
if let letter = letter {
letters.addPath(letter, transform: t)
}
}
}
let path = UIBezierPath()
path.move(to: CGPoint.zero)
path.append(UIBezierPath(cgPath: letters))
let pathLayer = CAShapeLayer()
pathLayer.path = path.cgPath
self.layer.addSubLayer(pathLayer)
...
Question:
How to get emoji path ,in this case I can draw the emoji outline instead of draw the whole emoji? Another benefit is I can draw emoji path animated if I need.
Any help is appreciate!
************************ update 2.15.2017 ***********************
Thanks #KrishnaCA 's suggest.
I used bool supports = CTFontGetGlyphWithName(myFont, "😀") find that no font is support emoji.
Fortunately is Google's Noto provide good support for emoji fonts
You can find it there :google's Noto
I used font Noto Emoji
Display:
Only Noto Emoji and Noto Color Emoji support Emoji (I guess)
Hope to help people who come here!
I believe you need to check whether a glyph for an unicode corresponding to the CTFont exist or not. If it doesn't exist, fall back to any default CTFont that has a glpyh for an unicode
You can check that using the following code.
bool supports = CTFontGetGlyphWithName(myFont, "😀")
here, myFont is a CTFontRef object.
Please let me know if this is what you're not looking for
I believe you'll need CATextLayers to help you out.
I know it's a bit late, but sadly you can not - emojis are actually bitmaps drawn into the same context as shapes representing regular characters. You best bet is probably to draw emoji characters separately at needed scale into the context. This won't give you access to the actual vector data.
If you really need it in a vector form:
I'd go with finding Apple Emoji font redrawn in vector (I remember seeing it on the internet, not sure if it contains all the latest emojis though)
Mapping names of the individual vector images you found to the characters and then drawing the vector images

When do adjustsFontSizeToFitWidth or boundingRectWithSize change the context.actualScaleFactor?

When does the actualScaleFactor of an NSStringDrawingContext change?
The documentation says:
"If you specified a custom value in the minimumScaleFactor property, when drawing is complete, this property contains the actual scale factor value that was used to draw the string."
My code:
myButton.titleLabel!.font = UIFont(name: "AmericanTypewriter-Bold", size: 40)
myButton.titleLabel?.adjustsFontSizeToFitWidth = true
myButton.setTitle("\(textString)", forState: .Normal)
let attributes = [NSFontAttributeName : myButton.titleLabel!.font]
let attributedString = NSMutableAttributedString(string:textString, attributes:attributes)
let context = NSStringDrawingContext()
context.minimumScaleFactor = myButton.titleLabel!.minimumScaleFactor
print("context: \(context.actualScaleFactor)")
let resultingRect = attributedString.boundingRectWithSize(myButton.titleLabel!.bounds.size, options: .UsesLineFragmentOrigin, context: context)
print("actual context after drawing: \(context.actualScaleFactor)")
//want to get the font size after adjustsFontSizeToFitWidth has done its magic:
//let actualFontSize = myButton.titleLabel!.font.pointSize * context.actualScaleFactor
Console log for both text that fits without being shrunk and longer text that is adjusted to fit the label's width are both the same:
context: 0.0
actual context after drawing: 1.0
Any idea what step I am missing to get a real scaleFactor from context after the text has been sized to fit the label?
do this in viewDidAppear... That worked for me
The same problem as #RanLearns.
After add myButton.titleLabel!.minimumScaleFactor = 0.1 it worked for me

Decimal point alignment in iOS UILabel during countdown?

Sorry I don't have any code yet, but would appreciate some advice!
I have a countdown timer showing seconds with one decimal point in a UILabel (10.0, 9.9, 9.8, etc.). It works fine, but the decimal point moves around slightly depending on the size decimal value. Is there a way to align the text in the UILabel to the decimal point or should I create two labels (one for the seconds value aligned right and one for the decimal value aligned left)?
Thanks!
I think your suggestion of multiple labels is perfectly valid.
Here is an attributed string solution (for m:ss but it should work with floating point numbers also):
let string = "\t43:21" // Sample min:sec. Note the leading tab is required in this code.
let countdownFont = UIFont.systemFontOfSize(13)
let terminators = NSCharacterSet(charactersInString: "\t:.") // in some locales '.' is used instead of ':'
let tabStop = NSTextTab(textAlignment: .Right, location: 40, options: [NSTabColumnTerminatorsAttributeName: terminators])
let paragraphStyle = NSMutableParagraphStyle()
paragraphStyle.tabStops = [tabStop]
let attributeDictionary: [String: AnyObject] = [NSParagraphStyleAttributeName: paragraphStyle, NSFontAttributeName: countdownFont]
let attributedString = NSAttributedString(string: string, attributes: attributeDictionary)
self.countdownLabel.attributedText = attributedString
Related resources:
https://www.objc.io/issues/9-strings/string-rendering/#tables-with-numbers
https://library.oreilly.com/book/0636920034261/programming-ios-8-1st-edition/314.xhtml?ref=toc#_tab_stops
Of course a fixed width font, such as Courier or Menlo could also solve this problem, but they contrast fairly starkly.

Resources