Calculate width and height of AttributeText does not correct - ios

I'm making an app chat and I have a problem with the chat message's size.
I've calculate my message's height like this and I've got the correct height
maximumMessageWidth = 300.0
let size = CGSize(width: maximumMessageWidth, height: 1000.0)
let messageAttributeText = messageBody.attributedText
let height = messageAttributeText?.boundingRect(with: size, options: .usesLineFragmentOrigin, context: nil).height
To avoid the special case like this image. I have to calculate the width
It has the big space after my message
I want it to look like this:
This is the code I use to calculate the message width (I used the correct height which I've calculated the width)
let size = CGSize(width: 1000.0, height: height)
let messageAttributeText = messageBody.attributedText
var width = messageAttributeText?.boundingRect(with: size, options: .usesLineFragmentOrigin, context: nil).width
but width in this code seem not right. Because I think it does care about my height property. It assume that my text is in only 1 line.
But I want to calculate the width, so that my text will `fill the whole label' just like the second image
Does anyone know how to calculate the width in my case ?

You can do that by casting String to NSString
var string = "Hello, World"
let nsString = string as NSString
let size = nsString.size(attributes: [NSFontAttributeName: UIFont.systemFont(ofSize: 14)])
The size will give you width and height of the string.
You can apply this to your NSAttributedStrings string property.

After a bit of research, I find out this:
The calculate width code doesn't work right because it do not care about the height property.
If height = 100 or height = 1000 it will return the same result because it assume to draw all the text in just one line.
So to calculate the width, I used binary search
var minWidth:CGFloat = 0.0
var maxWidth = maxWidth
while true {
if (minWidth >= maxWidth) {
width = minWidth
break
}
let testWidth = (maxWidth + minWidth) / 2.0
if calculateMessageHeight(width: testWidth) > messageHeight {
minWidth = testWidth + 2.0
continue
} else {
maxWidth = testWidth - 2.0
continue
}

Related

How to convert VNRectangleObservation item to UIImage in SwiftUI

I was able to identify squares from a images using VNDetectRectanglesRequest. Now I want those rectangles to store as separate images (UIImage or cgImage). Below is what I tried.
let rectanglesDetection = VNDetectRectanglesRequest { request, error in
rectangles = request.results as! [VNRectangleObservation]
rectangles.sort{$0.boundingBox.origin.y > $1.boundingBox.origin.y}
for rectangle in rectangles {
let rect = rectangle.boundingBox
let imageRef = cgImage.cropping(to: rect)
let image = UIImage(cgImage: imageRef!, scale: image!.scale, orientation: image!.imageOrientation)
checkBoxImages.append(image)
}
Can anybody point out what's wrong or what should be the best approach?
Update 1
At this stage, I'm testing with an image that I added to the assets.
With this image I get 7 rectangles as observations as each for each cell and one for the table margin.
My task is to identify the text inside in each rectangle and my approach is to send VNRecognizeTextRequest for each rectangle that has been identified. My real scenario is little complicated than this but I want to at least achieve this before going forward.
Update 2
for rectangle in rectangles {
let trueX = rectangle.boundingBox.minX * image!.size.width
let trueY = rectangle.boundingBox.minY * image!.size.height
let width = rectangle.boundingBox.width * image!.size.width
let height = rectangle.boundingBox.height * image!.size.height
print("x = " , trueX , " y = " , trueY , " width = " , width , " height = " , height)
let cropZone = CGRect(x: trueX, y: trueY, width: width, height: height)
guard let cutImageRef: CGImage = image?.cgImage?.cropping(to:cropZone)
else {
return
}
let croppedImage: UIImage = UIImage(cgImage: cutImageRef)
croppedImages.append(croppedImage)
}
My image width and height is
width = 406.0 height = 368.0
I've taken my debug interface for you to get a proper understand.
As #Lasse mentioned, this is my actual issue with screenshots.
This is just a guess since you didn't state what the actual problem is, but probably you're getting a zero-sized image for each VNRectangleObservation.
The reason is: Vision uses a normalized coordinate space from 0.0 to 1.0 with lower left origin.
So in order to get the correct rectangle of your original image, you need to convert the rect from Normalized Space to Image Space. Luckily there is VNImageRectForNormalizedRect(::_:) to do just that.

danielgindi/Charts: get left yAxis width

Is there a way to get the left yAxis width? I'm drawing custom markers and would like to avoid to draw over the left yAxis.
There is a solution for the android version of this library but it doesn't work in iOS because of different render methods: How to get the width of y-axis label in MPAndroidChart
You can retrieve the axis width by the requiredSize() method.
chartView.leftAxis.requiredSize()
This is what requiredSize() method actually does:
#objc open func requiredSize() -> CGSize
{
let label = getLongestLabel() as NSString
var size = label.size(withAttributes: [NSAttributedString.Key.font: labelFont])
size.width += xOffset * 2.0
size.height += yOffset * 2.0
size.width = max(minWidth, min(size.width, maxWidth > 0.0 ? maxWidth : size.width))
return size
}
Use this only after you set the chart data, cause the chart will automatically calculate the width with the longest label in your axis.
Hope it helps
Does this give you what you need?
let yourBarChartView = BarChartView()
let leftAxis = yourBarChartView.leftAxis
let width = leftAxis.axisLineWidth
Hope this helps!

Aligning glyphs to the top of a UITextView after sizeToFit

The app I'm working on supports hundreds of different fonts. Some of these fonts, particularly the script fonts, have significant ascenders and descenders. When sizeToFit() is called on a UITextView with some of these fonts, I end up with significant top and bottom padding (the image on the left). The goal is to end up with the image on the right, such that the tallest glyph is aligned flush with the top of the text view's bounding box.
Here's the log for the image above:
Point Size: 59.0
Ascender: 70.21
Descender: -33.158
Line Height: 103.368
Leading: 1.416
TextView Height: 105.0
My first thought was to look at the height of each glyph in the first line of text, and then calculate the offset between the top of the container and the top of the tallest glyph. Then I could use textContainerInset to adjust the top margin accordingly.
I tried something like this in my UITextView subclass:
for location in 0 ..< lastGlyphIndexInFirstLine {
let glphyRect = self.layoutManager.boundingRect(forGlyphRange: NSRange(location: location, length: 1), in: self.textContainer)
print(glphyRect.size.height) // prints 104.78399999999999 for each glyph
}
Unfortunately, this doesn't work because boundRect(forGlyphRange:in:) doesn't appear to return the rect of the glyph itself (I'm guessing this is always the same value because it's returning the height of the line fragment?).
Is this the simplest way to solve this problem? If it is, how can I calculate the distance between the top of the text view and the top of the tallest glyph in the first line of text?
This doesn't appear to be possible using TextKit, but it is possible using CoreText directly. Specifically, CGFont's getGlyphBBoxes returns the correct rect in glyph space units, which can then be converted to points relative to the font size.
Credit goes to this answer for making me aware of getGlyphBBoxes as well as documenting how to convert the resulting rects to points.
Below is the complete solution. This assumes you have a UITextView subclass with the following set beforehand:
self.contentInset = .zero
self.textContainerInset = .zero
self.textContainer.lineFragmentPadding = 0.0
This function will now return the distance from the top of the text view's bounds to the top of the tallest used glyph:
private var distanceToGlyphs: CGFloat {
// sanity
guard
let font = self.font,
let fontRef = CGFont(font.fontName as CFString),
let attributedText = self.attributedText,
let firstLine = attributedText.string.components(separatedBy: .newlines).first
else { return 0.0 }
// obtain the first line of text as an attributed string
let attributedFirstLine = attributedText.attributedSubstring(from: NSRange(location: 0, length: firstLine.count)) as CFAttributedString
// create the line for the first line of attributed text
let line = CTLineCreateWithAttributedString(attributedFirstLine)
// get the runs within this line (there will typically only be one run when using a single font)
let glyphRuns = CTLineGetGlyphRuns(line) as NSArray
guard let runs = glyphRuns as? [CTRun] else { return 0.0 }
// this will store the maximum distance from the baseline
var maxDistanceFromBaseline: CGFloat = 0.0
// iterate each run
for run in runs {
// get the total number of glyphs in this run
let glyphCount = CTRunGetGlyphCount(run)
// initialize empty arrays of rects and glyphs
var rects = Array<CGRect>(repeating: .zero, count: glyphCount)
var glyphs = Array<CGGlyph>(repeating: 0, count: glyphCount)
// obtain the glyphs
self.layoutManager.getGlyphs(in: NSRange(location: 0, length: glyphCount), glyphs: &glyphs, properties: nil, characterIndexes: nil, bidiLevels: nil)
// obtain the rects per-glyph in "glyph space units", each of which needs to be scaled using units per em and the font size
fontRef.getGlyphBBoxes(glyphs: &glyphs, count: glyphCount, bboxes: &rects)
// iterate each glyph rect
for rect in rects {
// obtain the units per em from the font ref so we can convert the rect
let unitsPerEm = CGFloat(fontRef.unitsPerEm)
// sanity to prevent divide by zero
guard unitsPerEm != 0.0 else { continue }
// calculate the actual distance up or down from the glyph's baseline
let glyphY = (rect.origin.y / unitsPerEm) * font.pointSize
// calculate the actual height of the glyph
let glyphHeight = (rect.size.height / unitsPerEm) * font.pointSize
// calculate the distance from the baseline to the top of the glyph
let glyphDistanceFromBaseline = glyphHeight + glyphY
// store the max distance amongst the glyphs
maxDistanceFromBaseline = max(maxDistanceFromBaseline, glyphDistanceFromBaseline)
}
}
// the final top margin, calculated by taking the largest ascender of all the glyphs in the font and subtracting the max calculated distance from the baseline
return font.ascender - maxDistanceFromBaseline
}
You can now set the text view's top contentInset to -distanceToGlyphs to achieve the desired result.

show a exactly Height of text in a TextField

I have to program a see test for an optician. So they need that the Numbers are exactly have a specific height.
He has a IPad ans streams it to a TV. I know i must consider the PPI of the TV. Here is my Code :
func calcPoints() -> Float {
// he gives the Visus with a TextField
let visus = Float(textFieldVisus.text!)
// Here you put in the PPI of the device
let ppi = Float(textFieldDPI.text!)
// Calculate the lenght of the Number
let lenght = ((0.29 * 5) / visus!) * 5
// Calculate the Points (because TextFiels work with Points)
let points = ((ppi! / 25.4) * lenght) * 0.75
// here you divide with 2 because you have a retina Display on the IPad
return (points / 2)
}
func passeUIan() {
// Now i can give the Points to the TextField
textField1.bounds.size.width = CGFloat(calcPoints())
textField1.bounds.size.height = CGFloat(calcPoints())
textField1.font = UIFont(name: (textField1.font?.fontName)!, size: CGFloat(calcPoints()))
}
but when i measure off the lenght on the TV it is wrong.
Normaly it must be 7.25mm but it is approximately 9mm.
i dont know what is wrong. I search for this problem since 2 weeks...
You need to first familiarize yourself with the different font metrics. The font size is usually (but not always) the difference between the ascender and descenders. For your purpose, the height of an uppercase letter is called the "cap height" and the height of a lowercase letter is called the "x-height".
There's no formula to translate the font size to either cap height or x-height. Their relationships different from fonts to fonts, and even variants (bold, italic, small caps, display, book) within a font.
The function below use binary search to look for a point size that matches your desired height (in inches):
// desiredHeight is in inches
func pointSize(inFontName fontName: String, forDesiredCapHeight desiredHeight: CGFloat, ppi: CGFloat) -> CGFloat {
var minPointSize: CGFloat = 0
var maxPointSize: CGFloat = 5000
var pointSize = (minPointSize + maxPointSize) / 2
// Finding for exact match may not be possible. UIFont may round off
// the sizes. If it's within 0.01 in (0.26 mm) of the desired height,
// we consider that good enough
let tolerance: CGFloat = 0.01
while let font = UIFont(name: fontName, size: pointSize) {
let actualHeight = font.capHeight / ppi * UIScreen.main.scale
if abs(actualHeight - desiredHeight) < tolerance {
return pointSize
} else if actualHeight < desiredHeight {
minPointSize = pointSize
} else {
maxPointSize = pointSize
}
pointSize = (minPointSize + maxPointSize) / 2
}
return 0
}
Example: find the point size that make the capital letter 1-inch tall in Helvetica. (326 is the PPI for iPhone 6 / 6S / 7, which I used to test):
let size = pointSize(inFontName: "Helvetica", forDesiredCapHeight: 1, ppi: 326)
label.font = UIFont(name: fontName, size: size)
label.text = "F"
(tips: a UILabel handles font size much better than a UITextField)

Creating Variable Constraints for a UIImageView in Swift?

I'm trying to make some UIImageViews in Xcode that fill the screen based on different conditions. For example, I might have a 3x3 square of images that have to fill the screen, or a 4x4 square that must fill the screen, based on different initial conditions. Every time I try to accomplish this, the ImageViews just end up being the same size for both conditions. I've tried many different solutions but the one I'm currently trying is:
if fieldDimensions == 3 {
let spacing = screenWidth / 16
let boxsize = screenWidth / 4
let xadjust = spacing / 2 //Value to help align view
let interval = spacing + boxsize
Button1Image.translatesAutoresizingMaskIntoConstraints = false
button1height.constant = boxsize
button1width.constant = boxsize
button2height.constant = boxsize
button2width.constant = boxsize
}
else if fieldDimensions == 4 {
let spacing = screenWidth / 20
let boxsize = screenWidth * 3/16
let interval = spacing + boxsize
button1height.constant = boxsize
button1width.constant = boxsize
button2height.constant = boxsize
button2width.constant = box size
All of the button heights and widths are linked to the height and width constraints in the storyboard (I just control-dragged them). Any help would be really appreciated, I've been working on this problem for almost a week now, thanks!!

Resources