iOS 10 Blur Effects Prominent and Regular not working - ios

I didn't find any related question on the web, and I am trying to get a blur view to display the new blur effects .prominent and .regular but they are not showing. When I change the blur effect to .light, .extraLight or.dark, it works fine. It says in the description that the new blur effects adapts to the user interface. What does that mean, and why aren't those two new blur effects working?
I have iOS 10 in both simulator and in my iPhone and none of them are displaying the new blur effect. Print statements say that the if statement (instead of the else) is being called, as expected.
let blurEffect : UIBlurEffect!
if #available(iOS 10.0, *) {
blurEffect = UIBlurEffect(style: .prominent )
} else {
// Fallback on earlier versions
blurEffect = UIBlurEffect(style: .light )
}
let blurView = UIVisualEffectView(effect: blurEffect)
blurView.frame = CGRect(x: 100, y: 100, width: 200, height: 300)

What does that mean, and why aren't those two new blur effects working?
From WWDC 2016 Session 206: What's New in tvOS :
We've also added two new blur styles to the API.
You can now use UIBlurEffectStyleRegular or UIBlurEffectStyleProminent.
And we call these automatic styles.
And they'll actually adjust the effective blur effect style based on what the system setting is.
So if you use UIBlurEffectStyleRegular and the system's in light, it will use UIBlurEffectStyle.light.
If you use regular and dark, you'll use dark.
If you use prominent, it will use .extraLight and .extraDark.
.extraDark will be coming in a later seed
See all text of the session: http://asciiwwdc.com/2016/sessions/206
Basically this effects are for tvOS. That OS can be in dark or light style. For iOS those effects work in light mode.

Related

Color unselected segment of UISegmentedControl without hiding selection

I wanted to color each segment in my UISegmentedControl in a different color. I used the following Swift code (there were some changes with segmentedControl in iOS 13):
segmentedControl.selectedSegmentTintColor = .white
let col: UIColor = .yellow
var subViewOfSegment: UIView = segmentedControl.subviews[2] as UIView
subViewOfSegment.layer.backgroundColor = col.cgColor
This works in general, one segment is now coloured. However, the selected segment is not shown anymore. The selected segment is supposed to be white, but it seems to be overlaid by the colour. The following image show how it looks when I select each segment from left to right:
I already tried subViewOfSegment.backgroundColor = col instead (same effect) or subViewOfSegment.tintColor = col (no effect at all in iOS 13) but I can't get the colors without hiding the selection. On other posts I only find this answer which doesn't say how to color unselected segments.
iOS 13 (Xcode 13.4) Segment Control 100% Working
mySegmentedControl.selectedSegmentTintColor = .white
let subView = mySegmentedControl.subviews[1] as UIView
subView.layer.backgroundColor = UIColor.yellow.cgColor
let subViewOfSegment: UIView = segmentedControl.subviews[1] as UIView
subViewOfSegment.backgroundColor = UIColor.red
subViewOfSegment.layer.zPosition = -999
> above solution is not proper but one of the way you can achieve it. you need manage one more label while select segment is red"`
`
You can do it by using the following code:
init(){
UISegmentedControl.appearance().backgroundColor = .yellow
}
var body: some View{
your code…
}
if you want to change the selected segment’s color, you can use:
UISegmentedControl.appearance().selectedSegmentTintColor = .blue
to change it. I hope these codes can help you :D
For detail, you can visit here.
Using Apple APIs, you could try drawing an image with the 4 colors and set that as background using setBackgroundImage(_:for:barMetrics:).
Here's some code to draw a 40x10 image with just one color to get you started:
let rect = CGRect(origin: CGPoint(x: 0, y:0), size: CGSize(width: 40, height: 10))
UIGraphicsBeginImageContext(rect.size)
let context = UIGraphicsGetCurrentContext()!
context.setFillColor(color.cgColor)
context.fill(rect)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
To get an image with your four colors, you need to repeat context.setFillColor(<varying color>) and context.fill(<varying rect>) four times.
The image should stretch, which means you it only needs to have four equally sized colored rectangles; you don't need to create it in the exact size of the segmented control.
You can then set the selected segment color using the selectedSegmentTintColor property.

iOS 12, Xcode 10: UIView setNeedsDisplay(_:) seems to be broken

After updating to Xcode 10 I realized that the draw(_ rect: CGRect) routine of my custom UIView (class derived from UIView) in my application was called with the wrong rect. Indeed it is always called with rect being the full frame of the underlying UIView, instead of the rect being specified by setNeedsDisplay(_ rect: CGRect).
Here is a code snippet that can be run as a playground, which at least in my setup shows the erroneous behavior described above in a minimalistic setting:
import Foundation
import UIKit
import PlaygroundSupport
class CustomView: UIView {
override func draw(_ rect: CGRect) {
print("rect = \(rect)")
}
}
let customView = CustomView(frame: CGRect(origin: CGPoint.zero, size: CGSize(width: 200.0, height: 200.0)))
PlaygroundPage.current.liveView = customView
print("test")
customView.setNeedsDisplay(CGRect(origin: CGPoint.zero, size: CGSize(width: 100.0, height: 100.0)))
The output I get is
rect = (0.0, 0.0, 200.0, 200.0)
test
rect = (0.0, 0.0, 200.0, 200.0)
The first printed output for rect is the standard full redraw of the view, but the second one after printing "test" yields the problem. The output there is from redrawing due to calling customView.setNeedsDisplay before and should be the smaller specified rectangle (0.0, 0.0, 100.0, 100.0).
So my obvious questions are:
Can you reproduce this behavior?
Am I missing something obvious?
Is this a bug?
This is actually intentional with iOS 12's new dynamic backing store feature.
What is a backing store
A backing store is what stores the drawn view, and that needs memory assigned to do so. That memory amount is dependent on how big the view is as it is essentially a map between colours and pixels.
If you were to draw a grayscale image but the memory has been assigned for the wide colour gamut then that would result in lots of empty assigned memory (gray scale has a lower footprint that RGBA). To get around this the dynamic backing store feature works by drawing the whole content of a view, and THEN working out how much memory it needs rather than assuming everything needs wide colour backing from the start.
The knock on effect of this is that you can't re-draw a smaller sub section of the view as that might then change this store.
How to get around it
This is a great new feature, but if you really do need to work around it you can disable dynamic backing stores on your view. The way you do that is by explicitly setting the contentsFormat property of the views layer.
There are three options you can chose which relate to grayscale, RGBA 8bit and RGBA 16 bit (wide colour)
so just call:
layer.contentsFormat = .RGBA16Float
and your setNeedsDisplay(_ rect: CGRect) will start working as expected again
You can read up on the property here: https://developer.apple.com/documentation/quartzcore/calayer/1792104-contentsformat
There's also a great talk from WWDC 18 that explains the new dynamic backing store and (very quietly) mentions this technique
https://developer.apple.com/videos/play/wwdc2018/219/?time=1451
I tested this in Xcode 9, 10 & 10.1.
The behaviour has definitely changed between iOS 11 and iOS 12 / 12.1
There's no indication in the documentation or header file that this was intentional.
Looks like a bug to me.

Is it possible to use custom iOS UI elements like UILabel in augmented reality app

I am wondering if I can use UI elements like UIButton, UILabel in an augmented reality app with ARKit.
If you are also interested in transparency modes for that UIView subclasses try my sample https://github.com/erikhric/ar-menu
You can use different blending modes. I guess .alpha will work for your purposes.
Yes, you can use UIKit elements by adding them to a UIView that's positioned above the view displaying the AR scene (ARSKView or ARSCNView).
If you create a new project in Xcode and select the "Augmented Reality App" template, you can see that the AR content is just a view like any other UIKit view.
What worked best for me
in main.storyboard:
- delete SceneView
- add regular UIView
- add ARKit SceneKit View on top of that
- then you can add buttons, etc.
Yes you can place UI elements on top of the ARSKView or ARSCNView displaying the AR scene:
let scanningPanel = UIImageView()
scanningPanel.backgroundColor = UIColor(white: 0.33, alpha: 0.6)
scanningPanel.layer.masksToBounds = true
scanningPanel.frame = CGRect(x: -2,
y: self.sceneView.frame.height-270,
width: 178,
height: 50)
scanningPanel.layer.cornerRadius = 10
let scanInfo = UILabel(frame: CGRect(x: 8,
y: self.sceneView.frame.height-268,
width: 160,
height: 45))
scanInfo.textAlignment = .left
scanInfo.font = scanInfo.font.withSize(15)
scanInfo.textColor = UIColor.white
scanInfo.text = "SCAN A SURFACE"
Adding:
self.sceneView.addSubview(scanningPanel)
self.sceneView.addSubview(scanInfo)
Removing:
if(scanInfo.isDescendant(of: self.sceneView)) {
scanInfo.removeFromSuperview()
}
You can insert content of any view on a plane in ARKit like this:
let plane = SCNPlane(width: sceneView.bounds.width/3000,
height: sceneView.bounds.height/3000)
plane.firstMaterial?.diffuse.contents = self.anyView`
Gestures and taps are automatically sent to that view.
Try my example.

iOS 11 breaks row selection

I've recently tested my app in iOS 11 and for some reason I'm not able to select one of the first 12 rows in a dynamically populated table view. The didSelectRow isn't even triggered for these rows. The other rows work fine, but even when scrolling down and back up (the cells should have been re-used again by then) the first 12 rows don't work.
Even on a static table view all cells that appear on screen when switching to that view controller will not respond, neither will controls inside them, even when they are in different sections. Cells that are out of screen initially again work fine.
I'll be trying to test this in an app with boilerplate code, but is this a known bug? I couldn't find anything online about it.
I've tested this after updating the devices to iOS 11, then again from Xcode 9 beta 6 without changes to the code, and again after migrating to Swift 4. Same behaviour inside the simulator. Up to iOS 10 everything is fine, only with iOS 11 the problem occurs.
This will break my app for users in two weeks, I need to fix it, so any help or advice very much appreciated!
UPDATE: As Paulw11 suggested, there is indeed another view blocking the rows. This was notable as row 12 could only be selected in the lower part of the cell, but not in the upper part.
The cause for this issue is the following code:
extension UIViewController {
func setBackgroundImage(forTableView tableView: UITableView) {
let bgImage = UIImage(named: "Background Image.png")
let bgImageView = UIImageView(image: bgImage)
tableView.backgroundView = bgImageView
let rect = bgImageView.bounds
let effect = UIBlurEffect(style: UIBlurEffectStyle.dark)
let blurView = UIVisualEffectView(effect: effect)
let height: CGFloat
switch screenSize.height {
case 480, 568: height = 455
case 736: height = 623
default: height = 554
}
blurView.frame = CGRect(x: 0, y: 0, width: rect.width, height: height)
let container = UIView(frame: rect)
bgImageView.addSubview(blurView)
let bgOverlay = UIImage(named: "Background Overlay.png")
let bgOverlayImageView = UIImageView(image: bgOverlay)
bgOverlayImageView.alpha = 0.15
bgImageView.addSubview(bgOverlayImageView)
self.view.insertSubview(container, at: 1)
}
}
Somehow since iOS 11 this background image seems to be rendered in front of the cells. Not inserting the container view into the table view's view will solve the issue. I've tried setting the zPosition of the container's layer but it does not help. How can I move the background image behind the cells again.
It's weird that this behaviour would change from iOS 10 to 11...
UPDATE 2: Inserting the container at index -1 fixes the issue:
self.view.insertSubview(container, at: -1)
I don't get why this works, though, shouldn't this index be out of range?
UPDATE 3: As Paulw11 pointed out below, the container is completely useless, it was left over from testing and removing it fixes the issue.
The container view seems to be appearing in front of the other views and preventing touches from making it through to the table view.
As an aside, I would see if you can refactor this to use constraints; It always worries me when you see hard-coded screen sizes, as that may break when new devices are released.

use UIVisualEffectView to create a blur view, correct on simulator but not on iphone & ipad

The goal: create a blur view in app.
The code I use:
func createBlurBackgroundView()
{
if !UIAccessibilityIsReduceTransparencyEnabled()
{
if blurredSubView == nil || blurredSubView.superview == nil
{
print("Create blurred background view")
self.backgroundColor = UIColor.clearColor()
let blurEffect = UIBlurEffect(style: UIBlurEffectStyle.Light)
blurredSubView = UIVisualEffectView(effect: blurEffect)
blurredSubView.frame = self.bounds
self.insertSubview(blurredSubView, atIndex: 0)
}
}
else
{
print("Transparency disabled!! no blur view")
}
}
The result:
everything works fine on the simulator:
But when I ran it on the iphone and ipads, it looks like:
PLEASE NOTE I DIDN'T CHANGE THE "REDUCE TRANSPARENCY" SETTINGS!
Then when I want to take a snapshot of this black background without blur, guess what?! in the photo stream, I saw exactly the correct blur view picture...
Also, when I double clicked home button, and look at the multi-task interface, I saw the blur view effect!
More Info:
I use iphone6s, ipad air2, both iOS 9.3.1
Xcode version 7.3
I'm so tired on trying to solve this problem, I tried other methods like take snapshot image and then apply blur effect on the image, but has other bugs and other CONs
UIVisualEffectView does not work with SpriteKit. I don't know what they do differently in the back, if someone knows please feel free to edit the answer. My guess is that the underlying implementation use different APIs that don't work togheter. The simulator does all kinds of tricks to simulate the actual device so they might use something different in the back than the real devices and that's why it does work on a simulator.
Remove this line in your code
self.backgroundColor = UIColor.clearColor()

Resources