iOS 15: SwiftUI Canvas/TimelineView terrible performance - ios

Playing around with new Canvas/TimelineView for iOS 15. I tried to create a particle system using the official WWDC tutorial, but couldn't manage to fix performance-related issues.
Here is the code:
struct MyView: View {
#State private var count = 32*32
var body: some View {
TimelineView(.animation) { timeline in
Canvas { context, size in
let now = timeline.date.timeIntervalSinceReferenceDate
for i in 0..<count {
context.fill(Ellipse().path(in: CGRect(x: size.width*0.01*Double(i), y: size.height*0.01*Double(i), width: 20.0, height: 20.0)), with: .color(.green))
}
}
}
}
}
It just draws 1024 circles but already consumes about 20% of the Simulator CPU and 50% of my iPhone 8 CPU. Considering the power of the iPhone and said effectiveness of new frameworks, is it expected behavior? How should I fix this if I need much more than 1024 circles?

Related

UIScreen.main is deprecated, what are other solutions other than GeometryReader? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 12 days ago.
Improve this question
I'm targeting iOS 16 for my app in which I access the screen height and width using UIScreen.main.bounds.width and UIScreen.main.bounds.height so I can draw views based on these two values. I'm assigning these two values to two CGFloat properties in the view struct as follows:
struct ContentView: View {
var width: CGFloat = UIScreen.main.bounds.width
var height: CGFloat = UIScreen.main.bounds.height
var fontSize: CGFloat
var body: some View {
// draw views here using width and height properties
}
Xcode is showing a warning message saying 'main' will be deprecated in a future version of iOS: use a UIScreen instance found through context instead: i.e, view.window.windowScene.screen
I'm not sure how to apply the answer here to my use case and I don't want to use GeometryReader since it just messes up the overall layout.
Any suggestions on how to obtain screen width and height in an app targeting iOS 16 and above without using GeometryReader?
Swift 5.5
#main
struct MyApp: App {
var body: some Scene {
WindowGroup {
GeometryReader { proxy in
ContentView()
.environment(\.mainWindowSize, proxy.size)
}
}
}
}
struct ContentView: View {
#Environment(\.mainWindowSize) var windowSize
var body: some View {
ZStack {
Text("Hello, world!")
.frame(width: windowSize.width/2, height: windowSize.height)
.background(Color.blue)
}
}
}
private struct MainWindowSizeKey: EnvironmentKey {
static let defaultValue: CGSize = .zero
}
extension EnvironmentValues {
var mainWindowSize: CGSize {
get { self[MainWindowSizeKey.self] }
set { self[MainWindowSizeKey.self] = newValue }
}
}
There is a new feature in iOS 16: Users can duplicate an application. So it looks like the user has five applications, but only one is actually running. When that happens, things like UIScreen.main don't make sense anymore, because you are actually responsible for five apps!
Now if you have a view, that belongs to ONE of the five virtual apps, so view.window.windowScene.screen will be the UIScreen that is actually responsible for what the user sees (not the four other instances of the app in the background). In iOS 17, code like that will be deprecated. State that used to be global in iOS 15 isn't global anymore.
Take an extreme case: Two copies of an app, both running in split screen, one on the left third, one on the right two thirds of your screen. There is nothing "global" that you could use.

SwiftUI dragging an object lags on newer iPhones

I have this really simple code for dragging an object on the screen:
struct ContentView: View {
#GestureState var translation = CGSize.zero
var body: some View {
Rectangle()
.frame(width: 200, height: 200)
.offset(translation)
.gesture(
DragGesture().updating($translation, body: { value, state, transaction in
state = value.translation
})
)
}}
When testing on device, it works perfectly smooth on an older model (iPhone 7 Plus), but it's kind of laggy on newer ones (iPhone 13 mini and iPhone 13). On simulators I could not reproduce it, only on a real device. Is there something I'm missing?
iphone13Mini
iphone 7Plus
The difference is only slightly seen on the videos, but in real life the lag feels more somehow on the 13.

iOS15 bug? Overlapping touch ares on iPhone screens with some HStack()ed Picker() objects

Hello developers!
I encountered a situation, where I like to get some feedback from experienced Swift developers at the pulse of time.
For an iOS app I need some user inputs, three to be precise. These can be seen as a whole number in the current state. So I thought, to put them in an integer variable. I decided to use Swift as the language and SwiftUI for the Interface.
Table of contents:
Story
Situation (Problem)
Question
Hardware
Example Code
THE STORY (in short):
I tried to find a way to get the user input into the variable. TextField() handles String. On TextFields()s tap some screen keyboard appears. After input the user can hit the 'return' button and the keyboard disappears. My goal was intercept all characters that did not belong to an integer. So I added a filter function which only allowed numbers from 0 to 9. Fine, but at the default keyboard the user had to hit the symbols button to get a number pad provided.
So I picked a 'Number Pad'. That one has, despite the unpopulated space, no 'return' button. Further the keyboard blocks the view to the Interface. Well, perhaps implementing something like IQKeyboardManager from 'Iftekhar Qurashi' (see: https://cocoapods.org/pods/IQKeyboardManager) can counter the insufficiency, But I asked myself, if such a basic task requires an external library. There should be a native way, right?
THE SITUATION:
After the described and discarded idea, I looked around a went for some Picker()s, which went well... visually. If you grab a picker wheel which has a neighbor to right, you will move the right neighbors wheel. This is shown is a video (just below 'Step 4:'): https://blckbirds.com/post/swiftui-how-to-create-a-multi-component-picker/
Same on the physical devices when in portrait mode(iPhone 6S, X and 13). Landscape offers enough space.
The suggested method to cure the unwanted behavior, namely to add '.compositingGroup().clipped()' to the end of each Picker()s closure does not work for me (iOS 15). As mentioned in the comments by 'Dave Reed', the fact, that the pickers work fine on my Intel MacBooks Xcode (preview and simulators) could be that im running currently 13.1. over there.
For the screenshot I commented the 'compositingGroup().clipped()' lines as shown in the example code. There are two gray color shades, where I think the touch area is (roughly?) positioned. The same can be said if 'compositingGroup().clipped()' are active at position 1.1 and 2.1 (see comment). If 'compositingGroup().clipped()' get active at position 1.2 and 2.2 the gray bars get cut but the touch area is still wide and overlaps the left neighbor.
See some screenshots:
iPhone13portrait
iPhone13landscape
THE QUESTION:
Is there a way of cutting or stretching the touch area for a Picker() object to fit its current stack?
Are there any ideas of using a number only keyboard with a return button?
THE HARDWARE:
Development Platforms:
MacBook Pro (Retina 13", Early 2015, Dual-Core Intel Core i5), FINE ON SIMULATORs AND PREVIEW (Canvas)(iPhone8, 13), Xcode 13.1
MacBook Pro (16", 2021, Apple M1 Pro), Situation occurs on simulators and Preview (Canvas)(iPhone8, 13), Xcode: now 13.2.1
Hardware Test Devices:
iPhone 6s, iOS Version: 15.2, Situation occurs
iPhone X, iOS Version: 15.2, Situation occurs
iPhone 13, iOS Version: 15.1.1, Situation occurs
THE FULL EXAMPLE CODE (incl. Preview):
// demonstration file for bad multiple picker touch area overlay
import SwiftUI
struct ContentView: View {
#State var rice: Int = 51
#State var corn: Int = 20
let range: ClosedRange<Int> = 0...99
var body: some View {
VStack{
HStack(alignment: .center, spacing: 0){
Spacer()
VStack(){//picker 1
Text("rice")
VStack(){
Picker("", selection: $rice) {
ForEach(range, id: \.self){ Text("\($0)").tag($0) }
}
//POSITION 1.1
.pickerStyle(.wheel)
//.compositingGroup()
//.clipped(antialiased: true)
}
//POSITION 1.2
.frame(width: 40, height: 100)
//.compositingGroup()
//.clipped(antialiased: true)
.border(Color.red)
}
Spacer()
VStack(){//picker 2
Text("corn")
VStack(){
Picker("", selection: $corn) {
ForEach(range, id: \.self){ Text("\($0)").tag($0) }
}
//POSITION 1.2
.pickerStyle(.wheel)
//.compositingGroup()
//.clipped(antialiased: true)
}
//POSITION 2.2
.frame(width: 40, height: 100)
//.compositingGroup()
//.clipped(antialiased: true)
.border(Color(UIColor.secondaryLabel))
}
Spacer()
}// End of HStack
.padding()
}.overlay(
RoundedRectangle(cornerRadius: 15)
.stroke(lineWidth: 2)
).padding()
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}

SwiftUI Eye Tracking: Eye tracking performance is slow but on device rotation runs smoothly

I have recently started to code on SwiftUI with no knowledge or experience in Application programming.
I have a simple eye gaze tracking app using this Framework:
https://github.com/ukitomato/EyeTrackKit
Using the given examples i have this code running:
struct ContentView: View {
#ObservedObject var eyeTrackController: EyeTrackController = EyeTrackController(device: Device(type: .iPad), smoothingRange: 10, blinkThreshold: .infinity, isHidden: true)
var body: some View {
ZStack(alignment: .topLeading) {
eyeTrackController.view
Circle()
.fill(Color.blue.opacity(0.5))
.frame(width: 25, height: 25)
.position(x: eyeTrackController.eyeTrack.lookAtPoint.x, y: eyeTrackController.eyeTrack.lookAtPoint.y)
}.edgesIgnoringSafeArea(.all)
}
On launch while having the circle it barely moves, updating its position once every 3-5 seconds.
After rotating the device to landscape mode, then immediately rotating back to Portrait mode, the app seems to run perfectly fine, having no stutter and smooth transition between point to point.
Any idea what can cause this issue?
Thanks in advance!
After a while i found a solution for this problem:
On the line:
#ObservedObject var eyeTrackController: EyeTrackController = EyeTrackController(device: Device(type: .iPad), smoothingRange: 10, blinkThreshold: .infinity, isHidden: true)
The IsHidden value should be false instead of true
Don't know why excatly, but now the code runs smoothly

Working around SwiftUI Path crash in Xcode 11 beta 5

Apple broke Path in Xcode 11 beta 5:
A known issue in Xcode 11 beta 5 causes your app to crash when you use the Path structure.
So I'm trying to work around this using CGMutablePath:
var body: some View {
GeometryReader { geometry in
let path = CGMutablePath()
path.addRect(CGRect(x: 0, y: 0, width: 100, height: 100))
return Path(path)
}
}
This draws a square.
When I try to change the color as follows:
var body: some View {
GeometryReader { geometry in
let path = CGMutablePath()
path.addRect(CGRect(x: 0, y: 0, width: 100, height: 100))
return Path(path).fill(Color.purple)
}
}
I get:
Cannot convert return expression of type 'GeometryReader<_>' to return type 'some View'
Function declares an opaque return type, but has no return statements in its body from which to infer an underlying type
I'm not sure what return type to use? I tried Path but evidently fill doesn't return another Path.
I tried View but I get:
Protocol 'View' can only be used as a generic constraint because it has Self or associated type requirements
I tried some View but it didn't seem to even parse.
The iOS 13 beta 7 release notes say it's fixed in Xcode 11 beta 6, so we just have to wait for that to be released. Hopefully tomorrow!
From the release notes :
Resolved Issues
Using the Path structure no longer causes your app to crash if you’re using the SDKs included in Xcode 11 beta 6 and later. (53523206)
It still will crash... Path is too broken! But if you are curious how you can get to compile successfully, you can do something like this:
struct MyShape: View {
#State private var flag = false
var body: some View {
return GeometryReader { (geometry) -> AnyView in
let path = CGMutablePath()
path.addRect(CGRect(x: 0, y: 0, width: 100, height: 100))
return AnyView(Path(path).fill(Color.purple))
}
}
}
It will crash on Path(path)

Resources