I'm trying to build a chat view in SwiftUI and I want to append my input views to the keyboard, so that when I dismiss the keyboard by dragging my view gets moved with it.
When I was using UIKit I overwrote the inputAccessoryView of the ViewController. Is something similar possible with SwiftUI?
EDIT:
I already saw that I can add a UIKit TextField and add a InputAccessory for this text field. However that's not what I want to do. I want to have a global inputAccessoryView in my SwiftUI View and add my custom input view as a subview, so that it is always Visible and not an addition to my TextField.
I see two possible solutions to the behavior you want.
In some cases, SwiftUI views move out of the way of the keyboard automatically
in iOS 15 and later you can create an InputAccessoryView in Swiftui
1: In swiftUI, there are several safe areas which views lay themselves inside of by default. One of these is the keyboard safe area. This areas takes up the full screen of the device when the keyboard is hidden but shrinks to the non keyboard area of the screen when the keyboard is displayed. So in the example code below, the text field should move above the keyboard when it appears and drop down when the keyboard disappears (this does not work on an iPad when the keyboard is in the smaller floating mode).
VStack {
ScrollView {
ForEach(0 ..< 50) { item in
Text("Demo Text")
.frame(maxWidth: .infinity)
}
}
TextField("Enter Text", text: $messageText)
}
2: In iOS 15+, you can create a toolbar in the keyboard location. This essentially acts as an InputAccessoryView does in UIKit. The difference between this and method 1 is that a view in here will only appear when the keyboard is displayed. The one expiation to this is when a wired or wireless keyboard is attached to the iPhone or iPad, the toolbar view will still be displayed just at the bottom of the screen.
.toolbar {
ToolbarItemGroup(placement: .keyboard) {
Text("Apears at top of keyboard")
}
}
So putting 1 and 2 together, here is an example that implements both. You can run it in Xcode to help understand how both methods behave
VStack {
ScrollView {
ForEach(0 ..< 50) { item in
Text("Demo Text")
.frame(maxWidth: .infinity)
}
}
TextField("Enter Text", text: $messageText)
}
.toolbar {
ToolbarItemGroup(placement: .keyboard) {
Text("Apears at top of keyboard")
}
}
Related
I wanna give my textfield some extra space while SwiftUI is automatically scrolling to it. As described in this question: How to add more padding below a TextField. With the answer from this one it works fine: https://developer.apple.com/forums/thread/699111 but cause I am manipulating the safe area of the view my button touches are not accepted anymore if they are outside of the safe area.
ScrollView {
VStack {
Spacer(minLength: 1000)
TextField("Textfield 1", text: $text)
.padding(30)
Button {
print("Button tapped")
} label: {
Text("Click this button")
}
Spacer(minLength: 1000)
}
}
.keyboardAvoiding()
This example code shows the problem. If I am selecting the TextField and the keyboard appears there's enough space below but If I am clicking at the button nothing happens. Any idea to work around this behaviour or any other way to add more padding below the TextField in SwiftUI?
I'm working on a SwiftUI View with a NavigationBar. The view is very simple, it's a full-page TextEditor:
struct NotesEditingScreen: View {
#State var text: String
var body: some View {
TextEditor(text: $text)
.padding(.horizontal)
.navigationBarTitle("Editing")
}
}
The issue I'm seeing, is that when landing on this screen (via a NavigationLink) the top of the TextEditor is covered up by Navigation Bar:
My desired behavior is that the TextEditor content appears beneath the Navigation Bar, like it appears after you manually scroll to the top to reveal the text.
Is there a solution/workaround to this issue? I was hoping for either some offset, a setting on NavigationBar, or some programmatic scroll behavior that could be done onAppear. Any suggestions welcome.
I think it's an unexpected behavior.
You can try this:
GeometryReader { geo in
ScrollView {
TextEditor(text: $text)
.frame(height: geo.size.height)
}
}
I have a SwiftUI ToolBar with 4 buttons, however the code I implemented is not correct because the buttons end up in weird places when changing the device type in simulator.
Even worse, when viewed on iPhone 8 / 8 Plus, 2 of the buttons are on the far edges of the window.
How do I properly apply spacing/padding to ToolBar buttons so they are consistent across different iOS devices?
Thank you!
// This code spaces the buttons but they change positions depending on the iOS device
ToolbarItem {
HStack {
HStack {
ProfileUploadMediaButton()
}.padding([.trailing], 85)
HStack {
ProfileUsernameButton()
}.padding([.trailing], 84)
HStack {
ProfileLiveButton()
}.padding([.trailing], 6)
HStack {
AccountButton()
}.padding([.trailing], 12)
}
}
})
// I was thinking code like this but all buttons are bunched together on the right-side of // the screen...
ToolbarItem {
HStack {
ProfileUploadMediaButton()
ProfileUsernameButton()
ProfileLiveButton()
AccountButton()
}
}
When you add ToolbarItems, there is an initializer where you can explicitly set the placement of each item. For your case, you would add 3 ToolbarItems, for the left, center, and right. I'd mention that the toolbar is meant to be dynamic, so it may look different on different devices on purpose.
struct ToolbarView: View {
var body: some View {
NavigationView {
VStack {
Text("Hello, world!")
}
.navigationTitle("Test")
.toolbar(content: {
ToolbarItem(placement: .navigationBarLeading) {
Image(systemName: "camera.fill")
}
ToolbarItem(placement: .principal) {
Text("Username")
}
ToolbarItem(placement: .navigationBarTrailing) {
HStack {
Image(systemName: "dot.radiowaves.left.and.right")
Image(systemName: "heart.fill")
}
}
})
}
}
}
Per the documentation, here are the placement options. I'm guessing that when you don't explicitly add a placement, they default to .automatic.
automatic:
The item is placed automatically, depending on many factors including the platform, size class, or presence of other items.
bottomBar:
The item is placed in the bottom toolbar. Applies to iOS, iPadOS, and Mac Catalyst.
cancellationAction:
The item represents a cancellation action for a modal interface.
confirmationAction:
The item represents a confirmation action for a modal interface.
destructiveAction:
The item represents a destructive action for a modal interface.
navigation:
The item represents a navigation action.
navigationBarLeading:
The item is placed in the leading edge of the navigation bar. Applies to iOS, iPadOS, tvOS, and Mac Catalyst.
navigationBarTrailing:
The item is placed in the trailing edge of the navigation bar. Applies to iOS, iPadOS, tvOS, and Mac Catalyst.
primaryAction:
The item represents a primary action.
principal:
The item is placed in the principal item section.
ToolbarItemPlacement:
The item represents a change in status for the current context.
First, I have looked at a similar question, but it does not address my use case.
Present ActionSheet in SwiftUI on iPad
My issue is that I have a NavigationBarItem in my NavigationView that will toggle an ActionSheet when pressed. This behavior works properly when used on an iPhone.
However, when I use this on an iPad, both buttons on my screen will gray out and nothing happens. Clicking the buttons again will make them active (blue), but again, no sheet is presented.
Finally, if I select the button in the middle of the screen (Show Button), then an ActionSheet is properly presented on an iPad.
I have tested with Xcode 11 & iOS 13.5 and Xcode 12 & iOS 14. There is no change in behavior.
import SwiftUI
struct ContentView: View {
#State private var isButtonSheetPresented = false
#State private var isNavButtonSheetPresented = false
var body: some View {
NavigationView {
Button(action: {
// Works on iPad & iPhone
self.isButtonSheetPresented.toggle()
}) {
Text("Show Button")
}
.actionSheet(isPresented: $isButtonSheetPresented,
content: {
ActionSheet(title: Text("ActionSheet"))
})
.navigationBarTitle(Text("Title"),
displayMode: .inline)
.navigationBarItems(trailing:
Button(action: {
// Works on iPhone, fails on iPad
self.isNavButtonSheetPresented.toggle()
}) {
Text("Show Nav")
}
.actionSheet(isPresented: $isNavButtonSheetPresented,
content: {
ActionSheet(title: Text("ActionSheet"))
})
)
}
.navigationViewStyle(StackNavigationViewStyle())
}
}
Finally, this is how it appears on an iPad when clicking on "Show Nav":
This is a simplified setup for the screen where this issue occurs. I will need to retain the navigation settings shown, but have included them for clarity.
*** UPDATED ***
While it is not possible for the real app behind this, I did remove the .navigationViewStyle(StackNavigationViewStyle()) setting, which did make an ActionSheet appear, although in the wrong spot as seen below.
This also results in bizarre placement for the Button one accessed via "Show Button".
Yes, it is a bug, but probably different - that Apple does not allow to change anchor and direction of shown ActionSheet, because it is shown, but always to the right of originated control on iPad. To prove this it is enough to change location of button in Navigation
Here is example of placing at .leading position. Tested with Xcode 12 / iOS 14
.navigationBarItems(leading:
Button(action: {
// Works on iPhone, fails on iPad
self.isNavButtonSheetPresented.toggle()
}) {
Text("Show Nav")
}
.actionSheet(isPresented: $isNavButtonSheetPresented,
content: {
ActionSheet(title: Text("ActionSheet"))
})
)
Note: SwiftUI 2.0 .toolbar behaves in the same way, ie. has same bug.
This is an old question but if someone is interested in a turnaround that works on iOS 14:
I have two navigation bar trailing buttons inside .toolbar() and they should open action sheets. I placed an invisible "bar" at the top of the view to use it as an anchor:
var body: some View {
VStack {
HStack {
Spacer()
Color.clear.frame(width: 1, height: 1, alignment: .center)
.actionSheet(/*ActionSheet for first button*/)
Spacer().frame(width: 40)
Color.clear.frame(width: 1, height: 1, alignment: .center)
.actionSheet(/*ActionSheet for second button*/)
Spacer().frame(width: 40)
}.frame(height: 1)
}
}
Cons:
There's a tiny bar/extra space at the top, noticeable especially during scrolling (Maybe putting the Stack in the background with a Stack could remove it?).
You might need to adjust the Spacers' width to try and align the ActionSheets to their respective button.
You can't force the action sheet arrows to always point upwards, I tested this on another simulator and the rightmost ActionSheet had its arrow pointing to the right (the 'illusion' that it came from the button was still there)
Here's how it looks
I am creating a tooltip system.
I want to dismiss the tooltip if the user touches anywhere outside the tooltip.
I would like it so that a touch outside the tooltip both dismisses the tooltip and activates any controls the user tapped on. (So you could have a tooltip open and still click a button outside the tooltip and have it activate on the first tap.)
To do this, I have an invisible view handling the tap gesture and dismissing the tooltip, but I do not know how to make SwiftUI not intercept and cancel the tap gestures. On the web, it's the equivalent of not calling event.stopPropagation() and event.preventDefault(), or calling super in touchesBegan: in UIKit.
Any ideas?
Here is a demo of possible approach. Tested with Xcode 11.4 / iOS 13.4
struct ContentView: View {
var body: some View {
VStack {
Button("Button") { print("> button tapped")}
}
.frame(width: 200, height: 200)
.contentShape(Rectangle()) // makes all area tappable
.simultaneousGesture(TapGesture().onEnded({
print(">>> tooltip area here")
}))
.border(Color.red) // just for demo show area
}
}
You need to use this modifier:
.allowsHitTesting(false)