Hide & Disable TabBar when in NavigationDestination Subview - Apple Watch - ios

I'm trying to create a UI layout and navigation functionality similar to the Apple Fitness app on Apple Watch.
This is all WatchOS 9 so no need for older API support, phew!
I want to have NavigationTitle set for each view in a TabBar.
I want each selected NavigationDestination to hide the TabBar controls and disable swiping when opened.
I want the NavigationTitle and update to remain whilst navigating between Tabs and Childs.
I've created this sample code based off the Building a productivity app for Apple Watch sample code from Apple and other examples on navigation with the new NavigationStack in WatchOS 9, iOS 16 etc...
import SwiftUI
#main
struct Test_Watch_App_Watch_AppApp: App {
    #SceneBuilder var body: some Scene {
        WindowGroup {
            TabView {
                NavigationStack {
                    ScrollView {
                        VStack {
                            NavigationLink("Mint", value: Color.mint)
                            NavigationLink("Pink", value: Color.pink)
                            NavigationLink("Teal", value: Color.teal)
                        }
                    }
                    .navigationDestination(for: Color.self) { color in
                        Text("Hello").background(color)
                    }
                    .navigationTitle("Colors")
                }
                NavigationStack {
                    ScrollView {
                        VStack {
                            NavigationLink("headline", value: Font.headline)
                            NavigationLink("title", value: Font.title)
                            NavigationLink("caption", value: Font.caption)
                        }
                    }
                    .navigationDestination(for: Font.self) { font in
                        Text("Hello").font(font)
                    }
                    .navigationTitle("Fonts")
                }
            }.tabViewStyle(.page)
        }
    }
}
The problem here is, when selecting any of the Navigation Links, the child view still displays the Tab Bar page indicators and allows you to swipe between the tabs, I don't want this. The functionality as I'd like exists int eh Apple Fitness app on the Watch. The app launches in the Activity Tab. You can swipe across to Sharing, select a person, and in that view it's not then possible to swipe straight back to Activity.
I've tried Embedding the whole TabView in a NavigationStack and removing the NavigationStacks per tab. This works as far as fixing the child views hiding the TabBar page indicator controls and swiping. However, it then breaks NavigationTitles on launch and when moving in and out of Childs, so I don't think it should be this way.
Any help very appreciated.
ADDITIONAL DETAILS:
I've added some screenshots from the Apple Fitness Watch App on WatchOS 9 better detailing what I'm trying to recreate. Note the tab bar on the first and second photo. Note the lack of tab bar on the third photo.

In the end I fixed this and got the behaviour I wanted without the need for a custom solution.
The solution was to wrap the whole thing in one navigationStack rather than individual ones.
I had already tried this but had the bug mentioned with the navigationTitle not displaying properly. This was fixed by setting the NavigationTitle on the TabView to the same title as the title on the first Tab item view. This is likely a bug which I'll report.
Thanks

Related

Flutter nested widget is not able to tap for perfecto automation on iOS 16

Flutter application while on iOS 16 devices, when tapping the screen, it is marked as a whole container not able to identify the inner child element for hittable visible and enabled
I tried like below but not able to detect inner child image element.
Container( 
      child: Column(
          children: [
            Container( 
            width: 100, 
            height: 100,
             child: Image.asset(
'assets/images/lake.jpg',               
fit: BoxFit.cover,             
),),
            Container(
              width: 100,
              height: 100,
              child: Image.asset(
                'assets/images/lake.jpg',
                fit: BoxFit.cover,
              ), 
          ),
            Container( 
            width: 100, 
            height: 100,
              child: Image.asset(
                'assets/images/lake.jpg', 
              fit: BoxFit.cover, 
            ), 
          ), 
          Container(
              width: 100, 
            height: 100,
              child: Image.asset( 
              'assets/images/lake.jpg',
                fit: BoxFit.cover, 
            ),          
),         
],       
),     
),

iOS app recompile for MacOS can't share on Catalina

I new to Swift, so I'm looking for help.
I'm working in an app for iOS that shows a list of data that you can share through the system share. I recompile the app to Work in MacOS too. On iOS works perfect, in my Ventura Mac works fine also... but when I tried to make it work on Catalina, it just don't show the menu, and even the button get grayed out like it get pushed and never released. The code is this:
    #IBAction func shareRecord(_ sender: Any) {
        if resultado == ""{
            print("No Item Selected")
        }else{
            print("Reenviando \(resultado)")
            let text = resultado
            let textToShare = [ text ]
            let activityViewController = UIActivityViewController(activityItems: textToShare, applicationActivities: nil)
            activityViewController.popoverPresentationController?.sourceView = self.view // so that iPads won't crash
            self.present(activityViewController, animated: true, completion: nil)
        }
    }
So, any ideas?
I tried several times, but nothing shows, I expect to a menu with options appears.

IB Outlet issues - Thread 1: EXC_BAD_ACCESS

I am making a phone number recognition app based off of Apple's example code. I am fairly new to swift and coding in general. When I run the project I get the "Thread 1: EXC_BAD_ACCESS (code=257, address=0x7e700019ec0ad79)" error on line 68 "previewView.session = captureSession" . I think it has something to do with line 26 "#IBOutlet weak var previewView: PreviewView!" ?
I have a view controller with 2 views and one label. The IB Outlets seem fine for two of them but does not look the same for line 26.. the color of the text PreviewView! is gray instead of blue like the others. Could this be the problem?
import AVFoundation
import Vision
class TextScanViewController: UIViewController {
    // MARK: - UI objects
    
    #IBOutlet weak var previewView: PreviewView!
    #IBOutlet weak var cutoutView: UIView!
    #IBOutlet weak var numberView: UILabel!
    var maskLayer = CAShapeLayer()
    // Device orientation. Updated whenever the orientation changes to a
    // different supported orientation.
    var currentOrientation = UIDeviceOrientation.portrait
    
    // MARK: - Capture related objects
    private let captureSession = AVCaptureSession()
    let captureSessionQueue = DispatchQueue(label: "com.example.apple-samplecode.CaptureSessionQueue")
    
    var captureDevice: AVCaptureDevice?
    
    var videoDataOutput = AVCaptureVideoDataOutput()
    let videoDataOutputQueue = DispatchQueue(label: "com.example.apple-samplecode.VideoDataOutputQueue")
    
    // MARK: - Region of interest (ROI) and text orientation
    // Region of video data output buffer that recognition should be run on.
    // Gets recalculated once the bounds of the preview layer are known.
    var regionOfInterest = CGRect(x: 0, y: 0, width: 1, height: 1)
    // Orientation of text to search for in the region of interest.
    var textOrientation = CGImagePropertyOrientation.up
    
    // MARK: - Coordinate transforms
    var bufferAspectRatio: Double!
    // Transform from UI orientation to buffer orientation.
    var uiRotationTransform = CGAffineTransform.identity
    // Transform bottom-left coordinates to top-left.
    var bottomToTopTransform = CGAffineTransform(scaleX: 1, y: -1).translatedBy(x: 0, y: -1)
    // Transform coordinates in ROI to global coordinates (still normalized).
    var roiToGlobalTransform = CGAffineTransform.identity
    
    // Vision -> AVF coordinate transform.
    var visionToAVFTransform = CGAffineTransform.identity
    
    // MARK: - View controller methods
    
    override func viewDidLoad() {
        super.viewDidLoad()
        
        // Set up preview view.
        previewView.session = captureSession
        
        // Set up cutout view.
        cutoutView.backgroundColor = UIColor.gray.withAlphaComponent(0.5)
        maskLayer.backgroundColor = UIColor.clear.cgColor
        maskLayer.fillRule = .evenOdd
        cutoutView.layer.mask = maskLayer
        
        // Starting the capture session is a blocking call. Perform setup using
        // a dedicated serial dispatch queue to prevent blocking the main thread.
        captureSessionQueue.async {
            self.setupCamera()
            
            // Calculate region of interest now that the camera is setup.
            DispatchQueue.main.async {
                // Figure out initial ROI.
                self.calculateRegionOfInterest()
            }
        }
    }
    
    override func viewWillTransition(to size: CGSize, with coordinator: UIViewControllerTransitionCoordinator) {
        super.viewWillTransition(to: size, with: coordinator)
        // Only change the current orientation if the new one is landscape or
        // portrait. You can't really do anything about flat or unknown.
        let deviceOrientation = UIDevice.current.orientation
        if deviceOrientation.isPortrait || deviceOrientation.isLandscape {
            currentOrientation = deviceOrientation
        }
        
        // Handle device orientation in the preview layer.
        if let videoPreviewLayerConnection = previewView.videoPreviewLayer.connection {
            if let newVideoOrientation = AVCaptureVideoOrientation(deviceOrientation: deviceOrientation) {
                videoPreviewLayerConnection.videoOrientation = newVideoOrientation
            }
        }
        
        // Orientation changed: figure out new region of interest (ROI).
        calculateRegionOfInterest()
    }
    
    override func viewDidLayoutSubviews() {
        super.viewDidLayoutSubviews()
        updateCutout()
    }
    
    // MARK: - Setup
    
    func calculateRegionOfInterest() {
        // In landscape orientation the desired ROI is specified as the ratio of
        // buffer width to height. When the UI is rotated to portrait, keep the
        // vertical size the same (in buffer pixels). Also try to keep the
        // horizontal size the same up to a maximum ratio.
        let desiredHeightRatio = 0.15
        let desiredWidthRatio = 0.6
        let maxPortraitWidth = 0.8
        
        // Figure out size of ROI.
        let size: CGSize
        if currentOrientation.isPortrait || currentOrientation == .unknown {
            size = CGSize(width: min(desiredWidthRatio * bufferAspectRatio, maxPortraitWidth), height: desiredHeightRatio / bufferAspectRatio)
        } else {
            size = CGSize(width: desiredWidthRatio, height: desiredHeightRatio)
        }
        // Make it centered.
        regionOfInterest.origin = CGPoint(x: (1 - size.width) / 2, y: (1 - size.height) / 2)
        regionOfInterest.size = size
        
        // ROI changed, update transform.
        setupOrientationAndTransform()
        
        // Update the cutout to match the new ROI.
        DispatchQueue.main.async {
            // Wait for the next run cycle before updating the cutout. This
            // ensures that the preview layer already has its new orientation.
            self.updateCutout()
        }
    }
    
    func updateCutout() {
        // Figure out where the cutout ends up in layer coordinates.
        let roiRectTransform = bottomToTopTransform.concatenating(uiRotationTransform)
        let cutout = previewView.videoPreviewLayer.layerRectConverted(fromMetadataOutputRect: regionOfInterest.applying(roiRectTransform))
        
        // Create the mask.
        let path = UIBezierPath(rect: cutoutView.frame)
        path.append(UIBezierPath(rect: cutout))
        maskLayer.path = path.cgPath
        
        // Move the number view down to under cutout.
        var numFrame = cutout
        numFrame.origin.y += numFrame.size.height
        numberView.frame = numFrame
    }
    
    func setupOrientationAndTransform() {
        // Recalculate the affine transform between Vision coordinates and AVF coordinates.
        
        // Compensate for region of interest.
        let roi = regionOfInterest
        roiToGlobalTransform = CGAffineTransform(translationX: roi.origin.x, y: roi.origin.y).scaledBy(x: roi.width, y: roi.height)
        
        // Compensate for orientation (buffers always come in the same orientation).
        switch currentOrientation {
        case .landscapeLeft:
            textOrientation = CGImagePropertyOrientation.up
            uiRotationTransform = CGAffineTransform.identity
        case .landscapeRight:
            textOrientation = CGImagePropertyOrientation.down
            uiRotationTransform = CGAffineTransform(translationX: 1, y: 1).rotated(by: CGFloat.pi)
        case .portraitUpsideDown:
            textOrientation = CGImagePropertyOrientation.left
            uiRotationTransform = CGAffineTransform(translationX: 1, y: 0).rotated(by: CGFloat.pi / 2)
        default: // We default everything else to .portraitUp
            textOrientation = CGImagePropertyOrientation.right
            uiRotationTransform = CGAffineTransform(translationX: 0, y: 1).rotated(by: -CGFloat.pi / 2)
        }
        
        // Full Vision ROI to AVF transform.
        visionToAVFTransform = roiToGlobalTransform.concatenating(bottomToTopTransform).concatenating(uiRotationTransform)
    }
    
    func setupCamera() {
        guard let captureDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .back) else {
            print("Could not create capture device.")
            return
        }
        self.captureDevice = captureDevice
        
        // NOTE:
        // Requesting 4k buffers allows recognition of smaller text but will
        // consume more power. Use the smallest buffer size necessary to keep
        // down battery usage.
        if captureDevice.supportsSessionPreset(.hd4K3840x2160) {
            captureSession.sessionPreset = AVCaptureSession.Preset.hd4K3840x2160
            bufferAspectRatio = 3840.0 / 2160.0
        } else {
            captureSession.sessionPreset = AVCaptureSession.Preset.hd1920x1080
            bufferAspectRatio = 1920.0 / 1080.0
        }
        
        guard let deviceInput = try? AVCaptureDeviceInput(device: captureDevice) else {
            print("Could not create device input.")
            return
        }
        if captureSession.canAddInput(deviceInput) {
            captureSession.addInput(deviceInput)
        }
        
        // Configure video data output.
        videoDataOutput.alwaysDiscardsLateVideoFrames = true
        videoDataOutput.setSampleBufferDelegate(self, queue: videoDataOutputQueue)
        videoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
        if captureSession.canAddOutput(videoDataOutput) {
            captureSession.addOutput(videoDataOutput)
            // NOTE:
            // There is a trade-off to be made here. Enabling stabilization will
            // give temporally more stable results and should help the recognizer
            // converge. But if it's enabled the VideoDataOutput buffers don't
            // match what's displayed on screen, which makes drawing bounding
            // boxes very hard. Disable it in this app to allow drawing detected
            // bounding boxes on screen.
            videoDataOutput.connection(with: AVMediaType.video)?.preferredVideoStabilizationMode = .off
        } else {
            print("Could not add VDO output")
            return
        }
        
        // Set zoom and autofocus to help focus on very small text.
        do {
            try captureDevice.lockForConfiguration()
            captureDevice.videoZoomFactor = 2
            captureDevice.autoFocusRangeRestriction = .near
            captureDevice.unlockForConfiguration()
        } catch {
            print("Could not set zoom level due to error: \(error)")
            return
        }
        
        captureSession.startRunning()
    }
    
    // MARK: - UI drawing and interaction
    
    func showString(string: String) {
        // Found a definite number.
        // Stop the camera synchronously to ensure that no further buffers are
        // received. Then update the number view asynchronously.
        captureSessionQueue.sync {
            self.captureSession.stopRunning()
            DispatchQueue.main.async {
                self.numberView.text = string
                self.numberView.isHidden = false
            }
        }
    }
    
    #IBAction func handleTap(_ sender: UITapGestureRecognizer) {
        captureSessionQueue.async {
            if !self.captureSession.isRunning {
                self.captureSession.startRunning()
            }
            DispatchQueue.main.async {
                self.numberView.isHidden = true
            }
        }
    }
}
// MARK: - AVCaptureVideoDataOutputSampleBufferDelegate
extension TextScanViewController: AVCaptureVideoDataOutputSampleBufferDelegate {
    
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        // This is implemented in VisionViewController.
    }
}
// MARK: - Utility extensions
extension AVCaptureVideoOrientation {
    init?(deviceOrientation: UIDeviceOrientation) {
        switch deviceOrientation {
        case .portrait: self = .portrait
        case .portraitUpsideDown: self = .portraitUpsideDown
        case .landscapeLeft: self = .landscapeRight
        case .landscapeRight: self = .landscapeLeft
        default: return nil
        }
    }
}
If you try to reference an #IBOutlet object that has not been properly connected, you will get this error:
Fatal error: Unexpectedly found nil while implicitly unwrapping an Optional value
Since that is not the error you are getting, it must be something else.
Almost certainly, you added a UIView and connected it to:
#IBOutlet weak var previewView: PreviewView!
but forgot to set the Custom Class.
When you select that UIView in Storyboard, this:
should look like this:
(Ignore the "Module: qt122022" -- it will auto-fill with the name of your project. Mine just happens to be named qt122022.)
Check the connection inspector in your storyboard or xib and check if previewView if it has a single reference, it may be that you have more than one or in any case no reference to your IBOulet

Display AdMob again after closing

So I have followed this guide to implement AdMob ads in my app: https://developers.google.com/admob/ios/interstitial?hl=en-US
This is my viewDidLoad and the Delegates:
override func viewDidLoad() {
    super.viewDidLoad()
    let request = GADRequest()
    GADInterstitialAd.load(withAdUnitID:"ca-app-pub-3940256099942544/4411468910",
                                request: request,
                      completionHandler: { [self] ad, error in
                        if let error = error {
                          print("Failed to load interstitial ad with error: \(error.localizedDescription)")
                          return
                        }
                        interstitial = ad
                        interstitial?.fullScreenContentDelegate = self
                      }
    )
  }
  /// Tells the delegate that the ad failed to present full screen content.
  func ad(_ ad: GADFullScreenPresentingAd, didFailToPresentFullScreenContentWithError error: Error) {
    print("Ad did fail to present full screen content.")
  }
  /// Tells the delegate that the ad will present full screen content.
  func adWillPresentFullScreenContent(_ ad: GADFullScreenPresentingAd) {
    print("Ad will present full screen content.")
  }
  /// Tells the delegate that the ad dismissed full screen content.
  func adDidDismissFullScreenContent(_ ad: GADFullScreenPresentingAd) {
    print("Ad did dismiss full screen content.")
  }
This is the code I run when I want to display the ad:
if interstitial != nil {
    interstitial.present(fromRootViewController: self)
  } else {
    print("Ad wasn't ready")
  }
It works the first time, but when I want to display the add again, it doesn't show up. What am I missing here?

onTapGesture action on a Text moving with animation

I am trying to implement an action on tap gesture on a Text which is moving with animation using SwiftUI.
When the Text is moving, the tap gesture seems to register only when I tap at the final destination of the Text, and not when I tap on the moving Text
struct ContentView: View {
    #State var offsetY : CGFloat = 0
    var body: some View {
        VStack {
            Text("Animate")
                .onTapGesture {
                    withAnimation(.linear(duration: 15)){
                        offsetY = 200
                    }
                }
            Text("Hello, world!")
                .offset(y: offsetY)
                .onTapGesture {
                    print("Tap")
                }
        }
    }
}
Is there a way to make tapGesture register when I tap on the moving Text ?
By the way, it works when I encapsulate the Text in a NavigationLink, but I don't want to have a NavigationLink in my case.
struct ContentView: View {
    #State var offsetY : CGFloat = 0
    var body: some View {
        VStack {
            Text("Animate")
                .onTapGesture {
                    withAnimation(.linear(duration: 15)){
                        offsetY = 100
                    }
                }
NavigationLink(destination: EmptyView()) {
                Text("Hello, world!")
            }.offset(y: offsetY)
                .onTapGesture {
                    print("Tap")
                }
        }
    }
}
you can work around that by using a button with the same position of the text you want to tap on it while animating
struct ContentView: View {
#State var offsetY : CGFloat = 0
#State var counter = 0 // just for counting taps
var body: some View {
VStack {
Text("Animate")
.onTapGesture {
withAnimation(.linear(duration: 15)){
offsetY = 200
}
}
Text("Hello, world!").offset(y: offsetY) // the text that animates
Button {
counter += 1
print("tap \(counter)")
} label: {
// remove text here so that button will be hidden
Text("").position(y: offsetY)
}
// the button with the same position to the animating offset of text
}.frame(width: 500, height: 500, alignment: .center)
}
}
try replacing your
.onTapGesture {
print("Tap")
}
with
.simultaneousGesture(
TapGesture()
.onEnded { _ in
print("Tap")
}
)
Works well for me on macos 12.5, using Xcode 13.3, targets ios-15 macCatalyst 12.3, tested on real devices only.
By default animatable value is applied instantly and we observe just drawing animation to that final value. In this case frame is jumping to the end and tap works there. To interfere in this we can use custom animatable modifier, which just process view progressively and frame is always updated during animation so tappable.
Tested with Xcode 13.4 / iOS 15.5
Main part:
Text("Hello, world!")
.onTapGesture {
print("Tap")
}
.modifier(MovableViewModifier(value: offsetY)) // << here !!
Complete test module is here

Resources