I am working on an app where data (custom structs) can be transfered between instances of the app running on different phones. The app should be able to open Share Sheet and send the data to another device where it will automatically open the app for the data to be imported. (Having support for the share sheet is important because the app needs to be functional when there is not internet access and airdrop seems to be the only way to transfer data between phone when there is no internet.)
So far I have made the struct I would like to transfer between instances of the app conform to the Transferrable Protocol. I have also defined a custom Uniform Type Identifier in the code and the info.plist. With this, I am able to export the struct using the share sheet and it sends a json file ending in .stageresult. However, my issue is that when other devices receive the file they do not open it automatically nor do they give any way to do it manually. Also, I have been unable to find anything online about how to handle importing custom files. Is there a way I can call a function with the imported data to load it into my app? What is the proper way to handle importing custome universal type identifers using Swift/SwiftUI
import UniformTypeIdentifiers
import SwiftUI
import Foundation
extension UTType {
static var stageresult: UTType { UTType(exportedAs: "com.example.stageresult") }
}
struct StageResult: Codable {
var name: String
var start: Bool
var recordings: [Recording]
}
struct Recording: Codable {
var plate: String
var timestamp: Double
}
extension StageResult: Transferable {
static var transferRepresentation: some TransferRepresentation {
CodableRepresentation(contentType: .stageresult)
}
}
Defining Type Identifiers
Info.plist Supports Opening Documents In Place
Related
I have a SwiftUI app (XCode 14 / iOS 15+) that tries to receive fmp12 files sent from the FileMaker app (using the share sheet). I got to the point that my app is shown in the share sheet and gets started. What I cannot achieve is access to the file.
Following some tutorials and Apple's documentation I imported "com.filemaker.fmp12" as an imported type identifier in my plist.info and added it to Document Types.
first strategy: DocumentGroup
To access the file I have first tried to use a DocumentGroup in SwiftUI based on Apple's documentation here: https://developer.apple.com/documentation/swiftui/documentgroup
#main
struct KioskBridgeApp: App {
var body: some Scene {
DocumentGroup(viewing: FMP12File.self) { file in
KioskBridgeView(file: file.fileURL?.absoluteString ?? "undefined")
}
}
While this works when I send an fmp12 file with the files app, it does not work when I send it from FileMaker: Then it always starts with the Document Browser.
The document browser also opens when I start the app without sending anything to it and I could not find a hint how to suppress that.
So I developed the feeling that I might be on the wrong track entirely with DocumentGroup here and so tried I strategy number 2:
second strategy: App delegate
Using an Application Delegate as suggested here https://stackoverflow.com/a/46572886 and as for the Adaptor here: SwiftUI app life cycle iOS14 where to put AppDelegate code? and here https://developer.apple.com/documentation/swiftui/uiapplicationdelegateadaptor
class MyAppDelegate: NSObject, UIApplicationDelegate {
func application(_ application: UIApplication, open url: URL, sourceApplication: String?, annotation: Any) -> Bool {
print("application opened with \(url.absoluteString)")
return true
}
#main
struct DelegateTestApp: App {
#UIApplicationDelegateAdaptor(MyAppDelegate.self) var appDelegate
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
But this delegate is never called, not even when I open my app using the files app.
I am fairly new to iOS development, so even after hours of reading I am not even sure that any of those strategies is even the right approach. Any help would be highly appreciated.
While I have not figured out ways to make the first two strategies work, more research has finally led to the rather simple solution of .onOpenUrl.
This was a useful source to get there: https://betterprogramming.pub/swiftui-3-ways-to-observe-apps-life-cycle-in-swiftui-e9be79e75fef.
So my code to catch the file sent to the app is now:
#main
struct KioskBridgeApp: App {
#State var openedUrl: URL? = nil
var body: some Scene {
WindowGroup {
KioskBridgeView(openedUrl: $openedUrl)
.onOpenURL { url in
openedUrl = url
}
}
}
}
Quite clear once one knows it. After that I also found a stackoverflow question on the topic I had overlooked so far: Handle incoming custom URL types with SwiftUI while app is closed
TL;DR
On iOS 13 and Xcode 11, how can I configure an Intent to run in the background and just return the result, so it can be used as the input for other actions in the Shortcuts app?
Details of what I'm trying to achieve
My app has a list of songs that I want to expose via Shortcuts (actually, metadata about the song, not the song itself). The idea is to give advanced users a way to integrate this database of music with other apps or actions they want. For example, one may find useful to get a list of upcoming music for the next month, and use it to create Calendar events for each song. Having access to this list on the Shortcuts app can enable them to do this.
I have created an Intent called "List All Unread Music Releases" and defined its response as a list of objects that contains information about each song. The problem is, when I go to the Shortcuts app, create a new shortcut using this Intent, and run it, it opens my app instead of running in the background.
Steps I've done to create and configure Intents
Here's a high level definition of what I did to configure Intents in the project. The next section will have the actual source code and screenshots.
Created a new SiriKit Intent Definition File.
Created a new Intent.
Defined it's Title, Description, Parameters, and disabled the "Intent is eligible for Siri Suggestions" checkbox.
Defined the response property as an Array (because it's going to be a list of songs), and configured the Output to be this array property.
Created a new Intents Extension, with the checkbox "Include UI Extension" disabled. The idea here is to process the user request in the background and return a list with the results - no UI required.
In the Intents Extension target, defined the IntentsSupported array inside Info.plist with the name of the intent created in step 2.
Made the IntentHandler class implement the protocol generated for the intent created in step 2.
Code samples and screenshots
My SiriKit Intent Definition File and the GetUnreadReleases Intent:
The GetUnreadReleases Intent response:
The Intents Extension IntentHandler class:
import Intents
class IntentHandler: INExtension, GetUnreadReleasesIntentHandling {
func handle(intent: GetUnreadReleasesIntent, completion: #escaping (GetUnreadReleasesIntentResponse) -> Void) {
let response = GetUnreadReleasesIntentResponse(code: .success, userActivity: nil)
let release1 = IntentRelease(identifier: "1", display: "Teste 1")
release1.name = "Name test 1"
release1.artist = "Artist test 1"
response.releases = [release1]
completion(response)
}
func resolveMediaType(for intent: GetUnreadReleasesIntent, with completion: #escaping (IntentMediaTypeResolutionResult) -> Void) {
if intent.mediaType == .unknown {
completion(.needsValue())
} else {
completion(.success(with: intent.mediaType))
}
}
override func handler(for intent: INIntent) -> Any {
// This is the default implementation. If you want different objects to handle different intents,
// you can override this and return the handler you want for that particular intent.
return self
}
}
The Intents Extension Info.plist file:
Conclusion
So, I would like this intent to run in the background, assemble the list of songs based on the user defined parameters, and return this list to be used as an input to other actions in the Shortcuts app.
It looks like previous versions of the Intents editor (Xcode < 11 / iOS < 13.0) had a checkbox "Supports background execution" that did just that, but I can't find it anymore on Xcode 11.
Thanks to edford from Apple Developer Forums, I was able to make it work. In the intents definition file, the "Intent is eligible for Siri Suggestions" checkbox must be checked for the background execution to work.
Im creating a game using SpriteKit but I’m not using sks files fore levels. I don’t to enter in details about the idea of the game before I release it but basically each level is auto generated based on a few numbers. So essentially what defines a level would be these numbers I would like to know where I could store this numbers. If I used sks files I would just have a file per level but in this case should I have them sorted in an array of levels? Should the array be in the level selection viewcontroller ? Should it be in a singleton class?
Basically what would be a good way to go about storing these values?
So the levels are auto-generated at runtime?
You could use an array of levels, or a file per level. I would just write them to one or more files in your app's documents directory. (I'd probably use one file per level, just to keep it simple and make it so you can easily add more levels without rewriting the whole game layout file each time.)
If you build your level structures out of scalar types, arrays, and dictionaries, (property list objects) then you can write the "object graph" to a property list using the NSArray or NSDictionary method write(to:).
Alternately you could make your level object conform to the Codable protocol, convert it to JSON, and save the JSON data to a file. The Codable protocol is easy to use, it's well documented by Apple, and there are tons of tutorials online.
EDIT
Note that you could also write your data to a property list using the Codable protocol. Just like the JSONEncoder and JSONDecoder classes, there are PropertyListEncoder and PropertyListDecoder classes that will convert your object graph back and forth to property list format. (Binary properties lists are more compact and faster to read and write than JSON.)
Below is a sample playground that defines a custom struct FooStruct, makes it Codable, and then uses a PropertyListEncoder to write the data to the playground's shared data directory (which you will have to set up if you want to test this code)
import UIKit
import PlaygroundSupport
struct FooStruct: Codable {
let aString: String
let anotherString: String
let anInt: Int
}
let fooArray: [FooStruct] = [FooStruct(aString: "Foo 1",
anotherString: "String 1", anInt: 4),
FooStruct(aString: "Foo 2",
anotherString: "String 2", anInt: 7)
]
let encoder = PropertyListEncoder()
encoder.outputFormat = .binary
do {
print(fooArray)
let data = try encoder.encode(fooArray)
let plistURL = playgroundSharedDataDirectory.appendingPathComponent("property list.plist")
try data.write(to: plistURL)
print("Data written to \(plistURL.path)")
} catch {
print (error)
}
I do apologize if I'm not posting correctly since I'm a little new to posting here. I'm currently attempting to add a siri shortcut into my application. I've created the intent and I'm able to handle it properly and create a response with dummy data.
I am however, unable to access my service classes and other objects from the application despite adding my app to the intent handler class's target.
class IntentHandler: INExtension, TestIntentHandling {
#available(iOS 12.0, *)
func confirm(intent: TestIntent, completion: #escaping (TestIntentResponse) -> Void) {
print("HERE")
completion(TestIntentResponse.init(code: .ready, userActivity: nil))
}
#available(iOS 12.0, *)
func handle(intent: TestIntent, completion: #escaping (TestIntentResponse) -> Void) {
let response = TestIntentResponse.init(code: .success, userActivity: nil)
//Trying to reach into service here to get real values
response.workout = "Bench Press"
response.weight = 150
completion(response)
}
}
I would like to reach into my application services to populate my workout and weight fields in my handle function but I keep getting an error saying that my service classes do not exist and was hoping someone would be able to point me in the right direction. Thanks!
According to the documentation:
When a user makes a request of your app using Siri or Maps, SiriKit loads your Intents app extension and creates an instance of its INExtension subclass. The job of your extension object is to provide SiriKit with the handler objects that you use to handle specific intents. You provide these objects from the handler(for:) method of your extension object.
You need to call the handler(for:) method and return the appropriate handler class (this will be a class you create). Your intent handler class, e.g. TestIntentHandler, will subclass NSObject and conform to your TestIntentHandling protocol. TestIntentHandler is where you would handle your intent.
You need to create an app group and move any classes and methods you need to use in both the app and intent into a Framework shared between both. For things like small bits of data you can use a shared UserDefaults using UserDefaults(suiteName: "your.app.group").
From the docs:
If your app and app extension share services, consider structuring your code in the following way:
• Implement your core services in a private shared framework. A
private shared framework lets you place the code for accessing your
services in one code module and use that code from multiple targets.
Shared frameworks minimize the size of both executables and make
testing easier by ensuring that each executable uses the same code
path.
• Use a shared container to store common resources. Put relevant
images and data files into a shared container so your app and app
extension can use them. You enable shared container support in the
Capabilities tab of each target.
I am working on extension for safari.I have checked we can communicate between host app and extension as we can run extension or close extension.But in my case I want to communicate with host app without closing extension app.
var MyExtensionJavaScriptClass = function() {};
MyExtensionJavaScriptClass.prototype = {
run: function(arguments) {
arguments.completionFunction({"baseURI": document.documentElement.innerHTML});
},
test: function(arguments) {
alert("Need to run without closing extension");
},
finalize: function(arguments) {
alert("Test Done");
// arguments contains the value the extension provides in [NSExtensionContext completeRequestReturningItems:expirationHandler:completion:].
// In this example, the extension provides a color as a returning item.
document.body.style.backgroundColor = arguments["bgColor"];
}
};
var ExtensionPreprocessingJS = new MyExtensionJavaScriptClass;
In above my JavaScript file I have run function that run at the time of extension run and finalize fun run as we call completeRequestReturningItems in objc side.I want to run my test function without closing extension
You don't.
To quote Apple's Extension Guidelines, from the section How an Extension Communicates.
There is no direct communication between a running extension and its containing app; typically, the containing app isn’t even running while its extension is running.
This isn't to say that you cannot, just that Apple doesn't want you, and the ability to do so is probably either private or non-existent.
Quick terminology level set:
Containing App = "an app that contains one or more extensions is called a containing app"
Host App = "An app that can let users choose an extension to help them perform a task is called a host app."
That being said, Apple does not supply a communication stream from Host App to extension. In your case, you can load data initially with the run() in the JS Preprocessing file and then respond with data on exit of the extension with finalize().