Playing USDZ skeletal animation in RealityKit from a separate file - augmented-reality

I am trying to play a specific skeletal animation on my 3D object (loaded from a USDZ file).
The animation file is also in format of USDZ.
I tried the following:
Entity.loadAsync(contentsOf: Bundle.main.url(forResource: "Character", withExtension: "usdz")!)
.append(Entity.loadAsync(contentsOf: Bundle.main.url(forResource: "Animation", withExtension: "usdz")!))
.collect()
.sink(receiveCompletion: {
if case .failure(let error) = $0 {
print(error)
}
}, receiveValue: { data in
let character = data[0]
self.anchorEntity?.addChild(character)
DispatchQueue.main.asyncAfter(deadline: .now() + 5) {
let animationEntity = data[1]
animationEntity.transform.matrix = character.transform.matrix
if let animation = animationEntity.availableAnimations.first {
character.playAnimation(animation, startsPaused: false)
}
}
})
.store(in: &self.cancellables)
I am seeing these in the console:
[Animation] Invalid bind path: Ann_Body_anim_Neutral.RootNode.root.Root_M_bnd.Spine1_M_bnd.Spine2_M_bnd.Spine3_M_bnd.Chest_M_bnd.Scapula_R_bnd.Shoulder_R_bnd.Elbow_R_bnd.Transform.transform
[Animation] Invalid bind path: Ann_Body_anim_Neutral.RootNode.root.Root_M_bnd.Spine1_M_bnd.Spine2_M_bnd.Spine3_M_bnd.Chest_M_bnd.Scapula_R_bnd.Shoulder_R_bnd.ShoulderPart1_R_bnd.Transform.transform
[Animation] Invalid bind path: Ann_Body_anim_Neutral.RootNode.root.Root_M_bnd.Spine1_M_bnd.Spine2_M_bnd.Spine3_M_bnd.Chest_M_bnd.Scapula_R_bnd.Shoulder_R_bnd.ShoulderPart2_R_bnd.Transform.transform
...
It seems that the transform is different between the animation file joints/nodes and the character ones.
Is there any way to fix this in code? If not, how I can make it work?
I am receiving the animation as FBX file and then I am converting it into gtlf using the fbx2gltf tool, then I am converting the gltf into usdz using usdzconvert.
let task = Process()
task.executableURL = Self.fbx2gltfBinURL
task.arguments = [
"-i",
url.path,
"-o",
temporaryURL.path,
"-b",
"--blend-shape-normals",
"--blend-shape-tangents",
"--fbx-temp-dir",
temporaryDirectory.path,
]
try task.run()

Related

Is it possible to print pdf Url data content with StarPRNT-SDK-iOS-Swift ? Or is there any other method to print the data from remote(api response)?

I'm following git demo sample for integration : git : https://github.com/star-micronics/StarPRNT-SDK-iOS-Swift
I have successfully installed the sdk and am able to print samples in class EnglishReceiptsImpl.
Now, I'm trying to print the content I'm getting in api response.
I'm concerned about :
Is there any direct method to convert/pass a pdf url to print the content?
How to pass data(Data/NSData) to print in "command".
I'm using/ working for TSP650II printer model.
The pods I have installed are:
// Pods for Swift SDK
pod 'StarIO', '2.8.2'
pod 'StarIO_Extension', '1.15.0'
Here is my code :
In class SearchPortViewController
func openStarPrinter() {
var commands: Data
let localizeReceipts: ILocalizeReceipts = LocalizeReceipts.createLocalizeReceipts(AppDelegate.getSelectedLanguage(), paperSizeIndex: .threeInch)
// commands = PrinterFunctions.createTextReceiptData(emulation, localizeReceipts: localizeReceipts, utf8: false)
// Method (trying to print data from pdf url) : (createPdf417Data)
commands = ApiFunctions.createPdf417Data(emulation)
GlobalQueueManager.shared.serialQueue.async {
_ = Communication.sendCommands(commands,
portName: self.portName,
portSettings: self.portSettings,
timeout: 10000, // 10000mS!!!
completionHandler: { (communicationResult: CommunicationResult) in
DispatchQueue.main.async {
self.showSimpleAlert(title: "Communication Result",message: Communication.getCommunicationResultMessage(communicationResult),buttonTitle: "OK",buttonStyle: .cancel)
// self.navigationController!.popViewController(animated: true)
}
})
}
}
Where in class ApiFunctions
let globalPdfUrl:URL? = Bundle.main.url(forResource: "fake-store-receipt", withExtension: "pdf")
static func createPdf417Data(_ emulation: StarIoExtEmulation) -> Data {
let otherData = try! Data(contentsOf: globalPdfUrl!)
let builder: ISCBBuilder = StarIoExt.createCommandBuilder(emulation)
builder.beginDocument()
builder.appendPdf417Data(withAbsolutePosition: otherData, line: 0, column: 1, level: SCBPdf417Level.ECC0, module: 2, aspect: 2, position: 1)
builder.endDocument()
return builder.commands.copy() as! Data
}
But the print is not in the correct format.
Hoping for quick and detailed solution!! Thanks in advance.

SWIFTUI - File Not Found error when trying to import a file from a cloud file provider like OneDrive and GoogleDrive

I have the following SwiftUI code where a simple button brings up the iOS file manager and allows the user to select a CSV file to be imported. I've found that it works well for files that are stored locally on my device but if I try to select a file from Google Drive or OneDrive it gets a URL but when I then try to retrieve the data from it, it returns an error saying that the file was not found.
After a lot of head scratching, I've found that when using the file browser if I long press to bring up the context menu and then view the info for the file (which I'm guessing may be pulling it down to the phones local cache), it will then work as expected. This is shown in the following animated gif:
I've found that once I've done that caching trick, I can access the file without issue in other apps using the same code and I've also found that I can uninstall my app and reinstall it and it continues to work.
Can anyone advise on an approach using SwiftUI where I can avoid this File Not Found error when trying to import the file from Google Drive or OneDrive?
The entire code that I've been using for testing is as follows:
import SwiftUI
struct ContentView: View {
#State private var isImporting: Bool = false
#State private var fileContentString = ""
#State var alertMsg = ""
#State var showAlert = false
func reportError(error: String) {
alertMsg = error
showAlert.toggle()
}
var body: some View {
VStack {
Button(action: { isImporting = true}, label: {
Text("Select CSV File")
})
.padding()
Text(fileContentString) //This will display the imported CSV as text in the view.
}
.padding()
.fileImporter(
isPresented: $isImporting,
allowedContentTypes: [.commaSeparatedText],
allowsMultipleSelection: false
) { result in
do {
guard let selectedFileURL: URL = try result.get().first else {
alertMsg = "ERROR: Result.get() failed"
self.reportError(error: alertMsg)
return
}
print("selectedFileURL is \(selectedFileURL)")
if selectedFileURL.startAccessingSecurityScopedResource() {
//print("startAccessingSecurityScopedResource passed")
do {
print("Getting Data from URL...")
let inputData = try Data(contentsOf: selectedFileURL)
print("Converting data to string...")
let inputString = String(decoding: inputData, as: UTF8.self)
print(inputString)
fileContentString = inputString
}
catch {
alertMsg = "ERROR: \(error.localizedDescription)"
self.reportError(error: alertMsg)
print(alertMsg)
}
//defer { selectedFileURL.stopAccessingSecurityScopedResource() }
} else {
// Handle denied access
alertMsg = "ERROR: Unable to read file contents - Access Denied"
self.reportError(error: alertMsg)
print(alertMsg)
}
} catch {
// Handle failure.
alertMsg = "ERROR: Unable to read file contents - \(error.localizedDescription)"
self.reportError(error: alertMsg)
print(alertMsg)
}
}
.alert(isPresented: $showAlert, content: {
Alert(title: Text("Message"), message: Text(alertMsg), dismissButton: .destructive(Text("OK"), action: {
}))
})
}
}
The console log output is as follows:
selectedFileURL is file:///private/var/mobile/Containers/Shared/AppGroup/8F147702-8630-423B-9DA0-AE49667748EB/File%20Provider%20Storage/84645546/1aTSCPGxY3HzILlCIFlMRtx4eEWDZ2JAq/example4.csv
Getting Data from URL...
ERROR: The file “example4.csv” couldn’t be opened because there is no such file.
selectedFileURL is file:///private/var/mobile/Containers/Shared/AppGroup/8F147702-8630-423B-9DA0-AE49667748EB/File%20Provider%20Storage/84645546/1aTSCPGxY3HzILlCIFlMRtx4eEWDZ2JAq/example4.csv
Getting Data from URL...
Converting data to string...
First Name,Last Name
Luke,Skywalker
Darth,Vader
My testing has been done on a physical iPhone 12 Pro Max running iOS 14.2 and a physical iPad Air 2 running iPadOS 14.4.
I found an answer to my issue. The solution was to use a NSFileCoordinator() to force the file to be downloaded.
With the code below, if I access a file in cloud storage that hasn't been previously downloaded to the local device it will print "FILE NOT AVAILABLE" but it will now just download the file rather than throwing a file not found error.
Ideally I would like to be able to download just the file property metadata first to check how big the file is and then decide if I want to download the full file. The NSFileCoordinator has a metadata only option but I haven't worked out how to retrieve and interpret the results from that. This will do for now...
if selectedFileURL.startAccessingSecurityScopedResource() {
let fileManager = FileManager.default
if fileManager.fileExists(atPath: selectedFileURL.path) {
print("FILE AVAILABLE")
} else {
print("FILE NOT AVAILABLE")
}
var error: NSError?
NSFileCoordinator().coordinate(
readingItemAt: selectedFileURL, options: .forUploading, error: &error) { url in
print("coordinated URL", url)
let coordinatedURL = url
isShowingFileDetails = false
importedFileURL = selectedFileURL
do {
let resources = try selectedFileURL.resourceValues(forKeys:[.fileSizeKey])
let fileSize = resources.fileSize!
print ("File Size is \(fileSize)")
} catch {
print("Error: \(error)")
}
}
do {
print("Getting Data from URL...")
let inputData = try Data(contentsOf: selectedFileURL)
print("Do stuff with file....")
}
}

Why does my MLKit model always returns an error when processing an image?

I have a Google MLKit model for labeling an Image after capturing the image, but everytime I tried to process the Image, it always give me this error:
label process error:: Pipeline failed to fully start: Calculator::Open() for node "ClassifierClientCalculator" failed: #vk The TFLite Model Metadata must not contain label maps when text_label_map_file is used.
Here's my MLKit image labeler configuration code (this code is based on MLKit's documentation):
private func configureModelSource() { // Called in viewDidLoad()
guard let manifestPath = Bundle.main.path(forResource: "filename", ofType: "json") else { return }
guard let localModel = LocalModel(manifestPath: manifestPath) else { return }
let options = CustomImageLabelerOptions(localModel: localModel)
options.confidenceThreshold = NSNumber(value: 0.0)
imageLabeler = ImageLabeler.imageLabeler(options: options)
}
private func processImage(with image: UIImage) { // Called after capturing an Image
guard imageLabeler != nil else { return }
let visionImage = VisionImage(image: image)
visionImage.orientation = image.imageOrientation
imageLabeler?.process(visionImage) { labels, error in
guard error == nil, let labels = labels, !labels.isEmpty else {
print("label process error:: \(error?.localizedDescription ?? "nil")")
return
}
for label in labels {
// Do something...
}
}
}
Is there anyway to solve this? For context, the model.tflite file was updated. The file before the one that gives me this error works as expected. But the new model.tflite file always gives me this error everytime I run my app. Is this a file-related error or did I do something wrong with my code that I have to also update it?
Here's my understanding based on the error message:
Given you are using the LocalModel(manifestPath: manifestPath) API, it is expecting a legacy TFLite model format where the label map is provided through a separate text file and the model.tflite itself does not contain the label map. That's why your file before your model update works.
To use your updated model.tflite (which seems to contain the lab map inside its metadata), I think you can try the following to use the model.tflite file directly with the custom models API without going through the filename.json manifest:
guard let modelPath = Bundle.main.path(forResource: "model", ofType: "tflite") else { return }
guard let localModel = LocalModel(path: modelPath) else { return }
You can check out the documentation about custom models here: https://developers.google.com/ml-kit/vision/image-labeling/custom-models/ios

how to add background music

The duplicate answer does not works at all
import Cocoa
import AVFoundation
var error: NSError?
println("Hello, Audio!")
var url = NSURL(fileURLWithPath: "/Users/somebody/myfile.mid") // Change to a local midi file
var midi = AVMIDIPlayer(contentsOfURL: url, soundBankURL: nil, error: &error)
if midi == nil {
if let e = error {
println("AVMIDIPlayer failed: " + e.localizedDescription)
}
}
midi.play(nil)
while midi.playing {
// Spin (yeah, that's bad!)
}
I've made a couple of changes to your code but this seems to "work" (we'll get to that)
First off, import the MP3 file to your playground as described in this answer
Then you can use your file like so:
import UIKit
import AVFoundation
print("Hello, Audio!")
if let url = Bundle.main.url(forResource: "drum01", withExtension: "mp3") {
do {
let midi = try AVMIDIPlayer(contentsOf: url, soundBankURL: nil)
midi.play(nil)
while midi.isPlaying {
// Spin (yeah, that's bad!)
}
} catch (let error) {
print("AVMIDIPlayer failed: " + error.localizedDescription)
}
}
Notice:
printinstead of println
In Swift 3 a lot of things was renamed and some of the "old" methods that took an &error parameter was changed to use do try catch instead. Therefore the error has gone from your call and has been replaced with a try.
The above will fail! You will see error code -10870 which can be found in the AUComponent.h header file and which translates to:
kAudioUnitErr_UnknownFileType
If an audio unit uses external files as a data source, this error is returned
if a file is invalid (Apple's DLS synth returns this error)
So...this leads me to thinking you need to do one of two things, either:
find a .midi file and use that with the AVMidiPlayer
find something else to play your file, for instance AVFilePlayer or AVAudioEngine
(you can read more about error handling in Swift here).
Hope that helps you.
The mp3 file must be in the Resources folder.
You play an mp3 with code like this (not the MIDI player):
if let url = Bundle.main.url(forResource: "drum01", withExtension: "mp3") {
let player = try? AVAudioPlayer(contentsOf: url)
player?.prepareToPlay()
player?.play()
}

ERROR: opening file <file name> for output when using wav_to_flac + FLACiOS framework for iOS

I am trying to convert audio from AVRecorder wav file to flac. I have added the FLACiOS framework as well as the wav_to_flac helper from jhurt.
I have set my recorder settings to the following:
// -------------------------------------------------+
// MARK: - AVFoundation Setup Methods
// -------------------------------------------------+
private func setupRecorder() {
let recordSettings: [String: AnyObject] = [
AVFormatIDKey: Int(kAudioFormatLinearPCM),
AVSampleRateKey: 16000.0,
AVEncoderBitRateKey: 16,
AVNumberOfChannelsKey: 1,
AVLinearPCMIsFloatKey: false,
AVLinearPCMIsBigEndianKey: false,
AVEncoderAudioQualityKey: AVAudioQuality.High.rawValue
]
do {
audioRecorder = try AVAudioRecorder(URL: recordFileURL, settings: recordSettings)
audioRecorder.delegate = self
audioRecorder.prepareToRecord()
} catch {
print("Error: attempt to create AVAudioRecorder failed miserably!")
}
}
// =================================================+
I am handling the conversion in the AVRecorderDelegate method as follows:
func audioRecorderDidFinishRecording(recorder: AVAudioRecorder, successfully flag: Bool) {
// Delegate housekeeping
audioRecorderDelegate?.didFinishRecording(recorder.url)
tempFiles.append(recorder.url)
// Set up files for flac conversion
let flacOutFile = (recorder.url.URLByDeletingPathExtension!.absoluteString as NSString).UTF8String
let flacInFile = (recorder.url.absoluteString as NSString).UTF8String
let outFiles = flac_file_array()
// Internal housekeeping for memory freeing later
outputFileBuffers.append((count: 1024, buffer: outFiles))
convertWavToFlac(flacInFile, flacOutFile, 0, outFiles)
recordFileURL = urlGen()
}
The wav files play back ok. Jhurt says the convertWavToFlac function expects Apple's undocumented wav format, rather than the "normal" wav. I am not sure if AVAudioRecorder is outputting that given the settings I am providing.
I am providing url to the recorder using a url generator function I wrote that returns a closure that generates a new url:
func generateFileURL(withExtension fileExtension: String) -> URLGenerator {
var number = 0
let tempDir = NSTemporaryDirectory()
let docPath = tempDir + "tmp"
return { () in
let url = NSURL(fileURLWithPath: docPath + "\(number)" + ".\(fileExtension)")
number += 1
return url
}
}
with the wav extension. I read in the Audio Programming that AVAudio infers type from an extension.
I get the following error when I using convertWavToFlac:
ERROR: opening file:///Users/ShahtieShack/Library/Developer/CoreSimulator/Devices/15522B12-9026-44B0-922C-415433D72982/data/Containers/Data/Application/051442B5-639C-4654-8D53-14B245DE7FEB/tmp/tmp0.wav for output
I am running this in the simulator and have not gotten the framework to work on a device. I have searched all around for MODERN answers to this problem. ANY help would be welcome.
EDIT: I have run this app on both the Simulator AND a Device and get the same error. I am feeding the convertWaveToFlac function audio that I have successfully recorded from the Microphone.

Resources