In my iOS app, I am trying to transcribe prerecorded audio using iOS 10's latest feature, the Speech API.
Multiple sources including the documentation have stated that the audio duration limit for the Speech API (more specifically SFSpeechRecognizer) is 1 minute.
In my code, I have found that any audio files with a length of about 15 seconds or more, will get the following error.
Error Domain=kAFAssistantErrorDomain Code=203 "SessionId=com.siri.cortex.ace.speech.session.event.SpeechSessionId#50a8e246, Message=Timeout waiting for command after 30000 ms" UserInfo={NSLocalizedDescription=SessionId=com.siri.cortex.ace.speech.session.event.SpeechSessionId#50a8e246, Message=Timeout waiting for command after 30000 ms, NSUnderlyingError=0x170248c40 {Error Domain=SiriSpeechErrorDomain Code=100 "(null)"}}
I have searched all over the internet and have not been able to find a solution to this. There also have been people with the same problem. Some people suspect that it's a problem with Nuance.
It is also worth noting that I do get partial results from the transcription process.
Here's the code from my iOS app.
` // Create a speech recognizer request object.
let srRequest = SFSpeechURLRecognitionRequest(url: location)
srRequest.shouldReportPartialResults = false
sr?.recognitionTask(with: srRequest) { (result, error) in
if let error = error {
// Something wrong happened
print(error.localizedDescription)
} else {
if let result = result {
print(4)
print(result.bestTranscription.formattedString)
if result.isFinal {
print(5)
transcript = result.bestTranscription.formattedString
print(result.bestTranscription.formattedString)
// Store the transcript into the database.
print("\nSiri-Transcript: " + transcript!)
// Store the audio transcript into Firebase Realtime Database
self.firebaseRef = FIRDatabase.database().reference()
let ud = UserDefaults.standard
if let uid = ud.string(forKey: "uid") {
print("Storing the transcript into the database.")
let path = "users" + "/" + uid + "/" + "siri_transcripts" + "/" + date_recorded + "/" + filename.components(separatedBy: ".")[0]
print("transcript database path: \(path)")
self.firebaseRef.child(path).setValue(transcript)
}
}
}
}
}`
Thank you for your help.
I haven't confirmed my answer aside from someone else running into the same problem but I believe it is an undocumented limit on prerecorded audio.
Remove the result.isFinal and do a null check for the result instead. Reference: https://github.com/mssodhi/Jarvis-ios/blob/master/Jarvis-ios/HomeCell%2Bspeech.swift
This is true, I extracted the audio file from the video, and if it exceeds 15 seconds, it will give the following error:
Domain = kAFAssistantErrorDomain Code = 203 "Timeout" UserInfo = {
NSLocalizedDescription = Timeout,
NSUnderlyingError = 0x1c0647950 {Error Domain=SiriSpeechErrorDomain Code=100 "(null)"}
}
The key issue is the audio file recognition after more than 15 seconds.
result.isFinal is always 0, which is very frustrating is that there is no accurate timestamp, although it is "Timeout", it has complete recognition content, which makes me feel weird.
If you print out the result traversal, you can see that there is some restriction, which is 15 seconds, but the reason is that the timestamp feedback of the audio file is limited to a limited number, such as 15 or 4 or 9, leading to the end. Timeout feedback is more unstable.
But in real-time speech recognition, you can break through 15 seconds, as described in the official documentation, within one minute.
Related
I'm working with my iOS Swift code, and have successfully installed all dependencies.
now, I'm trying to increase timeoutInterval in Firebase function.
functions.httpsCallable("getData").call(){ (result, error) in
guard error == nil else {
print(error)
return
}
.........
}
You cannot do it from client side. You will have to increase the timeout in your functions like thi:
const runtimeOpts = {
timeoutSeconds: 300,
memory: '1GB'
}
exports.getData = functions
.runWith(runtimeOpts)
https.onCall((data, ctx) = > {
// the function
});
The maximum value for timeoutSeconds is 540, or 9 minutes.
Detailed information can be found in the documentation
I found the answer. at first you have to set time in Firebase server, then in client side(Swift) use this code:
functions.httpsCallable("getData").timeoutInterval = 120
In AWSKinesisRecorder (here), how can we check if our records are submitted to the server / reached the AWS or check if we have records on disk that are not yet submitted?
kinesisRecorder.submitAllRecords()?.continueOnSuccessWith(block: { (task: AWSTask<AnyObject>) -> Any? in
if let error = task.error as NSError? {
Logger.log(method: .error, "\(#function) \(#line) \(#file)", "Error: \(error)")
}
if let result = task.result {
Logger.log(method: .info, "\(#function) \(#line) \(#file)", "Result: \(result)")
}
print("FINISHED AWSTask kinesisRecorder", task, task.error, task.isCompleted, task.isFaulted, task.isCancelled)
return nil
})
The completion block never returns an error neither does the task.result is also nil, even if the internet is turned off on the device.
Not Possible
Seems like there is no public API available to fetch the records that are written onto the local mobile storage, neither you can read the sent records from Kinesis.
Its aim is to stream data in a unidirectional way.
I had to create another API to get the details of records received on the server end and had to rely on the Kinesis that each record is 100% written safely onto the local storage. So, far I have not seen any data loss.
I'm using AWS Appsync for the current App I'm developing and facing a serious issue that is Whenever I fire queries in Appsync client, when there is slow internet connection the request never end with a callback. I checked over internet there is limited source of information on this topic and also found this issue that is still open.
This is the code I used to get the response
func getAllApi(completion:#escaping DataCallback){
guard isInternetAvailabele() else {
completion(nil)
return
}
// AppSyncManager.Client() is AWSAppSyncClient Object
AppSyncManager.Client().fetch(query: GetlAllPostQuery(input: allInputs), cachePolicy:.fetchIgnoringCacheData) {
(result, error) in
var haveError:Bool = error != nil
if let _ = result?.data?.getAllPostings?.responseCode {haveError = false} else {haveError = true}
if haveError {
print(error?.localizedDescription ?? "")
completion(nil)
return
}
if result != nil{
completion(result)
}else{
completion(nil)
}
}
}
The code works fine with internet connection and I have already checked at the top if there is no internet but when there is slow internet connection or the wifi is connected to a hotspot that I created with my mobile with internet data disabled the request doesn't return any callback it should give failed alert like we get in other apis when the request time out.
Is there any support for request for request time out or did I miss something?
Note : I recieved these logs in Terminal
Task <06E9BBF4-5731-471B-9B7D-19E5E504E57F>.<45> HTTP load failed (error code: -1001 [1:60])
Task <D91CA952-DBB5-4DBD-9A90-98E2069DBE2D>.<46> HTTP load failed (error code: -1001 [1:60])
Task <06E9BBF4-5731-471B-9B7D-19E5E504E57F>.<45> finished with error - code: -1001
Task <D91CA952-DBB5-4DBD-9A90-98E2069DBE2D>.<46> finished with error - code: -1001
Actually there could be two possible ways to fix the issue,
1) While configuring AWSAppSyncClientConfiguration, provide a custom URLSessionConfiguration and set the request timeout to your needs,
extension URLSessionConfiguration {
/// A `URLSessionConfiguration` to have a request timeout of 1 minutes.
static let customDelayed: URLSessionConfiguration = {
let secondsInOneMinute = 60
let numberOfMinutesForTimeout = 1
let timoutInterval = TimeInterval(numberOfMinutesForTimeout * secondsInOneMinute)
let configuration = URLSessionConfiguration.default
configuration.timeoutIntervalForRequest = timoutInterval
configuration.timeoutIntervalForResource = timoutInterval
return configuration
}()
}
And pass this session configuration i.e URLSessionConfiguration.customDelayed when initializing AWSAppSyncClientConfiguration as it accepts the URLSessionConfiguration in the below constructor,
public convenience init(url: URL,
serviceRegion: AWSRegionType,
credentialsProvider: AWSCredentialsProvider,
urlSessionConfiguration: URLSessionConfiguration = URLSessionConfiguration.default,
databaseURL: URL? = nil,
connectionStateChangeHandler: ConnectionStateChangeHandler? = nil,
s3ObjectManager: AWSS3ObjectManager? = nil,
presignedURLClient: AWSS3ObjectPresignedURLGenerator? = nil) throws {
2) If the first doesn't work then you have another option to edit/unlock the pod files directly. There is a class AWSAppSyncRetryHandler where you can change the logic for retrying request. If you are able to fix the issue then you can fork the original repo, clone your repo, make changes in your repo and in pods file point this pod to use your repository. This should be done as changing the pod files directly is absolutely wrong until you are really stuck and want to find some solution.
Update: This issue has been fixed with AppSync SDK 2.7.0
Recently I face a problem which is rare to appear
What I try to do is to get the number of steps per minute.
In my code:
let date = NSDate()
for i in 0...1000 {
dispatch_async(mySerialQueue) {
self.pedoMeter.queryPedometerDataFromDate(date.dateByAddingTimeInterval( Double(i+1) * -60.0 ), toDate: date.dateByAddingTimeInterval( Double(i) * -60.0 ), withHandler: { (data, error) in
if let data = data {
print("\(data.numberOfSteps)")
}
})
}
}
Sometimes, the number of steps would return a big number which can be >1000000. After tracing the device log, I found that there is a error log:
Sep 15 16:42:59 locationd[6315] <Error>: Steps were found to be non monotonically increasing - start:488825.000000, end:488825.000000
and that is the weird step number.
I am trying to avoid the problem. That's why I am using a serial queue to do the query. However, I failed. Is there any way to avoid it?
Updated code sample, console output and information
I'm trying to get the total duration of a video composition, divide it by 10, round up, then loop through and splice the video composition at those intervals.
It works, but after the 'currentDuration' becomes 60+, it throws a "The requested URL was not found on this server."
Basically if it determines it needs to produce 9 clips, it succeeds the first 6 and fails the other 3. Also my numLoop isn't working as expected, it almost seems the while loop finishes before any of the 9 clips are attempted.
Would love some help/insight into getting all 9 clips exporting.
I've noticed if the video is less than 60 seconds long, it has 100% success rate. Any video I choose over 60 seconds will fails on the 7th clip.
Here's my method:
func splitVideo(videoComposition: AVMutableVideoComposition) {
let fileManager = NSFileManager.defaultManager()
let documentsPath : String = NSSearchPathForDirectoriesInDomains(.DocumentDirectory,.UserDomainMask,true)[0]
//grab total duration/number of splits
let exporter: AVAssetExportSession = AVAssetExportSession(asset: asset!, presetName:AVAssetExportPresetHighestQuality)!
let totalDuration = Float64(CMTimeGetSeconds(exporter.asset.duration))
let totalDurationPieces = (totalDuration/10)
var numberOfSplits=Int(ceil(Double(totalDurationPieces)))
//prepare for loop
var loopNum : Int = 0
var currentDuration : Float64 = 0
var sessionNumber = (arc4random()%1000)
let opQueue = NSOperationQueue()
opQueue.maxConcurrentOperationCount = 1
while loopNum < numberOfSplits {
//new exporter
var destinationPath: String = documentsPath + "/splitVideo-"+String(sessionNumber)
destinationPath+="-"+String(Int(loopNum))+".mp4"
let new_exporter = AVAssetExportSession(asset: asset!, presetName:AVAssetExportPresetHighestQuality)!
new_exporter.outputURL = NSURL(fileURLWithPath: destinationPath as String)
new_exporter.videoComposition = videoComposition
new_exporter.outputFileType = AVFileTypeMPEG4
new_exporter.shouldOptimizeForNetworkUse = false
new_exporter.timeRange = CMTimeRangeMake(
CMTimeMakeWithSeconds(currentDuration, framesPerSecond!),CMTimeMakeWithSeconds(Float64(10),framesPerSecond!))
// Set up the exporter, then:
opQueue.addOperationWithBlock { () -> Void in
new_exporter.exportAsynchronouslyWithCompletionHandler({
dispatch_async(dispatch_get_main_queue(),{
print("Exporting... \(loopNum)")
self.exportDidFinish(new_exporter, loopNum: loopNum)
})
}) // end completion handler
} // end block
//prepare for next loop
loopNum = loopNum+1
currentDuration = currentDuration+10
if(loopNum>=numberOfSplits){
self.allExportsDone(Int(numberOfSplits))
}
} // end while
}
Here's the exportDidFinish method:
func exportDidFinish(session: AVAssetExportSession, loopNum: Int) {
let outputURL: NSURL = session.outputURL!
let library: ALAssetsLibrary = ALAssetsLibrary()
if(library.videoAtPathIsCompatibleWithSavedPhotosAlbum(outputURL)) {
library.writeVideoAtPathToSavedPhotosAlbum(outputURL, completionBlock: {(url, error) in
//done
print("Success on \(Int(loopNum))")
})
}
}
Here's the console output:
Exporting... 9
2016-08-20 13:39:27.980 TrimVideo[4776:1576022] Video /var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-6.mp4 cannot be saved to the saved photos album: Error Domain=NSURLErrorDomain Code=-1100 "The requested URL was not found on this server." UserInfo={NSUnderlyingError=0x1457f43f0 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}, NSErrorFailingURLStringKey=file:///var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-6.mp4, NSErrorFailingURLKey=file:///var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-6.mp4, NSURL=file:///var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-6.mp4, NSLocalizedDescription=The requested URL was not found on this server.}
Exporting... 9
2016-08-20 13:39:27.984 TrimVideo[4776:1576022] Video /var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-7.mp4 cannot be saved to the saved photos album: Error Domain=NSURLErrorDomain Code=-1100 "The requested URL was not found on this server." UserInfo={NSUnderlyingError=0x1457f88c0 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}, NSErrorFailingURLStringKey=file:///var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-7.mp4, NSErrorFailingURLKey=file:///var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-7.mp4, NSURL=file:///var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-7.mp4, NSLocalizedDescription=The requested URL was not found on this server.}
Exporting... 9
2016-08-20 13:39:27.988 TrimVideo[4776:1576022] Video /var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-8.mp4 cannot be saved to the saved photos album: Error Domain=NSURLErrorDomain Code=-1100 "The requested URL was not found on this server." UserInfo={NSUnderlyingError=0x14687cb30 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}, NSErrorFailingURLStringKey=file:///var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-8.mp4, NSErrorFailingURLKey=file:///var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-8.mp4, NSURL=file:///var/mobile/Containers/Data/Application/BE125EB7-AFC4-48B5-95C3-941B420BB71F/Documents/splitVideo-972-8.mp4, NSLocalizedDescription=The requested URL was not found on this server.}
Exporting... 9
Success on 9
Exporting... 9
Success on 9
Exporting... 9
Success on 9
Exporting... 9
Success on 9
Exporting... 9
Ok, some news.
ALAssetsLibrary is deprecated since iOS 8, with Apple moving everyone off to the Photos Framework to do this. Good news is that AVAssetExportSession is not deprecated. While you could carry on with the deprecated API, it might be a good idea to rewrite that exportDidFinish function to use the new API.
The while loop in the splitVideo function is throwing out ten concurrent export operations. Its a bit of a guess to be honest, but I suspect that there is some resource contention kicking in once you get to clip 6.
So that needs to be redesigned to be a little more friendly. Best bet is to use an NSOperationQueue with the maxConcurrentOperationsCount set to one (i.e. a serial queue).
Something like:
let opQueue = NSOperationQueue()
opQueue.maxConcurrentOperationsCount = 1
for loopNum in 0..<numberOfSplits {
// Set up the exporter, then:
opQueue.addOperationWithBlock { () -> Void in
new_exporter.exportAsynchronouslyWithCompletionHandler({
dispatch_async(dispatch_get_main_queue(),{
print("Exporting... \(loopNum)")
self.exportDidFinish(new_exporter, loopNum: loopNum)
})
} // end completion handler
} // end block
} // end while
The purpose of this is to ensure that the export operations get run one at a time rather than an attempt at all at once. If that succeeds, you can experiment with upping the maxConcurrentOperationsCount to get some multi-threading going on.