This has been asked before, but something must have changed in Swift since it was asked. I am trying to store CMSampleBuffer objects returned from an AVCaptureSession to be processed later. After some experimentation I discovered that AVCaptureSession must be reusing its CMSampleBuffer references. When I try to keep more than 15 the session hangs. So I thought I would make copies of the sample buffers. But I can't seem to get it to work. Here is what I have written:
var allocator: Unmanaged<CFAllocator>! = CFAllocatorGetDefault()
var bufferCopy: UnsafeMutablePointer<CMSampleBuffer?>
let err = CMSampleBufferCreateCopy(allocator.takeRetainedValue(), sampleBuffer, bufferCopy)
if err == noErr {
bufferArray.append(bufferCopy.memory!)
} else {
NSLog("Failed to copy buffer. Error: \(err)")
}
This won't compile because it says that Variable 'bufferCopy' used before being initialized. I've looked at many examples and they'll either compile and not work or they won't compile.
Anyone see what I'm doing wrong here?
You can simply pass a CMSampleBuffer? variable (which, as an optional,
is implicitly initialized with nil) as inout argument with
&:
var bufferCopy : CMSampleBuffer?
let err = CMSampleBufferCreateCopy(kCFAllocatorDefault, buffer, &bufferCopy)
if err == noErr {
// ...
}
Literally you're attempting to use the variable bufferCopy before it is initialized.
You've declared a type for it, but haven't allocated the memory it's pointing to.
You should instead create CMSampleBuffers using the following call https://developer.apple.com/library/tvos/documentation/CoreMedia/Reference/CMSampleBuffer/index.html#//apple_ref/c/func/CMSampleBufferCreate
You should be able to copy the buffer into this then (as long as the format of the buffer matches the one you're copying from).
Related
Before I updated to iOS 14 on my iPhone, this code was working perfectly. After, iOS 14 this is weirdly not running... it is very odd and I have not seen any solution online, additionally from my investigation, I have not been able to see any change.
This code is used in order to retrieve a videoURL for this video from the imported Camera Roll (I use import Photos...).
phResourceManager.writeData(for: resource.last!, toFile: newURL!, options: resourceRequestOptions) { (error) in
if error != nil {
print(error, "not c67omplted error?")
} else {
print("woah completedd 345?")
newUserTakenVideo.videoURL = newURL
print(newUserTakenVideo.videoURL, "<--?")
}
}
EDIT:
To be clear, it "does not run" means the compleition block never runs... as in it never even runs and gives an error, the compleition block simply never is called (nothing prints at least..)
And here is a print statement printing out all the values I pass in to the parameters:
phResourceManager:
<PHAssetResourceManager: 0x282d352c0>
resource.last:
Optional(<PHAssetResource: 0x28128bc00> {
type: video
uti: public.mpeg-4
filename: v07044090000bu6n1nhlp4leque7r720.mp4
asset: C97B45D3-7039-4626-BA3E-BCA67912A2A9/L0/001
locallyAvailable: YES
fileURL: file:///var/mobile/Media/DCIM/113APPLE/IMG_3404.MP4
width: 576
height: 1024
fileSize: 4664955
analysisType: unavailable
cplResourceType: Original
isCurrent: YES
})
newURL:
Optional(file:///var/mobile/Containers/Data/Application/E2792F47-142E-4601-8D5B-F549D03C9AFE/Documents/Untitled%2027228354.MP4)
resourceRequestOptions:
<PHAssetResourceRequestOptions: 0x28230d480>
Note: this is the decleration for the resource variable:
let resource = PHAssetResource.assetResources(for: (cell?.assetPH)!)
I have a solution to this! Swift 4+, tested on iOS 14!
I looked through using a PHAssetResourceRequest, but the file names were messed with in the process, and it generally didn't work with my sandbox. Then I also tried requesting a AVPlayerItem from the PHAsset but this too, did not work with sandboxing...
But then, I tried simply using PHAssetResourceManager.default().writeData(... and seemingly started working!
I tested a bit more and seemed to work, here is the full code:
let resource = PHAssetResource.assetResources(for: (cell?.assetPH)!)
let resourceRequestOptions = PHAssetResourceRequestOptions()
let newURL = ExistingMediaVC.newFileUrl
PHAssetResourceManager.default().writeData(for: resource.last!, toFile: newURL!, options: resourceRequestOptions) { (error) in
if error != nil {
print(error, "error")
} else {
print("good")
newUserTakenVideo.videoURL = newURL
}
}
It is quite simple!! Tell me if anything is not working, and note I still use the ExisitingMedia.fileURL variable you used in your original code as well :)
I'm trying to get the asset property from an AVAssetTrack object, but it's nil sometimes. It seems like the problem occurs only after I use Dispatch.main.async.
According to the documentation, it's necessary to use loadValuesAsynchronously(forKeys:, completion:) to avoid blocking the main thread, and return to the main thread after loading is done.
let asset = AVURLAsset(url: videoInAppBundleURL)
let track = asset.tracks(withMediaType: .video).first!
assert(track.asset != nil) // passes
track.loadValuesAsynchronously(forKeys: [#keyPath(AVAssetTrack.asset)]) {
assert(track.asset != nil) // passes
DispatchQueue.main.async {
assert(track.asset != nil) // FAILS
// [...]
}
}
What I found out is:
It makes no difference whether I'm running on a device or the
simulator.
It seems not to be a problem with the video / videoURL.
The video is part of the main bundle, I tried both .mp4 and .mov
files and I made sure the video works by displaying it via an
AVPlayerViewController.
Here is a working demo project.
I'm also wondering: why is AVAssetTrack's asset property optional? (all!! the other properties are non optional)
Note: this question has been edited after reading Matt's helpful comments and further investigation.
I reproduced the issue, with some tweaking of your github example, like this:
let asset = AVURLAsset(url: videoInAppBundleURL)
let tracksKey = #keyPath(AVAsset.tracks)
asset.loadValuesAsynchronously(forKeys: [tracksKey]) {
let track = asset.tracks(withMediaType: .video).first!
DispatchQueue.main.async {
assert(track.asset != nil) // fails
}
}
Okay, but now watch closely as I perform an amazing trick:
let asset = AVURLAsset(url: videoInAppBundleURL)
let tracksKey = #keyPath(AVAsset.tracks)
asset.loadValuesAsynchronously(forKeys: [tracksKey]) {
let track = asset.tracks(withMediaType: .video).first!
DispatchQueue.main.async {
print(asset) // <-- amazing trick
assert(track.asset != nil) // passes!
}
}
Whoa! All I did was add a print statement — and now suddenly the very same assertion passes. This in fact is parallel to your original statement (which you later edited out) that "Sometimes the problems are gone, when stepping through the code with the debugger.”
So, now, my suspicions being thoroughly aroused, I did something unbelievably clever (even if I do say so myself). I removed the print(asset), but I switched the scheme’s configuration from Debug to Release. Presto, the assertion still passes.
So what you’ve found is a quirk of the compiler — dare I call it a bug?
But wait, there’s more. You asked, quite reasonably, why asset is Optional. It’s because it’s weak:
weak open var asset: AVAsset? { get }
So there’s your answer. The track has only a weak reference to its asset. If we pass the track down into an asynchronous queue, and we do not bring the asset itself along with us, then the weak reference lets go and the asset is lost — in a Debug build.
Hope this helps. You are probably waiting for me to make some grand conclusory statement about whether this constitutes a bug, but I’m not going to, sorry. I’ve provided two workarounds (use a Release build, or deliberately carry the asset reference down into the async queue) and that’s as far as I can go.
I'm running into a compiler error when using the following code:
func saveImageToDisk() {
let imageData = UIImagePNGRepresentation(imageView.image!)!
let fileName = getDocumentsDirectory().appendingPathComponent("image.png")
imageData.writeToFile(fileName, atomically: true)
}
The error is: Value of type 'Data' has no member 'writeToFile'
Could this be a compiler error, or something I'm missing? Thanks
SE-0005 proposed a better translation of Objective-C APIs into Swift and that affected NSData (or just Data now). Instead of writeToFile you'll have to use write(to:options:) (or even just write(to:)). Here is the documentation for the updated method.
I have a C char *cArray and it's length, and I need to convert it to NSData
I did it with:
var data: NSData? = NSData(bytesNoCopy: cArray, length: Int(length))
And it's working. The problem is that this is causing some memory leak. I don't know why, but I can see it at the allocation instruments that it's malloc 64 bytes and not freeing it when the function finish or when I set it to null.
This code been called a lot, so I need it to be leaks-free. What can I do to prevent the leak?
Edit: this is the code
func on_data_recv_fn(buf: UnsafeMutablePointer<CChar>, length: CInt, user_data: UnsafeMutablePointer<Void>) -> CInt {
guard buf != nil else {
NSLog("on_data_recv_fn buf is nil")
return -1
}
//var data: NSData? = NSData(bytesNoCopy: buf, length: Int(length), freeWhenDone: true)
var data: NSData? = NSData(bytesNoCopy: buf, length: Int(length))
let succeededWriting = Int(PacketTunnelProvider.sendPackets(data!))
data = nil
return CInt(succeededWriting)
}
According to memory instruments, there is a leak here.
The sendPackets function does not holding the data so the problem isn't there.
Edit: attached an image from instruments.
Well, It's seems that if I use autoreleasepool everything is OK for some reason.
Memory management of types backed by Objective-C is a vast and interesting topic. See, for example, here:
https://developer.apple.com/library/ios/documentation/Swift/Conceptual/BuildingCocoaApps/WorkingWithCocoaDataTypes.html
You may also find this question useful:
Is it necessary to use autoreleasepool in a Swift program?
Also, I think there is a danger here if the buf passed to on_data_recv_fn was dynamically allocated by some C code, which later tries to free it. Another dangerous possibility: the function is a call-back implemented in Swift and called by C code. In this case the buf might be on the stack.
I haven't played with any of these scenarios, but according to NSData documentation, the bytesNoCopy initializer makes NSData take ownership of the memory and then de-allocate it; it assumes the memory was allocated using malloc(), so any memory that was not malloc'd should not be used to construct an NSData using this initializer. See https://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Classes/NSData_Class/#//apple_ref/occ/instm/NSData/initWithBytesNoCopy:length:
There are other NSData initializers that make a copy of the buffer and can be safer in those cases.
I'm trying to learn FFMpeg through this tutorial: http://dranger.com/ffmpeg/tutorial01.html
I was hoping that just translating the C code to swift should get me up and running but I guess I was mistaken
I tried converting the following code:
AVFormatContext *pFormatCtx = NULL;
// Open video file
if(avformat_open_input(&pFormatCtx, argv[1], NULL, 0, NULL)!=0) {}
to:
let pFormatCtx : UnsafeMutablePointer<UnsafeMutablePointer<AVFormatContext>> = nil
// Open video file
if avformat_open_input(pFormatCtx, path, nil, opaque) != 0 {}
This code breaks at: if avformat_open_input(pFormatCtx, path, nil, opaque) != 0 {} With an EXC_BAD_ACCESS error
can anyone guess whats wrong here??
By the way I have the FFMpeg library compiling without an issue so I don't think there might be an issue with the way I compiled or imported it. I'm probably passing wrong arguments I think :/ Any guesses??
First off I'm using Swift 2 with xCode 7.2 ...
The solution was to create the format Context as an "UnsafeMutablePointer< AVFormatContext >" and then pass its address through the avformat_open_input method. Here's the code that worked for me:
var formatContext = UnsafeMutablePointer<AVFormatContext>()
if avformat_open_input(&formatContext, path, nil, nil) != 0 {
print("Couldn't open file")
return
}
Hope this helps.
The partial solution & background explanation can be found here: http://en.swifter.tips/pointer-memory/.
Basically, the UnsafeMutablePointer must be allocated before being used.
To make the code above work, try this:
let path = ...
let formatContext = UnsafeMutablePointer<UnsafeMutablePointer<AVFormatContext>>.alloc(1)
if (avformat_open_input(formatContext, path, nil, nil) != 0) {
// TODO: Error handling
}
When you are done, do not forget to call formatContext.destroy().