Hi i am using sync api for download images from DBx and sync my app data to DBX. Now i want to download images from DBX with progress bar. I tried this but i could not any progress value and get downloaded with this warning. This is my code
DBFile *orignalImg = [[DBFilesystem sharedFilesystem]openFile:imgInfo.imgPath error:nil];
__weak DBFile *oFile = orignalImg;
if (oFile)
{
[orignalImageArray addObject:oFile]; // make reference of file
}
if (orignalImg.status.cached)
{
// already displayed
}
else
{
[orignalImg addObserver:self block:^(void)
{
DBFileStatus *fileStatus = oFile.status;
DBFileStatus *newerStatus = oFile.newerStatus;
UIImage *aImage = [UIImage imageWithData: [oFile readData:nil]];
if (fileStatus.cached) // if image downloaded
{
//display image
}
else if (fileStatus.state == DBFileStateDownloading) // show progress bar
{
// show progress
[self showProgress:strPath andProgressValue:fileStatus.progress];
}
else if (newerStatus && newerStatus.state == DBFileStateDownloading)// show progress bar
{
[self showProgress:strPath andProgressValue:fileStatus.progress]; }
}];
}
Warning is :- dropbox_file_wait_for_ready should not be called on the main thread
#rmaddy is right... calling readData before the file has finished downloading will cause that call to block, and so you won't see any progress. (That's also presumably causing the warning.)
If you don't do that, you should be able to see progress as the file downloads, but it looks like you haven't implemented that part yet.
Related
I am creating video in ARKit during session. When I press record button, camera freezes. I have written code in didUpdateFrame delegate that causes the problem. There I save scene.snapshot in an array. Also when i create video from these images, app crashes with following message in debugger:
Message from debugger: Terminated due to memory issue
-(void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame
{
if (_recordButton.state == UIControlStateSelected)
{
currentState = Recording;
[self saveImage];
}
else if (previousState == Recording)
{
NSLog(#"Stop recording");
currentState = NotRecording;
recordTime = NULL;
self.nextButton.enabled=YES;
}
//update recording state per frame update
previousState = currentState;
}
-(void)saveImage
{
UIImage *image = self.sceneView.snapshot;
[self.bufferArray addObject:image];
image = nil;
}
Do not use ARSCNView.snapshot with implementing ARSessionDelegate.didUpdateFrame. I had same issue and solution was do not implement ARSessionDelegate.didUpdateFrame. I have used CADisplayLink with ARSCNView.snapshot and it works well.
I also tried to use ARFrame.capturedImage, but it has not contain AR objects at all. ARSCNView.snapshot contains them.
I have this camera app.
I take the image from the camera, process it with a filter and at some point inside captureOutput:didOutputSampleBuffer:fromConnection: I take the final image and write it to a file using this:
CFRetain(sampleBuffer);
[myself writeToVideoImage:resultadoFinal
withSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
works wonderfully but If I put this inside a queue, like this:
CFRetain(sampleBuffer);
dispatch_async(_writeToVideoQueue, ^{
[myself writeToVideoImage:resultadoFinal
withSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
});
it crashes on the line
[_assetWriter startSessionAtSourceTime:presentationTime];
of
- (void)writeToVideoImage:(CIImage *)resultadoFinal
withSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
CFRetain(sampleBuffer);
CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CGRect extent = [resultadoFinal extent];
if (!_readyToWriteVideo) {
_readyToWriteVideo = [self setupAssetWriterVideoInputWithSize:extent.size];
return;
}
else if (_videoWritingStarted) {
_videoWritingStarted = NO;
// ***** CRASHES HERE ************
[_assetWriter startSessionAtSourceTime:presentationTime];
}
CVPixelBufferRef renderedOutputPixelBuffer = NULL;
OSStatus err = CVPixelBufferPoolCreatePixelBuffer(NULL,
_pixelBufferAdaptor.pixelBufferPool,
&renderedOutputPixelBuffer);
if (err) return;
[_ciContext render:resultadoFinal
toCVPixelBuffer:renderedOutputPixelBuffer
bounds:extent
colorSpace:_sDeviceRgbColorSpace];
[self writeToFile:renderedOutputPixelBuffer
comSampleTime:presentationTime
size:extent.size];
CFRelease(renderedOutputPixelBuffer);
CFRelease(sampleBuffer);
}
I don't have a clue of what is going on.
It appears to be something deallocating. I first suspected sampleBuffer was deallocating but I am retaining it inside and outside the function, just in case. I have also tried to create a copy of resultadoFinal inside the block, before calling the method, with no success.
Xcode shows the error
[AVAssetWriter startSessionAtSourceTime:] Cannot call method when status is 0'
There are questions on SO about that. I have tried all suggestions without success.
Any ideas?
The error "Cannot call method when status is 0" sounds like something isn't initialized properly - remember when you use dispatch_async you are running on a different thread, so you need to initialize your AVAssetWriter on that thread.
At first start of app I want to download all files from server and I want to continue in downloading even when user leaves app (it's not in foreground). The files I need to download are thumbnails, photos in original size, other files and video. I want to download them in order as I wrote before.
I am using Alamofire and I set session manager:
let backgroundManager: Alamofire.SessionManager = {
let bundleIdentifier = "com....."
return Alamofire.SessionManager(
configuration: URLSessionConfiguration.background(withIdentifier: bundleIdentifier + ".background")
)
}()
Then I am using it like this:
self.backgroundManager.download(fileUrl, to: destination)
.downloadProgress { progress in
//print("Download Progress: \(progress.fractionCompleted)")
}
.response(completionHandler: result)
It's in downloadPhoto method and I am calling it:
for item in items {
self.downloadPhoto(item: item, isThumbnail: true, shouldReloadData: false, indexPath: nil)
self.downloadPhoto(item: item, isThumbnail: false, shouldReloadData: false, indexPath: nil)
}
Then I could add call for file download and video download and other. But all this requests have same priority and I would like to download thumbnails first (because that's what user see at first) then full size images and after all images are downloaded then files and videos. But all must be in queue because if user starts app and then set it to background and left it for several hours all must be downloaded. Is this possible? And how can I do this?
I was looking at alamofire that it has component library AlamofireImage which has priority based downloading but images are just part of files which I want to prioritize. Thanks for help
Have a look at TWRDDownloadManager.
It uses NSURLSessionDownloadTask and also supports background modes.
What you need to do is:
1. Set HTTPMaximumConnectionsPerHost to 1 to be sure that download happen serially:
NSURLSessionConfiguration *configuration = [NSURLSessionConfiguration defaultSessionConfiguration];
configuration.timeoutIntervalForRequest = 30.0;
configuration.HTTPMaximumConnectionsPerHost = 1; // Note this
2. Following is the method which iterate for loop and download media one by one:
-(void)downloadDocuments
{
for(int i=0; i<[arrDownloadList count]; i++)
{
Downloads *download = [arrDownloadList objectAtIndex:i];
NSString *fileURL = download.documentURL;
[[TWRDownloadManager sharedManager] downloadFileForURL:fileURL
withName:[fileURL lastPathComponent]
inDirectoryNamed:kPATH_DOC_DIR_CACHE_DOCUMENTS
completionBlock:^(BOOL completed)
{
if (completed)
{
/* To some task like database download flag updation or whatever you wanr */
downloadIndex++;
if(downloadIndex < [arrDownloadList count]) {
[self updateDownloadingStatus];
}
else {
[self allDownloadCompletedWithStatus:TRUE];
}
}
else
{
/* Cancel the download */
[[TWRDownloadManager sharedManager] cancelDownloadForUrl:fileURL];
downloadIndex++;
if(downloadIndex < [arrDownloadList count]) {
[self updateDownloadingStatus];
}
else {
[self allDownloadCompletedWithStatus:TRUE];
}
}
} enableBackgroundMode:YES];
}
}
Don't forget to enable background mode: enableBackgroundMode:YES
3. Enable Background Modes in Xcode:
4. Add the following method to your AppDelegate:
- (void)application:(UIApplication *)application handleEventsForBackgroundURLSession:(NSString *)identifier completionHandler:(void (^)())completionHandler{
[TWRDownloadManager sharedManager].backgroundTransferCompletionHandler = completionHandler;
}
If you do this, the downloading will occur serially and it will continue even if App is in background or user locks the device.
Please add comment in case of any question or help regarding this.
I believe that iOS standard URLSessionDownloadTask will help you. It has property priority, which looks like you need.
Here is official documentation
Here is some tutorial
Does anybody know how to set the Progress View Bar for a PFImageView, that is downloading a PFFile from (Parse.com)? I can't find any tutorials on how to do so. (I don't want to set an activity indicator since I'm loading an image and I want the user to know how long he/she will wait).
Thanks!
PFImageView methods are quite limited, you can just use loadInBackground , with or without a completion block. In none of these methods a progressBlock is available.
A workaround would be not to assign a PFFile to a PFImageView, but instead load it with a method using a progressBlock parameter.
[myImageFile getDataInBackgroundWithBlock:^(NSData * result, NSError *error)
{
if (result != nil && error == nil)
{
[myImageView setImage:[UIImage imageWithData:result]];
}
else
{
// handle the error
}
}
progressBlock:^(int percentDone)
{
[progressView setProgress:(float)percentDone/100.0f];
}];
This is not as simple as [myPFImageView loadInBackground], but I can't see any other way to get a progress indicator. Note that using a PFImageView is no longer necessary in this case, a simple UIImageView will be sufficient.
Maybe one day Parse team will add a progressBlock to their PFFile methods !
My solution: Show a loader.
In an extension to PFImageView use the following code to show a loader
let loader = UIActivityIndicatorView(frame: CGRectMake(self.frame.width/2 - 5, self.frame.height/2 - 5, 10, 10))
loader.activityIndicatorViewStyle = UIActivityIndicatorViewStyle.Gray
loader.startAnimating()
self.addSubview(loader)
loadInBackground{(image: UIImage?, error: NSError?) -> Void in
loader.stopAnimating()
if error == nil {
self.image = image
}
}
I have an application in which the user can select from local video files. When one of those thumbnails gets pushed, the user is presented a new view which have a custom video player I've made that presents the video.
This works flawlessly, but only sometimes. The funny thing is that if the user selects a new video (thus getting presented a new view, initializing a new custom video player object) exactly 5 times, the underlying AVPlayerLayer that is used to present the visuals from the player renders black, even though it seems like the underlying asset still loads correctly (the player interface still holds the correct duration for the video and so forth).
When a new custom media player object gets initialized (which happens when the view controller for the media players containing view gets loaded), this is the part of the initializer method which sets up the AVPlayer and its associated item:
// Start to load the specified asset
mediaAsset = [[AVURLAsset alloc] initWithURL:contentURL options:nil];
if (mediaAsset == nil)
NSLog(#"The media asset is zero!!!");
// Now we need to asynchronously load in the tracks of the specified asset (like audio and video tracks). We load them asynchronously to avoid having the entire app UI freeze while loading occours
NSString* keyValueToLoad = #"tracks";
// When loading the tracks asynchronously we also specify a completionHandler, which is the block of code that should be executed once the loading is either or for some reason failed
[mediaAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:keyValueToLoad
] completionHandler:^
{
// When this block gets executed we check for potential errors or see if the asset loaded successfully
NSError* error = nil;
AVKeyValueStatus trackStatus = [mediaAsset statusOfValueForKey:keyValueToLoad error:&error];
if (error != nil)
{
NSLog(#"Error: %#", error.description);
}
//switch (trackStatus) {
//case AVKeyValueStatusLoaded:
if (trackStatus == AVKeyValueStatusLoaded)
{
NSLog(#"Did load properly!");
mediaItem = [AVPlayerItem playerItemWithAsset:mediaAsset];
if (mediaItem.error == nil)
NSLog(#"Everything went fine!");
if (mediaItem == nil)
NSLog(#"THE MEDIA ITEM WAS NIL");
[mediaItem addObserver:self forKeyPath:#"status" options:0 context:&itemStatusContext];
mediaContentPlayer = [[AVPlayer alloc] initWithPlayerItem:mediaItem];
[mediaContentView setPlayer:mediaContentPlayer];
//mediaContentView = [AVPlayerLayer playerLayerWithPlayer:mediaContentPlayer];
[activeModeViewBlocked configurePlaybackSliderWithDuration:mediaItem.duration];
originalDuration = mediaItem.duration.value / mediaItem.duration.timescale;
// We will subscribe to a timeObserver on the player to check for the current playback time of the movie within a specified interval. Doing so will allow us to frequently update the user interface with correct information
playbackTimeObserver = [mediaContentPlayer addPeriodicTimeObserverForInterval:CMTimeMake(1, 50) queue:dispatch_get_main_queue() usingBlock:^(CMTime time)
{
NSLog(#"TIME UPDATED!");
[activeModeViewBlocked updatePlaybackSlider:time];
}];
[self syncUI];
}
if (trackStatus == AVKeyValueStatusFailed)
{
NSLog(#"Something failed!");
}
if (trackStatus == AVKeyValueStatusCancelled)
{
NSLog(#"Something was cancelled!");
}
}];
Now if I initialize this custom media player object 5 times exactly, it always starts to render black screens.
Does anyone have any idea of why this could be happening?
This bit me too. There is a limit on the number of concurrent video players that AVFoundation will allow. That number is four(for iOS 4.x, more recently the number seems to have increased. For example, on iOS 7 I've had up to eight on one screen with no issue). That is why it is going black on the fifth one. You can't even assume you'll get four, as other apps may need a 'render pipeline'.
This API causes a render pipeline:
+[AVPlayer playerWithPlayerItem:]