Users who have little to no storage space in their camera roll aren't able to record a video in our app. Once they start recording a video and their storage fills up, our app crashes.
We are using react-native-camera to record video, but it looks like this might fall into the CameraRoll module in React Native itself.
Is there a way to detect how much storage a user has available on their phone? We'd like to give them a warning if they are low in storage before they record their video.
This is a duplicate but I can't flag it as such since the answer isn't accepted.
You can use react-native-fs's getFSInfo() to get the totalSpace and freeSpace.
Related
Screen recording will generate a huge data byte and have a lot of limitation when using. I want to record screen by myself without using Apple's screen recording API. Is there a solution or a related blog to reference.
I am orientating on developing an app to synchronize all pictures taken by the iPhone camera.
I searched quite a lot and can't find much about the hardware event for the camera shutter on the iPhone.
Is it possible like the android CAMERA_BUTTON BroadcastReceiver in the manifest, to listen if the camera button is pressed in general, without the app being specifically launched?
Or an overlay on the existing iOS camera app?
Update 02-05-2018
I din't managed to get a direct detection of the camera button, also no ongoing detection from pictures take from the camera(PHPhotoLibraryChangeObserver). When the app is killed, all listeners are also killed. I am however using this when the app is booted up with the locationchange mechanism
In the end I used the Using the Significant-Change Location Service to get the detection of the changed pictures to synchronize, ongoing. I used the NextCloud and OwnCloud as examples, which were containing this part.
Using the Significant-Change Location Service
The capturing of images and videos is an entire software related process managed by classes in the AVFoundation framework. The entire hardware of the iPhone is not accessible for applications and you cannot monitor the use of hardware directly. There are some system frameworks, but these won't help you. The AVFoundation doesn't have any notifications that it will post to registered observers.
All captured images and videos are put in the Photos library and the Photos library has notifications when something changes in the library. You can register your application as an observer for changes in the Photos library and you can specify the changes you want to observer. You can also collect the specific changes that have happened and have your application handle the changes in the Photos Library.
What I don't know is whether you can use this as a remote change and have your app being launched by iOS when it registers for that change in the Photos library. I do know that you can program your app to launch on reception of notifications, but I don't know if this can be done with this change observer. I would suggest to give it a try.
Hope this helps.
I need to develop an app, that uses Vuforia cloud recognition of object, and then display a video on top of that object. This video file needs to be downloaded from the internet from separate web service, using recognized object identifier. I was looking at Vuforia samples, and was able to configure Cloud Recognition to use correct target manager database - objects are recognized correctly. But I don't know how to do, so that after discovering this object, I would display an loader view, and when video is ready to play, then display this video. I don't know where and what to update in the code. I only found that some local dataset can be used, but I can't use local dataset, because videos I want to display, are supposed to be downloaded from the internet after detection. Can someone direct me, where in Vuforia examples I can update what is shown on the target?
Vuforia cloud is only for markers.
As per your requirements
1. You need to use iOS code to download the video.
2. After that you have to show that video using AVPlayer.
I'm making an iOS app in which I'd like to allow the user to save an audio file (a specific file that the app uses internally, not just any arbitrary audio file) to their music library so they can play it from other apps on the device. Ideally I'd like to save a sound directly to the users music library, but it seems from other similar questions that this is not possible. File sharing with iTunes seems to be the next best solution.
Is there anything about using the iTunes file sharing option for saving audio in this way that violates the app store terms?
Is this the path of least friction for the user, or is there another way to achieve this that I'm missing?
I want to create a Custom Video Player with some more features and settings .
So My question is:
1) Can i trigger my app when user clicks any video ?
2)How can i get list of all videos saved in iPhone (not in camera roll) ?i want to play them inside the video player i created.
Please guide me, thanks.
I don't think you can capture all video. External Apps tend to use Apple's native video support.
The only exception to this is if you register your application to open certain filetypes. Then applications can allow you to open the file in an external app such as yours. But this will require the cooperation of the other apps to implement "Open In" functionality; Very few apps tend to offer this functionality due to native support of video.
See here Apple Documentation
The filesystem is locked down, if it's not in the camera roll or "Open In" you can't access it.