Flutter won't initialise a video from disk - ios

I've downloaded a video to disk but it won't initialise using Video Player (https://pub.dev/packages/video_player).
final future = Downloader.shared
.getVideoPathFor(
url: url,
themeName: themeName,
)
.then(
(value) {
dLog('file path: $value');
final file = File(value);
final videoController = VideoPlayerController.file(file);
return videoController.initialize().then(
(value) {
dLog('video controller initialized');
return videoController;
},
);
},
);
It downloads the file fine and the file path becomes something like:
Application/9E6FD935-A424-4C1E-99CC-D5834448E45E/Library/Caches/videos/Clean water/clean_water_video.mp4
So I know the file exists and I can see it in Finder if I run this on Simulator.
If I use VideoController.contentUri() then it appears to work but it tells me I'm not allowed to use that on iOS and I can only use it on Android.
I know I can use VideoController.network() but I can't keep downloading the same video over and over across multiple screens like this.
What am I doing wrong?
Even when I do load the video like this (which I got from this video: https://youtu.be/uz4xRnE-UIw?t=596):
final filePath =
"<path>/Caches/videos/Clean water/clean_water_video.mp4";
final file = File(filePath);
controller = VideoPlayerController.file(file)
..addListener(() {
setState(() {});
})
..setLooping(true)
..initialize().then((value) {
controller.play();
});
the initialise never happens.

It turns out that it can't load a file if folders have spaces in the name.

Related

How to wait for code to be executed before executing next code in Dart?

I am working on a wallpaper app in Flutter & Dart. Currently I'm working on the set wallpaper button where I need to check if the wallpaper file exists, download it if need to and then change the wallpaper.
This is what I have right now and I think I've done it right, please note that I'm and Android Java Developer with only about 6 months of experience so I am beyond the basics in Dart too but not too good.
DOWNLOAD WALLPAPER FUNCTION
static Future<int> downloadWallpaperFile(int wallpaperID,
{String path}) async {
///Prepare a url for downloading the wallpaper using the getWallpaperURL method and passing in fullSizedWallpaper string constant
String url = getWallpaperURL(WallpaperSize.fullWallpaper, wallpaperID);
///Log output
print('CallingDownloadWallpaper : ' + url);
///Visual Feedback
wallpaperDetailsPageScaffoldGlobalKey.currentState.showSnackBar(
new SnackBar(content: new Text('Starting Wallpaper Download...')));
///Start downloading the wallpaper file from the url
var data = http.readBytes(url);
///After download is completed
data.then((buffer) async {
///If filePath is not passed in as parameter
if (path == null) {
///Use getPathForWallpaperFile to get a path for a wallpaper file
path = await getPathForWallpaperFile(url);
}
///Create a new file at the path, the path also includes the name of the file which is the id of the wallpaper
File newFile = new File(path);
///Get write access to the newly created wallpaper file
RandomAccessFile rf = newFile.openSync(mode: FileMode.write);
///Write the downloaded data to the file synchronously
rf.writeFromSync(buffer);
///Save the file to the disk synchronously
rf.flushSync();
///Close access to file synchronously
rf.closeSync();
///Log output
print('DownloadWallpaperResult : Complete');
///Visual Feedback
wallpaperDetailsPageScaffoldGlobalKey.currentState.showSnackBar(
new SnackBar(content: new Text('Wallpaper Download Complete')));
});
return 0;
}
SET WALLPAPER FUNCTION
static setWallpaper(int wallpaperID) async {
///Prepare variables for setting wallpaper and download the wallpaper as well (if needed)
String url = getWallpaperURL(WallpaperSize.fullWallpaper, wallpaperID);
String path = await getPathForWallpaperFile(url);
bool fileExists = checkIfFileExists(path);
///If wallpaper file does not exist then download it
if (fileExists == false) {
///Download wallpaper then change wallpaper
await downloadWallpaperFile(wallpaperID, path: path).then((result) {
///Check if download was successful
if (result == 0) {
///Change wallpaper
AndroidInterface.setWallpaper(path);
}
});
} else {
///Wallpaper already downloaded
///Change wallpaper
AndroidInterface.setWallpaper(path);
}
}
The problem is that you are using then, which is non-blocking (basically the old way to use Futures without await).
Instead, use await:
static Future<int> downloadWallpaperFile(int wallpaperID, {String path}) async {
// ...
//Start downloading the wallpaper file from the url
final buffer = await http.readBytes(url);
//After download is completed
//If filePath is not passed in as parameter
if (path == null) {
//Use getPathForWallpaperFile to get a path for a wallpaper file
path = await getPathForWallpaperFile(url);
}
// ...
return 0;
}
Btw, /// is reserved for documentation on classes and fields, use // for in-method comments!
I'm also not sure if it is a good idea to use synchronous io actions. That will probably block the UI of the app, it would be better to use the async io api (again with await).

React-native, how to get file-asset image absolute path?

I'm doing some image manipulation on ios on react-native.
The problem is one of the libraries I'm using only supports absolute paths, but I only have the file-asset uri.
Example
I have:
assets-library://asset/asset.HEIC?id=CE542E92-B1FF-42DC-BD89-D61BB70EB4BF&ext=HEIC
I need:
file:///Users/USERNAME/Library/Developer/CoreSimulator/Devices/########-####-####-####-############/data/Containers/Data/Application/########-####-####-####-############/Documents/########-####-####-####-############.jpg
Is there any way to easily get the image absolute path?
This is what I ended up doing, based on #ospfranco's answer.
I saved a copy of the asset on the temp folder. Also included a little snippet to generate a random string for the file name.
import RNFS from 'react-native-fs';
getAssetFileAbsolutePath = async (assetPath) => {
const dest = `${RNFS.TemporaryDirectoryPath}${Math.random().toString(36).substring(7)}.jpg`;
try {
let absolutePath = await RNFS.copyAssetsFileIOS(assetPath, dest, 0, 0);
console.log(absolutePath)
} catch(err) {
console.log(err)
}
}
So, the reason why you only get an url is because it image might not be stored on the device (it could be on iCloud). iOS silently downloads the asset for you once you do any operation on it.
That will not help you if you are really trying to manipulate the image from your react-native code though, so here is one workaround:
import RNFS from 'react-native-fs';
getAssetFileAbsolutePath = async (assetPath) => {
const dest = `${RNFS.TemporaryDirectoryPath}${Math.random().toString(36).substring(7)}.jpg`;
try {
let absolutePath = await RNFS.copyAssetsFileIOS(assetPath, dest, 0, 0);
} catch(err) {
// ...
}
}
Bare in mind this copies the file to a temporary directory which means it is not permanent, you can also copy it to your application's document directory.
I got it to work using RNFS, but I had to add a little 'extra' to the uri path to get it to work.
<TouchableHighlight
onPress={async () => {
const destPath = RNFS.CachesDirectoryPath + '/MyPic.jpg';
try {
await RNFS.copyAssetsFileIOS(imageUri, destPath, 0, 0);
console.log('destPath', destPath);
} catch (error) {
console.log(error);
}
navigation.navigate('SelectedPicture', {
uri: 'file://' + destPath,
});
}}>
<Image source={{uri: imageUri}} style={styles.image} />
</TouchableHighlight>
The question is old but i answer it to help people like me, new in react-native, having the same issue.
i were struggling with it trying to get the images from the cameraroll and process them with an OCR library. I was using react-native-photo-framework to get the images and i found that you can get fileurl using the method getImageMetadata with the assets. I need this fileurl because the original URI that has the format 'photo://...' wasn't being recognized as a valid URL for the OCR Library. I havenĀ“t tested it with real devices and with iCloud assets yet. Example:
const statusObj = await RNPhotosFramework.requestAuthorization()
if (statusObj.isAuthorized) {
const assetList = await RNPhotosFramework.getAssets({
includeMetadata: true,
includeResourcesMetadata: true,
fetchOptions: {
mediaTypes: ['image'],
sourceTypes: ['userLibrary', 'cloudShared', 'itunesSynced'],
}
})
const asset = photos.assets[0]
const metadata = await asset.getImageMetadata()
const uri = metadata.imageMetadata.fileUrl
}

How to play a custom sound in Flutter?

I was able to play a simple sound with this line of code:
SystemSound.play(SystemSoundType.click);
How can I play a customized sound?
Let's say a short mp3
Simple solution for playing a file already defined in assets is using AudioCache.
Library: https://pub.dartlang.org/packages/audioplayers.
More about AudioCache
After adding library to pubspec.yaml, import required class:
import 'package:audioplayers/audio_cache.dart';
add an asset in the same file and place the file with sound to assets folder (if you don't have this folder, create it)
assets:
- assets/sound_alarm.mp3
then add this code:
static AudioCache player = new AudioCache();
const alarmAudioPath = "sound_alarm.mp3";
player.play(alarmAudioPath);
An example here
Thanks for checking out Flutter!
Flutter SDK today (as of May 5, 2017) doesn't have built-in support to play and control arbitrary audio. However, we designed our plugin system to support it.
This plugin adds audio support to Flutter: https://pub.dartlang.org/packages/audioplayer
From the plugin's README:
Future play() async {
final result = await audioPlayer.play(kUrl);
if (result == 1) setState(() => playerState = PlayerState.playing);
}
// add a isLocal parameter to play a local file
Future playLocal() async {
final result = await audioPlayer.play(kUrl);
if (result == 1) setState(() => playerState = PlayerState.playing);
}
Future pause() async {
final result = await audioPlayer.pause();
if (result == 1) setState(() => playerState = PlayerState.paused);
}
Future stop() async {
final result = await audioPlayer.stop();
if (result == 1) {
setState(() {
playerState = PlayerState.stopped;
position = new Duration();
});
}
}
The audioplayers works (from https://medium.com/#bennett4/adding-custom-sound-effects-to-a-flutter-mobile-app-41594f1f3305):
(1) Add the library to your pubspec.yaml: audioplayers: ^0.15.1
(2) In pubspec.yaml under flutter add the reference to your assets file:
flutter
assets:
- assets/yes.mp3
MAKE SURE it is under the assets folder. It does not work when it is in a subfolder. For example, something like: - assets/sounds/yes.mp3 will not work. Just put your audio file in the assets folder, not in its subfolder
(3) import the library in your app as: import package:audioplayers/audioplayers.dart;
(4) then define this function:
Future<AudioPlayer> playLocalAsset() async {
AudioCache cache = new AudioCache();
//At the next line, DO NOT pass the entire reference such as assets/yes.mp3. This will not work.
//Just pass the file name only.
return await cache.play("yes.mp3");
}
(5) call the function whenever you need to play a sound: await playLocalAsset();
Null-safe code:
Add dependency to your pubspec.yaml file,
dependencies:
audioplayers: ^0.19.0
Add audio file path to your pubspec.yaml file.
flutter:
assets:
- assets/audio/my_audio.mp3
Run flutter pub get
Full code:
class HomePage extends StatelessWidget {
final AudioCache _audioCache = AudioCache(
prefix: 'audio/',
fixedPlayer: AudioPlayer()..setReleaseMode(ReleaseMode.STOP),
);
#override
Widget build(BuildContext context) {
return Scaffold(
body: ElevatedButton(
onPressed: () => _audioCache.play('my_audio.mp3'),
child: Text('Play Audio'),
),
);
}
}
[Answer updated: this approach doesn't work, see comments]
You can use the video_player plugin maintained by the Flutter team. It can reproduce many kinds of media across platforms, including sound files. More specifically, you may want to use the the VideoPlayerController class.
eg. _controller = VideoPlayerController.network('https://www.example.com/soundsFile.wav');
_controller.play();

audio Blob not working in IOS / Safari

I am recording audio, sending it as a blob to a nodejs server. The nodejs server then sends it to all connected users that are not currently recording.
Sending the blob:
mediaRecorder.onstop = function(e) {
var blob = new Blob(this.chunks, { 'type' : 'audio/ogg; codecs=opus' });
socket.emit('radio', blob);
};
Server receiving the blob:
socket.on('radio', function(blob) {
socket.broadcast.emit('voice', blob);
});
Listener receiving the blob:
socket.on('voice', function(arrayBuffer) {
var blob = new Blob([arrayBuffer], { 'type' : 'audio/ogg; codecs=opus' });
var audio = document.getElementById('audio');
audio.src = window.URL.createObjectURL(blob);
audio.play();
});
This works in all browsers/devices except safari and any ios device. Taking it further with safari's inspector I found this:
Does safari require something else in its headers for blob objects to be interpreted properly? I've researched accepted audio types, tried aac/mp3/ogg without any success. Upon reading further I've heard references to the fact that there is a bug with streaming blob audio/video data in safari and IOS though I'm not too clear any the details.
Guidance in the rite direction would be very helpful!
EDIT: It looks like this line audio.src = window.URL.createObjectURL(blob); in the receiving blob is what is causing the blob errors (image i linked)
EDIT 2: I tried to see if using another format other than blob would work, opting for base64 encoded string. Looks like this works on all devices and browsers except for IOS and Safari. I'm getting the impression it has something to do with how IOS interprets/loads the data...
For me the solution was to insert a source element into the audio element, and use the sourceElement.src attribute to refer to the blob. I didn't even need to attach the audio-element to the DOM. Example below, hope it helps someone.
var audioElement = document.createElement('audio')
var sourceElement = document.createElement('source')
audioElement.appendChild(sourceElement)
sourceElement.src = '<your blob url>'
sourceElement.type = 'audio/mp3' // or whatever
audioElement.load()
audioElement.play()
I haven't been able to find a solution using an audio element, however the Web Audio Api seems to do the trick: https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API
var audioContext = new (window.AudioContext || window.webkitAudioContext);
socket.on('voice', function(arrayBuffer) {
audioContext.decodeAudioData(arrayBuffer, audioData => {
var source = audioContext.createBufferSource();
source.buffer = audioData;
source.connect(audioContext.destination);
source.start()
});
You may still have an issue on iOS as any audio/video must be triggered by a user action.
I had a similar problem this week. I was recording the audio on Safari and the audio blob was being generated just fine. But when I tried to send the blob to server, no data was getting there. The solution bellow reads the blob with File Reader to convert it to base64, and afterwards send it to server. Here is what worked for me:
const reader = new FileReader();
reader.readAsDataURL(audioBlob);
reader.onloadend = () => {
const base64data = reader.result;
const audioName = uuid.v4() + '.mp3';
this.http.post(this.app.cloudStorageEndpoint + '/audios/local?audioName=' + audioName, base64data , header).subscribe(
res => { resolve(audioName); },
err => { reject(err); }
);
};
I've tested with Safari 12 on both iPad and iPhone and it worked fine.

iOS screenshot with cordova is not in the photos library

I'm using the cordova screenshot plugin : https://github.com/gitawego/cordova-screenshot to take a screenshot in my iPhone using this code :
navigator.screenshot.save(function (error, res) {
if (error) {
console.log('Screenshot error');
console.error(error);
} else {
console.log('screenshot ok', res.filePath);
}
}, 'jpg', 50, 'project-X-result');
It seems to work (i have no error) but I can't find the screenshot in the Photos Library. Is it possible to save it in this library?
How should I do? Using another plugin to move the file? (where should it be moved exactly?) Editing the plugin to save it directly in the library? (where should it be saved exactly?)
I just ran through the same problem. It took several days but I figured out how to do it.
It does involve another plugin Canvas2Image plugin. I didn't think it would work, but I was desperate and it did work in the end. Here's how I did it.
If you are getting the console.log for screenshot ok, then you are in good shape. The next thing you will need to do is install Canvas2Image with your CLI like so:
cordova plugin add https://github.com/devgeeks/Canvas2ImagePlugin.git
(or replace 'cordova' with 'phonegap' if you use that instead.)
Next, you will need to add a function (in this case saveImageToPhone()) that calls the plugin you just added to your project. This function will be called from your navigator.screenshot.save() function you already have. We will add that function call to your screenshot.save success block, right after the console.log line.
The key here is using that filePath property that we get back in the success block; That's our absolute path to the image we just saved to the temp folder in iOS. We will simply pass that path to the second function and let it do its work.
Here's those two functions from my code:
function saveScreen(){
navigator.screenshot.save(function(error,res){
if(error){
console.error(error);
}else{
console.log('ok',res.filePath);
var MEsuccess = function(msg){
console.info(msg);
} ;
var MEerror = function(err){
console.error(err);
};
saveImageToPhone(res.filePath, MEsuccess, MEerror);
}
},'jpg',90);
}
function saveImageToPhone(url, success, error) {
var canvas, context, imageDataUrl, imageData;
var img = new Image();
img.onload = function() {
canvas = document.createElement('canvas');
canvas.width = img.width;
canvas.height = img.height;
context = canvas.getContext('2d');
context.drawImage(img, 0, 0);
try {
imageDataUrl = canvas.toDataURL('image/jpeg', 1.0);
imageData = imageDataUrl.replace(/data:image\/jpeg;base64,/, '');
cordova.exec(
success,
error,
'Canvas2ImagePlugin',
'saveImageDataToLibrary',
[imageData]
);
}
catch(e) {
error(e.message);
}
};
try {
img.src = url;
}
catch(e) {
error(e.message);
}
}
Now just call the first function from wherever you wish.
If it works, you'll get a console.log right after your filePath readout that says
IMAGE SAVED!
Be careful, you might overwrite the same screenshot if you use a name as a screenshot.save parameter (after your jpg and quality parameters). My app needs to save different screenshots and have them all available later; by removing the name parameter and allowing the OS to name the file I was able to achieve just that.
I hope that helps you out, I know it caused me a lot of trouble...

Resources