I am developing a flutter demo app. I want to use metadata about a video in my phone storage. I am able to extract the path of that video, but don't know how to extract its metadata in dart/flutter.
I need the following metadata:
Duration of video
Name of video
Size of video
When video was taken
You can use the VideoPlayerController.file constructor from the official video player plugin (which is maintained by the official google team so you don't have to worry about its future and stability) to access the file and get the following meta after you install the package:
first this is your VideoPlayerController:
VideoPlayerController controller = new VideoPlayerController.file('');//Your file here
Duration:
controller.value.duration ;
Video Name, this should already be possessed with you as you can reach the file path and pass it to the player constructor.
3.Video Size:
controller.value.size ;
4.As for when the video was taken I can't help you with this. You have to find another way to figure it out.
One of the ways to get the creation time of the video in FLutter is to use flutter_ffmpeg plugin.
Add it to the pubspec.yaml:
dependencies:
flutter_ffmpeg: ^0.3.0
Get the file path of your video, for example with file_picker:
File pickedFile = await FilePicker.getFile();
Get meta data of the video by its path using ffmpeg:
final FlutterFFprobe flutterFFprobe = FlutterFFprobe();
MediaInformation mediaInformation = await flutterFFprobe.getMediaInformation(pickedFile.path);
Map<dynamic, dynamic> mp = mediaInformation.getMediaProperties();
String creationTime = mp["tags"]["creation_time"];
print("creationTime: $creationTime");
And in the console you'll get smth like this:
I/flutter (13274): creationTime: 2020-09-24T17:59:24.000000Z
Along with creation time, there are other useful details:
Note: adding this plugin to your app increases the weight of your final apk!
your code is correct Mazin Ibrahim you just need to initialize it. the future value will return all the details.
Future getVideo() async {
await MultiMediaPicker.pickVideo(source: ImageSource.gallery).then((video){
File _video = video;
VideoPlayerController fileVideocontroller = new VideoPlayerController.file(_video)
..initialize().then((_) {
debugPrint("========"+fileVideocontroller.value.duration.toString());
});
});
}
Related
I have used PDFTron to update/edit PDF files. I have followed the documentation for opening the PDF file which came from server, but I am not sure how to save the edited PDF file with this SDK (PDFTron).
I have referred below links to save PDF, but did not succeed.
https://www.pdftron.com/documentation/ios/guides/features/forms/export-data/
https://www.pdftron.com/api/ios/Enums/PTSaveOptions.html
I want to send XFDF file formats to server.
PDFTron saves PDF with annotation automatically after some time interval, but I want it to be saved by save button press. I am stuck on this saving process.
I have below code to import annotation and I don't know how to import this XFDF file and where do to get this XFDF file.
// Import annotations from XFDF to FDF
let fdf_doc: PTFDFDoc = PTFDFDoc.create(fromXFDF: xfdf_filename)
// Optionally read XFDF from a string
let fdf_doc: PTFDFDOc = PTFDFDoc.create(fromXFDF: xfdf_string)
// Merge FDF data into PDF doc
let doc: PTPDFDoc = PTPDFDoc(filepath: filename)
doc.fdfMerge(fdf_doc)
I don't want it to be customisations by myself, I just want it to be saved by me on pressing button.
Below is my query
How do I save the applied annotation on PDF by myself?
Once you've applied changes to the document data you'll probably want to do something with the updated PDF like letting the user download it or sending it back to your server.
If you just want to let the user download the edited file then no extra changes are necessary as pressing the download button will save the modified PDF to the user's computer.
To add a custom save button, here is a code sample.
If you want to get the modified PDF as an ArrayBuffer then you can use the
getFileData function on Document.
For example:
WebViewer(...)
.then(instance => {
const { documentViewer, annotationManager } = instance.Core;
documentViewer.addEventListener('documentLoaded', async () => {
const doc = documentViewer.getDocument();
const xfdfString = await annotationManager.exportAnnotations();
const options = { xfdfString };
const data = await doc.getFileData(options);
const arr = new Uint8Array(data);
const blob = new Blob([arr], { type: 'application/pdf' });
// upload blob to your server
});
});
I have followed the documentation for opening the PDF file which came from server
There are a few ways to do this - could you share which API you are using?
The main point of your question seems to be how to save the PDF via a button press (after you've merged in XFDF annotation data). Is this the case?
You can control where a remote document is shared by implementing the relevant delegate methods, likely specifically https://www.pdftron.com/api/ios/Protocols/PTDocumentControllerDelegate.html#/c:objc(pl)PTDocumentControllerDelegate(im)documentController:destinationURLForDocumentAtURL:
You can then save the document using this method:
https://www.pdftron.com/api/ios/Classes/PTDocumentBaseViewController.html#/c:objc(cs)PTDocumentBaseViewController(im)saveDocument:completionHandler:
I'm working on a React Native app where I need to load an audio file from the backend and play it in the app.
For this I'm using the packages RNFetchBlob and react-native-audio-recorder-player.
The problem is that my implementation works perfectly on Android, but it doesn't work on iOS... Even for playing files that were recorded using the react-native-audio-recorder-player inside iOS itself.
When playing files downloaded using RNFetchBlob I get the following error:
FigFileForkOpenMainByCFURL signalled err=2 (errno) (open failed) at /Library/Caches/com.apple.xbs/Sources/EmbeddedCoreMediaFramework_Sim/EmbeddedCoreMedia-2765.6/Sources/Platform/Darwin/DarwinFile.c:576
The part of the code that matters:
import AudioRecorderPlayer from 'react-native-audio-recorder-player';
import RNFetchBlob from 'rn-fetch-blob';
// THIS IS A SAMPLE, NOT THE REAL URL AND TOKEN.
const fileRemoteUrl = 'https://my-backend.com/files/file-id';
const authToken = 'my-authtoken';
// could be 'mp3' or 'aac', etc...
const fileExtension = 'm4a';
const dir = NFetchBlob.fs.dirs.DocumentDir;
const path = `${dir}/${Base64.btoa(fileRemoteUrl)}.${fileExtension}`;
const res = await RNFetchBlob.config({
fileCache: false,
appendExt: fileExtension,
path,
}).fetch('GET', fileRemoteUrl, { Authorization: `Bearer ${authToken}` });
const internalUrl = `${Platform.OS === 'android' ? 'file://' : ''}${res.path()}`;
const audioRecorderPlayer = new AudioRecorderPlayer();
await audioRecorderPlayer.startPlayer(internalUrl);
// here the audio should start playing,
// but I get the error on the device(simulator) console (xcode output)
As I said before, the same code works like a charm on Android.
Any idea how to solve this problem? I'm stuck in this...
I appreciate any help!
In the end I found that the problem was in this line:
const internalUrl = `${Platform.OS === 'android' ? 'file://' : ''}${res.path()}`;
For both iOS and Android it is necessary to add the prefix "file://" to the file path before passing to the "startPlayer" method.
The reason I was previously only adding the prefix on Android is because I use the same code snippet to load files of other formats (not audio). And for example to use the component "<Image source={{ uri }} />" on iOS the uri cannot have the prefix.
So the solution was to create a specific treatment to add the prefix when the platform is iOS just before calling the "startPlayer" method.
I've been searching for a couple of days on how to get the album art for a song (or a frame capture of a video) from a file path. All I could find is things related to Mediastore like this answer where it requires getting the album ID of the file.
Cursor cursor = getContentResolver().query(MediaStore.Audio.Albums.EXTERNAL_CONTENT_URI,
new String[] {MediaStore.Audio.Albums._ID, MediaStore.Audio.Albums.ALBUM_ART},
MediaStore.Audio.Albums._ID+ "=?",
new String[] {String.valueOf(albumId)},
null);
if (cursor.moveToFirst()) {
String path = cursor.getString(cursor.getColumnIndex(MediaStore.Audio.Albums.ALBUM_ART));
// do whatever you need to do
}
But I can't find a guid on how it works, how can pass the file to Mediastore or how can I get the album ID of the media or anything... Currently I get the media information using MediaMetadataRetriever but I can't find a way to get the album art or a video thumbnail of a media file using it...
** Update :-
If Mediastore must be used to get the media files in the first place before using it to get their data, I can implement it instead of what I'm currently doing (Currently I iterate the device files to get the supported files) and it can be a better option as Mediastore supports getting data from external storages as well.
Any help is appreciated.
If using MediaMetadataRetriever , you can have a try with follow sample code :
private void loadingCover(string mediaUri)
{
MediaMetadataRetriever mediaMetadataRetriever = new MediaMetadataRetriever();
mediaMetadataRetriever.SetDataSource(mediaUri);
byte[] picture = mediaMetadataRetriever.GetEmbeddedPicture();
Android.Graphics.Bitmap bitmap = BitmapFactory.DecodeByteArray(picture, 0, picture.Length);
musicCover.SetImageBitmap(bitmap);
}
In addition , not forgatting to add permission :
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
Invoke it as follow :
File file = new File("/storage/sdcard/Movies/music1.mp4");
if (file.Exists())
{
loadingCover(file.AbsolutePath);
}
I have a container and a blob of mp4 type in my Azure storage account.
I am able to stream it from the Blob.Uri (or using SAS token if required) in HTML5 video control.
But I need to convert that Blob Uri to be converted as "blob:" url so copying the source from developer's tool will not play the video in any new window.
I don't want to download the full video first anywhere.
Thanks in advance!
I am showing/streaming the video like:
List<string> blobPaths = new List<string>();
foreach (IListBlobItem item in container.ListBlobs(useFlatBlobListing: true))
{
if (item.GetType() == typeof(CloudBlockBlob))
{
CloudBlockBlob blb = (CloudBlockBlob)item;
string sasToken = blb.GetSharedAccessSignature(null, "spolicy");
blobPaths.Add(string.Format(CultureInfo.InvariantCulture, "{0}{1}", blb.Uri, sasToken));
}
}
ViewBag.Path1 = blobPaths[0];
Main point is that I want Netflix type of feature, so that copying the video source from web page's view source won't work.
Also video should not be downloaded locally.
With the new Firebase API you can upload files into cloud storage from client code. The examples assume the file name is known or static during upload:
// Create a root reference
var storageRef = firebase.storage().ref();
// Create a reference to 'mountains.jpg'
var mountainsRef = storageRef.child('mountains.jpg');
// Create a reference to 'images/mountains.jpg'
var mountainImagesRef = storageRef.child('images/mountains.jpg');
or
// File or Blob, assume the file is called rivers.jpg
var file = ...
// Upload the file to the path 'images/rivers.jpg'
// We can use the 'name' property on the File API to get our file name
var uploadTask = storageRef.child('images/' + file.name).put(file);
With users uploading their own files, name conflicts are going to be an issue. How can you have Firebase create a filename instead of defining it yourself? Is there something like the push() feature in the database for creating unique storage references?
Firebase Storage Product Manager here:
TL;DR: Use a UUID generator (in Android (UUID) and iOS (NSUUID) they are built in, in JS you can use something like this: Create GUID / UUID in JavaScript?), then append the file extension if you want to preserve it (split the file.name on '.' and get the last segment)
We didn't know which version of unique files developers would want (see below), since there are many, many use cases for this, so we decided to leave the choice up to developers.
images/uuid/image.png // option 1: clean name, under a UUID "folder"
image/uuid.png // option 2: unique name, same extension
images/uuid // option 3: no extension
It seems to me like this would be a reasonable thing to explain in our documentation though, so I'll file a bug internally to document it :)
This is the solution for people using dart
Generate the current date and time stamp using:-
var time = DateTime.now().millisecondsSinceEpoch.toString();
Now upload the file to the firebase storage using:-
await FirebaseStorage.instance.ref('images/$time.png').putFile(yourfile);
You can even get the downloadable url using:-
var url = await FirebaseStorage.instance.ref('images/$time.png').getDownloadURL();
First install uuid - npm i uuid
Then define the file reference like this
import { v4 as uuidv4 } from "uuid";
const fileRef = storageRef.child(
`${uuidv4()}-${Put your file or image name here}`
);
After that, upload with the file with the fileRef
fileRef.put(Your file)
In Android (Kotlin) I solved by combining the user UID with the milliseconds since 1970:
val ref = storage.reference.child("images/${auth.currentUser!!.uid}-${System.currentTimeMillis()}")
code below is combination of file structure in answer from #Mike McDonald , current date time stamp in answer from # Aman Kumar Singh , user uid in answer from #Damien : i think it provides unique id, while making the firebase storage screen more readable.
Reference ref = firebaseStorage
.ref()
.child('videos')
.child(authController.user.uid)
.child(DateTime.now().millisecondsSinceEpoch.toString());