When I use the Download method of the FileReference class, everything works fine on Desktop and Android, but I get an error on iOS.
This is the code:
var req = new URLRequest(url);
var localRef:FileReference = new FileReference();
localRef.download(req);
On iOS I'm getting an Alert:
Download Error
File downloads not supported.
I already tried to NavigateToUrl() and it asks save the file in Dropbox or another App.
How can I fix this error?
You shouldn't use FileReference on mobile (or AIR, in general, though it does open the Load/Save dialog on desktop so there can be some use there). You should instead use File and FileStream, which give you far more control over the file system.
In this case, you could try to use File.download() and save it to File.applicationStorageDirectory, but I don't know if it will have any difference since it extends FileReference.
What I generally do is use URLStream instead of URLLoader. It gives you access to the raw bytes of the file you are downloading and then use File and FileStream
So something like (and this is untested off the top of my head, though I have used similar in the past):
var urlStream:URLStream = new URLStream();
urlStream.addEventListener(Event.COMPLETE, completeHandler);
urlStream.load(new URLLoader('url');
function completeHandler(e:Event):void {
var bytes:ByteArray = new ByteArray();
urlStream.readBytes(bytes);
var f:File = File.applicationStorageDirectory.resolvePath('filename');
var fs:FileStream = new FileStream();
fs.open(f, FileMode.WRITE);
fs.writeBytes(bytes);
fs.close();
}
Now, obviously, there is a lot more you want to account for (errors, progress, etc). That should be enough to point you in the right direction, however.
It's possible to create a full download manager using this method (something I did for an iOS project two years ago), since you can save as-you-go to the file system rather than waiting until Event.COMPLETE fires (using the ProgressEvent.PROGRESS event). That allows you to avoid having a 500MB file in memory, something most devices can't handle.
Related
I've been reading a lot of StackOverflow posts that discuss copying data from FileSystemStorage to Storage in CodenameOne, such as described in this answer from Shai, as seen below:
InputStream stream =
FileSystemStorage.getInstance().openInputStream(i);
OutputStream out =
Storage.getInstance().createOutputStream("MyImage");
Util.copy(stream, out);
Util.cleanup(stream);
Util.cleanup(out);`
I've been trying to do the reverse: save from Storage to FileSystemStorage in order to show a PDF in the BrowserComponent (while using iOS), but have not been able to do so. I need to show the PDF within the app (so I don't want to use Display.getInstance().execute()).
Basically, I'm trying to dynamically populate a Container with whatever files the user selects-- I am using the FileChooser library for CN1 from Steve Hannah. (Disclaimer: I have made slight modifications to this library as it used in the app I'm working on-- HOWEVER, when I choose images with this library and pull them from Storage to an Image via InputStream, they display perfectly in an ImageViewer so I know that all files are being saved correctly in Storage.)
Here is my code (with help from Steve Hannah's comment on GitHub):
//fileLocation and fileName are slightly different but both end with file extension
File file = new File(fileToUpload.getFileName());
FileSystemStorage fss = FileSystemStorage.getInstance();
InputStream is = Storage.getInstance().createInputStream(fileToUpload.getLocation());
OutputStream os = fss.openOutputStream(file.getAbsolutePath());
Util.copy(is, os);
ToastBar.Status status = ToastBar.getInstance().createStatus();
String message = file.exists() + " " + file.isFile() + file.getAbsolutePath();
status.setMessage(message);
status.setExpires(3000);
status.show();
NativeLogs.getNativeLogs();
if (Display.getInstance().getPlatformName().equals("ios")) {
//Log.p("in ios !!!!");
BrowserComponent browserComponent = new BrowserComponent();
browserComponent.setURL(file.getPath());
horizontalContainer.add(browserComponent);
}
The ToastBar displays true and true for file.exists() and file.isFile().
I stipulate iOS because as far as I've seen while researching previewing PDFs within an app, I've seen that Android needs to have a different implementation, like adding a NativeInterface with an Android library. I also saw in different answers on the Google Group that this functionality (using browserComponent to view PDFs) is only available for iOS and not on the simulator. In the simulator, I see a blank space. My iPhone just freezes and/or crashes after displaying the ToastBar (and I work on a Windows machine, so not much ability to see native logs....)
What can I do to access the file and show it in the BrowserComponent?
Thank you!
Simple solution -- the file had a space in it (eg. "Test page.pdf") and didn't show! When I used files that didn't have spaces this worked and after removing spaces in the file names, thankfully everything worked. I'll have to add code to handle this scenario.
Thanks for your help!
After searching around the net, I couldn't find a method to download a mp3 from a server, save it in the local storage (iPad) and then load and play it other than converting it to byteArray, saving it as .mp3 and then load it and read it back to mp3 to be able to play it in the flash application.
The problem is, although this method works fine, the uncompressed files (in byteArray format) saved in the local storage are too heavy and I suspect that the app is wasting memory.
My question is, is there any form of saving the mp3 directly, without any conversion, like a properly playable mp3? I can't use methods like download() or save() from FileReference.
Lots of thanks!!
SOLUTION!! (WORKING PERFECTLY):
I continued looking for a method to do it around the net, and some forums gave me a clue to do whick I was searching. It was finally pretty simple, but I spent almost 2 weeks to find it out... here is my code:
var queue:LoaderMax = new LoaderMax({name:"mainQueue", onProgress:progressHandler, onComplete:completeHandler, onError:errorHandler});
queue.append( new DataLoader("http://" + **url**, {name:**"example.mp3"**format:"binay"}));
queue.load();
// Complete Event:
private function completeHandler():void {
var file:File = new File(**your location**); // appData
var fr:FileStream = new FileStream();
fr.open(file.resolvePath(nameIn), FileMode.WRITE);
fr.writeObject(LoaderMax.getContent("example.mp3"));
fr.close();
fr = null;
// Now the mp3 is saved in local storage, we load it as Sound object so we can play it.
var loader:MP3Loader;
loader = new MP3Loader(**your location**, {autoPlay:false}) ;
loader.addEventListener(Event.COMPLETE, function():Sound {
loader.content.play(); // This line is the only one not checked, but I am completely sure you can do something like this to play the sound. For example, I introduce the loader.content in a Array of Sound and later I am capable to play any sound I want.
loader.dispose(true); // This is very important to free memory. You should do the same thing with queue when all items are downloaded.
});
loader.load();
}
I expect this help a lot of people!!
I am aiming to create an application where the user can take a picture of their face, which includes an overlay of a face cutout. I need the user to be able to click the screen and for the application to save the picture, mask it with the same face cutout, and then save it to the applications storage.
This is the first time using AIR on IOS with Actionscript3. I know there is a proper directory that you are supposed to save to on IOS however I am not aware of it. I have been saving other variables using SharedObjects...
E.g:
var so:SharedObject = SharedObject.getLocal("applicationID");
and then writing to it
so.data['variableID'] = aVariable;
This is how I access the front camera and display it. For some reason to display the whole video and not a narrow section of it, I add the video from the camera to a movieclip on the stage, 50% of the size of the stage.
import flash.media.Camera;
import flash.media.Video;
import flash.display.BitmapData;
import flash.utils.ByteArray;
import com.adobe.images.JPGEncoder
var camera:Camera = Camera.getCamera("1");
camera.setQuality(0,100);
camera.setMode(1024,768, 30, false);
var video:Video = new Video();
video.attachCamera(camera);
videoArea.addChild(video);
Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT;
Capture_Picture_BTN.addEventListener(TouchEvent.TOUCH_TAP, savePicture);
function savePicture(event:TouchEvent):void
{
trace("Saving Picture");
//Capture Picture BTN
var bitmapData:BitmapData = new BitmapData(1024,768);
bitmapData.draw(video);
}
I apologize if this is the wrong way of going about this I am still fairly new to Actionscript as it is. If you need any more information I will be happy to provide.
You can only save ~100kb of data via SharedObject, so you can't use that. It is meant solely to save application settings and, from my experience, is ignored by AIR devs because we have better control over the file system.
We have the File and FileStream classes. These classes allow you to read and write directly to and from the device's disk, something not quite possible on the web (the user is the one who has to save/open; can't be done automatically).
Before my example, I must stress that you should read the documentation. Adobe's LiveDocs are among the best language/SDK docs available and it will point out many things that my quick example on usage will not (such as in-depth discussion of each directory, how to write various types, etc)
So, here's an example:
// create the File and resolve it to the applicationStorageDirectory, which is where you should save files
var f:File = File.applicationStorageDirectory.resolvePath("name.png");
// this prevents iCloud backup. false by default. Apple will reject anything using this directory for large file saving that doesn't prevent iCloud backup. Can also use cacheDirectory, though certain aspects of AIR cannot access that directory
f.preventBackup = true;
// set up the filestream
var fs:FileStream = new FileStream();
fs.open(f, FileMode.WRITE); //open file to write
fs.writeBytes( BYTE ARRAY HERE ); // writes a byte array to file
fs.close(); // close connection
So that will save to disk. To read, you open the FileStream in READ mode.
var fs:FileStream = new FileStream();
var output:ByteArray = new ByteArray();
fs.open(f, FileMode.READ); //open file to write
fs.readBytes( output ); // reads the file into a byte array
fs.close(); // close connection
Again, please read the documentation. FileStream supports dozens of read and write methods of various types. You need to select the correct one for your situation (readBytes() and writeBytes() should work in all cases, though there are instances where you are should use a more specific method)
Hope that helps.
My understanding currently is that:
CameraUI
I can use the CameraUI to access the built in camera for MediaType.VIDEO and that delegates to the built-in video camera app and lets me record a video. My app does that now.
When I stop recording and click the "Use" button, I am returned to my app and theoretically I have a valid MediaPromise.
iOS does -not- provide a valid/usable url/filename to the recorded video (or to photos) and so I would have to use a Loader to bring-in/use/access the 'recorded' video... AND... iOS does not actually create a file anywhere on the device, most importantly, in the Camera Roll where one would expect by the normal behavior when uses the system native camera/video app.
The documentation says that the Loader can load various image types and SWFs but nothing about video data, so I conclude from that that I cannot actually use the CameraUI to generate a valid MediaPromise that I can then pass to a Loader class or similar to read in the information created by the system camera and then manipulate (upload, save to applicationStorageDirectory, and/or display in one of the two video player components available in the API).
CameraRoll
I can have video entities in the iOS Camera Roll but the AS3/Air3.5 CameraRoll class won't let me view/access/reference them in any way.
Normal File I/O
All my attempts to use the Air3.5 File classes to browse to the storage location of the iOS Camera Roll have been rebuffed.
------- Questions -------
Am I correct in believing that there is a way to take video but no way to use the video that's been captured. (No way to use the resulting MediaPromise successfully).
I believe you can take video and access it using Android, but there's nothing in the documentation that says that you cannot using iOS.
Am I correct in believing that iOS sandboxes apps so that they cannot browse to video/photo storage using standard File I/O, but only through the apparently non-workable means I've tried (CameraUI & CameraRoll)
Am I wrong to think that these should be rather obvious NEEDS that one can achieve using the XCode Objective C++ etc route but the AIR Mobile Framework does not allow either because of Apple blocking functionality or because Adobe has failed to meet reasonable expectations?
One item of ironic note to convey. If I use the iOS system camera app to record a video, a thumnail of that video then appears in the Gallery/Camera Roll, and of course, I can share it or view it, or whatever... If I use AIR's CameraRoll.browseForImage(), provided I haven't used the camera to take another image, when it shows me the folder where the pictures are stored, the folder icon uses the thumbnail of the last object added... in this case, the video I took, but if I then enter the folder, the video cannot be found. It's teasing us. It knows it's there, but it is apparently forbidden fruit.
I can't answer all your questions, so this entry may not be acceptable, but I found this page while searching a solution for some the problems you described and thought that someone else may find this answer (partially) useful.
To save the movie you just took you need to open and read the data from the promise.
The iOS won't save the file anywere, so the MediaPromise.file is always null.
This is my solution to the problem:
private var camera:CameraUI;
private var dataInput:IDataInput;
public function recordVideo():void
{
// Start the camera and ask for a video
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, onCameraComplete);
camera.launch(MediaType.VIDEO);
}
private function onCameraComplete(event:MediaEvent):void
{
// event.data is a MediaPromise and MediaPromise.open() returns a IDataInput
// Let's cast it to a dispatcher and check when it's complete
dataInput = event.data.open();
var dispatcher:IEventDispatcher = IEventDispatcher(dataInput);
dispatcher.addEventListener(Event.COMPLETE, onDataInputComplete);
}
private function onDataInputComplete(event:Event):void
{
// We can do whatever we want with the data, so we'll store it in a File
var file:File = new File();
var bytes:ByteArray = new ByteArray();
var stream:FileStream = new FileStream();
// Reading the data from the opened MediaPromise
dataInput.readBytes(bytes);
stream.open(file, FileMode.WRITE);
stream.writeBytes(bytes, 0, bytes.bytesAvailable);
stream.close();
}
Also, I'm still looking for a way to put the movie in the CameraRoll
I'm developing an iOS iPad app w/ Flash CS5.5 AIR v.3.1
I'm using ShineMp3Encoder to encode a Wav to a Mp3 from:
https://github.com/kikko/Shine-MP3-Encoder-on-AS3-Alchemy
Basically it converts a byteArray (wav) to a byteArray (mp3) or "mp3encoder.mp3Data" in the code.
I have no trouble saving this using a (new FileReference()).save(mp3Data, filename); but because this is being used in a iOS AIR app I wanted to switch to using the File.applicationStorageDirectory so that I could put the saved mp3 into it's own folder to keep organized. When I run the code it goes through all the steps, converts the wav to mp3, and then says that it saves but doesn't with no errors. The sound IS stored in memory, as it is able to be played back until the app is closed. I've changed the resolvePath to the root folder, myApp/, and to /sounds - None of which work. I've never attempted to do this before so I'm a little lost why no file is being created. Anyone with any suggestions would help a lot.
function onEncoded(e:Event):void{
myTI.text = "Mp3 encoded and saved. Press Play.";
mp3encoder.mp3Data.position = 0;
var myDate:Date = new Date();
var theDate:String = myDate.monthUTC.toString() + myDate.dayUTC.toString()
+ myDate.hoursUTC.toString() + myDate.minutesUTC.toString()
+ myDate.secondsUTC.toString();
var file:File = File.applicationStorageDirectory.resolvePath("myApp/sounds/myVoice+"+theDate+".mp3");
var fileStream:FileStream = new FileStream;
fileStream.open(file,FileMode.UPDATE);
fileStream.writeBytes(mp3encoder.mp3Data);
fileStream.close();
}
Untested but try this. If it works, modify the filename to be dynamic as you require.
var myfilename:File = File.applicationStorageDirectory;
myfilename = myfilename.resolvePath("myVoice.mp3");
var outputStream:FileStream = new FileStream();
outputStream.open(myfilename,FileMode.WRITE);
outputStream.writeBytes(mp3encoder.mp3Data,0,mp3encoder.mp3Data.length);
outputStream.close();
I know my answer is little bit late, but maybe it will be useful for somebody who will find this post in future.
When you develop on Windows in Adobe AIR and try to save something in File.applicationStorageDirectory it will NOT saved in the same folder like your swf or fla file.
It will be saved here:
C:\Users[WIN_LOGIN]\AppData\Roaming[APP_ID]\Local Store
Your APP_ID you will find and set in Publish Settings for IOS.