I'm porting AIR app to iOS. App saves document localy with File.browseForSave(). That seem not to to work on iPad. How is it possible to save files on iPad?
P.S. tracing File.url says "app-storage:/New%20map.comap". Maybe names with % are not allowed on iOS?
Best wishes
You can only save to specific directories within the apps sandboxed area. e.g. Documents directory.
Something like this saves an image to the documents directory. It uses the Adobe JPGEncoder to create a byte array that is written and the crop method to take a snapshot of the stage.
private function createImages():void {
var snapShot:Bitmap = crop(0, 0, 1024, 768);
f = File.documentsDirectory.resolvePath("test.jpg");
var stream:FileStream = new FileStream();
stream.open(f, FileMode.WRITE);
var j:JPGEncoder = new JPGEncoder(80);
var bytes:ByteArray = new ByteArray();
bytes = j.encode(snapShot.bitmapData);
stream.writeBytes(bytes, 0, bytes.bytesAvailable);
stream.close();
stream.openAsync(f, FileMode.READ);
stream.addEventListener(Event.COMPLETE, imagewritten, false, 0, true);
}
private function imagewritten(e:Event):void {
trace("done");
}
private function crop( _x:Number, _y:Number, _width:Number, _height:Number, displayObject:DisplayObject = null):Bitmap
{
var cropArea:Rectangle = new Rectangle( 0, 0, _width, _height );
var croppedBitmap = new Bitmap( new BitmapData( _width, _height ), PixelSnapping.ALWAYS, true );
croppedBitmap.bitmapData.draw( (displayObject!=null) ? displayObject : stage, new Matrix(1, 0, 0, 1, -_x, -_y) , null, null, cropArea, true );
cropArea = null;
return croppedBitmap;
}
Related
I am using the following javascript to take a canvas shot of page and turn it into PDF , however when it saves the file ... I cant open the file in PDF I get the following error "There was an error Processing a Page. There was a PRoblem reading this document(110) " I can open the same file within the browser but not on my computer
function exportPDF() {
var pdf = new jsPDF('l','px'),
source = $('body')[0];
var canvasToImage = function(canvas){
var img = new Image();
var dataURL = canvas.toDataURL('image/png');
img.src = dataURL;
return img;
};
var canvasShiftImage = function(oldCanvas,shiftAmt){
shiftAmt = parseInt(shiftAmt) || 0;
if(!shiftAmt){ return oldCanvas; }
var newCanvas = document.createElement('canvas');
newCanvas.height = oldCanvas.height - shiftAmt;
newCanvas.width = oldCanvas.width;
var ctx = newCanvas.getContext('2d');
var img = canvasToImage(oldCanvas);
ctx.drawImage(img,0, shiftAmt, img.width, img.height, 0, 0, img.width, img.height);
return newCanvas;
};
var canvasToImageSuccess = function(canvas){
var pdf = new jsPDF('l','px'),
pdfInternals = pdf.internal,
pdfPageSize = pdfInternals.pageSize,
pdfScaleFactor = pdfInternals.scaleFactor,
pdfPageWidth = pdfPageSize.width,
pdfPageHeight = pdfPageSize.height,
totalPdfHeight = 0,
htmlPageHeight = canvas.height,
htmlScaleFactor = canvas.width / (pdfPageWidth * pdfScaleFactor),
safetyNet = 0;
while(totalPdfHeight < htmlPageHeight && safetyNet < 15){
var newCanvas = canvasShiftImage(canvas, totalPdfHeight);
pdf.addImage(newCanvas, 'png', 0, 0, pdfPageWidth, 0, null, 'NONE');
totalPdfHeight += (pdfPageHeight * pdfScaleFactor * htmlScaleFactor);
if(totalPdfHeight < htmlPageHeight){
pdf.addPage();
}
safetyNet++;
}
pdf.save(address.innerHTML + 'test.PDF');
};
html2canvas(source, {
onrendered: function(canvas){
canvasToImageSuccess(canvas);
}
});
Please follow the answer on the below git link
https://github.com/MrRio/jsPDF/issues/862
I've seen this question asked here already : Web Audio API Memory Leaks on Mobile Platforms but there doesn't seem to be any response to it. I've tried lots of different variations - setting variables to null once I'm finished with them, or declaring the variables within scope, but it appears that the AudioContext (or OfflineAudioContext ) is not being Garbage collected after each operation. I'm using this with Phonegap, on an IOS, so the browser is safari. Any ideas how to solve this memory leak?
Here is my code:
function readVocalsToBuffer(file){
console.log('readVocalsToBuffer');
var reader = new FileReader();
reader.onloadend = function(evt){
var x = audioContext.decodeAudioData(evt.target._result, function(buffer){
if(!buffer){
console.log('error decoding file to Audio Buffer');
return;
}
window.voiceBuffer = buffer;
loadBuffers();
});
}
reader.readAsArrayBuffer(file);
}
function loadBuffers(){
console.log('loadBuffers');
try{
var bufferLoader = new BufferLoader(
audioContext,
[
"."+window.srcSong
],
createOffLineContext
);
bufferLoader.load()
}
catch(e){
console.log(e.message);
}
}
function createOffLineContext(bufferList){
console.log('createOfflineContext');
offline = new webkitOfflineAudioContext(2, window.voiceBuffer.length, 44100);
var vocalSource = offline.createBufferSource();
vocalSource.buffer = window.voiceBuffer;
vocalSource.connect(offline.destination);
var backing = offline.createBufferSource();
backing.buffer = bufferList[0];
backing.connect(offline.destination);
vocalSource.start(0);
backing.start(0);
offline.oncomplete = function(ev){
vocalSource.stop(0);
backing.stop(0);
vocalSource.disconnect(0);
backing.disconnect(0);
delete vocalSource;
delete backing;
delete window.voiceBuffer;
window.renderedFile = ev.renderedBuffer;
var bufferR = ev.renderedBuffer.getChannelData(0);
var bufferL = ev.renderedBuffer.getChannelData(1);
var interleaved = interleave(bufferL, bufferR);
var dataview = encodeWAV(interleaved);
window.audioBlob = new Blob([dataview], {type: 'Wav'});
saveFile();
}
offline.startRendering();
}
function interleave(inputL, inputR){
console.log('interleave');
var length = inputL.length + inputR.length;
var result = new Float32Array(length);
var index = 0,
inputIndex = 0;
while (index < length){
result[index++] = inputL[inputIndex];
result[index++] = inputR[inputIndex];
inputIndex++;
}
return result;
}
function saveFile(){
offline = null;
console.log('saveFile');
delete window.renderedFile;
window.requestFileSystem(LocalFileSystem.PERSISTENT, 0, onFSSuccess, fail);
}
I've been trying to save file from server to an ios device using urlstream but it doesn't work (it works fine on android devices . I tried using(documentsDirectory) but it doesn't work too .I used many other methods like (file.download ) and others but none is working . Any help please
I am using flash pro cs6 .
script sample :
import flash.filesystem.*;
import flash.events.ProgressEvent;
var urlString:String = "http://example.sample.mp3";
var urlReq:URLRequest = new URLRequest(urlString);
var urlStream:URLStream = new URLStream();
var fileData:ByteArray = new ByteArray();
urlStream.addEventListener(Event.COMPLETE, loaded);
urlStream.addEventListener(ProgressEvent.PROGRESS, progressHandler);
urlStream.load(urlReq);
function loaded(event:Event):void {
urlStream.readBytes(fileData, 0, urlStream.bytesAvailable);
writeAirFile();
}
function writeAirFile():void {
var file:File = File.applicationStorageDirectory.resolvePath("sample.mp3");
var fileStream:FileStream = new FileStream();
fileStream.open(file, FileMode.WRITE);
fileStream.writeBytes(fileData, 0, fileData.length);
fileStream.close();
trace("The file is written.");
}
function progressHandler(event:Event):void {
trace ("progressHandler: " + event);
}
Tested on iOS
_urlString = "http://example.sample.mp3";
_urlReq = new URLRequest(_urlString);
_urlStream = new URLStream();
_urlStream.addEventListener(flash.events.ProgressEvent.PROGRESS, progressHandler, false, 0, true);
_urlStream.addEventListener(flash.events.Event.COMPLETE, saveFileToDisc, false, 0, true);
_urlStream.addEventListener(flash.events.IOErrorEvent.IO_ERROR, errorHandler, false, 0, true);
_urlStream.load(_urlReq);
private function progressHandler(evt:flash.events.ProgressEvent):void {
trace("progress: " + event.target.progress);
}
private function errorHandler(evt:flash.events.IOErrorEvent):void {
//do something
}
private function saveFileToDisc(event:flash.events.Event):void {
_fileData = new ByteArray();
_urlStream.readBytes(_fileData, 0, _urlStream.bytesAvailable);
_file = File.applicationStorageDirectory.resolvePath("sample.mp3");
_file.preventBackup = true;
_writeFileStream.addEventListener(flash.events.IOErrorEvent.IO_ERROR, filestreamErrorHandler, false, 0, true);
_writeFileStream.addEventListener(flash.events.Event.CLOSE, fileSaved, false, 0, true);
_writeFileStream.openAsync(_file, FileMode.UPDATE);
_writeFileStream.writeBytes(_fileData, 0, _fileData.length);
_writeFileStream.close();
}
private function filestreamErrorHandler(evt:flash.events.IOErrorEvent):void {
//do something
}
private function fileSaved(closeEvent:flash.events.Event):void {
//trace("file saved");
_writeFileStream.removeEventListener(flash.events.IOErrorEvent.IO_ERROR, filestreamErrorHandler);
_writeFileStream.removeEventListener(flash.events.Event.CLOSE, fileSaved);
_urlString = null;
_urlReq = null;
_urlStream = null;
_file = null;
_fileData.length = 0;
_fileData = null;
}
Two obvious problems:
This line: urlStream..addEventListener(ProgressEvent.PROGRESS, progressHandler); has ' .. ' which is wrong.
Also you never write anything into your ByteArray.
set the 3rd parameter to 0 to make sure to read the entire data:
urlStream.readBytes(fileData, 0, 0); //0 = read all
Besides URLStream is not meant for that type of operation (it is meant to stream the loading of binary data). I personally do this using URLLoader (loading in binary) and everything works perfectly and save copy on folder.
**I have an air app for iOS I have been developing. I am trying to capture a picture, save the file to the storage directory (not Camera Roll), and save the file name in an sqlite db.
I have tried so many different variations of this, but when it comes to writing the filestream to save the app hangs. Testing on iPad 3. Does ANYONE have a suggestion? This has been driving me nuts for days. I have searched the web but I am stumped.**
public var temp:File; // File Object to save name in database
protected function selectPicture():void
{
myCam = new CameraUI();
myCam.addEventListener(MediaEvent.COMPLETE, onComplete);
myCam.launch(MediaType.IMAGE);
}
protected function onComplete(event:MediaEvent):void {
//imageProblem.source = event.data.file.url;
var cameraUI:CameraUI = event.target as CameraUI;
var mediaPromise:MediaPromise = event.data;
var mpLoader:Loader = new Loader();
mpLoader.contentLoaderInfo.addEventListener(Event.COMPLETE, onMediaPromiseLoaded);
mpLoader.loadFilePromise(mediaPromise);
}
private function onMediaPromiseLoaded(e:Event):void
{
var mpLoaderInfo:LoaderInfo = e.target as LoaderInfo;
mpLoaderInfo.removeEventListener(Event.COMPLETE, onMediaPromiseLoaded);
this.imageProblem.source = mpLoaderInfo.loader;
var stream:FileStream = new FileStream();
stream.addEventListener(Event.COMPLETE, showComplete);
stream.addEventListener(IOErrorEvent.IO_ERROR, showError);
try{
this.messages.text = "Starting";
stream.open( temp, FileMode.WRITE );
stream.writeBytes(mpLoaderInfo.bytes);
stream.close();
}catch(e:Error){
this.messages.text = e.message;
}
}
protected function showError(e:IOErrorEvent):void{
this.messages.text = e.toString();
}
protected function showComplete(e:Event):void{
this.messages.text = "Completed Writing";
this.imgName.text = temp.url;
imagefile = temp;
deleteFlag = 1;
}
Application hangs due to you are trying to use file operation in Sync mode.
You need to use Async Mode operation instead of sync mode file operation.
stream.openAsync( temp, FileMode.WRITE );
Try with this
var stream:FileStream = new FileStream();
stream.addEventListener(Event.COMPLETE, showComplete);
stream.addEventListener(IOErrorEvent.IO_ERROR, showError);
stream.openAsync( temp, FileMode.WRITE );
stream.writeBytes(mpLoaderInfo.bytes);
stream.close();
Note when using async operation you need not use try catch.For handling error listen IOErrorEvent will catch if any error occurs.
I finally got this to work, I added comments below in the code to explain why it wasn't working.
public var temp:File;
protected function selectPicture():void
{
myCam = new CameraUI();
myCam.addEventListener(MediaEvent.COMPLETE, onComplete);
myCam.launch(MediaType.IMAGE);
}
protected function onComplete(event:MediaEvent):void {
//imageProblem.source = event.data.file.url;
var cameraUI:CameraUI = event.target as CameraUI;
var mediaPromise:MediaPromise = event.data;
var mpLoader:Loader = new Loader();
mpLoader.contentLoaderInfo.addEventListener(Event.COMPLETE, onMediaPromiseLoaded);
mpLoader.loadFilePromise(mediaPromise);
}
private function onMediaPromiseLoaded(e:Event):void
{
var mpLoaderInfo:LoaderInfo = e.target as LoaderInfo;
mpLoaderInfo.removeEventListener(Event.COMPLETE, onMediaPromiseLoaded);
this.imageProblem.source = mpLoaderInfo.loader;
/// Here was the solution
var bitmapDataA:BitmapData = new BitmapData(mpLoaderInfo.width, mpLoaderInfo.height);
bitmapDataA.draw(mpLoaderInfo.content,null,null,null,null,true);
/// I had to cast the loaderInfo as BitmapData
var bitmapDataB:BitmapData = resizeimage(bitmapDataA, int(mpLoaderInfo.width / 4), int(mpLoaderInfo.height/ 4)); // function to shrink the image
var c:CameraRoll = new CameraRoll();
c.addBitmapData(bitmapDataB);
var now:Date = new Date();
var f:File = File.applicationStorageDirectory.resolvePath("IMG" + now.seconds + now.minutes + ".jpg");
var stream:FileStream = new FileStream()
stream.open(f, FileMode.WRITE);
// Then had to redraw and encode as a jpeg before writing the file
var bytes:ByteArray = new ByteArray();
bytes = bitmapDataB.encode(new Rectangle(0,0, int(mpLoaderInfo.width / 4) , int(mpLoaderInfo.height / 4)), new JPEGEncoderOptions(80), bytes);
stream.writeBytes(bytes,0,bytes.bytesAvailable);
stream.close();
this.imgName.text = f.url;
imagefile = f;
deleteFlag = 1;
}
Is there a newer blob detection/tracking library?
Is it not a good library?
Isn't legacy supposed to be old and not useful code?
Does anybody know?
Here is newer blob detector:
http://opencv.itseez.com/modules/features2d/doc/common_interfaces_of_feature_detectors.html#SimpleBlobDetector
Here is my code which i used to track the blob in Emgucv 3.1 Version
//MCvFont font = new MCvFont(Emgu.CV.CvEnum.FontFace.HersheySimplex, 0.5, 0.5);
using (CvTracks tracks = new CvTracks())
using (ImageViewer viewer = new ImageViewer())
using (Capture capture = new Capture())
using (Mat fgMask = new Mat())
{
//BGStatModel<Bgr> bgModel = new BGStatModel<Bgr>(capture.QueryFrame(), Emgu.CV.CvEnum.BG_STAT_TYPE.GAUSSIAN_BG_MODEL);
BackgroundSubtractorMOG2 bgModel = new BackgroundSubtractorMOG2(0, 0, true);
//BackgroundSubstractorMOG bgModel = new BackgroundSubstractorMOG(0, 0, 0, 0);
capture.ImageGrabbed += delegate(object sender, EventArgs e)
{
Mat frame = new Mat();
capture.Retrieve(frame);
bgModel.Apply(frame, fgMask);
using (CvBlobDetector detector = new CvBlobDetector())
using (CvBlobs blobs = new CvBlobs())
{
detector.Detect(fgMask.ToImage<Gray, Byte>(), blobs);
blobs.FilterByArea(100, int.MaxValue);
tracks.Update(blobs, 20.0, 10, 0);
Image<Bgr, Byte> result = new Image<Bgr, byte>(frame.Size);
using (Image<Gray, Byte> blobMask = detector.DrawBlobsMask(blobs))
{
frame.CopyTo(result, blobMask);
}
//CvInvoke.cvCopy(frame, result, blobMask);
foreach (KeyValuePair<uint, CvTrack> pair in tracks)
{
if (pair.Value.Inactive == 0) //only draw the active tracks.
{
CvBlob b = blobs[pair.Value.BlobLabel];
Bgr color = detector.MeanColor(b, frame.ToImage<Bgr, Byte>());
result.Draw(pair.Key.ToString(), pair.Value.BoundingBox.Location, Emgu.CV.CvEnum.FontFace.HersheySimplex, 0.5, color);
result.Draw(pair.Value.BoundingBox, color, 2);
Point[] contour = b.GetContour();
result.Draw(contour, new Bgr(0, 0, 255), 1);
}
}
viewer.Image = frame.ToImage<Bgr, Byte>().ConcateVertical(fgMask.ToImage<Bgr, Byte>().ConcateHorizontal(result));
}
};
capture.Start();
viewer.ShowDialog();
}