Problem in loading an as2 movie inside as3 movie - actionscript

My scenario is that i have host swf that is of as3 and need to load an as2 movie into it.Now I need to control the as2 movie for keeping the host as3 swf and the as2 swf in sync. To do that I am using a local connection. But the problem i'm facing is that, when as2 loaded via as3 movie, I don't see as2 events being fired. For example, I kept a trace in onEnterFrame in as2 movie. Stand-alone I could see the traces. But when loaded from as3 movie, I could not see any of them.
I have 3 movie clips
Loader as3 movie
AS2 wrapper movie (Generic for all AS2 movies that will have local connection code.
this will get msgs from as3 and internally does what ever we wanted)
the actual as2 movie to be loaded into as3
can anyone throw some light on this???

How are you loading (3) into the SWF? There's a problem with AMV1Movie's (AS2 SWFs loaded into AS3 SWFs are created as AMV1Movie objects) loading of additional movies: http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/display/AVM1Movie.html
Have you tried something like SWFBridge: http://gskinner.com/blog/archives/2007/07/swfbridge_easie.html?

Related

Does audio.loadSound, loads audio files synchronously?

I'm working on a game that heavily depends on audio API of Corona SDK. The game neads to load a couple of sounds (wavs) and a relatively big background music (+15mb) I don't want to steam the files couse of a statement that I read in corona guides.
Note that streamed files may have a slightly higher latency cost and CPU cost than files loaded with audio.loadSound().
I'm using composer API and I'm planning to develop a loading screen based on event cycle.
My question is can I depend on audio API that if I put all of my loadSounds to the create event handler and they all going to load synchronously and then the show event will be dispatched after all of audio files loaded?..
...Or should I use a diffrent approach for my loading screen?
Yes I believe they are loaded synchronously.
On my apps with a larger background music file the load screen will take longer on slower devices. So it blocks until it is loaded.
I would imagine this is the case since there is no mechanism to 'query' or a callback for when audio is loaded if it was asynchronous.
But don't take my word for it, test it.
PS: Your solution does seem to be solid though, that's about what I do.

MP3 stream record, flat line in waveform record glitch

I record audio streams as byte sequence, for playing via https://github.com/mattgallagher/AudioStreamer.
If i play mp3 stream from URL it plays fine, but if i play it from local file it has glitches, i opened file in audio editor, i see flat lines in place of glitches in record(on screenshot), when i remove this flat lines in editor record works fine.
I also opened record in Audacity and it open it without this lines, and play audio without glitches, but when i open record in my app, or in any mac audio player it play with this glitches.
My record algorithm is simple, i just add bytes from stream to NSMutableData and write it in file, and then open it via same AudioStreamer like stream but from local file system.
Also i save structures for play, like AudioStreamBasicDescription, because without them AudioQueue will not start play.
As i understand i didn't save some structure that need to cut this empty pieces when playing. Because when i playing it from online URL stream it works without glitches.
i found a bug, when i write a stream to file it write also stream metadata like song name. When i start wrote only audio data all works fine.

Reuse audio player

Im just playing around with trigger.io, and need some clarification on native component usage. This question is specifically about the audio player, but I assume the other APIs work in the same manner so its probably valid for all APIS.
To play an audio file the documentation states:
forge.file.getLocal("music.mp3", function (file) {
forge.media.createAudioPlayer(file, function (player) {
player.play();
});
});
If you have multiple audio files that the user can play within the app, with the above code, every time they play a file a new audio player is created. This seems to happen because you can have multiple audio files playing together.
Surely overtime as the person uses the app this is going to consume a lot of memory? There doesnt seem like there is anyway to use an existing player and replace the current audio file with a new one. Is this possible once you have the "player" instance? Or is there a way to dispose the current instance when the user stops the audio or when its finished? or when the user navigates away from that particular audio item?
Thanks
Tyrone.
Good spot, this is actually just an oversight in our documentation, the player instance has another method player.destroy() which will remove the related native instance.
I'll make sure the API docs are updated in the future.

Play socket-streamed h.264 movie on iOS using AVFoundation

I’m working on a small iPhone app which is streaming movie content over a network connection using regular sockets. The video is in H.264 format. I’m however having difficulties with playing/decoding the data. I’ve been considering using FFMPEG, but the license makes it unsuitable for the project. I’ve been looking into Apple’s AVFoundation framework (AVPlayer in particular), which seems to be able to handle h264 content, however I’m only able to find methods to initiate the movie using an url – not by proving a memory buffer streamed from the network.
I’ve been doing some tests to make this happen anyway, using the following approaches:
Play the movie using a regular AVPlayer. Every time data is received on the network, it’s written to a file using fopen with append-mode. The AVPlayer’s asset is then reloaded/recreated with the updated data. There seems to be two issues with this approach: firstly, the screen goes black for a short moment while the first asset is unloaded and the new loaded. Secondly, I do not know exactly where the playing stopped, so I’m unsure how I would find out the right place to start playing the new asset from.
The second approach is to write the data to the file as in the first approach, but with the difference that the data is loaded into a second asset. A AVQueuedPlayer is then used where the second asset is inserted/queued in the player and then called when the buffering has been done. The first asset can then be unloaded without a black screen. However, using this approach it’s even more troublesome (than the first approach) to find out where to start playing the new asset.
Has anyone done something like this and made it work? Is there a proper way of doing this using AVFoundation?
The official method to do this is the HTTP Live Streaming format which supports multiple quality levels (among other things) and automatically switches between them (eg: if the user moves from WiFi to cellular).
You can find the docs here: Apple Http Streaming Docs

ActionScript - Save webcam video to local disk without using AIR or FMS

I would like to save a webcam captured video to the local disk using AS. The application is running in a standalone Flashplayer 10. I can save pictures from ByteArrays using file.save, but I can't find a way for doing this with video.
There is a nice implementation for that using AS and AIR at http://www.joristimmerman.be/wordpress/2008/12/18/flvrecorder-record-to-flv-using-air/. I don't want to have to install AIR before running the app. Any ideas?
Thanks, Basti
Flash is designed to be secure, so you won't be able to save anything but SharedObject data on local storage.
I don't think it is possible up to the current Flash Player (10) to save video directly to the local disc. You can only save pictures to the local disc.

Resources