xna the given key was not present in the dictionary - xna

Hello Im working with xna and had a little problem with XACT and playing some background music.Basically I have followed this guide http://rbwhitaker.wikidot.com/playing-sound to play sound . I have declared all the variables:
private AudioEngine audioEngine;
private WaveBank waveBank;
private SoundBank soundBank;
and in the LoadContent i have load them their sound information
audioEngine = new AudioEngine("Content\\XactMusic\\inGameMusic.xgs");
waveBank = new WaveBank(audioEngine, "Content\\XactMusic\\Wave Bank.xwb")
soundBank = new SoundBank(audioEngine, "Content\\XactMusic\\Sound Bank.xsb");
while at the XACT tool I have created a waveBank and a soundBank and have loaded a .wav file.
The problem is that when I try to compile an error pops up :
Building content threw KeyNotFoundException: The given key was not present in the dictionary.
What am I doing wrong??

Related

DASH Streaming ExoPlayer android studio

I'm streaming video from URL stored in firebase storage and I'm using the following code for streaming the video using ExoPlayer
BandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
TrackSelector trackSelector = new DefaultTrackSelector(new AdaptiveTrackSelection.Factory(bandwidthMeter));
LoadControl loadControl = new CustomLoadControl();
exoPlayer = ExoPlayerFactory.newSimpleInstance(SafetyTVHomeActivity.this, trackSelector, loadControl);
Uri videoUri = Uri.parse(videourl);
DefaultHttpDataSourceFactory dataSourceFactory = new DefaultHttpDataSourceFactory("exoplayer_video");
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
MediaSource mediaSource = new ExtractorMediaSource(videoUri, dataSourceFactory, extractorsFactory, null, null);
exoPlayerView.setPlayer(exoPlayer);
exoPlayer.addListener(new PlayerEventListener());
exoPlayer.prepare(mediaSource, false, false);
exoPlayer.seekTo(0, 0);
Everything is fine and the video gets streamed. But the problem I'm facing is the initial load time to start the video is too long (5+ seconds). I want to reduce the initial loading time to start the video to (0-2 seconds). Is there a way to achieve this using exoplayer?
I also tried using DASH media source in exoplayer using the code below
Uri videoUri = Uri.parse(videourl);
DataSource.Factory dataSourceFactory = new DefaultHttpDataSourceFactory(Util.getUserAgent(SafetyTVHomeActivity.this, "app-name"));
MediaSource mediaSource = new DashMediaSource.Factory(dataSourceFactory).createMediaSource(videoUri);
exoPlayer = ExoPlayerFactory.newSimpleInstance(this);
exoPlayer.prepare(mediaSource);
exoPlayerView.setPlayer(exoPlayer);
exoPlayer.addListener(new PlayerEventListener());
I used the same firebase storage URL in the dash media source but I'm getting the following error
ExoPlayerImplInternal: Source error.
com.google.android.exoplayer2.ParserException: org.xmlpull.v1.XmlPullParserException: Unexpected token (position:TEXT G#��B�%���������...#2:79 in java.io.InputStreamReader#c587547) at com.google.android.exoplayer2.source.dash.manifest.DashManifestParser.parse(DashManifestParser.java:105) at........
Could anyone please help me on how can I work around this.
My main objective is to stream video from URL and the initial load time to start the video should be 0-2 seconds (The way TikTok does it). Any help would be really helpful.

Sound Effect Won't Play (XCode/Swift)

So I am following through an old tutorial and I think with the changes in xcode and swift this code is now no longer usable, but i am not sure. Would love some help.
Declaring constant for the audio effect
let cannonSound = SKAction.playSoundFileNamed("cannon.wav", waitForCompletion: false)
Calling the audio effect within my funtion
let hotdogSequence = SKAction.sequence([cannonSound, moveHotdog, deleteHotdog])
hotdog.run(hotdogSequence)
For more info I am using SpriteKit in Xcode and this code is contained within GameScene.swift file.
Update:
The error I receive is
2017-03-30 00:52:43.631 Ballpark Weiner[95999:1983181] SKAction: Error loading sound resource: "cannon.wav"
The game doesn't crash just no sound plays
This message usually means that the file cannot be found in your project or might be corrupt.
You should check that the file is actually copied into your project and is spelled correctly. Its case sensitive so if the actual file is called "Cannon.wav" it will not work.
Hope this helps

Is it possible to save a mp3 file downloaded from server without converting it to ByteArray? (AS3)

After searching around the net, I couldn't find a method to download a mp3 from a server, save it in the local storage (iPad) and then load and play it other than converting it to byteArray, saving it as .mp3 and then load it and read it back to mp3 to be able to play it in the flash application.
The problem is, although this method works fine, the uncompressed files (in byteArray format) saved in the local storage are too heavy and I suspect that the app is wasting memory.
My question is, is there any form of saving the mp3 directly, without any conversion, like a properly playable mp3? I can't use methods like download() or save() from FileReference.
Lots of thanks!!
SOLUTION!! (WORKING PERFECTLY):
I continued looking for a method to do it around the net, and some forums gave me a clue to do whick I was searching. It was finally pretty simple, but I spent almost 2 weeks to find it out... here is my code:
var queue:LoaderMax = new LoaderMax({name:"mainQueue", onProgress:progressHandler, onComplete:completeHandler, onError:errorHandler});
queue.append( new DataLoader("http://" + **url**, {name:**"example.mp3"**format:"binay"}));
queue.load();
// Complete Event:
private function completeHandler():void {
var file:File = new File(**your location**); // appData
var fr:FileStream = new FileStream();
fr.open(file.resolvePath(nameIn), FileMode.WRITE);
fr.writeObject(LoaderMax.getContent("example.mp3"));
fr.close();
fr = null;
// Now the mp3 is saved in local storage, we load it as Sound object so we can play it.
var loader:MP3Loader;
loader = new MP3Loader(**your location**, {autoPlay:false}) ;
loader.addEventListener(Event.COMPLETE, function():Sound {
loader.content.play(); // This line is the only one not checked, but I am completely sure you can do something like this to play the sound. For example, I introduce the loader.content in a Array of Sound and later I am capable to play any sound I want.
loader.dispose(true); // This is very important to free memory. You should do the same thing with queue when all items are downloaded.
});
loader.load();
}
I expect this help a lot of people!!

No sound on BlackBerry with openfl

I try to use haxe (openfl) for blackberry development.
And I test PlayingSound sample - it works.
But when I try to load sound from url - doesn't work.
Here is my code:
public function PlaySong(url:String):Void{
var _url:URLRequest = new URLRequest(url);
if (_soundChannel != null) _soundChannel.stop();
_song = new Sound();
_song.load(_url); //<--Do not work
//_song = Assets.getSound("assets/stars.mp3"); <--work
_soundChannel =_song.play(0);
}
In the flash target this code is playing my sound from the url, but when I deploy app to my device - it have no sound. On the device, sound is playing correctly only if I load it from the asset folder.
Also, I see that soundChannel position is always 0 (on device);
I try firstly to load the sound with loader, and then play it, when the loading is complete, but it's not help me too.
Help me, please.
PS Sorry for my English.
Have you tried loading it using this:
var loader:URLLoader = URLLoader(new URLRequest("url"));
loader.data = DataFormat.BINARY;
then try
loader.addEventListener(Event.COMPLETE, onComplete);
function onComplete(e:Event):Void
{
sound.loadCompressedDataFromByteArray(e.data.content)
}
Try to load bytes first, then create sound from that.
Anyway, if your code works on other mobile devices(maybe emulators), then create new issue here:
https://github.com/openfl/openfl

Flex/Flash Builder/Actionscript/AIR/Mobile iOS How to take video using the camera and/or browse for & view/access video stored in the 'Camera Roll"

My understanding currently is that:
CameraUI
I can use the CameraUI to access the built in camera for MediaType.VIDEO and that delegates to the built-in video camera app and lets me record a video. My app does that now.
When I stop recording and click the "Use" button, I am returned to my app and theoretically I have a valid MediaPromise.
iOS does -not- provide a valid/usable url/filename to the recorded video (or to photos) and so I would have to use a Loader to bring-in/use/access the 'recorded' video... AND... iOS does not actually create a file anywhere on the device, most importantly, in the Camera Roll where one would expect by the normal behavior when uses the system native camera/video app.
The documentation says that the Loader can load various image types and SWFs but nothing about video data, so I conclude from that that I cannot actually use the CameraUI to generate a valid MediaPromise that I can then pass to a Loader class or similar to read in the information created by the system camera and then manipulate (upload, save to applicationStorageDirectory, and/or display in one of the two video player components available in the API).
CameraRoll
I can have video entities in the iOS Camera Roll but the AS3/Air3.5 CameraRoll class won't let me view/access/reference them in any way.
Normal File I/O
All my attempts to use the Air3.5 File classes to browse to the storage location of the iOS Camera Roll have been rebuffed.
------- Questions -------
Am I correct in believing that there is a way to take video but no way to use the video that's been captured. (No way to use the resulting MediaPromise successfully).
I believe you can take video and access it using Android, but there's nothing in the documentation that says that you cannot using iOS.
Am I correct in believing that iOS sandboxes apps so that they cannot browse to video/photo storage using standard File I/O, but only through the apparently non-workable means I've tried (CameraUI & CameraRoll)
Am I wrong to think that these should be rather obvious NEEDS that one can achieve using the XCode Objective C++ etc route but the AIR Mobile Framework does not allow either because of Apple blocking functionality or because Adobe has failed to meet reasonable expectations?
One item of ironic note to convey. If I use the iOS system camera app to record a video, a thumnail of that video then appears in the Gallery/Camera Roll, and of course, I can share it or view it, or whatever... If I use AIR's CameraRoll.browseForImage(), provided I haven't used the camera to take another image, when it shows me the folder where the pictures are stored, the folder icon uses the thumbnail of the last object added... in this case, the video I took, but if I then enter the folder, the video cannot be found. It's teasing us. It knows it's there, but it is apparently forbidden fruit.
I can't answer all your questions, so this entry may not be acceptable, but I found this page while searching a solution for some the problems you described and thought that someone else may find this answer (partially) useful.
To save the movie you just took you need to open and read the data from the promise.
The iOS won't save the file anywere, so the MediaPromise.file is always null.
This is my solution to the problem:
private var camera:CameraUI;
private var dataInput:IDataInput;
public function recordVideo():void
{
// Start the camera and ask for a video
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, onCameraComplete);
camera.launch(MediaType.VIDEO);
}
private function onCameraComplete(event:MediaEvent):void
{
// event.data is a MediaPromise and MediaPromise.open() returns a IDataInput
// Let's cast it to a dispatcher and check when it's complete
dataInput = event.data.open();
var dispatcher:IEventDispatcher = IEventDispatcher(dataInput);
dispatcher.addEventListener(Event.COMPLETE, onDataInputComplete);
}
private function onDataInputComplete(event:Event):void
{
// We can do whatever we want with the data, so we'll store it in a File
var file:File = new File();
var bytes:ByteArray = new ByteArray();
var stream:FileStream = new FileStream();
// Reading the data from the opened MediaPromise
dataInput.readBytes(bytes);
stream.open(file, FileMode.WRITE);
stream.writeBytes(bytes, 0, bytes.bytesAvailable);
stream.close();
}
Also, I'm still looking for a way to put the movie in the CameraRoll

Resources