Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I want to make an easy "game". When player start the game, it plays sound (for example dog bark) and player has to guess to which animal (musical instrument..) the sound belongs to.
I dont know how to make the "logic" of this program - how to connect image with its sound.
I was thinking about something like this:
if app plays sound1 then right image is image1, if plays sound2 then right image is image2...but what if I have 100 images?
It is not possible doing it like this. Do you have any ideas please?
You will have to map the sound files with the images.
One way, as you suggested would be to prefix them with an ID, for example:
001_img.png - 001_sound.mp3
002_img.png - 002_sound.mp3
003_img.png - 003_sound.mp3
004_img.png - 004_sound.mp3
005_img.png - 005_sound.mp3
And use a code snippet below to generate the filenames:
for (int assetIndex = 0; assetIndex < 6; assetIndex++)
{
NSString *imageFilename = [NSString stringWithFormat:#"%03i_img.png", assetIndex + 1];
NSString *soundFilename = [NSString stringWithFormat:#"%03i_sound.mp3", assetIndex + 1];
}
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
I have a video inside my device but I want to extract the following details of the video(In flutter/dart)-
1.When the video was taken.
2.Duration of the video
3.Type of the Video
4.When the video was taken
Currently i am using Ff-mpeg plugin but it is taking too long time before it returns metadata information.
You may need to use FFMPEG as follows
import 'package:flutter_ffmpeg/flutter_ffmpeg.dart';
class VideoDetail {
final FlutterFFmpeg _flutterFFmpeg = new FlutterFFmpeg();
VideoDetail() {
_flutterFFmpeg
.getMediaInformation("<file path or uri>")
.then((info) => print(info));
}
}
Do not forget to add dependency
flutter_ffmpeg: ^0.1.1
For more information please visit
https://pub.dartlang.org/packages/flutter_ffmpeg
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I would like to open Google Maps app on iOS using the url scheme for showing directions with multiple stops.
The web url for testing is:
https://google.com/maps/dir//49.54643774,22.28223445/49.54679476,22.28170513/49.54726735,22.28154318/49.54760869,22.28156607/49.54820312,22.2815506/49.54856556,22.28146425/49.54907329,22.28133231/49.54989807,22.28207924/49.55017454,22.2824851/49.55064392,22.28306989/49.5508548,22.28325003/49.55143275,22.28381447/49.55169439,22.28410868/49.5520271,22.28443534
I tried many configurations of the app scheme using comgooglemaps:// without success.
You can read the Google Maps documentation for opening the app.
To simplify it, this is the format you need to follow:
comgooglemaps://?saddr=Google,+1600+Amphitheatre+Parkway,+Mountain+View,+CA+94043&daddr=Google+Inc,+345+Spear+Street,+San+Francisco,+CA¢er=37.422185,-122.083898&zoom=10
in swift you would do something like this, take note of the callback option, you can choose to not have it if you don't want to return to your app:
let testURL = URL(string: "comgooglemaps-x-callback://")!
if UIApplication.shared.canOpenURL(testURL) {
let directionsRequest = "comgooglemaps-x-callback://" +
"?daddr=John+F.+Kennedy+International+Airport,+Van+Wyck+Expressway,+Jamaica,+New+York" +
"&x-success=sourceapp://?resume=true&x-source=AirApp"
let directionsURL = URL(string: directionsRequest)!
UIApplication.shared.openURL(directionsURL)
} else {
NSLog("Can't use comgooglemaps-x-callback:// on this device.")
}
Edit: To use coordinates, use it this way:
comgooglemaps://?saddr=52.3668563,4.8890813&daddr=52.357516,4.902319&zoom=10
Edit 2: For for more points on the map append coordinate using +to:Latitude,Longtitude to the daddr parameter
comgooglemaps://?saddr=52.3668563,4.8890813&daddr=52.357516,4.902319+to:52.357786,4.891913&zoom=10
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I've tried
StackExchange.uc.setBalance(99999999);
and it was nothing more than a visual thing.
Also, I bought the power up and now I want an inspirational answer. Because I fell that the unicorns would appreciate the going.
Q: Is there a faster way to mine unicoins or set them so I can get ALL the unicoins.
EDIT:
I feel like the unicorns aren't happy with the decision I've made of asking. They're outside my house.
Unicorn edit:
We've got him. Don't come looking, he's ours now.
go to the mining page, execute the following code in your console, just keep moving your mouse over the rocks.
$('#uc-rockcanvas').mousemove(function(event) {
for(var i = 0; i < 10; i++) {
var mousedownEvent = document.createEvent ("MouseEvent");
mousedownEvent.initMouseEvent ("mousedown", true, true, window, 0,
event.screenX, event.screenY, event.clientX, event.clientY,
event.ctrlKey, event.altKey, event.shiftKey, event.metaKey,
0, null);
event.target.dispatchEvent (mousedownEvent);
}
});
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am Developing the iPad app which has Equalizer functionality,which means sound play on three properties( Low, High, Medium). I googled it and found this link: iPhoneMixerEQGraphTest
Its basically mixes the sound but I want to apply equalizer effects on my sound.
Please help.
Have a look at the Audio Unit Component Services Reference documentation. Specifically:
kAudioUnitType_Effect = 'aufx',
and Effect Audio Unit Subtypes:
enum {
kAudioUnitSubType_PeakLimiter = 'lmtr',
kAudioUnitSubType_DynamicsProcessor = 'dcmp',
kAudioUnitSubType_Reverb2 = 'rvb2',
kAudioUnitSubType_LowPassFilter = 'lpas',
kAudioUnitSubType_HighPassFilter = 'hpas',
kAudioUnitSubType_BandPassFilter = 'bpas',
kAudioUnitSubType_HighShelfFilter = 'hshf',
kAudioUnitSubType_LowShelfFilter = 'lshf',
kAudioUnitSubType_ParametricEQ = 'pmeq',
kAudioUnitSubType_Delay = 'dely',
kAudioUnitSubType_Distortion = 'dist',
kAudioUnitSubType_AUiPodEQ = 'ipeq',
kAudioUnitSubType_NBandEQ = 'nbeq'
};
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I'm trying to get a UIWebView working with the following code:
NSString *urlAddress = [NSString stringWithFormat: #"http://www.wikipedia.org/wiki/%#",recipe.name];
recipe.name is the entity and property from the core data model
I used NSLog to test the recipe.name and it is outputting correctly to the console
I tried this with just a plain URL and it works fine:
NSString *urlAddress = [NSString stringWithFormat: #"http://www.wikipedia.org/wiki/soup"];
Initialize your UIWebView.
Create an NSURL-object containing your url(using +[NSURL URLWithString:] for example).
Create an NSURLRequest-object with the created NSURL-object.
load the NSURLRequest in your webView.
Voila
:)