How to download offline maps with SKMaps? - ios

I tried downloading offline maps with SKMaps.
At first I'd like to create a region from a self made SKTPackage like this:
SKTPackage* packageToDownload;
packageToDownload.type = 3;
packageToDownload.packageCode = #"DEBY";
SKTDownloadObjectHelper* region = [SKTDownloadObjectHelper downloadObjectHelperWithSKTPackage:packageToDownload];
Unfortunately the region is empty and every attempt to add packageToDownload.languages led to a crash. What can I do to initiate an offline map download with only the packageCode and packageType?
Thanks for your help!

In the provided code snippet, the packageToDownload object is never initialized. Replace the first line with:
SKTPackage *packageToDownload = [[SKTPackage alloc] init];

Related

running AppRTC for ios ,RtcEventLog issue

I would like to add an interface in AppRTCMobile, this interface can start webrtc Call module, in order to achieve the audio call between two phones (LAN, already know both the IP address and port number), but when I run successfully , The software crashes every time an exception occurs when the method is called by RtcEventLog. I do not know if Calling Call is reasonable or not. I sincerely thank you for your help in the absence of a solution.
Below the source code, please help me find the problem.
std::unique_ptr<RtcEventLog> event_log = webrtc::RtcEventLog::Create();
webrtc::Call::Config callConfig = webrtc::Call::Config(event_log.get());
callConfig.bitrate_config.max_bitrate_bps = 500*1000;
callConfig.bitrate_config.min_bitrate_bps = 100*1000;
callConfig.bitrate_config.start_bitrate_bps = 250*1000;
webrtc::AudioState::Config audio_state_config = webrtc::AudioState::Config();
cricket::VoEWrapper* g_voe = nullptr;
rtc::scoped_refptr<webrtc::AudioDecoderFactory> g_audioDecoderFactory;
g_audioDecoderFactory = webrtc::CreateBuiltinAudioDecoderFactory();
g_voe = new cricket::VoEWrapper();
audio_state_config.audio_processing = webrtc::AudioProcessing::Create();
g_voe->base()->Init(NULL,audio_state_config.audio_processing,g_audioDecoderFactory);
audio_state_config.voice_engine = g_voe->engine();
audio_state_config.audio_mixer = webrtc::AudioMixerImpl::Create();
callConfig.audio_state = AudioState::Create(audio_state_config);
std::unique_ptr<RtcEventLog> event_logg = webrtc::RtcEventLog::Create();
callConfig.event_log = event_logg.get();
g_call = webrtc::Call::Create(callConfig);
g_audioSendTransport = new AudioLoopbackTransport();
webrtc::AudioSendStream::Config config(g_audioSendTransport);
g_audioSendChannelId = g_voe->base()->CreateChannel();
config.voe_channel_id = g_audioSendChannelId;
g_audioSendStream = g_call->CreateAudioSendStream(config);
webrtc::AudioReceiveStream::Config AudioReceiveConfig;
AudioReceiveConfig.decoder_factory = g_audioDecoderFactory;
g_audioReceiveChannelId = g_voe->base()->CreateChannel();
AudioReceiveConfig.voe_channel_id = g_audioReceiveChannelId;
g_audioReceiveStream = g_call->CreateAudioReceiveStream(AudioReceiveConfig);
g_audioSendStream->Start();
g_audioReceiveStream->Start();
Here's a screenshot of the error that occurred when the crash occurred. Please tell me if you want to know more.
Your code crashed at event_log_->LogAudioPlayout()...
It's obviously that event_log_ object is already released.
Objects which are managed by unique_ptr or scoped_refptr will be released after execution, but these objects may still be used in your case, that will lead to crash problem. So put these objects in global memory or retain them.

How to provide hint to iOS speech recognition API?

I want to create an app that receive voice input using iOS speech API.
In google's API, there is an option for speechContext which I can provide hint or bias to some uncommon words.
Do iOS API provide this feature? I've been searching the site for a while but din't find any.
there is no sample code about implementing hints for Google Speech Clouds for Swift online, so I made it up!
Open this class: SpeechRecognitionService.swift
You have to add your hint list array to the SpeechContext, add the SpeechContext to RecognitionConfig, and finally add RecognitionConfig to Streaming recognition config. Like this:
let recognitionConfig = RecognitionConfig()
recognitionConfig.encoding = .linear16
recognitionConfig.sampleRateHertz = Int32(sampleRate)
recognitionConfig.languageCode = "en-US"
recognitionConfig.maxAlternatives = 3
recognitionConfig.enableWordTimeOffsets = true
let streamingRecognitionConfig = StreamingRecognitionConfig()
streamingRecognitionConfig.singleUtterance = true
streamingRecognitionConfig.interimResults = true
//Custom vocabulary (Hints) code
var phraseArray=NSMutableArray(array: ["my donkey is yayeerobee", "my horse is tekkadan", "bet four for kalamazoo"])
var mySpeechContext = SpeechContext.init()
mySpeechContext.phrasesArray=phraseArray
recognitionConfig.speechContextsArray = NSMutableArray(array: [mySpeechContext])
streamingRecognitionConfig.config = recognitionConfig
//Custom vocabulary (Hints) code
let streamingRecognizeRequest = StreamingRecognizeRequest()
streamingRecognizeRequest.streamingConfig = streamingRecognitionConfig
Bonus: Adding your custom words mixed inside a simple phrase instead of adding the word alone gave me better results.

Google Places Picker IOS customization

I'm using the Google Place Picker APi and wanted to know if there is a way to remove the back button and the search button and also prevent the map from moving around, that is created with the _placePicker = [[GMSPlacePicker alloc] initWithConfig:config]; ?
if not, is there an alternative i can use that provides same functionality? Basically, I want the the closest points of interest near a users location..
Thanks.
I'm also trying to figure that out. So far, what I've come up with is to combine GMSMapView and GMSPlacesClient into your own custom viewcontroller.
To gather the nearby points of interest, you'll use the GMSPlacesClient:
placesClient = GMSPlacesClient.sharedClient()
likelyPlace = [GMSPlaces]()
placesClient.currentPlaceWithCallback({ (placeLikelihoods, error) in
if let error = error {
print("Error with Current place: \(error.localizedDescription)")
} else {
if let likelihoodList = placeLikelihoods{
for likelihood in likelihoodList.likelihoods {
let place = likelihood.place
self.likelyPlaces.append(place)
}
}
}
})
this will put the nearby places in your likelyPlaces array. Then it's up to you how you'd want to display the contents. Maybe put them in a tableView or as annotations on the map.
Hope this helps.

How to add call button to search result in CoreSpotlight?

In WWDC session Introduction to Search APIs. They show a search result of Airbnb app with a call button. From what I saw I think the result was created with CSSearchableItemAttributeSet not from Web Markup api.
I tried setting ItemContentType of CSSearchableItemAttributeSet to kUTTypeItem, kUTTypeMessage, kUTTypeEmailMessage of course with phoneNumbers value. None of them seems to work. All detail I put are appear correctly, except for the call button.
CSSearchableItemAttributeSet *attributeSet = [[CSSearchableItemAttributeSet alloc] initWithItemContentType:(__bridge NSString *)kUTTypeItem];
attributeSet.title = #"Call me back";
attributeSet.contentDescription = #"Firstname Lastname\n14:36 - 30 January 2014";
attributeSet.phoneNumbers = #[#"+66827364538"];
attributeSet.accountHandles = #[#"+66827364538"];
If I were to use kUTTypeContent. The call button appears but all details are not. Just name of contact that I put in when create CSPerson object.
CSPerson *person = [[CSPerson alloc] initWithDisplayName:#"Theptai Intathep"
handles:#[#"+66827364538"]
handleIdentifier:CNContactPhoneNumbersKey];
attributeSet.authors = #[person];
Try this:
attributeSet.supportsPhoneCall = #(YES);

SKmaps failed to perform multi level search

I want to make a multi level offline search in my app.
I followed the directions at official Skobbler page and only difference is that l did not download map of France, but map of Wyoming instead.
Offline package code for it is USWY if I am right.
-(void)prepareForSearch{
[SKSearchService sharedInstance].searchServiceDelegate = self;
[SKSearchService sharedInstance].searchResultsNumber = 500;
_listLevel = SKCountryList;
_searchSettings = [SKMultiStepSearchSettings multiStepSearchSettings];
_searchSettings.listLevel = _listLevel;
_searchSettings.offlinePackageCode = #"USWY";
_searchSettings.parentIndex=-1;
}
- (IBAction)searchAction:(UIButton *)sender {
_searchSettings.searchTerm = [NSString stringWithFormat:#"%#",_searchBar.text];
[[SKSearchService sharedInstance]startMultiStepSearchWithSettings:_searchSettings];
}
-(void)searchService:(SKSearchService *)searchService didRetrieveMultiStepSearchResults:(NSArray *)searchResults
{
if ([searchResults count] !=0 && _listLevel<SKInvalidListLevel){
if (_listLevel == SKCountryList) {
_listLevel = SKCityList;
}
else{
_listLevel++;
}
SKSearchResult *searchResult = searchResults[0];
SKMultiStepSearchSettings* multiStepSearchObject = [SKMultiStepSearchSettings multiStepSearchSettings];
multiStepSearchObject.listLevel = _listLevel++;
multiStepSearchObject.offlinePackageCode = _searchSettings.offlinePackageCode;
multiStepSearchObject.searchTerm = _searchBar.text;
multiStepSearchObject.parentIndex = searchResult.identifier;
[[SKSearchService sharedInstance]startMultiStepSearchWithSettings:multiStepSearchObject];
}
}
-(void)searchServiceDidFailToRetrieveMultiStepSearchResults:(SKSearchService *)searchService
{
NSLog(#"Multi Level Search failed");
}
Whatever I put as a searchTerm, I end up with "MultiLevel Search Failed".
from this screenshot, you can see that my map package for Wyoming is included in my SKMaps.bundle:
(Also, if anyone can answer me this: Versioning was different in my app and in the simulator folder in the test app, from where I downloaded an offline package. So, for testing purposes, I made two folders and put Wyoming package in both of them(20140807 and 20140910). Are there any rules regarding this?)
What could be the problem?
Ok, after few days I managed to find the source of the problem.
First, I found out which version I'm using and it's the 20140910.
Second, For some reason, the entire folder containing maps was not recognised. So I took the entire SKMaps.bundle, together with some pre-bundled maps from the demo app, provided by the Skobbler team, and put it in my project and now everything works fine.

Resources