I got my MKMapView running fine on my iOS app. The purpose of the app is to scan a Wifi network for data all the time. Like many times every second. This is just a background task and the user don't need to do anything for that scanning process to work. But, for some reason the MKMapView can't load new tiles at the same time as I'm scanning in the background. I would be lame to show some code from the scanning because I don't even know which part that are interrupting the MapView. So, is this a common problem that have been seen before or is this something weird?
When I move around in the MapView without scanning in background from the Wifi Network the map loads fine and uses both cache and fetches data from the mapping servers.
When I move around in the MapView WHILE scanning in the background no maps are being shown. Just empty boxes that are "telling me that the map are loading".
I'm using this library https://github.com/FuzzyLuke/OBD2Kit to fetch data from an WiFi plug that is connected to a car while I'm trying to show the user's current location on a map.
Suggestions?
I dont know how you're scanning but I guess it is breaking the internet connection and the map view cannot connect to the server then
When you say "many times every second" - how exactly are you 'scanning' the wifi network for data, and what data is it? Because it sounds like you're creating many dozens, if not more, distinct connections per second, which is going to totally cripple your networking stack, and would explain why your maps can't load.
I would encourage you to share more information about what exactly it is you're doing on the network, because that's almost certainly where the problem is. I can guarantee there's a better way to achieve what you want than firing off tens or hundreds of network requests every second.
Related
Is there a possibility for CLLocationManagerDelegate method locationManager:didUpdateLocations: to return cached values in middle of usage?
Or is there a known bug for CoreLocation not working well with hight-speeds or high-altitudes?
I have airplane tracking application, and some users using older iPad devices, are notifying me, that application is showing their position incorrectly (around 3-4min delay). I have also text log implemented, and it shows correct timestamps without delay.
Application renders views and logs data as soon as it get's new location, and no queueing is possible so it shouldn't be a problem.
Can CoreLocations throttle so bad (CPU issues?) to fire up delegate with such huge delay? Can speed be a reason for it? As i said it is used on small planes, so speed is much bigger than in cars.
Edit: To be clear, im checking for actual timestamps, and always getting last element of array from didUpdateLocations
You can't get real-time data without an operational need, and organizations which do receive real-time data cannot legally re-distribute this data publicly, except to other organizations approved by the FAA. An operational need pretty much means you have to be a flight dispatcher for an airline or commercial operator, not just have an interest in tracking flights. For more details
Check following
check here
After some digging and help from apple support (big help) it looks that cause was changes in ios12.
Had to set activityType for .otherNavigation prior ios12 and .airborne for ios12+
Previously this option didn't seem to had any impact...
I am kind of new to programming and wanted to know what would be the best way to get data from server? To be more specific my app receives GPS coordinates from server (currently using Alamofire) and then shows them on the map.
Now, I don't know how keep those coordinates updated on device. Should I make loop, where app downloads coordinates from server, let's say, every 5 seconds and shows them on the map? Should loop interval be longer? Ideally I would like if an app could show live location - that is get updates from server every second.
Maybe there is library for my specific problem? Where could I read more about this?
And I could upload some of my code, but now it simply get data inviewDidLoad using Alamofire and show it on map.
EDIT: Why do I keep getting down votes? Seriuosly, I would like to know what I did wrong.
Best way is to use socket programming and listen particular socket where coordinates will be pushed from server. Rather than pulling periodically this will be push mechanism where server will push from it's side whenever it has new coordinate available ...
Here is the Apple's guide line for the same
https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/NetworkingTopics/Articles/UsingSocketsandSocketStreams.html#//apple_ref/doc/uid/CH73-SW4
If this seems like too much you can use pubnub library for ios which is free for 100 devices
Here is the link
https://www.pubnub.com/
I am attempting to stream audio files from a server to iOS devices and play them completely synchronized. For example on my phone I might be 20 secs into a song and then my friend next to me should also be 20 secs into the song as well. I know this is not an easy problem to solve, but I am attempting to do so.
I can currently get them within one second of each other by calculating the difference in time between the devices and then have them sync up, however that is not good enough because the human ear can detect a major difference in a second and this is over WIFI.
My next approach is going to be to unicast the one file from the server and then have the all devices pick it up directly from the server and then implement some type of buffer system similar to netflix so that network connectivity would be a limiting factor. http://www.wowza.com/ is what I would use to help with that.
I know this can be done, because http://lysn.in/ is does it with their app and I want to be able to do something similar.
Any other recommendations after I try my unicast option?
Would implementing firebase help solve a lot of the heavy lifting problems?
(1) In answer to ONE of your questions (the final one):
Firebase is not "realtime" in "that sense" -- PubNub is probably (almost certainly) the fastest "realtime" messaging for and between apps/browser/etc.
But they don't mean real-time in the sense of real-time, say, as race game engineers mean it or indeed in your use-case.
So firebase is not relevant to you here and won't help.
(2) Regarding your second general question: "how to synchronise time on two or more devices, given that we have communications delays."
Now, this is a really well-travelled problem in computer science.
It would be pointless outlining it here, because it is fully explained here http://www.ntp.org/ntpfaq/NTP-s-algo.htm if you click on "How is time synchronised"?
So in fact, to get a good time base on both machines, you should use that! Have both machines really accurately set a time to NTP using the existing (perfected for decades) NTP synchronisation.
(So for example https://stackoverflow.com/a/6744978/294884 )
In fact are you doing this?
It's possible that doing that will solve all your problems; then just agree to start at a certain exact time.
Hope it helps!
I would recommend against using the data movement to synchronize the playback. This should be straightforward to do with a buffer and a periodic "sync" signal that is sent at a period of < 1/2 the buffer size. Worst case this should generate a small blip on devices that get ahead or behind relative to the sync signal.
I'd like to infrequently open a Twitter streaming connection with TweetStream and listen for new statuses for about an hour.
How should I go about opening the connection, keeping it open for an hour, and then closing it gracefully?
Normally for background processes I would use Resque or Sidekiq, but from my understanding those are for completing tasks as quickly as possible, not chilling and keeping a connection open.
I thought about using a global variable like $twitter_client but that wouldn't horizontally scale.
I also thought about building a second application that runs on one box to handle this functionality, but that seems excessive if it can be integrated into the main app somehow.
To clarify, I have no trouble starting a process, capturing tweets, and using them appropriately. I'm just not sure what I should be starting. A new app? A daemon of some sort?
I've never encountered a problem like this, and am completely lost. Any direction would be much appreciated!
Although not a direct fix, this is what I would look at:
Time
You're working with time, so I'd look at what time-centric processes could be used to induce the connection for an hour
Specifically, I'd look at running a some sort of job on the server, which you could fire at specific times (programmatically if required), to open & close the connection. I only have experience with resque, but as you say, it's probably not up to the job. If I find any better solutions, I'll certainly update the answer
Storage
Once you've connected to TweetStream, you'll want to look at how you can capture the tweets for that time period. It seems a waste to create a data table just for the job, so I'd be inclined to use something like Redis to store the tweets that you need
This can then be used to output the tweets you need, allowing you to simulate storing / capturing them, but then delete them after the hour-window has passed
Delivery
I don't know what context you're using this feature in, so I'll just give you as generic process idea as possible
To display the tweets, I'd personally create some sort of record in the DB to show the time you're pinging TweetStream that day (if it changes; if it's constant, just set a constant in an initializer), and then just include some logic to try and get the tweets from Redis. If you're able to collect them, show them as you wish, else don't print anything
Hope that gives you a broader spectrum of ideas?
so my application connects to a URL (via URLConnectionDelegate), gathers data, which contains image URLs. It then connects to each and every image url (again, via URLConnectionDelegate), and gathers the images for each image.
Everything works perfect, couldn't be happier.
But the problem is that I can't really track the networkActivityIndicator. There are like, 100 connections going off at once, so I don't know when or how to set the networkActivityIndicator to turn off once the last image is done loading.
Does anyone have any suggestions without me having to redo a bunch of code?
Thanks for the help guys
The typical solution is a singleton object that you call [NetworkMonitor increaseNetworkCount] and [NetworkMonitor decreaseNetworkCount] at the appropriate points.
The nicer solution is a toolkit like MKNetworkKit, which will handle this and a bunch of similar things for you (like managing your download queue, since 100 simultaneous connections is actually very bad on iOS).