How to Simulate waterflow sensor network in contiki cooja? - contiki

Im very new to contiki cooja.Im wanna simulate a waterflow meter in cooja simulator,is it possible ?My aim is to simulate a waterflow meter for tracking water useage.How can i do this in cooja?

The only thing you have to simulate is the device collecting the flow data. It could be a very simple device driver that mocks a flow sensor, by outputting a value that is your flow volume. Pick this value up from your application and process it accordingly.
For ideas on how to proceed have a look at these device drivers: "contiki dev".
Please, in future try to document the work you did prior to ask a question, as by now it looks like you didn't do any research on this topic.

Related

Possibility of Hack-rf one working on full duplex mode

I am working with hack-rf one and GNU radio. I have one hack-rf one device and i am trying to transmit and receive signals continuously. Is there any module in GNU radio which makes it possible?
Currently i have a flowchart for receive and one for transmit.
I am using selector block but manual switching isnt working properly.
I need to send and receive the same signals after reflection and study the difference.
Any suggestion is welcomed.
Since that is physically impossible for the device: No, no software on this earth can do that for you.

Dynamic accuracy with Core Location

I would like my iOS app to get notified in background whenever user stops (or slows down below some velocity threshold) at a place while maintaining maximum battery life.
The catch is that I don't really care for accuracy when the user is moving but I need as accurate measurement as possible when user stops or walks around the same spot.
There are many Core Location tools available:
Standard Location Service
Significant Change Location Service
Geofencing and Ranging Service
Integration with Core Motion and M7 Motion Coprocessor
Which one of them should I use? Is there a best practice for what I am attempting to do? Has anybody experience with this sort of stuff? I found this app which does exactly what I want to incorporate in my app but I'm not permitted to use their API.
I've read the documentation but my case doesn't really fit any of the categories they discuss.
Thanks in advance.
Pete.
In iOS8 there is a new technology that fits what it sounds like you are asking for. CLVisit objects are sent to your app in the background when the user arrives or departs after stopping at a location. Power consumption is very low with this feature. You enable it by calling startMonitoringVisits on a CLLocationManager object.

Establishing synchronized music streaming across devices

I am attempting to stream audio files from a server to iOS devices and play them completely synchronized. For example on my phone I might be 20 secs into a song and then my friend next to me should also be 20 secs into the song as well. I know this is not an easy problem to solve, but I am attempting to do so.
I can currently get them within one second of each other by calculating the difference in time between the devices and then have them sync up, however that is not good enough because the human ear can detect a major difference in a second and this is over WIFI.
My next approach is going to be to unicast the one file from the server and then have the all devices pick it up directly from the server and then implement some type of buffer system similar to netflix so that network connectivity would be a limiting factor. http://www.wowza.com/ is what I would use to help with that.
I know this can be done, because http://lysn.in/ is does it with their app and I want to be able to do something similar.
Any other recommendations after I try my unicast option?
Would implementing firebase help solve a lot of the heavy lifting problems?
(1) In answer to ONE of your questions (the final one):
Firebase is not "realtime" in "that sense" -- PubNub is probably (almost certainly) the fastest "realtime" messaging for and between apps/browser/etc.
But they don't mean real-time in the sense of real-time, say, as race game engineers mean it or indeed in your use-case.
So firebase is not relevant to you here and won't help.
(2) Regarding your second general question: "how to synchronise time on two or more devices, given that we have communications delays."
Now, this is a really well-travelled problem in computer science.
It would be pointless outlining it here, because it is fully explained here http://www.ntp.org/ntpfaq/NTP-s-algo.htm if you click on "How is time synchronised"?
So in fact, to get a good time base on both machines, you should use that! Have both machines really accurately set a time to NTP using the existing (perfected for decades) NTP synchronisation.
(So for example https://stackoverflow.com/a/6744978/294884 )
In fact are you doing this?
It's possible that doing that will solve all your problems; then just agree to start at a certain exact time.
Hope it helps!
I would recommend against using the data movement to synchronize the playback. This should be straightforward to do with a buffer and a periodic "sync" signal that is sent at a period of < 1/2 the buffer size. Worst case this should generate a small blip on devices that get ahead or behind relative to the sync signal.

Is it possible to simulate low GPS accuracy in Xcode or device?

I need to allow manual positioning over a map if user location remains under a given accuracy a certain amount of time.
I need a way to test this circumstance. It's not like disable location services. Just want to have bad GPS signal (for example: horizontalAccuracy about 500 m. or so). "Unfortunately", in the place I work (my own house) I have a strong GPS signal so I can't test and fine tune the app behavior.
I wonder if there's a way to do this. Something like the network link conditioner in the developer menu on iPhone settings.
As far as I know, there's no way to do it short of writing a test class to inject location data. (I.e., there's nothing in the location simulation functionality provided by the simulator and/or debugger that will allow you to set accuracy.)
This answer to a similar question has an example of the manual-test-class-hackery way.

Accessing an AR2112

This is a little off the beaten path. I've got a DLink DWL-G520 card I'm using under OpenBSD and it works fine. What I want to do is be able to access the radio part of it. Why? I want to use it in a radio telescope. It's a 2.4 GHz receiver with an external antenna connector. I want to connect some coax, some amplifiers, and an old TV dish and point the dish at the sky. It has an RSSI signal and variable RF gain (which it adjusts, from what I can find) so all I'd need to do is record those over time while pointed at a certain spot in the sky. I don't need to control the frequency really since most natural events are broadband.
I'm poking through the OpenBSD ath driver following nested structs but I don't want any of the normal network stuff, which is most of what the driver does. dmesg identifies it as an AR5212 which according to the Atheros PDF is always paired with an AR2112 radio. Is there any easier way than wading through PCI stuff to see what my options are? I need to turn the transmitter off so it doesn't fry my amps too. Trying to find low level documentation is about impossible from what I've seen. Ultimately I'd like to have this work with other WiFi cards too, but I'll start with this one. I've got a Cistron with an external antenna connector also.
Alan, ab1jx

Resources