I am creating an IoT simulation using Cupcarbon IoT simulator. I want to store the simulation data in some database. Anyone, please help me how to do that.
A simple solution could be to use the command printfile.
With this command you will able to store in the file with the name results some data you will choose. Every sensor node could have its own result file.
Hope this works for you.
I would love to ready if you found another solution.
BR
Related
I'm currently finishing up a project for IOS in Swift in which I've used an SQLite database to store data with SQLite.swift.
At the time of coding I didn't know that SQLite files are stored locally on the device/simulator and I need a way to run the app and have the database synced between two devices, as soon as possible. I tried swapping over to Firebase since I heard that would be a solution but I'm not at all familiar with it and am worried it might be risky given all my functions are written for SQL tables. I also thought of keeping the SQL and adding on a firebase database to fetch the data from but I'm not sure how to execute that.
The solution doesn't have to be 100% reliable for all cases- I just need to simulate running the app on more than one device with synchronised data.
Does anyone have any suggestion for a way to do this? Or a way to store the data on a computer so that it can be accessed by both devices?
Any advice is appreciated!
There are no elegant solution with pure SQLite database. Just few offers:
Moving to CoreData (You will get all what you want with sharing data between devices)
Uploading database file to iCloud from one device and download it from another device
Build up classic client-server communication
Let iOS do the synchronisation via iCloud, e.g. use Core Data with CloudKit: https://developer.apple.com/documentation/coredata/mirroring_a_core_data_store_with_cloudkit/setting_up_core_data_with_cloudkit?changes=_1
I am currently developing an app with a big database. Think of the app as similar to TripAdvisor in the following way: multiple cities with different databases, each between 5, 10, 20, 30 MBs. My whole app is based on this data: every view needs some part of it.
I'm having big trouble finding the correct way to handle this huge chunk of data. I am currently using CoreData: I think it is a great tool, and for offline mode, it definitely works great.
The problem is, I can't really combine it with "online" mode. If the user doesn't want to store the data for X/Y reason, I'm not sure how I should handle the data.
Simply allocate it in variables that will be released anyway, or is there a better way to handle the data?
Right now I am stocking it into CoreData without saving it: I know this is absolutely not a good way to manage it. How could I achieve this?
What is the best way to handle online data and then simply stock it if the user wants it offline too?
Option 1:
Data is always static/ not change data frequently:
You should have data on online. You can fetch data by webservice and store it to local database(i.e core -data). For example: Bible, Quran app.
Option 2:
Data change frequently and user need to update frequently:same as above. Additionally, from server you send notification(i,e push-notification) to app that server has a new data.After getting notification in app, download and save data to you database.
I'm currently using the dataAPI to keep the dataitems synchronized between handheld and wearable.
Still I want to make sure that every data is stored and there is no data lost in the process.
I'm currently reading GPS parameters when the wear is not connected to the handheld and when they connect, they sync the dataitems.
How reliable is DataAPI?
Is my idea of creating a local file doubling my effort?
How can I create a local file on my wear device and then access it?
Syncing data using DataApi is reliable and I recommend using that; if you come across a scenario that sync is not happening reliably, that should be considered a bug and needs to be reported as such. One issue that folks run into is that they create the same data item and they don't get the onDataChanged() callback but that is by design, if the very same data is being added multiple times, there is no change, hence no callback triggers.
Another factor you might want to consider is whether the data you create on one node is for consumption by all other nodes or only a targeted one; DataApi syncs data across all connected nodes so if I create a data item on watch1 and want to sync that with my phone and if there is a watch2 in the picture as well, watch2 also gets the same data.
If you end up using the DataApi, I strongly recommend to make sure to put in place a policy that removes the data once it is synced and consumed otherwise data will be accumulated with no supervision and you'll finally run out of space.
To answer your questions:
I don't know how reliable it effectively is, but we had problems where data updates didn't trigger the appropriate listeners on the watch side. So I'm not sure. Maybe someone has an official statement for this?
I think it depends on the amount of data you want to store. So I suggest you first become clear about the amount and then choose the format. Keep in mind that there is also the possibility to store data in the Shared Preferences.
These guys here tried to save an image on the watch, but that makes no difference wheter it is an image file or text or whatever file.
I know, this is a super beginner question, but i'm a super beginner so please be kind and help me to get this right.
for the past 4 months i'v learned some c and java to get my programming of the ground, then started to learn objective-c and im learning to develop simple apps currently with the apple documentation guids, so I built a simple app and leaning now how to persist its data and save it to a database (which is probably the same thing :/).
Now, i'v never used a database before, and i'm wondering, when I build an app that use SQLite database (i guess it's the default database in core data), is the database is something that is created on the machine that developed the app, or each iPhone have a database instalment that is on the device to persist it's data..?
Would love to have any information possible, thanks a bunch!
I am working on some Core Data stuff and yes, there is a SQLite database on my development machine. When you create the DB (you are really creating a Managed Document that holds the database) you specify a file name and path within the app's file space. You have to find where that sits in the development machine file system. Here's where one of mine sits - obviously yours will not be in the same place but it gives you an idea...
/Users/leecea/Library/Application Support/iPhone Simulator/7.1/Applications/F4AC4CDC-4726-4D0E-B6C6-55F0F5C63BB4/Documents/CMDatabaseDir/CMDatabase/StoreContent/persistentStore
I am running SQLite Manager and can examine or update the database to help debug my app.
When you push the app out to a device, it will create an instance of the DB on each device, in a file in the app's sandbox. They are not connected back to the one on your machine.
Maybe there are more technically correct answers, but this visualization works for me :)
I store some data in my iOS app directly in a local .sqlite file. I chose to do this instead of CoreData because the data will need to be compatible with non-Apple platforms.
Now, I'm trying to come up with the best way to sync this file over iCloud. I know you can't sync it directly, for many reasons. I know CoreData is able to sync its DBs, but even ignoring that using CD would essentially lock this file into Apple platforms (I think? I've only looked into CD a bit), I need the iCloud syncing of this file to work across ALL of iCloud's supported platforms - which is supposed to include Windows. I have to assume that there won't be any compatibility for the CoreData files in the Windows API. Planning out the best way to accomplish this would be a lot easier if Apple would tell us any more than "There will be a Windows API [eventually?]"
In addition, I'll eventually need to implement at least one more sync service to support platforms that iCloud does not. It would be helpful, though not required, if the method I use for iCloud can be mostly reused for future services.
For these reasons, I don't think CoreData can help me with this. Am I correct in thinking this?
Moving on from there, I need to devise an algorithm for this, or find an existing one or an existing 3rd party solution. I haven't stumbled across anything yet. However, I have been mulling over a couple possible methods I could implement:
Method 1:
Do something similar to how CoreData syncs sqlite DBs: send "transaction logs" to iCloud instead and build each local sqlite file off of those.
I'm thinking each device would send a (uniquely named) text file listing all the sql commands that that device executed, with timestamps. The device would store how far along in each list of commands it has executed, and continue from that point each time the file is updated. If it received updates to multiple log files at once, it would execute each command in timestamp order.
Things could get 'interesting' efficiency-wise once these files get large, but it seems like a solvable problem.
Method 2:
Periodically sync a copy of the working database to iCloud. Have a modification timestamp field in every record. When an updated copy of the DB comes through, query all the records with newer timestamps than some reference time and update the record in the local DB from the new data.
I see many potential problems with this method:
-Have to implement something further to recognize record deletion.
-The DB file could get conflicts. It might be possible to deal with them by handling each conflict version in timestamp order.
-Determining the date to check each update from could be tricky, as it depends on which device the update is coming from.
There are a lot of potential problems with method 2, but method 1 seems doable to me...
Does anyone have any suggestions as to what might be the best course of action? Any better ideas than my "Method 1" (or reasons why it wouldn't work)?
Try those two solutions from Ray Wenderlich:
Exporting/Importing data through mail:
http://www.raywenderlich.com/1980/how-to-import-and-export-app-data-via-email-in-your-ios-app
File Sharing with iTunes:
http://www.raywenderlich.com/1948/how-integrate-itunes-file-sharing-with-your-ios-app
I found it quite complex but helped me a lot.
Both method 1 and method 2 seem doable. Perhaps a combination of the two in fact - use iCloud to send a separate database file that is a subset of data - i.e. just changed items. Or maybe another file format instead of sqlite db - XML/JSON/CSV etc.
Another alternative is to do it outside of iCloud - i.e. a simple custom web service for syncing. So each change gets submitted to a central server via JSON/XML over HTTP, and then other devices pull updates from that.
Obviously it depends how much data and how many devices you want to sync across, and whether you have access to an appropriate server and/or budget to cover running such a server. iCloud will do that for "free" but all it really does is transfer files. A custom solution allows you to define your syncing model as you wish, but you have to develop and manage it and pay for it.
I've considered the possibility of transferring a database file through iCloud but I think that I would run into classic problems of timing - slow start for the user - and corrupted databases if the app is run on multiple devices simultaneously. (iPad/iPhone for example).
Sooo. I've had to use the transaction logs method. It really is difficult to implement, but once in place, seems ok.
I am using Apple's SharedCoreData sample as the base for this work. This link requires an Apple Developer Account.
I did find a much much better solution from Tim Roadley however this only works for IOS and I needed both IOS and MacOS.
rant> iCloud development really has to get easier and more stable! /rant