Best way to work with a small db in swift ios app from a performance point of view? - ios

I'm using an sqlite db to populate an ios app with data about irregular verbs -- it's a language learning app.
The db has 3 tables, verb categories, verbs, and verb elements. The last has one entry per verb termination, so 50-odd per verb. The total db is about 2,500 entries, each one has 3-5 short string fields.
My question is whether the best thing to do from a performance point of view is:
(1) - write structs to model the data, each with appropriate functions, and load the db into memory in the form of structs when the app initialises, OR
(2) - write functions to query the sqlite db on the fly for each new view
My guess is that (1) is much better for a small db of this size, but I thought I would consult your collective wisdom as I'm a novice. Thanks in advance for any advice.
Btw, this is a follow up question to one I asked about how to get data into an ios app:
What's the best practice for storing and importing (non-user) data for an iOS app in swift/swiftUI/Xcode?

Both solutions should be sufficient for your case.
When it comes to performance this is question on what you are measuring.
Quick loading time
Quick operations in background
Smooth user interface and interactions
Energy consumtions
Memory consumptions
...
For instance when increasing memory consumption you have to understand that your app will more likely be terminated when in background. If user decides to take off 5 minutes and opens a game which loads immense amount of memory your application will more likely be terminated to get that memory for another application.
But again. Both solutions should be fine for the size you are describing here. Over time if this increases you may need to reconsider.
Also there is an option 3 where you can just hardcode these values directly into structures so you even lose initial load time. Just make a script or something that transforms your files with strings to source code.

Have you thought about CoreData? I use it for a small DB and it works fine. There is of course the learning curve of CoreData, but once you get over that, it does a lot of the work for you and has some integrations with SwiftUI as well. I am not saying it is more or less performant in any way, just that once over the learning curve, it is pretty easy to use and of course optimized by the Apple CoreData team.

Related

SQLite vs Core Data: saving large amounts of data

I'm making a statistical analysis app which needs to store large amounts of data locally. To give you an idea of what I'm talking about, I've made this illustration (not exactly the right information, but very similar):
The app will keep track of several thousand destinations, with the destinations population, temperature, number of cars etc. These numbers will be represented in a graph so that you can look at the numbers "development" over time. This will go over a long period of time, in other words: thousand of dates for each data-type for each thousands of cities.
To achieve this I need to save large amounts of data, and it is preferred to be done locally (is that crazy?). I'm stuck between digging deep into the foundation of Core data, or using my already decent skills in SQLite.
If you suggest I should use SQLite, could you refer to how you would implement this into your app (some code perhaps?).
If you suggest I should use core data (mainly for performance), please show me how you would implement this type of data model with entities, attributes, relationships etc. I can imagine using dictionaries saved in the core data would be a good solution?
Thank you in advance.
If you're going with SQLite with Swift - I highly recommend using this project. I am using it in my current project and it is absolutely fantastic and works perfectly (I'm not affliated in any way with the project or author). You actually drag that project into your project and it becomes a subproject, then you just set it up as 1. target dependency, 2. framework to link with, 3. copy framework (build phases), and it just works. Then you can handle your database with brilliantly constructed Swift interfaces rather than ugly cumbersome libsqlite calls.
I have used it for modest amounts of data. A few databases and multiple tables. Clean and intuitive. So far I haven't found a single bug of any kind. And Stephen Celis, the author, was responsive when I asked a question about a feature that wasn't documented (but actually is present and works, it turns out). It's a prodigious effort.
Its so clean and tightly integrated with Swift that, if I didn't know better, I'd think Apple itself added SQLite support to the Swift language.
https://github.com/stephencelis/SQLite.swift
Core Data is an object persistence model-- and there's your answer really, because every object has a little overhead, so having thousands of objects in memory at one time is problematic. If you already know SQL then that's another plus.
(Discussion of overall merits of core data is outside the scope of this question. The "music" app pulls it off using core data with thousands of items, I think, but because it only needs to display a subset of items. Map Kit drops down to C, and quite handles itself impressively with tens of thousands of single-digit byte items, which you can see running instruments with a Map Kit app.)
I've used SQLite in iOS and it's not a problem, being C-based and Objective C being a strict superset of C. There are various "wrappers" but look at these carefully, in case they take you back to every-row-an-object. I didn't use one and found the SQLite C setup to be tricky but fine. EDIT: I used this tutorial, which is dated now for the Objective C but a clear overview of how to integrate SQLite.
The only catch is the bundle/documents distinction in iOS will catch you if you ship with large amounts of data and want to save to that database: you can't modify files in the bundle, so you need to create or copy an SQL database to the documents folder 1 time, maybe on first app launch, to use it. Since you can't delete from the bundle, now you have your database twice. A problem inherent to the way iOS is set up.

Restkit + Rails API or TouchDB + CouchDB?

Any of you out there got the answer to what model is more trivial to iOS cloud data driven apps?
Restkit + RESTful API or TouchDB + CouchDB?
There is really hard to answer your question without know what you want to do. You can certainly make both works for different use cases.
There are a couple of things to consider:
If offline mode is important.
TouchDB is a full functional nosql datastore running on the device, and it allows user to read and write data even without connection. Restkit needs a connection to fully work.
Size the dataset to replicate
TouchDB will replicate data to the device, and it is easier if you have relative small dataset. The size is measured by how many documents in your database, and also the size of documents.
Also, most of time device only need to do a full replication when app starts (initial replication), so you can get around this (embedded most of data into the app apk itself for example) and only replicate delta.
By the way, you can certainly use both, and get the benefits of both.
Employing CouchDB+TouchDB completely takes the hassle of sync away from you. You don't need to care about sync, it just works. You get a notification upon sync, update your UI, that's it.
Replacing the Core Data stack with TouchDB is also fairly easy. The model objects basically stay the same, only that they now inherit from CouchModel instead of NSManagedObject. It's nearly trivial.
Querying is slightly different from Core Data. You define a set of views (indexes) that slice and sort your data along different criteria, and then query these indexes with a start and end key. So, there's no explicit query language, but that's no inconvenience, really.
I've moved a Core Data app to TouchDB, and it was completely painless. In about 3 days, I had CRUD and sync up and running.

CoreData: should I use binary store for little data(less than 1MB) to improve query speed?

My iOS app requires little local data, average user won't user more than 1 MB, but it does many queries(fetches) with predicate, so I'm thinking loading the whole sqlite file into memory when I launch to improve query speed, but I didn't see a way to do that.
So I'm thinking using NSBinaryStoreType, while will be loaded into memory when app launches, and queries much faster, am I doing right?
user465191,
Rather than speculate on the performance differences, perhaps you should just try both types? It is extremely easy to create and use both store types.
In reality, every app has a working set. Core Data's managed object contexts and store coordinators (and, I add, SQLite itself) are quite good at caching information. I doubt you will see little difference on a database of your size.
As in all engineering endeavors, use data to guide your hand. Your app is different than mine. I would love to know the results of your comparison. Your mileage will vary.
Andrew

Storing facebook friend list in coredata?

I am developing an app that uses a users facebook friends for some sort of interaction.
Now I am using core data to store some user data and I am not sure whether I would like to store the users friend in the database as well for caching.
It's a speed over storage kind of situation as storage-wise it's O(n) storage over connection speed fetching each time the friends list and then manipulating it as I need to.
Of course there has to be a handler to check if the friend list got bigger or smaller but let's assume that I have that validation happening lazily and in the background while the application loads.
Any thoughts would it be wise to save it to the core data database or should I just be fetching it and re-populating the database every time the application runs?
Your question is for thoughts pertaining to what is "wise" in this situation. Actually, my answer is the same for every situation.
Write code that is simple for humans to understand.
Then, do lots of performance analysis to determine where you may need to focus on performance. Fortunately, XCode ships with a pretty nice tool for that purpose (Instruments).
So, IMO, it would be size to implement it in the way that is the easiest and most straight-forward. Then run performance analysis. Address the needs that the performance tools tell you need to be addressed.

Achieving 25K Concurrent connections in RubyOnRails Application

I am trying to build a suggestion board application. where a users raises a query and multiple people will post at the same time. expected to be supporting atleast 25k concurrent users. now the question format also has checkboxes or radio buttons, in thats case they will be writing to DB.
Please let me know how can achieve this in Ruby on Rails.
- hardware support (specific Hardware LB)
- software support like (DB clustering/App server clustering/ Web traffic resolution)
I think your best plan is to worry about scaling to this level once you have that many users. There's nothing stopping you from achieving this in Rails, or indeed any other framework/language.
The problem with trying to design your architecture up-front to scale to this level is that, at this point, you have absolutely no idea where the pain points are going to be. Are there specific pages which are going to hit the database harder, are some of your pages heavy on HTML and images... there are so many questions to ask that simply cannot be answered effectively until you've gotten something out there.
This doesn't mean that you shouldn't worry about scaling - by all means, try to design your data structures in such a way as to allow you to scale later. But put off any major decisions, and think about them later when you have some hard data to work with.

Resources