I'm very new with Objective C and I inherited a project to learn and expand from. The end goal is to move to Swift in the long run, but for now I need to continue with Obj C.
I have a very large and complex JSON of roughly 4000 lines of variables and sub objects.
In Android I am able to easily deserialise this JSON to an object which can be saved easily with Room by essentially converting complex parts into separate JSON strings and deserialising them all when needed.
How should I approach this in Obj C? I found https://quicktype.io which allowed me to easily get a generated class for the JSON structure, so that part is done.
I need to save this data in CoreData and then retrieve all of it when needed. I don't need to get specific parts of the data, so it doesn't matter if it comes to saving the whole thing as a String.
Is there a way to easily generate table entities based on a class, or should I just mash it together as a String? What is the best way here?
Sorry that I can't give any code examples, because I don't have much to go on.
Related
In my project I have to parse JSON schema, that comes from server.
It has object "Properties", which in fact like Dictionary in curly braces. And, of course, JSONSerialization.jsonObject parses it as Dictionary.
Everything looks like OK, BUT: I use these Properties for building my view (it defines fields to be fiiled by user). Finally, I have to save order of these fields! But, as we know, immediately after the object is parsed to Dictionary, it looses keys order. Anybody knows how can I parse these object, saving fields order?
Additional information:
Structure of Properties is build by user in WEB, so their count is avsolutely random for mobile client. Furthermore, Every object in properties (e.g. Group) can have its own properties, containing other objects. So we have absolutely random tree of nested objects. And their order is necessary for us.
If you don't care about interoperability, meaning 3rd parties also being able to rely on order, you can try to find a parser that preserves order (such as by reading it into an OrderedMap in Python instead of a regular dict- obviously this will differ by language.)
If you care about 3rd parties, it's trickier. As the last person to respond noted, JSON itself does not support this, and JSON Schema is just JSON as far as parsing goes.
I hope this isn't an inappropriate post, but I wanted to make sure my first steps implementing parse as my backend are in the right direction to save some time. I'm new to both iOS programming and the parse sdk, so please bear with me!
In my app, users are able to create various polygon shape overlays on a Google Maps mapView, stored as a GMSMutablePath, which is basically a list of coordinates. Users will have at least one group of paths, each with at least one path. There will also be some information stored with each group, stored as strings or numbers. This information is specific to a single group of paths.
I'm trying to figure out the best way to store this data. My first basic question is 1) Can I store the GMSMutablePath as a whole in the Object data type? Or does the Object data type refer to a class that is created through parse? This link (https://www.parse.com/questions/what-is-data-type-of-object-in-data-browser) is the 'best' explanation I found of the Object data type, and it isn't very clear to me.
My gut instinct is no, I can't store the GMSMutablePath object, and that Object refers to a Parse object. Which leads me to 2) How should I store this data, then? I can get the individual lat/long values of the coordinates that make up each path, and I can store those as numbers, and use the numbers to recreate the paths elsewhere. None of the paths should use too many coordinates, and there shouldn't be too many paths in each group.
Playing around a little bit in the data browser, I see that I can store arrays, but I'm not sure how those are formatted, as I'd need an array (of groups) of arrays (of paths) of arrays (of lat/long values). A little bit of googling tells me it can be done, but doesn't show me how. Can any datatype be stored in any array, or is a datatype specified? I'm used to C++ programming, so I'm used to an array containing a single type of element. What I'm thinking is that I'd need an array of objects, which would be the groups of paths. Each one of those objects would have the string/number information associated with the group, as well as an array for the paths within the group. For each one of those paths, it would have to be either an array or an object. Since for the path I just need the coordinate lat/long values, I think that I can get away with an each path being an array of numbers, and I can write my program to use one array, with odd indexes being lat / even indexes being long values. That all being said, I'm not sure how to create all of that. I'm not looking for somebody to write my implementation for me, but all of the examples I can find are much more simple... if anybody could point me in the right direction to do this, or has a better idea of how to do it, I'd love some pointers.
Each user is going to have their own groups, but that data is going to be shared with others at some point. The data will be associated with the user it belongs to. With that in mind, my last question is 3) Should I store all of this information specific to a user and their groups on the User class, or make it all a separate class entirely? My guess it that I should add an Object to the User class, and store the groups within that Object. I just want to make sure I have that right, with future scalability in mind. Like, when I pull the group data, am I going to have to pull the entire User data from another user, and if so, is that going to slow things down significantly? I'm thinking that I do have to send entire user data, but I don't know if that poses any security risks. Would it be best to have a separate class for the groups, and store the user id associated with the groups? If I do this, should I also store the groups as an object on the User class?
Sorry for the wall of text, but thank you for any guidance you can provide!
If you need any clarification, let me know.
Thanks,
Jake
Creating a class to hold all the objects turned out to be unnecessary. It only had a few extra details that were just as convenient to add to the user object, and then have an array of objects on the user.
Some main things to note that I learned are: use addObject to add to an array, rather than setObject to add a single object to a PFObject/User.
Parse fetching/saving happens in background threads, so if you're loading the data to do something specific with it, make sure the code using the data occurs inside a block using the [PFObject fetchInBackgroundWithBlock] method.
Also, as changes are made to the structure of your data on a parse user/object, make sure you sign out of the current user and create a new one on your app, or you may run into lots of undefined behaviour that could crash your app.
I'm attempting to submit a large database containing many tables to a web service by sending the data via JSON. Extracting the data and converting it to a JSON string is working fine but so far I have only implemented it to send one table at a time each with its own ASIHTTPRequest. My question is whether or not concatenating all the JSON strings generated from each table is a good idea or if I should first combine the tables in their abstract data form, before converting all of them together to JSON?
Alternatively if there is any other suggestion that would be good too.
It entirely depends on your needs. If the tables are unrelated, multiple requests may be more appropriate because if a request fails (timeouts or loss of connection), it won't affect any other requests. However if you have tables with associations with one another, it would be better to send it all in one go so either all the data transmitted wholly or did not so you don't end up with broken associations.
You can't just "concatenate" JSON strings. The result will not be legal JSON. You need to somehow "splice" them.
And, of course, the server on the other end must be capable of parsing the resulting JSON -- it may only expect one table at a time, eg.
I dont see any issue in doing any one of the two choices you proposed
But i would suggest concatenate the tables in the database before converting so that you dont deal with string concatenations and other form of processes
Would nsdictionary be a good data type to use for storing long string values as values and names of those descriptions for keys? Or would a different data type be more effective? I am using it for animals, and having and array hold all the data then using a dictionary to point to the name and description of the animal. I'm just curious if this is used for smaller data like states and capitals
Or should I just use a #Define #"rhino description"
[Animal animalObj:#"rhino" location:#"the water" description:[[self setGenericAnimals] valueForKey:#"Rhino"]]
NSDictionary is OK for this. Whats great about using NSDictionary is you can save your data as JSON in a sperate file and then serialzie it into a NSDictonary when you need it. This would make it easier for you to manage all your data and it seperates it from your application.
this is a good start on how to convert JSON into a NSdictionary.
http://www.raywenderlich.com/5492/working-with-json-in-ios-5
Remember though that the entire NSDictionary must fit into memory so if your going to have thousands of strings you might want to separate that into different JSON files and then serialize them into Dictionaries when you need them.
Another thing to remember is that if you want to do simple comparisons and sorting options on objects you are better using CoreData as this allows you to store lots of strings and easily access them.
I'm trying to convert an XML document into a dataset that I can import into a database (like SQLite or MySQL) that I can query from.
It's an XML file that holds most of the stuff in attributes. This is part of a Rails project so I'm very inclined to use Ruby (and that's the language I'm most comfortable with at the moment).
I'm not sure how to go about doing that and I'd welcome both high-level and low-level contributions.
xmlsimple can convert your xml into a Ruby object (or nested object) which you can then look over and do whatever you like with. Makes working XML in Ruby really easy. As Jim says though depends on your XML complexity and your needs.
There are three basic approaches:
Use ruby's xml stream parsing facilities to process the data with ruby code and write the appropriate rows to the database.
Transform the xml using xslt to a non-XML stream format and feed that into a ruby program that updates the database
Transform the xml with xslt into a format acceptable to the bulk-loading tool for whatever database you are using.
Only you can determine the best approach depending on the XML schema complexity and the type of mapping you have to perform to get it into relational format.
It might help if you could post a sample of the XML and the DB schema you have to populate.
Will it load model data? If you're on *nix take a look at libxml-ruby. =)
With it you can load the XML, and iteration through the nodes you can create your AR objects.
You can have a look at the XMLMapping gem. It lets you define different classes depending upon the structure of your XML. Now you can create objects from those classes.
Now you will have to write some module which actually converts these XMLMapping objects into ActiveRecord objects. Once those are converted to AR objects you can simply call save to save those objects into the corresponding tables.
It is a long solution but it will let you create objects out of your XML without iterating over it. XMLMapping will do it for you.
Have you considered loading the data into an XML database?
Without knowing what the structure of the data is, I have no idea what the benefits of an RDBMS over an XML DB are.