Improve performance in bulk upload in ruby - ruby-on-rails

I want to improve the db performance which is taking lot of time in bulk uplaod
is there any way i can improve the db performance for bulk upload.

Related

Hosting <10gb of read-only data with reasonably fast but cheap access [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
tl;dr
I've currently got a PostGresQL database with about 10gb of data. This data is "archived" data- so it won't ever be changing, but I do need the data to be queryable/searchable/available for reading in the cheapest method possible for my Rails app.
Details:
I'm running a Digital Ocean server, but this is a no-profit project, so keeping costs low is essential. I'm currently using a low-end droplet 4 GB Memory / 40 GB Disk / SFO2 - Ubuntu 16.04.1 x64
Querying this data/loading the pages it's used on can take a significant amount of time occasionally. Some pages timeout because they take over a minute to load. (Given, those are very large pages, but still)
I've been looking at moving the database over to Amazon RedShift, but the base prices seem large- as they're aimed at MUCH larger projects than mine.
Is my best bet to try to put more and more into making the queries small and only rendering small bits at a time? Even basic pages have a long query time because the server is slowed down so much. Or is there a method similar to RedShift that will allow me to quickly query the data while also storing it externally for a reasonable price?
You can try Amazon S3 and Amazon Athena. S3 is a super simple storage where you can dump your data in text files and Athena is a service that provides SQL-like interface to data stored on S3. S3 is super cheap and Athena has per runtime cost. Since you said your data isn't going to change and is going to be queried rarely it's a good solution. Check this out: 9 Things to Consider When Choosing Amazon Athena

Aurora vs MemSQL for low throughput but low latency purpose [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
We are changing our current user side MySQL database. We have a website + mobile app for which users around the US query our database. the relevant data is is contained in three tables, and a join query against the three tables is needed to send the relevant results to the users.
The result sent back to the users are of small size (<6kb). If our objective is to go for low latency and throughput is a low priority, which of the two following databases would perform better:
MemSQL or AWS Aurora?
they both have the same starting cost for hardware (~$0.28/hr). We are only considering those two databases at this stage so that we can continue our "MySQL" in-house knowledge.
I like that i can outsource the DB headache to Aurora. But surely MemSQL's ability to read/write to memory is a lower latency solution?
Nothing beats in-memory for speed and this is what MemSQL is built for. It stores tables (in rowstore mode) in memory and uses a custom query engine to cache queries into an intermediate language so it can execute as fast as possible. Aurora is more like a classic disk-based MySQL instance but with lots of infrastructure changes and optimizations to make the most of Amazon's services.
Before deciding though, you need to figure out what "low-latency" means - is this within seconds or milliseconds?
MemSQL will be faster and most likely in milliseconds depending on your query. Aurora will be slower but can probably deliver sub-second, again depending on your query and the resources allocated and how the data is structured.
Without any more details, the answer is to judge by what your performance tolerance is and then experiment.

Is it bad to have a page with a lot of queries? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I have a page with activities of users and their friends , like social network does with posts , likes, comments, and reshares ... , which result with a lot of queries to show this page, sometimes 15 queries , and sometimes more...
Is this a bad thing ? because i'm trying to optimize a lot of things, but the number of queries still high... need your opinions
Slow responses to requests are bad, and lots of queries make for slow responses. But you can have your cake and eat it too with prudent use of caching. Here's a guide to caching in Rails, although you can also cache before the request gets to Rails, e.g. with varnish.
It's generally a good idea to keep the number of database requests to a minimum, they tend to be slow. The database also tends to be the most difficult thing to scale so keeping the load down helps. You can reduce database calls by either rewriting your queries or (more likely) caching in your application or using HTTP caching.
An important thing to do before any of this though, is to define what level of performance you require. Depending on your load it could be that 15 queries is perfectly acceptable. It could also be possible that greater performance gains could be achieved by fixing something else in the code base.
Try to define your requirements, measure to find bottlenecks, then fix.

Convert Asp.Net MVC/ EF application to NoSQL application [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
We are thinking about convert Asp.Net MVC/EF/SQL application to NoSQL application to improve development speed and application speed. It looks like there is mixed response on ravendb, I want to know if you have using NoSQL in a .Net environment , what you are using.
Also, our business still rely on SSRS to do reports, and it is important that we can still export data from NoSQL to SQL environment, what will you suggest to export data from NoSQL to SQL Server.
Thanks for you comments.
My 2 cents:
RavenDB is a good document database. I'm using it in a .NET environment and it integrated nicely.
Move to a NoSQL database only if your data makes sense in such a structure or the foreseen performance improvement is compelling enough. RavenDB is a document database so it works great with documents but it's much harder to work with relational data. You'll likely find that keeping relational data in a SQL database is more efficient from a development perspective, but perhaps you'll find better performance with a NoSQL database (probably not RavenDB) at the expense of some developer efficiency.
Be open to a mix of SQL and NoSQL in your environment. For example you may find that your relational data fits best in SQL and your document data fits best in RavenDB. Or perhaps you'll want your document data in both places which would require some SQL-RavenDB syncing.
For exporting from RavenDB to SQL, check out RavenDB's Index Replication Bundle. Please see Matt's comment about the latest bundle to use.
You will find (open sourced) BrightStarDB the closest fit from a DBContext solution for entityframework, as for pulling data out you can export it either writing your own app/query/or its own query language. Got a few options there.
Check it out BrightStarDb.com

Ruby on Rails and PostgreSQL scaling and load balancing (hardware and setup recommendations)? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
What we want:
I am looking for hardware, ruby version, and postgresql replication method recommendations for scaling a ruby on rails application that currently sits on 1 server.
What we use it for:
The company I work for currently has 1 server that handles the database and our ruby on rails application. Our application takes in GPS and other data from many different locations(mobile devices etc) at a rate of about every 5 seconds. After extensive testing we are ready to deploy it but want to prepare for several hundred/few thousand devices sending data to the server. As far as user interaction we have reports etc. available based on posted data.
I am currently leaning towards postgresql master-master replication and jruby, using separate servers for postgresql and ruby.
I know this has been asked a lot but I figured I should take all research paths possible. I plan on documenting the entire process to share with others so any input you can provide would be great.

Resources