Can we perform load/stress tesing using new relic tool? - load-testing

We have a requirement to perform stress testing on website for 1000 concurrent users. Is it possible using new relic (without use of jmeter)?

New Relic is a tool for monitoring applications (on the server and the end-user experience), transactions, and servers. You could use it to measure the effects of a large load, but it doesn't apply the load.
You could use something like Blitz, SOASTA, Cloud Assault, or BlazeMeter to apply load and run tests.

Yes it is possible. But if you have 1000 users i think you must use any hosting which provides cloud servers. I used Blazemeter and after testing i had very good reporting. I used their tutorial about new relic plugin

Related

Continuous delivery of a docker container to Google Cloud

Goal:
I have an application based inside a docker container. I want to be able to use continuous integration for that with push to deploy using Bitbucket Pipelines to Google Cloud. I need access to an SQL database (MariaDB preferably), and some kind of caching system (be it memcache, redis, or something else).
Problem:
I'm entirely unsure of what services I need from said cloud provider to facilitate this as simply as possible, while still being cost effective. I looked into using Google's AppEngine, but I don't know if I'm doing something weird or odd, but for 1 vCPU w/ 1 GB of RAM and 10GB of storage, it was $55/month USD. Which is far more than I really want to pay. But I'm not even sure this is what I need for this. I don't need much power (these are very small apps, used by a very small amount of people). Also, again, not sure if I'm doing something wrong, but I was unable to find a caching solution with Google that wasn't insanely expensive (MemoryStore). Basically, I'm completely overwhelmed by the # of options, and am just looking for a cost effective / simple solution for continuous delivery of a docker application to Google Cloud
Using the tools discussed here , you can set up an end-to-end continuous delivery pipeline covering code, build, test, deploy, and monitor phases of software development across multi-cloud, hybrid, or on-premise environment. Likely this is a way to start.

What Load Test tools are available that can consume AWS ALB logs from S3

Are there any recommended Load Test tools / services that are able to cycle through AWS Application Load Balancer logs stored in S3 preferably utilising the time stamps to perform piano roll type functionality?
aws-log-replay seems to be something you're looking for, it can replay requests with defined concurrency.
With regards to more or less popular load testing tools I can only think of Apache JMeter with Access Log Sampler which support out of box access log files from Tomcat, Weblogic, Reisin and SunOne, however you can come up with your own implementation of Generator class or dynamically populate HTTP Request sampler fields using JSR223 PreProcessor like it's described in Stop Making Assumptions! Learn How to Replay Your Production Traffic With JMeter guide.
Actually I don't think you will be able to produce realistic load by replaying your access logs, it might work for something simple like static content, however if your application assumes authentication, sessions, complex workflows, etc. - I'm afraid your "replay" attempt will got stuck at login page.
So instead of trying to replay complex scenarios from the logs I would suggest sticking to the load testing tool of your choice and create it from scratch. Access logs can be used to identify workload distribution (like X % of users are normally doing this, Y % are doing that, etc.) and anticipated concurrency (like at X time we had Y online users).

Stress Testing with Postman

I am building a website with rails on AWS and I am trying to determine the best ways to stress-test while also getting some idea of the cost I will be paying by user (very roughly). I have looked at tools like Selerium and I am curious if I could do something similar with Postman.
My objects are:
Observe what kind of load the server would be under during the test, how the cpu and memory are affected.
See how the load generated would affect the cpu cycles on the system that would generate cost to me by AWS.
Through Postman I can easily generate REST calls to my rails server and simulate user interaction, If I created some kind of multithreaded application that would make many calls like to the server, would that be an efficient way to measure these objectives?
If not, is there a tool that would help me either either (or both) of these objectives?
thanks,
You can use BlazeMeter to do the load test.
This AWS blog post show you how you can do it.

Sharing data between Elastic Beanstalk web and worker tiers

I have a platform (based on Rails 4/Postgres) running on an auto scaling Elastic Beanstalk web environment. I'm planning on offloading long running tasks (sync with 3rd parties, delivering email etc) to a Worker tier, which appears simple enough to get up and running.
However, I also want to run periodic batch processes. I've looked into using cron.yml and the scheduling seems pretty simple, however the batch process I'm trying to build needs to access the data from the web application to be able to work.
Does anybody have any opinion of the best way of doing this? Either a shared RDS database between web and worker tier, or perhaps a web service that the worker tier can access?
Thanks,
Dan
Note: I've added an extra question, which more broadly describes my
requirements as it struck me that this might not be the best approach.
What's the best way to implement this shared batch process with Elastic Beanstalk?
Unless you need a full relational database management system (RDBMS), consider using S3 for shared persistent data storage across your instances.
Also consider Amazon Simple Queue Service (SQS):
SQS is a fast, reliable, scalable, fully managed message queuing
service. SQS makes it simple and cost-effective to decouple the
components of a cloud application. You can use SQS to transmit any
volume of data, at any level of throughput, without losing messages or
requiring other services to be always available.

Monitor Multiple Rails Applications

Are there any tools that I can run on my server to monitor multiple rails applications?
I need to monitor the number of requests each application receives, how much memory each application is using, how much of the cpu is being used and other stats similar to those. I need to see the stats for each individual rails application.
I recommend you try NewRelic RPM.
The free version:
RPM Lite is the most widely used
solution for basic web application
monitoring. RPM Lite provides
application monitoring for unlimited
Java, Ruby or JRuby applications, for
unlimited users, for an unlimited
time. What a deal! With RPM Lite you
can identify overall app health, app
response time, throughput, Apdex SLA
scoring, cluster breakdown, and Notes.
You can also see where web
transactions are spending the most
time, isolate the worst offenders, and
determine where to focus your
remediation efforts
Later edit:
An alternative to NewRelic RPM is ScoutApp, that has a lot of plugins covering all your required features.
If you need something that can be run on your server, there is also the munin plugins gem that you may try. If you need a users monitoring tools (kind of like Google Analytics)m you can use the RailStat gem.
The Request Log Analyzer gem can be useful, is free, and works by analyzing Rails log files. Thus, there's no chance of it having any negative impact on your application's performance.

Resources