Software to help in log analysis? - analysis

Hey guys, I have a quick question.
Does anyone know of any software that can help with understanding client-server logs?
I have 3 huge logs of 1 server & 2 clients that are time stamped, but only if I could arrange all 3 of them in a UI side by side in a chronological order, it would be so easy for me to understand them.
Thanks for any tips on such a software. (Or maybe some ideas to help me with this.)3
Edit: the mockup image has blue line partitions to separate out time in milliseconds. That is just for visualization.

You didn't say what platform but Microsoft Log Parser is something that can be useful. Here are some samples.

Google BigQuery is a helpful tool for analyzing massive datasets, including logs. It doesn't give you a UI, but it does provide a SQL-like language for making queries over the logs.

Related

How can I write code to purposefully eat up RAM and CPU resources?

I have written code in Ruby and C before that has accomplished similar purposes, but I think my situation is a bit trickier now. As I said, I need to purposefully write code that ties up the resources on a machine, and measure where and when these events happen.
Are there any libraries/modules out there that do this, and if not, what would be the best way to accomplish this? Should I flood the processor with huge numbers and make it try to find primes? Should I shoot tons of packets to one of the machines and wait till it crashes?
I have spent the past 2 days writing code to accomplish various tasks related to this, but I'm dissatisfied because even when I do accomplish my task, I can't get any real data out on where, why, and when these events are happening. ANY suggestions would be greatly appreciated.
Please consider using the stress utility in Linux. I'm assuming this is a Linux system. Either way, "stress" allows you to target the subsystem of your choice and load it accordingly.

Dynamics crm use linq to display list of memberworks

I am trying to run a LINQ query within a website plugin with Dynamics CRM Hosted. I need to us LINQ to retrieve a list of all the currently registered Members as shown in the MemberWorks tab. But to be host I have no idea where to start.
Ive really jumped in at the deep end with this one to help out a friend, and I find that necessity and crushing time demands are the best way to challenge my brain and learn something new. So please if you van give me relevant pointer Id really appreciate it.
To clarify my LINQ knowledge is at beginner level and my knowledge of the hosted Dynamics CRM datastructure is at a similar level. So Ive not really tried anything as I simply don't know where to start at this stage. But hopefully some kind folk can give me direction and Ill see where that takes me.
Thanks in advance!
If you look at Hosk's blog you will probably find your answer, if not you will have other questions to ask :)
It's quite hard to give an answer to a question this general so excuse me if it's a bit fuzzy and not exactly what you expected

Can anyone recommend best practices for profiling Ruby on Rails software under Passenger/nginx combination in a live server environment?

I am attempting to determine what can possibly be the causative factor for 20+ second response times from a Rails 3 application located in EC2 using Elasticache. I have reason to believe the problem is in fact cache related, but I have no numbers to prove it. I'd like to get those numbers. For the sake of completeness, we're running the applications atop Ubuntu 12.04 .
Searching Google, I found nothing directly relevant to my situation, and no StackOverflow topics I could find were even remotely relevant to my situation. If anyone can point me to some documentation on the matter, I'd be quite appreciative. Thank you!
I've found the best tool for this to be New Relic.
http://newrelic.com/
I don't work for them and get no benefit from you trying them.
They have a free level that you can start with. If you go up to the non-free version you can literally trace all your requests through different models and into the database telling you how long the app spent in each section. It's a great tool for profiling.
Do you, by any chance have access to standard web logs including URLs and response times?
I faced a similar situation, searched the web, found nothing relevant, and eventually decided to roll my own, which I shared in this SO post:
Profiling a multi-tiered, distributed, web application (server side)
While it is far from perfect and may be too high level for some use-cases, it gave me a pretty quick and broad insight into where the application I was trying to profile is spending most of its time in, and what the slowest parts are. HTH.
The best parts of it are that:
It is 100% platform and programming language independent.
It is a 100% free software solution

Erlang, membase and comet

I was wondering if someone had already use this three technologies together. I know Erlang and Comet are widely used, buy I can't find anything related of Comet + Membase or Erlang + Membase. Are them together a bad idea for some reason?
I'm doing a research using Freemind to map the ideas about these three technologies (Erlang, Membase and Comet). As I'm new to the three I am not certain if they are a good combination.
Basically, the application I have in mind will have many clients ("clients A") sending small amounts of data to the server. The server needs to save this data as fast as possible, and send it on request to another set of clients ("clients B"), where clients B are quite fewer than clients A.
This application is just an idea I have had for a while (nothing new, it's been done already), but I would like to experiment with Erlang and Comet and they seem to fit.
If anyone can provide my with some hints I would appreciate it very much. This is my first question on this site, it my be to open. If it's so, please let me know and I will post it somewhere else.
Thank you!
For membase, you just need to use any of several memcached clients. It should work quite well for what you're describing.
Give it a try. Describe what doesn't work for you. :)

Web Server Log Analysis Tool

Any suggestions for an accurate Web Log analysis tool to generate reports on the IIS logs? We used WebTrends, but I don't feel it was accurate.
To analyze weblogs, I don't think you can go wrong with Analog: http://www.analog.cx/
If you are analyzing your own logs, which are often huge files, you will want the fastest analyzer you can find. Analog is fast.
You'll want one that's been around awhile and is still supported. Analog just celebrated its 10'th birthday.
Analog claims to be the most popular logfile analyser in the world.
Multi-languages.
Did I say its free and open source?
As far as accuracy goes, no tool gives perfect results. Javascript fails often in catching hits. Trying to track individual people's paths through a website (i.e. for Analytics purposes) is fraught with problems. And even trying to differentiate hits versus visits and screening out the bots is all more of a black art than a science.
What is best is simply to have a tool that gives decent basic statistics that tell you what you need to know.
I've looked at other tools, such as Deep Log Analyzer: http://www.deep-software.com/, which attempts to do analytics from your weblogs. But speed was a problem. They claim their new version 3.5 - April 2008, which I didn't try, has improved performance. The big advantage of a program like this is the advanced reporting you can do, including custom SQL requests. You have to purchase their professional version ($200) to do most of the analytics and custom queries. If Analog is too simple for you, then try the free version of Deep Log Analyzer.
And you can also try Microsoft's own Log Parser, as was the recommended answer in: https://stackoverflow.com/questions/157677/a-good-iis-log-viewer-for-large-log-files.
But you will need some extra skills to use it.
What are you wanting to analyze from your logs? There are a bunch of tools out there - free or paid for - that will go through the logs and spit out a great variety of figures. Some have real meaning, others are best used with a grain of salt.
What none will show you is "How many people are actually reading my wonderful web pages". Those that attempt to show "distinct site visitors" or any detailed metrics are at best a rough approximation to an indication of a vague trend...
But for what it's worth, we use Analog.
SHORT ANSWER:
You are correct to question the results; log analysis is not adequate to report actual traffic.
LONGER ANSWER:
WebTrends is a great tool for what it delivers. But as a previous administrator of a WebTrends installation, I found that web logs are notoriously bad at capturing metrics of interest.
For instance, if there exists any caching in your web delivery stack (or on the consumers side-- *I'm shaking my fist at YOU, AOL!), then your web logs are instantly non-reflective of your site's actual activity. This is because log analysis assumes that all user consumption will translate to an HTTP request back to the web server-- and thus having been recorded in the IIS logs. In the case of a cache, this would not be the case.
In the future if you want more reliable results, you ultimately need to ensure that there exists a way to bust any caching strategy. The obvious answer is dynamic content. But if you do not want to rewrite all of your content in such a fashion, just ensure your web traffic analysis uses a dynamic call.
WebTrends actually offers a solution to this problem, called SDC server. This is exactly what Google Analytics offers as well-- it's a javascript call back to the analysis server.
...I could go for days on this. If you want more specific information, comment back. ;)
EDIT: With WebTrends, specifically, it is quite important to configure session tracking beyond their default IP/userAgent configuration. If your web server assigns a session cookie, you will find this will increase your reliability; especially for differentiating between users which may sit behind the same NAT.
I have had really good luck with SmarterStats, from SmarterTools.
There is a logging package for free from MSFT for viewing this information using SQL Reporting Services. Google it.
doing it with the logs is only a good idea if it's internal - I'd use google analytics for anyhing on teh internets
I have been using Summary, which is paid for software, for years, and love it. The cost of updates is getting to me, and paying for an update to just get user agent string updates out of the deal is getting bothersome. Not that there are not other fixes, I just tend to not need them.
Anyone care to share if they have used Summary compared to analog?
Look at XpoLog log analysis platform for web application servers and web servers log. it a log management and analysis platform that integrate to web servers logs and create reports, provide search and log viewer and also monitor for problems. XpoLog

Resources