How do I delete the log files that Elmah generates on the server?
Is there a setting within Elmah that I can use to delete log files? I would prefer to specify some criteria (e.g. log files that are older than 30 days).
Or should I write my own code for that ?
You can set the maximum number of log entries, but there isn't a native function for clearing out logs older than a given date. It's a good feature request, though!
If you are storing your error logs in memory the maximum number stored is 500 by default and this requires no additional configuration. Alternatively, you can define the number using the size keyword:
<elmah>
<errorLog type="Elmah.MemoryErrorLog, Elmah" size="50" />
</elmah>
Setting a fixed size is obviously more important for in-memory or XML-based logging, where resources need to be closely managed. You can define a fixed size for any log type though.
This SQL script deletes the rows older than the newest "size" rows:
declare #size INTEGER = 50;
declare #countno INTEGER;
SELECT #countno = count([ErrorId]) FROM [dbo].[ELMAH_Error]
/*
SELECT TOP (#countno-#size)
[ErrorId]
,[TimeUtc]
,[Sequence]
FROM [dbo].[ELMAH_Error]
order by [Sequence] asc
GO
*/
DELETE FROM [dbo].[ELMAH_Error]
WHERE [ErrorId] IN (
SELECT t.[ErrorId] FROM (
SELECT TOP (#countno-#size)
[ErrorId]
,[TimeUtc]
,[Sequence]
FROM [dbo].[ELMAH_Error]
order by [Sequence] asc
) t
)
GO
/*
-- Print remaining rows
SELECT * FROM [dbo].[ELMAH_Error]
order by [Sequence] desc
*/
In response to what Phil Wheeler mentioned above, on a current project I am doing in Mvc4, we have an Mvc area called SiteAdmin. This area will be responsible for all site administration duties, including Elmah.
To get over the lack of delete functionality, I implemented a function to delete all of the current log entries in Elmah (we're using the XML based version).
Here is an image of the SiteAdmin index view:
View Error Log - Opens the Elmah UI in a new window.
Clear Error Log - Presents a confirmation popup, then deletes all entries if the user confirms.
If anyone needs the code as an example, I'd be happy to send it along.
My mechanics could be modified pretty easily to provide a mechanism for selective deletions by criteria if needed (by date, by status code, etc...).
The point of my answer here is that you could provide the delete functionality on your own AND not fork the open source code of the Elmah project.
At least within a MVC project, you can go to the App_Data/Elmah.Errors directory to delete the error XML files that you want to remove.
Related
I have an nFeed plugin on my Jira instance that I'm trying to learn to use. I have a select list in my create screen and I have it configured to JNDI. When I go to my select list, it has 3 options (how many rows in my table), but they're all blank.
Here is my query
SELECT p.PRODUCT_NAME, p.PRODUCT_ID FROM NIRD_Product_Groups p
and my key is 0
I have native filter checked and the display template is {1}
anyone hae an ideas what the problem could be?
It looks like probelm with your SQL query.
- Have you tried to enable the Debug mode of this nFeed field to figure out if ther is any error raised ?
- What type of database are you pulling data from ? Depending on its type you probably need to put double quotes around your attribute and table name in your query.
- You set the key as 0, which refers to the PRODUCT_NAME attribute. Would not it be safer to set the key as PRODUCT_ID ? This sounds more accurate to me.
- Have you tried to test this configuration on a different web browser ?
If you need further help, Valiantys is providing free support on its plugin here : https://jira.valiantys.com
I am trying to use Macros in FireDAC to Preprocess my SQL Queries. I have a TADQuery object on a Data Module with the SQL set to something like:
Select * from MyTable
join OtherTable on MyTable.Key = OtherTable.Key
&Where
Then in my code I do this:
WhereClause = 'stuff based on my form';
Query.MacroByName('Where').AsRaw := WhereClause;
Query.Open;
This has worked great for complicated queries because it lets me make sure my fields and join conditions are correct using the SQL Property editor.
My problem is when the SQL statements ends up invalid because of my where clause. Is there any way to see the SQL after pre-processing that is going to be executed? Right now I am catching the FireDac errors and showing the SQL that is on EADDBEngineException object. However that is still showing my original SQL with the macros. If I can't get to it after the error happens is there anyway to force the Macro replacement to take place so I can look at the SQL in the debugger to help me see what is wrong.
If it matters I am connecting to a MS Access database with the goal of moving to SQL Server in the near future.
Apart from using Text property, to monitor what SQL is actually going to the database engine, consider using the "FDMonitor" FireDAC utility. According to the DokWiki pages (below):
drop a TFDMoniRemoteClientLink component on your form,
Set its Tracing property to True,
Add the MonitorBy=Xxx connection definition parameter to your existing FDConnection component. You can do this in the IDE object inspector, by selecting your FDConnection component, expanding the Params property, and setting MonitorBy to mbRemote.
Note that the TFDMoniXxxxClientLink should come before TFDConnection in the data module or form creation order, so adjust this by right clicking on the form or data module, then Creation Order, and moving the TFDMoni.. component above the FDConnection.
Also, it's helpful in the options of the TFDMoniXxxxClientLink, to disable most of the events being recorded, otherwise all the data returned is also shown in the FireDAC monitor. Expand the EventKinds property, and turn all the event kinds off, except for perhaps ekConnConnect, ekConnPrepare, and ekCmdExecute.
Then open the FireDAC Monitor from the IDE, (Tools > FireDAC Monitor). Start your app only once the monitor is running. Double click on a trace event (in the Trace Output tab), and you will see the actual SQL sent to the database in the bottom pane.
It also seems likely that adding the EventType of ekConnPrepare as mentioned above, would show you when the query's Prepare is called, but I haven't played enough with it say for sure.
Please see the following pages on the DocWiki for more information:
Overview: FDMonitor
How to: Tracing and Monitoring (FireDAC)
Other FireDAC utilities: Utilities (FireDAC)
(Just to remove this question from list of unanswered questions)
From comments:
Well, I've roughly checked what's happening there and I'm still not
sure if calling Prepare (which is useless for you as I get) is the
minimal requirement to trigger that preprocessing. Though, the
preprocessed SQL, the one which is sent to the DBMS you can access
through the Text property (quite uncommon name for such property). – TLama Feb
21 '14 at 8:18
We ran the Fortify scan and had some Access Control: Database issues. The code is getting the textbox value and setting it to a string variable. In this case, it's passing the value from the TextBox to the stored procedure in a database. Any ideas on how I can get around this Access Control: Database issue?
Without proper access control, the method ExecuteNonQuery() in DataBase.cs
can execute a SQL statement on line 320 that contains an attacker-controlled primary
key, thereby allowing the attacker to access unauthorized records.
Source: Tool.ascx.cs:591 System.Web.UI.WebControls.TextBox.get_Text()
rptItem.FindControl("lblClmInvalidEntry").Visible = false;
ToolDataAccess.UpdateToolData(strSDN, strSSNum, strRANC, strAdvRecDate, strAdvSubDate, strClmRecDate, strClmAuth, strClmSubDate, strAdvAuth, txtNoteEntry.Text);
Sink: DataBase.cs:278
System.Data.SqlClient.SqlParameterCollection.Add()
// Add parameters
foreach (SqlParameter parameter in parameters)
cmd.Parameters.Add(parameter);
The point of "Access Control: Database" is where it isn't being specific enough in the query and so could potentially allow a user to see information that they're not supposed to.
An easy example of this vulnerability would be a payroll database where there is a textbox that says the ID of the employee and gives their salary, this could potentially allow the user to change the ID and see the salary of other employees.
Another example where this is often intended functionality is in a website URL where the product ID is used in a parameter, meaning a user could go through every product you have on your site. But as this only allows users to see information they're supposed to be able to, it's not particularly a security issue.
For instance:
"SELECT account_balance FROM accounts WHERE account_number = " + $input_from_attacker + ";"
// even if we safely build the query above, preventing change to the query structure,
// the attacker can still send someone else's account number, and read Grandma's balance!
As this is pretty context based, it's difficult to determine statically so there are lots of examples where Fortify may catch this but it's actually intended functionality. That's not to say the tool is broken, it's just one of the limitations of static analysis and depending on what your program is supposed to be doing it may or may not be intended.
If this is intended to work like this, then I would suggest auditing it as not an issue or suppressing the issue.
If you can see that this is definitely an issue and users can see information that they shouldn't be able to, then the stored procedure needs to be more specific so that users can only see information they should be able to. However SCA will likely still pick this up in a latter scan so you would still then need to audit it as fixed and no longer an issue.
I use log parser 2.2 to read the IIS log and copy the log into a database. Initially IIS log was having the default fields and I was able to copy the log in to database. Now I included one more field in IIS log but the log parser does not return the details of new column. Can anyone help to make log parser to read the additional fields along with the old log files?
Following query is used to read IIS log.
select * from C:\inetpub\logs\LogFiles\W3SVC3\*.*
If it's newly added, then I think log parser stops checking the newly defined field after the first 50 entries it finds (perhaps total or per log) try using just the IIS log that has the new field in it to determine if it's working or not. Also, make sure that the first 3 lines reflect the # Fields: stuff entire you're looking for.
ie:
select * from C:\inetpub\logs\LogFiles\W3SVC3\todays.log
I want to save the results of the MVC MiniProfiler to a sql server database. I profile a high load mvc4 page to spot a tricky performance problem which is non-reproducible on our test or development system and only happens sporadically on the production server.
What is the best way to hook into the mini profiler? Is there an existing extension to do that?
I've just been setting up MiniProfiler to save the results to SQL Azure, it's fairly easy. We're using MiniProfiler.MVC3 to take care of all the wiring up as described here.
The table create script is embedded into the assembly with the SqlServerStorage.TableCreationScript static field, so you can use that, but while digging into the code I found the latest development branch has enhanced the script slightly by adding some indexes. The table structure is otherwise unchanged, so it still works with the latest package available on nuget.
At the time of writing here is direct link to the latest table create script.
After that the only thing you need to do is set-up MiniProfiler to use SQL with single line of code:
MiniProfiler.Settings.Storage = new SqlServerStorage("<your connection string>");
If you're not using SQL Azure, that's it, but I found one issue when we tried to use it in Azure. I received the following exception (thank you ELMAH) when profiling tried to save:
System.Data.SqlClient.SqlException
Tables without a clustered index are not supported in this version of SQL Server. Please create a clustered index and try again.
To solve this I had to add an additional (unused) column to the MiniProfilers table. Here is the beginning of the create table script in question:
create table MiniProfilers
(
RowId integer not null identity constraint PK_MiniProfilers primary key clustered, -- Need a clustered primary key for SQL azure
Id uniqueidentifier not null,
Name nvarchar(200) not null,
And since the Guid Id column was no longer the primary key, I added an additional index to ensure lookups are still fast:
create unique nonclustered index IX_MiniProfilers_Id on MiniProfilers (Id)
Hope that helps.
Update
The change to support SQL Azure has been provided as a Pull Request and accepted. Thanks Jarrod.