I'd like to set up a RollingFileAppender in log4net such that the current (i.e. today's) log file always has a static name (like app.log), but upon roll over at the end of the day, it should be renamed to app.<date>.log. Here's as close as I've got so far (note that I'm using every-minute rollover rather than every-day rollover since this is easier to debug):
<appender name="applog" type="log4net.Appender.RollingFileAppender">
<file value="app.log" />
<staticLogFileName value="false" />
<datePattern value=".yyyy-MM-dd-hh-mm" />
<preserveLogFileNameExtension value="true" />
<appendToFile value="true" />
<rollingStyle value="Date" />
<maxSizeRollBackups value="5" />
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date [%thread] %-5level %logger - %message%newline" />
</layout>
</appender>
The problem with this is that I see the following when a request begins:
app.2016-02-01-05-00.log
And by the time the request ends, I have these files:
app.2016-02-01-05-00.log
app.2016-02-01-05-00.log.2016-02-01-05-00.log
Notice that the minute hasn't rolled over yet, but it appears to have created a rollover file of some kind anyway. Also, today's file isn't ever called just 'app.log' as I want, it always starts with the timestamp in the name. Lastly, it doesn't appear to honor my maxSizeRollBackups of 5, as far as I can tell the backups grow indefinitely without ever getting deleted.
I tried removing the staticLogFileName tag, and that makes today's name 'app.log' like I want, but then it rolls over in place, overwriting itself and not creating backup files.
After breaking down and downloading the source code, it turns out to be a permission issue with the rollover's System.IO.File.Move() call. I needed to set the folder's Modify permission to true as well, not just Read and Write (which is strange, because isn't a move technically a write operation?).
I also discovered that you should NOT set staticLogFileName to false, so I had to remove that element from the xml.
Related
ELMAH generates an exception like below when a known process visits a non-existing URL on our website:
System.Web.HttpException: The controller for path '/manager/' was not
found or does not implement IController.
Whereas going to that non-existent URL from a browser generates a typical IIS 404.
The resource cannot be found. Description: HTTP 404. The resource you
are looking for (or one of its dependencies) could have been removed,
had its name changed, or is temporarily unavailable. Please review
the following URL and make sure that it is spelled correctly.
Requested URL: /manager
I know the process accessing these addresses is harmless, and want to stop receiving these specific emails being generated from a range of IP addresses. This is what I have in web.config but ELMAH email still comes through.
The URL filter seems to be working fine.
<elmah>
<security allowRemoteAccess="false" />
<errorMail from="elmah#mydomain.com" to="elmahlog#mydomain.com" async="true" smtpServer="mail.mydomain.com" smtpPort="25" useSsl="false" />
<errorFilter>
<test>
<and>
<equal binding="HttpStatusCode" value="500" type="Int32" />
<or>
<!--TrustWave scans our website with intentional bad addresses-->
<regex binding="Context.Request.ServerVariables['REMOTE_ADDR']" pattern="64.37.231.\d{1,3}" type="String" />
<!--Google looking for Digital Asset Links - well known statements the website wants to make-->
<regex binding="Context.Request.ServerVariables['URL']" pattern="/.well-known/assetlinks.json" type="String" />
<!--Apple devices searching for universal links - app-site association. we dont have an app.-->
<regex binding="Context.Request.ServerVariables['URL']" pattern="/.well-known/apple-app-site-association" type="String" />
</or>
</and>
</test>
</errorFilter>
</elmah>
Even though ELMAH email message looks like a 500 error occurred, the web.config really needed a 404 error code for the filter to work.
<equal binding="HttpStatusCode" value="404" type="Int32" />
instead of
<equal binding="HttpStatusCode" value="500" type="Int32" />
I need to call a stored procedure which gathers data during a specific window. For instance I send in a Timestamp representing now, and a window of 15 minutes and it will return all data within the last 15 minutes. Each time this procedure is called I need update the Timestamp representing now so that I avoid old data.
Is there anyway to achieve this?
My attempt with the "now" bean below is a failure as even though that bean is a prototype it's value is only ever retrieved once when the channel adapter is created.
The anonymised stored-proc-inbound-channel-adapter I currently have configured is below:
<int-jdbc:stored-proc-inbound-channel-adapter
channel="storedProcOutboundChannel" stored-procedure-name="dataWindowRetrieval"
data-source="dataSource" auto-startup="true" id=""
ignore-column-meta-data="false" is-function="false"
skip-undeclared-results="true" return-value-required="false">
<int:poller fixed-rate="60" time-unit="SECONDS"></int:poller>
<int-jdbc:parameter name="CurrentDateTime" type="java.sql.Timestamp" value="#{now}" />
<int-jdbc:parameter name="MinuteOffset" type="java.lang.Integer" value="3" />
<int-jdbc:parameter name="SomeOtherParameter" type="java.lang.Integer" value="4" />
<int-jdbc:parameter name="YetAnotherParameter" type="java.lang.Integer" value="15" />
<int-jdbc:returning-resultset name="theResults" row-mapper="org.springframework.jdbc.core.ColumnMapRowMapper" />
</int-jdbc:stored-proc-inbound-channel-adapter>
<bean id="now" scope="prototype" class="java.sql.Timestamp">
<constructor-arg value="#{ T(java.lang.System).currentTimeMillis()}" />
</bean>
Instead of value use expression="#now".
The value is only evaluated at initialization time.
I am trying to change a log file name after deploying, so transform this:
<log4net>
...
<appender name="GeneralAppender" type="log4net.Appender.RollingFileAppender, log4net">
<file value="c:\logs\Co.App.log" />
...
</appender>
</log4net>
to this:
<log4net>
...
<appender name="GeneralAppender" type="log4net.Appender.RollingFileAppender, log4net">
<file value="c:\logs\Co.App.localhost.log" />
...
</appender>
</log4net>
the actual file node doesn't have any attributes, so I am trying to locate it by parent node
<runtime>
<assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
<log4net>
<appender >
<file value="c:\logs\Co.App.localhost.log" xdt:Transform="Replace" xdt:Locator="XPath(../appender[#name='GeneralAppender'])" />
</appender>
</log4net>
</assemblyBinding>
</runtime>
i've also tried all permutations of absolute and relative xpath's but i don't see it having any effect in transform preview.
i tried:
xdt:Locator="XPath(//appender[#name='GeneralAppender']/file)"
and even:
xdt:Transform="Remove" xdt:Locator="XPath(//file)"
found it!
<file value="c:\logs\Co.App.local.log" xdt:Transform="Replace" xdt:Locator="Condition(../#name='GeneralAppender')" />
Slight extension:
If the parent node has TWO (or more children), then the above solution is not enough.
This is the case in log4net, when the EventLogAppend is used, which has:
<appender name="EventLogAppender" type="log4net.Appender.EventLogAppender" >
...
<param name="LogName" value="MyLog"/>
<param name="ApplicationName" value="MyApplication"/>
...
Then you need to use an 'and' + an atribute more to hit the right node:,
like xdt:Locator="Condition(../#name='EventLogAppender' and #name='LogName')
Example for the EventLogAppender, where both the params are replaced:
<param name="LogName" value="MyNewLog" xdt:Transform="Replace" xdt:Locator="Condition(../#name='EventLogAppender' and #name='LogName')" />
<param name="ApplicationName" value="MyNewApplication" xdt:Transform="Replace" xdt:Locator="Condition(../#name='EventLogAppender' and #name='ApplicationName')" />
The MSDN documentation mentions that if you use XPath, then it's going to append the expression that you pass to the current context in the transform file. So XPath is good for cases where you want to modify the current element or one of it's predecessors.
Where as if you want to traverse to parents in a relative manner, there is no XPath expression that does that. XPath starts with a parent and allows you to traverse the descendants and not ascendants. That's when condition works.
I tried to use xdt:Locator="XPath(.)" and it worked perfectly to replace the current element, and if needed it's descendants. But it doesn't for ascendants.
Due to the url length for js resources on my local dev site, it's really annoying to try to find the script I'm looking for.
I'm using combres 2.2.2.4. Here's my relevant combres.xml section:
<jsMinifiers>
<minifier name="msajax" type="Combres.Minifiers.MSAjaxJSMinifier, Combres"
binderType="Combres.Binders.SimpleObjectBinder, Combres">
<param name="CollapseToLiteral" type="bool" value="true" />
<param name="EvalsAreSafe" type="bool" value="true" />
<param name="MacSafariQuirks" type="bool" value="true" />
<param name="CatchAsLocal" type="bool" value="true" />
<param name="LocalRenaming" type="string" value="KeepAll" />
<param name="OutputMode" type="string" value="SingleLine" />
<param name="RemoveUnneededCode" type="bool" value="true" />
<param name="StripDebugStatements" type="bool" value="true" />
</minifier>
<minifier name="yui" type="Combres.Minifiers.YuiJSMinifier, Combres">
<param name="IsVerboseLogging" type="bool" value="false" />
<param name="IsObfuscateJavascript" type="bool" value="true" />
<param name="PreserveAllSemicolons" type="bool" value="false" />
<param name="DisableOptimizations" type="bool" value="false" />
<param name="LineBreakPosition" type="int" value="80" />
</minifier>
</jsMinifiers>
<resourceSets url="~/combres.axd" defaultDuration="30"
defaultVersion="auto"
defaultIgnorePipelineWhenDebug="true"
defaultDebugEnabled="true"
defaultJSMinifierRef="msajax"
defaultCssMinifierRef="yui"
defaultCompressionEnabled="true" >
Any thoughts?
Updates:
I'm still not sure where those hash numbers are coming from. I've jiggled the defaultVersion, defaultVersionGenerator, and version tags of the resources, but I can't seem to see a change.
I've just turned debug off, and I notice that in chrome, it shows
/scripts
/1
indicating that when debug is turned off, the defaultVersion works just as documented, but with debug on, the huge hashes are back. I'm looking into whether it's the version of Combres (we recently updated).
Well, rolled back to 2.1.0.0, and I found the same behavior. It actually might not be Combres here, even though it looks like it should be. I'll continue hunting.
Looks Combres is configured to uses Sha512VersionGenerator (it doesn't show in your config segment, so I suppose that is set in other segments). Try either set versionGenerator of resourceSet (or defaultVersionGenerator of resourceSets, and remember to remove versionGenerator in resourceSet) to:
Combres.VersionGenerators.HashCodeVersionGenerator
Remove the versionGenerator attribute and manually set its version
Proposed by Buu Nguyen:
In Debug mode, the hash generation cannot be "turned off" via Combres configuration. However, it may be possible to modify GetResourceUrl to remove the hash from the Url.
I have a weird problem with the configuration of TFS 2010 work items. It seems to be impossible to change the case of characters in the allowed values collection of a field e.g. change "Works for me" to "Works For Me". Every other string e.g. "Works For Me 123" is valid.
Even if I try to change the name to another string first (since i know the similar case problem with files in Visual Studio projects) it is just not accepting the upper case version and returns always to the lower case string.
Background information:
We have a custom WIT file to define the "Bug" work item. This includes the definition of the allowed values for the field "Resolved Reason". Initially our list contained lower case words e.g. "Works for me". Since we want to synchronize the TFS work items with HP Quality Center we need an exact match of the state names now.
The desired version:
<FIELD name="Resolved Reason" refname="Microsoft.VSTS.Common.ResolvedReason" type="String" reportable="dimension">
<HELPTEXT>The reason why the bug was resolved</HELPTEXT>
<ALLOWEDVALUES expanditems="true">
<LISTITEM value="Duplicate" />
<LISTITEM value="Fixed" />
<LISTITEM value="Wont Fix" />
<LISTITEM value="Invalid" />
<LISTITEM value="Works For Me" />
<LISTITEM value="Forwarded" />
</ALLOWEDVALUES>
</FIELD>
The actual version:
<FieldDefinition reportable="dimension" refname="Microsoft.VSTS.Common.ResolvedReason" name="Resolved Reason" type="String">
<ALLOWEDVALUES>
<LISTITEM value="Duplicate" />
<LISTITEM value="Fixed" />
<LISTITEM value="Wont fix" />
<LISTITEM value="Invalid" />
<LISTITEM value="Works for me" />
<LISTITEM value="Forwarded" />
</ALLOWEDVALUES>
<HELPTEXT>The reason why the bug was resolved</HELPTEXT>
</FieldDefinition>
Any ideas are welcome.
Thanks,
Robert
As Grant explained, the old work items are stuck with the old casing.
A manual workaround would be to create a new ListItem with the desired case (leaving the old one in the definition for now), edit the existing work items that contain the undesired case to the newly created ResolvedReason, and finish by removing the undesired item from the definition. I have done a similar thing in the past, but not specifically a case change.
If you are familiar with the TFS API (I'm not), you can programmatically update the Microsoft.VSTS.Common.ResolvedReason field values on the server. If you have access to the SQL Server 2008 instance, you might be able to edit the field values there to the new case (many layers of bureaucracy prevent me from testing this for you).
Once a string in a work item type is created with a particular casing, it is stuck with that.