We are building a revamped version of our old site in DotNetNuke. There are many pages that link to pages on our old site and we would like those old URLs to still lead to relevant information on the new site. The old URLs end in a variety of extensions, and sometimes in no extension (our old site is a mishmash of several platforms as well as static files). Does a DNN plugin exist that allows for such redirects? Friendly URLs aren't entirely adequate.
note: I realize that this could be handled in IIS, but we would like our non-coder, non-admin site manager to be able to handle this dynamically.
You will probably want to use a Module like this
http://www.dnnsoftware.com/forge/open-url-rewriter-for-dnn-dotnetnuke/view/extensiondetail/project/openurlrewriter
There is another option though. You can actually put URL records into the database directly, I believe you would simply add a record to the TabUrls table, the TABID is the page in DNN you want to point to. Then you put in the URL, and the HTTPStatus of 301.
You could do that for all the old pages if you know where they need to be mapped to in DNN.
You have to force all request through the ASP.NET pipeline, and you can do that by adding only this single line to the web.config of your application:
<system.webServer>
<modules runAllManagedModulesForAllRequests="true" />
</system.webServer>
Related
I'm creating a Reporting application in MVC that I want to use in multiple websites. I want to be able to simply create an application in IIS under each of the consuming websites and point them to the same directory where the Reporting application is located.
When I tried doing this in an MVC website it worked fine. However, when I tried adding this application under a Webforms website I got a "403.14 - Forbidden" error because it's trying to use the Static File handler.
How can I correct it to use the right handler to route to the Home controller?
The problem was that the base website's app pool was using the 2.0 clr. Once I realized this I made the necessary web.config adjustments and recompiled the app to run under the 4.0 clr and then everything worked.
I also have a bunch of <location path="." inheritInChildApplications="false"> sections. Not sure if those are necessary now, but I'm leaving them since it works and the sub app doesn't need to inherit anything from the base website.
This post (Expression of type 'System.Web.Mvc.MvcWebRazorHostFactory' cannot be used for return type 'System.Web.WebPages.Razor.WebRazorHostFactory') also helped me in trying to get the web.config stuff sorted out while converting from 2.0 to 4.0. That was kind of a pain because there were several errors I had to work through, but I think I mostly just had to remove some sections of the web.config that are no longer necessary (because they're now included in the machine config).
I have a web app that uses MvcSiteMapProvider, but I don't want it to server /sitemap.xml as every page but the login page requires authentication, so there is no need for the public to see my sitemap.
Is there a way to turn off the the /sitemap.xml file in config? Or a way to do it with RoutesConfig?
As per the documentation, if using internal DI, you can disable the /sitemap.xml endpoint using the MvcSiteMapProvider_EnableSitemapsXml setting.
<appSettings>
<add key="MvcSiteMapProvider_EnableSitemapsXml" value="false"/>
</appSettings>
If using external DI, you need to remove this line from the /App_Start/MvcSiteMapProviderConfig.cs file (or anywhere else it may exist in your application startup code).
// Register the Sitemaps routes for search engines
//XmlSiteMapController.RegisterRoutes(RouteTable.Routes);
FYI - although this setting does what you asked, there really was no problem to begin with. Search engines do not scan web sites for XML sitemap files, they have to be explicitly submitted. According to the sitemap protocol, they can be submitted via HTTP request, via search engine control panel, or by adding the location to the \robots.txt file. But none of these are done without explicit intervention on the part of the webmaster. In all cases, the webmaster chooses the URL that the XML sitemap will be hosted at. Unlike the \robots.txt file, there is no default location for it. We chose the most reasonable logical path \sitemap.xml, but technically it could be anything.
I am using umbraco 7.1.3.
My requirement is to create another sub-domain in main site dynamically as per user request.For example I have implemented umbraco cms for my site "ww.xyz.com" & I am updating content through umbraco login. Now I want to create sub-domains for different clients as per their request... like : "www.xyz.com/client1", "www.xyz.com/client2" and so on...
Now all sub-domain site should have it's own umbraco framework, so client-site (sub-domain owner) can login and update their information respectively.
To achieve this requirement I implemented following steps...
First I register a umbraco website in IIS and configure it, and that worked properly.
Then I register another umbraco website in IIS and configured it, and that also worked properly.
Now to implement sub-domain logic...
I simply copied 2nd website's folder in to first website folder. Then convert that folder to application through IIS.
As per my expectation this should work, As I have already done the same in asp.net and it worked.
But with umbraco I am facing issue like "Invalid key value".
I think the issue is related to some umbraco configuration, but I am not able to figure it out.
Thanks & Regards
A bit of an open door, but since I don't see it mentioned in any of the comments and it's a bit hidden away in Umbraco 8. Have you tried setting the urls in the Cultures and Hostnames section?
Note: you get to this by going to "Content", in the content tree right click on your homepage and now you get several extra options which are normally hidden away with also the very useful Hostname and Cultures option which allows you to support multiple urls.
I have a project where I want to use Umbraco only for the backend as a CMS. But I want to disable it completely in the frontend handling my aspx pages for me, and I want to use the API to get the content I want. In other words, I want to create an aspx page manually which will not be handled by the Umbraco engine. Right now, if e.g. you create a test.aspx page and put it in the root folder, it will return 404 because Umbraco will look for a node with this alias.
Ho do I disable the .aspx handling by Umbraco, but still be able to use Nodefactory etc. in codebehind to access the content?
Thanks
Themos
There are a few ways you can override pages so that their URLs are not "caught" by Umbraco, you can do this by modifying the following appSettings items in the web.config file.
To add single files:
<add key="umbracoReservedUrls" value="~/myfile.aspx,~/config/splashes/booting.aspx,~/install/default.aspx,~/config/splashes/noNodes.aspx" />
To add whole subdirectories:
<add key="umbracoReservedPaths" value="~/myfolder,~/umbraco,~/install/" />
You should be able to add ~/ to umbracoReservedPaths which would disable all URL mappings, I just tried it and it seemed to work. But I can't vouch that it'll have no unintended side-effects on the running of the Umbraco CMS.
I have MOSS 2007 installed at lets say http://localhost:4999/ and I want to have my custom ASP.NET MVC (1.0) application at http://localhost:4999/mvcapp/ - logic dictates that, in IIS, i should create a new application virtual directory under my MOSS site and point it at my custom MVC app.
I've done this and it works for executing my controllers etc, however, none of my /Content content is being returned! All referenced images, javascript and css aren't retrieved.
If I put this app into it's own site, or a virtual application within a non-sharepoint site, this works fine and pulls down the images, js and css as normal.
Note - I'm creating a new application in IIS, not just a virtual directory and I have no requirement for integrating with sharepoint, I just want it to have the same domain and port number.
Any ideas?
Cheers
Tony
** EDIT **
To clarify - the URL's that are being generated aren't the issue - they are correct and are being generated in the same way as they would be if this wasn't hosted under sharepoint. IE: /mvcapp/Content/Scripts/jquery.js etc.
** EDIT 2 **
More clarification - the MVC app has it's own web.config file - but it appears that when using a virtual directory withn a sharepoint site, many of the handler mappings still get pushed up to the child site (Note: This is a virtual directory configured as a seperate application not just a virtual directory).
Whilst I don't want or need SharePoint integration, I need my MVC app to come from the same domain and port to overcome some cross domain issues (a lot of MVC content is iframed into sharepoint in various ways). So sharepoint would be at http://site and my app at http://site/mvc
I would keep them on separate web sites (MVC and SharePoint that is). You could create a new website entirely for your MVC app, and then through IIS, right click your MVC web site, edit bindings, and redirect the traffic from your MVC website to URL you want.
I set up a couple of WebForms apps to run in much the same way you say that you want: a separate virtual application with its own web.config, etc. I had to tweak the web.config to make it work, though. My app uses things like session state and view state, but I reckon those aren't applicable to your MVC app. As I look at my web.confing, I think this section might be applicable for what you're trying to do:
<location>
<system.web>
<xhtmlConformance mode="Legacy" />
<trust level="Full" />
<httpModules>
<remove name="PublishingHttpModule" />
</httpModules>
</system.web>
</location>
Hope that helps. I also have an <authorization> section in there, but it wasn't necessary to make the app work.
Look at the following:
Configuring Specific Files and Subdirectories (MSDN)
Disabling Configuration Inheritance For ASP.NET Child Applications (Blog)
HTH