Tricky one.
Browser Security says "No".
Scenario:
Forms and Templates are used throughout the business, this can be word, excel or pdfs files.
We have a intranet system and I would like to offer users the ability to open files via this way. (not download).
However the browser has other ideas (which I understand why), however is their a work around?
to achieve this?
Related
Sharepoint 2007:
Core.css file is located in the 12 Folder:
12\TEMPLATE\LAYOUTS\1033\STYLES
but my question is : is this file shared for all webapplications on sharepoint server?
I think it is not because when I make changes in the core.css, only one of the applications showed the changes, so where is the others core.css files located?
I made search in my c driver I find only one core.css!!!!
Thanks in advance
This file is shared by all Web Applications. What you experience is most likely a result of aggressive client side web browser caching. Try clearing your browsers cache and refresh the page.
Having said that...
This is not the way you are supposed to customize SharePoint. You should never touch the OOB files in the 12 hive. Next time you apply an update or service pack all of your changes are gone. Also, by touching the OOB files, you automatically lose all support from Microsoft.
Rather, you should either a) develop a SharePoint feature that publish a custom master page with your CSS changes or b) do master page changes using SharePoint designer.
Core.css is always shared by all web applications.
AS magnus said what u have seen is due to the browser caching or somehthing like that. delete the browser cache or Restart the browser and then try again..
May this help you
Thanks
I want to build a simple site with MVC but then render the "pages" and corresponding "assets" (js, css, images, etc) to what one might call a "static site".
In other words, I don't want to deploy to an IIS server that supports MVC. I simply want to build the site in MVC then somehow parse those pages into static html/css/etc files and upload the site to a regular LAMP host.
Is there an easy way to automate this? NuGet package? Binary? MVC extension like maybe a handler add-on that can render out the static site in a single pass?
About 10 years back, I used to download whole websites for offline use using HTTrack Website Copier. May be you could download your own website which gives you nice hierarchy of your static web pages. If you think all your webpages are reachable through the homepage links, menu links etc then you can download most of your website. Basically you can google for web crawlers/ offline browsers/website downloaders etc. and run them to get your job done.
Alternatively if you know the pattern of urls, you could give it to download manager to download them. Not sure if it works with your website, but I do it sometimes.
HTH
If your site depends on a database or some other dynamic source it will be close to impossible to dump all possible combinations of pages into static files. If on the other hand your site is pretty much static, saving the rendered HTML/JS/CSS source into files and uploading it to a LAMP server won't be too hard.
You may wanna look at Pretzel, a .Net static site generator.
Update: Apparently it doesn't work on ASP.Net projects: Issue #123. It only supports Razor language for authoring content pages.
If the reason for doing this is performance related why not just use output caching and the like, that way the pages will be extremely fast (you could set the cache timeout to a very long period of time) and you don't need to run some tool to do the conversion and have to store your html separate to your source code.
Of course you will still need to run IIS/.net
You have three options:
Create your website using plain html, css, jquery and images. You can use Visual Studio Code as IDE to create the files. One issue might be to manage common header/footer for your website. But you can solve it by injecting html header/footer using jquery.
Use a CMS (content management system) like Umbraco to host your static site. Umbraco indexes and caches pages to improve performance. You have great control on what to publish on your website etc.
Create the website using .Net + MVC and use tools like HTTrack to download a static copy of the website. You can even automate the process using commands and triggering it after every deployment or build etc.
It's pretty much known that publishing to a remote location using VS2008 is a an exercise of great patience and faith.
As long as a 'publish' begins (using VS2008, publishing an MVC site), that site might be down from the first file that is successfully transferred. The problem being that unreliable internet access, or interesting error messages () can break the site, and require restarting.
It's understood that there is little to do from the VS2008 end. The question then:
What strategy can I use to ensure that there is an acceptable user experience during the 'downtime'? (e.g. "This site is currently under maintenance...")
A lovely feature of ASP.NET/IIS is that if you place a file named app_offline.htm in the root of the web application, all requests will redirect to that file. This would include requests for images, stylesheets, scripts, etc.. so you'll need to condense all media for the page into the page itself.
In fact, while Visual Studio is in the process of publishing your web application, it will place this file in the root of the application and remove it when the publish is complete. While Visual Studio doesn't allow you to customize the contents of its app_offline.htm, you can take the application offline yourself simply by uploading that page.
I am considering using .NET MVC for my next web app but one of the requirements is that there should be minimum work involved from the clients side (who will be maintaining the site).
They are used to simple HTML sites where all they have to do in order to make a minor change is to modify an html in notepad and upload it.
What parts of an .NET web app needs to be compiled? Is it only the .cs parts of it? Can all the rest be updated freely by modifying files with e.g. notepad?
Also, in an MVC environment, is more of the view related code in compiled files?
How is this kind of maintenance usually done in such cases where the client will take over the site on delivery (and are not interested in needing VS installed and needing to compile!)?
If you really need a web application, then in order to make changes to the 'application' part, they're going to need to be able to recompile.
If they're going to make visual changes, then your best bet is to provide a method for them to edit the HTML of the site. You can make changes to the views (.aspx files) in ASP.NET MVC without having to recompile. If you make changes to your controllers or your Model, then you'll have to recompile.
If this is a major requirement for your client, you can build the site using ASP.Net Web Forms instead of ASP.NET MVC in which case changes to the .cs files will be compiled on the fly when the page is first accessed. Note that this only applies to the .cs files in your Web Forms project. Any .cs files in referenced assemblies will need to be pre-compiled.
That said, I suspect your client is primarily interested in modifying the look/feel/content of a page, so they would probably be satisfied modifying the .aspx files in either a Web Forms or MVC app.
If they have the budget for it, sounds like the best solution is to build a Content Management System, so they don't have to edit files ever again.
It is not uncommon for our intranet web applications to link to publications, documents, or other resources from our shared network file servers.
In the past, we've had little trouble fashioning links such as the following:
file://fileserver1/folderofgoodies/rules.pdf
\\fileserver1\folderofgoodies\rules.pdf
The reason we had no trouble is because everyone in the building uses IE6 or IE7 (very few have IE8). Both styles of URLs worked fine in Microsoft browsers it seems.
But if you try clicking such links in other browsers, specifically Firefox, nothing happens!
On a new intranet web app I'm developing I've been attempting to ensure cross-browser support, but any links to local computer or local network resources seem to be ignored in at least Firefox 3.5.3, though I admit I haven't yet checked other browsers.
Is there any way I can change the way I link to said files so that browsers like Firefox will accept them? I cannot do anything that requires installing scripts, software, extensions, or any other solution on a per-user/per-computer basis.
I realize the suppression of said links is a security thing, but these links would be originating from only trusted local intranet locations, so...
If this is intranet, you can build a little helper server/page/webservice/whatever to which you will link and pass file name as parameter:
http://server/getlocalfile?path=file://fileserver1/folderofgoodies/rules.pdf
And you will benefit from extended security, by the way.
I think your only option is reconfiguring Firefox, but unfortunately you said you can't do that.
You could just map the file server path as a virtual directory into your intranet site and link via http.
Mozilla applications block links to local files. The only way is to install plugin(s) to Firefox. This link describes some of them.