Bundling: It’s a simple logical group of files that could be referenced by unique name and being loaded with on HTTP requestor.
Minification: It’s process of removing unnecessary whitespace, line breaks, and comments from code to reduce its size thereby improving load times.
Here is my idea,
Basically I use multiple CCS, JS, and Image files for modularity, readability and maintainability of the code. Here multiple JS and CSS files require multiple HTTP requests from the browser leads to degrade the performance and load time of my web page, in some cases it leads to degradation of the overall performance of the website.
I would like to store my all static content into AWS S3 and serve them by CloudFront distribution links and use those CDN paths to my multiple projects with bundling & minification.
I have been trying to bundle all JS files from CDN into a single bundle (for Bundling & Minication) Like below code but this doesn’t work!
var myCDN = "http://cdn.myawsdomain.com/";
bundles.Add(new ScriptBundle("~/bundles/js", myCDN)
.Include(
"~/MyS3BucketName/Scripts/jquery.cookie.js",
"~/MyS3BucketName/Scripts/bootstrap.min.js",
"~/MyS3BucketName/Scripts/wow.min.js"
));
Also tried below code,but this doesn’t work!
bundles.Add(new ScriptBundle("~/bundles/js")
.Include(
"http://cdn.myawsdomain.com/MyS3BucketName/Scripts/jquery.cookie.js",
"http://cdn.myawsdomain.com/MyS3BucketName/Scripts/bootstrap.min.js",
"http://cdn.myawsdomain.com/MyS3BucketName/Scripts/wow.min.js"
));
Any help will be appreciated.
I am answering to my own question because it may be help full to someone.
I have generated compressed and minified version of the JS and CSS files using ASP.Net MVC Bundle Config. We can’t combine multiple CDN’s together (one script only) in bundle config.
I have performed the following steps to generate compressed and minified JS & CSS files,
a. Include necessary JS files in bundle config with the script bundle virtual path (“~/scripts/bundle”) and check web page is loaded with no errors in the browser.
BundleTable.EnableOptimizations = true;
bundles.UseCdn = true;
bundles.Add(new ScriptBundle("~/scripts/bundle")
.Include("~/Yourscriptfile1.js")
.Include("~/Yourscriptfile2.js")
.Include("~/Yourscriptfile3.js")
);
b. To compress and minify all those JS files into one file, Send HTTP request to virtual path (http://localhost:254/scripts/bundle) from your local machine browser and save response into “output.min.js” file.
c. Upload “output.min.js” file into S3 bucket and set this object public read-only property, add expire header with far-future expiration date, and configure S3 bucket as a CDN.
Key=”Cache-Control”, Value=”max-age=1814400” - [3 weeks]
Key=”Expires”, Value=”Thu, 30 Dec 2021 16:00:00 GMT” - [far-future expiration date only]
d. Now configure your CDN in bundle config file by changing the above code (Step-a) into below code,
BundleTable.EnableOptimizations = true;
bundles.UseCdn = true;
string CDN = "http://cdn.mydomain.io/Scripts/compress/output.min.js";
bundles.Add(new ScriptBundle("~/scripts/bundle", CDN)
.Include("~/Yourscriptfile1.js")
.Include("~/Yourscriptfile2.js")
.Include("~/Yourscriptfile3.js")
);
In the above code, Scripts will be requested from the CDN while in release mode. The debug version of the script will be drawn locally in debug mode.
Related
There are cases where an application might need a relatively large resource as a strict requirement, but only in certain cases that are not easily detectable from a service worker. For example:
Safari has a prefixed and non-conforming implementation of the Web Audio API. I found a great shim, but it's over 300kb. It's critical for the web app to function in Safari but unnecessary in other browsers.
Some media are available in multiple formats that may not always be supported. Video is typically too large to precache and has problems with range requests, but it could apply to WebP images or short audio files (e.g. Opus vs. AAC). If you include all formats in the precache manifest, by default it will download all of them.
One approach would be to manually exclude certain files from the precache manifest and then to conditionally load those files from the scripts on the main thread to be stored in the runtime cache. But then those files are not precached - they're only loaded after the new version activates, by which point you may no longer be online.
Is there a solution that allows the following?:
Have the service worker send a message to the main thread with the URL of a "test" script that checks the various conditions.
Load and run that script on the main thread and send the service worker the list of required conditional assets
Add those assets to the precache manifest to be diff'ed against the previous version and downloaded as necessary
The service worker should not switch over to the new version until all precached assets are loaded, including the conditional ones.
I ran into a similar issue with i18n files. I don't want my users to precache all strings for all available languages.
Some context for my setup. I'm using the injectManifest approach for vite. The webpack plugin should expose the same config options for this.
Language files in my case have this shape in the build dir:
assets/messages.XXXX.js
So the first step was to tell the injectManifest config to ignore these files:
injectManifest: {
globIgnores: ["**/node_modules/**/*", "assets/messages.*.js"],
}
The second step was going into the sw.js file and tell workbox that all scripts that aren't precached should be requested via the CacheFirst strategy. (i.e. first attempt to load it from the cache, if not present, load it from the network and put it into the cache).
So here's the adapted part of the sw.js:
/* ... */
import {registerRoute, NavigationRoute, Route} from "workbox-routing";
import {CacheFirst} from "workbox-strategies";
precacheAndRoute(self.__WB_MANIFEST);
// Handle non-pre-cached scripts
const scriptsRoute = new Route(
({request}) => {
return request.destination === "script";
},
new CacheFirst({
cacheName: "scripts",
})
);
cleanupOutdatedCaches();
registerRoute(new NavigationRoute(createHandlerBoundToURL("index.html")));
registerRoute(scriptsRoute);
/* ... */
I'm ASP.NET MVC v4 for my application, and I'm using the web optimization features (bundling and minification of scripts and styles).
Now, what I understand is (please correct me if wrong), the optimization framework will look at the included files at the time of compilation and configure them. It'll create a version number (v=something) based on the contents. Every time the contents change, it'll recreate the version hash, and the client will get updated files.
Now, is there a way to get the following done
[1] Update something inside a js file in my server, and serve the updated one to the clients without re-building & re-starting the application (I'm not changing bundle configuration here, just updating file content inside a script) ?
[2] Update the script configuration itself (e.g. adding a new script to a bundle), and get that served to the clients without Re-compiling & Re-staring the application? Or, at least without re-compiling? (I know, generally we define the bundles inside cs files, but wondering if there is a way out!)
[3] Is there a way to use my own version number (say from a config file, v=myCustomScriptVersion) rather than the auto-generated version hash?
It's bit late, but I'm just sharing my experience on my own questions here.
As discussed in the comments of the question, bundles are defined as part of a cs file (generally BundleConfig.cs inside App_Start). So, the bundles are defined at compile time, and at application start they will get added to collection and become usable.
Now, the interesting bit. At run-time, the optimization framework looks into the included files and creates a hash of the contents, and appends that as a version query-string to the bundle request. So, when the bundle is called the generated uri is like the below one.
http://example.com/Bundles/MyBundledScripts?v=ILpm9GTTPShzteCf85dcR4x0msPpku-QRNlggE42QN81
This version number v=... is completely dynamic. If any file content within the bundle is changed, this version will be regenerated, and will remain same otherwise.
Now to answer the questions,
[1] This is done automatically by the framework, no need to do anything extra for this. Every time a file content is changed, new version number will be generated and the clients will get the updated scripts.
[2] Not possible. If files included in a bundle are changed, is has to be recompiled.
[3] Yes, it can be used. The custom version number can be added as below.
#Scripts.Render("~/Bundles/MyBundledScripts?v=" + ConfigurationManager.AppSettings["ScriptVersion"])
But Caution! This will remove the automatic versioning based on file contents.
And, additionally, if there are multiple versions of the same file available and we always want to include the latest version available, that can be achieved easily by including a {version} wildcard in bundle configuration like below.
bundles.Add(new ScriptBundle("~/Bundles/MyBundledScripts")
.Include(
"~/Scripts/Vendor/someScript-{version}.js"
));
So, if there are 2 scripts in the /Scripts/Vendor folder
someScript-2.3.js
someScript-3.4.js
Then the file someScript-3.4.js (higher version) will get included automatically. And when a new file someScript-4.0.js is added to the folder, that will be served to clients without any need for recompile/restart.
I've been trying to reduce the amount of copying and pasting of content files across some of my projects and decided to go down the adding files as links from a central project.
The problem I have now is that the System.Web.Optimization.Bundle.AddFile(string virtualPath, bool throwIfNotExist = true) does not work as the file doesn't really exist in the directory I specified.
Does anyone have any pointers? Or maybe an alternative to linking content files?
Thanks.
I think you cannot access files outside of your web project with the virtual path system and It might hurt when you want to deploy your app.
I suggest to make a custom project for your static content with a custom domain: e.g. static.yourcompany.com and reference all this files from this domain. This also has the advantage that the browser does not have to add the cookies for authentication and tracking to these requests, which might be faster if you have a lot of traffic. You can also open your custom cdn (http://www.maxcdn.com/pricing/) or store the files in azure or amazon aws (which is more or less free for less files and traffic).
Another approach is to make some batch files to keep your content files synced.
Am looking to integrate SquishIt with our webapp. What I have noticed from the testing locally is, SquishIt generates the file only once. Based on other SO answers and reading the SquishIt code I gather that the file generation happens if the HttpCache doesn't contain a value of the generated hashed key.
If without restarting the app, or without clearing the HttpCache, if I delete the generated minified file, then SquishIt doesn't recreate the file.
Is there any way to force SquishIt to recreate the file, if it doesn't exist?
Earlier we were using RequestReduce and we noticed the it didn't always pick up css/js changes if only the css/js files were edited (ie, web.config was not edited and the app was not restarted). To ensure that the changes are picked up, we always delete all generated files when deploying.
Will SquishIt ALWAYS detect the changed code, even if web.config is not modified, the app is not restarted and the HttpCache is not cleared?
The [BundleCache.Add] (https://github.com/jetheredge/SquishIt/blob/master/SquishIt.Framework/BundleCache.cs#L40-54) method's code helps answer this question
Can I force SquishIt to generated files by simply deleting the generated files?
After thinking about the scenario I need to handle, this is the wrong question to ask.
EDIT:
What are the cache headers sent to the client for these generated files?
My scenario is as follows. I switched from the default JS minifier to JsMinMinifier. After deleting the files (RenderOnlyIfOutputFileIsMissing is set) and restarting the app, the minified files got generated. However, they had the same name as the previous files (I wrongly assumed it would have a different name).
Refreshing my browser showed that the newly generated files were sent by the server. How did this happen? If the assets had a long expiration cache header set on them, then the browser shouldn't have requested the new file from the server. (Inspecting the assets in Firebug, I am unable to understand the cache policy. To me it looks like it's set to cache for a couple of mins).
EDIT 2:
My take away is, there is no need to delete the generated file to cause regeneration. If the corresponding source files change, SquishIt WILL generate an appropriate file.
It should - we are adding cache dependencies for source files (not the generated ones) so if one of them is edited the entry in the bundle cache should be invalidated. See BundleCache.Add
No - once an entry is in the bundle cache we assume the output file will be there, so you'd end up with the file not being found. This is by design, we haven't really heard a compelling case against it.
Deleting generated files when deploying should be fine though, even if not strictly necessary - don't you need to restart the app then anyway?
If you are really concerned about files lingering you may want to consider using SquishIt without the file system
I built a photo gallery which uses Paperclip and validates the content-type using validates_attachment_content_type.
The application runs on a shared host with Passenger.
Is it possible to bypass the validation and run malicious scripts from the public/pictures directory? If so, is there anything that I can do to avoid evil scripts from running or from being uploaded?
Is it possible to bypass the validation and run malicious scripts from the public/pictures directory?
Yes. You can have a perfectly valid renderable image file that also contains HTML with script injection. Thanks for the bogus content-sniffing, IE, you have ruined everything.
See http://webblaze.cs.berkeley.edu/2009/content-sniffing/ for a summary.
If so, is there anything that I can do to avoid evil scripts from running or from being uploaded?
Not really. In theory you can check the first 256 bytes for HTML tags, but then you have to know the exact details of what browsers content-sniff for, and keeping that comprehensive and up-to-date is a non-starter.
If you are processing the images and re-saving them yourself that can protect you. Otherwise, do one or both of:
only serve user-uploaded files from a different hostname, so they don't have access to the cookie/auth details that would allow an injected script to XSS into your site. (but look out for non-XSS attacks like general JavaScript/plugin exploits)
serve user-uploaded files through a server-side script that includes the 'Content-Disposition: attachment' header, so browsers don't attempt to view the page inline. (but look out of old versions of Flash ignoring it for Flash files) This approach also means you don't have to store files on your server filesystem under the filename the user submits, which saves you some heavy and difficult-to-get-right filename validation work.