Thomas Herold asks about my Share This Plugin:
The prototype.js is about 56k in size. My concern is that this slows down the loading time of my site. Any comments about this?
Yep, comments I’ve got because this was a carefully considered decision. First, let’s look at the ways this isn’t as big a deal as it seems (What, 56k per page load!?!), then we’ll look at the reasons for using it.
There are a number of reasons why 56k in additional download isn’t as bad as it used to be. For one, folks are more frequently on faster internet connections these days. For another, browsers and web servers have pretty much gotten then whole “ya need this file?”, “nope, I’ve still got it from last time” two-step down pretty well these days – the file isn’t downloaded every time. Smart web masters will also set up their web servers to gzip plain text output to web browsers that can accept gzipped content – this is pretty much all desktop browsers these days. Gzipped, the 56k becomes a mere 12k, which is a lot less offensive.
So that covers the ways that the 56k shouldn’t be so scary, but that is really only half the story.
In the 2.1 release, WordPress will be including Prototype.js as part of the default distribution. This will likely be something that plugin developers will take advantage of, and it’s a smart move. It’s much better to have plugin developers using an included, popular library than have each plugin distributing their own library. Does this put Share This a bit ahead of the curve, sure. But it also makes it a good citizen and example for other plugins that are built with 2.1 in mind.
So yes, it’s a trade-off. I could have gotten the same functionality in fewer bytes by using a different library or custom code, etc. But with a 2.1 release looking like it’s right around the corner, it seemed like a better decision to use Prototype.
Hi Alex. I agree with you on the idea that 56K isin’t really that big a deal – hopefully the rest of the plugin developers will use the same included copy of prototype.
I also home Matt and gang look at something like Dojo Shrinksafe to get the js files to shrink. I was able to shrink my prototype js file down by 65%. Your mileage may vary – I wish the idea of taking all js files and gzip’ing them would become the standard. Browsers support it today.
By the way, great job on the Share This plugin. Love it. Cheers
I’m surprised how many times I tell people this, but enabling gzip at the server level only takes a few minutes. You can use it for .js, .html and pretty much anything that isn’t already compressed (well, no point in recompressing gif files anyway :-))
This way you don’t have to continually compress individual files and it works better than the gzip option in wordpress, which causes problems with some plug-ins.
I did a how-to on this for IIS6, for any interested.
I agree with the whole concept of using a bundled prototype. However gzipping might be a bad idea, since MSIE sometimes incorrectly decompress the requested file. I’d rather trim blanks, comments etc… which can greatly reduce the file size too.
Has it ever been discussed to have a centralized public version of key open source libraries like prototype.js? Could it benefit the world if unmodified stock libraries were served from a centralized server network?
The concept being that many sites would use the same file, so each time you need that file most likely it would have been downloaded by a previous site.
One of the key arguments against mod_gzip is simply CPU. It’s rather cycle expensive to compress everything. I personally just do pages, not CSS or JS.
That said, I just modified my comments to not use prototype.js for now. I was able to cut out 2 http requests in the mod and file size is down ~ 65k.
The HTTP requests kill. Doesn’t matter how much bandwidth you have, http requests harm performance.
I’d personally like to see a wordpress API for plugins to include their css/js into main files, rather than each add their own. Yes, that would require your style.css and scripts.js to be dynamic, but it would stop the insanity of 15 JS and 12 CSS files for each blog. Before images some sites are hitting 20+ requests. And that’s not even the worst of them.
Don’t forget the impact this has on web server performance as well.
IMHO it’s a design flaw for each to include their own JS/CSS.
I think you may have overlooked the caching portion of the discussion above.
Robert, for IIS, I find that setting compression level to 9 is (subjectively) hardly noticable. 9 produces 3: or 4:1 compression on text files.
I agree that combining files for fewer requests is better, but I think that shouldn’t matter much on most servers because of HTTP-Keep-Alive. Am I wrong here?
Ozh and Vinny, keeping multiple copies – compressed/uncompressed of the dozens of .js, .css, .html files is a pain in the @ss. I’d rather take the risk of IE problem. However, prototype is common, so if you’re going to bundle it in WP or anywhere else, at least ship a trimmed version.
Of course, as Alex says, caching plays a huge role. I agree with iolare and suppose it would be cool if there were central servers to cache common scripts, but maybe we’re dreaming.
BTW Alex, love the “new” sticker. How’s about a preview comment feature? 😛
Oh, back on topic – bundle prototype, trim it, and also use gzip. IMHO.
I think you are all mistaken that most people are on broadband these days. Yes, the majority of the US maybe. Most of the Nordic countries in Europe too.
But what about Latin America or Spain and France. Believe me when I say that most of the visits I get on my site are still from dial-ups.
So to keep my site responsive i try to keep the http requests to a minimum and if a feature is nice but 20k extra in size I will not add it, simply because the page would load to slow for a large part of my visitors.
my 2 cnts.
I didn’t say that:
I said that:
Megamuch, France is among the top DSLed country in Europe… we get ADSL for much cheaper than Americans do :p
Hmmm… I’m kinda lazy and prefer the small files jQuery.js creates.
If you are including Prototype.js, might as well include Moo.fx along with it then… 🙁
You’re missing the point son. I’m not including Prototype, WordPress is.
Caching is great, but it doesn’t help with the first page view, and first impressions are important.
According to research, 4 seconds is considered acceptable. Considering the large number of 56k users on the web… yea. It’s ugly.
[…] On the technical side, I’ve recently added special handling support for FeedBurner redirect URLs and the ability to backfill page titles in the event some pages don’t load properly during the initial harvest operation. Like my Share This plugin, Link Harvest also uses Prototype for the reasons I’ve explained previously. […]
[…] memory leak caused by the infamous prototype.js. I’m not too fond of the said script as it is too large for its own good and wish more people would use moo.fx and its relatives like mootools, which have a tinier version […]