Activity Stream
48,167 MEMBERS
6917 ONLINE
besthostingforums On YouTube Subscribe to our Newsletter besthostingforums On Twitter besthostingforums On Facebook besthostingforums On facebook groups

Page 1 of 2 12 LastLast
Results 1 to 10 of 13
  1.     
    #1
    Member
    Website's:
    eiswebhosting.com mastddl.com

    Default [Howto] Optimize Your Site With GZIP Compression and cache

    Part 1

    Compression is a simple, effective way to save bandwidth and speed up your site.

    Before we start I should explain what content encoding is. When you request a file like myseite.com/index.html, your browser talks to a web server. The conversation goes a little like this:


    So what's the problem?

    Well, the system works, but it's not that efficient. 100KB is a lot of text, and frankly, HTML is redundant. Every <html>, <table> and <div> tag has a closing tag that's almost the same. Words are repeated throughout the document. Any way you slice it, HTML (and its beefy cousin, XML) is not lean.
    And what's the plan when a file's too big? Zip it!
    If we could send a .zip file to the browser (index.html.zip) instead of plain old index.html, we'd save on bandwidth and download time. The browser could download the zipped file, extract it, and then show it to user, who's in a good mood because the page loaded quickly. The browser-server conversation might look like this:



    The tricky part of this exchange is the browser and server knowing it's ok to send a zipped file over. The agreement has two parts

    • The browser sends a header telling the server it accepts compressed content (gzip and deflate are two compression schemes): Accept-Encoding: gzip, deflate


    • The server sends a response if the content is actually compressed: Content-Encoding: gzip

    If the server doesn't send the content-encoding response header, it means the file is not compressed (the default on many servers). The "Accept-encoding" header is just a request by the browser, not a demand. If the server doesn't want to send back compressed content, the browser has to make do with the heavy regular version.
    Setting up the server

    The "good news" is that we can't control the browser. It either sends the Accept-encoding: gzip, deflate header or it doesn't.
    Our job is to configure the server so it returns zipped content if the browser can handle it, saving bandwidth for everyone (and giving us a happy user).
    For IIS, enable compression in the settings.
    In Apache,enabling output compression is fairly straightforward. Add the following to your .htaccess file:
    Code: 
    # compress text, html, javascript, css, xml: AddOutputFilterByType DEFLATE text/plain AddOutputFilterByType DEFLATE text/html AddOutputFilterByType DEFLATE text/xml AddOutputFilterByType DEFLATE text/css AddOutputFilterByType DEFLATE application/xml AddOutputFilterByType DEFLATE application/xhtml+xml AddOutputFilterByType DEFLATE application/rss+xml AddOutputFilterByType DEFLATE application/javascript AddOutputFilterByType DEFLATE application/x-javascript  # Or, compress certain file types by extension: <Files *.html> SetOutputFilter DEFLATE </Files>
    Apache actually has two compression options:

    • mod_deflate is easier to set up and is standard.
    • mod_gzip seems more powerful: you can pre-compress content.

    Deflate is quick and works, so I use it; use mod_gzip if that floats your boat. In either case, Apache checks if the browser sent the "Accept-encoding" header and returns the compressed or regular version of the file. However, some older browsers may have trouble (more below) and there are special directives you can add to correct this.
    If you can't change your .htaccess file, you can use PHP to return compressed content. Give your HTML file a .php extension and add this code to the top:
    In PHP:
    PHP Code: 
    <?php if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) ob_start("ob_gzhandler"); else ob_start(); ?>
    We check the "Accept-encoding" header and return a gzipped version of the file (otherwise the regular version). This is almost like building your own webserver (what fun!). But really, try to use Apache to compress your output if you can help it. You don't want to monkey with your files.
    Verify Your Compression

    Once you've configured your server, check to make sure you're actually serving up compressed content.

    • Online: Use the online gzip test to check whether your page is compressed.
    • In your browser: Use Web Developer Toolbar > Information > View Document Size (like I did for Yahoo, above) to see whether the page is compressed.
    • View the headers: Use Live HTTP Headers to examine the response. Look for a line that says "Content-encoding: gzip".

    Be prepared to marvel at the results. The MastDDL shrunk from 90k to 10k, a 88% reduction in size.

    Caveats


    As exciting as it may appear, HTTP Compression isn't all fun and games. Here's what to watch out for:

    • Older browsers: Yes, some browsers still may have trouble with compressed content (they say they can accept it, but really they can't). If your site absolutely must work with Netscape 1.0 on Windows 95, you may not want to use HTTP Compression. Apache mod_deflate has some rules to avoid compression for older browsers.
    • Already-compressed content: Most images, music and videos are already compressed. Don't waste time compressing them again. In fact, you probably only need to compress the "big 3" (HTML, CSS and Javascript).
    • CPU-load: Compressing content on-the-fly uses CPU time and saves bandwidth. Usually this is a great tradeoff given the speed of compression. There are ways to pre-compress static content and send over the compressed versions. This requires more configuration; even if it's not possible, compressing output may still be a net win. Using CPU cycles for a faster user experience is well worth it, given the short attention spans on the web.

    Enabling compression is one of the fastest ways to improve your site's performance. Go forth, set it up, and let your users enjoy the benefits.

    Part 2

    next post


    Source: betterexplained
    EW-Team Reviewed by EW-Team on . [Howto] Optimize Your Site With GZIP Compression and cache Part 1 Compression is a simple, effective way to save bandwidth and speed up your site. Before we start I should explain what content encoding is. When you request a file like myseite.com/index.html, your browser talks to a web server. The conversation goes a little like this: http://i2.mastiimage.com/view/EGeFbwEC/HTTP_request.png So what's the problem? Rating: 5


    Host Images FREE no delete

  2.   Sponsored Links

  3.     
    #2
    Member
    Website's:
    eiswebhosting.com mastddl.com
    Part 2
    Add future Expires and Cache-Control headers

    A first-time visitor to your page will make several HTTP requests to download all your sites files, but using the Expires and Cache-Control headers you make those files cacheable. This avoids unnecessary HTTP requests on subsequent page views.

    To set your Expires headers add these lines to your .htaccess:

    Code: 
    <ifModule mod_expires.c>
      ExpiresActive On
      ExpiresDefault "access plus 1 seconds"
      ExpiresByType text/html "access plus 1 seconds"
      ExpiresByType image/gif "access plus 2592000 seconds"
      ExpiresByType image/jpeg "access plus 2592000 seconds"
      ExpiresByType image/png "access plus 2592000 seconds"
      ExpiresByType text/css "access plus 604800 seconds"
      ExpiresByType text/javascript "access plus 216000 seconds"
      ExpiresByType application/x-javascript "access plus 216000 seconds"
    </ifModule>
    To set Cache-Control headers add:
    Code: 
    <ifModule mod_headers.c>
      <filesMatch "\\.(ico|pdf|flv|jpg|jpeg|png|gif|swf)$">
        Header set Cache-Control "max-age=2592000, public"
      </filesMatch>
      <filesMatch "\\.(css)$">
        Header set Cache-Control "max-age=604800, public"
      </filesMatch>
      <filesMatch "\\.(js)$">
        Header set Cache-Control "max-age=216000, private"
      </filesMatch>
      <filesMatch "\\.(xml|txt)$">
        Header set Cache-Control "max-age=216000, public, must-revalidate"
      </filesMatch>
      <filesMatch "\\.(html|htm|php)$">
        Header set Cache-Control "max-age=1, private, must-revalidate"
      </filesMatch>
    </ifModule>
    Now all your files must have the right headers and be cacheable except the CSS and JavaScript files processed by JSmart. This is because JSmart overrides the cache headers when gzipping these files.

    To fix this you have to edit /jsmart/load.php file and change the block code
    if (JSMART_CACHE_ENABLED) {
    if (isset($headers['If-Modified-Since']) && $headers['If-Modified-Since'] == $mtimestr)
    header_exit('304 Not Modified');

    header("Last-Modified: " . $mtimestr);
    header("Cache-Control: must-revalidate", false);
    } else header_nocache();
    TO
    Code: 
    if (JSMART_CACHE_ENABLED) {
      if (isset($headers['If-Modified-Since']) && $headers['If-Modified-Since'] == $mtimestr)
        header_exit('304 Not Modified');
     
      if ($file_type=='js') {
        header("Expires: " . gmdate("D, d M Y H:i:s", $mtime + 216000) . " GMT");
        header("Cache-Control: max-age=216000, private, must-revalidate", true);
      } else {
        header("Expires: " . gmdate("D, d M Y H:i:s", $mtime + 604800) . " GMT");
        header("Cache-Control: max-age=604800, public, must-revalidate", true);
      }
    } else header_nocache();
    With these settings you should have your site a lot faster and your file?s size greatly reduced.

    Resources
    Some descriptions are based on .htaccess (Hypertext Access) Articles from AskApache.
    mod_gzip settings are taken from Highub ? Web Development Blog.


    Host Images FREE no delete

  4.     
    #3
    Member
    Website's:
    vbulletin.org
    gr8 article..

  5.     
    #4
    Member
    Website's:
    zcinema.in bxmovies.com stoneshells.net
    ya its gud one ty man :X

  6.     
    #5
    Member
    Thank You Mate!

  7.     
    #6
    Member
    This is great if your web host allows gzip compression. any suggestion of shared web bost that allows gzip compression?
    Promote your website on BlogsDB for Free. Submit your blog today.

  8.     
    #7
    Member
    Website's:
    tribupinoy.net pinoyddl.org
    worldwidexs.com.au they provide Gzip compression.. and the price are reasonable also..

  9.     
    #8
    Member
    @cyber-cliff, thanks for your suggestion, I'll check that one.
    Promote your website on BlogsDB for Free. Submit your blog today.

  10.     
    #9
    Member
    Website's:
    i-fresh.net
    Thanks for nice post and thanks KWWHQuicko for nice bump. This is useful topic which I missed

  11.     
    #10
    Banned
    Can you do all this with w3tc caching plugin in wordpress?

Page 1 of 2 12 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. optimize my site
    By kkrajdurai in forum Technical Help Desk Support
    Replies: 12
    Last Post: 12th Jul 2012, 09:08 PM
  2. wp super cache VS w3 total cache
    By kokosko in forum Webmaster Discussion
    Replies: 1
    Last Post: 21st Dec 2011, 07:44 PM
  3. W3 cache plugin - breaks my site.
    By BeitarJerusalem in forum Wordpress
    Replies: 12
    Last Post: 12th Dec 2011, 01:23 PM
  4. How to speed up your site with GZIP
    By cgworld in forum Tutorials and Guides
    Replies: 20
    Last Post: 8th Mar 2011, 05:38 PM
  5. [Wordpress] WP Super Cache or W3 Total Cache ?
    By Rocke in forum Polling Plaza
    Replies: 13
    Last Post: 13th Dec 2010, 06:00 PM

Tags for this Thread

BE SOCIAL