Estou tentando configurar o Varnish 4 + Nginx com mod_pagespeed.
Estou usando a seguinte configuração para o Varnish 4 de acordo com este documento:
link
A configuração que estou usando não fornece erros, mas quase não obtenho acessos ao cache do Varnish
MAIN.cache_hit 80
MAIN.cache_miss 347
A configuração completa do .vcl:
# Marker to tell the VCL compiler that this VCL has been adapted to the
# new 4.0 format.
vcl 4.0;
import std;
# Block 1: Define upstream server's host and port. Set this to point to your
# content server.
backend default {
.host = "127.0.0.1";
.port = "8080";
}
# Block 2: Define a key based on the User-Agent which can be used for hashing.
# Also set the PS-CapabilityList header for PageSpeed server to respect.
sub generate_user_agent_based_key {
# Define placeholder PS-CapabilityList header values for large and small
# screens with no UA dependent optimizations. Note that these placeholder
# values should not contain any of ll, ii, dj, jw or ws, since these
# codes will end up representing optimizations to be supported for the
# request.
set req.http.default_ps_capability_list_for_large_screens = "LargeScreen.SkipUADependentOptimizations:";
set req.http.default_ps_capability_list_for_small_screens = "TinyScreen.SkipUADependentOptimizations:";
# As a fallback, the PS-CapabilityList header that is sent to the upstream
# PageSpeed server should be for a large screen device with no browser
# specific optimizations.
set req.http.PS-CapabilityList = req.http.default_ps_capability_list_for_large_screens;
# Cache-fragment 1: Desktop User-Agents that support lazyload_images (ll),
# inline_images (ii) and defer_javascript (dj).
# Note: Wget is added for testing purposes only.
if (req.http.User-Agent ~ "(?i)Chrome/|Firefox/|MSIE |Safari|Wget") {
set req.http.PS-CapabilityList = "ll,ii,dj:";
}
# Cache-fragment 2: Desktop User-Agents that support lazyload_images (ll),
# inline_images (ii), defer_javascript (dj), webp (jw) and lossless_webp
# (ws).
if (req.http.User-Agent ~
"(?i)Chrome/[2][3-9]+\.|Chrome/[[3-9][0-9]+\.|Chrome/[0-9]{3,}\.") {
set req.http.PS-CapabilityList = "ll,ii,dj,jw,ws:";
}
# Cache-fragment 3: This fragment contains (a) Desktop User-Agents that
# match fragments 1 or 2 but should not because they represent older
# versions of certain browsers or bots and (b) Tablet User-Agents that
# on all browsers and use image compression qualities applicable to large
# screens. Note that even tablets that are capable of supporting inline or
# webp images, e.g. Android 4.1.2, will not get these advanced
# optimizations.
if (req.http.User-Agent ~ "(?i)Firefox/[1-2]\.|MSIE [5-8]\.|bot|Yahoo!|Ruby|RPT-HTTPClient|(Google \(\+https\:\/\/developers\.google\.com\/\+\/web\/snippet\/\))|Android|iPad|TouchPad|Silk-Accelerated|Kindle Fire") {
set req.http.PS-CapabilityList = req.http.default_ps_capability_list_for_large_screens;
}
# Cache-fragment 4: Mobiles and small screen tablets will use image
# compression qualities applicable to small screens, but all other
# optimizations will be those that work on all browsers.
if (req.http.User-Agent ~ "(?i)Mozilla.*Android.*Mobile*|iPhone|BlackBerry|Opera Mobi|Opera Mini|SymbianOS|UP.Browser|J-PHONE|Profile/MIDP|portalmmm|DoCoMo|Obigo|Galaxy Nexus|GT-I9300|GT-N7100|HTC One|Nexus [4|7|S]|Xoom|XT907") {
set req.http.PS-CapabilityList = req.http.default_ps_capability_list_for_small_screens;
}
# Remove placeholder header values.
unset req.http.default_ps_capability_list_for_large_screens;
unset req.http.default_ps_capability_list_for_large_screens;
}
sub vcl_hash {
# Block 3: Use the PS-CapabilityList value for computing the hash.
hash_data(req.http.PS-CapabilityList);
}
# Block 3a: Define ACL for purge requests
acl purge {
# Purge requests are only allowed from localhost.
"localhost";
"127.0.0.1";
}
# Block 4: In vcl_recv, on receiving a request, call the method responsible for
# generating the User-Agent based key for hashing into the cache.
sub vcl_recv {
# We want to support beaconing filters, i.e., one or more of inline_images,
# lazyload_images, inline_preview_images or prioritize_critical_css are
# enabled. We define a placeholder constant called ps_should_beacon_key_value
# so that some percentages of hits and misses can be sent to the backend
# with this value used for the PS-ShouldBeacon header to force beaconing.
# This value should match the value of the DownstreamCacheRebeaconingKey
# pagespeed directive used by your backend server.
# WARNING: Do not use "random_rebeaconing_key" for your configuration, but
# instead change it to something specific to your site, to keep it secure.
set req.http.ps_should_beacon_key_value = "random_rebeaconing_key";
# Incoming PS-ShouldBeacon headers should not be allowed since this will allow
# external entities to force the server to instrument pages.
unset req.http.PS-ShouldBeacon;
call generate_user_agent_based_key;
# Block 3d: Verify the ACL for an incoming purge request and handle it.
if (req.method == "PURGE") {
if (!client.ip ~ purge) {
return (synth(405,"Not allowed."));
}
return (purge);
}
# Blocks which decide whether cache should be bypassed or not go here.
# Block 5a: Bypass the cache for .pagespeed. resource. PageSpeed has its own
# cache for these, and these could bloat up the caching layer.
if (req.url ~ "\.pagespeed\.([a-z]\.)?[a-z]{2}\.[^.]{10}\.[^.]+") {
# Skip the cache for .pagespeed. resource. PageSpeed has its own
# cache for these, and these could bloat up the caching layer.
return (pass);
}
# Block 5b: Only cache responses to clients that support gzip. Most clients
# do, and the cache holds much more if it stores gzipped responses.
if (req.http.Accept-Encoding !~ "gzip") {
return (pass);
}
}
# Block 6: Mark HTML uncacheable by caches beyond our control.
sub vcl_backend_response {
if (beresp.http.Content-Type ~ "text/html") {
# Hide the upstream cache control header.
unset beresp.http.Cache-Control;
# Add no-cache Cache-Control header for html.
set beresp.http.Cache-Control = "no-cache, max-age=0";
}
return (deliver);
}
sub vcl_hit {
if (std.random(0, 100) < 5) {
set req.http.PS-ShouldBeacon = req.http.ps_should_beacon_key_value;
return (pass);
}
}
sub vcl_miss {
# Send 25% of the MISSes to the backend for instrumentation.
if (std.random(0, 100) < 25) {
set req.http.PS-ShouldBeacon = req.http.ps_should_beacon_key_value;
return (pass);
}
}
# Block 7: Add a header for identifying cache hits/misses.
sub vcl_deliver {
if (obj.hits > 0) {
set resp.http.X-Cache = "HIT";
} else {
set resp.http.X-Cache = "MISS";
}
}
E a configuração completa de pagepeed:
pagespeed on;
# Needs to exist and be writable by nginx. Use tmpfs for best performance.
pagespeed FileCachePath /tmp/pagespeed;
location ~ "\.pagespeed\.([a-z]\.)?[a-z]{2}\.[^.]{10}\.[^.]+" {
add_header "" "";
}
location ~ "^/pagespeed_static/" { }
location ~ "^/ngx_pagespeed_beacon$" { }
#pagespeed ModifyCachingHeaders on;
pagespeed RewriteLevel CoreFilters;
pagespeed EnableFilters rewrite_domains;
#Testing CDN
#Authorize Domains
pagespeed Domain origindomain.com;
pagespeed Domain cdn.domain.com;
#Rewrite to CDN
pagespeed MapRewriteDomain cdn.domain.com origindomain.com;
#Disallow wp-admin / login
pagespeed Disallow "*/wp-admin/*";
pagespeed Disallow "*/checkout/*";
Demonstração dos cabeçalhos saindo ao chamar a página inicial:
Accept-Ranges bytes
Access-Control-Allow-Head... X-Requested-With
Access-Control-Allow-Meth... GET, HEAD, OPTIONS
Access-Control-Allow-Orig... *
Age 0
Cache-Control no-cache, max-age=0
Connection keep-alive
Content-Encoding gzip
Content-Length 8473
Content-Type text/html; charset=UTF-8
Date Wed, 15 Oct 2014 08:38:49 GMT
Link <http://origindomain.com/>; rel=shortlink
Server nginx/1.7.6
Set-Cookie _icl_current_language=en; expires=Thu, 16-Oct-2014 08:38:49 GMT; Max-Age=86400; path=/
Vary Accept-Encoding
Via 1.1 varnish-v4
X-Cache MISS
X-Page-Speed 1.9.32.1-4238
X-Pingback http://origindomain.com/xmlrpc.php
X-Powered-By PHP/7.0.0-dev
X-Varnish 393416
Parece que nunca atinge o cache e envia a versão otimizada do site. Na verdade, no início dos testes, ele acertava às vezes, mas não tinha certeza se estava empurrando uma versão otimizada ou apenas uma versão regular.
Então, o que é que é perder, o passthrough e o nginx veiculam o site otimizado para pagepeed, mas o Varnish não tem efeito algum.