0
I have a Django application and I’m having trouble configuring my server for high traffic. It’s very slow with few accesses. see my setup:
[uwsgi]
chdir = /home/meuser/public/meusite.com.br/public_html/
wsgi-file = meusite/wsgi.py
processes = 4
max-requests = 6000
chmod-socket = 666
master = True
vacuum = True
socket = /tmp/uwsgi.sock
Even with Varnish and Nginx the site is very slow.
How can I fix this?
I have one vps on the 16gb Ram line, and another 4gb exclusive to the database.
See the settings:
Varnish:
backend default {
.host = "127.0.0.1";
.port = "8080";
}
sub vcl_recv
{
# if (req.restarts == 0) {
# if (req.http.x-forwarded-for) {
# set req.http.X-Forwarded-For =
# req.http.X-Forwarded-For + ", " + client.ip;
# } else {
# set req.http.X-Forwarded-For = client.ip;
# }
# }
remove req.http.X-Forwarded-For;
set req.http.X-Forwarded-For = client.ip;
# Do not cache example.com, the admin area,
# logged-in users or POST requests
if (req.http.host ~ "patoshoje.com.br" ||
req.url ~ "^/rocha" ||
req.url ~ "^/admin" ||
req.http.Cookie ~ "sessionid" ||
req.request == "POST")
{
return (pass);
}
# Don't allow cookies to affect cachability
unset req.http.Cookie;
# Set Grace Time to one hour
set req.grace = 1h;
}
sub vcl_fetch
{
#set bereq.http.X-Real-IP = client.ip;
# set bereq.http.X-Forwarded-For = req.http.X-Forwarded-For;
#set bereq.http.host = req.http.host;
# Set the TTL for cache object to five minutes
set beresp.ttl = 3m;
# Set Grace Time to one hour
set beresp.grace = 1h;
}
Nginx:
upstream mysite {
server unix:///tmp/uwsgi.sock;
}
server {
listen 8080;
server_name www.mysite.com.br mysite.com.br;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
access_log /home/mysite/public/mysite.com.br/log/access.log;
error_log /home/mysite/public/mysite.com.br/log/error.log;
root /home/mysite/public/mysite.com.br/public_html;
charset utf-8;
gzip on;
gzip_disable "msie6";
gzip_types text/plain text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript;
gzip_vary on;
keepalive_timeout 0;
client_max_body_size 10m;
large_client_header_buffers 4 16k;
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /home/mysite/public/mysite.com.br/public_html/mysite/public/static/;
}
location /ckfinder {
alias /home/mysite/public/mysite.com.br/public_html/ckfinder;
expires 7d;
add_header pragma public;
add_header cache-control "public";
}
location /media {
alias /home/mysite/public/mysite.com.br/public_html/mysite/public/media;
expires 7d;
add_header pragma public;
add_header cache-control "public";
}
location /static {
alias /home/mysite/public/mysite.com.br/public_html/mysite/public/static;
expires 7d;
add_header pragma public;
add_header cache-control "public";
}
location /robots.txt {
root /home/mysite/public/mysite.com.br/public_html/mysite/public/static/robots.txt;
access_log off;
log_not_found off;
}
location /favicon.ico {
root /home/mysite/public/mysite.com.br/public_html/mysite/public/static/imgs/favicon.png;
access_log off;
log_not_found off;
}
location / {
uwsgi_pass mysite;
include /etc/nginx/uwsgi_params;
}
}
[uwsgi]
chdir = /home/meuser/public/meusite.com.br/public_html/
wsgi-file = meusite/wsgi.py
processes = 4
max-requests = 6000
chmod-socket = 666
master = True
vacuum = True
socket = /tmp/uwsgi.sock
high traffic, little access and slow is relative, place numbers. also put the settings of the other components involved, in the case Nginx and Varnish. if even with Varnish this slow the problem may be on the network or Varnish may not be caching
– tovmeod