I’ve been working on a web app using Django, and I’m curious if there is a way to schedule a job to run periodically.
Basically I just want to run through the database and make some calculations/updates on an automatic, regular basis, but I can’t seem to find any documentation on doing this.
Does anyone know how to set this up?
To clarify: I know I can set up a cron job to do this, but I’m curious if there is some feature in Django that provides this functionality. I’d like people to be able to deploy this app themselves without having to do much config (preferably zero).
I’ve considered triggering these actions “retroactively” by simply checking if a job should have been run since the last time a request was sent to the site, but I’m hoping for something a bit cleaner.
2) Use cron (on Linux) or at (on Windows) to run my command at the required times.
This is a simple solution that doesn’t require installing a heavy AMQP stack. However there are nice advantages to using something like Celery, mentioned in the other answers. In particular, with Celery it is nice to not have to spread your application logic out into crontab files. However the cron solution works quite nicely for a small to medium sized application and where you don’t want a lot of external dependencies.
EDIT:
In later version of windows the at command is deprecated for Windows 8, Server 2012 and above. You can use schtasks.exe for same use.
**** UPDATE ****
This the new link of django doc for writing the custom management command
Celery is a distributed task queue, built on AMQP (RabbitMQ). It also handles periodic tasks in a cron-like fashion (see periodic tasks). Depending on your app, it might be worth a gander.
Celery is pretty easy to set up with django (docs), and periodic tasks will actually skip missed tasks in case of a downtime. Celery also has built-in retry mechanisms, in case a task fails.
That does the job. Each cron is modeled as a class (so its all OO) and each cron runs at a different frequency and we make sure the same cron type doesn’t run in parallel (in case crons themselves take longer time to run than their frequency!)
Look at Django Poor Man’s Cron which is a Django app that makes use of spambots, search engine indexing robots and alike to run scheduled tasks in approximately regular intervals
Brian Neal’s suggestion of running management commands via cron works well, but if you’re looking for something a little more robust (yet not as elaborate as Celery) I’d look into a library like Kronos:
RabbitMQ and Celery have more features and task handling capabilities than Cron. If task failure isn’t an issue, and you think you will handle broken tasks in the next call, then Cron is sufficient.
Celery & AMQP will let you handle the broken task, and it will get executed again by another worker (Celery workers listen for the next task to work on), until the task’s max_retries attribute is reached. You can even invoke tasks on failure, like logging the failure, or sending an email to the admin once the max_retries has been reached.
And you can distribute Celery and AMQP servers when you need to scale your application.
Although not part of Django, Airflow is a more recent project (as of 2016) that is useful for task management.
Airflow is a workflow automation and scheduling system that can be used to author and manage data pipelines. A web-based UI provides the developer with a range of options for managing and viewing these pipelines.
Airflow is written in Python and is built using Flask.
Airflow was created by Maxime Beauchemin at Airbnb and open sourced in the spring of 2015. It joined the Apache Software Foundation’s incubation program in the winter of 2016. Here is the Git project page and some addition background information.
回答 11
将以下内容放在cron.py文件的顶部:
#!/usr/bin/pythonimport os, sys
sys.path.append('/path/to/')# the parent directory of the project
sys.path.append('/path/to/project')# these lines only needed if not on path
os.environ['DJANGO_SETTINGS_MODULE']='myproj.settings'# imports and code below
Put the following at the top of your cron.py file:
#!/usr/bin/python
import os, sys
sys.path.append('/path/to/') # the parent directory of the project
sys.path.append('/path/to/project') # these lines only needed if not on path
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproj.settings'
# imports and code below
after the part of code,I can write anything just like my views.py :)
#######################################
import os,sys
sys.path.append('/home/administrator/development/store')
os.environ['DJANGO_SETTINGS_MODULE']='store.settings'
from django.core.management impor setup_environ
from store import settings
setup_environ(settings)
#######################################
# django-q# -------------------------------------------------------------------------# See: http://django-q.readthedocs.io/en/latest/configure.html
Q_CLUSTER ={# Match recommended settings from docs.'name':'DjangoORM','workers':4,'queue_limit':50,'bulk':10,'orm':'default',# Custom Settings# ---------------# Limit the amount of successful tasks saved to Django.'save_limit':10000,# See https://github.com/Koed00/django-q/issues/110.'catch_up':False,# Number of seconds a worker can spend on a task before it's terminated.'timeout':60*5,# Number of seconds a broker will wait for a cluster to finish a task before presenting it again. This needs to be# longer than `timeout`, otherwise the same task will be processed multiple times.'retry':60*6,# Whether to force all async() calls to be run with sync=True (making them synchronous).'sync':False,# Redirect worker exceptions directly to Sentry error reporter.'error_reporter':{'sentry': RAVEN_CONFIG,},}
You should definitely check out django-q!
It requires no additional configuration and has quite possibly everything needed to handle any production issues on commercial projects.
It’s actively developed and integrates very well with django, django ORM, mongo, redis. Here is my configuration:
# django-q
# -------------------------------------------------------------------------
# See: http://django-q.readthedocs.io/en/latest/configure.html
Q_CLUSTER = {
# Match recommended settings from docs.
'name': 'DjangoORM',
'workers': 4,
'queue_limit': 50,
'bulk': 10,
'orm': 'default',
# Custom Settings
# ---------------
# Limit the amount of successful tasks saved to Django.
'save_limit': 10000,
# See https://github.com/Koed00/django-q/issues/110.
'catch_up': False,
# Number of seconds a worker can spend on a task before it's terminated.
'timeout': 60 * 5,
# Number of seconds a broker will wait for a cluster to finish a task before presenting it again. This needs to be
# longer than `timeout`, otherwise the same task will be processed multiple times.
'retry': 60 * 6,
# Whether to force all async() calls to be run with sync=True (making them synchronous).
'sync': False,
# Redirect worker exceptions directly to Sentry error reporter.
'error_reporter': {
'sentry': RAVEN_CONFIG,
},
}
Django APScheduler for Scheduler Jobs. Advanced Python Scheduler (APScheduler) is a Python library that lets you schedule your Python code to be executed later, either just once or periodically. You can add new jobs or remove old ones on the fly as you please.
note: I’m the author of this library
Install APScheduler
pip install apscheduler
View file function to call
file name: scheduler_jobs.py
def FirstCronTest():
print("")
print("I am executed..!")
Configuring the scheduler
make execute.py file and add the below codes
from apscheduler.schedulers.background import BackgroundScheduler
scheduler = BackgroundScheduler()
Your written functions Here, the scheduler functions are written in scheduler_jobs
It has great documentation and is easy to grok. Windows support is lacking, because Windows does not support process forking. But it works fine if you create your dev environment using the Windows for Linux Subsystem.
回答 19
我用Celery做我的定期任务。首先,您需要按以下步骤安装它:
pip install django-celery
不要忘记注册django-celery设置,然后您可以执行以下操作:
from celery import task
from celery.decorators import periodic_task
from celery.task.schedules import crontab
from celery.utils.log import get_task_logger
@periodic_task(run_every=crontab(minute="0", hour="23"))def do_every_midnight():#your code
I am not sure will this be useful for anyone, since I had to provide other users of the system to schedule the jobs, without giving them access to the actual server(windows) Task Scheduler, I created this reusable app.
Please note users have access to one shared folder on server where they can create required command/task/.bat file. This task then can be scheduled using this app.
For simple dockerized projects, I could not really see any existing answer fit.
So I wrote a very barebones solution without the need of external libraries or triggers, which runs on its own. No external os-cron needed, should work in every environment.
It works by adding a middleware: middleware.py
import threading
def should_run(name, seconds_interval):
from application.models import CronJob
from django.utils.timezone import now
try:
c = CronJob.objects.get(name=name)
except CronJob.DoesNotExist:
CronJob(name=name, last_ran=now()).save()
return True
if (now() - c.last_ran).total_seconds() >= seconds_interval:
c.last_ran = now()
c.save()
return True
return False
class CronTask:
def __init__(self, name, seconds_interval, function):
self.name = name
self.seconds_interval = seconds_interval
self.function = function
def cron_worker(*_):
if not should_run("main", 60):
return
# customize this part:
from application.models import Event
tasks = [
CronTask("events", 60 * 30, Event.clean_stale_objects),
# ...
]
for task in tasks:
if should_run(task.name, task.seconds_interval):
task.function()
def cron_middleware(get_response):
def middleware(request):
response = get_response(request)
threading.Thread(target=cron_worker).start()
return response
return middleware
models/cron.py:
from django.db import models
class CronJob(models.Model):
name = models.CharField(max_length=10, primary_key=True)
last_ran = models.DateTimeField()
Simple way is to write a custom shell command see Django Documentation and execute it using a cronjob on linux. However i would highly recommend using a message broker like RabbitMQ coupled with celery. Maybe you can have a look at
this Tutorial
I’m building a web application with Django. The reasons I chose Django were:
I wanted to work with free/open-source tools.
I like Python and feel it’s a long-term language, whereas regarding Ruby I wasn’t sure, and PHP seemed like a huge hassle to learn.
I’m building a prototype for an idea and wasn’t thinking too much about the future. Development speed was the main factor, and I already knew Python.
I knew the migration to Google App Engine would be easier should I choose to do so in the future.
I heard Django was “nice”.
Now that I’m getting closer to thinking about publishing my work, I start being concerned about scale. The only information I found about the scaling capabilities of Django is provided by the Django team (I’m not saying anything to disregard them, but this is clearly not objective information…).
My questions:
What’s the “largest” site that’s built on Django today? (I measure size mostly by user traffic)
Can Django deal with 100,000 users daily, each visiting the site for a couple of hours?
“What are the largest sites built on Django today?”
There isn’t any single place that collects information about traffic on Django built sites, so I’ll have to take a stab at it using data from various locations. First, we have a list of Django sites on the front page of the main Django project page and then a list of Django built sites at djangosites.org. Going through the lists and picking some that I know have decent traffic we see:
pownce.com (no longer active): alexa rank about 65k.
Mike Malone of Pownce, in his EuroDjangoCon presentation on Scaling Django Web Apps says “hundreds of hits per second”. This is a very good presentation on how to scale Django, and makes some good points including (current) shortcomings in Django scalability.
HP had a site built with Django 1.5: ePrint center. However, as for novemer/2015 the entire website was migrated and this link is just a redirect. This website was a world-wide service attending subscription to Instant Ink and related services HP offered (*).
“Can Django deal with 100,000 users daily, each visiting the site for a couple of hours?”
Yes, see above.
“Could a site like Stack Overflow run on Django?”
My gut feeling is yes but, as others answered and Mike Malone mentions in his presentation, database design is critical. Strong proof might also be found at www.cnprog.com if we can find any reliable traffic stats. Anyway, it’s not just something that will happen by throwing together a bunch of Django models :)
There are, of course, many more sites and bloggers of interest, but I have got to stop somewhere!
We’re doing load testing now. We think we can support 240 concurrent requests (a sustained rate of 120 hits per second 24×7) without any significant degradation in the server performance. That would be 432,000 hits per hour. Response times aren’t small (our transactions are large) but there’s no degradation from our baseline performance as the load increases.
We’re using Apache front-ending Django and MySQL. The OS is Red Hat Enterprise Linux (RHEL). 64-bit. We use mod_wsgi in daemon mode for Django. We’ve done no cache or database optimization other than to accept the defaults.
We’re all in one VM on a 64-bit Dell with (I think) 32Gb RAM.
Since performance is almost the same for 20 or 200 concurrent users, we don’t need to spend huge amounts of time “tweaking”. Instead we simply need to keep our base performance up through ordinary SSL performance improvements, ordinary database design and implementation (indexing, etc.), ordinary firewall performance improvements, etc.
What we do measure is our load test laptops struggling under the insane workload of 15 processes running 16 concurrent threads of requests.
What’s the “largest” site that’s built on Django today? (I measure size mostly by user traffic)
In the US, it was Mahalo. I’m told they handle roughly 10 million uniques a month. Now, in 2019, Mahalo is powered by Ruby on Rails.
Abroad, the Globo network (a network of news, sports, and entertainment sites in Brazil); Alexa ranks them in to top 100 globally (around 80th currently).
Other notable Django users include PBS, National Geographic, Discovery, NASA (actually a number of different divisions within NASA), and the Library of Congress.
Can Django deal with 100k users daily, each visiting the site for a couple of hours?
Yes — but only if you’ve written your application right, and if you’ve got enough hardware. Django’s not a magic bullet.
Could a site like StackOverflow run on Django?
Yes (but see above).
Technology-wise, easily: see soclone for one attempt. Traffic-wise, compete pegs StackOverflow at under 1 million uniques per month. I can name at least dozen Django sites with more traffic than SO.
Scaling Web apps is not about web frameworks or languages, is about your architecture.
It’s about how you handle you browser cache, your database cache, how you use non-standard persistence providers (like CouchDB), how tuned is your database and a lot of other stuff…
You should check the DjangoCon 2008 Keynote, delivered by Cal Henderson, titled “Why I hate Django” where he pretty much goes over everything Django is missing that you might want to do in a high traffic website. At the end of the day you have to take this all with an open mind because it is perfectly possible to write Django apps that scale, but I thought it was a good presentation and relevant to your question.
The largest django site I know of is the Washington Post, which would certainly indicate that it can scale well.
Good design decisions probably have a bigger performance impact than anything else. Twitter is often cited as a site which embodies the performance issues with another dynamic interpreted language based web framework, Ruby on Rails – yet Twitter engineers have stated that the framework isn’t as much an issue as some of the database design choices they made early on.
Django works very nicely with memcached and provides some classes for managing the cache, which is where you would resolve the majority of your performance issues. What you deliver on the wire is almost more important than your backend in reality – using a tool like yslow is critical for a high performance web application. You can always throw more hardware at your backend, but you can’t change your users bandwidth.
I was at the EuroDjangoCon conference the other week, and this was the subject of a couple of talks – including from the founders of what was the largest Django-based site, Pownce (slides from one talk here). The main message is that it’s not Django you have to worry about, but things like proper caching, load balancing, database optimisation, etc.
Django actually has hooks for most of those things – caching, in particular, is made very easy.
I’m sure you’re looking for a more solid answer, but the most obvious objective validation I can think of is that Google pushes Django for use with its App Engine framework. If anybody knows about and deals with scalability on a regular basis, it’s Google. From what I’ve read, the most limiting factor seems to be the database back-end, which is why Google uses their own…
It’s not uncommon to hear people say “Django doesn’t scale”. Depending on how you look at it, the statement is either completely true or patently false. Django, on its own, doesn’t scale.
The same can be said of Ruby on Rails, Flask, PHP, or any other language used by a database-driven dynamic website.
The good news, however, is that Django interacts beautifully with a suite of caching and
load balancing tools that will allow it to scale to as much traffic as you can throw at it.
Contrary to what you may have read online,
it can do so without replacing core components often labeled as “too slow” such as the database ORM or the template layer.
Disqus serves over 8 billion page views per month. Those are some huge numbers.
These teams have proven Django most certainly does scale.
Our experience here at Lincoln Loop backs it up.
We’ve built big Django sites capable of spending the day on the Reddit homepage without breaking a sweat.
Django’s scaling success stories are almost too numerous to list at this point.
It backs Disqus, Instagram, and Pinterest. Want some more proof? Instagram was able to sustain over 30 million users on Django with only 3 engineers (2 of which had no back-end development
The Washington Post’s website is a hugely popular online news source to accompany their daily paper. Its’ huge amount of views and traffic can be easily handled by the Django web framework.
Washington Post - 52.2 million unique visitors (March, 2015)
The National Aeronautics and Space Administration’s official website is the place to find news, pictures, and videos about their ongoing space exploration. This Django website can easily handle huge amounts of views and traffic.
2 million visitors monthly
The Guardian is a British news and media website owned by the Guardian Media Group. It contains nearly all of the content of the newspapers The Guardian and The Observer. This huge data is handled by Django.
The Guardian (commenting system) - 41,6 million unique visitors (October, 2014)
We all know YouTube as the place to upload cat videos and fails. As one of the most popular websites in existence, it provides us with endless hours of video entertainment. The Python programming language powers it and the features we love.
DropBox started the online document storing revolution that has become part of daily life. We now store almost everything in the cloud. Dropbox allows us to store, sync, and share almost anything using the power of Python.
Quora is the number one place online to ask a question and receive answers from a community of individuals. On their Python website relevant results are answered, edited, and organized by these community members.
A majority of the code for Bitly URL shortening services and analytics are all built with Python. Their service can handle hundreds of millions of events per day.
Reddit is known as the front page of the internet. It is the place online to find information or entertainment based on thousands of different categories. Posts and links are user generated and are promoted to the top through votes. Many of Reddit’s capabilities rely on Python for their functionality.
Hipmunk is an online consumer travel site that compares the top travel sites to find you the best deals. This Python website’s tools allow you to find the cheapest hotels and flights for your destination.
Yes it can. It could be Django with Python or Ruby on Rails. It will still scale.
There are few different techniques. First, caching is not scaling. You could have several application servers balanced with nginx as the front in addition to hardware balancer(s).
To scale on the database side you can go pretty far with read slave in MySQL / PostgreSQL if you go the RDBMS way.
Some good examples of heavy traffic websites in Django could be:
It’s a little old, but someone from the LA Times gave a basic overview of why they went with Django.
The Onion’s AV Club was recently moved from (I think Drupal) to Django.
I imagine a number of these these sites probably gets well over 100k+ hits per day. Django can certainly do 100k hits/day and more. But YMMV in getting your particular site there depending on what you’re building.
There are caching options at the Django level (for example caching querysets and views in memcached can work wonders) and beyond (upstream caches like Squid). Database Server specifications will also be a factor (and usually the place to splurge), as is how well you’ve tuned it. Don’t assume, for example, that Django’s going set up indexes properly. Don’t assume that the default PostgreSQL or MySQL configuration is the right one.
Furthermore, you always have the option of having multiple application servers running Django if that is the slow point, with a software or hardware load balancer in front.
Finally, are you serving static content on the same server as Django? Are you using Apache or something like nginx or lighttpd? Can you afford to use a CDN for static content? These are things to think about, but it’s all very speculative. 100k hits/day isn’t the only variable: how much do you want to spend? How much expertise do you have managing all these components? How much time do you have to pull it all together?
I have been using Django for over a year now, and am very impressed with how it manages to combine modularity, scalability and speed of development. Like with any technology, it comes with a learning curve. However, this learning curve is made a lot less steep by the excellent documentation from the Django community. Django has been able to handle everything I have thrown at it really well. It looks like it will be able to scale well into the future.
BidRodeo Penny Auctions is a moderately sized Django powered website. It is a very dynamic website and does handle a good number of page views a day.
Note that if you’re expecting 100K users per day, that are active for hours at a time (meaning max of 20K+ concurrent users), you’re going to need A LOT of servers. SO has ~15,000 registered users, and most of them are probably not active daily. While the bulk of traffic comes from unregistered users, I’m guessing that very few of them stay on the site more than a couple minutes (i.e. they follow google search results then leave).
For that volume, expect at least 30 servers … which is still a rather heavy 1,000 concurrent users per server.
Can Django deal with 100,000 users daily, each visiting the site for a couple of hours?
Yes but use proper architecture, database design, use of cache, use load balances and multiple servers or nodes
Could a site like Stack Overflow run on Django?
Yes just need to follow the answer mentioned in the 2nd question
If you have a site with some static content, then putting a Varnish server in front will dramatically increase your performance. Even a single box can then easily spit out 100 Mbit/s of traffic.
Note that with dynamic content, using something like Varnish becomes a lot more tricky.
My experience with Django is minimal but I do remember in The Django Book they have a chapter where they interview people running some of the larger Django applications. Here is a link. I guess it could provide some insights.
It says curse.com is one of the largest Django applications with around 60-90 million page views in a month.
I develop high traffic sites using Django for the national broadcaster in Ireland. It works well for us. Developing a high performance site is more than about just choosing a framework. A framework will only be one part of a system that is as strong as it’s weakest link. Using the latest framework ‘X’ won’t solve your performance issues if the problem is slow database queries or a badly configured server or network.
Even-though there have been a lot of great answers here, I just feel like pointing out, that nobody have put emphasis on..
It depends on the application
If you application is light on writes, as in you are reading a lot more data from the DB than you are writing. Then scaling django should be fairly trivial, heck, it comes with some fairly decent output/view caching straight out of the box. Make use of that, and say, redis as a cache provider, put a load balancer in front of it, spin up n-instances and you should be able to deal with a VERY large amount of traffic.
Now, if you have to do thousands of complex writes a second? Different story. Is Django going to be a bad choice? Well, not necessarily, depends on how you architect your solution really, and also, what your requirements are.
If you want to use Open source then there are many options for you. But python is best among them as it has many libraries and a super awesome community.
These are a few reasons which might change your mind:
Python is very good but it is a interpreted language which makes it slow. But many accelerator and caching services are there which partly solve this problem.
If you are thinking about rapid development then Ruby on Rails is best among all. The main motto of this(ROR) framework is to give a comfortable experience to the developers. If you compare Ruby and Python both have nearly the same syntax.
Google App Engine is very good service but it will bind you in its scope, you don’t get chance to experiment new things. Instead of it you can use Digital Ocean cloud which will only take $5/Month charge for its simplest droplet. Heroku is another free service where you can deploy your product.
Yes! Yes! What you heard is totally correct but here are some examples which are using other technologies
Rails: Github, Twitter(previously), Shopify, Airbnb, Slideshare, Heroku etc.
PHP: Facebook, Wikipedia, Flickr, Yahoo, Tumbler, Mailchimp etc.
Conclusion is a framework or language won’t do everything for you. A better architecture, designing and strategy will give you a scalable website. Instagram is the biggest example, this small team is managing such huge data. Here is one blog about its architecture must read it.
I don’t think the issue is really about Django scaling.
I really suggest you look into your architecture that’s what will help you with your scaling needs.If you get that wrong there is no point on how well Django performs. Performance != Scale. You can have a system that has amazing performance but does not scale and vice versa.
Is your application database bound? If it is then your scale issues lay there as well. How are you planning on interacting with the database from Django? What happens when you database cannot process requests as fast as Django accepts them? What happens when your data outgrows one physical machine. You need to account for how you plan on dealing with those circumstances.
Moreover, What happens when your traffic outgrows one app server? how you handle sessions in this case can be tricky, more often than not you would probably require a shared nothing architecture. Again that depends on your application.
In short languages is not what determines scale, a language is responsible for performance(again depending on your applications, different languages perform differently). It is your design and architecture that makes scaling a reality.
I hope it helps, would be glad to help further if you have questions.
Spreading the tasks evenly, in short optimizing each and every aspect including DBs, Files, Images, CSS etc. and balancing the load with several other resources is necessary once your site/application starts growing. OR you make some more space for it to grow. Implementation of latest technologies like CDN, Cloud are must with huge sites. Just developing and tweaking an application won’t give your the cent percent satisfation, other components also play an important role.