Commit d6108022 authored by Jamesie Pic's avatar Jamesie Pic

Project modernization

- continuous integration with travis,
- continuous deployment with openshift,
- add update_score cron,
- sentry for exception tracking,
- continuous documentation build with rtfd,
- configuration made 12factor-ish,
- make use of new django-representatives version for database
  optimization.
parent 40829d7d
*.sqlite3
celerybeat-*
core/static/libs/*
deploy
# libs
static/libs
......@@ -21,3 +20,6 @@ __pycache__/
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
.dpl
db.sqlite
log/
The OpenShift `python` cartridge documentation can be found at:
http://openshift.github.io/documentation/oo_cartridge_guide.html#python
For information about .openshift directory, consult the documentation:
http://openshift.github.io/documentation/oo_user_guide.html#the-openshift-directory
For information about action hooks, consult the documentation:
http://openshift.github.io/documentation/oo_user_guide.html#action-hooks
export OPENSHIFT_PYTHON_WSGI_APPLICATION=memopol/wsgi.py
#!/bin/bash
# This deploy hook gets executed after dependencies are resolved and the
# build hook has been run but before the application has been started back
# up again. This script gets executed directly, so it could be python, php,
# ruby, etc.
set -xe
source ${OPENSHIFT_HOMEDIR}app-root/runtime/dependencies/python/virtenv/bin/activate
cat ${OPENSHIFT_REPO_DIR}requirements.txt
pip install -U pip
pip install -r ${OPENSHIFT_REPO_DIR}requirements.txt
# We don't have sentry yet
# python ${OPENSHIFT_REPO_DIR}manage.py raven test
python ${OPENSHIFT_REPO_DIR}manage.py migrate --noinput
pushd ${OPENSHIFT_DATA_DIR}
if ! [ -d node ]; then
wget https://nodejs.org/dist/v4.2.2/node-v4.2.2-linux-x64.tar.gz
tar xvzf node-v4.2.2-linux-x64.tar.gz
ln -sfn node-v4.2.2-linux-x64 node
fi
popd
pushd ${OPENSHIFT_REPO_DIR}
if [ -f ${OPENSHIFT_DATA_DIR}sentry ]; then
pip install raven
./manage.py raven test
else
echo ${OPENSHIFT_DATA_DIR}sentry does not exist, not setting up raven.
fi
PATH="${OPENSHIFT_DATA_DIR}node/bin:$PATH"
HOME=$OPENSHIFT_DATA_DIR
CI=true
npm install bower
npm install
node_modules/.bin/bower install
node_modules/gulp/bin/gulp.js less
mkdir -p wsgi
./manage.py collectstatic --noinput
popd
mkdir -p ${OPENSHIFT_DATA_DIR}media
mkdir -p ${OPENSHIFT_REPO_DIR}wsgi/static/media
ln -sf ${OPENSHIFT_DATA_DIR}media ${OPENSHIFT_REPO_DIR}wsgi/static/media
Run scripts or jobs on a periodic basis
=======================================
Any scripts or jobs added to the minutely, hourly, daily, weekly or monthly
directories will be run on a scheduled basis (frequency is as indicated by the
name of the directory) using run-parts.
run-parts ignores any files that are hidden or dotfiles (.*) or backup
files (*~ or *,) or named *.{rpmsave,rpmorig,rpmnew,swp,cfsaved}
The presence of two specially named files jobs.deny and jobs.allow controls
how run-parts executes your scripts/jobs.
jobs.deny ===> Prevents specific scripts or jobs from being executed.
jobs.allow ===> Only execute the named scripts or jobs (all other/non-named
scripts that exist in this directory are ignored).
The principles of jobs.deny and jobs.allow are the same as those of cron.deny
and cron.allow and are described in detail at:
http://docs.redhat.com/docs/en-US/Red_Hat_Enterprise_Linux/6/html/Deployment_Guide/ch-Automating_System_Tasks.html#s2-autotasks-cron-access
See: man crontab or above link for more details and see the the weekly/
directory for an example.
PLEASE NOTE: The Cron cartridge must be installed in order to run the configured jobs.
For more information about cron, consult the documentation:
http://openshift.github.io/documentation/oo_cartridge_guide.html#cron
http://openshift.github.io/documentation/oo_user_guide.html#cron
#!/bin/bash
set -x
cmd=$1
cd $OPENSHIFT_REPO_DIR
export CLEAN=1
nohup bin/update_all > $OPENSHIFT_LOG_DIR/update_all.log 2>&1 &
Run scripts or jobs on a weekly basis
=====================================
Any scripts or jobs added to this directory will be run on a scheduled basis
(weekly) using run-parts.
run-parts ignores any files that are hidden or dotfiles (.*) or backup
files (*~ or *,) or named *.{rpmsave,rpmorig,rpmnew,swp,cfsaved} and handles
the files named jobs.deny and jobs.allow specially.
In this specific example, the chronograph script is the only script or job file
executed on a weekly basis (due to white-listing it in jobs.allow). And the
README and chrono.dat file are ignored either as a result of being black-listed
in jobs.deny or because they are NOT white-listed in the jobs.allow file.
For more details, please see ../README.cron file.
Time And Relative D...n In Execution (Open)Shift!
#!/bin/bash
echo "`date`: `cat $(dirname \"$0\")/chrono.dat`"
#
# Script or job files listed in here (one entry per line) will be
# executed on a weekly-basis.
#
# Example: The chronograph script will be executed weekly but the README
# and chrono.dat files in this directory will be ignored.
#
# The README file is actually ignored due to the entry in the
# jobs.deny which is checked before jobs.allow (this file).
#
chronograph
#
# Any script or job files listed in here (one entry per line) will NOT be
# executed (read as ignored by run-parts).
#
README
For information about markers, consult the documentation:
http://openshift.github.io/documentation/oo_user_guide.html#markers
sudo: false
env:
matrix:
- DEBUG=True
language: python
python:
- "2.7"
- '2.7'
install:
- pip install django
- pip install -U pip
- pip install -r requirements.txt
- cp memopol/config.json.sample memopol/config.json
- pip install django
- pip install -r requirements.txt
before_script:
- npm install -g bower
- bower install
- npm install
- npm install -g bower
- bower install
- npm install
script:
- ./manage.py migrate
- node_modules/gulp/bin/gulp.js less
- ./manage.py migrate
- node_modules/gulp/bin/gulp.js less
deploy:
- provider: openshift
user: jamespic@gmail.com
password:
secure: W7hQDKAtmpOfwLjBuss6NEKqPSrRhsbgH8a8eV+/Oo6HZxMi1mbNFSi+6WRNSs3Cil0ZZV+awoqC61jIzV4oTwEYcy5bv9NWNSY1QO34DECMS5sY00wA0zKhkdsdTr9Pc3TLRp1cw6x2KNCF356FKZojFTRbjtfJ79rqBc5k5ww=
app: dev
domain: memopol
skip_cleanup: true
deployment_branch: pr
on:
repo: political-memory/political_memory
branch: pr
if [ -n "$OPENSHIFT_HOMEDIR" ]; then
source ${OPENSHIFT_HOMEDIR}app-root/runtime/dependencies/python/virtenv/bin/activate
fi
function pipe_download_to_command() {
if [ -n "$OPENSHIFT_DATA_DIR" ]; then
cd $OPENSHIFT_DATA_DIR
fi
[ -n "$CLEAN" ] && rm -rf $1
[ -f "$1" ] || wget http://parltrack.euwiki.org/dumps/$1 || exit 1
if [ -n "$OPENSHIFT_REPO_DIR" ]; then
cd $OPENSHIFT_REPO_DIR
fi
export DJANGO_SETTINGS_MODULE=memopol.settings
unxz -c ${OPENSHIFT_DATA_DIR}$1 | $2
[ -n "$CLEAN" ] && rm -rf $1
}
#!/bin/bash
bin/update_representatives
# grace time for pg
sleep 120
bin/update_dossiers
sleep 120
bin/update_votes
sleep 120
bin/update_scores
#!/bin/bash
set -ex
source ${OPENSHIFT_REPO_DIR}bin/lib.sh
pipe_download_to_command ep_dossiers.json.xz parltrack_import_dossiers
#!/bin/bash
set -ex
source ${OPENSHIFT_REPO_DIR}bin/lib.sh
pipe_download_to_command ep_meps_current.json.xz parltrack_import_representatives
#!/bin/bash
set -ex
source ${OPENSHIFT_REPO_DIR}bin/lib.sh
[ -n "$OPENSHIFT_REPO_DIR" ] && cd $OPENSHIFT_REPO_DIR
./manage.py update_score
#!/bin/bash
set -ex
source ${OPENSHIFT_REPO_DIR}bin/lib.sh
pipe_download_to_command ep_votes.json.xz parltrack_import_votes
{% extends "contrib/admin/templates/admin/index.html" %}
{% block branding %}
<h1 id="site-name">Administration for Memopol</h1>
{% endblock %}
{% block content %}
{{ block.super }}
{% endblock %}
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build
# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
endif
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest coverage gettext
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " applehelp to make an Apple Help Book"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " coverage to run coverage check of the documentation (if enabled)"
clean:
rm -rf $(BUILDDIR)/*
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Memopol.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Memopol.qhc"
applehelp:
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
@echo
@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
@echo "N.B. You won't be able to view it unless you put it in" \
"~/Library/Documentation/Help or install it in your application" \
"bundle."
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/Memopol"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Memopol"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
This diff is collapsed.
Deployment on OpenShift
~~~~~~~~~~~~~~~~~~~~~~~
OpenShift is an Open-Source Platform-as-a-Service software by Red Hat. It is
also available in its hosted version known as "OpenShift Online" and the first
three websites ("gears") are free.
Clone the repository
====================
You should fork the project on github and use the fork's clone url. For the
sake of the demo, we'll use the main repository URL::
$ git clone https://github.com/political-memory/political_memory.git
Cloning into 'political_memory'...
remote: Counting objects: 2516, done.
remote: Compressing objects: 100% (109/109), done.
remote: Total 2516 (delta 44), reused 0 (delta 0), pack-reused 2402
Receiving objects: 100% (2516/2516), 4.40 MiB | 79.00 KiB/s, done.
Resolving deltas: 100% (1103/1103), done.
Checking connectivity... done.
$ cd political_memory/
Create your own branch, ie::
$ git checkout -b yourbranch origin/pr
Branch yourbranch set up to track remote branch pr from origin.
Switched to a new branch 'yourbranch'
Create an app on OpenShift
==========================
To deploy the website, use a command like::
$ rhc app-create \
python-2.7 \
"http://cartreflect-claytondev.rhcloud.com/reflect?github=smarterclayton/openshift-redis-cart" \
cron-1.4 \
postgresql-9.2 \
-a yourappname \
-e OPENSHIFT_PYTHON_WSGI_APPLICATION=memopol/wsgi.py \
--from-code https://github.com/political-memory/political_memory.git
This should create an app on openshift. Other commands would deploy it at once
but in this tutorial we're going to see how to manage it partly manually for
development.
Add the git remote created by OpenShift
=======================================
Add the git remote openshift created for you, you can see it with
``rhc app-show``, ie.::
$ rhc app-show -a yourappname
[snip]
Git URL: ssh://569f5cf500045f6a1839a0a4@yourappname-yourdomain.rhcloud.com/~/git/yourappname.git/
Initial Git URL: https://github.com/political-memory/political_memory.git
SSH: 569f5cf500045f6a1839a0a4@yourappname-yourdomain.rhcloud.com
[snip]
$ git remote add oo_yourappname ssh://569f5cf500045f6a1839a0a4@yourappname-yourdomain.rhcloud.com/~/git/yourappname.git/
Activate OpenShift's git post-recieve hook
==========================================
Activate OpenShift's post-receive hook on your branch::
$ rhc app-configure -a yourappname --deployment-branch yourbranch
Deploy your branch
==================
OpenShift will deploy when it receives commits on the deployment branch, to
deploy just do::
$ git push oo_yourappname yourbranch
If something goes wrong and you want to retry, use the ``rhc app-deploy``
command, ie::
$ rhc app-deploy yourbranch -a yourappname
Data provisionning
==================
To fill up the representatives database table, either wait for the cron script
to be executed, either do it manually::
$ rhc ssh -a yourappname 'cd app-root/repo/ && bin/update_all'
OpenShift is fun, login with ssh and look around if you're curious, you'll be
able to recreate your app without much effort if you break it anyway.
Local development tutorial
~~~~~~~~~~~~~~~~~~~~~~~~~~
.. warn:: I reverse-engineered this from the source code I inherited, I might
not be doing the right way nor be able to defend all of technical
decisions.
This tutorial drives through a local installation of the project for
development on Linux. It requires git, a fairly recent version of nodejs (see
:file:`.openshift/action_hooks/deploy` for a way to install it), python2 and
virtualenv.
Make a virtual environment
==========================
For the sake of the tutorial, we'll do this in the temporary directory, but you
could do it anywhere::
$ cd /tmp
Create a python virtual environment and activate it::
$ virtualenv memopol_env
Using real prefix '/usr'
New python executable in memopol_env/bin/python2
Also creating executable in memopol_env/bin/python
Installing setuptools, pip, wheel...done.
$ source memopol_env/bin/activate
Clone the repository
====================
You should fork the project on github and use the fork's clone url. For the
sake of the demo, we'll use the main repository URL::
$ git clone https://github.com/political-memory/political_memory.git
Cloning into 'political_memory'...
remote: Counting objects: 2516, done.
remote: Compressing objects: 100% (109/109), done.
remote: Total 2516 (delta 44), reused 0 (delta 0), pack-reused 2402
Receiving objects: 100% (2516/2516), 4.40 MiB | 79.00 KiB/s, done.
Resolving deltas: 100% (1103/1103), done.
Checking connectivity... done.
$ cd political_memory/
Create your own branch, ie::
$ git checkout -b yourbranch origin/pr
Branch yourbranch set up to track remote branch pr from origin.
Switched to a new branch 'yourbranch'
Install Python dependencies
===========================
Then, install the package for development::
$ pip install -e .
Obtaining file:///tmp/political_memory
Collecting django (from political-memory==0.0.1)
Using cached Django-1.9-py2.py3-none-any.whl
[output snipped for readability]
Installing collected packages: django, sqlparse, django-debug-toolbar, django-pdb, six, django-extensions, werkzeug, south, pygments, markdown, hamlpy, django-coffeescript, ijson, python-dateutil, pytz, political-memory
Running setup.py develop for political-memory
Successfully installed django-1.9 django-coffeescript-0.7.2 django-debug-toolbar-1.4 django-extensions-1.5.9 django-pdb-0.4.2 hamlpy-0.82.2 ijson-2.2 markdown-2.6.5 political-memory pygments-2.0.2 python-dateutil-2.4.2 pytz-2015.7 six-1.10.0 south-1.0.2 sqlparse-0.1.18 werkzeug-0.11.2
And install the requirements::
$ pip install -r requirements.txt
Collecting django<1.9,>=1.8 (from -r requirements.txt (line 1))
[output snipped for readability]
Using cached Django-1.8.7-py2.py3-none-any.whl
Running setup.py develop for django-representatives
Running setup.py develop for django-representatives-votes
Successfully installed amqp-1.4.8 anyjson-0.3.3 billiard-3.3.0.22 celery-3.1.19 django-1.8.7 django-adminplus-0.5 django-appconf-1.0.1 django-autocomplete-light-2.2.10 django-bootstrap3-6.2.2 django-celery-3.1.17 django-compressor-1.6 django-constance-1.1.1 django-datetime-widget-0.9.3 django-denorm-0.2.0 django-filter-0.11.0 django-picklefield-0.3.2 django-representatives django-representatives-votes django-taggit-0.17.5 django-uuidfield-0.5.0 djangorestframework-3.3.1 kombu-3.0.30 py-dateutil-2.2 pyprind-2.9.3 requests-2.8.1 slugify-0.0.1
Install NodeJS dependencies
===========================
We'll also need to install bower for the staticfiles::
$ npm install bower
memopol@3.0.0 /tmp/political_memory
└── bower@1.7.0 extraneous
As well as all the requirements from :file:`package.json`::
$ npm install
memopol@3.0.0 /tmp/political_memory
├── bower@1.7.0 extraneous
├─┬ gulp@3.9.0
[output snipped for readability]
npm WARN In bower@1.7.0 replacing bundled version of configstore with configstore@0.3.2
npm WARN In bower@1.7.0 replacing bundled version of latest-version with latest-version@1.0.1
npm WARN In bower@1.7.0 replacing bundled version of update-notifier with update-notifier@0.3.2
Don't worry about the warnings, for they are non-critical (as all warnings).
Then, install the bower packages::
$ node_modules/.bin/bower install
bower bootstrap#~3.3.5 cached git://github.com/twbs/bootstrap.git#3.3.6
bootstrap#3.3.6 static/libs/bootstrap
└── jquery#2.1.4
[output snipped for readability]
jquery#2.1.4 static/libs/jquery
Build the static files with gulp::
$ node_modules/gulp/bin/gulp.js less
[22:26:42] Using gulpfile /tmp/political_memory/gulpfile.js
[22:26:42] Starting 'less'...
[22:26:44] Finished 'less' after 1.54 s
.. note:: The ``node_modules/gulp/bin/gulp.js watch`` command may be used to
have gulp watching for changes and rebuilding static files
automatically.
Activate ``DEBUG``
==================
``DEBUG`` is disabled by default, the development server won't run properly by
default thnen, to enable it export the ``DEBUG`` variable in the current
shell::
$ export DEBUG=True
Database migrations
===================
Run database migrations, it'll use a file-based sqlite database by default::
$ ./manage.py migrate
Operations to perform:
Synchronize unmigrated apps: django_filters, staticfiles, datetimewidget, autocomplete_light, messages, adminplus, compressor, humanize, django_extensions, constance, bootstrap3
Apply all migrations: legislature, votes, database, admin, positions, sessions, representatives, auth, contenttypes, representatives_votes, taggit
Synchronizing apps without migrations:
Creating tables...
Running deferred SQL...
Installing custom SQL...
Running migrations:
Rendering model states... DONE
Applying contenttypes.0001_initial... OK
[output snipped for readability]
Applying taggit.0002_auto_20150616_2121... OK
Run the development server
==========================
Run the development server::
$ ./manage.py runserver
Performing system checks...
System check identified no issues (0 silenced).
December 09, 2015 - 21:26:47
Django version 1.8.7, using settings 'memopol.settings'
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.
[09/Dec/2015 21:26:48] "GET / HTTP/1.1" 200 13294