django request.POST vs request.method

Properly processing requests in your views is important, and one of the most important things is knowing what HTTP verb is being used.

There are few ways to determine whether you are getting a GET or POST. The two most common ways people determine this is:

if request.POST:


if request.method == 'POST':

It is important to know the difference between the two. While they do get you to the same place there are a couple of caveats to note.


This actually returns a string of the method used with the request, and nothing else. This is important because you can use HTTP verbs without sending data.


If you do a boolean check of request.POST it checks to make sure that there is data in the POST QueryDict dictionary. If there is data then it was a POST; if no data then it evaluates as false as if no POST happened.

The problem is you can do a post even without data and if you do you would get the following result: (captured from the shell)

<QueryDict: {}>

Which means there is no data so your code says “this is not a POST” when it really is one.


If you don’t know how these two different parts of the framework work then it can lead to a lot of headache when you get results you don’t expect. I recommend to always use request.method instead of evaluating if the QueryDict has data in it from a GET or a POST. It can lead to headaches in logic in some few instances, and is more explicit as to what is being evaluated.

Related Posts:

Quick and Dirty Write Your Own Bash Autocomplete

bashTyping a command on the command line and hitting tab to get the rest of what you need is very nice. It is almost magical. Fortunately it is quite easy to write your own script to handle this for any application you have, or write.

This is a quick and dirty intro into how to do this.

All the Code You Need.

To get started there is some code that you just need to add with out really understanding so you can get something done.

  local cur=${COMP_WORDS[COMP_CWORD]}
  COMPREPLY=( $(compgen -W "runserver collectstatic" -- $cur) )
complete -F _djangoadmin
  • First you are creating a new function called _djangoadmin to be called later.
  • Line 3 is boilerplate to just add to your function.
  • Line 4 is where the magic is. After the -W in the quotes is where you put your autocompleted commands.
  • Line 6 is saying that when you type in and hit tab run the _djangoadmin fuction.


After seeing this I quickly realized that since it is fairly simple to get started I need to offer an autocomplete script if I ever write a CLI utility for other people to use. It is also useful for writing your own autocomplete for utilities that don’ have it.

Related Posts:

5 Project Management Lessons I Re-Learned

For the last few weeks I have spent a lot of my free time rebuilding the GoDjango site. GoDjango has been a lot of fun, and I have felt bad putting it on a hiatus to get a new job and get married. So I decided i’m going to relaunch the site and really do things more intentionally over the haphazard way I did them before.

In order to relaunch, and make sure I don’t burnout and overstress I decided to lay out a plan, a long one. I stepped back and setup a timeline of things to do from coding to video production. With the relaunch of coming tomorrow, and a long and fruitful future coming. I wanted to recollect on the 5 biggest things I re-learned doing this project.

If You Don’t Define Scope Your Sunk

Once I decided to go full steam ahead reworking GoDjango to make sure it was a success I wrote down 10 steps and gave my self 3 weeks to do it in. My biggest problem is three of the steps are things I had never done before, so I set about breaking down two of those. My steps ballooned to roughly 40 steps. I learned that I should have broken it down before I set a deadline so I wouldn’t have had to push it back to make sure I got it all done.

However, the question is would I have pushed as hard with 40 steps as I did with 10?

Minimum Viable Product is Smaller than You Think

I set out to have three key things done by launch. Only one of them will be fully ready with the others getting turned on in the coming couple of weeks. As I got closer to the deadline I really started thinking what do I really need vs. what would be nice to have. It turns out I didn’t need as much as I thought. So on launch tomorrow two features I really wanted wont be available, but they will be soon.

To mitigate this problem in the future I think I really need to take to heart “Launch and Iterate”. To be honest though it is hard you want everyone to have everything from day one, even knowing you will have more users later and they wont know the difference.

Proper Deployments Take Time

I haven’t done a full deploy in a long time. I have used heroku a lot, but I have felt like I lost some of the control I really wanted. Therefore, I setup a VPS just for the new GoDjango, and decided to do a best practices deployment. Unfortunately because I haven’t deployed in so long it took a lot longer to setup than I though it would. I should have better planned my deployment and built in longer than a couple of hours into my schedule for it.

Write Down Everything

As part of the planning process you should write down everything you think about. I went through and planned a large portion of what I was going to code, and how I was going structure things. Unfortunately, I did a portion of it in my head thinking “Oh I wont forget”. Well I forgot, and had to spend time re-planning out portions of the rewrite of GoDjango. So going forward if nothing else I need to write down on my whiteboard whatever I plan out, and take a picture of it.

Don’t Put off What You Don’t Know

Finally, don’t put off things you don’t know about. I have never purchased and SSL certificate, or setup a server to use one. I have never had the need before. As part of keeping GoDjango alive for a long time I am going to add a premium subscription model to the site. In order to take payments I need SSL, I don’t want to use paypal. Unfortunately I put this off, and off, and off again. Now I wont be launching tomorrow with the ability for people to subscribe because I am having to argue with the State of Oklahoma as to why my business isn’t showing up for my certificate authority to verify.

If I had taken care of this sooner I could have had this all wrapped up by now. Now I have a bit of egg on my face, and a valuable lesson learned.


Being the sole developer, designer, architect and project manager of a site has a lot of challenges, and falling down in one area can have adverse consequences in other area. In the future I need to be more diligent in all areas. Maybe even make some checklists for everything to make sure all is covered.

Related Posts:

Adding Git Data to Your Bash Prompt

Git Logo

A well setup bash prompt can save you a lot of time, and be amazingly useful giving you all the information you need, quickly. Unfortunately, you need to set it up properly which is a bit tricky. We will walk you through setting up your prompt with git data.

Bash 4+

First thing you need to make sure of is you have bash 4.0 or higher.


This should be default.

Mac OS X

Mac OSX is a bit more complicated, but not by much, please follow a previous blog post: Upgrade Bash to 4+ on OS X.

Download git-prompt and Activate

Next you will need the bash functions for your environment to use. Easiest thing to do is download the file and source it in your `~/.bash_profile` or `~/.bashrc` file.


Similar to setting up git-autocomplete you need to download the file, and source it in your `~/.bashrc` or `~/.bash_profile`.

cd ~/


Then you need to source the file by adding the following to your `~/.bashrc` or `~/.bash_profile`:

source ~/

Adding Git Data to Your Prompt

One of the tricks to setting your prompt is setting the PS1 environment variable, will cover it in more detail in another post. Since you will be setting the PS1 in your `~/.bashrc` or `~/.bash_profile` you can actually do some fun things with bash scripting to have some very dynamic prompts. Most noticeably the __git_ps1() function was designed to get the current branch, plus more information, and bring back that data for your bash prompt.

The easiest thing to do is add the following line to your `~/.bashrc` or `~/.bash_profile`:

export PS1='$(__git_ps1 "(%s)") \W $'

This will produce a prompt similar to:

(master) Programming $ 

What you are seeing is the __git_ps1 function adds the (master) segment to the prompt if you are in the folder of a git repository. The \W shows only the current folder you are in, not the path. This is convenient when you are super deep in folders.

What About Staged and Non-Staged Changes?

One of the good things about this is you can easily tell if you have staged and non-staged changes. In order to use this feature you need to add the following line to your `~/.bash_profile` or `~/.bashrc` file.


This is a flag for your prompt to show a * and + after the branch name. * means unstaged changes and + is for staged changes. Having neither means you have no uncommitted changes.


Adding this to your prompt is extremely useful to have. There have been a number of times I have forgotten which branch I am in and push the wrong one to a remote. I have also tried to rebase and not realized I have uncommitted changes. Now my prompt easily ‘prompts’ me to know what I am doing. This saves me time everyday.

Related Posts:

Quick Intro to Python Requests Library

Calling 3rd party services is an essential part of web development these days. I did a quick little python article on Basic urllib GET and POST With and Without Data. It was a good look into how python natively handles doing GET and POST HTTP actions. However, there is a better way, and that is with the requests library.


Most of what you will be doing is using different HTTP Verbs. GET is probably the one you will use the most, and it is simple to do. Look at the following code example:

import requests
r = requests.get('https://localhost/user/buddylindsey/')

You can even pass in some data with your get request.

import requests
r = requests.get(

This builds up the request and adds the data as a query-string auto-magically. Makes things a bit easier don’t you think?


import requests
r =
        {'username':'buddylindsey', 'password':'password'}

Other Verbs

OPTIONS, HEAD, PUT, PATCH and DELETE are also available. To be honest I haven’t used OPTIONS, PATCH, or HEAD before so while I have a basic understanding of how they work I won’t attempt to explain it to you. Instead please visit the requests HTTP verbs docs to get a better understanding of their usage.

More on Response Object

The great thing about the response object you get back after making a request is you all of the options available for things to do and know. Here are some of the methods and properties you have access to:

  • headers
  • status_code
  • text
  • json()
  • encoding

There are many more, but these are probably the ones you will use the most.


The requests library is one of the best, if not the best, libraries for calling 3rd party web services and acting upon them. This is a powerful tool to have in your toolbelt, it is recommended to learn it, and learn it well.

Related Posts:

Upgrade Bash to 4+ on OS X

Unfortunately, Apple has decided to ship an old version of bash shell. When I go back and forth between linux and OSX sometimes I hit minor inconsistencies because of this. One big one is the git-prompt scripts. As such I finally decided to upgrade to version 4 of bash.

It is a very easy process, 2 minutes, and you are on your way. First, though you need homebrew installed and up-to-date.



brew install bash

Add it to Your Shells

You need to add it to your shells which is easy. Add the following line to `/etc/shells`


Change Your Shell


change the line that has Shell to the location that your newly brew installed bash is located.

Shell: /usr/local/bin/bash 

Restart Terminal

Close your terminal and open it again.


To see your current version of bash do the following:


It should echo something like `4.2.45(2)-release`


It is that simple. I though it would be hard until I did it.

Related Posts:

Adding Git Autocomplete to Bash on OS X

Git Logo

If you use git on the command line a lot then git bash autocomplete is an amazing tool to have in your toolbelt. Unfortunately, just by installing git you don’t get to use it. You need to add command line functionality to your environment. The easiest way to add autocomplete is by downloading a file and sourcing it in your `~/.bash_profile`.

Download Script

The bash script is in the contrib part of the git repository. You need to download it where you want it to go, I just put it in the home directory.

cd ~/
curl -O

Source Scripts

Now you need to source the script to add it to your environment. Add the following to your ~/.bash_profile.

if [ -f ~/git-completion.bash ]; then
  . ~/git-completion.bash

This simply checks if the bash file is there and if so it executes it which adds it to our environment. Now do `source ~/.bash_profile` and type in `git b<tab>` to see the auto-complete starting to work.


Now you should be able to type in `git checkout t` then it will auto fill in the rest of the branch name that starts with `t` when you hit tab. The sub-commands will also auto-complete as well as many other things. I find this particularly nice since we have some long/tough branch names so auto-complete makes switching branches quite easy and quick.

Related Posts:

Jenkins and Github Pull Requests

Jenkins Logo

One of the things people love about travis-ci is it will build pull requests, but most people don’t realize Jenkins can do pull requests as well. It is also very simple to configure Jenkins to do pull requests using the correct plugin.

Install the Plugin

Install this plugin through the manage Jenkins admin section.

Configure Github User

This is fairly simple. Follow the trail of links then add the relevant information. Make sure you have a user you want to access github with, and they have the appropriate permissions for the repos.

  1. Manage Jenkins
  2. Configure System
  3. Find Github pull requests builder Configuration Section
  4. Fill in the relevant information similar to the image below

System Configuration

Configure Your Git Pull

You need to configure where Jenkins pulls from Github to build pull requests. Specifically Github stores all the pull requests in a “secret” location. It isn’t really secret, but most people don’t explore where refs are stored. So adding the following to the correct refspec is a must. In the advanced tab of the your git configuration add the following:


In the branch specifier section add:


Here is what that section should look like:

Git Configuration

You also need to set Build triggers and add users to the whitelist so people whom submit pull requests can have their pull requests automatically build. If you don’t add people to the white list you will need to tell Jenkins to build the PR.

Build Triggers

Your Done

If everything was configured correctly. Your pull requests will look like this at the bottom of the PR:

Successful Build


Pull requests create an awesome workflow, and really help teams collaborate on a lot of different parts of code. The real problem has been, for a while, that the Pull Requests are outside the main refspec and will not build on a push. So with the pull request builder plugin it targets the location the PRs are at. The best part is it tells you the build status of the build inside the PR, similar to Travis-ci. If you aren’t using Jenkins or Github PR’s I recommend starting.

Related Posts:

Python Date and Datetime Objects – Getting to Know Them

Properly dealing with dates can be hard, but they don’t have to be as long as you understand the basics. I have had to deal a lot with dates lately with a few Django applications, and life got a whole lot easier once I figured out the basics with python. In this post we are going talk about getting dates, difference between date and datetime, and how to add and subtract time. Timezones I want to discuss in a blog post on its own.


Datetime Module

The datetime module is what houses all of our objects/classes for dealing with dates. You will need to import datetime whenever you want to deal with dates. It is super convenient to use and is built right into python. The key classes are:

  • date – Just a date. (Month, Day, Year)
  • time – Time independent of day. (Hour, Minute, Second, Microsecond)
  • datetime – Combination of date and time. (Month, Day, Year, Hour, Second, Microsecond)
  • timedelta – A duration of time used for manipulating dates
  • tzinfo – An abstract class for dealing with timezones

NOTE 1: All of these as objects are immutable.
NOTE 2: Objects of type date are naive, meaning they are not aware of a timezone.

Naive and Aware are two states you need to be aware of when dealing with dates. Naive dates just assume there is no timezone, basically. Aware dates you can use to make adjustments as needed. Again we are going to discuss timezone based dates in another post.

date vs datetime


Date just gets you back the actual date, not time. The best/easiset way to get the date is to create a new object and add in the appropriate numbers.

from datetime import date

d = date(2013, 12, 22)
print(d.strftime("%Y %m %d"))

This will output:

>>> '2013'
>>> '12'
>>> '22'
>>> '2013 12 22'


Datetime is fairly similar as well, and with some of the same code you can do generate a date and time object to work with.

from datetime import datetime

d = datetime(2013, 12, 22, 11, 30, 59)
print(d.strftime("%m/%d/%Y %I:%M:%S")) 

This will output:

>>> 11
>>> 30
>>> 59
>>> 2/22/2013 11:30:59

You may have noticed at the end I used the `strftime` function. This is a convenient function to output the date/time in string format. All you need to do is pass it a string with a symbol and the function interprets it and outputs it properly. is the site I use to get the symbols to add since it looks nicer and is easier to remember than the python docs.

Manipulating Dates

The final thing we shall look at is adding and subtracting time. How I used to do it before I actually looked in the docs to find the proper way was manually get the time using hours, minutes, seconds properties and create a new object adding or subtracting appropriately. After getting frustrated I found the `timedelta` object and it is amazing.

Lets add 3 days to a date object.

from datetime import timedelta, date

d1 = date(2012, 12, 20)
print(d1.strftime("%Y %m %d"))
d2 = d1 + timedelta(days=3)
print(d2.strftime("%Y %m %d"))

This will output:

>>> '2013 12 20'
>>> '2013 12 23'

You can try to experiment from here with dealing with timedelta objects. It kind of makes dealing with dates fun and easy.


Dealing with dates while programming is a science it itself, an annoying one at times. Fortunately, the python devs have solved the problem, at least as far as I have used it. The great thing about doing dates using python date objects is when adding and subtracting days, years, hours, etc you get a more accurate day and month. Subtracting 53 days from the 14th of March can be a bit complicated if you do it by hand, but with python its super simple. So if you aren’t using date objects in python you should definitely start today.

Related Posts:

Minimum Grep You Need to Know

You need to find a phrase in over 200 files worth of code. Manual searching is not a feasible option. If you are like me you know about grep, but it has always made you nervous. It is so powerful reading the man page was like a tech manual for an engine. Fortunately, getting the benefits of grep with little pain is easy, once you finally figure it out.

Over the last few months I have had to use grep more and more, and I would say I use the same type of search 80% of the time. It gets me what I need quickly and efficiently without much fuss.

What is Grep

The best full explanation comes from the grep man page.

Grep searches the named input FILEs (or standard input if no files are named, or the file name – is given) for lines containing a match to the given pattern. By default, grep prints the matching lines.

My explanation is it finds stuff in files and shows you where it is. It is amazingly useful because of the speed, and by showing the line it found it on along with the file name so you can more easily compare if that is what you need over just a file name.

Using Grep

Grep is very powerful, but I get by with 3 variations day to day for most of my needs. First lets look at how to structure your grep

grep <options> <search-term> <location>

This is important to remember as it can be frustrating when you forget and nothing works.

  • options – these are the different flags that can help you get more robust or targeted results back.
  • search-term – this takes any pattern/regular expression to match against all the files you are searching
  • location – this is where you put a directory or leave blank to search stdout/stdin

Grep with Other Commands

If you don’t put in a location it searches stdout/stdin. That is useful if you pipe (|) a bunch of data to grep for searching it. A mundane example is

ls -lha | grep buddy

This does a normal ls -lha and passes the result to grep. From there it only returns lines that have the word “buddy” in them.

To show the regular expression usage you can do:

ls -lha | grep ^d

This returns only results where the line starts with d. In the case of ls it means only directories are returned.

Grep’ing Files

Where you will probably spend most of your time is searching for text inside of files. Mostly you will need to know the file and line number of where the word you are looking for is located.

grep -rn hello .

This searches for hello in every file in the current directory and subdirectory. It then shows you the line in the file and the file number it is on. The options are fairly easy to remember as well:

  • r – recursively search files
  • n – display line numbers

Excluding Directories
Sometimes you get too many results or you get results in the folders you don’t want to search in. One of the projects I work on at work has a .svn folder that needs to stay. So I usually have to not include the directory. Fortunately it is easy.

grep -rn --exclude-dir=.svn hello .


Above is about all you need to know to get started using grep. It is an awesome tool with a lot more features cane you can do some crazy cool searches. It also actually helps you find elusive pieces of code.

Related Posts: