Deploying Userify with Fabric


Userify provisions SSH keys, server users, and sudo permissions (root) to your datacenter servers, EC2, Azure, and other public and hybrid clouds.

To get Userify working on all your servers, you'll need to install the Userify "daemon". To make this easier, you could automate setting up the daemon. Enter Fabric...


Fabric is a Python (2.5-2.7) library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks.

Here's what you could define in your

def deploy_userify():
    #env.sudo_prefix = "sudo -E -S -p '%(sudo_prompt)s' " % env
    print('env.effective_roles=%s' % env.effective_roles)
    if env.effective_roles[0]=='some_cores':
        sudo('curl -k "" | api_id="MY_API_ID1" api_key="MY_API_ID1" /bin/bash')
    elif env.effective_roles[0]=='some_clients':
        sudo('curl -k "" | api_id="MY_API_ID2" api_key="MY_API_ID2" /bin/bash')
    elif env.effective_roles[0]=='other_apps':
        sudo('curl -k "" | api_id="MY_API_ID3" api_key="MY_API_ID3" /bin/bash')
    elif env.effective_roles[0]=='some_apphosts':
        sudo('curl -k "" | api_id="MY_API_ID4" api_key="MY_API_ID4" /bin/bash')
    elif env.effective_roles[0]=='other_apphosts':
        sudo('curl -k "" | api_id="MY_API_ID5" api_key="MY_API_ID5" /bin/bash')
    elif env.effective_roles[0]=='some_qa':
        sudo('curl -k "" | api_id="MY_API_ID6" api_key="MY_API_ID6" /bin/bash')
    elif env.effective_roles[0]=='webhosts':
        sudo('curl -k "" | api_id="MY_API_ID7" api_key="MY_API_ID7" /bin/bash')
        print('Usage: fab deploy_userify -R <some_cores|some_clients|other_apps>')
        print('   Only one role at a time.')
    sudo('nohup /opt/userify/ >/dev/null &', pty=False)
    print('Deployed %s to %s' % (group, env.effective_roles[0]))

Then call it with fab deploy_userify -R <a_role>.

This makes it easy to deploy Userify and the server group keys to the servers they need to be on.

Check an SSL Certificate with OpenSSL

To check a website's SSL certificate in one easy step from the command line:

echo "quit" \
    | openssl s_client -connect 2>&1 \
    | sed -n '/-----BEGIN CERTIFICATE-----/,/-----END CERTIFICATE-----/p' \
    | openssl x509 -noout -subject -issuer -enddate

subject= / Organization/O=Thawte, Inc./serialNumber=3898261/C=US/ST=California/L=Mountain View/OU=Infrastructure Operations/
issuer= /C=US/O=thawte, Inc./OU=Terms of use at (c)06/CN=thawte Extended Validation SSL CA
notAfter=Aug 31 23:59:59 2014 GMT

Thrown into a small shell script, we get:


[ -z "$1" ] && { echo "Usage: $0 <host> <port>"; exit; }
[ -z "$2" ] && { echo "Usage: $0 <host> <port>"; exit; }

echo "quit" \
| openssl s_client -connect ${host}:${port} 2>/dev/null \
| sed -n '/-----BEGIN CERTIFICATE-----/,/-----END CERTIFICATE-----/p' \
| openssl x509 -text -noout

20 Lines of Python

Querying a Database and POSTing data in 20 lines

I love Python and its many libraries. The HTTP library by the name of requests is particularly awesome.

Note: I have not used any authentication, basic or otherwise, but it's easy enough to add.

StringIO is used because the cvs library expects to write data to a file and I don't want to write any files. StringIO, as I understand it, fakes a file so the data is written to a file object in memory. Sort of.

#!/usr/bin/env python

import csv
import StringIO
import requests
import sqlalchemy as sa

sql = "select foo, bar, baz from table where something='blecch'"
posturl = ''
headers = {'content-type': 'application/csv'}
dburi = 'mysql://username:password@localhost:3307/db'

engine = sa.create_engine(dburi)
ret = engine.execute(sql)

# create csv data in psuedo file
outstring = StringIO.StringIO()
writer = csv.writer(outstring, quoting=csv.QUOTE_NONNUMERIC)

# POST the data
r =, data=outstring.getvalue(), headers=headers)


MySQL Backup Script


date=$(date +%Y.%m.%d)

# echo and exit
function die {
    echo "$1"
    exit 1

# dump per database
function per_db {
    for db in $(mysql -u ${user} -p${pass} -Bse "show databases"); do
        [ ${db} = "information_schema" ] && continue
        [ ${db} = "performance_schema" ] && continue
        echo ${db}
        mysqldump -u ${user} -p${pass} --master-data=2 --hex-blob ${db} | gzip > ${backuproot}/${db}-${date}.sql.gz

# dump per table per database
function per_db_table {
    for db in $(mysql -u ${user} -p${pass} -Bse "show databases"); do
        [ ${db} = "information_schema" ] && continue
        [ ${db} = "performance_schema" ] && continue
        echo ${db}
        for table in $(mysql -u ${user} -p${pass} -Bse "show tables" ${db}); do
            echo "  ${table}"
            mkdir -p ${backuproot}/${db}
            mysqldump -u ${user} -p${pass} --master-data=2 --hex-blob ${db} ${table} | gzip > ${backuproot}/${db}/${db}-${table}-${date}.sql.gz

# mkdirs, etc
function prep {
    mkdir -p ${backuproot} 2>/dev/null || die "cannot mkdir ${backuproot}"
    touch ${backuproot}/.foo 2>/dev/null || die "${backuproot} not writeable" && rm -v ${backuproot}/.foo
    find ${backuproot}/* -maxdepth 0 -exec rm -rv {} \; 2>/dev/null

# run the appropriate functions
[ "$1" = "--per_db" ]       && prep && per_db          && exit 0
[ "$1" = "--per_db_table" ] && prep && per_db_table    && exit 0

# if you reach here...
echo "Usage: $(basename $0) --per_db | --per_db_table" && exit 0

Discovering the Seagate Central


ryant@spitfire:~$ nmap

Starting Nmap 6.00 ( ) at 2013-06-29 13:17 SAST
Nmap scan report for
Host is up (0.0098s latency).
Not shown: 990 closed ports
21/tcp   open  ftp
22/tcp   open  ssh
80/tcp   open  http
139/tcp  open  netbios-ssn
443/tcp  open  https
445/tcp  open  microsoft-ds
548/tcp  open  afp
631/tcp  open  ipp
3689/tcp open  rendezvous
9000/tcp open  cslistener

Nmap done: 1 IP address (1 host up) scanned in 1.89 seconds

Last posts

  1. Expect the Unexpected

    tags: expect snippet

  2. Persistence With SQLite

    tags: sqlite python sqlalchemy

  3. Do They Fit?

    tags: python music

  4. Commandline Completion for Fabric

    tags: bash fabric

  5. MySQL Over an SSH Tunnel

    tags: mysql bash ssh linux

  6. Bind, GeoIP, and Python a Beautiful Soup doth make

    tags: dns geodns system administration python

  7. Keeping IT Simple With

    tags: python

  8. SQL Converter

    tags: sql python sqlalchemy

  9. Twisted Exim

    tags: python exim twisted