Jump to content

User:Statsrick/PHP modules

From Wikipedia, the free encyclopedia

Python Modules I recommend and use

  1. import boto3 - aws sdk (the next generation after boto). Boto is a Python package that provides interfaces to Amazon Web Services. (sudo pip install boto)
  2. import botocore.exceptions - to capture aws sdk exceptions (example: except botocore.exceptions.ClientError)
  3. import urllib – interface to the internet/html
  4. import gzip – gzip interface (reading/writing)
  5. from datetime import datetime, timedelta - datetime manipulations (get today, yesterday)
  6. import sys – system module, I just use it to exit the script in case important files are missing
  7. import os – os interface – I use it to remove local files from disk
  8. import re – regex
  9. import urlparse – parse url into components
  10. import requests – web crawling
  11. import pgpasslib – password library for Redshift etc.
  12. import psycopg2 – postgres sql library for embedding Redshift statements

For analytics

  1. import pandas as pd (sudo pip install matplotlib pandas)
  2. import numpy as np
  3. import scipi as sp
  4. import mathsci as ms

install psycopg2 on a mac

#ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)”
#brew install postgresqlgem 
#install pgsudo -H pip 
#install psycopg2

using boto3 to read and write to s3 is easy and your creditials are stored here ~/.aws/credential

import boto3
s3_client = boto3.client('s3', region_name='us-west-2')
# Upload the file to S3 
s3_client.upload_file('hello-local.txt', 'MyBucket', 'hello-remote.txt') 
# Download the file from S3 
s3_client.download_file('MyBucket', 'hello-remote.txt', 'hello2-local.txt') 
print(open('hello2-local.txt').read())
#real example
s3_client.download_file('hearstlogfiles', 'adobe/omni-mgweb/2016/02/12/lookup_data.tar.gz', '/tmp/lookup_data.tar.gz')


Return to Rick's Library