Python and AWS Cookbook

Python and AWS Cookbook

Mitch Garnaat

Language: English

Pages: 78

ISBN: 144930544X

Format: PDF / Kindle (mobi) / ePub

If you intend to use Amazon Web Services (AWS) for remote computing and storage, Python is an ideal programming language for developing applications and controlling your cloud-based infrastructure. This cookbook gets you started with more than two dozen recipes for using Python with AWS, based on the author’s boto library.

You’ll find detailed recipes for working with the S3 storage service as well as EC2, the service that lets you design and build cloud applications. Each recipe includes a code solution you can use immediately, along with a discussion of why and how the recipe works. You also get detailed advice for using boto with AWS and other cloud services.

This book’s recipes include methods to help you:

  • Launch instances on EC2, and keep track of them with tags
  • Associate an Elastic IP address with an instance
  • Restore a failed Elastic Block Store volume from a snapshot
  • Store and monitor your own custom metrics in CloudWatch
  • Create a bucket in S3 to contain your data objects
  • Reduce the cost of storing noncritical data
  • Prevent accidental deletion of data in S3

Social Capital (Key Ideas)

Arduino Workshop: A Hands-On Introduction with 65 Projects

Idea to iPhone: The essential guide to creating your first app for the iPhone and iPad

Beginning Windows Phone App Development

Why We Fail

Social Capital (Key Ideas)




















data could include data and/or scripts that are run when the instance is launched as shown in Executing Custom Scripts upon Instance Startup. Example 2-11. Clone a Running Instance import boto from boto.ec2.blockdevicemapping import BlockDeviceMapping, BlockDeviceType import os import base64 def clone_instance(instance): """ Make an clone of an existing Instance object. instance The Instance object to clone. """ new_bdm = None ec2 = instance.connection if instance.block_device_mapping:

combination of the two approaches. Big packages like databases and web servers can be built into the image, because they take a long time to install and don’t change that often, whereas application code and data can be configured through scripts. This is a complicated topic and very application-specific; however, I do want to show a couple of examples of passing data and scripts to newly started instances to give you some idea of the power and flexibility that are available to you. Our first

of the object containing the data in S3. """ s3 = boto.connect_s3() bucket = s3.lookup(bucket_name) key = bucket.lookup(key_name) print key.metadata Computing Total Storage Used by a Bucket Problem You want to find out how much storage a bucket is using. Solution Iterate through all of the keys in the bucket and total the bytes used. Discussion Because you are charged for the amount of storage you use in S3, you may want to know how much data is stored in a

String Authentication (QSA) to generate an expiring URL to your data in S3. Discussion One neat feature of S3 is the ability to generate self-expiring URLs pointing to data in S3. This allows you to share private data in S3 without changing the permissions of the object. It also means that you can control how long the URL you pass on to your collaborator will work. You can have it expire in 5 seconds, 5 days, 5 months, or any other time period that seems appropriate. The example below

and version ID. To actually delete an object, you need to perform a versioned DELETE operation that specifies both the object name and the version ID. This makes accidental deletion much less likely. MFA Delete MFA (MultiFactor Authorization) Delete extends the protection of objects even further. Once MFA Delete is configured on a bucket, an object cannot be deleted without first providing an authentication token from a security device associated with your account. This makes accidental

Download sample