Boto3
Key Concepts in Boto3
Clients
Clients are used to make low-level service calls to AWS APIs.
The client gives you access to a set of methods for each AWS service, which correspond directly to AWS API operations.
Example
import boto3
= boto3.client('s3') # S3 client
client = client.list_buckets() # List all S3 buckets
response print(response)
Resources
Resources are higher-level abstractions that wrap around AWS service objects, providing an object-oriented way of interacting with AWS.
Resources hide much of the complexity of working with AWS APIs.
Example
= boto3.resource('s3')
s3 = s3.Bucket('my-bucket')
bucket for obj in bucket.objects.all():
print(obj.key)
Sessions
A session manages your AWS credentials, region, and configurations. If no session is explicitly created, Boto3 creates a default one.
You can create a session to manage different profiles, regions, or credentials.
Example
= boto3.Session(profile_name='myprofile')
session = session.resource('s3') s3
Paginator
Paginators automatically handle the pagination of results, which is required when an API call returns too many results to fit into a single response.
Example
= boto3.client('s3')
client = client.get_paginator('list_objects_v2')
paginator for page in paginator.paginate(Bucket='my-bucket'):
for obj in page['Contents']:
print(obj['Key'])
Waiters
Waiters provide a convenient way to wait for a resource to reach a specific state, such as an EC2 instance becoming available or an S3 object being fully uploaded.
Example
= boto3.client('ec2')
ec2 =['i-1234567890abcdef0'])
ec2.start_instances(InstanceIds= ec2.get_waiter('instance_running')
waiter =['i-1234567890abcdef0']) waiter.wait(InstanceIds
Common AWS Services with Boto3
Amazon S3 (Simple Storage Service)
S3 is used for scalable object storage. With Boto3, you can create, read, and manage S3 buckets and objects.
Create a bucket
= boto3.client('s3')
s3 ='my-bucket') s3.create_bucket(Bucket
Upload a file
= boto3.client('s3')
s3 'localfile.txt', 'my-bucket', 'remote.txt') s3.upload_file(
Download a file
= boto3.client('s3')
s3 'my-bucket', 'remote.txt', 'localfile.txt')
s3.download_file(
List buckets
= boto3.client('s3')
s3 = s3.list_buckets()
response for bucket in response['Buckets']:
print(bucket['Name'])
Delete a bucket
= boto3.client('s3')
s3 ='my-bucket') s3.delete_bucket(Bucket
Amazon EC2 (Elastic Compute Cloud)
EC2 is a service that provides resizable compute capacity. With Boto3, you can manage instances, security groups, and key pairs.
Launch an EC2 instance
= boto3.resource('ec2')
ec2 = ec2.create_instances(
instance ='ami-0abcdef1234567890',
ImageId=1,
MinCount=1,
MaxCount='t2.micro'
InstanceType )
Stop an instance
= boto3.client('ec2')
ec2 =['i-1234567890abcdef0']) ec2.stop_instances(InstanceIds
Terminate an instance
= boto3.client('ec2')
ec2 =['i-1234567890abcdef0']) ec2.terminate_instances(InstanceIds
List all instances
= boto3.resource('ec2')
ec2 for instance in ec2.instances.all():
print(instance.id, instance.state)
Amazon DynamoDB
DynamoDB is a fully managed NoSQL database service. Boto3 lets you interact with tables, items, and queries.
Create a table
= boto3.resource('dynamodb')
dynamodb = dynamodb.create_table(
table ='Users',
TableName=[
KeySchema
{'AttributeName': 'user_id',
'KeyType': 'HASH'
}
],=[
AttributeDefinitions
{'AttributeName': 'user_id',
'AttributeType': 'S'
}
],={
ProvisionedThroughput'ReadCapacityUnits': 10,
'WriteCapacityUnits': 10
}
)
Put an item in a table
= boto3.resource('dynamodb').Table('Users')
table ={'user_id': '123', 'name': 'John Doe'}) table.put_item(Item
Get an item from a table
= boto3.resource('dynamodb').Table('Users')
table = table.get_item(Key={'user_id': '123'})
response print(response['Item'])
Delete an item
= boto3.resource('dynamodb').Table('Users')
table ={'user_id': '123'}) table.delete_item(Key
AWS Lambda
AWS Lambda allows you to run code in response to events without provisioning servers. With Boto3, you can manage Lambda functions.
Create a Lambda function
= boto3.client('lambda')
lambda_client with open('my-deployment-package.zip', 'rb') as f:
= f.read()
zipped_code
lambda_client.create_function(='MyLambdaFunction',
FunctionName='python3.8',
Runtime='arn:aws:iam::123456789012:role/service-role/MyLambdaRole',
Role='lambda_function.lambda_handler',
Handler=dict(ZipFile=zipped_code),
Code=300
Timeout
)
Invoke a Lambda function
= boto3.client('lambda')
lambda_client = lambda_client.invoke(
response ='MyLambdaFunction',
FunctionName='RequestResponse',
InvocationType=b'{"key": "value"}'
Payload
)print(response['Payload'].read())
Delete a Lambda function
= boto3.client('lambda')
lambda_client ='MyLambdaFunction') lambda_client.delete_function(FunctionName
Amazon SNS (Simple Notification Service)
SNS allows you to send notifications via email, SMS, or to other AWS services.
Create an SNS topic
= boto3.client('sns')
sns = sns.create_topic(Name='MyTopic')
response = response['TopicArn'] topic_arn
Subscribe to an SNS topic
= boto3.client('sns')
sns
sns.subscribe(=topic_arn,
TopicArn='email',
Protocol='example@example.com'
Endpoint
)
Publish a message to an SNS topic
= boto3.client('sns')
sns
sns.publish(=topic_arn,
TopicArn='This is a test message.'
Message )
Handling Credentials in Boto3
Boto3 automatically looks for credentials in the following order:
- Environment Variables:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_SESSION_TOKEN
(optional)
- Shared Credentials File (
~/.aws/credentials
):
- Profiles can be configured here, like
[default]
.
- EC2 Instance Metadata:
- For instances running in AWS, Boto3 can retrieve credentials from the metadata service.
You can also pass credentials explicitly
= boto3.Session(
session ='YOUR_ACCESS_KEY',
aws_access_key_id='YOUR_SECRET_KEY',
aws_secret_access_key='us-west-2'
region_name
)= session.resource('s3') s3
Error Handling in Boto3
Boto3 includes exceptions that you can catch to handle errors gracefully.
import boto3
from botocore.exceptions import NoCredentialsError, PartialCredentialsError
try:
= boto3.client('s3')
s3
s3.list_buckets()except NoCredentialsError:
print("Credentials not available.")
except PartialCredentialsError:
print("Incomplete credentials provided.")
Optimizations in Boto3
- Multi-threading with
concurrent.futures
: You can speed up Boto3 operations (especially with S3) by using multi-threading or multi-processing.
from concurrent.futures import ThreadPoolExecutor
def upload_file(bucket, filename):
= boto3.client('s3')
s3
s3.upload_file(filename, bucket, filename)
with ThreadPoolExecutor(max_workers=5) as executor:
map(upload_file, ['my-bucket'] * 10, ['file1.txt', 'file2.txt', ...]) executor.
Session Caching: Reuse the same session across multiple Boto3 resource or client invocations to avoid repeatedly initializing the session.
Using Paginators: For services like S3 that return paginated results, using paginators avoids issues with large datasets.