With the completion of our automatic bucket creator tutorial, this tutorial follows suit and shows users how to create a service to upload files to S3. We'll discuss how to create this service, see a completed template, and have a video tutorial companion for this guide.
Watch this tutorial
Building the service
There's a few requirements to begin building the service. We'll need:
- An IAM user's credentials with access to S3 permissions
- A bucket to upload to
If you need assistance creating IAM users, our previous write up has a step by step guide with pictures.
WayScript
- Create a lair that will house your upload service
- Create an app.py and requirements.txt
#app.py
from flask import Flask, request
import boto3
from datetime import date
from wayscript import context
import os
app = Flask(__name__)
# AWS SETTINGS
# The secret key and access key come from an IAM user
# These secrets should be placed inside the .secrets file and then accessed using os
# For help creating an IAM user with permissions view the AWS docs
# https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_create.html
region = 'us-east'
ACCESS_KEY = os.environ.get('ACCESS_KEY')
SECRET_KEY = os.environ.get('SECRET_KEY')
# WayScript specific functions
# WayScript provides detailed information about the user when they access wayscript hosted endpoints
# This is done using the WayScript SDK
# For more information view:
# https://docs.wayscript.com/building-tools/sdk
def get_user_details():
application_key = request.headers.get('Authorization')[7:]
user = context.get_user_by_application_key(application_key)
return user
# AWS Resource Calls
# Boto3 is used in this example to access AWS resources
# https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html
def create_s3_client():
client = boto3.client(
's3',
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY
)
return client
@app.route('/create-bucket', methods=['GET', 'POST'])
def create_aws_resource_s3_bucket():
if request.method != 'POST':
return {'error': 'endpoint expected POST request with payload "project"'}
if not request.json:
return {'error': 'Missing Lair Context'}
try:
user = get_user_details()
except:
return {'data': 'logged out user.'}
# Name Bucket
# Name-Project-Env
env_type = request.json.get('environment')
project_name = request.json.get('name')
bucket_name = str(user.get('first_name')) + "-" + project_name + "-" + env_type
client = create_s3_client()
response = client.create_bucket(Bucket=bucket_name.lower())
return response
@app.route('/upload-file', methods=['GET', 'POST'])
def create_aws_resource_s3_bucket_file():
if request.method != 'POST':
return {'error': 'endpoint expected POST request with payload'}
if not request.json:
return {'error': 'Missing Lair Context'}
try:
user = get_user_details()
except:
return {'data': 'logged out user.'}
# Name Bucket
# Name-Project-Env
env_type = request.json.get('environment')
project_name = request.json.get('name')
bucket_name = str(user.get('first_name')) + "-" + project_name + "-" + env_type
today = str(date.today())
client = create_s3_client()
file_to_upload = request.files['file']
with open(file_too_upload, "rb") as f:
response = client.upload_fileobj(f, bucket_name, today)
return response
if __name__ == '__main__':
app.run()
Above you'll note that the ACCESS_KEY and SECRET_KEY need to be defined in the secrets file of your wayscript lair.
- Once this code is placed in your app.py, you'll need to deploy your service.
- From there, you are able to send files via a request and upload them at your new endpoint.