Skip to main content

AWS S3 Storage

AWS S3 Storage is a Type element for global cloud storage, implementing massive data storage, global CDN acceleration, and enterprise-level security management based on Amazon Simple Storage Service. It provides standardized file upload, download, and delete operations, integrates AWS ecosystem access control and permission management, supports multiple storage types and data backup strategies, ensuring data security and high availability.

The AWS S3 storage element hierarchy is Meta (storages.Meta) → Type (storages.AwsS3Type) → Instance. Developers can quickly create AWS S3 storage instance elements through JitAi's visual development tools.

Of course, developers can also create their own Type elements or override the storages.AwsS3Type element officially provided by JitAi in their own applications to implement their own encapsulation.

Quick Start

Creating Instance Elements

Directory Structure

Recommended Directory Structure
storages/
├── MyS3/ # Custom instance element name
│ ├── e.json # Element declaration file
│ └── MyS3.json # AWS S3 configuration file

e.json File

Basic Configuration
{
"title": "My AWS S3 Storage",
"type": "storages.AwsS3Type",
"backendBundleEntry": ".",
"variables": []
}

Business Configuration File

AWS S3 Connection Configuration
{
"accessKeyId": "your_access_key_id",
"accessKeySecret": "your_secret_access_key",
"endPoint": "s3.amazonaws.com",
"bucketName": "your_bucket_name",
"region": "us-east-1"
}

Usage Example

Basic Usage
# Get AWS S3 storage instance
s3 = app.getElement("storages.MyS3")

# Upload file
with open("example.txt", "rb") as file:
result = s3.uploadByFile("folder/example.txt", file.read(), "text/plain")
print(f"Upload successful, URL: {result['url']}")

# Download file
file_data = s3.download("folder/example.txt")

# Delete file
s3.delete("folder/example.txt")

Element Configuration

e.json Configuration

Configuration ItemTypeRequiredDescription
titlestringYesStorage element display name
typestringYesFixed value: storages.AwsS3Type
backendBundleEntrystringYesFixed value: "."
variablesarrayNoCustom variable configuration, usually empty array

Business Configuration File Configuration

Configuration ItemTypeRequiredDescription
accessKeyIdstringYesAWS access key ID
accessKeySecretstringYesAWS access key secret
endPointstringYesS3 service access domain, e.g.: s3.amazonaws.com
bucketNamestringYesS3 bucket name
regionstringYesAWS region code, e.g.: us-east-1, us-west-2, etc.

Methods

uploadByFile

Upload file to AWS S3 storage.

Parameter Details

ParameterTypeNative TypeRequiredDescription
nameStextstrYesStorage file path name
data-bytesYesFile binary data
contentTypeStextstrNoFile MIME type, defaults to "application/octet-stream"

Return Value

Returns dictionary containing upload result, with url field indicating file access address.

Usage Example

File Upload
# Upload image file
with open("avatar.jpg", "rb") as file:
result = s3.uploadByFile("users/avatar.jpg", file.read(), "image/jpeg")
avatar_url = result["url"]

# Upload document file
with open("document.pdf", "rb") as file:
result = s3.uploadByFile("docs/document.pdf", file.read(), "application/pdf")

getSignUrl

Get pre-signed access URL for files, used for temporary access permission control.

Parameter Details

ParameterTypeNative TypeRequiredDescription
fileStextstrYesFile path name
contentTypeStextstrYesFile MIME type
expiresNumericintNoExpiration time (seconds), defaults to 300 seconds

Return Value

Returns pre-signed URL string.

Usage Example

Get Pre-signed URL
# Get temporary access URL for image
temp_url = s3.getSignUrl("users/avatar.jpg", "image/jpeg", 600)

# Get temporary download URL for document
download_url = s3.getSignUrl("docs/document.pdf", "application/pdf")

download

Download file data.

Parameter Details

ParameterTypeNative TypeRequiredDescription
nameStextstrYesFile path name to download

Return Value

Returns file binary data.

Usage Example

File Download
# Download file
file_data = s3.download("docs/document.pdf")

# Save to local
with open("downloaded_document.pdf", "wb") as file:
file.write(file_data)

getObject

Get file object information.

Parameter Details

ParameterTypeNative TypeRequiredDescription
nameStextstrYesFile path name

Return Value

Returns file object information.

Usage Example

Get File Information
# Get file object
file_obj = s3.getObject("users/avatar.jpg")

delete

Delete specified file.

Parameter Details

ParameterTypeNative TypeRequiredDescription
nameStextstrYesFile path name to delete

Return Value

Returns delete operation result.

Usage Example

File Deletion
# Delete user avatar
s3.delete("users/avatar.jpg")

# Delete temporary file
s3.delete("temp/upload_cache.tmp")

Advanced Features

Exception Handling Mechanism

AWS S3 storage has built-in comprehensive exception handling mechanism, all methods automatically catch and convert exceptions.

Exception Handling Example
try:
s3.uploadByFile("test.txt", b"test data", "text/plain")
except Exception as e:
# Exception will contain detailed error information, including storage type, element name, etc.
print(f"Upload failed: {e}")

Batch Operations

Batch File Management
# Batch upload files
files = [
("file1.txt", b"content1", "text/plain"),
("file2.txt", b"content2", "text/plain")
]

for name, data, content_type in files:
try:
result = s3.uploadByFile(f"batch/{name}", data, content_type)
print(f"Upload successful: {name}")
except Exception as e:
print(f"Upload failed {name}: {e}")

Cross-Region Access

Cross-Region Configuration
# AWS S3 supports multi-region storage, you can choose the best region based on business needs
# Different regions can be specified in the configuration file:
# "region": "us-west-2" # US West
# "region": "eu-west-1" # Europe West
# "region": "ap-northeast-1" # Asia Pacific Northeast (Tokyo)

Storage Type Optimization

Storage Type Configuration
# AWS S3 provides multiple storage types for different access patterns:
# - Standard: Frequently accessed data
# - Infrequent Access (IA): Infrequently accessed data
# - Glacier: Archive storage
# - Deep Archive: Deep archive

# Storage type can be specified during upload
def upload_with_storage_class(name, data, storage_class="STANDARD"):
# Actual implementation may require additional parameter support
return s3.uploadByFile(name, data, "application/octet-stream")

Security Best Practices

Security Configuration Recommendations
# 1. Use IAM roles instead of hardcoded keys
# 2. Enable S3 bucket versioning
# 3. Configure appropriate bucket policies
# 4. Use pre-signed URLs for temporary access
# 5. Regularly rotate access keys

# Use environment variables to store sensitive information
import os
config = {
"accessKeyId": os.getenv("AWS_ACCESS_KEY_ID"),
"accessKeySecret": os.getenv("AWS_SECRET_ACCESS_KEY"),
"region": os.getenv("AWS_REGION", "us-east-1")
}