Introduction
Accessing Amazon S3 (Simple Storage Service) buckets programmatically is a common requirement for many applications. However, setting up and managing AWS accounts is daunting and expensive, especially for small-scale projects or local development environments. In this article, we'll explore how to overcome this hurdle by using Localstack to simulate AWS services. Localstack mimics most AWS services, meaning one can develop and test applications without incurring any costs or relying on an internet connection, which can be incredibly useful for rapid development and debugging. We used ObjectScript with embedded Python to communicate with Intersystems IRIS and AWS simultaneously. Before beginning, ensure you have Python and Docker installed on your system. When Localstack is set up and running, the bucket can be created and used.
Creating an S3 Bucket from ObjectScript with Embedded Python
Now that LocalStack is running, let's create an S3 bucket programmatically. We'll use Python and the Boto3 library - a Python SDK for AWS services. Take a look at the MakeBucket method provided in the S3UUtil class. This method utilizes Boto3 to create an S3 bucket:
ClassMethod MakeBucket(inboundfromiris As %String) As %Status [ Language = python ]
{
import boto3
s3 = boto3.client(
service_name='s3',
region_name="us-east-1",
endpoint_url='http:
)
try:
s3.create_bucket(Bucket=inboundfromiris)
print("Bucket created successfully")
return 1
except Exception as e:
print("Error:", e)
return 0
}
To create a bucket, you would call this method with the desired bucket name:
status = S3UUtil.MakeBucket("mybucket")
Uploading Objects to the Bucket from ObjectScript with Embedded Python
Once the bucket is created, you can upload objects to it programmatically. The PutObject method demonstrates how to achieve this:
ClassMethod PutObject(inboundfromiris As %String, objectKey As %String) As %Status [ Language = python ]
{
import boto3
try:
content = "Hello, World!".encode('utf-8')
s3 = boto3.client(
service_name='s3',
region_name="us-east-1",
endpoint_url='http:
)
s3.put_object(Bucket=inboundfromiris, Key=objectKey, Body=content)
print("Object uploaded successfully!")
return 1
except Exception as e:
print("Error:", e)
return 0
}
Call this method to upload an object:
Do ##class(S3.S3UUtil).PutObject("inboundfromiris", "hello-world-test")
Listing Objects in the Bucket from ObjectScript with Embedded Python
To list objects in the bucket, you can use the FetchBucket method:
ClassMethod FetchBucket(inboundfromiris As %String) As %Status [ Language = python ]
{
import boto3
s3 = boto3.client(
service_name='s3',
region_name="us-east-1",
endpoint_url='http:
)
try:
response = s3.list_objects(Bucket=inboundfromiris)
if 'Contents' in response:
print("Objects in bucket", inboundfromiris)
for obj in response['Contents']:
print(obj['Key'])
return 1
else:
print("Error: Bucket is empty or does not exist")
return 0
except Exception as e:
print("Error:", e)
return 0
}
Call the FetchBucket method to list objects from the bucket:
do ##class(S3.S3UUtil).FetchBucket("inboundfromiris")
Retrieving Objects from the Bucket from ObjectScript with Embedded Python
Finally, to retrieve objects from the bucket, you can use the PullObjectFromBucket method:
ClassMethod PullObjectFromBucket(inboundfromiris As %String, objectKey As %String) As %Status [ Language = python ]
{
import boto3
def pull_object_from_bucket(bucket_name, object_key):
try:
s3 = boto3.client(
service_name='s3',
region_name="us-east-1",
endpoint_url='http:
)
obj_response = s3.get_object(Bucket=bucket_name, Key=object_key)
content = obj_response['Body'].read().decode('utf-8')
print("Content of object with key '", object_key, "':", content)
return True
except Exception as e:
print("Error:", e)
return False
pull_object_from_bucket(inboundfromiris, objectKey)
}
Call this method:
Do ##class(DQS.CloudUtils.S3.S3UUtil).PullObjectFromBucket("inboundfromiris", "hello-world-test")
The discussion here is just the beginning, as it's clear there's plenty more ground to cover. I invite readers to dive deeper into this subject and share their insights. Let's keep the conversation going and continue advancing our understanding of this topic.
I'm eager to hear thoughts and contributions.