发布新帖

Rechercher

公告
· 五月 4, 2024
讨论 (0)1
登录或注册以继续
文章
· 五月 3, 2024 阅读大约需 6 分钟

Demo: Connecting Locally to an S3 Bucket without an AWS Account

Introduction

Accessing Amazon S3 (Simple Storage Service) buckets programmatically is a common requirement for many applications. However, setting up and managing AWS accounts is daunting and expensive, especially for small-scale projects or local development environments. In this article, we'll explore how to overcome this hurdle by using Localstack to simulate AWS services. Localstack mimics most AWS services, meaning one can develop and test applications without incurring any costs or relying on an internet connection, which can be incredibly useful for rapid development and debugging. We used ObjectScript with embedded Python to communicate with Intersystems IRIS and AWS simultaneously. Before beginning, ensure you have Python and Docker installed on your system. When Localstack is set up and running, the bucket can be created and used. 

Creating an S3 Bucket from ObjectScript with Embedded Python

Now that LocalStack is running, let's create an S3 bucket programmatically. We'll use Python and the Boto3 library - a Python SDK for AWS services. Take a look at the MakeBucket method provided in the S3UUtil class. This method utilizes Boto3 to create an S3 bucket:

ClassMethod MakeBucket(inboundfromiris As %String) As %Status [ Language = python ]

{

    import boto3

    s3 = boto3.client(

        service_name='s3', 

        region_name="us-east-1", 

        endpoint_url='http://host.docker.internal:4566', 
    )

    try:

        s3.create_bucket(Bucket=inboundfromiris)

        print("Bucket created successfully")

        return 1
    except Exception as e:

        print("Error:", e)

        return 0
}

To create a bucket, you would call this method with the desired bucket name:

status = S3UUtil.MakeBucket("mybucket")

Uploading Objects to the Bucket from ObjectScript with Embedded Python

Once the bucket is created, you can upload objects to it programmatically. The PutObject method demonstrates how to achieve this:

ClassMethod PutObject(inboundfromiris As %String, objectKey As %String) As %Status [ Language = python ]

{

    import boto3

    try:

        content = "Hello, World!".encode('utf-8')

        s3 = boto3.client(

            service_name='s3',

            region_name="us-east-1",

            endpoint_url='http://host.docker.internal:4566'
        )

        s3.put_object(Bucket=inboundfromiris, Key=objectKey, Body=content)

        print("Object uploaded successfully!")

        return 1
    except Exception as e:

        print("Error:", e)

        return 0
}

Call this method to upload an object:

Do ##class(S3.S3UUtil).PutObject("inboundfromiris", "hello-world-test")

 

Listing Objects in the Bucket from ObjectScript with Embedded Python

To list objects in the bucket, you can use the FetchBucket method:

ClassMethod FetchBucket(inboundfromiris As %String) As %Status [ Language = python ]

{

    import boto3

    s3 = boto3.client(

        service_name='s3', 

        region_name="us-east-1", 

        endpoint_url='http://host.docker.internal:4566', 
    )

    try:

        response = s3.list_objects(Bucket=inboundfromiris)

        if 'Contents' in response:

            print("Objects in bucket", inboundfromiris)

            for obj in response['Contents']:

                print(obj['Key'])

            return 1
        else:

            print("Error: Bucket is empty or does not exist")

            return 0
    except Exception as e:

        print("Error:", e)

        return 0
}

Call the FetchBucket method to list objects from the bucket:

do ##class(S3.S3UUtil).FetchBucket("inboundfromiris")


 

Retrieving Objects from the Bucket from ObjectScript with Embedded Python

Finally, to retrieve objects from the bucket, you can use the PullObjectFromBucket method:

ClassMethod PullObjectFromBucket(inboundfromiris As %String, objectKey As %String) As %Status [ Language = python ]

{

    import boto3

    def pull_object_from_bucket(bucket_name, object_key):

        try:

            s3 = boto3.client(

                service_name='s3', 

                region_name="us-east-1", 

                endpoint_url='http://host.docker.internal:4566', 
            )

            obj_response = s3.get_object(Bucket=bucket_name, Key=object_key)

            content = obj_response['Body'].read().decode('utf-8')

            print("Content of object with key '", object_key, "':", content)

            return True

        except Exception as e:

            print("Error:", e)

            return False

    pull_object_from_bucket(inboundfromiris, objectKey)

}

Call this method:

Do ##class(DQS.CloudUtils.S3.S3UUtil).PullObjectFromBucket("inboundfromiris", "hello-world-test")

 

The discussion here is just the beginning, as it's clear there's plenty more ground to cover. I invite readers to dive deeper into this subject and share their insights. Let's keep the conversation going and continue advancing our understanding of this topic.

I'm eager to hear thoughts and contributions.

讨论 (0)1
登录或注册以继续
公告
· 五月 3, 2024

[Video] Understanding HL7 FHIR Profiles

Hey Developers,

Watch the latest video on InterSystems Developers YouTube:

⏯ Understanding HL7 FHIR Profiles

See how to profile, or customize, HL7® FHIR® resources for a specific use case by providing extensions and constraints. Profiled FHIR resources make up Implementation Guides that describe how to use HL7 FHIR in different ways, while also improving data consistency and manageability.    

Enjoy and check out more videos! 👍

讨论 (0)1
登录或注册以继续
文章
· 五月 3, 2024 阅读大约需 1 分钟

Reviews on Open Exchange - #42

If one of your packages on OEX receives a review you get notified by OEX only of YOUR own package.   
The rating reflects the experience of the reviewer with the status found at the time of review.   
It is kind of a snapshot and might have changed meanwhile.   
Reviews by other members of the community are marked by * in the last column.

I also placed a bunch of Pull Requests on GitHub when I found a problem I could fix.    
Some were accepted and merged, and some were just ignored.     
So if you made a major change and expect a changed review just let me know.

# Package Review Stars IPM Docker *
1 hl7v2-to-kafka 6* super demo 6.0   y  
2 workshop-workflow well working demo 5.8   y  
3 Demo-Pandas-Analytics very professional 5.0 y y  
4 deployed-code-template a very useful feature 5.0 y y  
5 fhirpro nice animated visiualization 5.0      
6 foreign-tables usefull example 5.0 y y *
7 geo-vector-search easy to follow starter 5.0 y y *
8 mini-docker great idea to start with the smallest IRIS image 5.0   y *
9 AriticleSimilarity not convincing 3.5 y y  
10 Database Growth - Data Collection and Analysis a lot of typing 3.5      
讨论 (0)1
登录或注册以继续
公告
· 五月 3, 2024

VS Code release April 2024 (version 1.89)

 

Visual Studio Code releases new updates every month with new features and bug fixes, and the April 2024 release is now available. 

Version 1.89 includes:

The release also includes contributions from our very own @John Murray through pull requests that address open issues. 

Find out more about these features in the release notes here > https://code.visualstudio.com/updates/v1_89

For those with VS Code, your environment should auto-update. You can manually check for updates by running Help > Check for Updates on Linux and Windows or running Code > Check for Updates on macOS.

If you're thinking about migrating from Studio to VS Code but need some help, take a look at the training courses George James Software offers > https://georgejames.com/migration-from-studio/

讨论 (0)2
登录或注册以继续