AWS Aurora Postgres and IAM Auth.

AWS Aurora Postgres and IAM Auth.

Hi Folks ! Recently worked on AWS Aurora Postgres & IAM Authentication. This reduce one substantial action item of credential maintenance and provide batter control over access management when you have multiple database instances.

AWS Aurora is fully manage Relational Database Service (RDS), compatible with MySQL and PostgreSQL databases, more details can be found HERE.

We are going to follow Postgres DB in this context. Point to note here is that IAM address only Authentication part of resources, authorization, will still be manage by database (PostgreSQL) engine Role and Privilege ! We will do it in following steps.

  1. Create Aurora - Postgres database instance, while creation database Enable IAM auth for DB instance from AWS console or from terraform .
  2. Create a IAM Policy which enable a Database User to connect with below JSON doc. Note that lucifer a simple DB user name, other info are just from database resources. To allow all the user to connect just put Resources : ["*"] , which is not recommended off-course.

{
   "Version": "2012-10-17",
   "Statement": [
      {
         "Effect": "Allow",
         "Action": [
             "rds-db:connect"
         ],
         "Resource": [
             "arn:aws:rds-db:us-east-1:1234567890:dbuser:cluster-ABCDEFGHIJKL01234/lucifer"
         ]
      }
   ]
}        

In the above policy, we are allowing DB user lucifer to connect to database. More details can be found on AWS Document.

3. Let's connect to database and create user lucifer with its role and privileges. As you can see in the last statement we are assigning extra role rds_iam to lucifer to make it as IAM auth eligible.

### Create role with read write access

CREATE ROLE readwrite;
GRANT SELECT, INSERT, UPDATE, DELETE ON ALL TABLES IN SCHEMA public TO readwrite;

### Create DB user 

create user lucifer with LOGIN;

### Assign role to user 

GRANT readwrite TO lucifer;

### Make user IAM login eligible 

GRANT rds_iam to lucifer;
        

4. To test user login via IAM token, create shell script with below content, you can to download ssl root cert from here

hostname=aurora-db-postgres.cluster-XXXXXXXXX.us-west-2.rds.amazonaws.com
portNumber=5432
dbName=your_db_name
user="lucifer"

export PGPASSWORD="$(aws rds generate-db-auth-token \
--hostname $hostname \
--port 5432 \
--region us-west-2 \
--username $user)"


psql "host=${hostname} port=${portNumber} sslmode=verify-full \ sslrootcert=./us-west-2-bundle.pem dbname=${dbName} user=${user}" \
      -c 'select * from public.jdbc_test'
        

As you can see we are using an AWS IAM token as password to connect to database to perform select.

if some runtime in AWS ecosystem need JDBC connection of database, it can generate token via DefaultAWSCredentialsProviderChain like below.

public String generateToken() {
 
   RdsIamAuthTokenGenerator generator = RdsIamAuthTokenGenerator.builder(

        .credentials(new DefaultAWSCredentialsProviderChain())

        .region(region)

        .build();

    return generator.getAuthToken(

        GetIamAuthTokenRequest.builder()

            .hostname(hostnamePort.getFirst())

            .port(hostnamePort.getSecond())

            .userName(userName)

            .build());

}        

However In order to allow that we need to create AWS IAM role, with attached policy (created in 2nd step). Now this Role can be assumed by any runtime in order to access the db to perform operation.

Key point to notice that IAM token is valid for only 15 min, and will be required to review periodically . Service need to have logic to refresh it. In Spring-boot it can he be handled like below

@Component
public class RefreshDatabaseCrds {
  private final HikariDataSource dataSource;
  private final AuthTokenProvider authTokenProvider;


  public RefreshDatabaseCrds(DataSource dataSource, AuthTokenProvider authTokenProvider) {
    this.dataSource = (HikariDataSource) dataSource;
    this.authTokenProvider = authTokenProvider;
  }


  @Scheduled(fixedRateString = "${cloud.aws.refresh-db-credentials.rate:240000}",
      initialDelayString = "${cloud.aws.refresh-db-credentials.initial-delay:240000}")
  public void run() {
    dataSource.getHikariConfigMXBean().setPassword(authTokenProvider.generateToken());
  }
}        

AWS has details documentation which you can find in reference below.

Summary :- What we have learn is how to create IAM policy to allow DB user to connect to database via IAM Auth, and further how to use that policy in diffrent runtime via IAM Role, and logic to refresh AWS IAM token periodically with Spring-boot.

Reference,

https://aws.amazon.com/blogs/database/managing-postgresql-users-and-roles/

https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/UsingWithRDS.IAMDBAuth.IAMPolicy.html

https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/UsingWithRDS.IAMDBAuth.Enabling.html

#rds #aurora

Thanks,

Happy learning !

Any hiccups you saw when using Aurora Pg?

Like
Reply

To view or add a comment, sign in

More articles by Deepak Dabi

  • Gradle Java plugin to publish artifact in AWS S3

    Gradle is a great open-source build automation tool, which most of java developers prefer over maven due to its high…

    2 Comments
  • Spark UDAF with window function & Groupby

    Apache Spark has become de-facto framework for big data processing . Spark has great library support for UDF (user…

    4 Comments
  • Writing Avro From Spark to Kafka

    Hi All, Writing data from spark to any target is pretty standard, but when it comes to writing Avro object to Kafka;…

    2 Comments
  • Live stream process with history data -Kafka & spark streaming

    I was recently given an exercise to write an end to end flow where live events flow from Kafka as json format, which…

  • Cassandra As File Chunk Store

    Hi Guys, Cassandra is cool NoSQL DB and recently been getting traction due to its CQL (Cousin of SQL), probably could…

    5 Comments
  • HBase & Solr - Near Real time indexing and search

    For Solr - How beautiful an open-source cloud be :), Cheers to Team Solr for there good work. Now the use case is…

    8 Comments
  • HBase & Spark - Transformation+Aggregation of JSON.

    I came across a use case where the processing is a bit messy when data is stored in a json format into HBase; and you…

    8 Comments

Others also viewed

Explore content categories