Passing event data to AWS ECS Task
This short document is to show how to configure AWS Events Target to accept the event data.
Scenario: Cloudwatch event (called evenbridge now.) rule is configured to listen to an s3 bucket "PutObject" event. Event target is an ECS task that is configured to accept two event parameters (s3 keyname and s3 event type) and use them as environment variables in the ECS container task definition.
How to configure s3 events?
Go to AWS Events UI or programmatically create the event source as below.
{ "source": [ "aws.s3" ], "detail-type": [ "AWS API Call via CloudTrail" ], "detail": { "eventSource": [ "s3.amazonaws.com" ], "eventName": [ "PutObject" ], "requestParameters": { "bucketName": [ "mybucketxxx" ] } }}
//replace mybucketxxx with your bucket.
Note: Make sure cloudtrail is configured to accept data from this bucket and cloudwatch log group is configured for the cloudtrail. Cloudwatch log group receives the full json event. Cloudwatch event (a.k.a, eventbridge) uses this event to execute the rule.
How to configure target for the rule?
Do not configure from AWS UI. There is no "Input Transformer" option for ECS task. So you cannot really use the event data and pass it to the ECS task. If you have the need to do it, use a lambda function to create the target. (or CFT or Terraform.)
def create_eventbridge_target():
try:
client = boto3.client('events')
response = client.put_targets(
Rule='my-test-rule',
Targets=[
{
'Id': '1',
'Arn': 'arn:aws:ecs:us-east-1:123456789:custer/test',
'RoleArn': 'arn:aws:iam::123456789:role/eventstargetsRole',
'InputTransformer':
{
'InputPathsMap': {"keyname": "$.detail.requestParameters.key","eventname": "$.detail.eventName"},
'InputTemplate': "{\"containerOverrides\": [{\"name\":\"start-container\",\"environment\":[{\"name\":\"EVENTNAME\",\"value\":<eventname>},{ \"name\":\"S3_KEY\",\"value\":<keyname> }]}]}"
},
'EcsParameters': {
'TaskDefinitionArn': 'arn:aws:ecs:us-east-1:123456789:task-definition/start-td:11',
'TaskCount': 1,
'LaunchType': 'FARGATE',
'NetworkConfiguration':{
'awsvpcConfiguration': {
'Subnets': [ 'subnet-12341' ],
'SecurityGroups': [ 'sg-12345' ],
'AssignPublicIp': 'DISABLED'
}
}
}
}
]
)
except Exception as e: print(e)
Rule; Should be the rule name you already created the events source for. Lambda will simply update the existing rule or will create new one if not exists.
Id: unique id. Use "1" for first target. You can have multiple targets.
Arn: It is the cluster (ECS cluster) ARN.
RoleArn: Role ARN used by events.amazonaws.com principal to execute the target.
Policy attached for this role is as follows:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"ecs:RunTask"
],
"Resource": [
"*"
],
"Condition": {
"ArnLike": {
"ecs:cluster": "arn:aws:ecs:us-east-1:123456789:cluster/test"
}
}
},
{
"Effect": "Allow",
"Action": "iam:PassRole",
"Resource": [
"*"
],
"Condition": {
"StringLike": {
"iam:PassedToService": "ecs-tasks.amazonaws.com"
}
}
}
]
}
Trust relationship document:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": [
"ecs-tasks.amazonaws.com",
"events.amazonaws.com"
]
},
"Action": "sts:AssumeRole"
}
]
}
What is InputTransformer?
InputTranformer takes the event data, massages it and sends it to the target. It is the link between event source and the target.
'InputTransformer':
{
'InputPathsMap': {"keyname": "$.detail.requestParameters.key","eventname": "$.detail.eventName"},
'InputTemplate': "{\"containerOverrides\": [{\"name\":\"start-container\",\"environment\":[{\"name\":\"EVENTNAME\",\"value\":<eventname>},{ \"name\":\"S3_KEY\",\"value\":<keyname> }]}]}"
},
InputPathsMap can have up to 10 key/value data. You can use any value from the s3 source event which looks like below (last code snippet) in the cloudwatch log group. (once PutObject is triggered, you should see the below event in the cloudwatch log group in a few mins.. it may be delayed but target ECS task execution will usually be in real time.). Note for testing, we are taking $.detail.requestParameters.key (jsonpath notation of the event data.) and $.detail.eventName (again jsonpath notation.)
'InputTemplate' is where data transfoemed with inputpaths map and feed to the ECS task. If you check https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ecs.html#ECS.Client.run_task there is a overrides={} section. So essentially we are using containerOverrides and setting up environment variables with the "keyname" and "eventname". So all the magic happens in the InputTransformer section.
All other configuration in the target config. is self explanatory.
Once executed, event rule is now ready to use with source and target configured.
Now to test, when you Put an object to the s3 bucket, you should see ECS task coming up. Check the task details and you should see the two environment variables with values.
Now you could use these values in the docker application which is cool.
Sample events json:
{
"eventVersion": "1.07",
"userIdentity": {
"type": "Root",
"principalId": "123456789",
"arn": "arn:aws:iam::123456789:root",
"accountId": "123456789",
"accessKeyId": "xxxxxxxxxxx",
"sessionContext": {
"attributes": {
"creationDate": "xxxxxxxxxxZ",
"mfaAuthenticated": "false"
}
}
},
"eventTime": "xxxxxxxxxxxx",
"eventSource": "s3.amazonaws.com",
"eventName": "PutObject",
"awsRegion": "us-east-1",
"sourceIPAddress": "xxxxxxxxxxx",
"userAgent": "[Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36]",
"requestParameters": {
"X-Amz-Date": "xxxxxxxxx1Z",
"bucketName": "mybucketxxx",
"X-Amz-Algorithm": "AWS4-HMAC-SHA256",
"x-amz-acl": "private",
"X-Amz-SignedHeaders": "content-md5;content-type;host;x-amz-acl;x-amz-storage-class",
"Host": "mybucketxxx.s3.us-east-1.amazonaws.com",
"X-Amz-Expires": "300",
"key": "fortesting.txt",
"x-amz-storage-class": "STANDARD"
},
"responseElements": null,
"additionalEventData": {
"SignatureVersion": "SigV4",
"CipherSuite": "ExxxM-SHA256",
"bytesTransferredIn": 2298,
"AuthenticationMethod": "QueryString",
"x-amz-id-2": "4iDkGfxxxxxxxxxxqB9kjEWs6Wg3Bypd4=",
"bytesTransferredOut": 0
},
"requestID": "2Axxx04D",
"eventID": "24f6xxx132c214b268",
"readOnly": false,
"resources": [
{
"type": "AWS::S3::Object",
"ARN": "arn:aws:s3:::mybucketxx/fortesting.txt"
},
{
"accountId": "123456789",
"type": "AWS::S3::Bucket",
"ARN": "arn:aws:s3:::mybucketxx"
}
],
"eventType": "AwsApiCall",
"managementEvent": false,
"recipientAccountId": "123456789",
"eventCategory": "Data"
}
Hii , thanks for the post. I am currently facing an issue while working with CFT. Despite my attempts, I have been unable to successfully update the environment variable with the event data. This data is crucial for the proper functioning of the task, and I am in need of some guidance to resolve this issue. Can you please help with this .Thank You. Below is the section of the rule target configuration that I have been working on: "InputTransformer": { "InputPathsMap": { "payload": "$.detail" }, "InputTemplate": "{\"containerOverrides\":[{\"environment\":[{\"name\":\"TASK_INPUT_PAYLOAD\",\"value\":\"<payload>\"}],\"name\": \"auto-cv\"}]}" } I would greatly appreciate your assistance in troubleshooting this matter.Thank You
Roshan - you saved me from a lot of frustration here! Thank you!
How i can resolve this issue. Error : botocore.exceptions.ClientError: An error occurred (AccessDeniedException) when calling the PutTargets operation: User: arn:aws:sts::973*******:assumed-role/S3AccessRole/S3EventFunction is not authorized to perform: iam:PassRole on resource: arn:aws:iam::973*******:role/service-role/S3AccessRole because no identity-based policy allows the iam:PassRole action