File this one under: AWS tips and tricks.
While this may seem to be a bit of an obscure configuration to setup, there can be some specific cases where it is desirable to run multiple services or APIs within the same ECS task, and then have them served through separate load balancers. In my scenario, I was serving a GraphQL API along with a WebSocket API through separate ports on the same server. While one solution is to set up multiple tasks, one for each API, this would require setting up a separate data source remotely accessible to both, such as redis. However, due to some limitations of the framework being used, and for simplicity sake, I wanted to start by using application in-memory data source, and ended up with the two APIs on the same application served through two separate ports.
Whatever the motivation for this setup, it’s first important to understand what is supported with ECS tasks and load balancers, both in terms of configuration, and any limitations in terms of deployment.
The good news is that it is possible. The bad news is, it’s not apparent how to achieve it, and there are indeed some limitations.
The most important to note is that it is not possible to set this up with configuration through AWS Console, and the ECS service configuration will need to be created using the AWS CLI. The second limitation is that it is not at all possible to deploy this configuration, at the time of writing at least, with the blue/green deployment mechanism offered within ECS. Both of these were confirmed with me through AWS Support.
If we are fine with these two (albeit significant) limitations, we can dig into the steps required to get it working. Firstly, the load balancers can be created as normal through AWS Console. Set up listeners based on the two ports that will be exposed on the single ECS task. Copy the ARNs of the created load balancers.
In order to create the ECS service with two separate load balancers, create a service definition JSON that will be used when creating the service through AWS CLI. Modify as needed to suit your specific needs:
{
"serviceName": "[Service Name]",
"cluster": "[Cluster Name]",
"loadBalancers": [
{
"targetGroupArn": "[Load Balancer 1 ARN]",
"containerName": "[Container Name]",
"containerPort": [Port used for Load Balancer 1]
},
{
"targetGroupArn": "[Load Balancer 2 ARN]",
"containerName": "[Container Name]",
"containerPort": [Port used for Load Balancer 2]
}
],
"serviceRegistries": [],
"desiredCount": 1,
"taskDefinition": "[Task Definition ARN]",
"deploymentConfiguration": {
"maximumPercent": 200,
"minimumHealthyPercent": 100
},
"placementConstraints": [],
"placementStrategy": [],
"healthCheckGracePeriodSeconds": 250,
"schedulingStrategy": "REPLICA",
"deploymentController": {
"type": "ECS"
},
"enableECSManagedTags": true,
"propagateTags": "NONE",
"enableExecuteCommand": false
}
Ensure that you have the AWS CLI installed here. Then run, the ecs create-service command to create the ECS task:
aws ecs create-service --service-name [Service Name] --cli-input-json [Location to definition file eg. file://def.json]
It should take a little time to then create your service. Go into the service and note that you now have two load balancers associated with the service. Hopefully in the future AWS enables service creation with this kind of configuration from the Console, as well as the ability to deploy with it’s blue/green deployment mechanism.