0
votes

I have AWS AutoScale setup with ELB attached. In General, the APIs deployed on EC2 instances of AutoScale are successfully getting requests from Loadbalancer and responses to the clients are okay. However, Now I have developed a new API that requires Client's IP address. In the current setup, Loadbalancer changes the source IP address.

I have gone through this document http://docs.aws.amazon.com/ElasticLoadBalancing/latest/DeveloperGuide/enable-proxy-protocol.html and I know that it is possible with Proxy Protocol

I have created the policy using this AWS CLI command

aws elb create-load-balancer-policy --load-balancer-name LB-autoscale --policy-name my-ProxyProtocol-policy --policy-type-name ProxyProtocolPolicyType --policy-attributes AttributeName=ProxyProtocol,AttributeValue=true

Question: How to set this load balancer policy for the backend EC2 servers created by AutoScale group automatically? So whenever autoscale group launches a new EC2 instance, the proxyProtocol should be enabled for that instance, and the API deployed on that instance should get the original IP of Client.

1

1 Answers

0
votes

ProxyProtocol should be enabled for that instance

The problem with the question you have asked is that you don't enable the proxy protocol "on the instance."

The proxy protocol has to be understood by the web server software that is running on the instance, and the software has to be configured to use it.

In the nginx web server, for example, instead of this...

server {
    listen 80;
...

...you would use this...

server {
    listen 80 proxy_protocol;
...

...and the $proxy_protocol_addr built-in variable would contain the client IP, which you could use to set a header to pass the address to downstream services.

According to the standard, if a service is expecting the proxy protocol preamble, it is required to reject any request that does not include it.

The receiver MUST be configured to only receive the protocol described in this specification and MUST not try to guess whether the protocol header is present or not.

http://www.haproxy.org/download/1.5/doc/proxy-protocol.txt

This means a compliant service configured to use the proxy protocol cannot work without it, and a service configured not to use it (or unaware of it/incompatible) should, at best, ignore it, and most commonly, fail entirely, since the payload will be unexpected.

So, without support by your stack, the proxy protocol won't work for you. It isn't handled on your side by the instance or by any AWS component.

On the other hand, for a web service API, you shouldn't typically need it.

An Elastic Load Balancer in HTTP mode will inject an X-Forwarded-For header into each request, containing the client's IP address. Most applications appear to use this mechanism.

If the incoming request already has such a header, the client IP address will be appended to the end, with values comma-separated. When multiple values are found by your code, only the rightmost value should be trusted, and values to the left of that should be considered "informational only" -- they may be accurate, and they may be forged... but the last one can't be tampered with.