amazon s3 - How to download a file from GitHub Enterprise using Terraform? -
here s3_policy.json
{ "version":"2012-10-17", "statement":[ { "sid":"mybucket", "effect":"allow", "principal": "*", "action":["s3:getobject"], "resource":[ "arn:aws:s3:::${bucket_name}/*" ], "condition": { "ipaddress": { "aws:sourceip": [ "10.xx.xxx.x", "172.168.xx.x", ........, ........., .........., ..........., ] } } } ] }
i have common repo use different projects. common repo has cidr ip list in yaml format.
i pull terraform project can re use same file instead of hardcoding ip addresses.
i'm unable figure out way automate instead of hardcoding ip addresses in repo.
you consume ip addresses data source , use instead.
your policy document like:
resource "aws_iam_policy" "whitelist_ips" { name = "whitelist_ips" description = "${var.policy_description}" policy = <<eof { "version":"2012-10-17", "statement":[ { "sid":"mybucket", "effect":"allow", "principal": "*", "action":["s3:getobject"], "resource":[ "arn:aws:s3:::${bucket_name}/*" ], "condition": { "ipaddress": { "aws:sourceip": ["${data.external.ip_addresses.result}"] } } } ] } eof }
you need create external data source can ran fetch ip addresses location , return ips comma separated string.
data "external" "ip_addresses" { program = ["python", "${path.module}/get_ips.py"] }
where get_ips.py
might this:
#!/usr/bin/env python __future__ import print_function import json import re yaml_string = """ - 1.2.3.4/32 - 1.2.3.5/32 - 1.3.0.0/16 """ result = [] lines = yaml_string.split("\n") line in lines: # remove empty lines if line != "": result.append(re.sub('\s*-\s*', '', line)) print(json.dumps(','.join(result)))
but need go fetch yaml list github instead of pointlessly hardcoding in data source.
Comments
Post a Comment