Depending on your user data, I see multiple options to expose data to AWS Batch jobs / containers:
- for key/value pairs, expose the data via environment variables - use the container_overrides parameter for aws batch ( see 1 or use the aws cli :
aws batch submit-job --container-overrides vcpus=integer,memory=integer,command=[string,string],environment=[{name=EnvVariableName,value=EnvVariableValue},{name=string,value=string}]
Or just write your variables to a json file and use --cli-input-json my_file.json
If you have large data, store it in S3, hand over the S3 address during startup and then pull the data during startup ; the aws batch job needs to have permission to fetch data from S3
for secrets, use the AWS Secret manager to store them, allow your AWS Batch jobs to use secretsmanager:GetSecretValue and pull the secret from AWS