4
votes

How does one run locally a AWS Lambda Function with layers?

My environment:

    +---.aws-sam
        ....
    +---test
    |       app.py
    |       requirements.txt
    |       
    +---dependencies
    |   \---python
    |           constants.py
    |           requirements.txt
    |           sql.py
    |           utils.py
  • and deployment template like:
testFunc:
    Type: AWS::Serverless::Function
    Properties:
      CodeUri: teest/
      Handler: app.test
      Runtime: python3.6
      FunctionName: testFunc
      Events:
        test:
          Type: Api
          Properties:
            Path: /test
            Method: ANY
      Layers:
        - !Ref TempConversionDepLayer

  TempConversionDepLayer:
        Type: AWS::Serverless::LayerVersion
        Properties:
            LayerName: Layer1
            Description: Dependencies
            ContentUri: dependencies/
            CompatibleRuntimes:
              - python3.6
              - python3.7
            LicenseInfo: 'MIT'
            RetentionPolicy: Retain

I can deploy the function correctly and running it on AWS works well, whenever i try to run the function locally, it fails with the error message:

`Unable to import module 'app': No module named 'sql'`

I've tried to read all possible resources about Layers and Pycharm but nothing really helped.

Can anybody give a hand please?

Thank you,

1
What does the import statement in app.py look like?Peter Halverson
@PeterHalverson, i have for example: from utils import * utils being the dependencies/python/utils.py which gets expanded in AWS under /opt, and correctly imported. The error is : <<Unable to import module 'app': No module named 'utils'>> (yesterday i had a similar import but from the sql.py file, i was hoping to.)devplayer

1 Answers

3
votes

I was able to get around this issue in PyCharm by adding a symbolic link to another directory which contained code for the layer