0
votes

My workflow is email--sendgrid--griddler and the rails app runs on heroku. All inbound emails have attachments, and some are rather large. I keep getting H12 timeouts on Heroku because the attachments take more than 30s to upload to Google Cloud Storage.

I have used delayed job for everything I can, but I dont THINK I can pass an attachment from griddler to a delayed job, since the attachment is ephemeral. I had a friend suggest just move to retrieving the emails from gmail instead of using sendgrid and griddler, but that would be more of a rewrite than I am up for at the moment. In an ideal world, I would be able to pass the attachments to a delayed job, but I dont know if that is possible in the end.

    email_processor.rb

        if pdfs = attachments.select { |attachment| attachment.content_type == 'application/pdf' }
          pdfs.each do |att|
            # att is a ActionDispatch::Http::UploadedFile type
            # content_type = MIME::Types.type_for(att.to_path).first.content_type
            content_type = att.content_type
            if content_type == 'application/pdf'
              # later if we use multiple attachments in single fax, change num_pages
              fax = @email_address.phone_number.faxes.create(sending_email: @email_address.address ,twilio_to: result, twilio_from: @email_address.phone_number.number, twilio_num_pages: 1)
              attachment = fax.create_attachment
              # next two rows should be with delay
              attachment.upload_file!(att)
              #moved to attachment model for testing
              #fax.send!
            end
          end


    file upload from another model
    def upload_file!(file)
    # file should be ActionDispatch::Http::UploadedFile type
    filename = file.original_filename.gsub(/\s+/, '_')
    filename = filename[0..15] if filename.size > 16
    path = "fax/#{fax.id}/att-#{id}-#{filename}"
    upload_out!(file.open, path)

    #self.fax.send!
    #make_thumbnail_pdf!(file.open)
  end

  def upload_out!(file, path)
    upload = StorageBucket.files.new key: path, body: file, public: true
    upload.save # upload file
    update_columns url: upload.public_url
    self.fax.update(status: 'outbound-uploaded')
    self.fax.process!
  end
2

2 Answers

0
votes

If you cannot receive and upload the attachment in 30 seconds then heroku won't work for receiving emails. You are right - the ephemeral storage on the web dyno is not accessible from the worker dyno running delayed job.

Even if it were possible for the worker dyno to read data from the web dyno's ephemeral storage there is no guarantee the web dyno could handle the POST from sendgrid in 30 seconds if the attachments were large enough.

One option is to configure sendgrid to forward the emails directly to your google app engine - https://cloud.google.com/appengine/docs/standard/python/mail/receiving-mail-with-mail-api

Your app engine script could write the attachments into google cloud storage, and then your app engine script could do a POST to your heroku app with the location of the attachment and the web app could then queue a delayed job to download and handle the attachment.

0
votes

I ended up just rewriting my email processing completely. I setup gmail as mail target, and then used a scheduled task at Heroku to process the emails(look for unread) and then upload the attachments to Google Cloud Storage. Using a scheduled task gets around the H12 issues.