1
votes

I have the following problem: I'm trying to create a local cache for a maven repository that is hosted at a different site inside the same company. So I set up Artifactory OSS, created a couple remote repositories pointing at the off-site server and, generally speaking, everything works as expected, except for a couple of dependecies that are unusually large. Mind you they are not jars with a couple megabytes of compiled codes, but zip archives filled with image data. The largest one is 8gb on its own.

My gradle build fails every time, because my local Artifactory replies that the artifact does exist, tries to download it, and then causes a WebSocketTimeout. This does not happen, when I access the off-site server directly. When I uncheck "store locally" it works fine too, but that is the exact opposite of what I'm trying to achieve.

Any ideas?

1

1 Answers

0
votes

There are several timeouts that can be at play here. When you ask Artifactory to download and cache something for you, it will:

  1. Fully download the file
  2. Once complete, it will stream the file to you

This means that for larger files, Artifactory will take a while before sending anything back to the client. This can be a problem if the client (or something in between, like a reverse proxy) has a small timeout. There is also a socket timeout setting on Artifactory itself.

If the file appears in Artifactory at some point after the download has failed, the timeout issue is likely in the reverse proxy or the client itself and you should try increasing them. If the file never makes it to Artifactory, you should go to the remote repository settings (UI -> Admin -> Repositories -> Remote -> [repo name] -> Advanced) and increase the Socket Timeout (MS) to be double or triple the current value, just enough to get the largest file but not too large as it may cause too many connections to a bad server to build up.