360
votes

Well, this one seems quite simple, and it is. All you have to do to download a file to your server is:

file_put_contents("Tmpfile.zip", file_get_contents("http://someurl/file.zip"));

Only there is one problem. What if you have a large file, like 100mb. Then, you will run out of memory, and not be able to download the file.

What I want is a way to write the file to the disk as I am downloading it. That way, I can download bigger files, without running into memory problems.

12
That's set in your server configuration, PHP can't really get around it as far as I know (except for a direct .ini edit)Ben

12 Answers

538
votes

Since PHP 5.1.0, file_put_contents() supports writing piece-by-piece by passing a stream-handle as the $data parameter:

file_put_contents("Tmpfile.zip", fopen("http://someurl/file.zip", 'r'));

From the manual:

If data [that is the second argument] is a stream resource, the remaining buffer of that stream will be copied to the specified file. This is similar with using stream_copy_to_stream().

(Thanks Hakre.)

137
votes
private function downloadFile($url, $path)
{
    $newfname = $path;
    $file = fopen ($url, 'rb');
    if ($file) {
        $newf = fopen ($newfname, 'wb');
        if ($newf) {
            while(!feof($file)) {
                fwrite($newf, fread($file, 1024 * 8), 1024 * 8);
            }
        }
    }
    if ($file) {
        fclose($file);
    }
    if ($newf) {
        fclose($newf);
    }
}
70
votes

Try using cURL

set_time_limit(0); // unlimited max execution time
$options = array(
  CURLOPT_FILE    => '/path/to/download/the/file/to.zip',
  CURLOPT_TIMEOUT =>  28800, // set this to 8 hours so we dont timeout on big files
  CURLOPT_URL     => 'http://remoteserver.com/path/to/big/file.zip',
);

$ch = curl_init();
curl_setopt_array($ch, $options);
curl_exec($ch);
curl_close($ch);

I'm not sure but I believe with the CURLOPT_FILE option it writes as it pulls the data, ie. not buffered.

23
votes

prodigitalson's answer didn't work for me. I got missing fopen in CURLOPT_FILE more details.

This worked for me, including local urls:

function downloadUrlToFile($url, $outFileName)
{   
    if(is_file($url)) {
        copy($url, $outFileName); 
    } else {
        $options = array(
          CURLOPT_FILE    => fopen($outFileName, 'w'),
          CURLOPT_TIMEOUT =>  28800, // set this to 8 hours so we dont timeout on big files
          CURLOPT_URL     => $url
        );

        $ch = curl_init();
        curl_setopt_array($ch, $options);
        curl_exec($ch);
        curl_close($ch);
    }
}
19
votes
  1. Create a folder called "downloads" in destination server
  2. Save [this code] into .php file and run in destination server

Downloader :

<html>
<form method="post">
<input name="url" size="50" />
<input name="submit" type="submit" />
</form>
<?php
    // maximum execution time in seconds
    set_time_limit (24 * 60 * 60);

    if (!isset($_POST['submit'])) die();

    // folder to save downloaded files to. must end with slash
    $destination_folder = 'downloads/';

    $url = $_POST['url'];
    $newfname = $destination_folder . basename($url);

    $file = fopen ($url, "rb");
    if ($file) {
      $newf = fopen ($newfname, "wb");

      if ($newf)
      while(!feof($file)) {
        fwrite($newf, fread($file, 1024 * 8 ), 1024 * 8 );
      }
    }

    if ($file) {
      fclose($file);
    }

    if ($newf) {
      fclose($newf);
    }
?>
</html> 
18
votes
set_time_limit(0); 
$file = file_get_contents('path of your file');
file_put_contents('file.ext', $file);
9
votes

There are 3 ways:

  1. file_get_contents and file_put_contents
  2. CURL
  3. fopen

You can find examples from here.

9
votes

Use a simple method in php copy()

copy($source_url, $local_path_with_file_name);

Note: if the destination file already exists, it will be overwritten

PHP copy() Function

Note: You need to set permission 777 for the destination folder. Use this method when you are downloading to your local machine.

Special Note: 777 is a permission in Unix based system with full read/write/execute permission to owner, group and everyone. In general we give this permission to assets which are not much needed to be hidden from public on a web server. Example: images folder.

5
votes

I use this to download file

function cURLcheckBasicFunctions()
{
  if( !function_exists("curl_init") &&
      !function_exists("curl_setopt") &&
      !function_exists("curl_exec") &&
      !function_exists("curl_close") ) return false;
  else return true;
}

/*
 * Returns string status information.
 * Can be changed to int or bool return types.
 */
function cURLdownload($url, $file)
{
  if( !cURLcheckBasicFunctions() ) return "UNAVAILABLE: cURL Basic Functions";
  $ch = curl_init();
  if($ch)
  {

    $fp = fopen($file, "w");
    if($fp)
    {
      if( !curl_setopt($ch, CURLOPT_URL, $url) )
      {
        fclose($fp); // to match fopen()
        curl_close($ch); // to match curl_init()
        return "FAIL: curl_setopt(CURLOPT_URL)";
      }
      if ((!ini_get('open_basedir') && !ini_get('safe_mode')) || $redirects < 1) {
        curl_setopt($ch, CURLOPT_USERAGENT, '"Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1.11) Gecko/20071204 Ubuntu/7.10 (gutsy) Firefox/2.0.0.11');
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
        //curl_setopt($ch, CURLOPT_REFERER, 'http://domain.com/');
        if( !curl_setopt($ch, CURLOPT_HEADER, $curlopt_header)) return "FAIL: curl_setopt(CURLOPT_HEADER)";
        if( !curl_setopt($ch, CURLOPT_FOLLOWLOCATION, $redirects > 0)) return "FAIL: curl_setopt(CURLOPT_FOLLOWLOCATION)";
        if( !curl_setopt($ch, CURLOPT_FILE, $fp) ) return "FAIL: curl_setopt(CURLOPT_FILE)";
        if( !curl_setopt($ch, CURLOPT_MAXREDIRS, $redirects) ) return "FAIL: curl_setopt(CURLOPT_MAXREDIRS)";

        return curl_exec($ch);
    } else {
        curl_setopt($ch, CURLOPT_USERAGENT, '"Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1.11) Gecko/20071204 Ubuntu/7.10 (gutsy) Firefox/2.0.0.11');
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
        //curl_setopt($ch, CURLOPT_REFERER, 'http://domain.com/');
        if( !curl_setopt($ch, CURLOPT_FOLLOWLOCATION, false)) return "FAIL: curl_setopt(CURLOPT_FOLLOWLOCATION)";
        if( !curl_setopt($ch, CURLOPT_FILE, $fp) ) return "FAIL: curl_setopt(CURLOPT_FILE)";
        if( !curl_setopt($ch, CURLOPT_HEADER, true)) return "FAIL: curl_setopt(CURLOPT_HEADER)";
        if( !curl_setopt($ch, CURLOPT_RETURNTRANSFER, true)) return "FAIL: curl_setopt(CURLOPT_RETURNTRANSFER)";
        if( !curl_setopt($ch, CURLOPT_FORBID_REUSE, false)) return "FAIL: curl_setopt(CURLOPT_FORBID_REUSE)";
        curl_setopt($ch, CURLOPT_USERAGENT, '"Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1.11) Gecko/20071204 Ubuntu/7.10 (gutsy) Firefox/2.0.0.11');
    }
      // if( !curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true) ) return "FAIL: curl_setopt(CURLOPT_FOLLOWLOCATION)";
      // if( !curl_setopt($ch, CURLOPT_FILE, $fp) ) return "FAIL: curl_setopt(CURLOPT_FILE)";
      // if( !curl_setopt($ch, CURLOPT_HEADER, 0) ) return "FAIL: curl_setopt(CURLOPT_HEADER)";
      if( !curl_exec($ch) ) return "FAIL: curl_exec()";
      curl_close($ch);
      fclose($fp);
      return "SUCCESS: $file [$url]";
    }
    else return "FAIL: fopen()";
  }
  else return "FAIL: curl_init()";
}
4
votes

A PHP 4 & 5 Solution:

readfile() will not present any memory issues, even when sending large files, on its own. A URL can be used as a filename with this function if the fopen wrappers have been enabled.

http://php.net/manual/en/function.readfile.php

0
votes

Simple solution:

<?php 
exec('wget http://someurl/file.zip');
0
votes

best solution

install aria2c in system &

 echo exec("aria2c \"$url\"")