22
votes

Hello I'm looking for powershell script which would merge all csv files in a directory into one text file (.txt) . All csv files have same header which is always stored in a first row of every file. So I need to take header from the first file, but in rest of the files the first row should be skipped. I was able to find batch file which is doing exactly what I need, but I have more than 4000 csv files in a single directory and it takes more than 45 minutes to do the job.

@echo off
ECHO Set working directory
cd /d %~dp0
Deleting existing combined file
del summary.txt
setlocal ENABLEDELAYEDEXPANSION
set cnt=1
for %%i in (*.csv) do (
 if !cnt!==1 (
 for /f "delims=" %%j in ('type "%%i"') do echo %%j >> summary.txt
) else (
 for /f "skip=1 delims=" %%j in ('type "%%i"') do echo %%j >> summary.txt
 )
 set /a cnt+=1
 )

Any suggestion how to create powershell script which would be more efficient than this batch code?

Thank you.

John

11

11 Answers

49
votes

This will append all the files together reading them one at a time:

get-childItem "YOUR_DIRECTORY\*.txt" 
| foreach {[System.IO.File]::AppendAllText
 ("YOUR_DESTINATION_FILE", [System.IO.File]::ReadAllText($_.FullName))}

# Placed on seperate lines for readability

This one will place a new line at the end of each file entry if you need it:

get-childItem "YOUR_DIRECTORY\*.txt" | foreach
{[System.IO.File]::AppendAllText("YOUR_DESTINATION_FILE", 
[System.IO.File]::ReadAllText($_.FullName) + [System.Environment]::NewLine)}

Skipping the first line:

$getFirstLine = $true

get-childItem "YOUR_DIRECTORY\*.txt" | foreach {
    $filePath = $_

    $lines =  $lines = Get-Content $filePath  
    $linesToWrite = switch($getFirstLine) {
           $true  {$lines}
           $false {$lines | Select -Skip 1}

    }

    $getFirstLine = $false
    Add-Content "YOUR_DESTINATION_FILE" $linesToWrite
    }
50
votes

If you're after a one-liner you can pipe each csv to an Import-Csv and then immediately pipe that to Export-Csv. This will retain the initial header row and exclude the remaining files header rows. It will also process each csv one at a time rather than loading all into memory and then dumping them into your merged csv.

Get-ChildItem -Filter *.csv | Select-Object -ExpandProperty FullName | Import-Csv | Export-Csv .\merged\merged.csv -NoTypeInformation -Append
8
votes

Try this, it worked for me

Get-Content *.csv| Add-Content output.csv
4
votes

This is pretty trivial in PowerShell.

$CSVFolder = 'C:\Path\to\your\files';
$OutputFile = 'C:\Path\to\output\file.txt';

$CSV = Get-ChildItem -Path $CSVFolder -Filter *.csv | ForEach-Object { 
    Import-Csv -Path $_
}

$CSV | Export-Csv -Path $OutputFile -NoTypeInformation -Force;

Only drawback to this approach is that it does parse every file. It also loads all files into memory, so if we're talking about 4000 files that are 100 MB each you'll obviously run into problems.

You might get better performance with System.IO.File and System.IO.StreamWriter.

2
votes

Your Batch file is pretty inefficient! Try this one (you'll be surprised :)

@echo off
ECHO Set working directory
cd /d %~dp0
ECHO Deleting existing combined file
del summary.txt
setlocal
for %%i in (*.csv) do set /P "header=" < "%%i" & goto continue
:continue

(
   echo %header%
   for %%i in (*.csv) do (
      for /f "usebackq skip=1 delims=" %%j in ("%%i") do echo %%j
   )
) > summary.txt

How this is an improvement

  1. for /f ... in ('type "%%i"') requires to load and execute cmd.exe in order to execute the type command, capture its output in a temporary file and then read data from it, and this is done with each input file. for /f ... in ("%%i") directly reads data from the file.
  2. The >> redirection opens the file, appends data at end and closes the file, and this is done with each output *line*. The > redirection keeps the file open all the time.
1
votes

Here is a version also using System.IO.File,

$result = "c:\temp\result.txt"
$csvs = get-childItem "c:\temp\*.csv" 
#read and write CSV header
[System.IO.File]::WriteAllLines($result,[System.IO.File]::ReadAllLines($csvs[0])[0])
#read and append file contents minus header
foreach ($csv in $csvs)  {
    $lines = [System.IO.File]::ReadAllLines($csv)
    [System.IO.File]::AppendAllText($result, ($lines[1..$lines.Length] | Out-String))
}
0
votes

I found the previous solutions quite inefficient for large csv-files in terms of performance, so here is a performant alternative.

Here is an alternative which simply appends the files:

cmd /c copy  ((gci "YOUR_DIRECTORY\*.csv" -Name) -join '+') "YOUR_OUTPUT_FILE.csv" 

Thereafter, you probably want to get rid of the multiple csv-headers.

0
votes

The following batch script is very fast. It should work well as long as none of your CSV files contain tab characters, and all source CSV files have fewer than 64k lines.

@echo off
set "skip="
>summary.txt (
  for %%F in (*.csv) do if defined skip (
    more +1 "%%F"
  ) else (
    more "%%F"
    set skip=1
  )
)

The reason for the restrictions is that MORE converts tabs into a series of spaces, and redirected MORE hangs at 64k lines.

-1
votes
$pathin = 'c:\Folder\With\CSVs'
$pathout = 'c:\exported.txt'
$list = Get-ChildItem -Path $pathin | select FullName
foreach($file in $list){
    Import-Csv -Path $file.FullName | Export-Csv -Path $pathout -Append -NoTypeInformation
}
-1
votes
Get-ChildItem *.csv|select -First 1|Get-Content|select -First 1|Out-File -FilePath .\input.csv -Force #Get the header from one of the CSV Files, write it to input.csv
Get-ChildItem *.csv|foreach {Get-Content $_|select -Skip 1|Out-File -FilePath .\Input.csv -Append} #Get the content of each file, excluding the first line and append it to input.csv
-4
votes

type *.csv >> folder\combined.csv