234
votes

This is similar to this question, but I want to include the path relative to the current directory in unix. If I do the following:

ls -LR | grep .txt

It doesn't include the full paths. For example, I have the following directory structure:

test1/file.txt
test2/file1.txt
test2/file2.txt

The code above will return:

file.txt
file1.txt
file2.txt

How can I get it to include the paths relative to the current directory using standard Unix commands?

14
This just shows that ls is missing this feature. - Arjun Singri
It's a shame all of these solutions require find or tree. I'm ssh'ing into an android device where I appear to only have ls, and none of these other tools :/ - Ponkadoodle

14 Answers

320
votes

Use find:

find . -name \*.txt -print

On systems that use GNU find, like most GNU/Linux distributions, you can leave out the -print.

74
votes

Use tree, with -f (full path) and -i (no indentation lines):

tree -if --noreport .
tree -if --noreport directory/

You can then use grep to filter out the ones you want.


If the command is not found, you can install it:

Type following command to install tree command on RHEL/CentOS and Fedora linux:

# yum install tree -y

If you are using Debian/Ubuntu, Mint Linux type following command in your terminal:

$ sudo apt-get install tree -y
25
votes

Try find. You can look it up exactly in the man page, but it's sorta like this:

find [start directory] -name [what to find]

so for your example

find . -name "*.txt"

should give you what you want.

9
votes

You could use find instead:

find . -name '*.txt'
5
votes

That does the trick:

ls -R1 $PWD | while read l; do case $l in *:) d=${l%:};; "") d=;; *) echo "$d/$l";; esac; done | grep -i ".txt"

But it does that by "sinning" with the parsing of ls, though, which is considered bad form by the GNU and Ghostscript communities.

4
votes
DIR=your_path
find $DIR | sed 's:""$DIR""::'

'sed' will erase 'your_path' from all 'find' results. And you recieve relative to 'DIR' path.

4
votes

To get the actual full path file names of the desired files using the find command, use it with the pwd command:

find $(pwd) -name \*.txt -print
1
votes

Here is a Perl script:

sub format_lines($)
{
    my $refonlines = shift;
    my @lines = @{$refonlines};
    my $tmppath = "-";

    foreach (@lines)
    {
        next if ($_ =~ /^\s+/);
        if ($_ =~ /(^\w+(\/\w*)*):/)
        {
            $tmppath = $1 if defined $1;    
            next;
        }
        print "$tmppath/$_";
    }
}

sub main()
{
        my @lines = ();

    while (<>) 
    {
        push (@lines, $_);
    }
    format_lines(\@lines);
}

main();

usage:

ls -LR | perl format_ls-LR.pl
1
votes

You could create a shell function, e.g. in your .zshrc or .bashrc:

filepath() {
    echo $PWD/$1
}

filepath2() {
    for i in $@; do
        echo $PWD/$i
    done
}

The first one would work on single files only, obviously.

1
votes

Find the file called "filename" on your filesystem starting the search from the root directory "/". The "filename"

find / -name "filename" 
1
votes

If you want to preserve the details come with ls like file size etc in your output then this should work.

sed "s|<OLDPATH>|<NEWPATH>|g" input_file > output_file
1
votes

In the fish shell, you can do this to list all pdfs recursively, including the ones in the current directory:

$ ls **pdf

Just remove 'pdf' if you want files of any type.

0
votes

You can implement this functionality like this
Firstly, using the ls command pointed to the targeted directory. Later using find command filter the result from it. From your case, it sounds like - always the filename starts with a word file***.txt

ls /some/path/here | find . -name 'file*.txt'   (* represents some wild card search)
0
votes

In mycase, with tree command

Relative path

tree -ifF ./dir | grep -v '^./dir$' | grep -v '.*/$' | grep '\./.*' | while read file; do
  echo $file
done

Absolute path

tree -ifF ./dir | grep -v '^./dir$' | grep -v '.*/$' | grep '\./.*' | while read file; do
  echo $file | sed -e "s|^.|$PWD|g"
done