142
votes

I'd like to be able to use the result of the last executed command in a subsequent command. For example,

$ find . -name foo.txt
./home/user/some/directory/foo.txt

Now let's say I want to be able to open the file in an editor, or delete it, or do something else with it, e.g.

mv <some-variable-that-contains-the-result> /some/new/location

How can I do it? Maybe using some bash variable?

Update:

To clarify, I don't want to assign things manually. What I'm after is something like built-in bash variables, e.g.

ls /tmp
cd $_

$_ holds the last argument of the previous command. I want something similar, but with the output of the last command.

Final update:

Seth's answer has worked quite well. Couple of things to bear in mind:

  • don't forget to touch /tmp/x when trying the solution for the very first time
  • the result will only be stored if last command's exit code was successful
22
After seeing your edit I thought to delete my answer. I wonder whether there is anything built-in that you are looking for.taskinoor
I couldn't find anything built-in. I was wondering if it would be possible to implement it.. maybe through .bahsrc? I think it'd be a pretty handy feature.armandino
I am afraid all you can do is either redirect the output to file or pipe or capture it, otherwise it won't be saved.bandi
You can't do that without the cooperation of the shell and the terminal, and they generally don't cooperate. See also How do I reuse the last output from the command line? and Using text from previous commands' output on Unix Stack Exchange.Gilles 'SO- stop being evil'
One of the main reasons why the output of commands is not captured is because the output can be arbitrarily large - many megabytes at a time. Granted, not always that large, but big outputs cause problems.Jonathan Leffler

22 Answers

72
votes

This is a really hacky solution, but it seems to mostly work some of the time. During testing, I noted it sometimes didn't work very well when getting a ^C on the command line, though I did tweak it a bit to behave a bit better.

This hack is an interactive mode hack only, and I am pretty confident that I would not recommend it to anyone. Background commands are likely to cause even less defined behavior than normal. The other answers are a better way of programmatically getting at results.


That being said, here is the "solution":

PROMPT_COMMAND='LAST="`cat /tmp/x`"; exec >/dev/tty; exec > >(tee /tmp/x)'

Set this bash environmental variable and issues commands as desired. $LAST will usually have the output you are looking for:

startide seth> fortune
Courtship to marriage, as a very witty prologue to a very dull play.
                -- William Congreve
startide seth> echo "$LAST"
Courtship to marriage, as a very witty prologue to a very dull play.
                -- William Congreve
92
votes

I don't know of any variable that does this automatically. To do something aside from just copy-pasting the result, you can re-run whatever you just did, eg

vim $(!!)

Where !! is history expansion meaning 'the previous command'.

If you expect there to be a single filename with spaces or other characters in it that might prevent proper argument parsing, quote the result (vim "$(!!)"). Leaving it unquoted will allow multiple files to be opened at once as long as they don't include spaces or other shell parsing tokens.

19
votes

Bash is kind of an ugly language. Yes, you can assign the output to variable

MY_VAR="$(find -name foo.txt)"
echo "$MY_VAR"

But better hope your hardest that find only returned one result and that that result didn't have any "odd" characters in it, like carriage returns or line feeds, as they will be silently modified when assigned to a Bash variable.

But better be careful to quote your variable correctly when using it!

It's better to act on the file directly, e.g. with find's -execdir (consult the manual).

find -name foo.txt -execdir vim '{}' ';'

or

find -name foo.txt -execdir rename 's/\.txt$/.xml/' '{}' ';'
13
votes

There are more than one ways to do this. One way is to use v=$(command) which will assign the output of command to v. For example:

v=$(date)
echo $v

And you can use backquotes too.

v=`date`
echo $v

From Bash Beginners Guide,

When the old-style backquoted form of substitution is used, backslash retains its literal meaning except when followed by "$", "`", or "\". The first backticks not preceded by a backslash terminates the command substitution. When using the "$(COMMAND)" form, all characters between the parentheses make up the command; none are treated specially.

EDIT: After the edit in the question, it seems that this is not the thing that the OP is looking for. As far as I know, there is no special variable like $_ for the output of last command.

12
votes

It's quite easy. Use back-quotes:

var=`find . -name foo.txt`

And then you can use that any time in the future

echo $var
mv $var /somewhere
12
votes

Disclamers:

  • This answer is late half a year :D
  • I'm a heavy tmux user
  • You have to run your shell in tmux for this to work

When running an interactive shell in tmux, you can easily access the data currently displayed on a terminal. Let's take a look at some interesting commands:

  • tmux capture-pane: this one copies the displayed data to one of the tmux's internal buffers. It can copy the history that's currently not visible, but we're not interested in that now
  • tmux list-buffers: this displays the info about the captured buffers. The newest one will have the number 0.
  • tmux show-buffer -b (buffer num): this prints the contents of the given buffer on a terminal
  • tmux paste-buffer -b (buffer num): this pastes the contents of the given buffer as input

Yeah, this gives us a lot of possibilities now :) As for me, I set up a simple alias: alias L="tmux capture-pane; tmux showb -b 0 | tail -n 3 | head -n 1" and now every time I need to access the last line i simply use $(L) to get it.

This is independent of the output stream the program uses (be it stdin or stderr), the printing method (ncurses, etc.) and the program's exit code - the data just needs to be displayed.

9
votes

I think you might be able to hack out a solution that involves setting your shell to a script containing:

#!/bin/sh
bash | tee /var/log/bash.out.log

Then if you set $PROMPT_COMMAND to output a delimiter, you can write a helper function (maybe called _) that gets you the last chunk of that log, so you can use it like:

% find lots*of*files
...
% echo "$(_)"
... # same output, but doesn't run the command again
7
votes

You could set up the following alias in your bash profile:

alias s='it=$($(history | tail -2 | head -1 | cut -d" " -f4-))'

Then, by typing 's' after an arbitrary command you can save the result to a shell variable 'it'.

So example usage would be:

$ which python
/usr/bin/python
$ s
$ file $it
/usr/bin/python: symbolic link to `python2.6'
6
votes

Capture the output with backticks:

output=`program arguments`
echo $output
emacs $output
6
votes

I just distilled this bash function from the suggestions here:

grab() {     
  grab=$("$@")
  echo $grab
}

Then, you just do:

> grab date
Do 16. Feb 13:05:04 CET 2012
> echo $grab
Do 16. Feb 13:05:04 CET 2012

Update: an anonymous user suggested to replace echo by printf '%s\n' which has the advantage that it doesn't process options like -e in the grabbed text. So, if you expect or experience such peculiarities, consider this suggestion. Another option is to use cat <<<$grab instead.

5
votes

By saying "I'd like to be able to use the result of the last executed command in a subsequent command", I assume - you mean the result of any command, not just find.

If thats the case - xargs is what you are looking for.

find . -name foo.txt -print0 | xargs -0 -I{} mv {} /some/new/location/{}

OR if you are interested to see the output first:

find . -name foo.txt -print0

!! | xargs -0 -I{} mv {} /some/new/location/{}

This command deals with multiple files and works like a charm even if the path and/or filename contains space(s).

Notice the mv {} /some/new/location/{} part of the command. This command is build and executed for each line printed by earlier command. Here the line printed by earlier command is replaced in place of {}.

Excerpt from man page of xargs:

xargs - build and execute command lines from standard input

For more detail see man page: man xargs

4
votes

I usually do what the others here have suggested ... without the assignment:

$find . -iname '*.cpp' -print
./foo.cpp
./bar.cpp
$vi `!!`
2 files to edit

You can get fancier if you like:

$grep -R "some variable" * | grep -v tags
./foo/bar/xxx
./bar/foo/yyy
$vi `!!`
2
votes

If all you want is to rerun your last command and get the output, a simple bash variable would work:

LAST=`!!`

So then you can run your command on the output with:

yourCommand $LAST

This will spawn a new process and rerun your command, then give you the output. It sounds like what you would really like would be a bash history file for command output. This means you will need to capture the output that bash sends to your terminal. You could write something to watch the /dev or /proc necessary, but that's messy. You could also just create a "special pipe" between your term and bash with a tee command in the middle which redirects to your output file.

But both of those are kind of hacky solutions. I think the best thing would be terminator which is a more modern terminal with output logging. Just check your log file for the results of the last command. A bash variable similar to the above would make this even simpler.

1
votes

Here's one way to do it after you've executed your command and decided that you want to store the result in a variable:

$ find . -name foo.txt
./home/user/some/directory/foo.txt
$ OUTPUT=`!!`
$ echo $OUTPUT
./home/user/some/directory/foo.txt
$ mv $OUTPUT somewhere/else/

Or if you know ahead of time that you'll want the result in a variable, you can use backticks:

$ OUTPUT=`find . -name foo.txt`
$ echo $OUTPUT
./home/user/some/directory/foo.txt
1
votes

As an alternative to the existing answers: Use while if your file names can contain blank spaces like this:

find . -name foo.txt | while IFS= read -r var; do
  echo "$var"
done

As I wrote, the difference is only relevant if you have to expect blanks in the file names.

NB: the only built-in stuff is not about the output but about the status of the last command.

1
votes

you can use !!:1. Example:

~]$ ls *.~
class1.cpp~ class1.h~ main.cpp~ CMakeList.txt~ 

~]$ rm !!:1
rm class1.cpp~ class1.h~ main.cpp~ CMakeList.txt~ 


~]$ ls file_to_remove1 file_to_remove2
file_to_remove1 file_to_remove2

~]$ rm !!:1
rm file_to_remove1

~]$ rm !!:2
rm file_to_remove2
1
votes

I had a similar need, in which I wanted to use the output of last command into the next one. Much like a | (pipe). eg

$ which gradle 
/usr/bin/gradle
$ ls -alrt /usr/bin/gradle

to something like -

$ which gradle |: ls -altr {}

Solution : Created this custom pipe. Really simple, using xargs -

$ alias :='xargs -I{}'

Basically nothing by creating a short hand for xargs, it works like charm, and is really handy. I just add the alias in .bash_profile file.

1
votes

It can be done using the magic of file descriptors and the lastpipe shell option.

It has to be done with a script - the "lastpipe" option will not work in interactive mode.

Here's the script I've tested with:

$ cat shorttest.sh 
#!/bin/bash
shopt -s lastpipe

exit_tests() {
    EXITMSG="$(cat /proc/self/fd/0)"
}

ls /bloop 2>&1 | exit_tests

echo "My output is \"$EXITMSG\""


$ bash shorttest.sh 
My output is "ls: cannot access '/bloop': No such file or directory"

What I'm doing here is:

  1. setting the shell option shopt -s lastpipe. It will not work without this as you will lose the file descriptor.

  2. making sure my stderr also gets captured with 2>&1

  3. piping the output into a function so that the stdin file descriptor can be referenced.

  4. setting the variable by getting the contents of the /proc/self/fd/0 file descriptor, which is stdin.

I'm using this for capturing errors in a script so if there is a problem with a command I can stop processing the script and exit right away.

shopt -s lastpipe

exit_tests() {
    MYSTUFF="$(cat /proc/self/fd/0)"
    BADLINE=$BASH_LINENO
}

error_msg () {
    echo -e "$0: line $BADLINE\n\t $MYSTUFF"
    exit 1
}

ls /bloop 2>&1 | exit_tests ; [[ "${PIPESTATUS[0]}" == "0" ]] || error_msg

In this way I can add 2>&1 | exit_tests ; [[ "${PIPESTATUS[0]}" == "0" ]] || error_msg behind every command I care to check on.

Now you can enjoy your output!

0
votes

This is not strictly a bash solution but you can use piping with sed to get the last row of previous commands output.

First lets see what i have in folder "a"

rasjani@helruo-dhcp022206::~$ find a
a
a/foo
a/bar
a/bat
a/baz
rasjani@helruo-dhcp022206::~$ 

Then, your example with ls and cd would turn to sed & piping into something like this:

rasjani@helruo-dhcp022206::~$ cd `find a |sed '$!d'`
rasjani@helruo-dhcp022206::~/a/baz$ pwd
/home/rasjani/a/baz
rasjani@helruo-dhcp022206::~/a/baz$

So, the actual magic happens with sed, you pipe what ever output of what ever command into sed and sed prints the last row which you can use as parameter with back ticks. Or you can combine that to xargs also. ("man xargs" in shell is your friend)

0
votes

The shell doesn't have perl-like special symbols that store the echo result of the last command.

Learn to use the pipe symbol with awk.

find . | awk '{ print "FILE:" $0 }'

In the example above you could do:

find . -name "foo.txt" | awk '{ print "mv "$0" ~/bar/" | "sh" }'
0
votes
find . -name foo.txt 1> tmpfile && mv `cat tmpfile` /path/to/some/dir/

is yet another way, albeit dirty.

0
votes

I find remembering to pipe the output of my commands into a specific file to be a bit annoying, my solution is a function in my .bash_profile that catches the output in a file and returns the result when you need it.

The advantage with this one is that you don't have to rerun the whole command (when using find or other long-running commands that can be critical)

Simple enough, paste this in your .bash_profile:

Script

# catch stdin, pipe it to stdout and save to a file
catch () { cat - | tee /tmp/catch.out}
# print whatever output was saved to a file
res () { cat /tmp/catch.out }

Usage

$ find . -name 'filename' | catch
/path/to/filename

$ res
/path/to/filename

At this point, I tend to just add | catch to the end of all of my commands, because there's no cost to doing it and it saves me having to rerun commands that take a long time to finish.

Also, if you want to open the file output in a text editor you can do this:

# vim or whatever your favorite text editor is
$ vim <(res)