My Perl script
while (blah)
{
system ("wget $blah");
}
does not die when I ctrl-c. Instead the child process wget dies, and the while-loop continues.
How can the parent Perl process detect this and terminate?
My Perl script
while (blah)
{
system ("wget $blah");
}
does not die when I ctrl-c. Instead the child process wget dies, and the while-loop continues.
How can the parent Perl process detect this and terminate?
I prefer qx (backtracks than system). Here is an example using backtracks. However, you can do the same with system.
use English; ## So i can use $OS_ERROR rather than $!. see perlvar for more info
qx(exit 1); ## anything other than 0 is an error in Linux
print "failed\n" if $OS_ERROR; ## you can die here
However, wget may not always fail. you may get served a 404 page, which will not be picked up as a failure.
use Mojo::UserAgent; ## use Mojo
my $url = "http://madeup.com/hdhd";
my $res = Mojo::UserAgent->new->get($url)->result;
unless ($res->code==200) {
print "ERROR\n".$res->message."\n"; ## you can use die here
}
Same As above using HTTP::Tiny
use HTTP::Tiny; ## use http tiny
my $url = "https://makeup.com/hdhd";
my $res = HTTP::Tiny->new->get($url);
unless ($res->{success}) {
print "ERROR\n".$res->{status}."\n"; ## you can die here
}
I'm inclined to do my own fork and manage the process more closely if I want to actually think about the child. Set up various signal handlers, exit with appopriate values, and so on. That gives you fine-grained control over what's happening. That forked process might even exec to become wget. And, since it's a different process, you can send signals to it directly.
But, if wget is the command you want to use, I'd wonder why you weren't using a module to do that work. You'd get a lot more control when you have access to the request and response info.