7
votes

I have a script that works great manually. However, when I schedule it in Task Scheduler, the script never ends and so the next time it tries to run, it fails because the prior instance is still running. The script itself takes a few seconds to complete the first time or when run manually. Here is the script:

$source = "\\server1\upload"
$destination = "\\server2\upload"
$logfile = "c:\Scripts\fileMover\log.txt"
$table = Get-ChildItem $source -include *
foreach ($file in $table){
    $filename = $file.FullName
    #write-host $filename
    try
    {
        move-item -LiteralPath $filename -destination $destination -force
        $body = "Successfully moved $filename to $destination"
        $subject = "fileMover Succeeded"
    }
    catch
    {
        $body = "Failed to move $filename to $destination"
        $subject = "fileMover Failed"
    }
    finally
    {
        $body | out-file $logfile -append -width 1000 -encoding ascii
        Send-MailMessage -To "[email protected]" -From "[email protected]" -Subject $subject -SmtpServer "10.1.10.1" -Body $body
        exit
    }
}
exit

The script is scheduled with the following settings:

  • Run whether user is logged or not (user account has been granted log in as a batch program privilege)
  • Run with highest privileges
  • Triggers Daily, every 2 minute
  • Action: Start a Program powershell -file c:\Scripts\upload.ps1

As a workaround, I configured the task to automatically stop after 1 minute. However, I'm concerned that in certain circumstances -- such as a large number of large files -- the script may get terminated before completing fully.

The script needs to run every 2 minutes.

3
Try appending 'Start-Transcript "C:\Scripts\Upload-Transcript.txt" to the start of your script, and see if it's throwing errors when running via task scheduler, it may be prompting for email credentials or something, and getting stuck there. - colsw
Thanks for the suggestion Connor. I did as suggested, but no errors were thrown. - Doug
I've narrowed the problem down to the send-mailmessage. When run manually, the script runs within a few seconds. However, when run as a task, it takes around a minute per file. Rather than sending a confirmation for each file processed, I rewrote the script to send the a single email at the end of the process. This doesn't explain the delay when run as a task, but it has fixed my problem for now. - Doug

3 Answers

8
votes

I had this problem when trying to execute .ps1 file directly. Execute Powershell and feed it your script as a parameter.

Scheduled Task Actions Tab:

Program\script: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe

Add Arguments (optional):
-command & 'E:\PowerShell_scripts\Script_name.ps1’

2
votes

For the "Start a program" action, list powershell as the "Program/script" and put the path to the script you want to run in the "Add arguments (optional)" box.

The action never finishes if you try to "run" a PowerShell script by listing it as the "Program/script" directly, because Notepad is the default application associated with the .ps1 extension (to help users avoid accidentally running a PowerShell script), and so it opens the script in Notepad in the background instead of executing it. Notepad then never closes itself.

2
votes

This is going to sound silly, but I initially had the same problem as detailed in the other answers - trying to run the .ps1 script directly instead of via powershell.exe. I worked that part out since I saw Notepad opening, but after correcting my mistake it was still reporting Running.

The solution? Refresh