You are not logged in.
I use zsh, but bash-compatible would be nice.
I know I can do command1 && command2.
Is it possible to start command1, realise it's going to take a while to complete, but specify a job to run immediately after it finishes (presumably successfully)?
i.e., I want to run command1 && command2, at the point that I've started command1 on its way and forgot I wanted to specify command2.
Or is this not possible?
Thanks guys.
Offline
Uh, i'm not entirely sure what you mean.
command1 && command2 will execute command2 after command1 finishes successfully. If you want to ignore if command1 failed or note you can:
command1 ; command2
Ogion
(my-dotfiles)
"People willing to trade their freedom for temporary security deserve neither and will lose both." - Benjamin Franklin
"Enlightenment is man's leaving his self-caused immaturity." - Immanuel Kant
Offline
You can try pgrep
if ! pgrep command1; then command2 fi
You can specify PID or the name of said command, depends on what actually this is.
Hackish, but may work, if you run only one command1 at th time - if you have multiple gzip processes running, you need a bit different approach.
Offline
nevermind
Last edited by saline (2010-09-07 15:59:12)
Offline
Uh, i'm not entirely sure what you mean.
command1 && command2 will execute command2 after command1 finishes successfully. If you want to ignore if command1 failed or note you can:
command1 ; command2Ogion
But you still have to run
command1 ; command2
as one - if you've run just command1, you either need to abort it and start anew (which may be a mess) or ...?
Offline
command 1
Control + Z
fg; command 2
Offline
Ah, now i understood what you want after reading Procyons answer. Nvm my post then
Ogion
(my-dotfiles)
"People willing to trade their freedom for temporary security deserve neither and will lose both." - Benjamin Franklin
"Enlightenment is man's leaving his self-caused immaturity." - Immanuel Kant
Offline
i still think he wants command2 to wait for command1 to finish. i'd use similar job control plus a small function (which could live in bashrc if you do this sort of thing often):
//blue/0/~/ when_done() { while true; do ps $1 &>/dev/null || return 0; done; }
//blue/0/~/ sleep 20
^Z
[1]+ Stopped sleep 20
//blue/148/~/ bg
[1]+ sleep 20 &
//blue/0/~/ jobs -l
[1]+ 30771 Running sleep 20 &
//blue/0/~/ when_done 30771 && echo 'command2'
[1]+ Done sleep 20
command2
//blue/0/~/
Last edited by brisbin33 (2010-09-07 16:41:32)
//github/
Offline
i still think he wants command2 to wait for command1 to finish. i'd use similar job control plus a small function (which could live in bashrc if you do this sort of thing often):
//blue/0/~/ when_done() { while true; do ps $1 &>/dev/null || return 0; done; } //blue/0/~/ sleep 20 ^Z [1]+ Stopped sleep 20 //blue/148/~/ bg [1]+ sleep 20 & //blue/0/~/ jobs -l [1]+ 30771 Running sleep 20 & //blue/0/~/ when_done 30771 && echo 'command2' [1]+ Done sleep 20 command2 //blue/0/~/
But it's tied to a PID, so if command1 actually means 'commandA && commandB', that will fail, won't it?
Offline
brisbin33 wrote:i still think he wants command2 to wait for command1 to finish. i'd use similar job control plus a small function (which could live in bashrc if you do this sort of thing often):
//blue/0/~/ when_done() { while true; do ps $1 &>/dev/null || return 0; done; } //blue/0/~/ sleep 20 ^Z [1]+ Stopped sleep 20 //blue/148/~/ bg [1]+ sleep 20 & //blue/0/~/ jobs -l [1]+ 30771 Running sleep 20 & //blue/0/~/ when_done 30771 && echo 'command2' [1]+ Done sleep 20 command2 //blue/0/~/
But it's tied to a PID, so if command1 actually means 'commandA && commandB', that will fail, won't it?
/edit:
I think you're right; after some CLI testing, it seems both my over-complicated way and Procyon's identical simpler solution (I misread his initially ) will fail in the case of two initial commands.
//blue/0/~/ sleep 10 && sleep 10
^Z
[1]+ Stopped sleep 10
//blue/148/~/ fg; echo yay
sleep 10
yay
but i don't think that was the original question, was it?
Last edited by brisbin33 (2010-09-07 17:07:31)
//github/
Offline
command 1
Control + Z
fg; command 2
This seems to do exactly what I want, thanks.
Uh, i'm not entirely sure what you mean.
This is my scenario: I've started some long-running command, and now want to leave the computer doing its job. Once it completes, I want it to run another command. I don't really care what the exit status of the first command is.
Suspending the first command then bringing it back to the foreground and executing another command seems to do the job though.
Thanks everyone!
Offline
The exit status will tell you whether it completed or aborted. command2 will run after command1 exits - but the job command1 ought to have done may not be completed, so maybe you should do some checks against it.
Offline
The exit status will tell you whether it completed or aborted. command2 will run after command1 exits - but the job command1 ought to have done may not be completed, so maybe you should do some checks against it.
You're right of course, but for what I want right now, I don't really care about whether or not it was successful.
If I did, I'll turn to the other solutions given here, I've learned some new stuff today, thanks.
Offline
You may find useful to read this. (French, but there is a quite self explanatory schema in if you scroll down)
Offline
You know, as a real hackish way, if your command1 is only doing output to your term and doesn't read anything from there, you can just type into its output and hit enter, at least my shell wil receive what you wrote and interpret it once it gets back (ie command1 is done):
ogion@Gont ~ % sleep 10
echo yay
ogion@Gont ~ % echo yay
yay
May sound weird and certainly is a hack and not elegant
Ogion
Last edited by Ogion (2010-09-07 19:38:59)
(my-dotfiles)
"People willing to trade their freedom for temporary security deserve neither and will lose both." - Benjamin Franklin
"Enlightenment is man's leaving his self-caused immaturity." - Immanuel Kant
Offline
Overkill for this, but for future reference, look into "coprocesses" and "wait" bash builtins - don't know how bash-specific they are.
"...one cannot be angry when one looks at a penguin." - John Ruskin
"Life in general is a bit shit, and so too is the internet. And that's all there is." - scepticisle
Offline