Question: Kill nohup bash process
0
gravatar for genomes_and_MGEs
9 months ago by
genomes_and_MGEs0 wrote:

Hey guys,

I ran this

nohup bash -c 'for next in $(cat all_bacteria_links.txt); do wget -P all_bacteria "$next"/*genomic.fna.gz; done',

and now I want to kill the process. I ran the command in ssh server, so whenever I try to kill PID or kill -9 PID, the server connection closes. Can you please let me know how I can kill this process? Thanks!

sequence genome • 1.3k views
ADD COMMENTlink modified 9 months ago by Santosh Anand5.0k • written 9 months ago by genomes_and_MGEs0
1

Not a bioinfo Q. Still, this might help you https://stackoverflow.com/questions/17385794/how-to-get-the-process-id-to-kill-a-nohup-process

ADD REPLYlink written 9 months ago by Santosh Anand5.0k

You're killing the wrong bash process.

ADD REPLYlink written 9 months ago by Devon Ryan94k

Do ps -ef | grep all_bacteria_links and kill the right process.

ADD REPLYlink written 9 months ago by WouterDeCoster43k
2

Pro tip: use a decoy character class with the grep following the ps to avoid listing the grep as one of the candidates.

ps -ef | grep "[a]ll_bacteria_links"

The pattern [a]ll doesn't match the string "[a]ll", so the grep process won't be picked.

ADD REPLYlink modified 9 months ago • written 9 months ago by RamRS26k
1

OP has a very convoluted way to run the command by running a bash shell with nohup, so the process to kill is bash.

ADD REPLYlink written 9 months ago by Santosh Anand5.0k
2

I believe there is a specific corner in hell for people who do this, no?

ADD REPLYlink written 9 months ago by WouterDeCoster43k

It doesn't work, because the PID is always changing on the server... So, I found a solution:

pkill -STOP -P the_ppid

Thanks anyway guys!

ADD REPLYlink written 9 months ago by genomes_and_MGEs0
6
gravatar for shenwei356
9 months ago by
shenwei3565.1k
China
shenwei3565.1k wrote:

Hi, I'd recommend using screen instead of nohup, screen is safer.

And, you better utilize some batch commands like parallel to accelerate thousands of jobs, here is wget.

Using wget -c to resume unfinished download is also recommended.

Here's my method in this case from Manipulation on NCBI refseq bacterial assembly summary

$ cat mt.tsv | csvtk cut -t -f ftp_path | sed 1d \
    | rush -v prefix='{}/{%}' \
        '   wget -c {prefix}_genomic.fna.gz; \
            wget -c {prefix}_genomic.gbff.gz; \
            wget -c {prefix}_genomic.gff.gz; \
            wget -c {prefix}_cds_from_genomic.fna.gz \
            wget -c {prefix}_protein.faa.gz; \
        ' \
        -j 10 -c -C download.rush

If the process died for some reasons like PC reboot or network interrupt, just re-run the command, which will ignore finished jobs recorded in file download.rush.

ADD COMMENTlink modified 9 months ago • written 9 months ago by shenwei3565.1k

THIS is the right solution!

ADD REPLYlink written 9 months ago by Santosh Anand5.0k
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.3.0
Traffic: 1655 users visited in the last hour