Question: Error using perl module Parallel::ForkManager
0
gravatar for josquin.daron
2.6 years ago by
josquin.daron0 wrote:

Hi everyone,

I am getting the following error message when I am using the perl module Parallel::ForkManager, and I can't figure out what's wrong in my code.

What's weird is that the program run fine for the first 70 bam files it is sending on the different threads and then it output me the error bellow.

Any help would be so appreciate.

Thanks a lot. :)

Here is the errors:

Cannot fork: Cannot allocate memory at /usr/local/perl-5.24.0/lib/site_perl/5.24.0/Parallel/ForkManager.pm line 52, line 8260.

The storable module was unable to store the child's data structure to the temp file "/tmp/1341650.1.longjob.q/dZnCgF1MiR/Parallel-ForkManager-28852-28855.txt": can't create /tmp/1341650.1.longjob.q/dZnCgF1MiR/Parallel-ForkManager-28852-28855.txt: No such file or directory at /usr/local/perl-5.24.0/lib/site_perl/5.24.0/Parallel/ForkManager.pm line 84.

Here is my code:

if ($threads == 1) {
    for my $id (0 .. $nbInputFiles) {
        $self->{$id}->{_currentFile} = $self->{inputFiles}[$id] ;
        $self->readBamInputFile($id) ;
        $self->correctMatrice($id) ;
        print STDERR "INFO $prog $datestring: $self->{$id}->{_currentFile} -> correcting matrice done !\n" ;
    }
}
else {
    my $pm = new Parallel::ForkManager($threads);
    $pm->set_waitpid_blocking_sleep(0);

    $pm->run_on_finish(sub {
        my ($pid, $exit, $id, $signal, $core, $data) = @_;

        $self->{$id}->{cov} = $data->{ret}->{$id}->{cov} ;
        $self->{$id}->{_currentSample} = $data->{ret}->{$id}->{_currentSample} ;
        $self->correctMatrice($id) ;
        print STDERR "INFO $prog $datestring: $data->{ret}->{$id}->{_currentFile} -> correcting matrice done !\n" ;

        $ret[$id] = delete $data->{ret};
        $err[$id] = delete $data->{err};
    });

    for my $id (0 .. $nbInputFiles) {
        $self->{$id}->{_currentFile} = $self->{inputFiles}[$id] ;
        $pm->start($id) and next;

        # here do the job
        # my $res = eval { $task[0]->($self) };
        my $res = eval { &readBamInputFile($self, $id) };
        $pm->finish(0, { ret => $self, err => $@ });
    }
    $pm->wait_all_children;
}
multithreading perl • 1.5k views
ADD COMMENTlink modified 2.6 years ago by h.mon32k • written 2.6 years ago by josquin.daron0

Cross-posted at StackExchange: https://stackoverflow.com/questions/51113439/

@josquin.daron: please don't cross-post, this is typically not recommended as it runs the risk of annoying people in both communities.

ADD REPLYlink modified 2.6 years ago • written 2.6 years ago by h.mon32k
1
gravatar for Nitin Narwade
2.6 years ago by
Nitin Narwade450
India
Nitin Narwade450 wrote:

What is the size of your system's /tmp/ folder?

It seems while processing BAM files, by default Perl, is storing temporary files in system /tmp/ folder. As you said it is running fine for 70 bam files but throwing an error for rest of the files.

The program is still running so it is not wiping the temporary data generated by itself and unable to store the temporary files required to process further bam files because there is no space left in the /tmp/ folder.

If you could remove the temporary files (stored in /tmp/ folder) either in the loop or your subroutine then it will get free space and hopefully will not throw any error.

ADD COMMENTlink written 2.6 years ago by Nitin Narwade450

How would one accomplish this?

ADD REPLYlink written 2.1 years ago by vsanderl0
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.3.0
Traffic: 2099 users visited in the last hour
_