The error message "std::bad_alloc" indicates that bedtools has run out of memory during the intersection operation. This is likely because your file B is approximately 8 GB in size, and bedtools loads the -b file into memory by default when not using the -sorted option.
To resolve this, first ensure that both files A.bed and B.bed are sorted by chromosome and then by start position. You can sort them using the following commands:
sort -k1,1 -k2,2n A.bed > A.sorted.bed
sort -k1,1 -k2,2n B.bed > B.sorted.bed
Then, rerun the intersection with the -sorted option, which enables a memory-efficient algorithm:
intersectBed -a A.sorted.bed -b B.sorted.bed -wa -wb -sorted > out.bed
This should prevent the memory issue, as bedtools will no longer load the entire -b file into RAM. If your system still encounters problems, consider splitting the large file B into smaller chromosomes or using alternative tools like bedops, which may handle large files more efficiently.
Kevin
well I guess you don't. Try
-sortedand find outI did it and get
terminate called after throwing an instance of 'std::bad_alloc' what(): std::bad_allocthis timeI agree with Carambakaracho that the error is likely due to the fact that the process is filling up your RAM. Try and give a look at what happens with the RAM while you are executing the task. Another good try would be to repeat the task with a substantially reduced set of data (e.g. the first 2000 lines of B file) and see what happens. If it's just a RAM issue I think you might subset the second file in two/four chunks and do the work separately and then merge the files: I don't see downside in doing this.
Thanks for your reply, It is working when I am getting a subset of B file. So I will split that file to 2-3 files then run command.
std::bad_allocis the C++ error code for when a new operator tries to allocate something, but fails. As C++ cannot dynamically allocate memory, the lack of memory is the most common cause.As of now, you didn't disclose anything on the machine you're working on, so I guess the most common cause, too. If your working on a HPC and allocated 1TB RAM, something else went wrong. For a start, Fabio Marroni gave good troubleshooting advice.
In my case, I was doing something different but got the same error (std::bad_alloc), and the memory was the problem - thanks!