Entering edit mode
4.1 years ago
akh22
▴
120
HI,
I installed a following docker image,
REPOSITORY TAG IMAGE ID CREATED SIZE
zymoresearch/bcl2fastq latest 037f216c2523 13 months ago 117MB
and run a following command;
docker run -d --name bcl2fastq -v /Volumes/Aura2/bcl_NU/170720_NB501488_0132_AH5V32BGX3:/mnt/run -v /Volumes/Aura2/output:/mnt/out zymoresearch/bcl2fastq:2.20 -R /mnt/run -o /mnt/out/Data/Intensities/BaseCalls/Alignment_1 --barcode-mismatches 0 --with-failed-reads --no-lane-splitting -p 24
This generates fastq files but they are all 0Ks. I looked at the log file but nothing stands out as a major error. Since I am really getting stuck trouble shooting this issue, I'd really appreciate any inputs and suggestions.
Are you not getting any fastq files? e.g. if the data is not being demultiplexed then you should get files called
Undetermined * R1 *.fastq.gz and Undetermined * R2 *.fastq.gz
.You also don't seem to be providing a Samplesheet.csv file which is what denotes sample_ID --> Index associations. Without that there is no way to demux the data.
That is a bad option to choose. Those reads fail initial filters for a reason and should be left alone.
@genomax, It did generate all the fastqs for entires in the samplesheet.csv except one
I don't know where this Undetermined_S0_R1_001.fastq.gz came from, which was not in the samplesheet.csv. I forgot to mention that these are single reads.
Also, I did try run this without
--with-failed-reads
and--barcode-mismatches 0
but did not make any difference, and a following is the samplesheet.csv used in the run.Those are all zero byte files. Any reads that can't be classified using indexes provided in SampleSheet.csv are put into Undetermined* file. You can look in that file to see what indexes ended up there. Don't be surprised to see a smattering of indexes you don't expect. This is normal, as long as they are < 5% of reads. Use the code I have here: C: Demultiplexing reads with index present in the labels
I think easiest mistake is providing index sequence as reverse-complement. Once you compare results from my code and indexes you have, it should be easy to figure out.
Why are you using a SampleSheet for 2D indexes then? The file does not follow Illumina's samplesheet format either.
I wish I could use you script but the "Underdetermined" file is also 0K. Also, single reads were done with dual index. So the entry of the second index in the samplesheet is wrong in this case ?
My apologies. Single reads with dual indexes are certainly fine. I should have considered that.
Note: Depending on which sequencer this run as done, you may need to reverse complement the second index.
Can you create your samplesheet in this format. Save as comma separated values (.csv) file.
If you don't have multiple lanes then you will have entries for lane 1. If the pool ran on multiple lanes then you will need to create multiple lane entries (column 1).
Also make sure you run the
bcl2fastq2
command with these optionsIf you have more than more core available then you could also add
-r N -p N -w N
to the command. Three N's added up should not be more than cores you have available.I modified the samplesheet.csv as follows;
Again, it generated 26 fastqs but they are all empty. The bcl2fastq log file is found here bcl2fastq log
Do you actually have access to the complete flowcell data folder? For
bcl2fastq2
to work that is a requirement. Looking at the log, it looks to me that you don't have the full FC folder. Right after the fastq files are created is where the program should start reading thebcl
files and converting them to sequence. Your log ends at the point. Are there any errors in any other log?The input folder appears to be intact, containing everything original except the samplesheet.csv which I had to create based on a sample submission sheet. I think the issue may be incorrect samplesheet entries, or some corrupted files which will generate errors, but I dont se them. At this point, I have to throw the towel
I encounter exactly the same problem. Solved it by assigning more memory space for my docker app.