Seven days ago we started a new sequencing run on our G400 sequencer from MGI. However, this time there seems to be an issue with the run time. The time the sequencer spends taking the image for each cycle is unusually long, which is why the run is taking much longer than expected. The G400 provides a real-time raw Q30 value, and to our surprise, the Q30 is still high (around 90).
My question is: could anyone explain how the Q30 is calculated? I know that fluorescence intensity and signal separation are taken into account, but is it really possible for the Q30 to remain this high despite the delay?
We will check the data very carefully once the run is completed, but any ideas or suggestions on how to approach this issue would be greatly appreciated.