I am listing below various flags that are used by wgsim. Wgsim simulate sequence reads from a reference genome .
-e FLOAT base error rate [0.020]
-d INT outer distance between the two ends 
-s INT standard deviation 
-N INT number of read pairs 
-1 INT length of the first read 
-2 INT length of the second read 
-r FLOAT rate of mutations [0.0010]
-R FLOAT fraction of indels [0.15]
-X FLOAT probability an indel is extended [0.30]
-S INT seed for random generator [-1]
-A FLOAT disgard if the fraction of ambiguous bases higher than FLOAT [0.05]
-h haplotype mode
What do you understand by standard deviation in respect to generating reads.? Similarly what is the difference between rate of mutation and base error rate.? How to calculate outer distance between two ends of a read. These are some of the question that troubles me..