Problem With Awk Printing
3
0
Entering edit mode
11.3 years ago
RandManP ▴ 10

Hi All, I would like to put "N" instead of those numbers which are less than 0.04 but keep 0. so:

0         0.1    9     0.004      98
23        2      89    0.002      9
0.001     0.03   45    9          10

awk '{for(i=1; i<=5; i++) {if ($i==0 || $i>0.04) print $i; else print "N"; }}' in.txt > out.txt

the output is correct but it will be printed in one column not in 5 columns like in.file. Thank you so much

awk • 2.2k views
ADD COMMENT
0
Entering edit mode

Please indicate relevance of this question to a bioinformatics research problem.

ADD REPLY
0
Entering edit mode

I want to make a cutoff (RNA-seq data). Also it is only an example not real data. I want to know how I can manipulate the data

ADD REPLY
0
Entering edit mode

"print" always adds a new line by default

ADD REPLY
0
Entering edit mode

That is why you need to use printf here as I did in my following answer.

ADD REPLY
5
Entering edit mode
11.3 years ago

With awk, you can do the following (just need to tune up the printf to get what you want):

awk '{for(i=1; i<=4; i++) {if($i==0 || $i>0.04) printf("%f\t", $i); else printf("N\t");} if($5==0 || $5>0.04) print $5; else print "N";}' eggs
ADD COMMENT
4
Entering edit mode
11.3 years ago
Neilfws 49k

Can't help you with awk, but here's a solution in R, assuming file is tab-delimited:

# read file
mydata <- read.table("in.txt", sep = "\t", header = F)
# transform values
mydata.t <- as.data.frame(sapply(mydata, function(x) ifelse(x < 0.04 & x > 0, "N", x)))

Result:

  V1  V2 V3 V4 V5
1  0 0.1  9  N 98
2 23   2 89  N  9
3  N   N 45  9 10

Can write back to a tab-delimited file using:

write.table(mydata.t, "out.txt", row.names = F, col.names = F, quote = F, sep = "\t")
ADD COMMENT
0
Entering edit mode

thank you so much :)

ADD REPLY
0
Entering edit mode
11.3 years ago
RandManP ▴ 10

thanks, very helpful

ADD COMMENT

Login before adding your answer.

Traffic: 2076 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6