User: mikysyc2016

gravatar for mikysyc2016
mikysyc201610
Reputation:
10
Status:
New User
Location:
Last seen:
27 minutes ago
Joined:
1 month ago
Email:
m**********@gmail.com

Posts by mikysyc2016

<prev • 71 results • page 1 of 8 • next >
0
votes
1
answer
97
views
1
answers
Comment: C: have trouble to upload data to washU browser
... Can i directly add type=bedgraph into the file without open it? The file is two big very hard to open. What kind command i can use ? Thanks, ...
written 4 hours ago by mikysyc201610
0
votes
0
answers
31
views
0
answers
(Closed) can not upload bedgraph and bigwig data into WashU browser
... Hi all, I want to use Wash U browser to make figure for my ChIP-seq data. But I can not upload my bedgraph and bigwig file(which are in my own computer) into the browser. The file can be well open on IGV. Did some one know how to deal with it? Thanks, when i upload bigwig it show as below: > Sho ...
software error chip-seq written 18 hours ago by mikysyc201610
0
votes
0
answers
129
views
0
answers
Comment: C: my file do not have duplicates, but it still shows duplicate are not allow
... you are right i get two : NM_001001130 22 16 14 12 25 18 2218 NM_001001130 how i can remove the second one? Thanks! ...
written 1 day ago by mikysyc201610 • updated 1 day ago by Ram15k
0
votes
0
answers
129
views
0
answers
Comment: C: my file do not have duplicates, but it still shows duplicate are not allow
... when i use cut -f1 merged_6_rd.txt | sort | uniq -d I get : ID NM_001001130 NM_001001144 NM_001001152 NM_001001160 NM_001001176 NM_001001177 NM_001001178 NM_001001180 NM_001001181 NM_001001182 NM_001001183 NM_00100118 .......... ...
written 1 day ago by mikysyc201610 • updated 1 day ago by Ram15k
3
votes
0
answers
129
views
0
answers
my file do not have duplicates, but it still shows duplicate are not allow
... Hi all, I check my file use which(dupilcate(file)), and i already remove duplicate with my file. But when i read my file in R, it still show as below: x <- read.delim("merged_6_rd.txt", row.names = 1, stringsAsFactors = FALSE) Error in read.table(file = file, header = header, sep = sep, ...
R rna-seq written 1 day ago by mikysyc201610 • updated 20 hours ago by WouterDeCoster29k
0
votes
1
answer
93
views
1
answers
Comment: C: combine RNA-seq data count and len
... thank you for your reply. My case is a little bit different. The order of transcrip id for the two file is different. and one have around ~20000 id, another has ~30000id. ...
written 2 days ago by mikysyc201610
0
votes
1
answer
93
views
1
answers
Comment: C: combine RNA-seq data count and len
... It looks like: one is : Transcript KO1 KO2 KO3 WT1 WT2 WT3 78 79 81 66 68 70 27 28 29 NM_001011874 3 0 0 2 3 0 1 0 0 1 0 0 1 3 2 NM_001195662 0 0 0 2 1 0 0 0 0 0 0 0 0 0 0 NM_011283 0 0 0 2 1 0 0 0 0 0 0 0 0 0 0 NM_011441 769 153 314 871 158 399 289 224 888 275 270 1031 285 1360 821 .... another ...
written 2 days ago by mikysyc201610
0
votes
0
answers
118
views
0
answers
Comment: C: how to compare two TF binding sites
... Thank you for your nice reply! ...
written 2 days ago by mikysyc201610
0
votes
1
answer
93
views
1
answer
combine RNA-seq data count and len
... I have two txt file. One include transcription id and gene length, another one have transcription id and each sample reads, i want to combine them as one transcription id and reads count and length, how I can do it. I know vlookup can, but it is not good for big data. Thanks! ...
rna-seq written 2 days ago by mikysyc201610 • updated 2 days ago by c.chakraborty70
0
votes
1
answer
97
views
1
answers
Comment: C: have trouble to upload data to washU browser
... how I can add type=bedGraph to my bedgraph file? ...
written 4 days ago by mikysyc201610

Latest awards to mikysyc2016

Rising Star 5 days ago, created 50 posts within first three months of joining.
Scholar 12 days ago, created an answer that has been accepted. For A: how to use igvtools on ubuntu

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 2.3.0
Traffic: 966 users visited in the last hour