Hi,  I have been stretching my head to understand possible memory usage  in one of my complex R script. What I found is quite surprising. Using a function rtracklayer::readGFF() occupies quite a bit memory (about 400 MB) in the R even without loading the package rtracklayer. See the step by step use case and output. 
NOTE: Though  question is more about programming and related to R, I find it worth posting to the biostars as rtracklayer is well used in the bioinformatics community. 
Amount of memory used in fresh R session. (NOTE: No prior package or objects are loaded)
## fresh R session memory usage 
pryr::mem_used()
39.4 MB
Amount of memory  used once rtracklayer::readGFF() function executed. 
## read gff using rtracklayer::readGFF(), (NOTE: no file supplied to the  function)
rtracklayer::readGFF()
pryr::mem_used()
416 MB
As we can see, there is 10X larger memory occupied though nothing has been returned from the rtracklayer::readGFF() execution. Can anyone explain why this is ? and how to prevent R occupying additional ~400 MB of memory while using  rtracklayer::readGFF() ?
This post is more suitable to GitHub issues, see:
i did but it got deleted :(, i think
What do you mean "got deleted"? The author deleted it?
Sorry. !! Now it appears there. GitHub behaving weirdly. See the link in above comment.
They're having data consistency issues, so things will be weird for the next few hours at least.