Rust and Go are the best languages I have tried. But Rust proved to be too low-level for daily use, and its hard to find excuses for using Go when Python already has everything I need. However as the complexity of your production applications grow, the advantages of languages like Rust and Go start to outweigh the "convenience" of Python. Being able to compile and deploy a single static binary to run your apps, and not even needing a container for it, is huge. With interpreted languages like R and Python you pretty much need to ship an entire operating system for them to work, especially when using a large amount of 3rd party libraries (and dont get me started on library management for these two...).
The strong static typing and more verbose function and class signatures in Rust & Go makes the source code so much easier to understand. I also really appreciate how Rust lets you embed your unit tests directly into your program's source code in the same file.
What I use/like most: R. Why? Because it is the only language I know well enough to actually do some productive work with it (I do not count bash as a language) and as I do both the wetlab work and subsequent analysis of the data I generate I simply do not have the time to work myself into another language even though I would really like to learn a 2nd one, something like Rust which seems to be what the cool kids learn these days. For sure not Perl. But at the end of the day the biology/science has to be done and for this I rather spend time keeping up with the ever increasing amount of literature/new papers published rather than learning a new language. That would be different of course if I was a software developer rather than analyst. That having said, I catch myself far too often coding stupid little things rather than reading papers because coding is fun, and reading papers is hard work, especially when you read several in a row to find out that none has a real biological message. That unfortunately gets more and more common with the ever increasing amount of high-throughput data that are rather descriptive in far too many papers.
Coming from a statistical background, I almost entirely use R when I can get away with things. It's capacity for manipulating data into any format is unrivalled as far as I am aware. This is mostly to do with the incredible tidyverse libraries which make doing all kinds of complex things extremely clean and easy. I also enjoy using the Rcpp functionality as well, which makes it very easy to write snippets of high performance C++ code into your R base.
Some coding snobs/purists will tell you that R isn't a real programming language etc. Whether or not this is true is mostly irrelevant to my view that R tends to be excpeptionally good at most of the things it's designed for - primarily performing complex data manipulation, statistical analysis and general analysis of e.g. data in some kind of text format.
That said, I cringe when I see people analysing massive genomic datasets in R. It is a terrible memory hog and loading entire vcfs into memory is just a non-starter for me.
When I absolutely have to, I will write something in C from scratch. But fortunately this doesn't happen too frequently any more.
I use Python the most, but I love Perl more. Unfortunately, Perl went out of fashion, thus new bioinformatic libraries are usually created for Python and this is the reason I use it.
Regarding the future direction: I think that bioinformaticians will continue to use Python. It is one of the most popular languages in the world and its popularity continues to increase as of February 2021.
It is not set in stone, but academia is moving to Python in many fields.
The reason of this movement is mainly:
1) Python is easy to learn and use for a person who has no experience in programming.
2) Again, it is easy to get and use libraries created by users, so it supports community to share their knowledge easily.
Also, I would add R for statistical purposes.
And lastly, bash is an absolute must in my honest opinion.