In addition to this we have powerful HPC server on which usually we perform more important computational work.
This is the really important part of the situation.
The ideal configuration, from my experience, has been to use a macOS system to ssh in to your institution's HPC (or cloud). iTerm2 + CyberDuck + VS Code is pretty much all I have needed for many years. I have got some notes here with the typical extra software I ended up using in my macOS systems, among other things.
I think the question in the OP is itself a little misguided, because in real life you will pretty much never want to be using Windows for bioinformatics (genomics) work. However if you had a Windows PC you could just install Linux (such as Ubuntu) on it and have a much more appropriate system. But at this point, I would still have a hard time recommending it to anyone for their lab's machine, simply because you want your lab workstation to "just work" and not require a tech-head to know how to manage a local Linux workstation. Not that its hard, especially if you are gonna be using Linux on the HPC anyway, but I really think its important that if you are requisitioning something it needs to be as easy as possible and reliable to use. macOS is really what you would want here. Others mentioned the ability to use MS Office; I think this is a critical consideration as well. Also consider the ease of integrating the system into your institution's IT infrastructure. I run a lot of Linux systems at home for personal use, and despite the huge strides that Ubuntu has made in their out-of-the-box experience and driver support, there's still occasional hiccups that I would loathe to deal with in a lab setting.
use it for analysis in python/bash/R and
Python and bash are by far the easiest to work with in macOS over Windows. R (with R Studio) is surprisingly painless on Windows and the experience is about the same on both. Linux makes all three trivial.
I want to make me free from server work for base/medium weight work.
I do not think this is really a worthwhile expectation to have. Just about the only work that I have ever experienced that was feasible on the local system, instead of on the HPC, was downstream prototyping with local R and Rstudio, which would ultimately get back ported to the HPC for final execution. But no matter how beefy of a workstation you get, Mac or PC, you will simply never get the type of throughput that is available on the HPC, to the extent that it makes it not worth your time to bother trying to do much work outside of it. Also consider the headaches involved in keeping your analysis' software configurations in sync between two different systems. I think its far more worthwhile to do two things;
conda has pretty much got you covered, especially if you stick with newer package versions (some old packages dont have M1 versions available for macOS), and you can pull and run pretty much all Docker containers flawlessly. Building containers on macOS ARM (M1, etc.) to run on x86 Intel HPC is a bit of a headache however. Singularity does not run natively on macOS or Windows, but that is not really an issue if you just stick with Docker (and convert to Singularity from within the HPC environment).
Ultimately my best suggestion is to get a MacBook Pro with the most memory and storage space you can budget (32GB / 1TB or 2TB would be good to shoot for), and utilize the HPC for most everything possible, and utilize network storage for your lab's data archives (dont store all your research data on the laptop). Also worth considering that if you want to be mobile, the 14" laptop models are significantly lighter and easier to take with you e.g. to a conference or to the breakroom to do work while you drink the professors' coffee in the faculty lounge.