Docker runs anywhere. Windows, MacOS, Linux desktop, Linux server, Windows Server. X86, ARM, you name it, and chances are Docker works on it.
Not only does Docker work on all platforms, it works the same on all platforms, with one caveat. If running docker on ARM without ARM native processes it does need to translate instructions to x86. This is slower, it has gotten better recently but it is still noticable. You may also need some minor tweaks to get a container built for Unix to run on Windows. Good news is, most of the time these changes just mean a quick config tweak.
What this means is nothing inside of your Docker container touches "the outside" unless you specifically assign a folder to your Docker container. You can run your crazy Experimental code inside a container, and if you accidentally have it self destruct and deletes all files, you can rest assured it will only delete what is explicitly given permission to.
Without docker you need to have all dependencies installed on your local machine. If the version of one dependency is wrong, you may get an error and you are stuck debugging it. In a docker image you include all dependencies that are needed to run it. You can then write simple scripts that will execute commands in sequence inside your containers to do things like fetching a database, handling environment variables securely through a password manager, etc.
As a Full Stack engineer the amount of time docker has saved me is very significant. Not only have I saved time on not setting up every local server from scratch, it has also saved me a lot of time when helping colleagues. I can write automation once and upload it to GitHub and from there all developers know how to run it.