The ULTIMATE setup for (Web)Developers with WSL2 and Docker – Part 2

This is the second part of the ULTIMATE Setup for WSL2 and Docker, in which I’ll focus on how to setup Docker inside WSL to allow for a seamless network integration across all your devices in your network.

Part 1

In case you missed it, check out my first part of the series.

https://dnmc.in/2021/02/01/the-ultimate-setup-for-webdevelopers-with-wsl2-and-docker-part-1

The problem

Let’s say you’re working on a NodeJS server app and you’d like to access it from other devices within your network, for example a mobile App, you’re developing simultaneously while also developing your server side code.

Now, your app runs on your mobile phone, while your server runs inside WSL on port 3000, triggered by node server.js. Although you can access localhost:3000 on your Windows machine, unfortunately you can’t access that service with 192.168.1.2:3000 from your mobile app, since Windows does not forward ports to WSL2 automatically. This is frustrating, but it can easily be solved.

Dockerize everything

The solution to the previously mentioned problem is to run everything inside a Docker container, by everything I literally mean everything. node, npm, python, etc…

Since I doubt that you want to docker build every server.js you’re working on, I’m about to show you how to point node and npm commands to run inside Docker, so that you can seamlessly run your scripts as if you had node and npm installed locally.

I have a ~/Docker folder where I store my docker-compose and all the Docker mounts and configs.

Docker-compose

I have a huge docker-compose file, which starts all the services I’m relying on. Nginx, databases, python, nodejs, etc… For the sake of simplicity, I’ll focus only on NodeJS and MySQL in this article, but you can find the complete docker-compose.yaml on GitHub, which is configured to run python in the same way as nodejs and has additional settings.

version: "3.8"

services:

  nodejs:
    image: node:alpine
    container_name: nodejs
    tty: true
    user: 1000:1000
    ports:
      - "3000-3100:3000-3100"
    environment:
      - NPM_CONFIG_PREFIX=$PWD/.npm-global
      - PATH=$PATH:$PWD/.npm-global/bin
    volumes:
      - ~:${HOME}:cached
      - ./.npm-global:$PWD/.npm-global
    command: ["tail", "-f", "/dev/null"]

  mysql:
    image: mysql:8
    container_name: mysql
    user: 1000:1000
    ports:
      - 3306:3306
    environment:
      - MYSQL_ROOT_PASSWORD=${MYSQL_ROOT_PASSWORD}
    volumes:
      - ./db_data/mysql:/var/lib/mysql:cached
      - ./conf/mysql/mysql.cnf:/etc/mysql/conf.d/mysql.cnf:cached

So, as you can guess, running docker-compose up spins up the two containers. Unlike mysql, the node container has no services that keep the container running in the background, that’s why the tail -f /dev/null command is provided to prevent the container from exiting. So far so good.

The volumes part mounts the $HOME directory to the container, so /home/<username>/nodeprojects/projectA is also present in the container, which is very important. Additionally, there`s a folder provided for persisting global node modules and accessing them from inside WSL2, more on that later.

Notice how we expose a port range from 3000-3100. This is necessary, so that a running server.js which listens on port 3002 is exposed to the host. This is just temporary, since it only applies to this article. In the next part of this series, I will show you how to get rid of that line and make everything work flawlessly and dynamically.

Custom bin files

So now that nodejs is running inside Docker we need to call our server.js from within that container. To do so we will create a custom node shell script, which does exactly that.

The custom bin files reside in ~/Docker/bin which is appended to the $PATH variable. In order to be able to run globally installed node packages you’ll also want to add the .npm-global/bin path to your $PATH variable.

#!/bin/bash

docker exec -w "$PWD" nodejs node $@
exit $?

As you can see, the script executes node inside the nodejs container, tells it to use the current directory as the working directory, and passes all arguments.

We can do the same for npm and yarn.

#!/bin/bash

docker exec -w "$PWD" nodejs npm $@
exit $?
#!/bin/bash

docker exec -w "$PWD" nodejs yarn $@
exit $?

You’re ready to go!

Running node server.js should successfully run your server, and be accessible via localhost:3002 and 192.168.1.2:3002 from within your network.

Also, if you want to use global packages, you can do that with npm i -g <package>, you will notice how the package is installed in the docker-compose mounted .npm-global folder. Since the .npm-global/bin folder is appended to the $PATH variable, you can execute them without problems.

GitHub

You can find my full Docker setup on my GitHub.

As always, I hope you learned something and check back for the next part, where I’ll show you the proper setup of Nginx to run all services dynamically.

https://github.com/ad-on-is/ultimate-wsl2-docker-setup

>