How To Use .env Variables In package.json

 

Ray Knight

Ray Knight practices software engineering at GenUI.

Updated Nov 19, 2019

The following is a copy of an article originally posted here: https://medium.com/@arrayknight/how-to-use-env-variables-in-package-json-509b9b663867

I was working on setting up a new project while following a tutorial. This tutorial started to introduce a multi-step docker setup shell script file .sh. My preference is to break up these types of scripts so that each step can be run individually in the case of a partial setup/failure. This is an approach that I’ve seen and some of my co-workers use as a standard. But is it the best/only option?

Environment in Package

linklinklinklinklinklinkThe Problem

It’s common to need access to your environment variables in multiple places. Your code, your build scripts, and your pipeline may all need access. It’s not always simple to propagate these variables from one end to the other. There’s sometimes a fair amount of exporting/importing and sometimes it can require some level of duplication.

Example setup script with a series of steps and some variables:

#!/bin/bash

set -e

SERVER=”my_database_server”;
PW=”mysecretpassword”;
DB=”my_database”;

echo “echo stop & remove old docker [$SERVER] and starting new fresh instance of [$SERVER]”;

(docker kill $SERVER || :) && \
  (docker rm $SERVER || :) && \
  docker run — name $SERVER -e POSTGRES_PASSWORD=$PW \
  -e PGPASSWORD=$PW \
  -p 5432:5432 \
  -d postgres

# wait for pg to start

echo “sleep wait for pg-server [$SERVER] to start”;

SLEEP 3;

# create the db

echo “CREATE DATABASE $DB ENCODING ‘UTF-8’;” | docker exec -i $SERVER psql -U postgres

echo “\l” | docker exec -i $SERVER psql -U postgres

Yes, I could do what has been done before. I could break each step out into a separate shell script file .sh. But most of my projects don’t need that many scripts and the simpler I can make things, the better.

This got me thinking:

  • Do I need external script files? Is this the most valuable approach? Does having more files just make it harder to find where things are happening?
  • How would it be possible to skip writing these files at all?
  • How can we have a single source of secret variables that are used everywhere?

linklinklinklinklinklinkThe Solution

linklinklinklinklinklinkDotenv

A well known and widely used tool that makes getting environment variables easily loaded. Dotenv is built into many of the tools/bundlers/frameworks that you may already be using. And it has a series of plugins/extensions that make it even more useful. Like dotenv-safe, dotenv-extend, and dotenv-cli, which make the development experience smoother and more robust.

linklinklinklinklinklinkCross-env/var

Admittedly, these two are not the same tool or interchangeable. But they are both so useful that they’re worth mentioning.

Cross-env makes setting environment variables work across platforms.

Cross-var makes substituting environment variables work across platforms. We’ll be using this in our example.

linklinklinklinklinklinkWith our powers combined…

By using dotenv and cross-var together, we’re able to read in whichever .env files we want, or consume existing environment variables (from cli, bash_profile, CI, etc) and then easily substitute them into our package.json scripts and it works across development platforms!

Dependencies being used in this example:

"cross-var": "1.1.0",
"dotenv-cli": "3.0.0",

An example .env file:

DOCKER_NAME="my-project"
POSTGRES_HOST=127.0.0.1
POSTGRES_PORT=5432
POSTGRES_DATABASE="my-db"
POSTGRES_USERNAME="foo"
POSTGRES_PASSWORD="bar"

Recreating the docker scripts in our package.json:

"docker:kill": "dotenv cross-var docker kill %DOCKER_NAME%",

"docker:remove": "dotenv cross-var docker rm %DOCKER_NAME%",

"docker:run": "dotenv -- cross-var docker run --name %DOCKER_NAME% -e POSTGRES_USER=%POSTGRES_USER% -e POSTGRES_PASSWORD=%POSTGRES_PASSWORD% -p 5432:5432 -d postgres",

"docker:setup": "dotenv -- cross-var \"echo CREATE DATABASE $DB ENCODING ‘UTF-8’; | docker exec -i %DOCKER_NAME% psql -d %POSTGRES_DATABASE% -U %POSTGRES_USER%\"",

For my project, I was able to simplify this further by utilizing more features/flags with docker:

"docker:kill": "dotenv cross-var docker kill %DOCKER_NAME%",

"docker:run": "dotenv -- cross-var docker run --rm --name %DOCKER_NAME% -e POSTGRES_DB=%POSTGRES_DATABASE% -e POSTGRES_USER=%POSTGRES_USERNAME% -e POSTGRES_PASSWORD=%POSTGRES_PASSWORD% -p 5432:5432 -d postgres",

This is in no way specific to docker. You can use this with any of your scripts if it makes sense for your project.

linklinklinklinklinklinkBreak It Down

Combine dotenv, dotenv-cli, and cross-var.

Simple command:

dotenv cross-var foobar %MY_VARIABLE%

Commands with flag(s):

dotenv -- cross-var foobar -F --b %MY_VARIABLE%

dotenv -e .env.staging -- cross-var foobar -F --b %MY_VARIABLE%

Complex command:

dotenv -- cross-var \"echo \\l; | foobar -F --b %MY_VARIABLE%\"

You’ll want to take note of three features that become necessary:

  • All variables need to be wrapped with %, like: %MY_VARIABLE%. This prevents the shell from substituting the variable before our script runs
  • -- comes from dotenv-cli and separates the internal command from the dotenv call so it knows which flags belong to dotenv and which belong to the internal command
  • \"...\" comes from cross-var and wraps more complicated commands

linklinklinklinklinklinkThoughts/Next Steps

There will always be a use case for larger build/CI/CD systems where this solution is not a good fit.

In the age of extreme simplicity when it comes to build systems, like Parcel, it’s nice to have an approach to handling secrets that feels consistently easy. When you can keep things simple, why not?

Is this a common enough of a scenario to add this feature to dotenv-cli? Or should a new tool ( dotenv-var?) be developed to simplify the combo into a single command?

How can we help?

Can we help you apply these ideas on your project? Send us a message! You'll get to talk with our awesome delivery team on your very first call.