adjoe Engineers’ Blog
 /  Infrastructure  /  React Applications: Build Once, Deploy Anywhere in React
purple abstract decorative image
Infrastructure

React Applications: Build Once, Deploy Anywhere in React

As a DevOps Engineer at adjoe, I’ve recently had to deal with the deployments of multiple React applications from one monorepo. These React applications get deployed to multiple environments – for example, development, sandbox, staging, and production.

These React apps then run on 

  • dev.example.com/react-app-1
  • dev.example.com/react-app-2
  • prod.example.com/react-app-1

And so on.

What Was the Problem?

All of these React applications were created following the create-react-app (CRA) approach, which, at the time, was the recommended way to start developing these applications.

The problem was that the builds for these React applications were environment-specific. This meant that when deploying to dev, we had to build the application for dev and could not deploy this same build to staging or production.

Each deployment of our application required a new build, which costs our developers time and unnecessarily increases the load on our pipeline infrastructure.

Diagram showing the initial pipeline with tightly coupled build and deploy stage

In this process, we are also rejecting the principle of strictly separating build and run stages from the Twelve-Factor-App


This has led to more clutter in our container registry, as there has needed to be a version of the application for each environment. A deployment to sandbox would then have to grab the app-sandbox version from the container registry. Otherwise – for example – the app-staging image would not work when deployed to sandbox.

Diagram showing the initial state of our ECR container registry

Why Were the Builds Environment-Specific?

Have you ever wondered how environment variables work for client-side (JavaScript) applications? Well, they don’t. There are simply no environment variables. Environment variables are defined in the environment that an application runs in – most of the time, in some linux server – but for client-side applications, it’s the browser, and there are no environment variables we can set here.

But, that might leave you wondering: “What about REACT_APP_ prefixed environment variables?” which the create-react-app official documentation mentions. Even inside the source code, these are used as process.env. This tells us that these are clearly environment variables, right? Wrong!

In reality, the object “process” does not exist inside the browser environment; it’s node-specific.  create-react-app (CRA) by default doesn’t do server-side rendering. It can’t inject environment variables during content serving. During the build process, Webpack replaces all occurrences of process.env with a given string value. This means it can only be configured during build time.

The build and deploy in this pipeline are tightly coupled because we include npm run build in our Dockerfile, and Webpack hard-codes the environment variables of this moment into the JavaScript code bundle. Thus when building one of these applications for sandbox, we would hard-code sandbox-specific environment variables into the JavaScript bundle. This means you cannot reuse the same image for different environments, even if it actually contains the same code.

How Can We Work Around That?

Instead of incorporating these variables into the build process, we define them at the very last possible moment, at the start of the container.

To do this, we change the entry point of the docker container to a bash script. This will read these variables from the environment that the container runs in. In our case, that’s AWS ECS, and it creates a JavaScript file.

#!/bin/bash

echo "window.env = {" >> ./react-app/env-config.js

while read -r line || [[ -n "$line" ]];
do
  if printf '%s\n' "$line" | grep -q -e '='; then
    varname=$(printf '%s\n' "$line" | sed -e 's/=.*//')
    varvalue=$(printf '%s\n' "$line" | sed -e 's/^[^=]*=//')
  fi
  value=$(printf '%s\n' "${!varname}")
  [[ -z $value ]] && value=${varvalue}
  echo "  $varname: \"$value\"," >> ./react-app/env-config.js
done < .env

echo "}" >> ./react-app/env-config.js

Running this script with the environment variables REACT_APP_ENV_URL and REACT_APP_ENV present will result in the file env-config.js being created with content such as the following, depending on the values of the variables:

// env-config.js
window.env = {
  REACT_APP_ENV_URL: "https://sandbox.example.com",
  REACT_APP_ENV: "sandbox",
}

We then load these variables in our index.html file as the very first thing, which makes them available to the rest of the application.

<script src="%PUBLIC_URL%/env-config.js"></script>


Throughout the rest of the code, we can access these variables now with window.env.REACT_APP_ENV. This way, the variables are present at the start of the application, but we don’t need to set them during the build process anymore.

What About Local Development?

During the development process, our developers run the application locally using Docker Compose. So, here we have to find an alternative way to set these variables, as we can’t utilize the ECS task definition here. 

The following code snippet shows a part of our docker-compose.yml file, where you can see that we also set the environment variables here.

react-app:
    build:
      context: .
      dockerfile: ./react-app/docker/Dockerfile
    command: yarn run start:dev
    ports:
      - 5040:3001
    environment:
      REACT_APP_ENV_URL: 'https://local.dsp-dashboard.adjoe.zone'
      REACT_APP_ENV: 'development'

We then make sure that our env.sh script from before is the first thing that runs once the container starts by adjusting the start:dev command in our package.json file.

Without this change, start:dev would otherwise execute the following:

"start:dev": "yarn install; yarn start"

Now we’ve modified the command to first execute the new env.sh script and to copy the result into the public directory.

"start:dev": "chmod +x ./env.sh && ./env.sh && cp env-config.js ./public/ && yarn install; yarn start"

What Is the Result?

This change subsequently benefits us in two ways:

  • We have automatic builds on push
  • We can deploy to multiple environments

Automatic builds on push: Since builds were previously part of deployment, we wouldn’t trigger them automatically. With only the tests running automatically on new commits coming in, it was possible for a change to be included in our main branch that made our builds fail. Now, after this change, our builds are run automatically as well. A change that fails to be built would be caught long before it reaches the main branch.

Deploying to multiple environments: When a build passes, it can now be deployed to multiple environments seamlessly. This saves precious time for our developers.

Diagram showing the final architecture of the deployment pipeline

And, of course, the container registry is now also cleaner, as there is only one image of the React application. This is still with different tags based on the branch.

Diagram showing the state of our container registry after the clean-up

Lessons Learned

To sum it all up, we learned that there are generally no environment variables in the browser. This makes it complicated to build client-side applications that can be deployed to multiple environments, such as development, staging, and production. We also found a way around this limitation by reading the environment variables at the start of the container, creating a JavaScript file containing these variables, and adding it to our bundle.

Our team’s key takeaways were as follows:

  • Always separate build and deploy stages
  • Environment variables don’t exist in the browser
  • Define environment-specific variables at (container) startup time

DevOps Engineer (f/m/d)

  • adjoe
  • Cloud Engineering
  • Full-time

adjoe is a leading mobile ad platform developing cutting-edge advertising and monetization solutions that take its app partners’ business to the next level. Part of the applike group ecosystem, adjoe is home to an advanced tech stack, powerful financial backing from Bertelsmann, and a highly motivated workforce to be reckoned with.

Meet Your Team: Cloud Engineering

The Cloud Engineering team is the core of adjoe’s tech department. It is responsible for the underlying infrastructure that helps adjoe’s developers to run their software – and the company to grow its business. 

From various AWS services to Kubernetes and Open source projects, the team continuously validates new cloud and architecture services to efficiently handle a huge amount of data. Cloud Engineering tackles the challenge of choosing when to use self-hosted or managed services to reduce $300K of monthly hosting costs, while still ensuring convenience and data security for adjoe’s developers. 

Because adjoe needs to ensure high-quality service and minimal downtime to grow its business, Cloud Engineering invests heavily in monitoring and alerting technologies for insights into system health (networking, application logs, cloud service information, hardware, etc.). The cloud engineers also provide working solutions, knowledge, and documentation to the entire community of adjoe developers, giving them the autonomy to work on the infrastructure themselves and ensure the smooth sailing of adjoe’s systems.
What You Will Do:
  • You will collaborate with a team of experienced DevOps engineers to reinvent our cloud infrastructure by introducing new technologies and improving the existing environment.
  • You will help transfer our current managed AWS cloud infrastructure to self-hosted and open source technologies: We believe a hybrid approach, combining managed and self-hosted solutions,offers the best cost/efficiency ratio.
  • You will support our developers in building a high-performance backend using Go, utilizing our existing backend structures separated over several globally located data centers.
  • You will collaborate with experts from different technological backgrounds and countries, learn from highly experienced colleagues, and share your knowledge.
  • You will work with our current tech stack: Go, DruidDB, Kafka, DynamoDB, ScyllaDB, RDS, Kubernetes, Terraform, GitLab, ECS, EMR, Lambda, complex CI/CD pipelines, Prometheus, Data pipelines, and many more.
  • You will introduce new technologies, such as migrating parts of the architecture to our new Kubernetes and Kafka clusters, implementing Apache Spark/Flink, and establishing our own hosted object storage.
  • You will troubleshoot issues in complex systems, conduct root cause analysis, and implement appropriate solutions.

  • Who You Are:
  • You have a strong passion for Linux and a solid understanding of Linux operating systems.
  • You have proficiency in at least one programming language (e.g., Golang, Python, Rust, etc).
  • Familiarity with Amazon Web Services (AWS) and experience in provisioning and managing cloud resources.
  • Experience with containerization technologies like Docker.
  • Knowledge of networking concepts, including VPNs, network stack, and network encryption.
  • Familiarity with CI/CD tools and pipelines.
  • Self-motivated and eager to learn new technologies and tools.
  • Understanding of Kubernetes and experience in deploying and managing containerized applications using Kubernetes.
  • Familiarity with infrastructure-as-code tools like Terraform or CloudFormation
  • Experience with monitoring and logging tools (e.g., Prometheus, ELK stack).
  • Heard of Our Perks?
  • Tech Package: Create game-changing technologies and work with the newest technologies out there.
  • Wealth building: virtual stock options for all our regular employees.
  • Work-Life Package: 2 remote days per week, 30 vacation days, 3 weeks per year of remote work, flexible working hours, dog-friendly kick-ass office in the center of the city.
  • Relocation Package: Visa & legal support, relocation bonus, reimbursement of German Classes costs and more.
  • Happy Belly Package: Monthly company lunch, tons of free snacks and drinks, free breakfast & fresh delicious pastries every Monday
  • Physical & Mental Health Package: In-house gym with personal trainer, various classes like Yoga with expert teachers.
  • Activity Package: Regular team and company events, hackathons.
  • Education Package: Opportunities to boost your professional development with courses and trainings directly connected to your career goals 
  • Free of charge access to our EAP (Employee Assistance Program) which is a counseling service designed to support your mental health and well-being.
  • Skip writing cover letters. Tell us about your most passionate personal project, your desired salary and your earliest possible start date. We are looking forward to your application!We welcome applications from people who will contribute to the diversity of our company.

    Conquer cloud technologies at adjoe

    See vacancies