System Design - Online Education Platform Part 3

August 1st 2021


Table of Contents

  1. Deployment

Deployment

Deployment is a big step forward. This project will be deployed on an AWS EC2 T2micro instance.

Overview of the Proxy

I’ve discussed what a proxy is in the first post of this series. But to review its function in short, it serves static content and the content generated by services. With deployment and scalability in mind, we will be deploying the app with at minimum 3 servers. One for the proxy, one for the service, and one for the service’s database.

Web Sequence Example

Overview of AWS

An EC2 instance is a virtual server. The simplest way to deploy onto the instance is to SSH into the server via the terminal, pull the project repo, npm intall, and npm start. To keep the instances running in case it crashes for any reason, I used pm2 instead.

Stress Test 1: Initial Performance

These initial stress tests were run from my local machine with k6, and tracked with New Relic.

Instructors GET requests

RPS % Dropped Requests
1 0.00%
10 0.00%
100 2.05%
1000 46.12%

Instructors POST requests

RPS % Dropped Requests
1 0.00%
10 0.00%
100 0.20%
1000 44.17%

Currently, as peers work on their own services, thew proxy is tested while serving the instructor service only.

Proxy GET requests

RPS % Dropped Requests
1 0.00%
10 0.00%
100 0.07%
1000 20.28%

It seems at 1000RPS the dropped requests have a sharp rise. It is possible that the server times out and drops many requests at once.

Deploy the Service and Proxy

When creating a new EC2 instance, I chose T2micro. Security inbound rules rules allow HTTP and HTTPS access from anywhere on port 80. This will also allow the proxy to make requests to the service instance.

No special configurations needed to be made when deploying the proxy or instructors service.

  1. SSH into the instance.
  2. Clone the project repo into the instance.
  3. Update any environment variables.
  4. npm install.
  5. If using pm2, pm2 start app.js, otherwise npm start will work just fine.

Other notes: Remember to keep the private IPs of the service instances, as we need to give the database servers these IPs for access to the data.

Deploy the DBMS Server

Deploying the database is simple, as we only need to set up the database and its schema (and dummy data if needed).

This means I installed Postgres and set up Postgres and the project’s schema to the database.

The primary configurations needed are regarding security. Security rules for the DB instance will need to allow connections from the service that will make requests directly to the database.

One of the files that need to be updated is pg_hba.conf and update the #IPV4 local connections and #IPV6 local connections. To apply the changes made, the Postgres server needs to be restarted: sudo service postgresql stop then sudo service postgresql start.

I also open up the database server PostgreSQL port to the service instance.

Somehow, the environment variables from my .env file were not being read properly. Instead, I moved the variables into .bash_profile from root. In order for the changes to apply, source .bash_profile must be run first.

Stress Test 2: Deployed Performance

Instructors GET requests Instructors service GET at 1000RPS

RPS Latency % Dropped Requests
1 29ms 0.00%
10 29ms 0.00%
100 27ms 0.00%
1000 6022ms 71.14%

Instructors POST requests

Instructors service POST at 1000RPS

RPS Latency % Dropped Requests
1 16ms 0.00%
10 15ms 0.00%
100 14ms 0.00%
1000 2848ms 41.04%

Proxy GET requests

Proxy at 1000RPS

RPS Latency % Dropped Requests
1 28ms 0.00%
10 26ms 0.00%
100 24ms 0.00%
1000 25ms 0.00%

As you may note, the app has a lot of trouble receiving 1000RPS, but generally have no problem up to 100RPS.

The proxy requests do not show drops at all, and this is primarily because it does not need to make any server requests itself. It only needs to request for the bundle.js served by the service.

When programming with React, the React lifecycle allows an app to first be loaded with static content only, then quickly update the content with backend data via state updates. This affects the proxy, as the proxy can serve the “dumb” bundle.js without needing to wait for data.

Next, I will be slowly improving the performance of this app through horizontal scaling. Look forward to sever side rendering, caching, and load balancing.