How to create a Real-Time Communication Web App with React and Firebase API

Introduction

A real-time web application facilitates real-time audio and video communication using a peer-to-peer protocol via simple APIs. In this article, we will build a 1-to-1 video chat whose peer to peer connection will involve a third party server used for signalling and storing shared data for stream negotiation. We will use Node’s react to build our UI and firebase database to listen and update its database in real-time.

User Guide

  • Open the project
  • Start tour webcam
  • Confirm video and audio permissions
  • Create a new call
    • Join a call (Answer the call from a different browser window or device)
  • Hangup

Prerequisites

  • Basic knowledge in React and JavaScript
  • Basic understanding of Firebase API

Requirements

  • Linux / Windows 8 or any higher version
  • Nodejs
  • React
  • Visual Studio Code (VsCode)
  • This article uses git bash terminal and Linux operating system.

What you will learn:

  • How to set up a React Project.
  • How to authenticate and integrate react with Firebase API.
  • How to create peer to peer web Real-Time Communication(RTC) using react and Firebase.

INSTALL REQUIRED APPS

1. Install node.

In your browser, use this link to go to the nodejs official website. Click to download nodejs respective to your operating system

install node
  • In your terminal type the following command to confirm nodejs installation node -v

2. Install Visual Studio Code

  • Use this link to download VsCode and install the app.

3. Install React

  • Create your projects root folder and access it via your terminal
  • Download and install react using: npm install react
  • Create your react app; we will name ours ‘videocall’: npx create-react-app videocall. The project’s first instance is successfully built at this point
  • Run the project using: npm start
  • The project uses port 3000 as your local server. Access it through the following link to view your project first instance
react first page

To open the directory via VsCode, type this command in your terminal: code .

project’s first instance

The project’s first page uses the DOM element rendered from the index component in the ‘src/index.js’ directory. It will serve as our root component. The component then passes components from the ‘src/app.js’ directory.

FIREBASE INTEGRATION

4. Create a firebase account

  • Use this link in your browser to access the official Firebase website
  • Click ‘go to console’ option at the top right
Firebase

You will be directed to a page requiring you to create a project. Click to create a project and complete the following three steps to continue. This article uses the following details

  • Project name – video call
  • Enable Google Analytics
  • Configure Google Analytics with – Default account for firebase.

Once the procedure is complete, your firebase project should be successfully up and running. With this app, we will begin by accessing our authentication parameters. Go to the settings page through the settings icon at the top right of the sidebar.

firebase select product settings

On the following page, scroll down to ‘Your apps’ option, where you will be prompted to select a platform for your project.

We are using nodejs to create a web application, so from the options given, select ‘ web’ to proceed.

Type a filename and click ‘register app’. Our app will be named ‘webrtc.’

firebase enter the app name

The second option will provide your firebase configuration keys. Click at the bottom to ‘continue to console.’

Through the ‘your apps’ option on the ‘project settings page, you will always be able to access these keys and use them for your project’s environment variables.

5.Integrate Firebase

  • To integrate firebase with our react app, we need to involve environment variables in our project’s root directory.
  • In the above mentioned directory create a file called ‘.env’ and paste the following inside
  • Fill the blank parenthesis with the respective keys from firebase.

At this point, we need to add firebase to our react project as a dependency. We can achieve this through: npm install firebase. To ensure firebase is installed, go to the file named ‘package.json’ at your root directory. You should be able to view firebase under ‘dependencies’.

firebase added to dependencies

6. Measuring Performance

By default, Create React App includes a performance relayer that allows you to measure and analyze the performance of your application using different metrics. With web vitals, we have a clear metric for user experience.

Install web vitals via your terminal: npm install web-vitals

Go to your src folder directory and create a component called ‘reportWebVitals.js’. Paste in the following code:

The above code measures all the web vital metrics on real users to accurately match how Chrome measures them and reportedly to other Google tools.

With our web vitals ready, let’s import them to our root component.

7. Create Root Component

Go to ‘src/index.js’ directory and replace the contents with the following

code guide :

  •  Line 1: imports React to use Hooks or other exports that React provides.
  • Line 2: uses react-dom package to provide method used at the top level of our react application and as an escape path to get outside of react model if necessary.
  • Line 3: imports css for our component’s style integration
  • Line 4: imports the App component from the same folder
  • Line 5:imports web vitals
  • Line 6: imports firebase
  • Line 8-16: Here we introduce our authentication keys using process.env.

**NOTE** process.env is injected by the Node at runtime for your application to use, and it represents the state of the system environment your application is in when it starts. 

  • Line17: pass settings to Firestore to configure Firestore persistence. Setting this to true in all open tabs enables shared access to local persistence, shared execution of queries and latency-compensated local document updates across all connected instances.
  • Line 20: ReactDOM.render  controls the contents of the container node you pass in. Any existing DOM elements inside are replaced when first called. Later calls use React’s DOM diffing algorithm for efficient updates.
  • Line 21: StrictMode is a tool for highlighting potential problems in an application. Like FragmentStrictMode does not render any visible UI. It activates additional checks and warnings for its descendants.
  • Line 22: pass in the App component imported from line 4
  • Line 24: We call this a “root” DOM node because React DOM will manage everything inside it.
  • Line27: If you want to start measuring performance in your app, pass a function to log results (for example: reportWebVitals(console.log))

With our index page set up, we need to create another component naming it ‘stream’, which we shall implement our video call components.

8. Create Streamer component

From the src directory, create a folder and name it components. Inside the components directory, create a file and name it ‘streamer.js.’

We will have to ensure that our component can run when instructed. Paste the following code inside the streamer.js component

The function above will display the string “Streamer component works” when called

In the ‘src/App.js’ directory replace the contents with the following

Code guide:

  • Line 1: Include css styling in component
  • Line 2: Imports Streamer Component
  • Line 5: Create function App
  • Line 6-11: Pass the Streamer component in the function’s return statement.
  • Run the project : npm start
functioning streamer component

We can now populate our streamer component to achieve a video call.

Let us begin with adding our desired element to a css component.

In the Components folder create a new file and name it Streamer.module.css and paste the following styles

Back to the streamer component, start by importing the following on top of your Streamer function.

Code guide:

  • Line 1: Imports styles from the Streamer.module.css component
  • Line2: Imports firebase to update firebase component
  • Line3: Import to trigger firebase side effects
  • Line 4: imports React to use Hooks or other exports that React provides.

9. Include required variables

Below the Imports, replace the Streamer function with the following

Code guide:

  • Under the streamer function, create the variables
  • webcamButtonRef: Reference to the ‘start Webcam’ Button
  • webcamVideoRef: Reference to the Local stream video element.
  • callButtonRef: Reference to the ‘Create call(offer)’ button
  • callInputRef: Reference to caller id input element
  • answerButtonRef: Reference to the ‘answer call’ button
  • remoteVideoRef: Reference to Remote stream video element
  • hangupButtonRef: Reference to the ‘hangup call’ button.

Proceed by adding the following code below the variables.

Code guide:

  • Line 1: since we will be interacting with firestore, we grab a reference to the firestore database object.
  • Line 2: Create an array named ‘recordedChuncks’ (take note of this array, we will use it in a moment).

10. Create peer connection

RTCPeer communication allows the communication between peers.

Add the following code below your variables.

Code guide:

Our video elements will stream using global variables for the peer connections. Our newly created RTC Peer Connection is referenced to STUN servers that are hosted by the Google-a-STUN server. The servers are effective in discovering a suitable IP address port when coming up with a P2P connection. The ICECandidatePoolsize refers to a 16-bit integer value that specifies the size of the prefetched ICE candidate pool. Connections can be established more quickly by allowing the ICE agent to start fetching ICE candidates before you start trying to connect so that they’re already available for inspection when RTCPeerConnection.setLocalDescription() is called. By changing the size of the ICE candidate pool, we trigger the beginning of the ICE gathering.

  • The stream objects(local stream and remote stream) are used to define the behaviours of a stream object.
  • The options variable is assigned to a mime type for WebM videos. The Multipurpose Internet Mail Extension is a standard that indicates the nature and format of a document, file, or assortment of bytes.
  • The media recorder object acts as the interface of the media stream and provides functionality to record media easily.

11. Video Stream

Now we create a webcam feed using the media Stream interface

Below the previous code, add the following.

Code guide:

  • Line 1: Create async function called ‘webCamHandler’. We will use this function to provide video feed from our webcam using the mediaStream interface.
  • Line 2: (Optional): Notifies with a string in console.log when the command fires.
  • Line 3-5: Activates your device audio and video
  • Line 8: Create Media Stream
  • Line 11-13: Connect the stream received from ‘localStream’ to the RTCPeerConnection. The mediaStream should consist of at least one media track individually added to the RTCPeerConnection when transmitting media to the remote peer.
  • Line 56-60: Creates a listener on our local PeerConnection(pc), which will listen for the incoming track event then populate the remote peer.
  • Line 22: populates localStream video element using webcamVideoRef reference.
  • Line 23: populates localStream video element using webcamVideoRef reference.

**NOTE**: The following codes will act as an optional head start venture for interested parties willing to develop a recording feature.

  • Line 26: Creates a media recording interface easy to provide an interface for recording media.
  • Line 27-32: Registers a data event listener that populates the ‘recordedChunks’ array with event data.
  • Line 34: Activates the media Recorder.

We now create our call Handler. Add the following code below the call above.

Code guide:

  • Line 1: Create async call handler function
  • Line 2:(Optional): Notify with a string in console.log when the call Handler function fires.
  • Line 3-6: Creates variables that reference Firestore collections. The firestore database will use this information to allow peers to relay the required information to establish a connection.
  • Line 11-13: Considering each call document contains a subcollection for offerCandidate, we get each offerCandidates and save them to the database.
  • Line 16-24: Create a call offer, we involve the ‘offer’ method to initiate the creation of an SDP offer to start a new webRTC connection to a remote peer.
  • Line 27-33: Use the on snapshot method to listen to a document. An initial call using the callback provided creates a document snapshot immediately with the current contents of the single document. Each time the contents change, another call updates the document snapshot
  • Line 3643: Uses the logic above when a call is answered to add a candidate to the peer connection.
  • line 45: Disables the hangup button during the callHandler function moment.

We now work on our answerHandler function. Paste the following code below the above function.

Code guide:

  • Line 1: Creates async answer handler function
  • Line 2:(Optional): Notify with a string in console.log when the answer Handler function works.
  • Line 3: populates the caller id input element with the provided id value
  • Line 4-6:  Reference Firestore collections using the respective variables. The firestore database will use this information to allow peers to relay the required information to establish a connection.
  • Line 8-10: Considering each call document contains a subcollection for offerCandidate , we get each offerCandidates and save them to the database.
  • Line 12: Assigns callerID with caller id data from the database
  • Line 14-27: Creates offer. We involve the ‘offer’ method to initiate the creation of the SDP offer to start a new webRTC connection to the remote peer.
  • Line 27-33: Use the on snapshot method to listen to a document. An initial call using the callback provided creates a document snapshot immediately with the current contents of the single document. Each time the contents change, another call updates the document snapshot.

Our handlerFunction is complete! proceed by adding the following code below it

code guide:

  • Line 1: Creates function hangupHandler
  • Line 2:(Optional): Notify with a string in console.log when the hangup Handler function works.
  • Line 4 – 5: Obtains the localStream video element and gets its tracklist by calling the getTrack() method. Iterate over the tracklist using the forEach() method and call each track’s stop() method.
  • Line 6-9: The MediaRecorder. onstop event handler (part of the MediaRecorder API) handles the stop event, allowing you to run code in response to media recording via a MediaRecorder being stopped.
  • Line 11-15: (optional) Still under download venture, you can use these codes to download your video stream and upload it to your respective online cloud storage website. This project will only focus on achieving a video chat stream.

12. User Interface

Our final stage will be to create a user-friendly interface for our functions.

An optional feature is to set a background for your UI. You can do this by going to the ‘src/index.js’ and adding the following to your body tag background: url('https://wallpapercave.com/uwp/uwp565863.jpeg');

The background is dark themed, so add the following below the background to ensure visibility of the text objects.

color: #fff;//colours texts to white.

We will also use materialize UI to customize our components with material design.

Install Material-UI’s source files. The module will take care of all the injecting css needed.

npm install @material-ui/core

Head to ‘src/index.css’ folder and add the following.

Head back to your streamer component. At the top include materialize.css module : import {Button} from '@material-ui/core'

In your Streamer function’s return statement , paste the following code

Code guide:

  • Line 1: Beginning of the return function
  • Line 3: Sub heading containing first instruction (start Webcam)
  • Line 4: Video Stream container
  • Line 5-13: Creates span containing local Stream video element
  • Line 14-22: Creates span containing remote stream video element
  • Line 25-27: Creates the button to initiate local user webcam
  • Line 29-31: Creates a button to initiate a call offer(get video call offer id)
  • Line 36: Creates input element to receive call offer/id
  • Line 37-39:Creates answer call button
  • Line 41: Creates hangup button
  • Line 46-49: (optional) Creates download button to download streamed video to upload to your respective cloud storage server.

That’s it!

Our UI is fully set up, and its layout is as shown below

UI screenshot top
UI screenshot bottom

Optional Approach to this project:

  • Create project directory
  • Use the repository link to clone your app: git clone https://github.com/apeli23/reactwebrtc.git
  • Install project dependencies with: npm install
  • Run the project: npm run dev

This completes the project. The application provided should be able to achieve the following:

  • Create a functioning react application and include other functional components.
  • Integrate react application with firebase API
  • Ensure Real Time Communication using react application

**STEPS TO NOTE**

Remember to:

  • Set up your firebase env variables as you integrate your project with firebase
  • Disinvolve your .env variables while uploading your files in your repository(Recommended practice)
  • Use a laptop or computer that fulfils the required hardware expectations (Audio&Video streamable)

We have successfully built our video chat streaming application.

Happy coding!

0 Shares:
You May Also Like