DEV Community

Cover image for Building a Real-Time Video Chat App with WebRTC, Socket.io, Node.js, and React .
emilshiju
emilshiju

Posted on

Building a Real-Time Video Chat App with WebRTC, Socket.io, Node.js, and React .

Full Implementation of Real-Time Video Chat

This document provides a complete implementation for a real-time video chat application using WebRTC and Socket.IO. The app enables seamless video and audio communication between users, with features for muting audio, pausing video, and handling call events.

Key Features:

Video and Audio Communication: Real-time streaming using WebRTC.
Call Management: Handle incoming and outgoing calls, including offer/answer and ICE candidates exchange.
User Controls: Toggle audio and video on/off, start and end calls.
Notifications: Alerts for call status and call management actions.
Implementation Details:

Socket.IO Integration: Manages real-time signaling and event handling.
RTCPeerConnection: Establishes peer-to-peer connections and handles ICE candidates.
Media Stream Handling: Manages local and remote media streams for video and audio.
UI Controls: Provides intuitive buttons for controlling video and audio.
Components:

App Component: The core component for initializing and managing video calls.
Event Handlers: Functions to manage offers, answers, candidates, and hangups.
User Interface: Responsive layout with controls for call actions.
This implementation ensures a robust and interactive user experience for real-time communication, with clear and effective UI elements for managing video calls.

Minimum APIs of WebRTC:
WebRTC consists of several key APIs that enable its functionality:

  • getUserMedia API: This API allows web applications to access the userโ€™s camera and microphone, enabling real-time audio and video streaming.

  • RTCPeerConnection API: This API establishes and manages the peer-to-peer connection, facilitating audio and video communication between browsers. It handles network protocols, codecs, and security features like encryption.

  • RTCDataChannel API: In addition to audio and video streams, this API provides a peer-to-peer data channel for exchanging arbitrary data (files, text, etc.) directly between browsers.

To Know More

Image description



io.on("connection", (socket) => {
  console.log("Connected");

  socket.on("calling", (message) => {

    socket.broadcast.emit("calling", message);
  });

  socket.on("disconnect", () => {
    console.log("Disconnected");
  });
});



Enter fullscreen mode Exit fullscreen mode

To establish a direct connection between two specific users, you can use Socket.io to emit a calling event from one user directly to the intended recipient. This ensures that the connection request is sent specifically to the targeted user rather than broadcasting it to all connected users.



io.on("connection", (socket) => {
  console.log("Connected");


  socket.on("calling", (message) => {

    io.to(message.userId).emit("calling", message);

  });

  socket.on("disconnect", () => {
    console.log("Disconnected");
  });
});



Enter fullscreen mode Exit fullscreen mode

peer to peer connection



const configuration = {
  iceServers: [
    {
      urls: ["stun:stun1.l.google.com:19302", "stun:stun2.l.google.com:19302"],
    },
  ],
  iceCandidatePoolSize: 10,
};



try {
    pc.current = new RTCPeerConnection(configuration);
    pc.current.onicecandidate = (e) => {
      const message = {
        type: "candidate",
        candidate: null,
        id:userInfo,
      };
      if (e.candidate) {
        message.candidate = e.candidate.candidate;
        message.sdpMid = e.candidate.sdpMid;
        message.sdpMLineIndex = e.candidate.sdpMLineIndex;
      }
      socket.emit("calling", message);
    };
    pc.current.ontrack = (e) => (remoteVideo.current.srcObject = e.streams[0]);
    localStream.current.getTracks().forEach((track) => pc.current.addTrack(track, localStream.current));
    const offer = await pc.current.createOffer();
    socket.emit("calling", {id:userInfo, type: "offer", sdp: offer.sdp });
    await pc.current.setLocalDescription(offer);
  } catch (e) {
    console.log(e);
  }





Enter fullscreen mode Exit fullscreen mode

Handling Offers, Answers, and Candidates:



  try {
      pc.current = new RTCPeerConnection(configuration);
      pc.current.onicecandidate = (e) => {
        const message = {
          type: "candidate",
          id: userInfo,
          candidate: e.candidate ? e.candidate.candidate : null,
          sdpMid: e.candidate ? e.candidate.sdpMid : undefined,
          sdpMLineIndex: e.candidate ? e.candidate.sdpMLineIndex : undefined,
        };
        socket.emit("calling", message);
      };
      pc.current.ontrack = (e) => (remoteVideo.current.srcObject = e.streams[0]);
      localStream.current.getTracks().forEach((track) => pc.current.addTrack(track, localStream.current));
      await pc.current.setRemoteDescription(offer);
      const answer = await pc.current.createAnswer();
      socket.emit("calling", { id: userInfo, type: "answer", sdp: answer.sdp });
      await pc.current.setLocalDescription(answer);
    } catch (e) {
      console.log(e);
    }



Enter fullscreen mode Exit fullscreen mode

Handle Answer:



async function handleAnswer(answer) {
  if (!pc.current) {
    console.error("no peerconnection");
    return;
  }
  try {
    await pc.current.setRemoteDescription(answer);
  } catch (e) {
    console.log(e);
  }
}



Enter fullscreen mode Exit fullscreen mode

Handle Candidate:



 async function handleCandidate(candidate) {
    try {
      if (!pc.current) {
        console.error("no peerconnection");
        return;
      }
      await pc.current.addIceCandidate(candidate ? candidate : null);
    } catch (e) {
      console.log(e);
    }
  }



Enter fullscreen mode Exit fullscreen mode

Hangup Call:



async function hangup() {
  if (pc.current) {
    pc.current.close();
    pc.current = null;
  }
  localStream.current.getTracks().forEach((track) => track.stop());
  localStream.current = null;
  startButton.current.disabled = false;
  hangupButton.current.disabled = true;
  muteAudButton.current.disabled = true;
  muteVideo.current.disabled = true;

}



Enter fullscreen mode Exit fullscreen mode

React Component Implementation:





import { useRef, useEffect, useState } from "react";
import { FiVideo, FiVideoOff, FiMic, FiMicOff } from "react-icons/fi";
import { FaMicrophone, FaMicrophoneSlash, FaVideo, FaVideoSlash, FaPhone, FaTimes } from 'react-icons/fa';
import { io } from "socket.io-client";
import Swal from 'sweetalert2';

const socket = io("http://localhost:3000", { transports: ["websocket"] });

const configuration = {
  iceServers: [
    { urls: ["stun:stun1.l.google.com:19302", "stun:stun2.l.google.com:19302"] },
  ],
  iceCandidatePoolSize: 10,
};

function App() {

  const userInfo = 123456 // Assume userId is defined elsewhere

  const pc = useRef(null);
  const localStream = useRef(null);
  const startButton = useRef(null);
  const hangupButton = useRef(null);
  const muteAudButton = useRef(null);
  const localVideo = useRef(null);
  const remoteVideo = useRef(null);
  const muteVideoButton = useRef(null);
  const muteVideo = useRef(null);

  socket.on("calling", (e) => {
    if (!localStream.current) {
      console.log("not ready yet");
      return;
    }
    switch (e.type) {
      case "offer":
        handleOffer(e);
        break;
      case "answer":
        handleAnswer(e);
        break;
      case "candidate":
        handleCandidate(e);
        break;
      case "ready":
        if (pc.current) {
          alert("already in call ignoring");
          return;
        }
        makeCall();
        break;
      case "bye":
        if (pc.current) {
          hangup();
        }
        break;
      default:
        console.log("unhandled", e);
        break;
    }
  });



  async function makeCall() {
    try {
      pc.current = new RTCPeerConnection(configuration);
      pc.current.onicecandidate = (e) => {
        const message = {
          type: "candidate",
          candidate: e.candidate ? e.candidate.candidate : null,
          sdpMid: e.candidate ? e.candidate.sdpMid : undefined,
          sdpMLineIndex: e.candidate ? e.candidate.sdpMLineIndex : undefined,
          id: userInfo,
        };
        socket.emit("calling", message);
      };
      pc.current.ontrack = (e) => (remoteVideo.current.srcObject = e.streams[0]);
      localStream.current.getTracks().forEach((track) => pc.current.addTrack(track, localStream.current));
      const offer = await pc.current.createOffer();
      socket.emit("calling", { id: userInfo, type: "offer", sdp: offer.sdp });
      await pc.current.setLocalDescription(offer);
    } catch (e) {
      console.log(e);
    }
  }

  async function handleOffer(offer) {
    if (pc.current) {
      console.error("existing peerconnection");
      return;
    }
    try {
      pc.current = new RTCPeerConnection(configuration);
      pc.current.onicecandidate = (e) => {
        const message = {
          type: "candidate",
          id: userInfo,
          candidate: e.candidate ? e.candidate.candidate : null,
          sdpMid: e.candidate ? e.candidate.sdpMid : undefined,
          sdpMLineIndex: e.candidate ? e.candidate.sdpMLineIndex : undefined,
        };
        socket.emit("calling", message);
      };
      pc.current.ontrack = (e) => (remoteVideo.current.srcObject = e.streams[0]);
      localStream.current.getTracks().forEach((track) => pc.current.addTrack(track, localStream.current));
      await pc.current.setRemoteDescription(offer);
      const answer = await pc.current.createAnswer();
      socket.emit("calling", { id: userInfo, type: "answer", sdp: answer.sdp });
      await pc.current.setLocalDescription(answer);
    } catch (e) {
      console.log(e);
    }
  }

  async function handleAnswer(answer) {
    if (!pc.current) {
      console.error("no peerconnection");
      return;
    }
    try {
      await pc.current.setRemoteDescription(answer);
    } catch (e) {
      console.log(e);
    }
  }

  async function handleCandidate(candidate) {
    try {
      if (!pc.current) {
        console.error("no peerconnection");
        return;
      }
      await pc.current.addIceCandidate(candidate ? candidate : null);
    } catch (e) {
      console.log(e);
    }
  }

  async function hangup() {
    if (pc.current) {
      pc.current.close();
      pc.current = null;
    }
    localStream.current.getTracks().forEach((track) => track.stop());
    localStream.current = null;
    startButton.current.disabled = false;
    hangupButton.current.disabled = true;
    muteAudButton.current.disabled = true;
    muteVideo.current.disabled = true;

    closeVideoCall();
  }

  useEffect(() => {
    hangupButton.current.disabled = true;
    muteAudButton.current.disabled = true;
    muteVideo.current.disabled = true;
  }, []);

  const [audioState, setAudio] = useState(true);
  const [videoState, setVideoState] = useState(true);

  const startB = async () => {
    try {
      localStream.current = await navigator.mediaDevices.getUserMedia({
        video: true,
        audio: { echoCancellation: true },
      });
      localVideo.current.srcObject = localStream.current;
    } catch (err) {
      console.log(err);
    }

    startButton.current.disabled = true;
    hangupButton.current.disabled = false;
    muteAudButton.current.disabled = false;
    muteVideo.current.disabled = false;

    socket.emit("calling", { id: userInfo, type: "ready" });
  };

  const hangB = async () => {
    Swal.fire({
      title: 'Are you sure to cut the call?',
      showCancelButton: true,
      confirmButtonText: 'Yes',
      cancelButtonText: 'No',
    }).then((res) => {
      if (res.isConfirmed) {
        hangup();
        socket.emit("calling", { id: userInfo, type: "bye" });
      }
    });
  };

  function muteAudio() {
    if (localStream.current) {
      localStream.current.getAudioTracks().forEach(track => {
        track.enabled = !track.enabled; // Toggle mute/unmute
      });
      setAudio(!audioState); // Update state for UI toggle
    }
  }

  function pauseVideo() {
    if (localStream.current) {
      localStream.current.getVideoTracks().forEach(track => {
        track.enabled = !track.enabled; // Toggle video track
      });
      setVideoState(!videoState); // Update state for UI toggle
    }
  }

  return (
    <div className='bg-white w-screen h-screen fixed top-0 left-0 z-50 flex justify-center items-center'>
      <div className='flex flex-col md:flex-row space-y-4 md:space-y-0 md:space-x-4'>
        <div className='flex-1 p-4'>
          <div className='bg-gray-200 h-96 w-full md:w-96 rounded-lg shadow-md'>
            <video ref={localVideo} className='w-full h-full rounded-lg object-cover' autoPlay playsInline></video>
          </div>
        </div>
        <div className='flex-1 p-4'>
          <div className='bg-gray-200 h-96 w-full md:w-96 rounded-lg shadow-md'>
            <video ref={remoteVideo} className='w-full h-full rounded-lg object-cover' autoPlay playsInline></video>
          </div>
        </div>
      </div>
      <div className='absolute bottom-8 flex justify-center space-x-4'>
        <button className='p-2 rounded-full bg-gray-300 hover:bg-gray-400' ref={muteAudButton} onClick={muteAudio}>
          {audioState ? <FiMic /> : <FiMicOff />}
        </button>
        <button className='p-2 rounded-full bg-gray-300 hover:bg-gray-400' ref={startButton} onClick={startB}>
          <FaPhone className='text-gray-600' />
        </button>
        <button className='p-2 rounded-full bg-gray-300 hover:bg-gray-400' ref={muteVideo} onClick={pauseVideo}>
          {videoState ? <FaVideo className='text-gray-600' /> : <FaVideoSlash className='text-gray-600' />}
        </button>
        <button className='p-2 rounded-full bg-gray-300 hover:bg-gray-400' ref={hangupButton} onClick={hangB}>
          <FaTimes className='text-gray-600' />
        </button>
      </div>
    </div>
  );
}

export default App;







Enter fullscreen mode Exit fullscreen mode

Feel free to adjust the tone and content based on your audience and specific requirements.

Top comments (6)

Collapse
 
mattlewandowski93 profile image
Matt Lewandowski

If youโ€™re going to post AI content at least remove the last part of this article ๐Ÿ˜‚

โ€œFeel free to adjust the tone and content based on your audience and specific requirements.โ€

Collapse
 
emilshiju profile image
emilshiju

While I used AI to assist with some parts of the article, I want to clarify that the content was not entirely AI-generated. I actually wrote that particular statement myself. I included it to encourage readers to make changes according to their needs.

Collapse
 
mattlewandowski93 profile image
Matt Lewandowski

You absolutely did not ๐Ÿ˜‚

Collapse
 
axorax profile image
Axorax

๐Ÿ”ฅ๐Ÿ‘

Collapse
 
mtech_tv_3b0afcd54d2ccb3d profile image
MTECH TV

This looks great, man ๐Ÿ‘. But does the system support more than one peer of users, please?

Collapse
 
aditya_parmar_0aa023e4f7e profile image
Aditya Parmar

No , 2 at a time only