π Introduction
Integrate real-time messaging protocol (RTMP) livestream functionality seamlessly into your Flutter video call application for a dynamic and engaging user experience. With RTMP support, users can effortlessly stream live video content, fostering interactive communication within your app. Enable users to broadcast their video feeds in real time, creating vibrant and immersive virtual gatherings. Implementing RTMP in your Flutter app ensures smooth and efficient live streaming, enhancing the overall quality of video calls.
Benefits of RTMP Livestream:
- Improved Communication : RTMP enhances communication by allowing users to share live video content, fostering collaboration and information sharing.
- High-Quality Streaming: RTMP ensures high-quality video streaming with minimal latency, providing users with a seamless viewing experience.
- Scalability: With RTMP support, your app can accommodate a large number of simultaneous viewers, making it suitable for hosting virtual events of varying sizes.
- Enhanced User Engagement: RTMP livestreaming facilitates real-time interaction, keeping users engaged and connected during video calls.
Use Cases of RTMP Livestream:
- Virtual Events : Host virtual conferences, concerts, or seminars where participants can stream live video content to engage with the audience in real time.
- Educational Platforms : Enable teachers to conduct live lectures and interactive sessions, allowing students to stream video content and ask questions in real time.
- Remote Work : Facilitate remote meetings and conferences where participants can share live video presentations and collaborate effectively
This tutorial VideoSDK provides clear instructions and code examples to help you seamlessly integrate RTMP live streaming into your Flutter video-calling app.
π Getting Started with VideoSDK
To integrate RTMP, we must use the capabilities of VideoSDK and what it offers. Before diving into the implementation steps, let's ensure you complete the necessary prerequisites.
Create a VideoSDK Account
Go to your VideoSDK dashboard and sign up if you don't have an account. This account gives you access to the required Video SDK token, which acts as an authentication key that allows your application to interact with VideoSDK functionality.
Generate your Auth Token
Visit your VideoSDK dashboard and navigate to the "API Key" section to generate your auth token. This token is crucial in authorizing your application to use VideoSDK features.
For a more visual understanding of the account creation and token generation process, consider referring to the provided tutorial.
Prerequisitesβ
Before proceeding, ensure that your development environment meets the following requirements:
- Video SDK Developer Account (if you do not have one, follow VideoSDK Dashboard
- The basic understanding of Flutter.
- Flutter VideoSDK
- Have Flutter installed on your device.
π οΈ Install VideoSDKβ
Install the VideoSDK using the below-mentioned flutter command. Make sure you are in your Flutter app directory before you run this command.
$ flutter pub add videosdk
//run this command to add http library to perform network call to generate roomId
$ flutter pub add http
VideoSDK Compatibility
Android and iOS app | Web | Desktop app | Safari browser |
---|---|---|---|
Structure of the projectβ
Your project structure should look like this.
root
βββ android
βββ ios
βββ lib
βββ api_call.dart
βββ join_screen.dart
βββ main.dart
βββ meeting_controls.dart
βββ meeting_screen.dart
βββ participant_tile.dart
We are going to create flutter widgets (JoinScreen, MeetingScreen, MeetingControls, and ParticipantTile).
App Structureβ
The app widget will contain JoinScreen
and MeetingScreen
widget. MeetingScreen
will have MeetingControls
and ParticipantTile
widget.
Configure Project
For Androidβ
- Update
/android/app/src/main/AndroidManifest.xml
for the permissions we will be using to implement the audio and video features.
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.FOREGROUND_SERVICE"/>
<uses-permission android:name="android.permission.WAKE_LOCK" />
AndroidManifest.xml
- Also, you will need to set your build settings to Java 8 because the official WebRTC jar now uses static methods in
EglBase
the interface. Just add this to your app-level/android/app/build.gradle
.
android {
//...
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
}
- If necessary, in the same
build.gradle
you will need to increaseminSdkVersion
ofdefaultConfig
up to23
(currently default Flutter generator set to16
). - If necessary, in the same
build.gradle
you will need to increasecompileSdkVersion
andtargetSdkVersion
up to33
(currently, the default Flutter generator set to30
).
For iOSβ
- Add the following entries which allow your app to access the camera and microphone of your
/ios/Runner/Info.plist
file : - Uncomment the following line to define a global platform for your project in
/ios/Podfile
:
<key>NSCameraUsageDescription</key>
<string>$(PRODUCT_NAME) Camera Usage!</string>
<key>NSMicrophoneUsageDescription</key>
<string>$(PRODUCT_NAME) Microphone Usage!</string>
# platform :ios, '12.0'
For MacOSβ
- Add the following entries to your
/macos/Runner/Info.plist
file that allows your app to access the camera and microphone:
<key>NSCameraUsageDescription</key>
<string>$(PRODUCT_NAME) Camera Usage!</string>
<key>NSMicrophoneUsageDescription</key>
<string>$(PRODUCT_NAME) Microphone Usage!</string>
- Add the following entries to your
/macos/Runner/DebugProfile.entitlements
file that allows your app to access the camera, microphone, and open outgoing network connections:
<key>com.apple.security.network.client</key>
<true/>
<key>com.apple.security.device.camera</key>
<true/>
<key>com.apple.security.device.microphone</key>
<true/>
- Add the following entries to your
/macos/Runner/Release.entitlements
file that allows your app to access the camera, microphone, and open outgoing network connections:
<key>com.apple.security.network.server</key>
<true/>
<key>com.apple.security.network.client</key>
<true/>
<key>com.apple.security.device.camera</key>
<true/>
<key>com.apple.security.device.microphone</key>
<true/>
π₯ Essential Steps to Implement Video Calling Functionality
Before diving into the specifics of screen sharing implementation, it's crucial to ensure you have VideoSDK properly installed and configured within your Flutter project. Refer to VideoSDK's documentation for detailed installation instructions. Once you have a functional video calling setup, you can proceed with adding the screen-sharing feature.
Step 1: Get started with api_call.dart
β
Before jumping to anything else, you will write a function to generate a unique meetingId. You will require an authentication token, you can generate it either by using videosdk-rtc-api-server-examples or by generating it from the VideoSDK Dashboard for development.
import 'dart:convert';
import 'package:http/http.dart' as http;
//Auth token we will use to generate a meeting and connect to it
String token = "<Generated-from-dashboard>";
// API call to create meeting
Future<String> createMeeting() async {
final http.Response httpResponse = await http.post(
Uri.parse("https://api.videosdk.live/v2/rooms"),
headers: {'Authorization': token},
);
//Destructuring the roomId from the response
return json.decode(httpResponse.body)['roomId'];
}
api_call.dart
Step 2: Creating the JoinScreenβ
Let's create join_screen.dart
file in lib
directory and create JoinScreen StatelessWidget
.
The JoinScreen will consist of:
- Create Meeting Button : This button will create a new meeting for you.
- Meeting ID TextField : This text field will contain the meeting ID, you want to join.
- Join Meeting Button : This button will join the meeting, which you have provided.
- Update the home screen of the app in the
main.dart
import 'package:flutter/material.dart';
import 'api_call.dart';
import 'meeting_screen.dart';
class JoinScreen extends StatelessWidget {
final _meetingIdController = TextEditingController();
JoinScreen({super.key});
void onCreateButtonPressed(BuildContext context) async {
// call api to create meeting and then navigate to MeetingScreen with meetingId,token
await createMeeting().then((meetingId) {
if (!context.mounted) return;
Navigator.of(context).push(
MaterialPageRoute(
builder: (context) => MeetingScreen(
meetingId: meetingId,
token: token,
),
),
);
});
}
void onJoinButtonPressed(BuildContext context) {
String meetingId = _meetingIdController.text;
var re = RegExp("\\w{4}\\-\\w{4}\\-\\w{4}");
// check meeting id is not null or invaild
// if meeting id is vaild then navigate to MeetingScreen with meetingId,token
if (meetingId.isNotEmpty && re.hasMatch(meetingId)) {
_meetingIdController.clear();
Navigator.of(context).push(
MaterialPageRoute(
builder: (context) => MeetingScreen(
meetingId: meetingId,
token: token,
),
),
);
} else {
ScaffoldMessenger.of(context).showSnackBar(const SnackBar(
content: Text("Please enter valid meeting id"),
));
}
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: const Text('VideoSDK QuickStart'),
),
body: Padding(
padding: const EdgeInsets.all(12.0),
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
ElevatedButton(
onPressed: () => onCreateButtonPressed(context),
child: const Text('Create Meeting'),
),
Container(
margin: const EdgeInsets.fromLTRB(0, 8.0, 0, 8.0),
child: TextField(
decoration: const InputDecoration(
hintText: 'Meeting Id',
border: OutlineInputBorder(),
),
controller: _meetingIdController,
),
),
ElevatedButton(
onPressed: () => onJoinButtonPressed(context),
child: const Text('Join Meeting'),
),
],
),
),
);
}
}
join_screen.dart
import 'package:flutter/material.dart';
import 'join_screen.dart';
void main() {
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({super.key});
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'VideoSDK QuickStart',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: JoinScreen(),
);
}
}
main.dart
Step 3: Creating the MeetingControlsβ
Let's create meeting_controls.dart
file and create MeetingControls StatelessWidget
.
The MeetingControls will consist of:
- Leave Button: This button will leave the meeting.
- Toggle Mic Button: This button will unmute or mute the mic.
- Toggle Camera Button: This button will enable or disable the camera.
MeetingControls will accept 3 functions in the constructor
- onLeaveButtonPressed: invoked when the Leave button is pressed.
- onToggleMicButtonPressed: invoked when the toggle mic button is pressed.
- onToggleCameraButtonPressed: invoked when the toggle Camera button is pressed.
import 'package:flutter/material.dart';
class MeetingControls extends StatelessWidget {
final void Function() onToggleMicButtonPressed;
final void Function() onToggleCameraButtonPressed;
final void Function() onLeaveButtonPressed;
const MeetingControls(
{super.key,
required this.onToggleMicButtonPressed,
required this.onToggleCameraButtonPressed,
required this.onLeaveButtonPressed});
@override
Widget build(BuildContext context) {
return Row(
mainAxisAlignment: MainAxisAlignment.spaceEvenly,
children: [
ElevatedButton(
onPressed: onLeaveButtonPressed, child: const Text('Leave')),
ElevatedButton(
onPressed: onToggleMicButtonPressed, child: const Text('Toggle Mic')),
ElevatedButton(
onPressed: onToggleCameraButtonPressed,
child: const Text('Toggle WebCam')),
],
);
}
}
meeting_controls.dart
Step 4: Creating ParticipantTileβ
Let's create participant_tile.dart
file and create ParticipantTile StatefulWidget
.
The ParticipantTile will consist of:
- RTCVideoView : This will show the participant's video stream.
ParticipantTile will accept Participant
in constructor
- participant: participant of the meeting.
import 'package:flutter/material.dart';
import 'package:videosdk/videosdk.dart';
class ParticipantTile extends StatefulWidget {
final Participant participant;
const ParticipantTile({super.key, required this.participant});
@override
State<ParticipantTile> createState() => _ParticipantTileState();
}
class _ParticipantTileState extends State<ParticipantTile> {
Stream? videoStream;
@override
void initState() {
// initial video stream for the participant
widget.participant.streams.forEach((key, Stream stream) {
setState(() {
if (stream.kind == 'video') {
videoStream = stream;
}
});
});
_initStreamListeners();
super.initState();
}
_initStreamListeners() {
widget.participant.on(Events.streamEnabled, (Stream stream) {
if (stream.kind == 'video') {
setState(() => videoStream = stream);
}
});
widget.participant.on(Events.streamDisabled, (Stream stream) {
if (stream.kind == 'video') {
setState(() => videoStream = null);
}
});
}
@override
Widget build(BuildContext context) {
return Padding(
padding: const EdgeInsets.all(8.0),
child: videoStream != null
? RTCVideoView(
videoStream?.renderer as RTCVideoRenderer,
objectFit: RTCVideoViewObjectFit.RTCVideoViewObjectFitCover,
)
: Container(
color: Colors.grey.shade800,
child: const Center(
child: Icon(
Icons.person,
size: 100,
),
),
),
);
}
}
participant_tile.dart
Step 5: Creating the MeetingScreenβ
Let's create meeting_screen.dart
file and create MeetingScreen StatefulWidget
.
MeetingScreen will accept meetingId and token in the constructor.
- meetingID: meetingId, you want to join
- token : VideoSDK Auth token.
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:videosdk/videosdk.dart';
import './participant_tile.dart';
class MeetingScreen extends StatefulWidget {
final String meetingId;
final String token;
const MeetingScreen(
{super.key, required this.meetingId, required this.token});
@override
State<MeetingScreen> createState() => _MeetingScreenState();
}
class _MeetingScreenState extends State<MeetingScreen> {
late Room _room;
var micEnabled = true;
var camEnabled = true;
Map<String, Participant> participants = {};
@override
void initState() {
// create room
_room = VideoSDK.createRoom(
roomId: widget.meetingId,
token: widget.token,
displayName: "John Doe",
micEnabled: micEnabled,
camEnabled: camEnabled
);
setMeetingEventListener();
// Join room
_room.join();
super.initState();
}
// listening to meeting events
void setMeetingEventListener() {
_room.on(Events.roomJoined, () {
setState(() {
participants.putIfAbsent(
_room.localParticipant.id, () => _room.localParticipant);
});
});
_room.on(
Events.participantJoined,
(Participant participant) {
setState(
() => participants.putIfAbsent(participant.id, () => participant),
);
},
);
_room.on(Events.participantLeft, (String participantId) {
if (participants.containsKey(participantId)) {
setState(
() => participants.remove(participantId),
);
}
});
_room.on(Events.roomLeft, () {
participants.clear();
Navigator.popUntil(context, ModalRoute.withName('/'));
});
}
// onbackButton pressed leave the room
Future<bool> _onWillPop() async {
_room.leave();
return true;
}
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return WillPopScope(
onWillPop: () => _onWillPop(),
child: Scaffold(
appBar: AppBar(
title: const Text('VideoSDK QuickStart'),
),
body: Padding(
padding: const EdgeInsets.all(8.0),
child: Column(
children: [
Text(widget.meetingId),
//render all participant
Expanded(
child: Padding(
padding: const EdgeInsets.all(8.0),
child: GridView.builder(
gridDelegate: const SliverGridDelegateWithFixedCrossAxisCount(
crossAxisCount: 2,
crossAxisSpacing: 10,
mainAxisSpacing: 10,
mainAxisExtent: 300,
),
itemBuilder: (context, index) {
return ParticipantTile(
key: Key(participants.values.elementAt(index).id),
participant: participants.values.elementAt(index));
},
itemCount: participants.length,
),
),
),
MeetingControls(
onToggleMicButtonPressed: () {
micEnabled ? _room.muteMic() : _room.unmuteMic();
micEnabled = !micEnabled;
},
onToggleCameraButtonPressed: () {
camEnabled ? _room.disableCam() : _room.enableCam();
camEnabled = !camEnabled;
},
onLeaveButtonPressed: () {
_room.leave()
},
),
],
),
),
),
home: JoinScreen(),
);
}
}
meeting_screen.dart
CAUTION
If you getwebrtc/webrtc.h file not found
error at a runtime in iOS, then check the solution here.TIP :
You can checkout the complete quick start example here.
Integrate RTMP Livestream Feature
RTMP is a popular protocol for live streaming video content from a VideoSDK to platforms such as YouTube, Twitch, Facebook, and others.
By providing the platform-specific stream key and stream URL, the VideoSDK can connect to the platform's RTMP server and transmit the live video stream.
VideoSDK allows you to live stream your meeting to a platform that supports RTMP ingestion just by providing the platform-specific stream key and stream URL, we can connect to the platform's RTMP server and transmit the live video stream.
VideoSDK also allows you to configure the livestream layouts in numerous ways, like by simply setting different prebuilt layouts in the configuration or by providing your own custom template to do the live stream according to your layout choice.
After installing videoSDK, you can unlock the power of live streaming by following these steps:
Start Livestream
startLivestream()
can be used to start an RTMP live stream of the meeting, which can be accessed from the Room
object. This method accepts two parameters:
-
1. outputs
: This parameter accepts an array of objects that contains the RTMPurl
andstreamKey
of the platforms, you want to start the live stream. -
2. config (optional)
: This parameter will define what the live stream layout should look like.
var outputs = [
{
url: "rtmp://a.rtmp.youtube.com/live2",
streamKey: "<STREAM_KEY>",
},
{
url: "rtmps://",
streamKey: "<STREAM_KEY>",
},
];
Map<String, dynamic> config = {
// Layout Configuration
layout: {
type: "GRID", // "SPOTLIGHT" | "SIDEBAR", Default : "GRID"
priority: "SPEAKER", // "PIN", Default : "SPEAKER"
gridSize: 4, // MAX : 4
},
// Theme of livestream
theme: "DARK", // "LIGHT" | "DEFAULT"
};
room.startLivestream(outputs,config:config);
Stop Live Stream
-
stopLivestream()
is used to stop the meeting live stream which can be accessed from theRoom
object.
Event associated with Livestreamβ
livestreamStateChanged : Whenever the livestream state changes, then livestreamStateChanged
the event will trigger.
import 'package:flutter/material.dart';
import 'package:videosdk/videosdk.dart';
class MeetingScreen extends StatefulWidget {
...
}
class _MeetingScreenState extends State<MeetingScreen> {
late Room room;
@override
void initState() {
...
setupRoomEventListener();
}
@override
Widget build(BuildContext context) {
return YourMeetingWidget();
}
void setupRoomEventListener() {
//...
room.on(Events.livestreamStateChanged, (String status) {
//Status can be :: LIVESTREAM_STARTING
//Status can be :: LIVESTREAM_STARTED
//Status can be :: LIVESTREAM_STOPPING
//Status can be :: LIVESTREAM_STOPPED
log("Meeting Livestream status : $status");
});
}
}
Example
- Let's create
LiveStreamControls
widget which will help to toggle RTMP LiveStream.
import 'package:flutter/material.dart';
class LiveStreamControls extends StatelessWidget {
final String liveStreamState;
final void Function() onToggleLiveStreamButtonPressed;
const LiveStreamControls({
super.key,
required this.liveStreamState,
required this.onToggleLiveStreamButtonPressed,
});
@override
Widget build(BuildContext context) {
return Wrap(
children: [
ElevatedButton(
onPressed: onToggleLiveStreamButtonPressed,
child: Text(liveStreamState == "LIVESTREAM_STOPPED"
? 'Start LiveStream'
: liveStreamState == "LIVESTREAM_STARTING"
? "Starting LiveStream"
: liveStreamState == "LIVESTREAM_STARTED"
? "Stop LiveStream"
: liveStreamState == "LIVESTREAM_STOPPING"
? "Stopping LiveStream"
: "Start LiveStream")),
],
);
}
}
- Now, we will add
LiveStreamControls
widget inMeetingScreen
afterMeetingControls
widget. - Also, we will listen to
liveStreamStateChanged
event when the meeting's live stream status changed.
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:videosdk/videosdk.dart';
import './participant_tile.dart';
import './livestream_controls.dart';
class MeetingScreen extends StatefulWidget {
//...
}
class _MeetingScreenState extends State<MeetingScreen> {
late Room _room;
var micEnabled = true;
var camEnabled = true;
String liveStreamState = "LIVESTREAM_STOPPED";
Map<String, Participant> participants = {};
@override
void initState() {
// create room
//...
}
// listening to meeting events
void setMeetingEventListener() {
// ...
widget.room.on(
Events.liveStreamStateChanged,
(String status) {
setState(
() => liveStreamState = status,
);
},
);
}
// onbackButton pressed leave the room
Future<bool> _onWillPop() async {
_room.leave();
return true;
}
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return WillPopScope(
onWillPop: () => _onWillPop(),
child: Scaffold(
appBar: AppBar(
title: const Text('VideoSDK QuickStart'),
),
body: Padding(
padding: const EdgeInsets.all(8.0),
child: Column(
children: [
// ...,
LiveStreamControls(
onToggleLiveStreamButtonPressed: () {
if (liveStreamState == "LIVESTREAM_STOPPED") {
var outputs = [
{
"url": "<URL>",
"streamKey": "<Stream-Key>",
}
];
var liveStreamConfig = {
'layout': {
'type': 'GRID',
'priority': 'SPEAKER',
'gridSize': 4,
},
'theme': "LIGHT",
};
widget.room.startLivestream(outputs,config: liveStreamConfig);
} else if (liveStreamState == "LIVESTREAM_STARTED") {
widget.room.stopLivestream();
}
},
liveStreamState: liveStreamState,
),
],
),
),
),
home: JoinScreen(),
);
}
}
Custom Templateβ
With VideoSDK, you can also use your own custom designed layout template to live stream the meetings. To use the custom template, you need to create a template, for which you can follow this guide. Once you have set up the template, you can use the REST API to start the live stream with the templateURL
parameter.
π Conclusion
Integrating RTMP livestream functionality into your Flutter video call app offers numerous benefits and opens up a wide range of use cases. With seamless RTMP support, users can engage in high-quality, real-time video streaming, enhancing communication and collaboration within your application. Embrace the versatility and scalability of RTMP in your Flutter app to elevate user engagement, foster meaningful connections, and provide a platform for diverse multimedia experiences.
Unlock the full potential of VideoSDK and craft seamless video experiences effortlessly. Sign up with VideoSDK today and receive 10,000 free minutes to propel your video app to the next level.
Top comments (0)