Streaming from the browser with Nuxt using sockets and ffmpeg
Streaming from the browser directly to a RTMP server, the industry standard for publishing live streams, is simply impossible. Browsers can't talk that language. You are going to need a server involved.
Why from the browser?
Good question. You always can use specialized software such as OBS Studio but this introduces a new step. If you are building a streaming platform I'm sure you want to reduce the friction and let the user goes live with a few clicks without leaving your app. Once users have that quick option in the browser they can switch to a Desktop app later, if they want to improve the performance and the quality.
On top of that, building your own solution will allow you to reduce the noise that big streaming apps create such as configs and settings that the user will never use.
What are we going to use?
- getUserMedia() For asking permission to the browser for accessing to the webcam and the microphone.
- MediaRecorder Using this API you can record chunks of video and pass a callback when each chunk is ready. In our case we'll send it via socket to the server.
- Socket.io For emitting the chunks in the client and receiving them in the server.
- ffmpeg running in the server, connected to the RTMP server. The video chunk that comes in the socket will be processed by this.
The client
Accessing the media
First thing we are going to do is request access to the camera and the microphone using getUserMedia
. If the user accepts the dialog, the promise is resolved and we are ready for saving the stream to our data.
Then this stream is going to be the source of a video element. As soon as the data is loaded we will play the video. The video is muted on purpose to avoid audio coupling.
<template>
<div>
<video ref="video" width="100%" muted />
</div>
</template>
<script>
export default {
data() {
return {
video: null,
cameraStream: null,
}
},
async mounted() {
this.cameraStream = await navigator.mediaDevices.getUserMedia({
audio: true,
video: true,
})
this.video = this.$refs.video
this.video.srcObject = this.cameraStream
this.video.onloadedmetadata = () => {
this.video.play()
}
}
}
</script>
Capturing chunks of video
Now that we have our video stream. We are going to record it. For that, we start up a MediaRecorder
with the stream as a first param and an options object as second param. MediaRecorder.start()
accepts a param, which is the length in milliseconds of the chunk.
<script>
export default {
data() {
return {
// ...
mediaRecorder: null,
}
},
async mounted() {
// ...
this.mediaRecorder = new MediaRecorder(this.cameraStream, {
mimeType: 'video/webm',
videoBitsPerSecond: 3000000,
})
this.mediaRecorder.start(1000)
},
// ...
}
</script>
Emitting the events
First of all we need to add socket.io
to our project
yarn add socket.io
And create a plugin for it
import io from 'socket.io-client'
const socket = io(process.env.WS_URL)
export default socket
Remember to add the websockets URL in your .env
WS_URL=http://localhost:3000
If only it existed as a way to notify the server when a chunk of video is ready... Wait a minute, MediaRecorder
has a function called ondataavailable
that we can override!
Lets use this hook to push an event to the server.
import socket from '~/plugins/socket.io.js'
<script>
export default {
// ...
async mounted() {
// ...
this.mediaRecorder.ondataavailable = e => {
socket.emit('stream-video-chunk', e.data)
}
},
// ...
}
</script>
The Server
For our demo we are going to use the socket example in the nuxt repo as a bolierplate.
Create a module called io/index.js
and import it in nuxt.config.js
export default {
// ...
modules: ['~/io'],
// ...
}
For processing the video we are going to use ffmpeg
. If you already have it in your system is enough. But for convinience and compatibility reasons we are going to use a NPM pacakage with the binaries.
yarn add @ffmpeg-installer/ffmpeg
Then when the socket connects we are going to spawn a child process of ffmpeg
in node. The command of the process has the following structure:
path to the binary
. Available in path property of the instance.input
. -i pipe:0 (we are going to push to the pipe every chunk that arrives in the socket)settings
. Additional ffmpeg settings for encoding the video and so onoutput
. The RTMP server
// ...
const spawn = require('child_process').spawn
const ffmpegPath = require('@ffmpeg-installer/ffmpeg').path
export default function() {
this.nuxt.hook('render:before', () => {
// ...
io.on('connection', socket => {
const ffmegSettings = 'additional settings you may pass to ffmpeg'
const streamUrl = 'rtmp://your.server:port/channel'
const command = `${ffmpegPath} -i pipe:0 ${ffmpegSettings} "${streamUrl}"`
const ffmpeg = spawn(command, { shell: true })
})
})
}
And the final step 🎉!
Listen to the socket that the client is emitting and push the video chunk to the stream.
// ...
export default function() {
this.nuxt.hook('render:before', () => {
// ...
io.on('connection', socket => {
// ...
const ffmpeg = spawn(command, { shell: true })
socket.on('stream-video-chunk', function(chunk) {
ffmpeg.stdin.write(chunk)
})
})
})
}
And that's all! You now have a broadcast platform in your Nuxt app.
If this post helped you or your company to build that feature that the client is requesting, consider buying me a coffee in Github Sponsors. Thank You 🥰