如何用h5播放h264裸流?
一开始听到觉得不可思议,为什么要在前端解析裸流而不是用m3u8等格式实时推送?
这是由于一些对实时性要求很高的业务场景需要。如果将裸流先传输到服务器,然后进行推流,延时比较大,一般大概在5-10s左右。而像安防这种场景下,如果溜进了一个小偷,10秒后才出现监控进行反应,后果可想而知。
2020年起主流浏览器开始禁用flash插件,所以用flash去解析裸流的时代已经过去。
经调研有两种方案:
- 用websocket传输h264编码数据,前端使用broadway等开源库进行解码,利用html5 canvas绘制图像。经测试,broadway解码效率不高,如果是比较大的主码流画面会十分卡顿。(macbook电脑)
- 使用MSE(Media Source Extension)扩展实现 HTML5 video tag的流式直播。经测试,单屏幕主码流不卡顿,4屏幕子码流不卡顿,cpu占用维持在25%左右。(macbook电脑)
具体实现:
使用方案2,解析h264裸流数据,封装fMP4 fragment。
推荐基于hls.js中的代码片段封装的插件Jmuxer,当然有兴趣也可自己封装。
对源码的说明:
查找首帧
static extractNALu(buffer) {
let i = 0;
const length = buffer.byteLength;
let value;
let state = 0;
const result = [];
let lastIndex;
// debug.log('length='+length);
while (i < length) {
value = buffer[i++];
// finding 3 or 4-byte start codes (00 00 01 OR 00 00 00 01)
// debug.log('state='+state);
switch (state) {
case 0:
if (value === 0) {
state = 1;
}
break;
case 1:
if (value === 0) {
state = 2;
} else {
state = 0;
}
break;
case 2:
case 3:
if (value === 0) {
state = 3;
} else if (value === 1 && i < length) {
if (lastIndex) {
result.push(buffer.subarray(lastIndex, i - state - 1));
}
lastIndex = i;
state = 0;
} else {
state = 0;
}
break;
default:
break;
}
}
if (lastIndex) {
result.push(buffer.subarray(lastIndex, length));
}
return result;
}
解析到视频流片段后,找到785帧(s帧,p帧,b帧)封装首个初始片段,可以降低无用帧导致绿屏花屏情况:
nalus = H264Parser.extractNALu(data.video);
const nalarr = [];
const nal = nalus.shift();
const nalType = nal[0] & 0x1f;
if (this.spspps === false) {
if (nalType === 7) {
this.spsnal = nal;
debug.log('找到sps');
return;
}
if (nalType === 8) {
this.ppsnal = nal;
debug.log('找到pps');
return;
}
if (nalType === 5) {
this.ipsnal = nal;
debug.log('找到ips');
return;
}
if (this.spsnal != null && this.ppsnal != null) {
nalarr.push(this.spsnal);
nalarr.push(this.ppsnal);
nalarr.push(this.ipsnal);
this.spspps = true;
debug.log('用7,8,5帧封装初始帧');
}
} else if (nalType !== 7 && nalType !== 8) {
nalarr.push(nal);
}
if (nalarr.length > 0) {
chunks.video = this.getVideoFrames(nalarr, duration); // nalarr nalus
remux = true;
}
解析二进制视频流的信息,利用DataView 可以从二进制ArrayBuffer对象中读写
export class ByteArray {
constructor(ab) {
this.buffer = ab;
this.dataView = new DataView(ab);
// 偏移量
this.offset = 0;
}
GetArrayBuffer() {
return this.buffer;
}
get bytesAvailable() {
let diff = this.buffer.byteLength - this.offset;
if (diff < 0) {
diff = 0;
}
return diff;
}
WriteString(str) {
for (let i = 0; i < str.length; i++) {
this.dataView.setUint8(this.offset + i, str.charCodeAt(i));// 以大端字节序
}
this.offset += str.length;
}
WriteUint32(value) {
this.dataView.setUint32(this.offset, value);// 以大端字节序
this.offset += 4;
}
ReadUint32(little = false) {
const result = this.dataView.getUint32(this.offset, little);
this.offset += 4;
return result;
}
ReadUint16(little = false) {
const result = this.dataView.getUint16(this.offset, little);
this.offset += 2;
return result;
}
ReadUint8() {
const result = this.dataView.getUint8(this.offset);
this.offset += 1;
return result;
}
SliceNewAB(len, num = 0) {
const ab = this.buffer.slice((this.offset - num), (this.offset - num) + len);
this.offset += len - num;
return ab;
}
SetOffset(num) {
this.offset += num;
}
ReadStringBytes(len) {
let str = '';
for (let i = 0; i < len; i++) {
str += String.fromCharCode(this.dataView.getUint8(this.offset + i));
}
this.offset += len;
return str;
}
}
解析视频片段(不同定义逻辑不同):
const socketBA = new ByteArray(abdata);
while (socketBA.bytesAvailable > 0) {
socketBA.ReadUint32();
socketBA.ReadUint32();
socketBA.ReadUint32();
// 解析到视频流数据后传给插件解析,转换fmp4
const h264buf = socketBA.SliceNewAB(biSize); // biSize为视频流字节数
this.feed({
video: new Uint8Array(h264buf)
});
}
经测试延时在100ms左右。
如有问题可留言评论。