第三节 摄像头直播系统设计之Stream

971 阅读5分钟

第一节 认识摄像头流媒体
第二节 浏览器无插件播放rtsp


摄像头直播系统设计

摄像头直播

  • 步骤1:用户在前端页面,操作摄像头。
  • 步骤2:指令通过TCP协议如HTTPMQTTAMQP等透传至Core服务,如采取HTTP协议方式透传指令,则要保证Core对外暴露的服务地址在公网可访问,如果采取MQTTAMQP协议下发指令则只需要保证Core服务能够访问公网MQTTAMQPBroker地址。
  • 步骤3:Core服务作为摄像头与Stream服务之间的桥梁,需要保证与摄像头的网络通畅(工地、家庭、公司的摄像头一般都处于局域网络,那么Core服务也要部署于和摄像头同一个局域网的主机),在Core服务中接收指令执行FFmpeg指令取流推流、抓拍上传、录像上传。
  • 步骤4:流媒体数据(照片、录像)通过Http协议传输。
  • 步骤5:Stream服务处理步骤四发送过来的请求,并且通过Websocketjsmpeg.js通信,将流媒体数据发送至前端。

流媒体中转服务Stream

Stream服务需要有专门接收流媒体HTTP请求的控制器、然后需要将HTTP请求携带的流媒体的字节数据委托给WsHandler类,WsHandler中保存着前端通过jsmepg.js建立的WebSocketSession,并将摄像头的字节数据通过WebSocketSession发送,其协作图如下:

image-20210411174028566

1.在Intellij中创建Maven项目Stream,在 pom.xml 中添加Spring Boot的依赖

<parent>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>2.3.4.RELEASE</version>
</parent>

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-websocket</artifactId>
    </dependency>
</dependencies>

<build>
    <plugins>
        <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
            <version>2.3.4.RELEASE</version>
            <!--这里写上main方法所在类的路径-->
            <configuration>
                <mainClass>ning.zhou.stream.StreamApplication</mainClass>
            </configuration>
            <executions>
                <execution>
                    <goals>
                        <goal>repackage</goal>
                        <goal>build-info</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

2.编写启动类StreamApplication,并集成WebSocket配置

@SpringBootApplication
public class StreamApplication {
    public static void main(String[] args) {
        SpringApplication.run(StreamApplication.class, args);
    }
}
@Configuration
@EnableWebSocket
public class WsConfiguration implements WebSocketConfigurer {

    @Autowired
    private WsIntercept wsIntercept;

    @Autowired
    private WsHandler wsHandler;

    @Override
    public void registerWebSocketHandlers(WebSocketHandlerRegistry webSocketHandlerRegistry) {
        //设置websocket的uri地址和配置处理器
        webSocketHandlerRegistry.addHandler(wsHandler, "/videoplay")
            	//配置拦截器和跨域
                .addInterceptors(wsIntercept).setAllowedOrigins("*");
    }

    @Bean
    public ServletServerContainerFactoryBean createWebSocketContainer() {
        //设置消息大小最大为10M,如果不设置文本过大会自动断开连接
        ServletServerContainerFactoryBean container = new ServletServerContainerFactoryBean();
        container.setMaxTextMessageBufferSize(10 * 1024);
        container.setMaxBinaryMessageBufferSize(10 * 1024);
        return container;
    }
}
@Component
public class WsIntercept extends HttpSessionHandshakeInterceptor {

    @Override
    public void afterHandshake(ServerHttpRequest serverHttpRequest, ServerHttpResponse serverHttpResponse, WebSocketHandler webSocketHandler, Exception e) {
        //不同浏览器兼容配置
        HttpServletRequest request = ((ServletServerHttpRequest) serverHttpRequest).getServletRequest();
        HttpServletResponse response = ((ServletServerHttpResponse) serverHttpResponse).getServletResponse();
        String header = request.getHeader("sec-websocket-protocol");
        if (StringUtils.isNotEmpty(header)) {
            response.addHeader("sec-websocket-protocol",header);
        }
        super.afterHandshake(serverHttpRequest,serverHttpResponse,webSocketHandler,e);
    }
}

WsConfigurationWsIntercept类是SpringBoot集成Websocket的模板配置代码;读者仅需关注在指定的URI路径上配置WebSocketHandler的方法:webSocketHandlerRegistry.addHandler(wsHandler, "/videoplay")

public class RtspUtils {

    static final int MIN_ARRAY_LEN = 2;
    static final int DIVIDE_INTO_PAIRS = 2;

    public static Map<String, String> parseRequestParam(String url) {
        Map<String, String> map = new HashMap<>();
        if (!url.contains("?")) {
            return null;
        }
        String[] parts = url.split("\\?", DIVIDE_INTO_PAIRS);
        if (parts.length < MIN_ARRAY_LEN) {
            return null;
        }
        String parsedStr = parts[1];
        if (parsedStr.contains("&")) {
            String[] multiParamObj = parsedStr.split("&");
            for (String obj : multiParamObj) {
                parseBasicParam(map, obj);
            }
            return map;
        }
        parseBasicParam(map, parsedStr);
        return map;
    }

    private static void parseBasicParam(Map<String, String> map, String str) {
        String[] paramObj = str.split("=");
        if (paramObj.length < MIN_ARRAY_LEN) {
            return;
        }
        map.put(paramObj[0], paramObj[1]);
    }

}
@Component
public class WsHandler extends BinaryWebSocketHandler {

    private static final Logger logger = LogManager.getLogger(WsHandler.class);

    /**
     * 摄像头id与对应用户订阅列表
     */
    private Map<String, CopyOnWriteArrayList<WebSocketSession>> cameraClientsMap = new ConcurrentHashMap<>();

    @Override
    public void afterConnectionEstablished(WebSocketSession session) throws Exception {
        Map<String, String> paramMap = RtspUtils.parseRequestParam(session.getUri().toString());
        String cameraId = paramMap.get("id");
        //线程安全的put
        cameraClientsMap.computeIfAbsent(cameraId, k -> new CopyOnWriteArrayList()).add(session);
        logger.info(session.getId() + "用户上线,打开了摄像头" + cameraId);
    }

    @Override
    public void afterConnectionClosed(WebSocketSession session, CloseStatus status) throws Exception {
        cameraClientsMap.values().forEach(webSocketSessions -> webSocketSessions.remove(session));
    }

    @Override
    public void handleTransportError(WebSocketSession session, Throwable exception) throws Exception {
        cameraClientsMap.values().forEach(webSocketSessions -> webSocketSessions.remove(session));
    }
    
    /**
     * 发送数据
     *
     * @param data 流媒体数据
     * @param id 摄像头id
     */
    public void sendVideo(byte[] data, String id) {
        try {
            CopyOnWriteArrayList<WebSocketSession> webSocketSessions = cameraClientsMap.get(id);
            if (webSocketSessions != null && webSocketSessions.size() > 0) {
                for (WebSocketSession session : webSocketSessions) {
                    if (session.isOpen()) {
                        Thread.sleep(1);
                        session.sendMessage(new BinaryMessage(data));
                    }
                }
            }
        } catch (Exception e) {
        }
    }

}

WsHandler继承并覆盖BinaryWebSocketHandlerafterConnectionEstablishedafterConnectionClosedhandleTransportError方法,维护着每个摄像头id对应的WebSocketSessionWebSocketSession是每个观看摄像头的客户端连接,亦是服务端向客户端发送数据的通道,我们定义的sendVideo方法将每个摄像头对应的流媒体发送至观看该摄像头的客户端。

3.接收流媒体请求控制器

@RestController
public class RtspController {

    @Resource
    private WsHandler wsHandler;

    @PostMapping("/rtsp/receive")
    @ResponseBody
    public void receive(HttpServletRequest request, String id) throws Exception {
        ServletInputStream inputStream = request.getInputStream();
        int len = -1;
        while ((len = inputStream.available()) != -1) {
            Thread.sleep(1);
            byte[] data = new byte[len];
            inputStream.read(data);
            if (data.length > 0) {
                //调用WsHandler的sendVideo方法,
                //发送流媒体数据
                wsHandler.sendVideo(data, id);
            }
        }
    }
}

4.在resources/static目录下添加jsmpeg.min.js,并新建index.html

<html>
<head>
</head>
<body>
<canvas id="video"></canvas>
<script type="text/javascript" src="jsmpeg.min.js"></script>
<script type="text/javascript">
    var canvas = document.getElementById('video');
    var url = 'ws://127.0.0.1:8080/videoplay?id=1';
    var player = new JSMpeg.Player(url, {canvas: canvas});
</script>
</body>
</html>

请根据需求自行修改url地址,其中videoplay为上面WsConfiguration.registerWebSocketHandlers方法中配置的websocketuri地址,id为摄像头的id,通过调整该参数预览不同摄像头的视频流。

测试Stream服务

1.运行StreamApplication.javamain方法

2.在dos窗口执行ffmpeg指令推流,可以参考第一节中提到的指令,替换参数

D:\ffmpeg\bin\ffmpeg -hwaccel auto -rtsp_transport tcp -i rtsp地址 -f mpegts -codec:v mpeg1video -bf 0 -codec:a mp2 -r 25 -b:v 1000k -s 960x520 -an 中继http
  • 2.1rtsp地址:摄像头的rtsp播放地址,在这里笔者使用家里的萤石云摄像头rtsp地址。
  • 2.2中继http:本篇幅中为Stream服务RtspController.receive方法的rest地址注意要加上 摄像头的id参数,在index.html我们播放的为摄像头id为1的地址,所以http中继地址为http://127.0.0.1:8080/rtsp/receive?id=1
  • -r参数对应fps-b:v视频流帧率,-s分辨率,-an关闭音频

3.浏览器输入http://127.0.0.1:8080/index.html, 浏览器成功预览摄像头视频流。

image-20210516161812069

延伸&思考

本节的Stream服务我们已然干掉了websocket-relay.js提供的功能,但是仅仅通过在Dos窗口执行Cmd指令取流对于Javaer来说有些难以接受,还有一些其他的问题,我们要保证Cmd指令的重试机制,因为客观原因FFmpeg编解码异常Cmd指令会报错停止;如何保证不重复推流,摄像头已经打开Cmd指令推流了,那么不可能再次打开相同的Cmd指令了,等等其他一系列问题需要解决,在接下来的章节中,我们将在Core服务中一步一步克服问题,打造高可用的摄像头监控系统。