java接受rtsp流解码,推送给前端,实现播放实时流

接到一个需求,设备给我们提供rtsp流,我们将rtsp在web端进行播放。因为设备端是无法直接播放实时流的,而且我们不是单纯的播放声音,还需要根据声纹进行频谱图展示。如下,web端采用的是waversufer等插件。

 java接受rtsp流解码,推送给前端,实现播放实时流_第1张图片

 那么rtsp我们怎么处理呢?我们使用java将rtsp流拿到之后,进行rtsp解码,因为要考虑到延时性,所以我们要尽量拿到一秒的数据之后再进行推流处理。代码如下:

public void decodeWave(String token, String url) throws Exception {
        //读取rtmp文件流
        // 获取视频源
        boolean pullAction = (Boolean) redisService.get("pullwav" + token);
        log.info("音频流地址{}",url);
        FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(url);
        try {
            grabber.start();
            Frame frame = null;
            long startTime = System.currentTimeMillis();
            ByteArrayOutputStream wavBuffer = new ByteArrayOutputStream();
            while ((frame = grabber.grabFrame()) != null && pullAction) {
                log.info("解流");
                Buffer[] samples = frame.samples;
                int sampleRate = frame.sampleRate;
                int audioChannels = frame.audioChannels;
                int sampleSizeInBit = 16;
                //一秒数据量
                int oneSecondBytes = sampleRate * sampleSizeInBit / 2 * audioChannels;

                List sampleBytes = ByteUtil.convertByteBuffer(samples);
                if (EmptyUtil.isNotEmpty(sampleBytes)) {
                    byte[] bytes = sampleBytes.get(0);
                    wavBuffer.write(bytes);

                    //缓冲区满1秒
                    if (wavBuffer.toByteArray().length >= oneSecondBytes) {
                        byte[] pcmBytes = wavBuffer.toByteArray();
                        wavBuffer.reset();
                        long delay = 1000 - (System.currentTimeMillis() - startTime);

                        if (delay > 0) {
                            Thread.sleep(delay);
                        }
                        byte[] temp = WaveHeaderKit.buildHeader(sampleRate, audioChannels, sampleSizeInBit, Arrays.copyOf(pcmBytes, Math.min(oneSecondBytes,pcmBytes.length)));
                        if (temp != null) {
                            webSocketRTServer.sendBroadcast(temp);
                        }
                        startTime = System.currentTimeMillis();
                    }


                }
                pullAction = (Boolean) redisService.get("pullwav" + token);
                if(!pullAction){
                    break;
                }
            }
        } catch (FrameGrabber.Exception e) {
            e.printStackTrace();
            log.info("拉流失败!");
        }catch (Exception e){
            log.info("实时流播放异常!");
        } finally{
            grabber.stop();
        }
    }
byte[] temp = WaveHeaderKit.buildHeader(sampleRate, audioChannels, sampleSizeInBit, Arrays.copyOf(pcmBytes, Math.min(oneSecondBytes,pcmBytes.length)));这一步是将解码的rtsp流封转为wav声音传,并且通过webScoket推送给前端。

推送动作的java代码如下:

@ServerEndpoint("/ws/rt/{token}")
@Component
public class WebSocketRTServer {


    private static RedisService redisService;

    private static PullWaveAction pullWaveAction;

    // 注入的时候,给类的 service 注入
    @Autowired
    public void setRedisService(RedisService redisService) {
        WebSocketRTServer.redisService = redisService;
    }

    @Autowired
    public void setPullWaveAction(PullWaveAction pullWaveAction) {
        WebSocketRTServer.pullWaveAction = pullWaveAction;
    }

    private Log LOG = LogFactory.get();
    private static ConcurrentHashMap sessionMap = new ConcurrentHashMap<>();

    /**
     * 发送消息
     *
     * @param session
     * @param message
     * @throws IOException
     */
    public void sendMessage(Session session, String message) throws IOException {
        if (session != null) {
            System.out.println(message);
            synchronized (session) {
                session.getBasicRemote().sendText(message);
            }
        }
    }

    /**
     * 发送消息
     *
     * @param session
     * @param message
     * @throws IOException
     */
    public void sendBytes(Session session, byte[] message) {
        if (session != null) {
            synchronized (session) {
                try{
                    session.getBasicRemote().sendBinary(ByteBuffer.wrap(message));
                }catch (Exception e){
                    LOG.info("socket关闭,停止传输!");
                }

            }
        }
    }

    /**
     * 群发消息
     * @param message
     */
    public void sendBroadcast(byte[] message) {
        sessionMap.forEach((token,session)->{
            sendBytes(session,message);

        });
    }

    /**
     * 获取session
     *
     * @param token
     * @return
     */
    public Session getSession(String token) {
        final Session session = sessionMap.get(token);
        return session;
    }

    @OnOpen
    public void onOpen(Session session, @PathParam("token") String token) throws IOException {
        sessionMap.put(token, session);
        LOG.info("{}加入websocket", token);

        sendMessage(session,"hello");

        /*开始rmtp拉流操作
        1、开始拉流 拉流标志放到redis里面保存
        一旦拉流标志为false则停止拉流*/
        try {
            LOG.info("开始拉流");
            redisService.set("pullwav" + token, true, 3600000);
            pullWaveAction.decodeWave(token);
        } catch (Exception e) {
            LOG.info("结束拉流!");
        }

    }

    @OnClose
    public void onClose(@PathParam("token") String token) {
        sessionMap.remove(token);
        //停止rtmp拉流操作
        //一旦拉流标志为false则停止拉流
        redisService.set("pullwav" + token, false, 3600000);

        LOG.info("{}退出WebSocket", token);
    }

    @OnMessage
    public void onMessage(byte[] data, Session session){
    }
}

java这边将流推出,前端就可以接收流并且播放啦

你可能感兴趣的:(ffmpeg+rtsp,ffmpeg,websocket,java,javac,rtsp)