srs SRT:推流切HLS出现错误

os8fio9y  于 2022-10-27  发布在  其他
关注(0)|答案(1)|浏览(317)

vmix推流到SRS轻量服务器,错误日志如下:

docker logs -f srs-server|grep t33g4163
03:33:29.679184/srs*E:SRT.d: SND-DROPPED 4 packets - lost delaying for 1022ms
03:35:23.059938/SRT:RcvQ:worker*E:SRT.c: IPE: ACK node overwritten when acknowledging 1162 (ack extracted: 1004308388)
03:36:52.965435/SRT:RcvQ:worker*E:SRT.c: IPE: ACK node overwritten when acknowledging 5908 (ack extracted: 1004325252)
[2022-03-04 03:37:48.565][Trace][1][t33g4163] RTMP client ip=127.0.0.1:48220, fd=17
[2022-03-04 03:37:48.568][Trace][1][t33g4163] complex handshake success
[2022-03-04 03:37:48.568][Trace][1][t33g4163] connect app, tcUrl=rtmp://127.0.0.1/live, pageUrl=, swfUrl=, schema=rtmp, vhost=127.0.0.1, port=1935, app=live, args=(obj)
[2022-03-04 03:37:48.569][Trace][1][t33g4163] edge-srs ip=172.17.0.4, version=4.0.241, pid=1, id=0
[2022-03-04 03:37:48.569][Trace][1][t33g4163] protocol in.buffer=0, in.ack=0, out.ack=0, in.chunk=128, out.chunk=128
[2022-03-04 03:37:48.570][Trace][1][t33g4163] client identified, type=flash-publish, vhost=127.0.0.1, app=live, stream=livestream, param=?secret=d6d2be37&vhost=127.0.0.1, duration=0ms
[2022-03-04 03:37:48.578][Trace][1][t33g4163] http: on_connect ok, client_id=t33g4163, url=http://mgmt.srs.local:2022/terraform/v1/hooks/srs/verify, request={"server_id":"vid-1odw09d","action":"on_connect","client_id":"t33g4163","ip":"127.0.0.1","vhost":"__defaultVhost__","app":"live","stream":"livestream","param":"?secret=d6d2be37&vhost=127.0.0.1","tcUrl":"rtmp://127.0.0.1/live","pageUrl":""}, response={"code":0}
[2022-03-04 03:37:48.578][Trace][1][t33g4163] connected stream, tcUrl=rtmp://127.0.0.1/live, pageUrl=, swfUrl=, schema=rtmp, vhost=__defaultVhost__, port=1935, app=live, stream=livestream, param=?secret=d6d2be37&vhost=127.0.0.1, args=(obj)
[2022-03-04 03:37:48.578][Trace][1][t33g4163] source url=/live/livestream, ip=127.0.0.1, cache=1, is_edge=0, source_id=/3890cc79
[2022-03-04 03:37:48.590][Trace][1][t33g4163] http: on_publish ok, client_id=t33g4163, url=http://mgmt.srs.local:2022/terraform/v1/hooks/srs/verify, request={"server_id":"vid-1odw09d","action":"on_publish","client_id":"t33g4163","ip":"127.0.0.1","vhost":"__defaultVhost__","app":"live","tcUrl":"rtmp://127.0.0.1/live","stream":"livestream","param":"?secret=d6d2be37&vhost=127.0.0.1"}, response={"code":0}
[2022-03-04 03:37:48.592][Trace][1][t33g4163] RTC bridge from RTMP, rtmp2rtc=1, keep_bframe=0, merge_nalus=0
[2022-03-04 03:37:48.600][Trace][1][t33g4163] hls: win=60000ms, frag=10000ms, prefix=, path=./objs/nginx/html, m3u8=[app]/[stream].m3u8, ts=[app]/[stream]-[seq].ts, aof=2.00, floor=0, clean=1, waitk=1, dispose=0ms, dts_directly=1
[2022-03-04 03:37:48.600][Trace][1][t33g4163] ignore disabled exec for vhost=__defaultVhost__
[2022-03-04 03:37:48.601][Trace][1][t33g4163] start publish mr=0/350, p1stpt=20000, pnt=5000, tcp_nodelay=0
[2022-03-04 03:37:48.606][Warn][1][t33g4163][11] aac ignore type=1 for no sequence header
[2022-03-04 03:37:48.607][Trace][1][t33g4163] Drop ts segment, sequence_no=14, uri=livestream-14.ts, duration=0ms
[2022-03-04 03:37:48.617][Trace][1][t33g4163] cleanup when unpublish
[2022-03-04 03:37:48.617][Trace][1][t33g4163] cleanup when unpublish, created=1, deliver=1
[2022-03-04 03:37:48.625][Trace][1][t33g4163] http: on_unpublish ok, client_id=t33g4163, url=http://mgmt.srs.local:2022/terraform/v1/hooks/srs/verify, request={"server_id":"vid-1odw09d","action":"on_unpublish","client_id":"t33g4163","ip":"127.0.0.1","vhost":"__defaultVhost__","app":"live","stream":"livestream","param":"?secret=d6d2be37&vhost=127.0.0.1"}, response={"code":0}
[2022-03-04 03:37:48.635][Trace][1][t33g4163] http: on_close ok, client_id=t33g4163, url=http://mgmt.srs.local:2022/terraform/v1/hooks/srs/verify, request={"server_id":"vid-1odw09d","action":"on_close","client_id":"t33g4163","ip":"127.0.0.1","vhost":"__defaultVhost__","app":"live","send_bytes":3865,"recv_bytes":3702}, response={"code":0}
[2022-03-04 03:37:48.635][Trace][1][t33g4163] TCP: before dispose resource(RtmpConn)(0x1ece1b0), conns=2, zombies=0, ign=0, inz=0, ind=0
[2022-03-04 03:37:48.635][Error][1][t33g4163][11] serve error code=5011 : service cycle : rtmp: stream service : rtmp: receive thread : handle publish message : rtmp: consume message : rtmp: consume audio : bridger consume audio : aac append header : adts
thread [1][t33g4163]: do_cycle() [src/app/srs_app_rtmp_conn.cpp:217][errno=11]
thread [1][t33g4163]: service_cycle() [src/app/srs_app_rtmp_conn.cpp:414][errno=11]
thread [1][t33g4163]: do_publishing() [src/app/srs_app_rtmp_conn.cpp:910][errno=11]
thread [1][t33g4163]: consume() [src/app/srs_app_recv_thread.cpp:380][errno=11]
thread [1][t33g4163]: handle_publish_message() [src/app/srs_app_rtmp_conn.cpp:1037][errno=11]
thread [1][t33g4163]: process_publish_message() [src/app/srs_app_rtmp_conn.cpp:1058][errno=11]
thread [1][t33g4163]: on_audio_imp() [src/app/srs_app_source.cpp:2223][errno=11]
thread [1][t33g4163]: on_audio() [src/app/srs_app_rtc_source.cpp:843][errno=11]
thread [1][t33g4163]: aac_raw_append_adts_header() [src/app/srs_app_rtc_source.cpp:85][errno=11](Resource temporarily unavailable)
03:37:52.910634/SRT:RcvQ:worker*E:SRT.c: IPE: ACK node overwritten when acknowledging 9105 (ack extracted: 1004336704)
[2022-03-04 03:37:48.635][Trace][1][t33g4163] TCP: disposing #0 resource(RtmpConn)(0x1ece1b0), conns=2, disposing=1, zombies=0
[2022-03-04 03:37:48.662][Trace][1][r02gu55i] source url=/live/livestream, ip=127.0.0.1, cache=1, is_edge=0, source_id=/t33g4163
03:38:22.886780/SRT:RcvQ:worker*E:SRT.c: IPE: ACK node overwritten when acknowledging 10711 (ack extracted: 1004342323)

看起来是HLS切片失败,导致推流失败了。

kdfy810k

kdfy810k1#

看了下,这个是rtmp的aac流数据先要有头aac_sequence_header,但是遇到了aac的RawData,所以会有下面的输出

[2022-03-04 03:37:48.606][Warn][1][t33g4163][11] aac ignore type=1 for no sequence header

再到后面的调用aac_raw_append_adts_header这个函数的时候,发现aac的RawData的nb_samples不是1,就返回报错了。
所以根本原因在rtmp过来的aac流,没有先把头发过来。
srt2rtmp为什么没把头带过来,目前的日志看不出来,大概率怀疑是srt因为丢包,导致没有正确解析到aac的元数据信息

相关问题