diff --git a/docs/SellySDK_直播推拉流接入文档_Android.md b/docs/SellySDK_直播推拉流接入文档_Android.md index d99972c..bd59e63 100644 --- a/docs/SellySDK_直播推拉流接入文档_Android.md +++ b/docs/SellySDK_直播推拉流接入文档_Android.md @@ -12,13 +12,16 @@ Selly Live SDK 提供完整的音视频直播能力,支持 **推流(直播 ### 主要能力 - 支持 **RTMP / RTC** 推流与播放模式 +- 支持 **SurfaceView / TextureView** 两套渲染后端 +- 直播播放器与点播播放器支持 **SurfaceTexture** 高级渲染接入 - 高性能音视频采集与编码 - 灵活的视频参数配置(分辨率 / 帧率 / 码率) - 推流状态与统计回调 - 拉流播放状态与错误回调 - 支持视频帧处理(美颜 / 滤镜 / 水印) - 基于 **Token 的安全鉴权机制** -- 支持 **RTMP H264 + AAC payload XOR 保护(可选)** +- 支持 **RTMP Payload XOR 保护(可选)** +- 支持 **RTC(WHEP/WHIP)WebRTC Frame XOR 加解密(可选)** - 支持 **外部代理地址注入**(如洋葱盾等第三方安全代理) --- @@ -44,7 +47,7 @@ Selly Live SDK 提供完整的音视频直播能力,支持 **推流(直播 ```gradle dependencies { - implementation files("libs/sellycloudsdk-1.0.0.aar") + implementation files("libs/sellycloudsdk-1.0.1.aar") } ``` @@ -177,17 +180,18 @@ val proxy = SellyCloudManager.getProxyAddress() // null 表示未设置 2. 调用 `pusher.token = newToken` / `player.token = newToken` 3. 停止并重新开始推流 / 拉流流程 -### 5.4 RTMP Payload XOR 保护(可选) +### 5.4 RTMP / WebRTC XOR 保护(可选) 用途: -- 防止他人拿到 RTMP 地址后直接播放、转码或截图 +- 提高流地址泄露后被直接播放、转码或抓流的门槛 生效范围与约束: -- 仅对 **RTMP** 生效 -- 仅支持 **H264 + AAC**(当前版本) -- 只处理 payload,配置帧(SPS/PPS、AAC Sequence Header)保持不变 +- **RTMP** 推拉流:支持 payload XOR,当前仅支持 **H264 + AAC** +- **RTC(WHEP/WHIP)** 推拉流:支持 WebRTC frame XOR 加解密 +- 当前这里的 WebRTC 指直播 RTC 推拉流,不包含互动通话高层 API +- RTMP 只处理 payload,配置帧(SPS/PPS、AAC Sequence Header)保持不变 - 推流端与播放端必须使用**同一个 key** Key 格式: @@ -195,12 +199,13 @@ Key 格式: - `hex` 字符串,建议 16 或 32 字节(即 32/64 个 hex 字符) - 支持 `0x` 前缀 - 长度必须为偶数 -- 非法 key 会被忽略并关闭 XOR(会输出 warning 日志) +- 非法 key 会直接抛出 `IllegalArgumentException`,不会静默降级 时机要求: -- 推流:请在 `startLiveWithStreamId(...)` / `startLiveWithUrl(...)` 之前设置 key +- 推流:请在 `startLiveWithStreamId(...)` / `startLiveWithUrl(...)` 之前调用 `setXorKey(...)` - 拉流:请在 `initWithStreamId(...)` / `initWithUrl(...)` 创建播放器时传入 `xorKeyHex` +- 运行中修改 key 不会影响当前连接,需重启推流或重建播放器实例 --- @@ -231,7 +236,7 @@ val config = SellyLiveVideoConfiguration.defaultConfiguration().apply { outputImageOrientation = SellyLiveOrientation.PORTRAIT } -pusher.attachPreview(previewContainer) +pusher.attachPreview(previewContainer, useTextureView = false) pusher.startRunning( cameraPosition = SellyLiveCameraPosition.FRONT, videoConfig = config, @@ -239,13 +244,44 @@ pusher.startRunning( ) ``` +### 6.2.1 预览后端选择 + +推流预览支持两种接入方式: + +- `attachPreview(container, useTextureView = false)`:SDK 创建预览 View,默认走旧的 `Surface/OpenGL` 预览链路 +- `attachPreview(container, useTextureView = true)`:SDK 创建 `TextureView` 预览,适合需要普通 View 层级混排的场景 +- `setPreviewView(view)`:手动传入预览 View +- `setPreviewView(view, mode)`:当传入 `TextureView` 时,建议使用这个显式协议版本 + +示例: + +```kotlin +// 默认旧路径 +pusher.attachPreview(previewContainer, useTextureView = false) + +// TextureView 路径 +pusher.attachPreview(previewContainer, useTextureView = true) +``` + +```kotlin +// 手动指定 TextureView 时,建议显式传入 liveMode +val textureView = com.sellycloud.sellycloudsdk.widget.AspectRatioTextureView(this) +pusher.setPreviewView(textureView, SellyLiveMode.RTMP) +``` + +说明: + +- `RTMP` 模式下,SDK 内部会根据预览 View 类型自动选择 `OpenGlView` 或 `TextureView` +- `RTC/WHIP` 预览也支持 `TextureView` +- 当前版本建议在 **开始采集/推流前** 选定预览后端;不保证运行中热切换预览后端 + ### 6.3 设置推流 Token(使用 streamId 时) ```kotlin pusher.token = pushToken ``` -#### RTMP Payload XOR(可选) +#### 推流 XOR(RTMP / RTC-WHIP,可选) ```kotlin val xorKeyHex = "A1B2C3D4E5F6A7B8C9D0E1F2A3B4C5D6" @@ -254,7 +290,7 @@ val xorKeyHex = "A1B2C3D4E5F6A7B8C9D0E1F2A3B4C5D6" pusher.setXorKey(xorKeyHex) ``` -> 若在推流中修改 key,需停止并重新开始推流后才会使用新 key。 +> `setXorKey(...)` 同时作用于 RTMP 推流与 RTC/WHIP 推流。若在推流中修改 key,需停止并重新开始推流后才会使用新 key。 ### 6.4 开始/停止推流 @@ -295,10 +331,118 @@ pusher.stopLive { error -> - `setCameraEnabled(true/false)`:关闭/开启摄像头 - `setStreamOrientation(...)`:切换推流方向 - `setVideoConfiguration(...)` + `changeResolution(...)`:动态调整分辨率 +- `setAutoFramingEnabled(...)` / `getAutoFramingCapability()` / `getAutoFramingState()`:自动取景 - `setBeautyEngine(...)` + `setBeautyEnabled(...)`:接入美颜 - `setBeautyLevel(level)`:设置美颜强度 - `setBitmapAsVideoSource(...)` / `restoreCameraVideoSource()`:背景图推流 +### 6.5.1 美颜引擎接入 + +当前版本推荐通过 `BeautyEngine` + `VideoProcessor` 接入美颜。Demo 使用 `FaceUnityBeautyEngine`,位于: + +- `example/src/main/java/com/demo/SellyCloudSDK/beauty/FaceUnityBeautyEngine.kt` + +接入示例: + +```kotlin +val beautyEngine = FaceUnityBeautyEngine() + +pusher.setBeautyEngine(beautyEngine) +pusher.setBeautyEnabled(true) +pusher.setBeautyLevel(3.0f) +``` + +说明: + +- `BeautyEngine.createProcessor()` 返回的是 SDK V2 `VideoProcessor` +- 当前 Demo 的美颜实现走 `TEXTURE_2D + READ_WRITE` +- 美颜属于“完整重写输出”的场景,建议在 `VideoProcessorConfig` 中设置 `fullRewrite = true` +- `RTC/WHIP` 路径优先推荐 `TEXTURE_2D`,避免对 texture-backed 帧做额外的 texture-to-CPU 转换 + +### 6.5.2 推流前帧处理与观察 + +直播推流支持: + +- 一个可写 `VideoProcessor` +- 多个只读 `VideoFrameObserver` + +只读观测示例: + +```kotlin +val disposable = pusher.addVideoFrameObserver(object : VideoFrameObserver { + override val config = VideoFrameObserverConfig( + preferredFormat = VideoProcessFormat.TEXTURE_2D + ) + + override fun onTextureFrame(frame: VideoTextureFrame) { + // 只读观测,不修改输出 + } +}) +``` + +可写处理示例: + +```kotlin +pusher.setVideoProcessor(object : VideoProcessor { + override val config = VideoProcessorConfig( + preferredFormat = VideoProcessFormat.TEXTURE_2D, + mode = VideoProcessMode.READ_WRITE + ) + + override fun processTexture(input: VideoTextureFrame, outputTextureId: Int) { + // 将滤镜/水印直接写入 SDK 提供的 outputTextureId + } +}) +``` + +当前 SDK / Demo 的处理建议: + +- `RTC/WHIP` 路径优先使用 `TEXTURE_2D` +- `RTMP` 在确实需要 CPU 像素时,可使用 `I420` / `RGBA` +- `READ_WRITE` 模式下,SDK 会准备输出缓冲;只有“完整覆盖输出”的场景才建议 `fullRewrite = true` +- `outputTextureId` 由 SDK 管理,处理器不应转移所有权,也不应在回调里主动删除纹理 +- `VideoFrameObserverConfig` 的默认值仍为 `I420` 以兼容旧接入;新接入建议显式声明 `preferredFormat` + +Demo 中当前可直接验证的模式: + +- `帧回调纹理`:`TEXTURE_2D` observer +- `帧回调空CPU`:声明 `I420`,不处理像素 +- `帧回调单CPU`:单个 `I420` observer +- `帧回调双CPU`:两个 `I420` observer,共享同一次 CPU 转换 +- `改帧`:`RTC` 下走 `TEXTURE_2D`,`RTMP` 示例走 `RGBA` + +### 6.5.3 自动取景(Auto Framing) + +当前高层 API 已暴露: + +- `setAutoFramingEnabled(enabled)`:开启 / 关闭自动取景 +- `getAutoFramingCapability()`:查询当前是否支持及原因 +- `getAutoFramingState()`:读取当前状态 +- `delegate.onAutoFramingStateChanged(state)`:接收状态变化回调 + +状态枚举: + +- `OFF` +- `INACTIVE` +- `FRAMING` +- `CONVERGED` +- `UNSUPPORTED` + +当前约束: + +- 当前自动取景只在 **RTMP 推流** 路径可用 +- `RTC / WHIP` 推流当前会返回 `UNSUPPORTED` +- 需要摄像头已启动后再查询 capability;相机关闭、背景图推流等场景也会返回不支持 + +示例: + +```kotlin +val capability = pusher.getAutoFramingCapability() +if (capability.supported) { + pusher.setAutoFramingEnabled(true) +} +``` + ### 6.6 生命周期建议 在宿主 Activity 中对齐生命周期: @@ -324,21 +468,25 @@ pusher.stopLive { error -> - videoBitrateKbps / audioBitrateKbps - rttMs - cpu 使用率(Demo 通过 `CpuUsage` 读取) +- auto framing state(通过 `onAutoFramingStateChanged` / `getAutoFramingState()` 获取) ### 6.8 推流 API 速览(含 Demo 未覆盖) 初始化与预览: - `initWithLiveMode(context, liveMode)`:创建推流实例 -- `setPreviewView(view)`:设置预览 View -- `attachPreview(container)`:将预览 View 添加到容器 +- `setPreviewView(view)`:设置预览 View;`TextureView` 会按当前 `liveMode` 选择协议 +- `setPreviewView(view, mode)`:显式设置预览 View 与协议,`TextureView` 推荐使用 +- `attachPreview(container)`:将默认预览 View 添加到容器 +- `attachPreview(container, useTextureView)`:创建并绑定 `Surface/OpenGL` 或 `TextureView` 预览 - `getPreviewView()`:获取当前预览 View 采集与推流: - `startRunning(cameraPosition, videoConfig, audioConfig)`:开始采集预览 - `setVideoConfiguration(config)`:更新视频参数 -- `setXorKey(hexKey)`:设置 RTMP payload XOR key(可选) +- `setXorKey(hexKey)`:设置推流 XOR key(RTMP payload / RTC-WHIP frame,可选) +- `setAutoFramingEnabled(enabled)` / `getAutoFramingCapability()` / `getAutoFramingState()`:自动取景控制与状态查询 - `startLiveWithStreamId(streamId)`:使用 streamId 推流 - `startLiveWithUrl(url)`:使用完整 URL 推流 - `stopLive()` / `stopLive(callback)`:停止推流 @@ -356,6 +504,7 @@ pusher.stopLive { error -> - `setBeautyEngine(engine)`:设置美颜引擎 - `setBeautyEnabled(true/false)`:启用 / 关闭美颜 - `setBeautyLevel(level)`:设置美颜强度 +- `onAutoFramingStateChanged(state)`:自动取景状态回调 - `setStreamOrientation(orientation)`:设置推流方向 - `changeResolution(width, height)`:动态调整分辨率 - `setBitmapAsVideoSource(bitmap)` / `restoreCameraVideoSource()`:背景图推流 @@ -375,7 +524,7 @@ val player = SellyLiveVideoPlayer.initWithStreamId( context = this, streamId = streamId, liveMode = SellyLiveMode.RTC, - xorKeyHex = "" // RTC 场景可留空 + xorKeyHex = "" // 加密流传入同一 key,明文流可留空 ) // 或直接使用完整 URL // val player = SellyLiveVideoPlayer.initWithUrl(this, playUrl, xorKeyHex = "A1B2...") @@ -394,7 +543,7 @@ val player = SellyLiveVideoPlayer.initWithStreamId( ) ``` -> 使用 RTMP 加密流时,请在创建播放器时传入 `xorKeyHex`;后续如需换 key,请重建播放器实例。 +> 使用 RTMP 或 RTC/WHEP 加密流时,请在创建播放器时传入 `xorKeyHex`;后续如需换 key,请重建播放器实例。 ### 7.2 设置拉流 Token(使用 streamId 时) @@ -406,11 +555,35 @@ player.token = playToken ### 7.3 播放流程 ```kotlin -player.attachRenderView(renderContainer) +player.attachRenderView(renderContainer, com.sellycloud.sellycloudsdk.render.RenderBackend.SURFACE_VIEW) player.prepareToPlay() player.play() ``` +### 7.3.1 播放渲染后端选择 + +直播播放器支持以下渲染接入方式: + +- `attachRenderView(container, RenderBackend.SURFACE_VIEW)`:默认旧路径 +- `attachRenderView(container, RenderBackend.TEXTURE_VIEW)`:使用 `TextureView` +- `setRenderView(view)`:手动传入 `SurfaceView`、`SurfaceViewRenderer` 或 `TextureView` +- `setRenderSurfaceTexture(surfaceTexture, width, height)`:高级场景下直接绑定 `SurfaceTexture`(调用方负责 SurfaceTexture 生命周期) + +示例: + +```kotlin +val backend = com.sellycloud.sellycloudsdk.render.RenderBackend.TEXTURE_VIEW +player.attachRenderView(renderContainer, backend) +player.prepareToPlay() +player.play() +``` + +说明: + +- `RTMP` 播放支持 `SurfaceView`、`TextureView`、`SurfaceTexture` +- `RTC/WHEP` 播放支持 `SurfaceViewRenderer`、`TextureView`,以及高级场景下的 `SurfaceTexture` +- 当前版本建议在 **开始播放前** 选定渲染后端;当前 Demo 在首页设置中统一选择,进入页面后不再暴露热切换 + 控制接口: - `pause()` @@ -422,6 +595,8 @@ player.play() 补充接口(Demo 未覆盖): - `setRenderView(view)`:手动指定渲染 View +- `setRenderSurfaceTexture(surfaceTexture, width, height)`:直接绑定 `SurfaceTexture`(调用方负责 SurfaceTexture 生命周期) +- `clearRenderTarget()`:解绑当前渲染面,播放会话可继续存活 - `seekBy(deltaMs)`:播放进度跳转(仅在流支持快进/回放时有效) ### 7.4 播放回调 @@ -453,7 +628,11 @@ player.delegate = object : SellyLiveVideoPlayerDelegate { - `initWithStreamId(context, streamId, liveMode, vhost, appName, xorKeyHex)`:使用 streamId 创建播放器 - `initWithUrl(context, url, xorKeyHex)`:使用完整 URL 创建播放器 -- `attachRenderView(container)` / `setRenderView(view)`:设置渲染 View +- `attachRenderView(container)`:创建默认 `SurfaceView` 渲染 View +- `attachRenderView(container, backend)`:创建指定 backend 的渲染 View +- `setRenderView(view)`:手动设置渲染 View +- `setRenderSurfaceTexture(surfaceTexture, width, height)`:绑定 `SurfaceTexture`(调用方负责 SurfaceTexture 生命周期) +- `clearRenderTarget()`:解绑当前渲染面 - `getRenderView()`:获取当前渲染 View 播放控制: @@ -468,6 +647,17 @@ player.delegate = object : SellyLiveVideoPlayerDelegate { - `setStatsListener { snapshot -> }`:播放统计回调 - `release()`:释放播放器资源 +### 7.6 点播播放器渲染说明 + +`SellyVodPlayer` 与直播播放器在渲染后端模型上保持一致: + +- `attachRenderView(container, backend)`:支持 `SURFACE_VIEW` / `TEXTURE_VIEW` +- `setRenderView(surfaceView)` / `setRenderView(textureView)`:手动绑定现有 View +- `setRenderSurfaceTexture(surfaceTexture, width, height)`:高级场景使用 `SurfaceTexture`(调用方负责 SurfaceTexture 生命周期) +- `clearRenderTarget()`:解绑当前渲染面但不一定立即销毁播放实例 + +因此 Demo 中点播页的 `SurfaceView / TextureView` 选择,也与直播播放页保持一致,均在首页设置中统一生效。 + --- ## 8. 错误处理与重试建议 @@ -490,6 +680,10 @@ player.delegate = object : SellyLiveVideoPlayerDelegate { ## 9. 最佳实践 - 推流前先完成采集预览 +- `SurfaceView / TextureView` backend 建议在开始推流或播放前选定 +- `RTC/WHIP` 的美颜、滤镜、水印、观测优先使用 `TEXTURE_2D` +- `I420 / RGBA` 仅在算法必须访问 CPU 像素时再使用 +- 完整重写输出的 GPU 处理器设置 `fullRewrite = true`;叠加类处理保留默认值 - Token 即将过期前提前刷新 - 使用统计回调做质量监控 - 拉流失败避免无限重试 @@ -521,10 +715,32 @@ SDK 不解析 URL 中的鉴权信息,所有鉴权均通过 `token` 属性完 - 推流端与播放端 `xorKeyHex` 是否完全一致 - key 格式是否为合法 hex(偶数长度,支持 `0x` 前缀) -- 当前是否为 RTMP + H264 + AAC +- 当前是 `RTMP` 还是 `RTC/WHEP`,两端是否都走了对应的加密流配置 - 变更 key 后是否已重启推流 / 重建播放器 -### Q5:如何接入代理/加速服务(如洋葱盾)? +### Q5:什么时候选择 `SurfaceView`,什么时候选择 `TextureView`? +**A:** + +- 普通原生 Android 页面,优先使用默认 `SurfaceView`,性能最优 +- 需要与按钮、封面、弹层等普通 View 正常混排时,优先使用 `TextureView` +- Flutter 场景通过 `setRenderSurfaceTexture()` 接入,走 `TextureView` 同一套渲染管线 +- 当前版本建议在开始推流/播放前选定 backend;当前 Demo 在首页设置中统一选择,进入页面后不支持切换 + +### Q5.1:`TextureView` 模式下,VOD/RTMP 播放的 `BufferQueueProducer timeout` 日志是什么? +**A:** + +SDK 内部使用 GL Bridge 将 MediaCodec 硬解输出通过 OpenGL 中转渲染到 TextureView,大幅减少此类日志。如在极端场景下仍偶现,属于 Android 系统 BufferQueue 机制限制,不影响播放功能。`SurfaceView` 路径不存在此问题。 + +### Q5.2:`attach` 和 `set` 两套 API 的区别? +**A:** + +| API | 谁创建 View | 谁释放 | +|---|---|---| +| `attachRenderView()` / `attachPreview()` | SDK 创建 | SDK 在 `release()` 时自动释放 | +| `setRenderView()` / `setPreviewView()` | 调用方创建并传入 | 调用方负责释放,SDK 只做绑定/解绑 | +| `setRenderSurfaceTexture()` | 调用方传入 SurfaceTexture | 调用方负责 SurfaceTexture 生命周期 | + +### Q6:如何接入代理/加速服务(如洋葱盾)? **A:** SDK 本身不集成任何第三方代理 SDK。业务方需在 SDK 外部完成代理初始化与地址获取,然后通过 `SellyCloudManager.setProxyAddress(proxyUrl)` 注入。SDK 内部会自动通过代理地址解析真实服务器 IP。 diff --git a/docs/SellySDK_音视频通话接入文档_Android.md b/docs/SellySDK_音视频通话接入文档_Android.md index db6f892..7141092 100644 --- a/docs/SellySDK_音视频通话接入文档_Android.md +++ b/docs/SellySDK_音视频通话接入文档_Android.md @@ -4,6 +4,12 @@ SDK 核心以 `InteractiveRtcEngine` 为中心,通过 `InteractiveRtcEngineEventHandler` 回调通话状态、用户事件、音视频状态及异常。 +当前版本的互动渲染模型已经从“仅 `SurfaceViewRenderer`”扩展为“`RtcRenderTarget` 抽象 + 多种后端实现”: + +- `SurfaceViewRenderer` 旧路径仍可用 +- `TextureView` 已可用于本地/远端视频渲染 +- 推荐在 **加入频道前** 选定本地渲染后端 + --- ## 目录 @@ -105,16 +111,60 @@ val rtcEngine = InteractiveRtcEngine.create( ### 4. 设置本地/远端画布 +推荐使用 `InteractiveVideoCanvas(renderTarget, userId)` 新接口。 + +#### 4.1 SurfaceViewRenderer 旧路径 + ```kotlin val localRenderer = SurfaceViewRenderer(this) -rtcEngine.setupLocalVideo(InteractiveVideoCanvas(localRenderer, userId)) +val localCanvas = InteractiveVideoCanvas( + com.sellycloud.sellycloudsdk.render.SurfaceViewRtcTarget(localRenderer), + userId +) +rtcEngine.setupLocalVideo(localCanvas) ``` ```kotlin val remoteRenderer = SurfaceViewRenderer(this) -rtcEngine.setupRemoteVideo(InteractiveVideoCanvas(remoteRenderer, remoteUserId)) +val remoteCanvas = InteractiveVideoCanvas( + com.sellycloud.sellycloudsdk.render.SurfaceViewRtcTarget(remoteRenderer), + remoteUserId +) +rtcEngine.setupRemoteVideo(remoteCanvas) ``` +#### 4.2 TextureView 路径 + +```kotlin +val localTextureView = com.sellycloud.sellycloudsdk.widget.AspectRatioTextureView(this) +val localCanvas = InteractiveVideoCanvas( + com.sellycloud.sellycloudsdk.render.TextureViewRtcTarget(localTextureView), + userId +) +rtcEngine.setupLocalVideo(localCanvas) +``` + +```kotlin +val remoteTextureView = com.sellycloud.sellycloudsdk.widget.AspectRatioTextureView(this) +val remoteCanvas = InteractiveVideoCanvas( + com.sellycloud.sellycloudsdk.render.TextureViewRtcTarget(remoteTextureView), + remoteUserId +) +rtcEngine.setupRemoteVideo(remoteCanvas) +``` + +兼容说明: + +- `InteractiveVideoCanvas(view: SurfaceViewRenderer, userId)` 旧构造仍可用(deprecated) +- 推荐新接入统一走 `RtcRenderTarget` +- 当前高层互动 API 还没有直接暴露 `SurfaceTexture` 入口;Android 场景推荐 `SurfaceViewRenderer` 或 `TextureView` + +所有权说明: + +- 调用方自己创建的 `SurfaceViewRenderer` / `TextureView`,由调用方负责释放 +- SDK 只在 `setupLocalVideo` / `setupRemoteVideo` 中绑定 target,在 `leaveChannel` 时解绑 +- 调用方应在 `leaveChannel` 之后、Activity 销毁前释放自己创建的 View + ### 5. 加入通话 ```kotlin @@ -174,9 +224,9 @@ val options = InteractiveChannelMediaOptions( 3. 创建 `InteractiveRtcEngine` 4. 设置 `EventHandler` 5. 配置 `InteractiveVideoEncoderConfig` -6. 设置本地画布 `setupLocalVideo` +6. 设置本地画布 `setupLocalVideo`(建议在 `joinChannel` 前完成,并在此阶段确定 backend) 7. `joinChannel` 加入频道 -8. `onUserJoined` 后设置远端画布 +8. `onUserJoined` 后设置远端画布;也可以提前为某个 `userId` 调用 `setupRemoteVideo`,SDK 会在用户真正上线后自动 attach 9. 通话中进行音视频控制 10. `leaveChannel` 并释放资源 @@ -268,21 +318,45 @@ val isSharing = rtcEngine.isScreenSharing() ## 视频帧前后处理 ```kotlin -rtcEngine.setCaptureVideoFrameInterceptor { frame -> - // 在此处理美颜/滤镜,返回新的 frame - frame -} +rtcEngine.setCaptureVideoProcessor(object : VideoProcessor { + override val config = VideoProcessorConfig( + preferredFormat = VideoProcessFormat.TEXTURE_2D, + mode = VideoProcessMode.READ_WRITE, + fullRewrite = true + ) + + override fun processTexture(input: VideoTextureFrame, outputTextureId: Int) { + // 推荐在 GPU texture 上处理采集前帧,美颜/滤镜直接写入 outputTextureId + } +}) ``` ```kotlin -rtcEngine.setRenderVideoFrameInterceptor { frame, userId -> - // 远端渲染前处理,返回 true 表示继续渲染 - true -} +val renderObserver = rtcEngine.addRenderVideoFrameObserver(object : VideoFrameObserver { + override val config = VideoFrameObserverConfig( + preferredFormat = VideoProcessFormat.TEXTURE_2D, + stage = VideoStage.RENDER_PRE_DISPLAY + ) + + override fun onTextureFrame(frame: VideoTextureFrame) { + // 远端渲染前只读观测 + val userId = frame.sourceId + } +}) ``` -> Demo 中的美颜示例见: -> `example/src/main/java/com/demo/SellyCloudSDK/beauty/FuVideoFrameInterceptor.kt` +> 推荐优先使用 `TEXTURE_2D`: +> - `TEXTURE_2D` 适合美颜、滤镜、AR、水印等 GPU 处理链路。 +> - `I420` / `RGBA` 仅在算法必须访问 CPU 像素时再使用。 +> - 对 RTC / WHIP 的 texture-backed 帧,走 CPU observer / processor 会触发额外的 texture-to-CPU 转换。 +> - `VideoFrameObserverConfig` 默认仍为 `I420` 以兼容旧接入;新 RTC / WHIP 接入建议显式写 `preferredFormat = TEXTURE_2D`。 +> - 完整重写输出的处理器建议设置 `fullRewrite = true`;水印/叠加类处理保留默认值即可。 +> +> Demo 中的采集前美颜示例见: +> `example/src/main/java/com/demo/SellyCloudSDK/beauty/FaceUnityBeautyEngine.kt` +> +> 当前 Demo 的互动页接入见: +> `example/src/main/java/com/demo/SellyCloudSDK/interactive/InteractiveLiveActivity.kt` --- @@ -421,6 +495,7 @@ SDK 初始化与代理: 本地与远端控制: - `setupLocalVideo(canvas)` / `setupRemoteVideo(canvas)`:设置画布 +- `InteractiveVideoCanvas(renderTarget, userId, renderMode)`:推荐画布模型 - `clearRemoteVideo(userId)`:清理远端画面 - `enableLocalVideo(true/false)` / `enableLocalAudio(true/false)`:开关本地音视频 - `muteRemoteAudioStream(userId, true/false)` / `muteRemoteVideoStream(userId, true/false)`:按用户静音 @@ -429,8 +504,9 @@ SDK 初始化与代理: 帧处理与屏幕共享: -- `setCaptureVideoFrameInterceptor(...)`:采集前帧处理 -- `setRenderVideoFrameInterceptor(...)`:渲染前帧处理 +- `setCaptureVideoProcessor(...)`:采集前可写处理 +- `addCaptureVideoFrameObserver(...)`:采集前只读观测 +- `addRenderVideoFrameObserver(...)`:远端渲染前只读观测 - `startScreenShare(...)` / `stopScreenShare()` / `isScreenSharing()`:屏幕共享 消息与 Token: @@ -447,6 +523,20 @@ SDK 初始化与代理: 2. 是否在 `onUserJoined` 后调用 `setupRemoteVideo` 3. 远端是否关闭了视频 +### Q:互动直播可以用 `TextureView` 吗? +可以。 + +推荐用法是: + +- 本地:`InteractiveVideoCanvas(TextureViewRtcTarget(textureView), userId)` +- 远端:`InteractiveVideoCanvas(TextureViewRtcTarget(textureView), remoteUserId)` + +注意: + +- 建议在 `joinChannel` 前确定本地 backend +- 当前 Demo 在首页设置中统一选择本地 backend,进入互动页面后不再暴露切换入口 +- 高层互动 API 当前未直接暴露 `SurfaceTexture` 入口 + ### Q:加入频道失败? 1. 检查 `signaling_app_id` 是否正确 2. Token 是否为空或已过期 @@ -457,5 +547,11 @@ SDK 初始化与代理: 1. 是否已获取 `MediaProjection` 授权 2. Android 14+ 是否启动前台服务 +### Q:互动通话支持 XOR 吗? +当前高层互动 API 还没有暴露 `xorKeyHex` 一类的配置入口。 + +- 目前已支持 XOR 的 WebRTC 路径,是直播 RTC 的 `WHIP / WHEP` 推拉流 +- 互动通话如需接入 XOR,需要后续在互动链路单独暴露配置并挂载 FrameCrypto + ### Q:如何接入代理/加速服务? SDK 本身不集成任何第三方代理 SDK。业务方需在外部完成代理初始化,获取本地代理地址后,通过 `SellyCloudManager.setProxyAddress()` 注入。详见「代理地址配置」章节。 diff --git a/example/build.gradle b/example/build.gradle index 8bc321a..03a9fff 100644 --- a/example/build.gradle +++ b/example/build.gradle @@ -3,7 +3,7 @@ plugins { id 'org.jetbrains.kotlin.android' } -def sdkAarPath = "libs/${findProperty("sellySdkArtifactId") ?: "sellycloudsdk"}-${findProperty("sellySdkVersion") ?: "1.0.0"}.aar" +def sdkAarPath = "libs/${findProperty("sellySdkArtifactId") ?: "sellycloudsdk"}-${findProperty("sellySdkVersion") ?: "1.0.1"}.aar" def releaseStorePath = project.rootProject.file(findProperty("MY_STORE_FILE") ?: "release.keystore") def hasReleaseKeystore = releaseStorePath.exists() @@ -64,11 +64,11 @@ android { } dependencies { + implementation files(sdkAarPath) implementation files( - sdkAarPath, + "libs/Kiwi.aar", "libs/fu_core_all_feature_release.aar", - "libs/fu_model_all_feature_release.aar", - "libs/Kiwi.aar" + "libs/fu_model_all_feature_release.aar" ) implementation fileTree(dir: "libs", include: ["*.jar"]) implementation 'androidx.appcompat:appcompat:1.7.0-alpha03' diff --git a/example/libs/sellycloudsdk-1.0.0.aar b/example/libs/sellycloudsdk-1.0.1.aar similarity index 80% rename from example/libs/sellycloudsdk-1.0.0.aar rename to example/libs/sellycloudsdk-1.0.1.aar index f72d939..a99eebd 100644 Binary files a/example/libs/sellycloudsdk-1.0.0.aar and b/example/libs/sellycloudsdk-1.0.1.aar differ diff --git a/example/src/main/java/com/demo/SellyCloudSDK/FeatureHubActivity.kt b/example/src/main/java/com/demo/SellyCloudSDK/FeatureHubActivity.kt index 9ce0eb5..f85f210 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/FeatureHubActivity.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/FeatureHubActivity.kt @@ -642,6 +642,12 @@ class FeatureHubActivity : AppCompatActivity() { AvDemoSettings.Resolution.P540 -> binding.rgSettingsResolution.check(R.id.rbSettingsRes540p) AvDemoSettings.Resolution.P720 -> binding.rgSettingsResolution.check(R.id.rbSettingsRes720p) } + binding.rgSettingsRenderBackend.check( + when (settings.renderBackendPreference) { + AvDemoSettings.RenderBackendPreference.SURFACE_VIEW -> R.id.rbSettingsRenderSurface + AvDemoSettings.RenderBackendPreference.TEXTURE_VIEW -> R.id.rbSettingsRenderTexture + } + ) restoreEnvSettingsToUi() } @@ -681,13 +687,18 @@ class FeatureHubActivity : AppCompatActivity() { R.id.rbSettingsRes540p -> AvDemoSettings.Resolution.P540 else -> AvDemoSettings.Resolution.P720 } + val renderBackendPreference = when (binding.rgSettingsRenderBackend.checkedRadioButtonId) { + R.id.rbSettingsRenderTexture -> AvDemoSettings.RenderBackendPreference.TEXTURE_VIEW + else -> AvDemoSettings.RenderBackendPreference.SURFACE_VIEW + } val current = settingsStore.read() return current.copy( streamId = streamId, resolution = res, fps = fps, maxBitrateKbps = maxKbps, - minBitrateKbps = minKbps + minBitrateKbps = minKbps, + renderBackendPreference = renderBackendPreference ) } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/avdemo/AvDemoSettingsStore.kt b/example/src/main/java/com/demo/SellyCloudSDK/avdemo/AvDemoSettingsStore.kt index fba3bc0..a8a7f3e 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/avdemo/AvDemoSettingsStore.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/avdemo/AvDemoSettingsStore.kt @@ -2,6 +2,7 @@ package com.demo.SellyCloudSDK.avdemo import android.content.Context import androidx.core.content.edit +import com.sellycloud.sellycloudsdk.render.RenderBackend data class AvDemoSettings( val streamId: String, @@ -12,9 +13,20 @@ data class AvDemoSettings( val xorKeyHex: String = "", val useUrlMode: Boolean = false, val pushUrl: String = "", + val renderBackendPreference: RenderBackendPreference = RenderBackendPreference.SURFACE_VIEW, ) { enum class Resolution { P360, P480, P540, P720 } + enum class RenderBackendPreference { + SURFACE_VIEW, + TEXTURE_VIEW; + + fun isTextureView(): Boolean = this == TEXTURE_VIEW + + fun toSdkBackend(): RenderBackend = + if (this == TEXTURE_VIEW) RenderBackend.TEXTURE_VIEW else RenderBackend.SURFACE_VIEW + } + fun resolutionSize(): Pair = when (resolution) { Resolution.P360 -> 640 to 360 Resolution.P480 -> 854 to 480 @@ -34,6 +46,13 @@ class AvDemoSettingsStore(context: Context) { AvDemoSettings.Resolution.P540.name -> AvDemoSettings.Resolution.P540 else -> AvDemoSettings.Resolution.P720 } + val renderBackendPreference = when ( + prefs.getString(KEY_RENDER_BACKEND, AvDemoSettings.RenderBackendPreference.SURFACE_VIEW.name) + ) { + AvDemoSettings.RenderBackendPreference.TEXTURE_VIEW.name -> + AvDemoSettings.RenderBackendPreference.TEXTURE_VIEW + else -> AvDemoSettings.RenderBackendPreference.SURFACE_VIEW + } return AvDemoSettings( streamId = prefs.getString(KEY_STREAM_ID, DEFAULT_STREAM_ID).orEmpty(), resolution = resolution, @@ -42,7 +61,8 @@ class AvDemoSettingsStore(context: Context) { minBitrateKbps = prefs.getInt(KEY_MIN_KBPS, DEFAULT_MIN_KBPS), xorKeyHex = prefs.getString(KEY_XOR_KEY_HEX, "").orEmpty(), useUrlMode = prefs.getBoolean(KEY_USE_URL_MODE, false), - pushUrl = prefs.getString(KEY_PUSH_URL, "").orEmpty() + pushUrl = prefs.getString(KEY_PUSH_URL, "").orEmpty(), + renderBackendPreference = renderBackendPreference ) } @@ -56,6 +76,7 @@ class AvDemoSettingsStore(context: Context) { putString(KEY_XOR_KEY_HEX, settings.xorKeyHex) putBoolean(KEY_USE_URL_MODE, settings.useUrlMode) putString(KEY_PUSH_URL, settings.pushUrl) + putString(KEY_RENDER_BACKEND, settings.renderBackendPreference.name) } } @@ -74,5 +95,6 @@ class AvDemoSettingsStore(context: Context) { private const val KEY_XOR_KEY_HEX = "xor_key_hex" private const val KEY_USE_URL_MODE = "use_url_mode" private const val KEY_PUSH_URL = "push_url" + private const val KEY_RENDER_BACKEND = "render_backend" } } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/beauty/BeautyControlDialog.kt b/example/src/main/java/com/demo/SellyCloudSDK/beauty/BeautyControlDialog.kt deleted file mode 100644 index 8db7f4f..0000000 --- a/example/src/main/java/com/demo/SellyCloudSDK/beauty/BeautyControlDialog.kt +++ /dev/null @@ -1,185 +0,0 @@ -package com.demo.SellyCloudSDK.beauty -// -//import android.app.Dialog -//import android.content.Context -//import android.os.Bundle -//import android.widget.SeekBar -//import android.widget.TextView -//import android.widget.Switch -//import android.widget.Button -//import android.view.Window -// -///** -// * 美颜参数控制对话框 -// */ -//class BeautyControlDialog( -// context: Context, -//) : Dialog(context) { -// -// private lateinit var switchBeautyEnable: Switch -// private lateinit var seekBarBeautyIntensity: SeekBar -// private lateinit var seekBarFilterIntensity: SeekBar -// private lateinit var seekBarColorIntensity: SeekBar -// private lateinit var seekBarRedIntensity: SeekBar -// private lateinit var seekBarEyeBrightIntensity: SeekBar -// private lateinit var seekBarToothIntensity: SeekBar -// -// private lateinit var tvBeautyValue: TextView -// private lateinit var tvFilterValue: TextView -// private lateinit var tvColorValue: TextView -// private lateinit var tvRedValue: TextView -// private lateinit var tvEyeBrightValue: TextView -// private lateinit var tvToothValue: TextView -// private lateinit var btnClose: Button -// -// override fun onCreate(savedInstanceState: Bundle?) { -// super.onCreate(savedInstanceState) -// requestWindowFeature(Window.FEATURE_NO_TITLE) -// setContentView(R.layout.dialog_beauty_control) -// -// initViews() -// setupListeners() -// updateUI() -// } -// -// private fun initViews() { -// switchBeautyEnable = findViewById(R.id.switchBeautyEnable) -// seekBarBeautyIntensity = findViewById(R.id.seekBarBeautyIntensity) -// seekBarFilterIntensity = findViewById(R.id.seekBarFilterIntensity) -// seekBarColorIntensity = findViewById(R.id.seekBarColorIntensity) -// seekBarRedIntensity = findViewById(R.id.seekBarRedIntensity) -// seekBarEyeBrightIntensity = findViewById(R.id.seekBarEyeBrightIntensity) -// seekBarToothIntensity = findViewById(R.id.seekBarToothIntensity) -// -// tvBeautyValue = findViewById(R.id.tvBeautyValue) -// tvFilterValue = findViewById(R.id.tvFilterValue) -// tvColorValue = findViewById(R.id.tvColorValue) -// tvRedValue = findViewById(R.id.tvRedValue) -// tvEyeBrightValue = findViewById(R.id.tvEyeBrightValue) -// tvToothValue = findViewById(R.id.tvToothValue) -// btnClose = findViewById(R.id.btnClose) -// } -// -// private fun setupListeners() { -// // 美颜开关 -// switchBeautyEnable.setOnCheckedChangeListener { _, isChecked -> -// streamingService?.enableBeauty(isChecked) -// // 根据开关状态启用/禁用参数调节 -// updateSeekBarsEnabled(isChecked) -// } -// -// // 美颜强度调节 (0-100, 转换为0.0-10.0) -// seekBarBeautyIntensity.setOnSeekBarChangeListener(object : SeekBar.OnSeekBarChangeListener { -// override fun onProgressChanged(seekBar: SeekBar?, progress: Int, fromUser: Boolean) { -// val intensity = progress / 10.0 -// tvBeautyValue.text = String.format("%.1f", intensity) -// streamingService?.setBeautyIntensity(intensity) -// } -// override fun onStartTrackingTouch(seekBar: SeekBar?) {} -// override fun onStopTrackingTouch(seekBar: SeekBar?) {} -// }) -// -// // 滤镜强度调节 (0-10, 转换为0.0-1.0) -// seekBarFilterIntensity.setOnSeekBarChangeListener(object : SeekBar.OnSeekBarChangeListener { -// override fun onProgressChanged(seekBar: SeekBar?, progress: Int, fromUser: Boolean) { -// val intensity = progress / 10.0 -// tvFilterValue.text = String.format("%.1f", intensity) -// streamingService?.setFilterIntensity(intensity) -// } -// override fun onStartTrackingTouch(seekBar: SeekBar?) {} -// override fun onStopTrackingTouch(seekBar: SeekBar?) {} -// }) -// -// // 美白强度调节 -// seekBarColorIntensity.setOnSeekBarChangeListener(object : SeekBar.OnSeekBarChangeListener { -// override fun onProgressChanged(seekBar: SeekBar?, progress: Int, fromUser: Boolean) { -// val intensity = progress / 10.0 -// tvColorValue.text = String.format("%.1f", intensity) -// streamingService?.setColorIntensity(intensity) -// } -// override fun onStartTrackingTouch(seekBar: SeekBar?) {} -// override fun onStopTrackingTouch(seekBar: SeekBar?) {} -// }) -// -// // 红润强度调节 -// seekBarRedIntensity.setOnSeekBarChangeListener(object : SeekBar.OnSeekBarChangeListener { -// override fun onProgressChanged(seekBar: SeekBar?, progress: Int, fromUser: Boolean) { -// val intensity = progress / 10.0 -// tvRedValue.text = String.format("%.1f", intensity) -// streamingService?.setRedIntensity(intensity) -// } -// override fun onStartTrackingTouch(seekBar: SeekBar?) {} -// override fun onStopTrackingTouch(seekBar: SeekBar?) {} -// }) -// -// // 亮眼强度调节 -// seekBarEyeBrightIntensity.setOnSeekBarChangeListener(object : SeekBar.OnSeekBarChangeListener { -// override fun onProgressChanged(seekBar: SeekBar?, progress: Int, fromUser: Boolean) { -// val intensity = progress / 10.0 -// tvEyeBrightValue.text = String.format("%.1f", intensity) -// streamingService?.setEyeBrightIntensity(intensity) -// } -// override fun onStartTrackingTouch(seekBar: SeekBar?) {} -// override fun onStopTrackingTouch(seekBar: SeekBar?) {} -// }) -// -// // 美牙强度调节 -// seekBarToothIntensity.setOnSeekBarChangeListener(object : SeekBar.OnSeekBarChangeListener { -// override fun onProgressChanged(seekBar: SeekBar?, progress: Int, fromUser: Boolean) { -// val intensity = progress / 10.0 -// tvToothValue.text = String.format("%.1f", intensity) -// streamingService?.setToothIntensity(intensity) -// } -// override fun onStartTrackingTouch(seekBar: SeekBar?) {} -// override fun onStopTrackingTouch(seekBar: SeekBar?) {} -// }) -// -// // 关闭按钮 -// btnClose.setOnClickListener { -// dismiss() -// } -// } -// -// private fun updateUI() { -// // 获取当前美颜状态并更新UI -// val isBeautyEnabled = streamingService?.isBeautyEnabled() ?: true -// switchBeautyEnable.isChecked = isBeautyEnabled -// -// // 获取当前美颜参数 -// val params = streamingService?.getCurrentBeautyParams() ?: mapOf() -// -// // 设置各项参数的当前值 -// val blurIntensity = params["blurIntensity"] as? Double ?: 6.0 -// val filterIntensity = params["filterIntensity"] as? Double ?: 0.7 -// val colorIntensity = params["colorIntensity"] as? Double ?: 0.5 -// val redIntensity = params["redIntensity"] as? Double ?: 0.5 -// val eyeBrightIntensity = params["eyeBrightIntensity"] as? Double ?: 1.0 -// val toothIntensity = params["toothIntensity"] as? Double ?: 1.0 -// -// seekBarBeautyIntensity.progress = (blurIntensity * 10).toInt() -// seekBarFilterIntensity.progress = (filterIntensity * 10).toInt() -// seekBarColorIntensity.progress = (colorIntensity * 10).toInt() -// seekBarRedIntensity.progress = (redIntensity * 10).toInt() -// seekBarEyeBrightIntensity.progress = (eyeBrightIntensity * 10).toInt() -// seekBarToothIntensity.progress = (toothIntensity * 10).toInt() -// -// tvBeautyValue.text = String.format("%.1f", blurIntensity) -// tvFilterValue.text = String.format("%.1f", filterIntensity) -// tvColorValue.text = String.format("%.1f", colorIntensity) -// tvRedValue.text = String.format("%.1f", redIntensity) -// tvEyeBrightValue.text = String.format("%.1f", eyeBrightIntensity) -// tvToothValue.text = String.format("%.1f", toothIntensity) -// -// // 根据开关状态启用/禁用参数调节 -// updateSeekBarsEnabled(isBeautyEnabled) -// } -// -// private fun updateSeekBarsEnabled(enabled: Boolean) { -// seekBarBeautyIntensity.isEnabled = enabled -// seekBarFilterIntensity.isEnabled = enabled -// seekBarColorIntensity.isEnabled = enabled -// seekBarRedIntensity.isEnabled = enabled -// seekBarEyeBrightIntensity.isEnabled = enabled -// seekBarToothIntensity.isEnabled = enabled -// } -//} diff --git a/example/src/main/java/com/demo/SellyCloudSDK/beauty/FUBeautyFilterRender.kt b/example/src/main/java/com/demo/SellyCloudSDK/beauty/FUBeautyFilterRender.kt deleted file mode 100644 index 4940a36..0000000 --- a/example/src/main/java/com/demo/SellyCloudSDK/beauty/FUBeautyFilterRender.kt +++ /dev/null @@ -1,264 +0,0 @@ -package com.demo.SellyCloudSDK.beauty - -import android.content.Context -import android.opengl.GLES20 -import android.opengl.Matrix -import android.util.Log -import com.demo.SellyCloudSDK.R -import com.pedro.encoder.input.gl.render.filters.BaseFilterRender -import com.pedro.encoder.utils.gl.GlUtil -import java.nio.ByteBuffer -import java.nio.ByteOrder - -/** - * FaceUnity beauty filter that plugs into RootEncoder's GL filter chain. - * 优化后台兼容性,避免依赖Activity上下文 - */ -class FUBeautyFilterRender( - private val fuRenderer: FURenderer -) : BaseFilterRender() { - - private val TAG = "FUBeautyFilterRender" - - // 美颜开关状态 - private var isBeautyEnabled = true - - // 添加摄像头朝向跟踪 - private var currentCameraFacing: com.pedro.encoder.input.video.CameraHelper.Facing = - com.pedro.encoder.input.video.CameraHelper.Facing.BACK - - // Standard vertex data following pedro's pattern (X, Y, Z, U, V) - private val squareVertexDataFilter = floatArrayOf( - // X, Y, Z, U, V - -1f, -1f, 0f, 0f, 0f, // bottom left - 1f, -1f, 0f, 1f, 0f, // bottom right - -1f, 1f, 0f, 0f, 1f, // top left - 1f, 1f, 0f, 1f, 1f // top right - ) - - private var frameW = 0 - private var frameH = 0 - private lateinit var appContext: Context - - // GLSL program and handles - private var program = -1 - private var aPositionHandle = -1 - private var aTextureHandle = -1 - private var uMVPMatrixHandle = -1 - private var uSTMatrixHandle = -1 - private var uSamplerHandle = -1 - - // 添加初始化状态检查 - private var isInitialized = false - - init { - squareVertex = ByteBuffer.allocateDirect(squareVertexDataFilter.size * FLOAT_SIZE_BYTES) - .order(ByteOrder.nativeOrder()) - .asFloatBuffer() - squareVertex.put(squareVertexDataFilter).position(0) - Matrix.setIdentityM(MVPMatrix, 0) - Matrix.setIdentityM(STMatrix, 0) - } - - override fun initGl( - width: Int, - height: Int, - context: Context, - previewWidth: Int, - previewHeight: Int - ) { - // GL 上下文可能重建:确保滤镜和 FaceUnity 资源重新初始化 - isInitialized = false - program = -1 - // 先保存 ApplicationContext,避免 super.initGl 内部触发 initGlFilter 时为空 - this.appContext = context.applicationContext - super.initGl(width, height, context, previewWidth, previewHeight) - // 确保使用 ApplicationContext,避免Activity依赖 - frameW = width - frameH = height - // 刷新 FaceUnity GL 资源绑定到新的上下文 - fuRenderer.reinitializeGlContextBlocking() - Log.d(TAG, "initGl: width=$width, height=$height, context=${context.javaClass.simpleName}") - } - - override fun initGlFilter(context: Context?) { - if (isInitialized) { - Log.d(TAG, "Filter already initialized. Skipping initGlFilter.") - return - } - try { - // 使用 ApplicationContext 避免Activity依赖 - val safeContext = context?.applicationContext ?: appContext - - val vertexShader = GlUtil.getStringFromRaw(safeContext, R.raw.simple_vertex) - val fragmentShader = GlUtil.getStringFromRaw(safeContext, R.raw.fu_base_fragment) - - program = GlUtil.createProgram(vertexShader, fragmentShader) - aPositionHandle = GLES20.glGetAttribLocation(program, "aPosition") - aTextureHandle = GLES20.glGetAttribLocation(program, "aTextureCoord") - uMVPMatrixHandle = GLES20.glGetUniformLocation(program, "uMVPMatrix") - uSTMatrixHandle = GLES20.glGetUniformLocation(program, "uSTMatrix") - uSamplerHandle = GLES20.glGetUniformLocation(program, "uSampler") - - isInitialized = true - Log.d(TAG, "initGlFilter completed - program: $program") - } catch (e: Exception) { - Log.e(TAG, "initGlFilter failed", e) - isInitialized = false - } - } - - /** - * 设置摄像头朝向(供外部调用) - */ - fun setCameraFacing(facing: com.pedro.encoder.input.video.CameraHelper.Facing) { - currentCameraFacing = facing - fuRenderer.setCameraFacing(facing) - Log.d(TAG, "Camera facing updated: $facing") - } - - /** - * Core render step called by BaseFilterRender every frame. - */ - override fun drawFilter() { - // 增加初始化检查 - if (!isInitialized) { - Log.w(TAG, "Filter not initialized, skipping draw") - return - } - - // 如果美颜被禁用,使用简单的纹理透传渲染 - if (!isBeautyEnabled) { - drawPassThrough() - return - } - - if (!fuRenderer.isAuthSuccess || fuRenderer.fuRenderKit == null) { - // Fallback: 使用透传渲染而不是直接return - drawPassThrough() - return - } - - if (previousTexId <= 0 || frameW <= 0 || frameH <= 0) { - return - } - - try { - // 保存当前 FBO 与 viewport,避免外部库改写 - val prevFbo = IntArray(1) - val prevViewport = IntArray(4) - GLES20.glGetIntegerv(GLES20.GL_FRAMEBUFFER_BINDING, prevFbo, 0) - GLES20.glGetIntegerv(GLES20.GL_VIEWPORT, prevViewport, 0) - - // 使用带朝向的渲染方法 - val processedTexId = fuRenderer.onDrawFrame(previousTexId, frameW, frameH, currentCameraFacing) - - // 还原 FBO 与 viewport,避免黑屏 - GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, prevFbo[0]) - GLES20.glViewport(prevViewport[0], prevViewport[1], prevViewport[2], prevViewport[3]) - - // Use processed texture if available, otherwise fallback to original - val textureIdToDraw = if (processedTexId > 0) processedTexId else previousTexId - - // Now draw using our own shader program - GLES20.glUseProgram(program) - - // Set vertex position - squareVertex.position(SQUARE_VERTEX_DATA_POS_OFFSET) - GLES20.glVertexAttribPointer(aPositionHandle, 3, GLES20.GL_FLOAT, false, - SQUARE_VERTEX_DATA_STRIDE_BYTES, squareVertex) - GLES20.glEnableVertexAttribArray(aPositionHandle) - - // Set texture coordinates - squareVertex.position(SQUARE_VERTEX_DATA_UV_OFFSET) - GLES20.glVertexAttribPointer(aTextureHandle, 2, GLES20.GL_FLOAT, false, - SQUARE_VERTEX_DATA_STRIDE_BYTES, squareVertex) - GLES20.glEnableVertexAttribArray(aTextureHandle) - - // Set transformation matrices - GLES20.glUniformMatrix4fv(uMVPMatrixHandle, 1, false, MVPMatrix, 0) - GLES20.glUniformMatrix4fv(uSTMatrixHandle, 1, false, STMatrix, 0) - - // Bind texture and draw - GLES20.glUniform1i(uSamplerHandle, 0) - GLES20.glActiveTexture(GLES20.GL_TEXTURE0) - GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureIdToDraw) - - // Draw the rectangle - GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4) - - } catch (e: Exception) { - Log.e(TAG, "Error in beauty processing", e) - // Fallback: 使用透传渲染 - drawPassThrough() - } - } - - /** - * 透传渲染:直接渲染原始纹理,不进行美颜处理 - */ - private fun drawPassThrough() { - if (previousTexId <= 0 || !isInitialized) { - return - } - - try { - // 使用原始纹理进行渲染 - GLES20.glUseProgram(program) - - // Set vertex position - squareVertex.position(SQUARE_VERTEX_DATA_POS_OFFSET) - GLES20.glVertexAttribPointer(aPositionHandle, 3, GLES20.GL_FLOAT, false, - SQUARE_VERTEX_DATA_STRIDE_BYTES, squareVertex) - GLES20.glEnableVertexAttribArray(aPositionHandle) - - // Set texture coordinates - squareVertex.position(SQUARE_VERTEX_DATA_UV_OFFSET) - GLES20.glVertexAttribPointer(aTextureHandle, 2, GLES20.GL_FLOAT, false, - SQUARE_VERTEX_DATA_STRIDE_BYTES, squareVertex) - GLES20.glEnableVertexAttribArray(aTextureHandle) - - // Set transformation matrices - GLES20.glUniformMatrix4fv(uMVPMatrixHandle, 1, false, MVPMatrix, 0) - GLES20.glUniformMatrix4fv(uSTMatrixHandle, 1, false, STMatrix, 0) - - // Bind original texture and draw - GLES20.glUniform1i(uSamplerHandle, 0) - GLES20.glActiveTexture(GLES20.GL_TEXTURE0) - GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, previousTexId) - - // Draw the rectangle - GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4) - - } catch (e: Exception) { - Log.e(TAG, "Error in pass-through rendering", e) - } - } - - override fun disableResources() { - GlUtil.disableResources(aTextureHandle, aPositionHandle) - } - - override fun release() { - isInitialized = false - if (program != -1) { - GLES20.glDeleteProgram(program) - program = -1 - } - isInitialized = false - Log.d(TAG, "FUBeautyFilterRender released") - } - - /** - * 设置美颜开关状态 - */ - fun setBeautyEnabled(enabled: Boolean) { - isBeautyEnabled = enabled - Log.d(TAG, "Beauty enabled: $enabled") - } - - /** - * 获取美颜开关状态 - */ - fun isBeautyEnabled(): Boolean = isBeautyEnabled -} diff --git a/example/src/main/java/com/demo/SellyCloudSDK/beauty/FURenderer.kt b/example/src/main/java/com/demo/SellyCloudSDK/beauty/FURenderer.kt index 1f89ec2..5d0db2b 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/beauty/FURenderer.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/beauty/FURenderer.kt @@ -1,6 +1,7 @@ package com.demo.SellyCloudSDK.beauty import android.content.Context +import android.opengl.GLES20 import android.util.Log import com.faceunity.core.callback.OperateCallback import com.faceunity.core.entity.FUBundleData @@ -19,9 +20,10 @@ import com.faceunity.wrapper.faceunity import com.pedro.encoder.input.video.CameraHelper import java.io.File import java.io.IOException -import java.util.concurrent.CountDownLatch +import java.nio.ByteBuffer +import java.nio.ByteOrder +import java.nio.FloatBuffer import java.util.concurrent.Executors -import java.util.concurrent.TimeUnit /** @@ -52,14 +54,18 @@ class FURenderer(private val context: Context) { private val BUNDLE_AI_HUMAN = "model" + File.separator + "ai_human_processor.bundle" private val BUNDLE_FACE_BEAUTY = "graphics" + File.separator + "face_beautification.bundle" - @Volatile - private var workerThreadRef: Thread? = null private val workerThread = Executors.newSingleThreadExecutor { task -> - Thread(task, "FURenderer-Worker").also { workerThreadRef = it } + Thread(task, "FURenderer-Worker") } // 添加摄像头朝向管理 private var currentCameraFacing: CameraHelper.Facing = CameraHelper.Facing.BACK + private var blitProgram = 0 + private var blitFramebuffer = 0 + private var blitPositionLoc = 0 + private var blitTexCoordLoc = 0 + private var blitTextureLoc = 0 + private var blitQuadBuffer: FloatBuffer? = null /** * 初始化美颜SDK @@ -80,7 +86,7 @@ class FURenderer(private val context: Context) { // 初始化成功后,在后台线程加载所需资源 workerThread.submit { try { - faceunity.fuSetUseTexAsync(1) + applyTextureOutputMode() // 获取 FURenderKit 实例 fuRenderKit = FURenderKit.getInstance() @@ -142,8 +148,7 @@ class FURenderer(private val context: Context) { // 重新应用美颜参数与道具 if (faceBeauty == null) loadBeautyBundle() fuRenderKit?.faceBeauty = faceBeauty - // 再次开启异步纹理模式(稳妥起见) - try { faceunity.fuSetUseTexAsync(1) } catch (_: Throwable) {} + applyTextureOutputMode() Log.d(TAG, "onGlContextRecreated: done") } catch (e: Exception) { Log.e(TAG, "onGlContextRecreated error", e) @@ -206,6 +211,53 @@ class FURenderer(private val context: Context) { } } + fun renderProcessedTextureToOutput( + inputTex: Int, + outputTextureId: Int, + width: Int, + height: Int, + facing: CameraHelper.Facing + ) { + if (outputTextureId <= 0) return + val renderedTextureId = onDrawFrame(inputTex, width, height, facing) + val sourceTextureId = when { + renderedTextureId == outputTextureId -> return + renderedTextureId > 0 -> renderedTextureId + else -> inputTex + } + ensureBlitResources() + if (blitProgram <= 0 || blitFramebuffer <= 0) return + + val previousFramebuffer = IntArray(1) + val previousViewport = IntArray(4) + GLES20.glGetIntegerv(GLES20.GL_FRAMEBUFFER_BINDING, previousFramebuffer, 0) + GLES20.glGetIntegerv(GLES20.GL_VIEWPORT, previousViewport, 0) + GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, blitFramebuffer) + GLES20.glFramebufferTexture2D( + GLES20.GL_FRAMEBUFFER, + GLES20.GL_COLOR_ATTACHMENT0, + GLES20.GL_TEXTURE_2D, + outputTextureId, + 0 + ) + GLES20.glViewport(0, 0, width, height) + drawRgbTexture(sourceTextureId) + GLES20.glFramebufferTexture2D( + GLES20.GL_FRAMEBUFFER, + GLES20.GL_COLOR_ATTACHMENT0, + GLES20.GL_TEXTURE_2D, + 0, + 0 + ) + GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, previousFramebuffer[0]) + GLES20.glViewport( + previousViewport[0], + previousViewport[1], + previousViewport[2], + previousViewport[3] + ) + } + /** * 加载美颜道具并设置默认参数 */ @@ -235,19 +287,7 @@ class FURenderer(private val context: Context) { if (!isAuthSuccess) return workerThread.execute { - try { - Log.d(TAG, "Releasing GL context resources for protocol switch") - isGlInitialized = false - - // 释放渲染器的 GL 资源 - fuRenderKit?.release() - fuRenderKit = null - - // 注意:不清空 faceBeauty,保留美颜参数配置 - Log.d(TAG, "GL context resources released successfully") - } catch (e: Exception) { - Log.e(TAG, "Error releasing GL context", e) - } + releaseGlContextOnCurrentThread() } } @@ -256,33 +296,36 @@ class FURenderer(private val context: Context) { */ fun reinitializeGlContext() { if (!isAuthSuccess) return - workerThread.execute { doReinitializeGlContext() } + workerThread.execute { reinitializeGlContextOnCurrentThread() } } /** - * 重新初始化 GL 上下文(同步等待完成,用于避免美颜空窗) + * 供 RTC texture processor 使用:必须在当前持有 GL context 的线程上执行。 */ - fun reinitializeGlContextBlocking(timeoutMs: Long = 2000L) { + fun reinitializeGlContextOnCurrentThread() { if (!isAuthSuccess) return - if (Thread.currentThread() === workerThreadRef) { - doReinitializeGlContext() - return - } - val latch = CountDownLatch(1) - workerThread.execute { - try { - doReinitializeGlContext() - } finally { - latch.countDown() - } - } try { - if (!latch.await(timeoutMs, TimeUnit.MILLISECONDS)) { - Log.w(TAG, "GL context reinit timeout: ${timeoutMs}ms") - } - } catch (_: InterruptedException) { - Thread.currentThread().interrupt() - Log.w(TAG, "GL context reinit interrupted") + doReinitializeGlContext() + } catch (e: Exception) { + Log.e(TAG, "Error reinitializing GL context on current thread", e) + isGlInitialized = false + } + } + + /** + * 供 RTC texture processor 使用:必须在当前持有 GL context 的线程上执行。 + */ + fun releaseGlContextOnCurrentThread() { + if (!isAuthSuccess) return + try { + Log.d(TAG, "Releasing GL context resources on current thread") + isGlInitialized = false + releaseBlitResources() + fuRenderKit?.release() + fuRenderKit = null + Log.d(TAG, "GL context resources released successfully") + } catch (e: Exception) { + Log.e(TAG, "Error releasing GL context on current thread", e) } } @@ -293,8 +336,7 @@ class FURenderer(private val context: Context) { // 重新获取 FURenderKit 实例(绑定到新的 GL 上下文) fuRenderKit = FURenderKit.getInstance() - // 重新设置异步纹理模式 - faceunity.fuSetUseTexAsync(1) + applyTextureOutputMode() // 如果之前有美颜配置,重新应用 if (faceBeauty != null) { @@ -316,6 +358,9 @@ class FURenderer(private val context: Context) { fun release() { Log.d(TAG, "Releasing FURenderer resources") isGlInitialized = false + try { + releaseBlitResources() + } catch (_: Exception) {} try { fuRenderKit?.release() } catch (_: Exception) {} @@ -327,4 +372,132 @@ class FURenderer(private val context: Context) { workerThread.shutdown() } catch (_: Exception) {} } + + private fun ensureBlitResources() { + if (blitProgram > 0 && blitFramebuffer > 0 && blitQuadBuffer != null) return + blitProgram = createProgram(BLIT_VERTEX_SHADER, BLIT_FRAGMENT_SHADER) + if (blitProgram <= 0) return + blitPositionLoc = GLES20.glGetAttribLocation(blitProgram, "aPosition") + blitTexCoordLoc = GLES20.glGetAttribLocation(blitProgram, "aTextureCoord") + blitTextureLoc = GLES20.glGetUniformLocation(blitProgram, "uTexture") + if (blitQuadBuffer == null) { + blitQuadBuffer = ByteBuffer.allocateDirect(BLIT_QUAD.size * 4) + .order(ByteOrder.nativeOrder()) + .asFloatBuffer() + .put(BLIT_QUAD) + .also { it.position(0) } + } + if (blitFramebuffer <= 0) { + val framebuffers = IntArray(1) + GLES20.glGenFramebuffers(1, framebuffers, 0) + blitFramebuffer = framebuffers[0] + } + } + + private fun drawRgbTexture(textureId: Int) { + val quad = blitQuadBuffer ?: return + GLES20.glUseProgram(blitProgram) + quad.position(0) + GLES20.glVertexAttribPointer(blitPositionLoc, 2, GLES20.GL_FLOAT, false, 16, quad) + GLES20.glEnableVertexAttribArray(blitPositionLoc) + quad.position(2) + GLES20.glVertexAttribPointer(blitTexCoordLoc, 2, GLES20.GL_FLOAT, false, 16, quad) + GLES20.glEnableVertexAttribArray(blitTexCoordLoc) + GLES20.glActiveTexture(GLES20.GL_TEXTURE0) + GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId) + GLES20.glUniform1i(blitTextureLoc, 0) + GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4) + GLES20.glDisableVertexAttribArray(blitPositionLoc) + GLES20.glDisableVertexAttribArray(blitTexCoordLoc) + GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0) + GLES20.glUseProgram(0) + } + + private fun releaseBlitResources() { + if (blitProgram > 0) { + GLES20.glDeleteProgram(blitProgram) + blitProgram = 0 + } + if (blitFramebuffer > 0) { + GLES20.glDeleteFramebuffers(1, intArrayOf(blitFramebuffer), 0) + blitFramebuffer = 0 + } + blitQuadBuffer = null + } + + private fun createProgram(vertexSource: String, fragmentSource: String): Int { + val vertexShader = compileShader(GLES20.GL_VERTEX_SHADER, vertexSource) + val fragmentShader = compileShader(GLES20.GL_FRAGMENT_SHADER, fragmentSource) + if (vertexShader <= 0 || fragmentShader <= 0) { + if (vertexShader > 0) GLES20.glDeleteShader(vertexShader) + if (fragmentShader > 0) GLES20.glDeleteShader(fragmentShader) + return 0 + } + val program = GLES20.glCreateProgram() + if (program <= 0) return 0 + GLES20.glAttachShader(program, vertexShader) + GLES20.glAttachShader(program, fragmentShader) + GLES20.glLinkProgram(program) + val status = IntArray(1) + GLES20.glGetProgramiv(program, GLES20.GL_LINK_STATUS, status, 0) + GLES20.glDeleteShader(vertexShader) + GLES20.glDeleteShader(fragmentShader) + if (status[0] != GLES20.GL_TRUE) { + Log.w(TAG, "Failed to link blit program: ${GLES20.glGetProgramInfoLog(program)}") + GLES20.glDeleteProgram(program) + return 0 + } + return program + } + + private fun compileShader(type: Int, source: String): Int { + val shader = GLES20.glCreateShader(type) + if (shader <= 0) return 0 + GLES20.glShaderSource(shader, source) + GLES20.glCompileShader(shader) + val status = IntArray(1) + GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, status, 0) + if (status[0] != GLES20.GL_TRUE) { + Log.w(TAG, "Failed to compile shader: ${GLES20.glGetShaderInfoLog(shader)}") + GLES20.glDeleteShader(shader) + return 0 + } + return shader + } + + private fun applyTextureOutputMode() { + try { + faceunity.fuSetUseTexAsync(1) + } catch (t: Throwable) { + Log.w(TAG, "Failed to configure texture output mode", t) + } + } + + companion object { + private val BLIT_QUAD = floatArrayOf( + -1f, -1f, 0f, 0f, + 1f, -1f, 1f, 0f, + -1f, 1f, 0f, 1f, + 1f, 1f, 1f, 1f, + ) + + private const val BLIT_VERTEX_SHADER = """ + attribute vec4 aPosition; + attribute vec2 aTextureCoord; + varying vec2 vTextureCoord; + void main() { + gl_Position = aPosition; + vTextureCoord = aTextureCoord; + } + """ + + private const val BLIT_FRAGMENT_SHADER = """ + precision mediump float; + uniform sampler2D uTexture; + varying vec2 vTextureCoord; + void main() { + gl_FragColor = texture2D(uTexture, vTextureCoord); + } + """ + } } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/beauty/FaceUnityBeautyEngine.kt b/example/src/main/java/com/demo/SellyCloudSDK/beauty/FaceUnityBeautyEngine.kt index d0d4841..d9f96ed 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/beauty/FaceUnityBeautyEngine.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/beauty/FaceUnityBeautyEngine.kt @@ -2,9 +2,12 @@ package com.demo.SellyCloudSDK.beauty import android.content.Context import android.util.Log -import com.pedro.encoder.input.gl.render.filters.BaseFilterRender import com.pedro.encoder.input.video.CameraHelper -import com.sellycloud.sellycloudsdk.VideoFrameInterceptor +import com.sellycloud.sellycloudsdk.VideoProcessFormat +import com.sellycloud.sellycloudsdk.VideoProcessMode +import com.sellycloud.sellycloudsdk.VideoProcessor +import com.sellycloud.sellycloudsdk.VideoProcessorConfig +import com.sellycloud.sellycloudsdk.VideoTextureFrame import com.sellycloud.sellycloudsdk.beauty.BeautyEngine /** @@ -16,8 +19,6 @@ class FaceUnityBeautyEngine : BeautyEngine { private val tag = "FaceUnityBeautyEng" private var renderer: FURenderer? = null - private var filter: FUBeautyFilterRender? = null - private var whipInterceptor: FuVideoFrameInterceptor? = null private var initialized = false private var enabled = true @@ -31,15 +32,6 @@ class FaceUnityBeautyEngine : BeautyEngine { val fuRenderer = FURenderer(appCtx).also { it.setup() } renderer = fuRenderer - filter = FUBeautyFilterRender(fuRenderer).apply { - setBeautyEnabled(enabled) - setCameraFacing(currentFacing) - } - - whipInterceptor = FuVideoFrameInterceptor(fuRenderer).apply { - setFrontCamera(currentFacing == CameraHelper.Facing.FRONT) - } - applyIntensity() initialized = true Log.d(tag, "FaceUnity beauty engine initialized") @@ -49,19 +41,40 @@ class FaceUnityBeautyEngine : BeautyEngine { } } - override fun obtainFilter(): BaseFilterRender? { + override fun createProcessor(): VideoProcessor? { applyIntensity() - return filter - } + val textureRenderer = renderer ?: return null + return object : VideoProcessor { + override val config: VideoProcessorConfig = VideoProcessorConfig( + preferredFormat = VideoProcessFormat.TEXTURE_2D, + mode = VideoProcessMode.READ_WRITE, + fullRewrite = true + ) - override fun obtainWhipInterceptor(): VideoFrameInterceptor? { - applyIntensity() - return whipInterceptor + override fun onGlContextCreated() { + textureRenderer.reinitializeGlContextOnCurrentThread() + applyIntensity() + } + + override fun onGlContextDestroyed() { + textureRenderer.releaseGlContextOnCurrentThread() + } + + override fun processTexture(input: VideoTextureFrame, outputTextureId: Int) { + if (!enabled || outputTextureId <= 0) return + textureRenderer.renderProcessedTextureToOutput( + inputTex = input.textureId, + outputTextureId = outputTextureId, + width = input.width, + height = input.height, + facing = currentFacing + ) + } + } } override fun setEnabled(enabled: Boolean) { this.enabled = enabled - filter?.setBeautyEnabled(enabled) } override fun setIntensity(intensity: Double) { @@ -71,8 +84,6 @@ class FaceUnityBeautyEngine : BeautyEngine { override fun onCameraFacingChanged(facing: CameraHelper.Facing) { currentFacing = facing - filter?.setCameraFacing(facing) - whipInterceptor?.setFrontCamera(facing == CameraHelper.Facing.FRONT) } override fun onBeforeGlContextRelease() { @@ -90,11 +101,8 @@ class FaceUnityBeautyEngine : BeautyEngine { } override fun release() { - kotlin.runCatching { filter?.release() } kotlin.runCatching { renderer?.release() } - filter = null renderer = null - whipInterceptor = null initialized = false } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/beauty/FuVideoFrameInterceptor.kt b/example/src/main/java/com/demo/SellyCloudSDK/beauty/FuVideoFrameInterceptor.kt index ec4f944..464c28b 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/beauty/FuVideoFrameInterceptor.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/beauty/FuVideoFrameInterceptor.kt @@ -6,157 +6,215 @@ import com.faceunity.core.enumeration.CameraFacingEnum import com.faceunity.core.enumeration.FUExternalInputEnum import com.faceunity.core.enumeration.FUInputBufferEnum import com.faceunity.core.enumeration.FUTransformMatrixEnum -import com.sellycloud.sellycloudsdk.VideoFrameInterceptor +import com.sellycloud.sellycloudsdk.SellyVideoFrame import org.webrtc.JavaI420Buffer import org.webrtc.VideoFrame +import java.nio.ByteBuffer /** - * 将 WebRTC 采集的 I420 帧交给 FaceUnity 进行美颜,返回处理后的 NV21 帧。 - * 最小化侵入:当 SDK 未就绪或出错时,返回 null 让上游透传原始帧。 - * - * 重要:此拦截器不管理传入帧的生命周期,只负责创建新的处理后帧。 + * 将 I420 帧交给 FaceUnity 进行美颜处理。 + * live 推流走 SDK 的 [SellyVideoFrame];互动 RTC 仍保留 WebRTC [VideoFrame] 的便捷重载。 */ class FuVideoFrameInterceptor( private val fuRenderer: FURenderer -) : VideoFrameInterceptor { +) { private val tag = "FuVideoFrameInt" @Volatile private var isFrontCamera: Boolean = true @Volatile private var enabled: Boolean = true + fun setFrontCamera(front: Boolean) { isFrontCamera = front } fun setEnabled(enable: Boolean) { enabled = enable } - override fun process(frame: VideoFrame): VideoFrame? { + fun process(frame: SellyVideoFrame): SellyVideoFrame? { if (!enabled) return null val kit = fuRenderer.fuRenderKit if (!fuRenderer.isAuthSuccess || kit == null) return null - val src = frame.buffer - // 兼容部分 webrtc 版本中 toI420 可能标注为可空的情况 - val i420Maybe = try { src.toI420() } catch (_: Throwable) { null } - val i420 = i420Maybe ?: return null + val i420 = frame.buffer as? SellyVideoFrame.I420Buffer ?: return null + val width = i420.width + val height = i420.height + if (width == 0 || height == 0) return null + return try { + val i420Bytes = toI420Bytes( + width = width, + height = height, + dataY = i420.dataY, + strideY = i420.strideY, + dataU = i420.dataU, + strideU = i420.strideU, + dataV = i420.dataV, + strideV = i420.strideV + ) + val outI420 = renderI420(width, height, i420Bytes) ?: return null + SellyVideoFrame(fromI420BytesToSellyI420(outI420, width, height), frame.rotation, frame.timestampNs) + } catch (t: Throwable) { + Log.w(tag, "beauty failed: ${t.message}") + null + } + } + + fun process(frame: VideoFrame): VideoFrame? { + if (!enabled) return null + val kit = fuRenderer.fuRenderKit + if (!fuRenderer.isAuthSuccess || kit == null) return null + + val i420 = try { frame.buffer.toI420() } catch (_: Throwable) { null } ?: return null return try { val width = i420.width val height = i420.height if (width == 0 || height == 0) return null - val i420Bytes = toI420Bytes(i420) - - val inputData = FURenderInputData(width, height).apply { - imageBuffer = FURenderInputData.FUImageBuffer( - FUInputBufferEnum.FU_FORMAT_I420_BUFFER, - i420Bytes - ) - renderConfig.apply { - externalInputType = FUExternalInputEnum.EXTERNAL_INPUT_TYPE_IMAGE - if (isFrontCamera) { - cameraFacing = CameraFacingEnum.CAMERA_FRONT - inputTextureMatrix = FUTransformMatrixEnum.CCROT0_FLIPVERTICAL - inputBufferMatrix = FUTransformMatrixEnum.CCROT0_FLIPVERTICAL - outputMatrix = FUTransformMatrixEnum.CCROT0 - } else { - cameraFacing = CameraFacingEnum.CAMERA_BACK - inputTextureMatrix = FUTransformMatrixEnum.CCROT0 - inputBufferMatrix = FUTransformMatrixEnum.CCROT0 - outputMatrix = FUTransformMatrixEnum.CCROT0_FLIPVERTICAL - } - isNeedBufferReturn = true - } - } - - val output = kit.renderWithInput(inputData) - val outImage = output.image ?: return null - val outI420 = outImage.buffer ?: return null - if (outI420.isEmpty()) return null - - // 安全:将 I420 字节填充到 JavaI420Buffer,避免手写 NV21 转换越界 - val jbuf = fromI420BytesToJavaI420(outI420, width, height) - VideoFrame(jbuf, frame.rotation, frame.timestampNs) + val i420Bytes = toI420Bytes( + width = width, + height = height, + dataY = i420.dataY, + strideY = i420.strideY, + dataU = i420.dataU, + strideU = i420.strideU, + dataV = i420.dataV, + strideV = i420.strideV + ) + val outI420 = renderI420(width, height, i420Bytes) ?: return null + VideoFrame(fromI420BytesToJavaI420(outI420, width, height), frame.rotation, frame.timestampNs) } catch (t: Throwable) { Log.w(tag, "beauty failed: ${t.message}") null } finally { - // 只释放我们创建的 I420Buffer,不释放原始 frame try { i420.release() } catch (_: Throwable) {} } } - private fun toI420Bytes(i420: VideoFrame.I420Buffer): ByteArray { - val w = i420.width - val h = i420.height - val ySize = w * h - val uvW = (w + 1) / 2 - val uvH = (h + 1) / 2 - val uSize = uvW * uvH - val vSize = uSize - val out = ByteArray(ySize + uSize + vSize) - val yBuf = i420.dataY - val uBuf = i420.dataU - val vBuf = i420.dataV - val yStride = i420.strideY - val uStride = i420.strideU - val vStride = i420.strideV - // copy Y + private fun renderI420(width: Int, height: Int, i420Bytes: ByteArray): ByteArray? { + val inputData = FURenderInputData(width, height).apply { + imageBuffer = FURenderInputData.FUImageBuffer( + FUInputBufferEnum.FU_FORMAT_I420_BUFFER, + i420Bytes + ) + renderConfig.apply { + externalInputType = FUExternalInputEnum.EXTERNAL_INPUT_TYPE_IMAGE + if (isFrontCamera) { + cameraFacing = CameraFacingEnum.CAMERA_FRONT + inputTextureMatrix = FUTransformMatrixEnum.CCROT0_FLIPVERTICAL + inputBufferMatrix = FUTransformMatrixEnum.CCROT0_FLIPVERTICAL + outputMatrix = FUTransformMatrixEnum.CCROT0 + } else { + cameraFacing = CameraFacingEnum.CAMERA_BACK + inputTextureMatrix = FUTransformMatrixEnum.CCROT0 + inputBufferMatrix = FUTransformMatrixEnum.CCROT0 + outputMatrix = FUTransformMatrixEnum.CCROT0_FLIPVERTICAL + } + isNeedBufferReturn = true + } + } + + val output = fuRenderer.fuRenderKit?.renderWithInput(inputData) ?: return null + val outImage = output.image ?: return null + val outI420 = outImage.buffer ?: return null + return outI420.takeIf { it.isNotEmpty() } + } + + private fun toI420Bytes( + width: Int, + height: Int, + dataY: ByteBuffer, + strideY: Int, + dataU: ByteBuffer, + strideU: Int, + dataV: ByteBuffer, + strideV: Int + ): ByteArray { + val ySize = width * height + val uvWidth = (width + 1) / 2 + val uvHeight = (height + 1) / 2 + val uSize = uvWidth * uvHeight + val out = ByteArray(ySize + uSize * 2) + var dst = 0 - for (j in 0 until h) { - val srcPos = j * yStride - yBuf.position(srcPos) - yBuf.get(out, dst, w) - dst += w + for (row in 0 until height) { + val srcBase = row * strideY + for (col in 0 until width) { + out[dst++] = dataY.get(srcBase + col) + } } - // copy U - for (j in 0 until uvH) { - val srcPos = j * uStride - uBuf.position(srcPos) - uBuf.get(out, ySize + j * uvW, uvW) + for (row in 0 until uvHeight) { + val srcBase = row * strideU + for (col in 0 until uvWidth) { + out[dst++] = dataU.get(srcBase + col) + } } - // copy V - for (j in 0 until uvH) { - val srcPos = j * vStride - vBuf.position(srcPos) - vBuf.get(out, ySize + uSize + j * uvW, uvW) + for (row in 0 until uvHeight) { + val srcBase = row * strideV + for (col in 0 until uvWidth) { + out[dst++] = dataV.get(srcBase + col) + } } return out } - // 将连续 I420 字节拷贝到 JavaI420Buffer - private fun fromI420BytesToJavaI420(i420: ByteArray, width: Int, height: Int): JavaI420Buffer { + private fun fromI420BytesToSellyI420(i420: ByteArray, width: Int, height: Int): SellyVideoFrame.I420Buffer { val ySize = width * height - val uvW = (width + 1) / 2 - val uvH = (height + 1) / 2 - val uSize = uvW * uvH + val uvWidth = (width + 1) / 2 + val uvHeight = (height + 1) / 2 + val uSize = uvWidth * uvHeight val vSize = uSize require(i420.size >= ySize + uSize + vSize) { "I420 buffer too small: ${i420.size}" } - val buf = JavaI420Buffer.allocate(width, height) - val y = buf.dataY - val u = buf.dataU - val v = buf.dataV - val yStride = buf.strideY - val uStride = buf.strideU - val vStride = buf.strideV - // 拷贝 Y + + val buffer = SellyVideoFrame.allocateI420Buffer(width, height) + val y = buffer.dataY + val u = buffer.dataU + val v = buffer.dataV + var src = 0 - for (j in 0 until height) { - y.position(j * yStride) + for (row in 0 until height) { + y.position(row * buffer.strideY) y.put(i420, src, width) src += width } - // 拷贝 U - var uSrc = ySize - for (j in 0 until uvH) { - u.position(j * uStride) - u.put(i420, uSrc, uvW) - uSrc += uvW + for (row in 0 until uvHeight) { + u.position(row * buffer.strideU) + u.put(i420, src, uvWidth) + src += uvWidth } - // 拷贝 V - var vSrc = ySize + uSize - for (j in 0 until uvH) { - v.position(j * vStride) - v.put(i420, vSrc, uvW) - vSrc += uvW + for (row in 0 until uvHeight) { + v.position(row * buffer.strideV) + v.put(i420, src, uvWidth) + src += uvWidth } - return buf + return buffer + } + + private fun fromI420BytesToJavaI420(i420: ByteArray, width: Int, height: Int): JavaI420Buffer { + val ySize = width * height + val uvWidth = (width + 1) / 2 + val uvHeight = (height + 1) / 2 + val uSize = uvWidth * uvHeight + val vSize = uSize + require(i420.size >= ySize + uSize + vSize) { "I420 buffer too small: ${i420.size}" } + + val buffer = JavaI420Buffer.allocate(width, height) + val y = buffer.dataY + val u = buffer.dataU + val v = buffer.dataV + + var src = 0 + for (row in 0 until height) { + y.position(row * buffer.strideY) + y.put(i420, src, width) + src += width + } + for (row in 0 until uvHeight) { + u.position(row * buffer.strideU) + u.put(i420, src, uvWidth) + src += uvWidth + } + for (row in 0 until uvHeight) { + v.position(row * buffer.strideV) + v.put(i420, src, uvWidth) + src += uvWidth + } + return buffer } } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/interactive/InteractiveForegroundService.kt b/example/src/main/java/com/demo/SellyCloudSDK/interactive/InteractiveForegroundService.kt index 651984b..72d0005 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/interactive/InteractiveForegroundService.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/interactive/InteractiveForegroundService.kt @@ -65,17 +65,15 @@ class InteractiveForegroundService : Service() { } private fun ensureChannel() { - if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) { - val manager = getSystemService(NotificationManager::class.java) ?: return - val existing = manager.getNotificationChannel(CHANNEL_ID) - if (existing == null) { - val channel = NotificationChannel( - CHANNEL_ID, - "Interactive Call", - NotificationManager.IMPORTANCE_LOW - ) - manager.createNotificationChannel(channel) - } + val manager = getSystemService(NotificationManager::class.java) ?: return + val existing = manager.getNotificationChannel(CHANNEL_ID) + if (existing == null) { + val channel = NotificationChannel( + CHANNEL_ID, + "Interactive Call", + NotificationManager.IMPORTANCE_LOW + ) + manager.createNotificationChannel(channel) } } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/interactive/InteractiveLiveActivity.kt b/example/src/main/java/com/demo/SellyCloudSDK/interactive/InteractiveLiveActivity.kt index 2129fee..d4cc3da 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/interactive/InteractiveLiveActivity.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/interactive/InteractiveLiveActivity.kt @@ -17,10 +17,10 @@ import androidx.core.view.isVisible import androidx.lifecycle.lifecycleScope import com.demo.SellyCloudSDK.KiwiHelper import com.demo.SellyCloudSDK.R -import kotlinx.coroutines.launch -import com.demo.SellyCloudSDK.beauty.FURenderer -import com.demo.SellyCloudSDK.beauty.FuVideoFrameInterceptor +import com.demo.SellyCloudSDK.avdemo.AvDemoSettingsStore +import com.demo.SellyCloudSDK.beauty.FaceUnityBeautyEngine import com.demo.SellyCloudSDK.databinding.ActivityInteractiveLiveBinding +import com.pedro.encoder.input.video.CameraHelper import com.sellycloud.sellycloudsdk.interactive.CallType import com.sellycloud.sellycloudsdk.interactive.InteractiveCallConfig import com.sellycloud.sellycloudsdk.interactive.InteractiveChannelMediaOptions @@ -33,18 +33,28 @@ import com.sellycloud.sellycloudsdk.interactive.InteractiveStreamStats import com.sellycloud.sellycloudsdk.interactive.InteractiveVideoCanvas import com.sellycloud.sellycloudsdk.interactive.InteractiveVideoEncoderConfig import com.sellycloud.sellycloudsdk.interactive.RemoteState +import com.sellycloud.sellycloudsdk.render.RtcRenderTarget +import com.sellycloud.sellycloudsdk.render.SurfaceViewRtcTarget +import com.sellycloud.sellycloudsdk.render.TextureViewRtcTarget +import android.view.TextureView +import android.view.View +import kotlinx.coroutines.launch import org.webrtc.SurfaceViewRenderer +import java.util.Locale class InteractiveLiveActivity : AppCompatActivity() { private lateinit var binding: ActivityInteractiveLiveBinding + private lateinit var settingsStore: AvDemoSettingsStore + private var useTextureView: Boolean = false private var rtcEngine: InteractiveRtcEngine? = null private var lockedCallType: CallType? = null - private var localRenderer: SurfaceViewRenderer? = null + private var localRenderTarget: RtcRenderTarget? = null + private var localRenderView: View? = null private lateinit var localSlot: VideoSlot private lateinit var remoteSlots: List - private val remoteRendererMap = mutableMapOf() + private val remoteRenderMap = mutableMapOf>() private var isLocalPreviewEnabled = true private var isLocalAudioEnabled = true private var isSpeakerOn = true @@ -55,8 +65,14 @@ class InteractiveLiveActivity : AppCompatActivity() { private var currentConnectionState: InteractiveConnectionState = InteractiveConnectionState.Disconnected private var callDurationSeconds: Long = 0 private var lastMessage: String? = null - private var beautyRenderer: FURenderer? = null - private var fuFrameInterceptor: FuVideoFrameInterceptor? = null + private var beautyEngine: FaceUnityBeautyEngine? = null + private val defaultCameraVideoConfig = InteractiveVideoEncoderConfig( + 640, + 480, + fps = 20, + minBitrateKbps = 150, + maxBitrateKbps = 850 + ) @Volatile private var isFrontCamera = true @Volatile private var beautyEnabled: Boolean = true @Volatile private var isLocalVideoEnabled: Boolean = true @@ -71,6 +87,7 @@ class InteractiveLiveActivity : AppCompatActivity() { private var currentCallId: String? = null @Volatile private var selfUserId: String? = null private var isScreenSharing: Boolean = false + @Volatile private var leaveInProgress: Boolean = false private val permissionLauncher = registerForActivityResult( ActivityResultContracts.RequestMultiplePermissions() @@ -107,6 +124,8 @@ class InteractiveLiveActivity : AppCompatActivity() { setDisplayHomeAsUpEnabled(true) } + settingsStore = AvDemoSettingsStore(this) + useTextureView = settingsStore.read().renderBackendPreference.isTextureView() setupVideoSlots() initRtcEngine() setupUiDefaults() @@ -123,12 +142,12 @@ class InteractiveLiveActivity : AppCompatActivity() { binding.btnSwitchCamera.setOnClickListener { isFrontCamera = !isFrontCamera - fuFrameInterceptor?.setFrontCamera(isFrontCamera) + beautyEngine?.onCameraFacingChanged(currentCameraFacing()) rtcEngine?.switchCamera() } binding.btnToggleBeauty.setOnClickListener { beautyEnabled = !beautyEnabled - fuFrameInterceptor?.setEnabled(beautyEnabled) + ensureBeautySessionReady() updateControlButtons() } } @@ -153,26 +172,26 @@ class InteractiveLiveActivity : AppCompatActivity() { override fun onDestroy() { super.onDestroy() - rtcEngine?.setCaptureVideoFrameInterceptor(null) - fuFrameInterceptor = null + rtcEngine?.setCaptureVideoProcessor(null) remoteMediaState.clear() // 捕获需要释放的引用,避免主线程阻塞导致 ANR val engine = rtcEngine - val local = localRenderer - val remotes = remoteRendererMap.values.toList() - val beauty = beautyRenderer + val localTarget = localRenderTarget + val remoteTargets = remoteRenderMap.values.map { it.second } + val beauty = beautyEngine rtcEngine = null - localRenderer = null - remoteRendererMap.clear() - beautyRenderer = null + localRenderTarget = null + localRenderView = null + remoteRenderMap.clear() + beautyEngine = null // 重量级资源释放移到后台线程 Thread { try { engine?.leaveChannel() } catch (_: Exception) {} try { InteractiveRtcEngine.destroy(engine) } catch (_: Exception) {} - try { local?.release() } catch (_: Exception) {} - remotes.forEach { try { it.release() } catch (_: Exception) {} } + try { localTarget?.release() } catch (_: Exception) {} + remoteTargets.forEach { try { it.release() } catch (_: Exception) {} } try { beauty?.release() } catch (_: Exception) {} }.start() } @@ -183,17 +202,23 @@ class InteractiveLiveActivity : AppCompatActivity() { } private fun initRtcEngine() { + rtcEngine?.setCaptureVideoProcessor(null) + rtcEngine?.destroy() + rtcEngine = null + beautyEngine?.release() + beautyEngine = null + val appId = getString(R.string.signaling_app_id) val token = getString(R.string.signaling_token).takeIf { it.isNotBlank() } // Kiwi 代理后台获取,rsName 为空时清除残留 val kiwiRsName = getString(R.string.signaling_kiwi_rsname).trim() KiwiHelper.startProxySetup(kiwiRsName.isNotBlank(), kiwiRsName) - beautyRenderer = FURenderer(this).also { it.setup() } - fuFrameInterceptor = beautyRenderer?.let { FuVideoFrameInterceptor(it).apply { - setFrontCamera(isFrontCamera) - setEnabled(beautyEnabled) - } } + beautyEngine = FaceUnityBeautyEngine().also { + it.initialize(this) + it.setEnabled(beautyEnabled) + it.onCameraFacingChanged(currentCameraFacing()) + } rtcEngine = InteractiveRtcEngine.create( InteractiveRtcEngineConfig( context = applicationContext, @@ -203,14 +228,10 @@ class InteractiveLiveActivity : AppCompatActivity() { ).apply { setEventHandler(rtcEventHandler) setClientRole(InteractiveRtcEngine.ClientRole.BROADCASTER) -// setVideoEncoderConfiguration(InteractiveVideoEncoderConfig()) 使用默认值 - setVideoEncoderConfiguration(InteractiveVideoEncoderConfig(640, 480 , fps = 20, minBitrateKbps = 150, maxBitrateKbps = 850)) + setVideoEncoderConfiguration(defaultCameraVideoConfig) setDefaultAudioRoutetoSpeakerphone(true) - setCaptureVideoFrameInterceptor { frame -> - if (!beautyEnabled) return@setCaptureVideoFrameInterceptor frame - fuFrameInterceptor?.process(frame) ?: frame - } } + ensureBeautySessionReady() } private val rtcEventHandler = object : InteractiveRtcEngineEventHandler { @@ -232,6 +253,8 @@ class InteractiveLiveActivity : AppCompatActivity() { override fun onLeaveChannel(durationSeconds: Int) { Log.d(TAG, "回调onLeaveChannel duration=${durationSeconds}s") runOnUiThread { + leaveInProgress = false + releaseLocalRenderTargetAsync() resetUiAfterLeave() } } @@ -325,12 +348,12 @@ class InteractiveLiveActivity : AppCompatActivity() { runOnUiThread { handleRemoteAudioState(enabled, userId) } } - override fun onStreamStateChanged(peerId: String, state: RemoteState, code: Int, message: String?) { + override fun onStreamStateChanged(userId: String, state: RemoteState, code: Int, message: String?) { runOnUiThread { - val tip = "onStreamStateChanged[$peerId] state=$state code=$code ${message ?: ""}" + val tip = "onStreamStateChanged[$userId] state=$state code=$code ${message ?: ""}" Log.d(TAG, tip) Toast.makeText(this@InteractiveLiveActivity, tip, Toast.LENGTH_SHORT).show() - if (peerId == currentUserId && message?.contains("screen_share_stopped") == true) { + if (userId == currentUserId && message?.contains("screen_share_stopped") == true) { isScreenSharing = false updateControlButtons() } @@ -345,11 +368,13 @@ class InteractiveLiveActivity : AppCompatActivity() { VideoSlot(binding.flRemote2, TileType.REMOTE), VideoSlot(binding.flRemote3, TileType.REMOTE) ) - if (localRenderer == null) { - localRenderer = createRenderer() + if (localRenderView == null) { + val (view, target) = createRenderTarget() + localRenderView = view + localRenderTarget = target } - localRenderer?.let { renderer -> - localSlot.layout.attachRenderer(renderer) + localRenderView?.let { view -> + localSlot.layout.attachRenderer(view) } resetVideoSlots(releaseRemotes = false) binding.videoContainer.isVisible = false @@ -481,9 +506,9 @@ class InteractiveLiveActivity : AppCompatActivity() { } private fun applyLocalPreviewVisibility() { - val renderer = localRenderer ?: createRenderer().also { localRenderer = it } if (isLocalPreviewEnabled) { - localSlot.layout.attachRenderer(renderer) + val view = localRenderView ?: return + localSlot.layout.attachRenderer(view) } else { localSlot.layout.detachRenderer() } @@ -513,7 +538,15 @@ class InteractiveLiveActivity : AppCompatActivity() { if (stopped) { isScreenSharing = false ensureBeautySessionReady() - fuFrameInterceptor?.setEnabled(beautyEnabled) + binding.root.post { + // The active call keeps the local preview target inside the SDK. + // During a live session we must not swap/release that target from the demo side. + applyDefaultCameraVideoConfig() + if (!isLocalVideoEnabled) { + rtcEngine?.enableLocalVideo(false) + } + applyLocalPreviewVisibility() + } } else if (showToast) { Toast.makeText(this, "停止屏幕共享失败", Toast.LENGTH_SHORT).show() } @@ -610,11 +643,15 @@ class InteractiveLiveActivity : AppCompatActivity() { } private fun executeJoinInternal(request: JoinRequest) { - val renderer = localRenderer ?: createRenderer().also { - localRenderer = it + applyDefaultCameraVideoConfig() + val target = localRenderTarget ?: run { + val (view, t) = createRenderTarget() + localRenderView = view + localRenderTarget = t + t } currentUserId = request.userId - rtcEngine?.setupLocalVideo(InteractiveVideoCanvas(renderer, request.userId)) + rtcEngine?.setupLocalVideo(InteractiveVideoCanvas(target, request.userId)) ensureBeautySessionReady() rtcEngine?.joinChannel( request.token, @@ -634,10 +671,13 @@ class InteractiveLiveActivity : AppCompatActivity() { private fun ensureBeautySessionReady() { try { - beautyRenderer?.releaseGlContext() - beautyRenderer?.reinitializeGlContext() - fuFrameInterceptor?.setEnabled(beautyEnabled) - fuFrameInterceptor?.setFrontCamera(isFrontCamera) + val engine = rtcEngine + val beauty = beautyEngine + beauty?.setEnabled(beautyEnabled) + beauty?.onCameraFacingChanged(currentCameraFacing()) + engine?.setCaptureVideoProcessor( + if (beautyEnabled) beauty?.createProcessor() else null + ) } catch (_: Exception) { } } @@ -672,8 +712,8 @@ class InteractiveLiveActivity : AppCompatActivity() { private fun addRemoteTile(userId: String) { remoteSlots.firstOrNull { it.userId == userId }?.let { existing -> - val renderer = ensureRemoteRenderer(userId) - existing.layout.attachRenderer(renderer) + val view = ensureRemoteRenderView(userId) + existing.layout.attachRenderer(view) remoteSlots.filter { it.userId == userId && it !== existing }.forEach { extra -> extra.userId = null extra.layout.detachRenderer() @@ -690,17 +730,19 @@ class InteractiveLiveActivity : AppCompatActivity() { return } slot.userId = userId - val renderer = ensureRemoteRenderer(userId) - slot.layout.attachRenderer(renderer) + val view = ensureRemoteRenderView(userId) + slot.layout.attachRenderer(view) updateSlotOverlay(slot) binding.videoContainer.isVisible = true } - private fun ensureRemoteRenderer(userId: String): SurfaceViewRenderer { - return remoteRendererMap[userId] ?: createRenderer().also { renderer -> - remoteRendererMap[userId] = renderer - rtcEngine?.setupRemoteVideo(InteractiveVideoCanvas(renderer, userId)) - } + private fun ensureRemoteRenderView(userId: String): View { + val existing = remoteRenderMap[userId] + if (existing != null) return existing.first + val (view, target) = createRenderTarget() + remoteRenderMap[userId] = view to target + rtcEngine?.setupRemoteVideo(InteractiveVideoCanvas(target, userId)) + return view } private fun removeRemoteTile(userId: String) { @@ -711,27 +753,27 @@ class InteractiveLiveActivity : AppCompatActivity() { updateSlotOverlay(slot) } val engine = rtcEngine - val renderer = remoteRendererMap.remove(userId) + val removed = remoteRenderMap.remove(userId) remoteStats.remove(userId) - // SurfaceViewRenderer.release() 会死锁主线程,移到后台 + // RtcRenderTarget.release() may block the main thread, move to background Thread { try { engine?.clearRemoteVideo(userId) } catch (_: Exception) {} - try { renderer?.release() } catch (_: Exception) {} + try { removed?.second?.release() } catch (_: Exception) {} }.start() } - private fun resetVideoSlots(releaseRemotes: Boolean = true) { + private fun resetVideoSlots(releaseRemotes: Boolean = true, reattachLocal: Boolean = true) { if (releaseRemotes) { val engine = rtcEngine - val remoteIds = remoteRendererMap.keys.toList() - val renderersToRelease = remoteIds.mapNotNull { remoteRendererMap.remove(it) } + val remoteIds = remoteRenderMap.keys.toList() + val targetsToRelease = remoteIds.mapNotNull { remoteRenderMap.remove(it)?.second } remoteStats.clear() - // SurfaceViewRenderer.release() 会死锁主线程,移到后台 + // RtcRenderTarget.release() may block the main thread, move to background Thread { remoteIds.forEach { userId -> try { engine?.clearRemoteVideo(userId) } catch (_: Exception) {} } - renderersToRelease.forEach { try { it.release() } catch (_: Exception) {} } + targetsToRelease.forEach { try { it.release() } catch (_: Exception) {} } }.start() } remoteSlots.forEach { slot -> @@ -740,9 +782,19 @@ class InteractiveLiveActivity : AppCompatActivity() { updateSlotOverlay(slot) } localSlot.userId = currentUserId - val renderer = localRenderer ?: createRenderer().also { localRenderer = it } + if (!reattachLocal) { + localSlot.layout.detachRenderer() + updateSlotOverlay(localSlot) + return + } + val view = localRenderView ?: run { + val (v, t) = createRenderTarget() + localRenderView = v + localRenderTarget = t + v + } if (isLocalPreviewEnabled) { - localSlot.layout.attachRenderer(renderer) + localSlot.layout.attachRenderer(view) } else { localSlot.layout.detachRenderer() } @@ -757,23 +809,29 @@ class InteractiveLiveActivity : AppCompatActivity() { private fun displayId(userId: String): String = userId private fun leaveChannel() { - // SDK 的 leaveChannel() 会同步停止 Whip/Whep 客户端,阻塞主线程 + if (leaveInProgress) return + leaveInProgress = true val engine = rtcEngine - Thread { try { engine?.leaveChannel() } catch (_: Exception) {} }.start() - resetUiAfterLeave() + currentConnectionState = InteractiveConnectionState.Disconnected + updateCallInfo() + setJoinButtonEnabled(false) + Thread { + try { + engine?.leaveChannel() + } catch (_: Exception) { + } finally { + runOnUiThread { + if (!leaveInProgress) return@runOnUiThread + leaveInProgress = false + releaseLocalRenderTargetAsync() + resetUiAfterLeave() + } + } + }.start() } private fun resetUiAfterLeave() { currentCallId = null - resetVideoSlots() - binding.videoContainer.isVisible = false - binding.btnJoin.text = getString(R.string.join) - setJoinButtonEnabled(true) - isLocalPreviewEnabled = true - isLocalAudioEnabled = true - isSpeakerOn = true - beautyEnabled = true - fuFrameInterceptor?.setEnabled(true) selfUserId = null localStats = null remoteStats.clear() @@ -781,6 +839,16 @@ class InteractiveLiveActivity : AppCompatActivity() { currentConnectionState = InteractiveConnectionState.Disconnected callDurationSeconds = 0 lastMessage = null + resetVideoSlots(reattachLocal = false) + binding.videoContainer.isVisible = false + binding.btnJoin.text = getString(R.string.join) + setJoinButtonEnabled(!leaveInProgress) + isLocalPreviewEnabled = true + isLocalAudioEnabled = true + isSpeakerOn = true + isFrontCamera = true + isLocalVideoEnabled = true + beautyEnabled = true binding.tvMessageLog.text = getString(R.string.message_none) isScreenSharing = false updateControlButtons() @@ -788,16 +856,33 @@ class InteractiveLiveActivity : AppCompatActivity() { updateCallInfo() setJoinInputsVisible(true) InteractiveForegroundService.stop(this) + initRtcEngine() } - private fun createRenderer(): SurfaceViewRenderer = SurfaceViewRenderer(this).apply { - setZOrderMediaOverlay(false) + private fun currentCameraFacing(): CameraHelper.Facing { + return if (isFrontCamera) CameraHelper.Facing.FRONT else CameraHelper.Facing.BACK } - private fun releaseRenderer(renderer: SurfaceViewRenderer) { - try { - renderer.release() - } catch (_: Exception) {} + private fun createRenderTarget(): Pair { + return if (useTextureView) { + // Interactive demo owns these targets and releases them in onDestroy(). + val tv = com.sellycloud.sellycloudsdk.widget.AspectRatioTextureView(this) + tv to TextureViewRtcTarget(tv, ownedBySdk = false) + } else { + val svr = SurfaceViewRenderer(this).apply { setZOrderMediaOverlay(false) } + svr to SurfaceViewRtcTarget(svr, ownedBySdk = false) + } + } + + private fun releaseLocalRenderTargetAsync() { + val target = localRenderTarget ?: return + localRenderTarget = null + localRenderView = null + Thread { try { target.release() } catch (_: Exception) {} }.start() + } + + private fun applyDefaultCameraVideoConfig() { + rtcEngine?.setVideoEncoderConfiguration(defaultCameraVideoConfig) } private fun hideKeyboard() { @@ -852,7 +937,7 @@ class InteractiveLiveActivity : AppCompatActivity() { val duration = if (callDurationSeconds > 0) { val minutes = callDurationSeconds / 60 val seconds = callDurationSeconds % 60 - String.format(" | 时长 %02d:%02d", minutes, seconds) + String.format(Locale.getDefault(), " | 时长 %02d:%02d", minutes, seconds) } else { "" } @@ -863,7 +948,9 @@ class InteractiveLiveActivity : AppCompatActivity() { val lines = mutableListOf(header) val width = stats?.width?.takeIf { it > 0 }?.toString() ?: "--" val height = stats?.height?.takeIf { it > 0 }?.toString() ?: "--" - val fpsText = stats?.fps?.takeIf { it > 0 }?.let { String.format("%.1f fps", it.toDouble()) } ?: "-- fps" + val fpsText = stats?.fps?.takeIf { it > 0 }?.let { + String.format(Locale.getDefault(), "%.1f fps", it.toDouble()) + } ?: "-- fps" lines += "Res:${width}x${height} $fpsText" val videoCodec = stats?.videoCodec?.takeIf { it.isNotBlank() } val audioCodec = stats?.audioCodec?.takeIf { it.isNotBlank() } @@ -874,10 +961,16 @@ class InteractiveLiveActivity : AppCompatActivity() { else -> null } codecLine?.let { lines += it } - val videoBitrate = stats?.videoBitrateKbps?.takeIf { it > 0 }?.let { String.format("%.0f", it.toDouble()) } ?: "--" - val audioBitrate = stats?.audioBitrateKbps?.takeIf { it > 0 }?.let { String.format("%.0f", it.toDouble()) } ?: "--" + val videoBitrate = stats?.videoBitrateKbps?.takeIf { it > 0 }?.let { + String.format(Locale.getDefault(), "%.0f", it.toDouble()) + } ?: "--" + val audioBitrate = stats?.audioBitrateKbps?.takeIf { it > 0 }?.let { + String.format(Locale.getDefault(), "%.0f", it.toDouble()) + } ?: "--" lines += "Video:${videoBitrate}kbps Audio:${audioBitrate}kbps" - val rtt = stats?.rttMs?.takeIf { it > 0 }?.let { String.format("%.0fms", it.toDouble()) } ?: "--" + val rtt = stats?.rttMs?.takeIf { it > 0 }?.let { + String.format(Locale.getDefault(), "%.0fms", it.toDouble()) + } ?: "--" lines += "RTT:$rtt" return lines.joinToString("\n") } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/live/LivePlayActivity.kt b/example/src/main/java/com/demo/SellyCloudSDK/live/LivePlayActivity.kt index c62a3dd..d1b966b 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/live/LivePlayActivity.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/live/LivePlayActivity.kt @@ -8,6 +8,7 @@ import android.content.Intent import android.content.pm.PackageManager import android.content.res.Configuration import android.graphics.Bitmap +import com.sellycloud.sellycloudsdk.render.RenderBackend import android.graphics.Color import android.graphics.Typeface import android.graphics.drawable.GradientDrawable @@ -33,6 +34,7 @@ import androidx.core.content.ContextCompat import coil.load import com.demo.SellyCloudSDK.KiwiHelper import com.demo.SellyCloudSDK.R +import com.demo.SellyCloudSDK.avdemo.AvDemoSettingsStore import com.demo.SellyCloudSDK.databinding.ActivityLivePlayBinding import com.demo.SellyCloudSDK.live.auth.LiveAuthHelper import com.demo.SellyCloudSDK.live.auth.LiveTokenSigner @@ -71,6 +73,7 @@ class LivePlayActivity : AppCompatActivity() { private lateinit var playerClient: SellyLiveVideoPlayer private lateinit var pipController: SellyPipController + private var useTextureView: Boolean = false private var isPlaying: Boolean = false private var isMuted: Boolean = false private var previewImageUrl: String? = null @@ -85,6 +88,7 @@ class LivePlayActivity : AppCompatActivity() { private var lastLatencyChasingSpeed: Float? = null private var lastLatencyChasingUpdate: SellyLatencyChasingUpdate? = null private var hasReleasedPlayer: Boolean = false + private var logEnabled: Boolean = true private val logLines: ArrayDeque = ArrayDeque() private val logTimeFormat = SimpleDateFormat("HH:mm:ss.SSS", Locale.getDefault()) @@ -109,10 +113,12 @@ class LivePlayActivity : AppCompatActivity() { addLogFloatingButton() envStore = LiveEnvSettingsStore(this) + useTextureView = AvDemoSettingsStore(this).read().renderBackendPreference.isTextureView() pipController = SellyPipController(this) val env = envStore.read().also { it.applyToSdkRuntimeConfig(this) } + logEnabled = env.logEnabled args = Args.from(intent, env) - Log.d(TAG, "init liveMode=${args.liveMode} input=${args.streamIdOrUrl} autoStart=${args.autoStart}") + debugLog("init liveMode=${args.liveMode} input=${args.streamIdOrUrl} autoStart=${args.autoStart}") setupPreview(args.previewImageUrl) playerClient = createPlayerForArgs(args).also { client -> @@ -192,6 +198,13 @@ class LivePlayActivity : AppCompatActivity() { } } + override fun onReconnectStateChanged(isReconnecting: Boolean, detail: String?) { + runOnUiThread { + val suffix = detail?.takeIf { it.isNotBlank() }?.let { ": $it" }.orEmpty() + logEvent(if (isReconnecting) "重连开始$suffix" else "重连结束$suffix") + } + } + override fun onError(error: com.sellycloud.sellycloudsdk.SellyLiveError) { runOnUiThread { logEvent("错误: ${error.message}") @@ -208,7 +221,8 @@ class LivePlayActivity : AppCompatActivity() { binding.actionScreenshot.setOnClickListener { captureCurrentFrame() } binding.actionPip.setOnClickListener { enterPipMode() } - playerClient.attachRenderView(binding.renderContainer) + val backend = if (useTextureView) RenderBackend.TEXTURE_VIEW else RenderBackend.SURFACE_VIEW + playerClient.attachRenderView(binding.renderContainer, backend) if (args.autoStart) { lifecycleScope.launch { @@ -278,7 +292,6 @@ class LivePlayActivity : AppCompatActivity() { } private fun enterPipMode() { - if (Build.VERSION.SDK_INT < Build.VERSION_CODES.O) return if (!isPlaying) return val renderView = playerClient.getRenderView() ?: binding.renderContainer pipController.enterPictureInPictureMode(renderView) @@ -322,7 +335,7 @@ class LivePlayActivity : AppCompatActivity() { Toast.makeText(this, "生成 token 失败", Toast.LENGTH_SHORT).show() return } - Log.d(TAG, "startPlayback params liveMode=${args.liveMode} streamId=$channelId tokenPreview=${auth.tokenResult.tokenPreview}") + debugLog("startPlayback params liveMode=${args.liveMode} streamId=$channelId tokenPreview=${auth.tokenResult.tokenPreview}") playerClient.token = auth.tokenResult.token beginPlayback() return @@ -334,7 +347,7 @@ class LivePlayActivity : AppCompatActivity() { return } if (input.contains("://")) { - Log.d(TAG, "startPlayback directUrl=$input") + debugLog("startPlayback directUrl=$input") playerClient.token = null beginPlayback() return @@ -354,7 +367,7 @@ class LivePlayActivity : AppCompatActivity() { Toast.makeText(this, "生成 token 失败", Toast.LENGTH_SHORT).show() return } - Log.d(TAG, "startPlayback liveMode=${args.liveMode} streamId=$channelId tokenPreview=${auth.tokenResult.tokenPreview}") + debugLog("startPlayback liveMode=${args.liveMode} streamId=$channelId tokenPreview=${auth.tokenResult.tokenPreview}") playerClient.token = auth.tokenResult.token beginPlayback() } @@ -443,8 +456,22 @@ class LivePlayActivity : AppCompatActivity() { Toast.makeText(this, "视图尚未布局完成,稍后再试", Toast.LENGTH_SHORT).show() return } + if (view is android.view.TextureView) { + val bmp = view.getBitmap() + if (bmp == null) { + Toast.makeText(this, "TextureView 尚未渲染画面", Toast.LENGTH_SHORT).show() + return + } + uiScope.launch(Dispatchers.IO) { + val ok = saveBitmapToGallery(bmp, prefix) + launch(Dispatchers.Main) { + Toast.makeText(this@LivePlayActivity, if (ok) "截图已保存到相册" else "保存失败", Toast.LENGTH_SHORT).show() + } + } + return + } if (view !is android.view.SurfaceView) { - Toast.makeText(this, "当前视图不支持截图", Toast.LENGTH_SHORT).show() + Toast.makeText(this, "当前视图类型不支持截图", Toast.LENGTH_SHORT).show() return } val bmp = Bitmap.createBitmap(view.width, view.height, Bitmap.Config.ARGB_8888) @@ -761,6 +788,10 @@ class LivePlayActivity : AppCompatActivity() { } } + private fun debugLog(message: String) { + if (logEnabled) Log.d(TAG, message) + } + private fun dpToPx(dp: Int): Int { return (dp * resources.displayMetrics.density + 0.5f).toInt() } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/live/LivePlayForegroundService.kt b/example/src/main/java/com/demo/SellyCloudSDK/live/LivePlayForegroundService.kt index 290260f..83950e9 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/live/LivePlayForegroundService.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/live/LivePlayForegroundService.kt @@ -67,17 +67,15 @@ class LivePlayForegroundService : Service() { } private fun ensureChannel() { - if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) { - val manager = getSystemService(NotificationManager::class.java) ?: return - val existing = manager.getNotificationChannel(CHANNEL_ID) - if (existing == null) { - val channel = NotificationChannel( - CHANNEL_ID, - "Live Playback", - NotificationManager.IMPORTANCE_LOW - ) - manager.createNotificationChannel(channel) - } + val manager = getSystemService(NotificationManager::class.java) ?: return + val existing = manager.getNotificationChannel(CHANNEL_ID) + if (existing == null) { + val channel = NotificationChannel( + CHANNEL_ID, + "Live Playback", + NotificationManager.IMPORTANCE_LOW + ) + manager.createNotificationChannel(channel) } } @@ -88,14 +86,10 @@ class LivePlayForegroundService : Service() { fun start(context: Context) { val appContext = context.applicationContext val intent = Intent(appContext, LivePlayForegroundService::class.java) - if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) { - try { - appContext.startService(intent) - } catch (_: IllegalStateException) { - ContextCompat.startForegroundService(appContext, intent) - } - } else { + try { appContext.startService(intent) + } catch (_: IllegalStateException) { + ContextCompat.startForegroundService(appContext, intent) } } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/live/LivePushActivity.kt b/example/src/main/java/com/demo/SellyCloudSDK/live/LivePushActivity.kt index e3d441d..19e6993 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/live/LivePushActivity.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/live/LivePushActivity.kt @@ -12,9 +12,12 @@ import android.graphics.Color import android.graphics.Matrix import android.graphics.Paint import android.graphics.Rect +import android.graphics.RectF import android.net.Uri +import android.opengl.GLES20 import android.os.Build import android.os.Bundle +import android.os.SystemClock import android.view.View import android.view.WindowManager import android.widget.Toast @@ -37,6 +40,7 @@ import com.demo.SellyCloudSDK.live.env.applyToSdkRuntimeConfig import com.demo.SellyCloudSDK.live.env.toLiveMode import com.demo.SellyCloudSDK.live.util.GalleryImageSaver import com.sellycloud.sellycloudsdk.CpuUsage +import com.sellycloud.sellycloudsdk.Disposable import com.sellycloud.sellycloudsdk.SellyLiveCameraPosition import com.sellycloud.sellycloudsdk.SellyLiveMode import com.sellycloud.sellycloudsdk.SellyLiveOrientation @@ -44,8 +48,16 @@ import com.sellycloud.sellycloudsdk.SellyLivePusherStats import com.sellycloud.sellycloudsdk.SellyLiveStatus import com.sellycloud.sellycloudsdk.SellyLiveVideoConfiguration import com.sellycloud.sellycloudsdk.SellyLiveVideoPusher +import com.sellycloud.sellycloudsdk.SellyVideoFrame import com.sellycloud.sellycloudsdk.SellyLiveVideoPusherDelegate import com.sellycloud.sellycloudsdk.SellyLiveVideoResolution +import com.sellycloud.sellycloudsdk.VideoFrameObserver +import com.sellycloud.sellycloudsdk.VideoFrameObserverConfig +import com.sellycloud.sellycloudsdk.VideoProcessFormat +import com.sellycloud.sellycloudsdk.VideoProcessMode +import com.sellycloud.sellycloudsdk.VideoProcessor +import com.sellycloud.sellycloudsdk.VideoProcessorConfig +import com.sellycloud.sellycloudsdk.VideoTextureFrame import kotlinx.coroutines.CoroutineScope import kotlinx.coroutines.Dispatchers import kotlinx.coroutines.SupervisorJob @@ -53,7 +65,13 @@ import kotlinx.coroutines.cancel import kotlinx.coroutines.delay import kotlinx.coroutines.isActive import kotlinx.coroutines.launch +import java.nio.ByteBuffer +import java.nio.ByteOrder +import java.nio.FloatBuffer +import java.util.concurrent.atomic.AtomicInteger +import java.util.concurrent.atomic.AtomicLong import kotlin.math.max +import kotlin.math.min import kotlin.math.roundToInt class LivePushActivity : AppCompatActivity() { @@ -65,18 +83,38 @@ class LivePushActivity : AppCompatActivity() { private lateinit var args: Args private lateinit var pusherClient: SellyLiveVideoPusher + private var useTextureView: Boolean = false private var isPublishing: Boolean = false private var isStatsCollapsed: Boolean = false private var latestStats: SellyLivePusherStats? = null private var isMuted: Boolean = false private var beautyEnabled: Boolean = true private var beautyAvailable: Boolean = true + private var autoFramingEnabled: Boolean = false + private var autoFramingAvailable: Boolean = false private var beautyEngine: FaceUnityBeautyEngine? = null private var videoSourceMode: VideoSourceMode = VideoSourceMode.Camera private var streamOrientation: SellyLiveOrientation = SellyLiveOrientation.PORTRAIT private var currentFacing: SellyLiveCameraPosition = SellyLiveCameraPosition.FRONT private val backgroundPaint = Paint(Paint.ANTI_ALIAS_FLAG) private var hasNavigatedHome: Boolean = false + private var latestLiveStatus: SellyLiveStatus = SellyLiveStatus.Idle + private var lastStateMessage: String? = null + private var lastStateExtras: Bundle? = null + private var logEnabled: Boolean = true + private var frameInterceptorMode: FrameInterceptorMode = FrameInterceptorMode.OFF + @Volatile private var latestFrameCallbackFps: Int = 0 + @Volatile private var lastFrameCallbackMeta: String = "off" + @Volatile private var lastFrameCallbackLogAtMs: Long = 0L + private val frameCallbackCounter = AtomicInteger(0) + private val frameCallbackTimingSamples = AtomicInteger(0) + private val frameCallbackTotalNs = AtomicLong(0L) + private val frameCallbackEditNs = AtomicLong(0L) + private val frameCallbackSlowCount = AtomicInteger(0) + private val frameCallbackMaxNs = AtomicLong(0L) + @Volatile private var diagnosticWatermarkOverlay: DiagnosticWatermarkOverlay? = null + private val frameObserverDisposables = mutableListOf() + private var diagnosticTextureOverlayRenderer: DiagnosticTextureOverlayRenderer? = null private val uiScope = CoroutineScope(SupervisorJob() + Dispatchers.Main.immediate) @@ -115,10 +153,11 @@ class LivePushActivity : AppCompatActivity() { supportActionBar?.hide() settingsStore = AvDemoSettingsStore(this) + useTextureView = settingsStore.read().renderBackendPreference.isTextureView() envStore = LiveEnvSettingsStore(this) val env = envStore.read().also { it.applyToSdkRuntimeConfig(this) } + logEnabled = env.logEnabled args = Args.from(intent, defaultMode = env.protocol.toLiveMode()) - window.addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON) binding.btnClose.setOnClickListener { @@ -166,9 +205,11 @@ class LivePushActivity : AppCompatActivity() { binding.actionFlip.setOnClickListener { switchCameraAndRemember() } binding.actionMute.setOnClickListener { toggleMute() } binding.actionCamera.setOnClickListener { toggleCamera() } - binding.actionScreenshot.setOnClickListener { captureCurrentFrame() } + binding.actionAutoFraming.setOnClickListener { toggleAutoFraming() } binding.actionBackground.setOnClickListener { toggleOrPickBackground() } binding.actionBeauty.setOnClickListener { toggleBeauty() } + binding.actionFrameCallback.setOnClickListener { toggleFrameInterceptor() } + binding.actionFrameModify.setOnClickListener { toggleFrameModify() } binding.btnQuickFlip.setOnClickListener { switchCameraAndRemember() } binding.btnQuickOrientation.setOnClickListener { toggleStreamOrientation() } @@ -177,6 +218,7 @@ class LivePushActivity : AppCompatActivity() { binding.tvStatsTitle.setOnClickListener { toggleStats() } renderToolStates() + updateFrameCallbackStats() updateStreamOrientationUi() updatePublishingUi() updateLayoutForOrientationAndState() @@ -190,6 +232,7 @@ class LivePushActivity : AppCompatActivity() { override fun onDestroy() { super.onDestroy() + clearFrameObservers() pusherOrNull()?.release() uiScope.cancel() } @@ -219,16 +262,41 @@ class LivePushActivity : AppCompatActivity() { runOnUiThread { onStateUpdated(status) } } + override fun onStatusDetailChanged(status: SellyLiveStatus, message: String?, extras: Bundle?) { + runOnUiThread { + latestLiveStatus = status + lastStateMessage = message + lastStateExtras = extras?.let(::Bundle) + updateStatusPanel() + } + } + override fun onStatisticsUpdate(stats: SellyLivePusherStats) { latestStats = stats runOnUiThread { updateStatsFromStats(stats) } } override fun onError(error: com.sellycloud.sellycloudsdk.SellyLiveError) { - runOnUiThread { Toast.makeText(this@LivePushActivity, error.message, Toast.LENGTH_SHORT).show() } + runOnUiThread { + latestLiveStatus = SellyLiveStatus.Failed + lastStateMessage = error.message + lastStateExtras = null + updateStatusPanel() + Toast.makeText(this@LivePushActivity, error.message, Toast.LENGTH_SHORT).show() + } + } + + override fun onAutoFramingStateChanged(state: com.sellycloud.sellycloudsdk.SellyLiveAutoFramingState) { + runOnUiThread { + autoFramingAvailable = state != com.sellycloud.sellycloudsdk.SellyLiveAutoFramingState.UNSUPPORTED + autoFramingEnabled = state != com.sellycloud.sellycloudsdk.SellyLiveAutoFramingState.OFF + && state != com.sellycloud.sellycloudsdk.SellyLiveAutoFramingState.UNSUPPORTED + renderToolStates() + } } } client.setMuted(isMuted) + applyFrameInterceptorState(client) } setupBeautyEngine(pusherClient) @@ -236,13 +304,21 @@ class LivePushActivity : AppCompatActivity() { try { val videoConfig = buildVideoConfig(settings) pusherClient.setVideoConfiguration(videoConfig) - pusherClient.attachPreview(binding.previewContainer) + pusherClient.attachPreview(binding.previewContainer, useTextureView) pusherClient.startRunning(currentFacing, videoConfig, null) applyStreamConfig(settings) } catch (t: Throwable) { Toast.makeText(this, "初始化预览失败: ${t.message}", Toast.LENGTH_LONG).show() } + // 延迟探测 Auto Framing 能力(需要相机已启动) + binding.previewContainer.postDelayed({ + val cap = pusherOrNull()?.getAutoFramingCapability() ?: return@postDelayed + debugLog("AutoFraming capability: supported=${cap.supported}, reason=${cap.reason}") + autoFramingAvailable = cap.supported + renderToolStates() + }, 1000) + startCpuLoop() } @@ -251,8 +327,8 @@ class LivePushActivity : AppCompatActivity() { val config = buildVideoConfig(settings) val (width, height) = resolveStreamSize(settings) pusher.setVideoConfiguration(config) - pusher.changeResolution(width, height) pusher.setStreamOrientation(streamOrientation) + pusher.changeResolution(width, height) } private fun resolveStreamSize(settings: AvDemoSettings): Pair { @@ -276,15 +352,16 @@ class LivePushActivity : AppCompatActivity() { } private fun onStateUpdated(state: SellyLiveStatus) { + latestLiveStatus = state isPublishing = state == SellyLiveStatus.Publishing || state == SellyLiveStatus.Connecting || state == SellyLiveStatus.Reconnecting binding.btnStartStopLive.text = getString(if (isPublishing) com.demo.SellyCloudSDK.R.string.push_stop_live else com.demo.SellyCloudSDK.R.string.push_start_live) updatePublishingUi() updateLayoutForOrientationAndState() - updateStreamOrientationUi() + updateStatusPanel() updateStatsFromStats(latestStats) - if (state == SellyLiveStatus.Stopped || state == SellyLiveStatus.Failed) { + if (state == SellyLiveStatus.Stopped) { navigateHomeAfterStop() } } @@ -369,6 +446,7 @@ class LivePushActivity : AppCompatActivity() { binding.tvStatsCpuSys.text = "Sys CPU: $percent%" binding.tvStatsCpuSys.setTextColor(ContextCompat.getColor(this@LivePushActivity, com.demo.SellyCloudSDK.R.color.av_stats_green)) updateStatsFromStats(latestStats) + updateFrameCallbackStats() delay(1000) } } @@ -448,6 +526,56 @@ class LivePushActivity : AppCompatActivity() { binding.actionBeauty.isEnabled = beautyAvailable binding.actionBeauty.alpha = if (beautyAvailable) 1f else 0.5f + val observeActive = frameInterceptorMode.isObserverMode + val modifyActive = frameInterceptorMode == FrameInterceptorMode.MODIFY + + val frameColor = if (observeActive) { + ContextCompat.getColor(this, com.demo.SellyCloudSDK.R.color.av_stats_green) + } else { + normal + } + binding.tvToolFrameCallbackLabel.text = if (observeActive) { + getString(frameInterceptorMode.toolLabelRes) + } else { + getString(com.demo.SellyCloudSDK.R.string.push_tool_frame_callback_off) + } + binding.ivToolFrameCallback.setColorFilter(frameColor) + binding.tvToolFrameCallbackLabel.setTextColor(frameColor) + + val modifyColor = if (modifyActive) { + ContextCompat.getColor(this, com.demo.SellyCloudSDK.R.color.av_stats_green) + } else { + normal + } + binding.tvToolFrameModifyLabel.setText( + if (modifyActive) { + com.demo.SellyCloudSDK.R.string.push_tool_frame_modify_on + } else { + com.demo.SellyCloudSDK.R.string.push_tool_frame_modify_off + } + ) + binding.ivToolFrameModify.setColorFilter(modifyColor) + binding.tvToolFrameModifyLabel.setTextColor(modifyColor) + + // Auto Framing + val afLabelRes = if (autoFramingEnabled) { + com.demo.SellyCloudSDK.R.string.push_tool_autoframing_on + } else { + com.demo.SellyCloudSDK.R.string.push_tool_autoframing_off + } + val afColor = if (!autoFramingAvailable) { + muted + } else if (autoFramingEnabled) { + ContextCompat.getColor(this, com.demo.SellyCloudSDK.R.color.av_stats_green) + } else { + normal + } + binding.tvToolAutoFramingLabel.setText(if (autoFramingAvailable) afLabelRes else com.demo.SellyCloudSDK.R.string.push_tool_not_supported) + binding.ivToolAutoFraming.setColorFilter(afColor) + binding.tvToolAutoFramingLabel.setTextColor(afColor) + binding.actionAutoFraming.isEnabled = autoFramingAvailable + binding.actionAutoFraming.alpha = if (autoFramingAvailable) 1f else 0.5f + val canSwitchCamera = videoSourceMode !is VideoSourceMode.Background binding.actionFlip.isEnabled = canSwitchCamera binding.actionFlip.alpha = if (canSwitchCamera) 1f else 0.4f @@ -468,6 +596,254 @@ class LivePushActivity : AppCompatActivity() { renderToolStates() } + private fun clearFrameObservers() { + frameObserverDisposables.forEach { disposable -> + runCatching { disposable.dispose() } + } + frameObserverDisposables.clear() + } + + private fun addFrameObserver( + pusher: SellyLiveVideoPusher, + observer: VideoFrameObserver + ) { + frameObserverDisposables += pusher.addVideoFrameObserver(observer) + } + + private fun toggleFrameInterceptor() { + frameInterceptorMode = when (frameInterceptorMode) { + FrameInterceptorMode.OFF -> FrameInterceptorMode.OBSERVE + FrameInterceptorMode.OBSERVE -> FrameInterceptorMode.CPU_EMPTY + FrameInterceptorMode.CPU_EMPTY -> FrameInterceptorMode.CPU_SINGLE + FrameInterceptorMode.CPU_SINGLE -> FrameInterceptorMode.CPU_DOUBLE + FrameInterceptorMode.CPU_DOUBLE -> FrameInterceptorMode.OFF + FrameInterceptorMode.MODIFY -> FrameInterceptorMode.OBSERVE + } + resetFrameCallbackWindow(if (frameInterceptorMode == FrameInterceptorMode.OFF) "off" else frameInterceptorMode.label) + applyFrameInterceptorState() + } + + private fun toggleFrameModify() { + frameInterceptorMode = if (frameInterceptorMode == FrameInterceptorMode.MODIFY) { + FrameInterceptorMode.OFF + } else { + FrameInterceptorMode.MODIFY + } + resetFrameCallbackWindow(if (frameInterceptorMode == FrameInterceptorMode.OFF) "off" else frameInterceptorMode.label) + applyFrameInterceptorState() + } + + private fun applyFrameInterceptorState(pusher: SellyLiveVideoPusher? = pusherOrNull()) { + val activePusher = pusher ?: return + clearFrameObservers() + if (frameInterceptorMode == FrameInterceptorMode.OFF) { + runCatching { activePusher.setVideoProcessor(null) } + resetFrameCallbackWindow("off") + updateFrameCallbackStats() + renderToolStates() + return + } + + resetFrameCallbackWindow(frameInterceptorMode.label) + + val ok = runCatching { + activePusher.setVideoProcessor(null) + when (frameInterceptorMode) { + FrameInterceptorMode.OFF -> Unit + FrameInterceptorMode.OBSERVE -> { + addFrameObserver(activePusher, object : VideoFrameObserver { + override val config: VideoFrameObserverConfig = VideoFrameObserverConfig( + preferredFormat = VideoProcessFormat.TEXTURE_2D + ) + + override fun onTextureFrame(frame: VideoTextureFrame) { + val callbackStartNs = System.nanoTime() + val mode = frameInterceptorMode + val bufferKind = "GL_TEXTURE_2D" + val now = SystemClock.elapsedRealtime() + val totalNs = System.nanoTime() - callbackStartNs + recordFrameCallbackMetrics(totalNs, 0L) + flushFrameCallbackWindowIfNeeded( + mode = mode, + nowMs = now, + fps = frameCallbackCounter.incrementAndGet(), + width = frame.width, + height = frame.height, + rotation = frame.rotation, + bufferKind = bufferKind, + patchLabel = null + ) + } + }) + } + + FrameInterceptorMode.CPU_EMPTY -> { + addFrameObserver(activePusher, object : VideoFrameObserver { + override val config: VideoFrameObserverConfig = VideoFrameObserverConfig( + preferredFormat = VideoProcessFormat.I420 + ) + }) + } + + FrameInterceptorMode.CPU_SINGLE -> { + addFrameObserver(activePusher, object : VideoFrameObserver { + override val config: VideoFrameObserverConfig = VideoFrameObserverConfig( + preferredFormat = VideoProcessFormat.I420 + ) + + override fun onFrame(frame: SellyVideoFrame) { + recordCpuObserverFrame(mode = frameInterceptorMode, frame = frame) + } + }) + } + + FrameInterceptorMode.CPU_DOUBLE -> { + addFrameObserver(activePusher, object : VideoFrameObserver { + override val config: VideoFrameObserverConfig = VideoFrameObserverConfig( + preferredFormat = VideoProcessFormat.I420 + ) + + override fun onFrame(frame: SellyVideoFrame) { + recordCpuObserverFrame(mode = frameInterceptorMode, frame = frame) + } + }) + addFrameObserver(activePusher, object : VideoFrameObserver { + override val config: VideoFrameObserverConfig = VideoFrameObserverConfig( + preferredFormat = VideoProcessFormat.I420 + ) + + override fun onFrame(frame: SellyVideoFrame) { + // Intentionally empty. This verifies that multiple CPU observers + // sharing the same format do not force duplicate conversion work. + } + }) + } + + FrameInterceptorMode.MODIFY -> { + if (args.liveMode == SellyLiveMode.RTC) { + activePusher.setVideoProcessor(object : VideoProcessor { + override val config: VideoProcessorConfig = VideoProcessorConfig( + preferredFormat = VideoProcessFormat.TEXTURE_2D, + mode = VideoProcessMode.READ_WRITE + ) + + override fun onGlContextDestroyed() { + diagnosticTextureOverlayRenderer?.release() + diagnosticTextureOverlayRenderer = null + } + + override fun processTexture(input: VideoTextureFrame, outputTextureId: Int) { + val callbackStartNs = System.nanoTime() + val mode = frameInterceptorMode + var patchLabel: String? = null + val now = SystemClock.elapsedRealtime() + if (outputTextureId > 0) { + runCatching { + val overlay = obtainDiagnosticWatermarkOverlay(input.width, input.height) + obtainDiagnosticTextureOverlayRenderer().renderOverlay( + outputTextureId = outputTextureId, + frameWidth = input.width, + frameHeight = input.height, + overlay = overlay + ) + patchLabel = overlay.label + }.onFailure { + debugLog("Frame texture modify failed: ${it.message}") + } + } + val totalNs = System.nanoTime() - callbackStartNs + recordFrameCallbackMetrics(totalNs, totalNs) + flushFrameCallbackWindowIfNeeded( + mode = mode, + nowMs = now, + fps = frameCallbackCounter.incrementAndGet(), + width = input.width, + height = input.height, + rotation = input.rotation, + bufferKind = "GL_TEXTURE_2D", + patchLabel = patchLabel + ) + } + }) + } else { + activePusher.setVideoProcessor(object : VideoProcessor { + override val config: VideoProcessorConfig = VideoProcessorConfig( + preferredFormat = VideoProcessFormat.RGBA, + mode = VideoProcessMode.READ_WRITE + ) + + override fun processCpu(frame: SellyVideoFrame): SellyVideoFrame? { + val callbackStartNs = System.nanoTime() + val mode = frameInterceptorMode + var editNs = 0L + var patchLabel: String? = null + var bufferKind = defaultFrameBufferKind(frame) + val now = SystemClock.elapsedRealtime() + val result = runCatching { createDiagnosticFrame(frame) } + .onFailure { debugLog("Frame modify failed: ${it.message}") } + .getOrNull() + ?.also { + editNs = it.editNs + patchLabel = it.patchLabel + bufferKind = it.bufferKind + } + ?.frame + val totalNs = System.nanoTime() - callbackStartNs + recordFrameCallbackMetrics(totalNs, editNs) + flushFrameCallbackWindowIfNeeded( + mode = mode, + nowMs = now, + fps = frameCallbackCounter.incrementAndGet(), + width = frame.width, + height = frame.height, + rotation = frame.rotation, + bufferKind = bufferKind, + patchLabel = patchLabel + ) + return result + } + }) + } + } + } + }.isSuccess + + if (!ok) { + frameInterceptorMode = FrameInterceptorMode.OFF + resetFrameCallbackWindow("off") + Toast.makeText(this, com.demo.SellyCloudSDK.R.string.push_tool_not_supported, Toast.LENGTH_SHORT).show() + } + updateFrameCallbackStats() + renderToolStates() + } + + private fun defaultFrameBufferKind(frame: SellyVideoFrame): String { + return frame.buffer.javaClass.simpleName.ifBlank { + frame.buffer.javaClass.name.substringAfterLast('.') + } + } + + private fun recordCpuObserverFrame( + mode: FrameInterceptorMode, + frame: SellyVideoFrame + ) { + val callbackStartNs = System.nanoTime() + val now = SystemClock.elapsedRealtime() + val totalNs = System.nanoTime() - callbackStartNs + val bufferKind = defaultFrameBufferKind(frame) + recordFrameCallbackMetrics(totalNs, 0L) + flushFrameCallbackWindowIfNeeded( + mode = mode, + nowMs = now, + fps = frameCallbackCounter.incrementAndGet(), + width = frame.width, + height = frame.height, + rotation = frame.rotation, + bufferKind = bufferKind, + patchLabel = null + ) + } + private fun toggleBeauty() { val pusher = pusherOrNull() ?: return if (beautyEngine == null) { @@ -489,6 +865,28 @@ class LivePushActivity : AppCompatActivity() { renderToolStates() } + private fun toggleAutoFraming() { + val pusher = pusherOrNull() ?: return + val cap = pusher.getAutoFramingCapability() + debugLog("AutoFraming toggle: supported=${cap.supported}, reason=${cap.reason}") + if (!cap.supported) { + val reason = cap.reason?.name ?: "UNKNOWN" + Toast.makeText(this, "Auto Framing 不可用: $reason", Toast.LENGTH_SHORT).show() + autoFramingAvailable = false + renderToolStates() + return + } + val target = !autoFramingEnabled + val ok = pusher.setAutoFramingEnabled(target) + if (!ok) { + Toast.makeText(this, "Auto Framing 开启失败", Toast.LENGTH_SHORT).show() + return + } + autoFramingEnabled = target + autoFramingAvailable = true + renderToolStates() + } + private fun toggleStreamOrientation() { if (isPublishing) return streamOrientation = if (streamOrientation == SellyLiveOrientation.PORTRAIT) { @@ -505,14 +903,8 @@ class LivePushActivity : AppCompatActivity() { private fun updateStreamOrientationUi() { val isPortrait = streamOrientation == SellyLiveOrientation.PORTRAIT - val labelRes = if (isPortrait) { - com.demo.SellyCloudSDK.R.string.push_stream_portrait - } else { - com.demo.SellyCloudSDK.R.string.push_stream_landscape - } binding.btnQuickOrientation.rotation = if (isPortrait) 0f else 90f - val protocolLabel = if (args.liveMode == SellyLiveMode.RTC) "rtc" else "rtmp" - binding.tvStatsProtocol.text = "$protocolLabel | ${getString(labelRes)}" + updateStatusPanel() } private fun applyUiOrientation() { @@ -836,6 +1228,7 @@ class LivePushActivity : AppCompatActivity() { companion object { const val EXTRA_LIVE_MODE = "push_live_mode" + private const val TAG = "LivePushActivity" fun createIntent(context: Context, liveMode: SellyLiveMode): Intent = Intent(context, LivePushActivity::class.java) @@ -854,6 +1247,602 @@ class LivePushActivity : AppCompatActivity() { private fun pusherOrNull(): SellyLiveVideoPusher? = if (this::pusherClient.isInitialized) pusherClient else null + private fun updateStatusPanel() { + val protocolLabel = if (args.liveMode == SellyLiveMode.RTC) "rtc" else "rtmp" + val orientationLabel = getString( + if (streamOrientation == SellyLiveOrientation.PORTRAIT) { + com.demo.SellyCloudSDK.R.string.push_stream_portrait + } else { + com.demo.SellyCloudSDK.R.string.push_stream_landscape + } + ) + binding.tvStatsProtocol.text = "$protocolLabel | $orientationLabel | ${stateLabel(latestLiveStatus)}" + + val parts = mutableListOf() + // Use unified stateLabel instead of raw SDK message (which differs between RTMP/RTC) + parts += stateLabel(latestLiveStatus) + lastStateExtras?.getString("phase")?.takeIf { it.isNotBlank() }?.let { parts += "phase=$it" } + lastStateExtras?.getNumber("attempt")?.toInt()?.takeIf { it > 0 }?.let { parts += "attempt=$it" } + lastStateExtras?.getNumber("elapsed_ms")?.toLong()?.takeIf { it > 0 }?.let { parts += "elapsed=${it}ms" } + val detail = parts.joinToString(" • ") + + if (detail.isBlank()) { + binding.tvStatsDetail.visibility = View.GONE + } else { + binding.tvStatsDetail.visibility = View.VISIBLE + binding.tvStatsDetail.text = detail + val colorRes = when (latestLiveStatus) { + SellyLiveStatus.Failed -> com.demo.SellyCloudSDK.R.color.av_stats_red + SellyLiveStatus.Reconnecting -> com.demo.SellyCloudSDK.R.color.av_stats_yellow + SellyLiveStatus.Publishing -> com.demo.SellyCloudSDK.R.color.av_stats_green + else -> com.demo.SellyCloudSDK.R.color.brand_primary_text_sub + } + binding.tvStatsDetail.setTextColor(ContextCompat.getColor(this, colorRes)) + } + } + + private fun updateFrameCallbackStats() { + val text = when (frameInterceptorMode) { + FrameInterceptorMode.OFF -> + getString(com.demo.SellyCloudSDK.R.string.live_stats_frame_callback_off) + FrameInterceptorMode.MODIFY -> + getString( + com.demo.SellyCloudSDK.R.string.live_stats_frame_callback_modify, + latestFrameCallbackFps, + lastFrameCallbackMeta + ) + else -> + getString( + com.demo.SellyCloudSDK.R.string.live_stats_frame_callback_generic, + frameInterceptorMode.label, + latestFrameCallbackFps, + lastFrameCallbackMeta + ) + } + val colorRes = when { + frameInterceptorMode == FrameInterceptorMode.OFF -> com.demo.SellyCloudSDK.R.color.brand_primary_text_sub + latestFrameCallbackFps > 0 -> com.demo.SellyCloudSDK.R.color.av_stats_green + else -> com.demo.SellyCloudSDK.R.color.av_stats_yellow + } + binding.tvStatsFrameCallback.text = text + binding.tvStatsFrameCallback.setTextColor(ContextCompat.getColor(this, colorRes)) + } + + private fun resetFrameCallbackWindow(meta: String) { + frameCallbackCounter.set(0) + frameCallbackTimingSamples.set(0) + frameCallbackTotalNs.set(0L) + frameCallbackEditNs.set(0L) + frameCallbackSlowCount.set(0) + frameCallbackMaxNs.set(0L) + latestFrameCallbackFps = 0 + lastFrameCallbackMeta = meta + lastFrameCallbackLogAtMs = 0L + } + + private fun recordFrameCallbackMetrics(totalNs: Long, editNs: Long) { + frameCallbackTimingSamples.incrementAndGet() + frameCallbackTotalNs.addAndGet(totalNs) + frameCallbackEditNs.addAndGet(editNs) + if (totalNs >= 33_000_000L) { + frameCallbackSlowCount.incrementAndGet() + } + while (true) { + val current = frameCallbackMaxNs.get() + if (totalNs <= current || frameCallbackMaxNs.compareAndSet(current, totalNs)) break + } + } + + private fun flushFrameCallbackWindowIfNeeded( + mode: FrameInterceptorMode, + nowMs: Long, + fps: Int, + width: Int, + height: Int, + rotation: Int, + bufferKind: String, + patchLabel: String? + ) { + val lastLogAt = lastFrameCallbackLogAtMs + if (lastLogAt == 0L) { + lastFrameCallbackLogAtMs = nowMs + return + } + if (nowMs - lastLogAt < 1000L) return + + lastFrameCallbackLogAtMs = nowMs + latestFrameCallbackFps = fps + frameCallbackCounter.set(0) + + val samples = frameCallbackTimingSamples.getAndSet(0).coerceAtLeast(1) + val avgTotalMs = nanosToMillis(frameCallbackTotalNs.getAndSet(0L) / samples) + val avgEditMs = nanosToMillis(frameCallbackEditNs.getAndSet(0L) / samples) + val maxMs = nanosToMillis(frameCallbackMaxNs.getAndSet(0L)) + val slowCount = frameCallbackSlowCount.getAndSet(0) + val patchSuffix = patchLabel?.let { " patch=$it" }.orEmpty() + + lastFrameCallbackMeta = + "${mode.label} ${width}x${height} rot=$rotation avg=${"%.1f".format(avgTotalMs)}ms " + + "edit=${"%.1f".format(avgEditMs)}ms buf=$bufferKind$patchSuffix" + + debugLog( + "Frame callback protocol=${args.liveMode} testMode=${mode.label} fps=$fps " + + "size=${width}x${height} rotation=$rotation buffer=$bufferKind$patchSuffix " + + "avgTotal=${"%.2f".format(avgTotalMs)}ms avgEdit=${"%.2f".format(avgEditMs)}ms " + + "max=${"%.2f".format(maxMs)}ms slow33ms=$slowCount" + ) + runOnUiThread { updateFrameCallbackStats() } + } + + private fun createDiagnosticFrame(frame: SellyVideoFrame): ModifiedFrameTrace? { + val buffer = frame.buffer + val bufferKind = buffer.javaClass.simpleName.ifBlank { + buffer.javaClass.name.substringAfterLast('.') + } + if (buffer.width == 0 || buffer.height == 0) return null + + val editStartNs = System.nanoTime() + val patchLabel = when (buffer) { + is SellyVideoFrame.RgbaBuffer -> applyDiagnosticPatch(buffer) + is SellyVideoFrame.I420Buffer -> applyDiagnosticPatch(buffer) + else -> return null + } + val editNs = System.nanoTime() - editStartNs + + return ModifiedFrameTrace( + frame = frame, + bufferKind = bufferKind, + patchLabel = patchLabel, + editNs = editNs + ) + } + + private fun applyDiagnosticPatch(i420: SellyVideoFrame.I420Buffer): String { + + val overlay = obtainDiagnosticWatermarkOverlay(i420.width, i420.height) + val startX = (i420.width - overlay.margin - overlay.width).coerceAtLeast(0).let { + if (it % 2 == 0) it else (it - 1).coerceAtLeast(0) + } + val startY = overlay.margin.coerceAtLeast(0).let { + if (it % 2 == 0) it else it - 1 + } + val yPlane = i420.dataY + val uPlane = i420.dataU + val vPlane = i420.dataV + + for (row in 0 until overlay.height) { + val dstRowBase = (startY + row) * i420.strideY + startX + val srcRowBase = row * overlay.width + for (col in 0 until overlay.width) { + val idx = srcRowBase + col + if (overlay.alphaMask[idx].toInt() != 0) { + yPlane.put(dstRowBase + col, overlay.yPlane[idx]) + } + } + } + + val uvStartX = startX / 2 + val uvStartY = startY / 2 + for (row in 0 until overlay.uvHeight) { + val dstURowBase = (uvStartY + row) * i420.strideU + uvStartX + val dstVRowBase = (uvStartY + row) * i420.strideV + uvStartX + val srcRowBase = row * overlay.uvWidth + for (col in 0 until overlay.uvWidth) { + if (overlay.uvMask[srcRowBase + col].toInt() != 0) { + uPlane.put(dstURowBase + col, 128.toByte()) + vPlane.put(dstVRowBase + col, 128.toByte()) + } + } + } + return overlay.label + } + + private fun applyDiagnosticPatch(rgba: SellyVideoFrame.RgbaBuffer): String { + val overlay = obtainDiagnosticWatermarkOverlay(rgba.width, rgba.height) + val startX = (rgba.width - overlay.margin - overlay.width).coerceAtLeast(0) + val startY = overlay.margin.coerceAtLeast(0) + val rgbaData = rgba.data + val overlayBytes = overlay.rgbaBytes + + for (row in 0 until overlay.height) { + val dstRowBase = (startY + row) * rgba.stride + startX * 4 + val srcRowBase = row * overlay.width + for (col in 0 until overlay.width) { + val idx = srcRowBase + col + if (overlay.alphaMask[idx].toInt() == 0) continue + val srcOffset = idx * 4 + val dstOffset = dstRowBase + col * 4 + rgbaData.put(dstOffset, overlayBytes[srcOffset]) + rgbaData.put(dstOffset + 1, overlayBytes[srcOffset + 1]) + rgbaData.put(dstOffset + 2, overlayBytes[srcOffset + 2]) + rgbaData.put(dstOffset + 3, overlayBytes[srcOffset + 3]) + } + } + return overlay.label + } + + private fun obtainDiagnosticWatermarkOverlay(frameWidth: Int, frameHeight: Int): DiagnosticWatermarkOverlay { + val cached = diagnosticWatermarkOverlay + if (cached != null && cached.frameWidth == frameWidth && cached.frameHeight == frameHeight) { + return cached + } + return createDiagnosticWatermarkOverlay(frameWidth, frameHeight).also { + diagnosticWatermarkOverlay = it + } + } + + private fun createDiagnosticWatermarkOverlay(frameWidth: Int, frameHeight: Int): DiagnosticWatermarkOverlay { + val margin = (min(frameWidth, frameHeight) / 36).coerceIn(8, 24) + val maxWidth = (frameWidth - margin * 2).coerceAtLeast(32) + val maxHeight = (frameHeight - margin * 2).coerceAtLeast(24) + val overlayWidth = if (maxWidth <= 120) maxWidth else (frameWidth / 5).coerceIn(120, maxWidth) + val overlayHeight = if (maxHeight <= 42) maxHeight else (overlayWidth * 0.32f).roundToInt().coerceIn(42, maxHeight) + + val bitmap = Bitmap.createBitmap(overlayWidth, overlayHeight, Bitmap.Config.ARGB_8888) + val canvas = Canvas(bitmap) + val backgroundPaint = Paint(Paint.ANTI_ALIAS_FLAG).apply { + color = Color.argb(215, 0, 0, 0) + style = Paint.Style.FILL + } + val strokePaint = Paint(Paint.ANTI_ALIAS_FLAG).apply { + color = Color.argb(235, 255, 255, 255) + style = Paint.Style.STROKE + strokeWidth = (overlayHeight * 0.035f).coerceAtLeast(1.5f) + } + val textPaint = Paint(Paint.ANTI_ALIAS_FLAG).apply { + color = Color.WHITE + textSize = overlayHeight * 0.34f + isFakeBoldText = true + textAlign = Paint.Align.CENTER + } + val radius = overlayHeight * 0.22f + val bounds = RectF(0f, 0f, overlayWidth.toFloat(), overlayHeight.toFloat()) + canvas.drawRoundRect(bounds, radius, radius, backgroundPaint) + canvas.drawRoundRect(bounds, radius, radius, strokePaint) + + val baseline = overlayHeight / 2f - (textPaint.ascent() + textPaint.descent()) / 2f + canvas.drawText("SDK TEST", overlayWidth / 2f, baseline, textPaint) + + val pixels = IntArray(overlayWidth * overlayHeight) + bitmap.getPixels(pixels, 0, overlayWidth, 0, 0, overlayWidth, overlayHeight) + bitmap.recycle() + + val rgbaBytes = ByteArray(overlayWidth * overlayHeight * 4) + val yPlane = ByteArray(overlayWidth * overlayHeight) + val alphaMask = ByteArray(overlayWidth * overlayHeight) + for (index in pixels.indices) { + val color = pixels[index] + val alpha = Color.alpha(color) + val red = Color.red(color) + val green = Color.green(color) + val blue = Color.blue(color) + val rgbaOffset = index * 4 + rgbaBytes[rgbaOffset] = red.toByte() + rgbaBytes[rgbaOffset + 1] = green.toByte() + rgbaBytes[rgbaOffset + 2] = blue.toByte() + rgbaBytes[rgbaOffset + 3] = alpha.toByte() + if (alpha <= 12) continue + alphaMask[index] = 1 + yPlane[index] = rgbToYByte(red, green, blue) + } + + val uvWidth = (overlayWidth + 1) / 2 + val uvHeight = (overlayHeight + 1) / 2 + val uvMask = ByteArray(uvWidth * uvHeight) + for (row in 0 until overlayHeight step 2) { + for (col in 0 until overlayWidth step 2) { + val idx0 = row * overlayWidth + col + val idx1 = idx0 + 1 + val idx2 = if (row + 1 < overlayHeight) idx0 + overlayWidth else idx0 + val idx3 = if (row + 1 < overlayHeight && col + 1 < overlayWidth) idx2 + 1 else idx2 + val covered = + alphaMask[idx0].toInt() != 0 || + (col + 1 < overlayWidth && alphaMask[idx1].toInt() != 0) || + (row + 1 < overlayHeight && alphaMask[idx2].toInt() != 0) || + (row + 1 < overlayHeight && col + 1 < overlayWidth && alphaMask[idx3].toInt() != 0) + if (covered) { + uvMask[(row / 2) * uvWidth + (col / 2)] = 1 + } + } + } + + return DiagnosticWatermarkOverlay( + frameWidth = frameWidth, + frameHeight = frameHeight, + width = overlayWidth, + height = overlayHeight, + uvWidth = uvWidth, + uvHeight = uvHeight, + margin = margin, + label = "wm=${overlayWidth}x${overlayHeight}", + rgbaBytes = rgbaBytes, + yPlane = yPlane, + alphaMask = alphaMask, + uvMask = uvMask + ) + } + + private fun obtainDiagnosticTextureOverlayRenderer(): DiagnosticTextureOverlayRenderer { + return diagnosticTextureOverlayRenderer ?: DiagnosticTextureOverlayRenderer().also { + diagnosticTextureOverlayRenderer = it + } + } + + private fun nanosToMillis(nanos: Long): Double = nanos / 1_000_000.0 + + private fun rgbToYByte(red: Int, green: Int, blue: Int): Byte { + val y = ((66 * red + 129 * green + 25 * blue + 128) shr 8) + 16 + return y.coerceIn(16, 235).toByte() + } + + private data class ModifiedFrameTrace( + val frame: SellyVideoFrame, + val bufferKind: String, + val patchLabel: String, + val editNs: Long + ) + + private data class DiagnosticWatermarkOverlay( + val frameWidth: Int, + val frameHeight: Int, + val width: Int, + val height: Int, + val uvWidth: Int, + val uvHeight: Int, + val margin: Int, + val label: String, + val rgbaBytes: ByteArray, + val yPlane: ByteArray, + val alphaMask: ByteArray, + val uvMask: ByteArray + ) + + private class DiagnosticTextureOverlayRenderer { + private var program = 0 + private var framebuffer = 0 + private var overlayTexture = 0 + private var positionLoc = -1 + private var texCoordLoc = -1 + private var textureLoc = -1 + private var quadBuffer: FloatBuffer? = null + private var uploadedFrameWidth = 0 + private var uploadedFrameHeight = 0 + + fun renderOverlay( + outputTextureId: Int, + frameWidth: Int, + frameHeight: Int, + overlay: DiagnosticWatermarkOverlay + ) { + if (outputTextureId <= 0 || frameWidth <= 0 || frameHeight <= 0) return + ensureGlResources() + ensureOverlayTexture(overlay) + val quad = quadBuffer ?: return + if (program <= 0 || framebuffer <= 0 || overlayTexture <= 0) return + + val leftPx = (frameWidth - overlay.margin - overlay.width).coerceAtLeast(0) + val topPx = overlay.margin.coerceAtLeast(0) + val rightPx = (leftPx + overlay.width).coerceAtMost(frameWidth) + val bottomPx = (topPx + overlay.height).coerceAtMost(frameHeight) + val left = leftPx.toFloat() / frameWidth * 2f - 1f + val right = rightPx.toFloat() / frameWidth * 2f - 1f + val top = 1f - topPx.toFloat() / frameHeight * 2f + val bottom = 1f - bottomPx.toFloat() / frameHeight * 2f + val vertices = floatArrayOf( + left, bottom, 0f, 1f, + right, bottom, 1f, 1f, + left, top, 0f, 0f, + right, top, 1f, 0f + ) + quad.clear() + quad.put(vertices) + quad.position(0) + + val previousFramebuffer = IntArray(1) + val previousViewport = IntArray(4) + val blendEnabled = GLES20.glIsEnabled(GLES20.GL_BLEND) + GLES20.glGetIntegerv(GLES20.GL_FRAMEBUFFER_BINDING, previousFramebuffer, 0) + GLES20.glGetIntegerv(GLES20.GL_VIEWPORT, previousViewport, 0) + + GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, framebuffer) + GLES20.glFramebufferTexture2D( + GLES20.GL_FRAMEBUFFER, + GLES20.GL_COLOR_ATTACHMENT0, + GLES20.GL_TEXTURE_2D, + outputTextureId, + 0 + ) + GLES20.glViewport(0, 0, frameWidth, frameHeight) + GLES20.glEnable(GLES20.GL_BLEND) + GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA) + GLES20.glUseProgram(program) + quad.position(0) + GLES20.glVertexAttribPointer(positionLoc, 2, GLES20.GL_FLOAT, false, 16, quad) + GLES20.glEnableVertexAttribArray(positionLoc) + quad.position(2) + GLES20.glVertexAttribPointer(texCoordLoc, 2, GLES20.GL_FLOAT, false, 16, quad) + GLES20.glEnableVertexAttribArray(texCoordLoc) + GLES20.glActiveTexture(GLES20.GL_TEXTURE0) + GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, overlayTexture) + GLES20.glUniform1i(textureLoc, 0) + GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4) + GLES20.glDisableVertexAttribArray(positionLoc) + GLES20.glDisableVertexAttribArray(texCoordLoc) + GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0) + GLES20.glUseProgram(0) + if (!blendEnabled) { + GLES20.glDisable(GLES20.GL_BLEND) + } + GLES20.glFramebufferTexture2D( + GLES20.GL_FRAMEBUFFER, + GLES20.GL_COLOR_ATTACHMENT0, + GLES20.GL_TEXTURE_2D, + 0, + 0 + ) + GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, previousFramebuffer[0]) + GLES20.glViewport( + previousViewport[0], + previousViewport[1], + previousViewport[2], + previousViewport[3] + ) + } + + fun release() { + if (program > 0) { + GLES20.glDeleteProgram(program) + program = 0 + } + if (framebuffer > 0) { + GLES20.glDeleteFramebuffers(1, intArrayOf(framebuffer), 0) + framebuffer = 0 + } + if (overlayTexture > 0) { + GLES20.glDeleteTextures(1, intArrayOf(overlayTexture), 0) + overlayTexture = 0 + } + quadBuffer = null + uploadedFrameWidth = 0 + uploadedFrameHeight = 0 + } + + private fun ensureGlResources() { + if (program <= 0) { + program = createProgram(VERTEX_SHADER, FRAGMENT_SHADER) + if (program > 0) { + positionLoc = GLES20.glGetAttribLocation(program, "aPosition") + texCoordLoc = GLES20.glGetAttribLocation(program, "aTextureCoord") + textureLoc = GLES20.glGetUniformLocation(program, "uTexture") + } + } + if (framebuffer <= 0) { + val framebuffers = IntArray(1) + GLES20.glGenFramebuffers(1, framebuffers, 0) + framebuffer = framebuffers[0] + } + if (overlayTexture <= 0) { + val textures = IntArray(1) + GLES20.glGenTextures(1, textures, 0) + overlayTexture = textures[0] + GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, overlayTexture) + GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR) + GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR) + GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE) + GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE) + GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0) + } + if (quadBuffer == null) { + quadBuffer = ByteBuffer.allocateDirect(16 * 4) + .order(ByteOrder.nativeOrder()) + .asFloatBuffer() + } + } + + private fun ensureOverlayTexture(overlay: DiagnosticWatermarkOverlay) { + if ( + overlay.frameWidth == uploadedFrameWidth && + overlay.frameHeight == uploadedFrameHeight + ) { + return + } + val rgbaBuffer = ByteBuffer.allocateDirect(overlay.rgbaBytes.size) + .order(ByteOrder.nativeOrder()) + rgbaBuffer.put(overlay.rgbaBytes) + rgbaBuffer.position(0) + GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, overlayTexture) + GLES20.glTexImage2D( + GLES20.GL_TEXTURE_2D, + 0, + GLES20.GL_RGBA, + overlay.width, + overlay.height, + 0, + GLES20.GL_RGBA, + GLES20.GL_UNSIGNED_BYTE, + rgbaBuffer + ) + GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0) + uploadedFrameWidth = overlay.frameWidth + uploadedFrameHeight = overlay.frameHeight + } + + private fun createProgram(vertexSource: String, fragmentSource: String): Int { + val vertexShader = compileShader(GLES20.GL_VERTEX_SHADER, vertexSource) + val fragmentShader = compileShader(GLES20.GL_FRAGMENT_SHADER, fragmentSource) + if (vertexShader <= 0 || fragmentShader <= 0) { + if (vertexShader > 0) GLES20.glDeleteShader(vertexShader) + if (fragmentShader > 0) GLES20.glDeleteShader(fragmentShader) + return 0 + } + val shaderProgram = GLES20.glCreateProgram() + if (shaderProgram <= 0) return 0 + GLES20.glAttachShader(shaderProgram, vertexShader) + GLES20.glAttachShader(shaderProgram, fragmentShader) + GLES20.glLinkProgram(shaderProgram) + val status = IntArray(1) + GLES20.glGetProgramiv(shaderProgram, GLES20.GL_LINK_STATUS, status, 0) + GLES20.glDeleteShader(vertexShader) + GLES20.glDeleteShader(fragmentShader) + if (status[0] != GLES20.GL_TRUE) { + GLES20.glDeleteProgram(shaderProgram) + return 0 + } + return shaderProgram + } + + private fun compileShader(type: Int, source: String): Int { + val shader = GLES20.glCreateShader(type) + if (shader <= 0) return 0 + GLES20.glShaderSource(shader, source) + GLES20.glCompileShader(shader) + val status = IntArray(1) + GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, status, 0) + if (status[0] != GLES20.GL_TRUE) { + GLES20.glDeleteShader(shader) + return 0 + } + return shader + } + + companion object { + private const val VERTEX_SHADER = """ + attribute vec4 aPosition; + attribute vec2 aTextureCoord; + varying vec2 vTextureCoord; + void main() { + gl_Position = aPosition; + vTextureCoord = aTextureCoord; + } + """ + + private const val FRAGMENT_SHADER = """ + precision mediump float; + uniform sampler2D uTexture; + varying vec2 vTextureCoord; + void main() { + gl_FragColor = texture2D(uTexture, vTextureCoord); + } + """ + } + } + + private fun stateLabel(status: SellyLiveStatus): String = when (status) { + SellyLiveStatus.Idle -> "空闲" + SellyLiveStatus.Connecting -> "连接中" + SellyLiveStatus.Publishing -> "推流中" + SellyLiveStatus.Reconnecting -> "重连中" + SellyLiveStatus.Stopped -> "已停止" + SellyLiveStatus.Failed -> "失败" + } + + private fun debugLog(message: String) { + if (logEnabled) android.util.Log.d(TAG, message) + } + + @Suppress("DEPRECATION") + private fun Bundle.getNumber(key: String): Number? = get(key) as? Number + private data class Args(val liveMode: SellyLiveMode) { companion object { fun from(intent: Intent, defaultMode: SellyLiveMode): Args { @@ -869,4 +1858,19 @@ class LivePushActivity : AppCompatActivity() { object CameraOff : VideoSourceMode() data class Background(val uri: Uri) : VideoSourceMode() } + + private enum class FrameInterceptorMode( + val label: String, + val toolLabelRes: Int + ) { + OFF("off", com.demo.SellyCloudSDK.R.string.push_tool_frame_callback_off), + OBSERVE("observe", com.demo.SellyCloudSDK.R.string.push_tool_frame_callback_texture), + CPU_EMPTY("cpu-empty", com.demo.SellyCloudSDK.R.string.push_tool_frame_callback_cpu_empty), + CPU_SINGLE("cpu-single", com.demo.SellyCloudSDK.R.string.push_tool_frame_callback_cpu_single), + CPU_DOUBLE("cpu-double", com.demo.SellyCloudSDK.R.string.push_tool_frame_callback_cpu_double), + MODIFY("modify", com.demo.SellyCloudSDK.R.string.push_tool_frame_modify_on); + + val isObserverMode: Boolean + get() = this == OBSERVE || this == CPU_EMPTY || this == CPU_SINGLE || this == CPU_DOUBLE + } } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/live/PkPlayActivity.kt b/example/src/main/java/com/demo/SellyCloudSDK/live/PkPlayActivity.kt index d9e8058..ca05e40 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/live/PkPlayActivity.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/live/PkPlayActivity.kt @@ -82,6 +82,7 @@ class PkPlayActivity : AppCompatActivity() { // Shared state private var isMuted: Boolean = false private var hasReleasedPlayers: Boolean = false + private var logEnabled: Boolean = true // Log system private val logLines: ArrayDeque = ArrayDeque() @@ -100,6 +101,7 @@ class PkPlayActivity : AppCompatActivity() { envStore = LiveEnvSettingsStore(this) val env = envStore.read().also { it.applyToSdkRuntimeConfig(this) } + logEnabled = env.logEnabled args = Args.from(intent, env) ?: run { Toast.makeText(this, "缺少 PK 播放参数", Toast.LENGTH_SHORT).show() finish() @@ -109,7 +111,7 @@ class PkPlayActivity : AppCompatActivity() { binding.tvMainStreamName.text = args.mainStreamName binding.tvPkStreamName.text = args.pkStreamName - Log.d(TAG, "初始化主播放器:streamId=${args.mainStreamName}, 协议: RTC") + debugLog("初始化主播放器:streamId=${args.mainStreamName}, 协议: RTC") mainPlayer = SellyLiveVideoPlayer.initWithStreamId( this, args.mainStreamName, @@ -144,7 +146,7 @@ class PkPlayActivity : AppCompatActivity() { ) mainPlayer.setMuted(isMuted) - Log.d(TAG, "初始化 PK 播放器:streamId=${args.pkStreamName}") + debugLog("初始化 PK 播放器:streamId=${args.pkStreamName}") pkPlayer = SellyLiveVideoPlayer.initWithStreamId( this, args.pkStreamName, @@ -280,6 +282,13 @@ class PkPlayActivity : AppCompatActivity() { } } + override fun onReconnectStateChanged(isReconnecting: Boolean, detail: String?) { + runOnUiThread { + val suffix = detail?.takeIf { it.isNotBlank() }?.let { ": $it" }.orEmpty() + logEvent(if (isReconnecting) "$prefix: 重连开始$suffix" else "$prefix: 重连结束$suffix") + } + } + override fun onError(error: com.sellycloud.sellycloudsdk.SellyLiveError) { runOnUiThread { logEvent("$prefix: 错误: ${error.message}") @@ -660,6 +669,10 @@ class PkPlayActivity : AppCompatActivity() { } } + private fun debugLog(message: String) { + if (logEnabled) Log.d(TAG, message) + } + private fun dpToPx(dp: Int): Int { return (dp * resources.displayMetrics.density + 0.5f).toInt() } diff --git a/example/src/main/java/com/demo/SellyCloudSDK/vod/VodPlayActivity.kt b/example/src/main/java/com/demo/SellyCloudSDK/vod/VodPlayActivity.kt index e5ea9a6..94ba349 100644 --- a/example/src/main/java/com/demo/SellyCloudSDK/vod/VodPlayActivity.kt +++ b/example/src/main/java/com/demo/SellyCloudSDK/vod/VodPlayActivity.kt @@ -10,6 +10,7 @@ import android.graphics.Bitmap import android.graphics.Color import android.graphics.Typeface import android.graphics.drawable.GradientDrawable +import com.sellycloud.sellycloudsdk.render.RenderBackend import android.os.Build import android.os.Bundle import android.os.Looper @@ -28,6 +29,7 @@ import androidx.appcompat.app.AppCompatActivity import androidx.appcompat.widget.AppCompatTextView import androidx.core.content.ContextCompat import com.demo.SellyCloudSDK.R +import com.demo.SellyCloudSDK.avdemo.AvDemoSettingsStore import com.demo.SellyCloudSDK.databinding.ActivityVodPlayBinding import com.demo.SellyCloudSDK.live.util.GalleryImageSaver import com.sellycloud.sellycloudsdk.SellyCloudManager @@ -55,6 +57,7 @@ class VodPlayActivity : AppCompatActivity() { private var player: SellyVodPlayer? = null private var renderView: View? = null + private var useTextureView = false private var isPlaying = false private var isMuted = false private var currentState: SellyPlayerState = SellyPlayerState.Idle @@ -90,6 +93,7 @@ class VodPlayActivity : AppCompatActivity() { binding = ActivityVodPlayBinding.inflate(layoutInflater) setContentView(binding.root) supportActionBar?.hide() + useTextureView = AvDemoSettingsStore(this).read().renderBackendPreference.isTextureView() addLogFloatingButton() binding.btnClose.setOnClickListener { finish() } @@ -249,7 +253,8 @@ class VodPlayActivity : AppCompatActivity() { client.setMuted(isMuted) } - renderView = vodPlayer.attachRenderView(binding.renderContainer) + val backend = if (useTextureView) RenderBackend.TEXTURE_VIEW else RenderBackend.SURFACE_VIEW + renderView = vodPlayer.attachRenderView(binding.renderContainer, backend) player = vodPlayer startPlayAttempt() vodPlayer.prepareAsync() @@ -331,8 +336,22 @@ class VodPlayActivity : AppCompatActivity() { Toast.makeText(this, "视图尚未布局完成,稍后再试", Toast.LENGTH_SHORT).show() return } + if (view is android.view.TextureView) { + val bmp = view.getBitmap() + if (bmp == null) { + Toast.makeText(this, "TextureView 尚未渲染画面", Toast.LENGTH_SHORT).show() + return + } + uiScope.launch(Dispatchers.IO) { + val ok = saveBitmapToGallery(bmp, prefix) + launch(Dispatchers.Main) { + Toast.makeText(this@VodPlayActivity, if (ok) "截图已保存到相册" else "保存失败", Toast.LENGTH_SHORT).show() + } + } + return + } if (view !is android.view.SurfaceView) { - Toast.makeText(this, "当前视图不支持截图", Toast.LENGTH_SHORT).show() + Toast.makeText(this, "当前视图类型不支持截图", Toast.LENGTH_SHORT).show() return } val bmp = Bitmap.createBitmap(view.width, view.height, Bitmap.Config.ARGB_8888) diff --git a/example/src/main/res/drawable/bg_av_sheet.xml b/example/src/main/res/drawable/bg_av_sheet.xml deleted file mode 100644 index 30e3f34..0000000 --- a/example/src/main/res/drawable/bg_av_sheet.xml +++ /dev/null @@ -1,9 +0,0 @@ - - - - - diff --git a/example/src/main/res/drawable/bg_login_logo.xml b/example/src/main/res/drawable/bg_login_logo.xml deleted file mode 100644 index 6c74a64..0000000 --- a/example/src/main/res/drawable/bg_login_logo.xml +++ /dev/null @@ -1,5 +0,0 @@ - - - - - diff --git a/example/src/main/res/drawable/ic_live_auto_framing.xml b/example/src/main/res/drawable/ic_live_auto_framing.xml new file mode 100644 index 0000000..10a436e --- /dev/null +++ b/example/src/main/res/drawable/ic_live_auto_framing.xml @@ -0,0 +1,14 @@ + + + + + + diff --git a/example/src/main/res/layout/activity_feature_hub.xml b/example/src/main/res/layout/activity_feature_hub.xml index a4141f2..4500666 100644 --- a/example/src/main/res/layout/activity_feature_hub.xml +++ b/example/src/main/res/layout/activity_feature_hub.xml @@ -476,6 +476,57 @@ android:textColorHint="@color/av_text_hint" android:textSize="14sp" /> + + + + + + + + + + + + app:layout_constraintTop_toTopOf="parent" + app:layout_constraintStart_toStartOf="parent" /> + + + + + + + + + + + + + + + + + + + + + + + diff --git a/example/src/main/res/layout/activity_vod_play.xml b/example/src/main/res/layout/activity_vod_play.xml index 52632e5..1d1aa63 100644 --- a/example/src/main/res/layout/activity_vod_play.xml +++ b/example/src/main/res/layout/activity_vod_play.xml @@ -25,7 +25,8 @@ android:src="@drawable/ic_av_close" app:tint="@color/av_text_primary" app:layout_constraintEnd_toEndOf="parent" - app:layout_constraintTop_toTopOf="parent" /> + app:layout_constraintTop_toTopOf="parent" + app:layout_constraintStart_toStartOf="parent" /> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -