diff --git a/README.en.md b/README.en.md new file mode 100644 index 0000000000000000000000000000000000000000..2335c79778e8934ee8b1ae0bdf50cca511341b66 --- /dev/null +++ b/README.en.md @@ -0,0 +1,202 @@ +# AVCodec Based Video Codec + +### Overview + +This sample demonstrates video playback and recording implemented on AVCodec. + +- The main process of video playback is as follows: demuxing, decoding, displaying, and playing video files. +- The main process of video recording is as follows: camera capture, encoding, and muxing into MP4 files. + +### Atomic Capability Specifications Supported for Playback + +| Media | Muxing Format | Stream Format | +|-------|:--------------|:------------------------------------------------------| +| Video | mp4 | Video stream: H.264/H.265; audio stream: Audio Vivid | +| Video | mkv | Video stream: H.264/H.265; audio stream: AAC/MP3/OPUS | +| Video | mpeg-ts | Video stream: H.264; audio stream: Audio Vivid | + +### Atomic Capability Specifications Supported for Recording + +| Muxing Format | Video Codec Type | +|---------------|------------------| +| mp4 | H.264/H.265 | AAC, MPEG (MP3)| +| m4a | AVC (H.264) | + +This sample supports only video recording. It does not integrate the audio capability. + +### Preview + +| Home page | App usage example | +|---------------------------------------------------------------|---------------------------------------------------------------| +| ![AVCodec_Index.png](screenshots/device/AVCodec_Index.en.png) | ![AVCodecSample.gif](screenshots/device/AVCodecSample.en.gif) | + +### How to Use + +1. A message is displayed, asking you whether to allow AVCodecVideo to use the camera. Touch **Allow**. + +#### Recording + +1. Touch **Record**. + +2. Confirm the recording is allowed to be saved to the **gallery**. + +3. Touch **Stop** to finish recording. + +#### Playback + +1. Place the recorded video file in **gallery**, or click **Start** to record another video file (without audio). + +2. Touch **Play** and select a file to play. + +### Project Directory + +``` +├──entry/src/main/cpp // Native layer +│ ├──capbilities // Capability interfaces and implementation +│ │ ├──include // Capability interfaces +│ │ ├──AudioDecoder.cpp // Audio decoder implementation +│ │ ├──Demuxer.cpp // Demuxer implementation +│ │ ├──Muxer.cpp // Muxer implementation +│ │ ├──VideoDecoder.cpp // Video decoder implementation +│ │ └──VideoEncoder.cpp // Video encoder implementation +│ ├──common // Common modules +│ │ ├──dfx // Logs +│ │ ├──SampleCallback.cpp // Codec callback implementation +│ │ ├──SampleCallback.h // Codec callback definition +│ │ └──SampleInfo.h // Common classes for function implementation +│ ├──render // Interfaces and implementation of the display module +│ │ ├──include // Display module interfaces +│ │ ├──EglCore.cpp // Display parameter settings +│ │ ├──PluginManager.cpp // Display module management implementation +│ │ └──PluginRender.cpp // Display logic implementation +│ ├──sample // Native layer +│ │ ├──player // Player interfaces and implementation at the native layer +│ │ │ ├──Player.cpp // Player implementation at the native layer +│ │ │ ├──Player.h // Player interfaces at the native layer +│ │ │ ├──PlayerNative.cpp // Player entry at the native layer +│ │ │ └──PlayerNative.h +│ │ └──recorder // Recorder interface and implementation at the native layer +│ │ ├──Recorder.cpp // Recorder implementation at the native layer +│ │ ├──Recorder.h // Recorder interfaces at the native layer +│ │ ├──RecorderNative.cpp // Recorder entry at the native layer +│ │ └──RecorderNative.h +│ ├──types // Interfaces exposed by the native layer +│ │ ├──libplayer // Interfaces exposed by the player to the UI layer +│ │ └──librecorder // Interfaces exposed by the recorder to the UI layer +│ └──CMakeLists.txt // Compilation entry +├──ets // UI layer +│ ├──common // Common modules +│ │ ├──utils // Common utility class +│ │ │ ├──CameraCheck.ets // Check whether camera parameters are supported +│ │ │ ├──DateTimeUtils.ets // Used to obtain the current time +│ │ │ └──Logger.ets // Log utility +│ │ └──CommonConstants.ets // Common constants +│ ├──entryability // App entry +│ │ └──EntryAbility.ets +│ ├──entrybackupability +│ │ └──EntryBackupAbility.ets +│ ├──model +│ │ └──CameraDataModel.ets // Camera parameter data class +│ └──pages // Pages contained in the EntryAbility +│ ├──Index.ets // Home page/Playback page +│ └──Recorder.ets // Recording page +├──resources // Static resources +│ ├──base // Resource files in this directory are assigned unique IDs. +│ │ ├──element // Fonts and colors +│ │ ├──media // Images +│ │ └──profile // Home page of the app entry +│ ├──en_US // Resources in this directory are preferentially matched when the device language is English (US). +│ └──zh_CN // Resources in this directory are preferentially matched when the device language is simplified Chinese. +└──module.json5 // Module configuration information +``` + +### How to Implement + +#### Recording + +##### UI Layer + +1. On the **Index** page at the UI layer, touching **Record** triggers the BindSheet, confirming to save the recording + file to the gallery. +2. After the file path is selected, the encoder calls **initNative** at the ArkTS layer by using the FD of the file and + the preset recording parameters. After the initialization is complete, the encoder calls * + *OH_NativeWindow_GetSurfaceId** to obtain the surface ID of the NativeWindow and return it to the UI layer through a + callback. +3. After obtaining the surface ID from the encoder, the UI layer invokes the page route, carrying the surface ID, to + redirect to the recording page. +4. During the construction of the **XComponent** on the recording page, the **onLoad()** method is called to obtain the + surface ID of the **XComponent**. Then **createDualChannelPreview()** is called to create a production-consumption + model where the camera produces, and both the **XComponent** and the encoder's surface consume. + +##### Native Layer + +1. On the recording page, the encoder starts to encode the camera preview stream at the UI layer. +2. Each time the encoder successfully encodes a frame, the callback function **OnNewOutputBuffer()** in * + *sample_callback.cpp** is invoked once, and the AVCodec framework provides an **OH_AVBuffer**. +3. In the output callback, you need to manually store the frame buffer and index in the output queue and instruct the + output thread to unlock. +4. The output thread stores the frame information in the previous step as bufferInfo and pops out of the queue. +5. The output thread uses bufferInfo obtained in the previous step to call **WriteSample** to mux the frame into the MP4 + format. +6. The output thread calls **FreeOutputBuffer** to return the buffer of this frame to the AVCodec framework, achieving + buffer cycling. + +#### Playback + +##### UI Layer + +1. On the **Index** page at the UI layer, touching **Play** triggers a click event and the **selectFile()** method. This + method invokes the file selection module of the gallery and obtains the file path selected by the user. +2. After the file path is selected, **play()** is invoked to open a file in the file path, obtain the file size, and + change the button status to unavailable. Then the UI layer invokes **playNative()** exposed by the ArkTS layer. +3. The UI layer calls **PlayerNative::Play()** based on the **playNative** field. The callback function for ending the + playback is registered in this method. +4. When the playback ends, **napi_call_function()** in the callback is invoked to instruct the UI layer to change the + button status to available. + +##### ArkTS Layer + +1. Call **Export()** of **PluginManager()** in **Init()** of **PlayerNative.cpp** and register the callback function * + *OnSurfaceCreatedCB()**. When a new **XComponent** appears on the page, convert it and assign it to **pluginWindow_** + in the singleton class **PluginManager**. + +##### Native Layer + +1. The working principles are as follows. + - After the decoder is started, **OnNeedInputBuffer** is invoked each time the decoder obtains a frame, and the + AVCodec framework provides an **OH_AVBuffer**. + - In the input callback, you need to manually store the frame buffer and index in the input queue and instruct the + input thread to unlock. + - The input thread stores the frame information in the previous step as bufferInfo and pops out of the queue. + - The input thread uses bufferInfo obtained in the previous step to call **ReadSample** to demux the frame. + - The input thread uses the demuxed bufferInfo to call **PushInputData** of the decoder. When the buffer is no + longer needed, the input thread returns the buffer to the framework, achieving buffer cycling. + - After **PushInputData** is called, the decoder starts frame decoding. Each time a frame is decoded, the output + callback function is invoked. You need to manually store the frame buffer and index to the output queue. + - The output thread stores the frame information in the previous step as bufferInfo and pops out of the queue. + - After calling **FreeOutputData**, the output thread displays the frame and releases the buffer. The released + buffer is returned to the framework, achieving buffer cycling. +2. In the decoder configuration, the input parameter **OHNativeWindow*** of **OH_VideoDecoder_SetSurface** is * + *pluginWindow_** in **PluginManager**. +3. In the decoder configuration, **SetCallback** is used. The input and output callbacks in **sample_callback.cpp** must + store the callback frame buffer and index to a user-defined container, named **sample_info.h**, for subsequent + operations. +4. **Start()** in **Player.cpp** is used to start the input thread and output thread. + +### Required Permissions + +**ohos.permission.CAMERA**: allows an app to use the camera. + +### Dependencies + +N/A + +### Constraints + +1. The sample app is supported only on Huawei phones running the standard system. + +2. The HarmonyOS version must be HarmonyOS NEXT Beta1 or later. + +3. The DevEco Studio version must be DevEco Studio NEXT Beta1 or later. + +4. The HarmonyOS SDK version must be HarmonyOS NEXT Beta1 or later. diff --git a/README.md b/README.md index 13f59ef2db2007d459e6530e3a3638c171260c12..3187996256e17ca8fbde1002bf6f14300ba4396f 100644 --- a/README.md +++ b/README.md @@ -33,13 +33,13 @@ 1. 点击“录制” -2. 选取视频输出路径,默认为【我的手机】文件夹下 +2. 确认允许录制文件保存到图库 3. 录制完成后点击“停止录制” #### 播放 -1. 推送视频文件至storage/media/100/local/files/Docs下或点击下方“开始录制”,录制一个视频文件(无音频) +1. 推送视频文件至图库下或点击下方“开始录制”,录制一个视频文件(无音频) 2. 点击播放按钮,选择文件,开始播放 @@ -82,14 +82,16 @@ ├──ets // UI层 │ ├──common // 公共模块 │ │ ├──utils // 共用的工具类 +│ │ │ ├──CameraCheck.ets // 检查相机参数是否支持 │ │ │ ├──DateTimeUtils.ets // 获取当前时间 -│ │ │ ├──Logger.ets // 日志工具 -│ │ │ └──SaveAsset.ets // 选取文件保持位置 +│ │ │ └──Logger.ets // 日志工具 │ │ └──CommonConstants.ets // 参数常量 │ ├──entryability // 应用的入口 │ │ └──EntryAbility.ets │ ├──entrybackupability -│ │ └──EntryBackupAbility.ets +│ │ └──EntryBackupAbility.ets +│ ├──model +│ │ └──CameraDataModel.ets // 相机参数数据类 │ └──pages // EntryAbility 包含的页面 │ ├──Index.ets // 首页/播放页面 │ └──Recorder.ets // 录制页面 @@ -107,7 +109,7 @@ #### *录制* ##### UI层 -1. 在UI层Index页面,用户点击“录制”后,会调起文件管理,用户选择一个输出地址。录制结束后,文件会存放于此。 +1. 在UI层Index页面,用户点击“录制”后,会拉起半模态界面,用户确认保存录制文件到图库。录制结束后,文件会存放于图库。 2. 选择好文件后,会用刚刚打开的fd,和用户预设的录制参数,掉起ArkTS的initNative,待初始化结束后,调用OH_NativeWindow_GetSurfaceId接口,得到NativeWindow的surfaceId,并把surfaceId回调回UI层。 3. UI层拿到编码器给的surfaceId后,调起页面路由,携带该surfaceId,跳转到Recorder页面; 4. 录制页面XComponent构建时,会调起.onLoad()方法,此方法首先会拿到XComponent的surfaceId,然后调起createDualChannelPreview(),此函数会建立一个相机生产,XComponent和编码器的surface消费的生产消费模型。 @@ -122,7 +124,7 @@ #### *播放* ##### UI层 -1. 在UI层Index页面,用户点击播放按钮后,触发点击事件,调起selectFile()函数,该函数会调起文件管理的选择文件模块,拿到用户选取文件的路径; +1. 在UI层Index页面,用户点击播放按钮后,触发点击事件,调起selectFile()函数,该函数会调起图库的选择文件模块,拿到用户选取文件的路径; 2. 用户选择文件成功后,调起play()函数,该函数会根据上一步获取到的路径,打开一个文件,并获取到该文件的大小,改变按钮状态为不可用,之后调起ArkTS层暴露给应用层的playNative()接口; 3. 根据playNative字段,调起PlayerNative::Play()函数,此处会注册播放结束的回调。 4. 播放结束时,Callback()中napi_call_function()接口调起,通知应用层,恢复按钮状态为可用。 @@ -156,8 +158,8 @@ 1. 本示例仅支持标准系统上运行,支持设备:华为手机; -2. HarmonyOS系统:HarmonyOS NEXT Developer Beta1及以上; +2. HarmonyOS系统:HarmonyOS NEXT Beta1及以上; -3. DevEco Studio版本:DevEco Studio NEXT Developer Beta1及以上; +3. DevEco Studio版本:DevEco Studio NEXT Beta1及以上; -4. HarmonyOS SDK版本:HarmonyOS NEXT Developer Bata1 SDK及以上。 \ No newline at end of file +4. HarmonyOS SDK版本:HarmonyOS NEXT Bata1 SDK及以上。 \ No newline at end of file diff --git a/entry/src/main/cpp/sample/player/Player.cpp b/entry/src/main/cpp/sample/player/Player.cpp index 46d3c3dd69c191653c7f3308f0a4611da20d8a59..7f89adc503c715ca3c92e2bb94f04dfdec7dcc9b 100644 --- a/entry/src/main/cpp/sample/player/Player.cpp +++ b/entry/src/main/cpp/sample/player/Player.cpp @@ -133,7 +133,8 @@ int32_t Player::Start() { audioDecInputThread_ = std::make_unique(&Player::AudioDecInputThread, this); audioDecOutputThread_ = std::make_unique(&Player::AudioDecOutputThread, this); #ifdef DEBUG_DECODE - // for debug The decoded data is written to the sandbox address, and the physical address is /data/app/el2/100/base/com.example.avcodecsample/haps/entry/files/ + // for debug The decoded data is written to the sandbox address, and the physical address is + // /data/app/el2/100/base/com.example.avcodecsample/haps/entry/files/ audioOutputFile_.open("/data/storage/el2/base/haps/entry/files/audio_decode_out.pcm", std::ios::out | std::ios::binary); #endif @@ -224,6 +225,7 @@ void Player::Release() { audioDecContext_ = nullptr; } OH_AudioStreamBuilder_Destroy(builder_); + builder_ = nullptr; doneCond_.notify_all(); // Trigger the callback sampleInfo_.playDoneCallback(sampleInfo_.playDoneCallbackData); diff --git a/entry/src/main/ets/common/CommonConstants.ets b/entry/src/main/ets/common/CommonConstants.ets index 3e2801ad66affff142a6ee578e319a46f718cace..e6e1af7bf7254d66f1e03a43dbc58dd3e5ba1b68 100644 --- a/entry/src/main/ets/common/CommonConstants.ets +++ b/entry/src/main/ets/common/CommonConstants.ets @@ -13,174 +13,153 @@ * limitations under the License. */ +import { camera } from '@kit.CameraKit'; + export class CommonConstants { /** * Index page Tag. */ static readonly INDEX_TAG: string = 'INDEX'; - /** * Recorder page Tag. */ static readonly RECORDER_TAG: string = 'RECORDER'; - /** * Default ID. */ static readonly DEFAULT_ID: string = '-1'; - /** * Default time. */ static readonly DEFAULT_TIME: string = '00:00'; - /** * PX. */ static readonly PX: string = 'px'; - /** * Default value. */ static readonly DEFAULT_VALUE: number = 0; - + /** + * Default profile. + */ + static readonly DEFAULT_PROFILE: camera.Profile = { + format: camera.CameraFormat.CAMERA_FORMAT_YUV_420_SP, + size: { + width: 1920, + height: 1080 + } + }; /** * Video avc mime type. */ static readonly MIME_VIDEO_AVC: string = 'video/avc'; - /** * Video hevc mime type. */ static readonly MIME_VIDEO_HEVC: string = 'video/hevc'; - /** * Default width. */ static readonly DEFAULT_WIDTH: number = 1920; - /** * Default height. */ static readonly DEFAULT_HEIGHT: number = 1080; - /** * 4K video width. */ static readonly VIDEO_WIDTH_4K: number = 3840; - /** * 4K video height. */ static readonly VIDEO_HEIGHT_4K: number = 2160; - /** * 1080P video width. */ static readonly VIDEO_WIDTH_1080P: number = 1920; - /** * 1080P video height. */ static readonly VIDEO_HEIGHT_1080P: number = 1080; - /** * 720P video width. */ static readonly VIDEO_WIDTH_720P: number = 1280; - /** * 720P video height. */ static readonly VIDEO_HEIGHT_720P: number = 720; - /** * 10M bitrate. */ static readonly BITRATE_VIDEO_10M: number = 10 * 1024 * 1024; - /** * 20M bitrate. */ static readonly BITRATE_VIDEO_20M: number = 20 * 1024 * 1024; - /** * 30M bitrate. */ static readonly BITRATE_VIDEO_30M: number = 30 * 1024 * 1024; - /** * 30 FPS. */ static readonly FRAMERATE_VIDEO_30FPS: number = 30; - /** * 60 FPS. */ static readonly FRAMERATE_VIDEO_60FPS: number = 60; - /** * Duration. */ static readonly DURATION: number = 2000; - /** * The distance between toast dialog box and the bottom of screen. */ - static readonly BOTTOM: number = 200; - + static readonly BOTTOM: number = 80; /** * Default picker item height. */ static readonly DEFAULT_PICKER_ITEM_HEIGHT: number = 30; - /** * Selected text style font size. */ static readonly SELECTED_TEXT_STYLE_FONT_SIZE: number = 15; - /** * Video mime type. */ static readonly VIDEO_MIMETYPE: string[] = ['HDRVivid', 'H264', 'H265']; - /** * Video resolution. */ static readonly VIDEO_RESOLUTION: string[] = ['4K', '1080P', '720P']; - /** * Video framerate. */ static readonly VIDEO_FRAMERATE: string[] = ['30Fps', '60Fps']; - /** * Video recorderInfo. */ static readonly RECORDER_INFO: string[][] = [ CommonConstants.VIDEO_MIMETYPE, CommonConstants.VIDEO_RESOLUTION, CommonConstants.VIDEO_FRAMERATE ]; - /** * The number corresponding to true. */ static readonly TRUE: number = 1; - /** * The number corresponding to false. */ static readonly FALSE: number = 0; - /** * Min range. */ static readonly MIN_RANGE: number = 1; - /** * Max range. */ static readonly MAX_RANGE: number = 30; - /** * Full size. */ diff --git a/entry/src/main/ets/common/utils/CameraCheck.ets b/entry/src/main/ets/common/utils/CameraCheck.ets new file mode 100644 index 0000000000000000000000000000000000000000..0f102517a616caa2b4924281bb922db492a3ba3d --- /dev/null +++ b/entry/src/main/ets/common/utils/CameraCheck.ets @@ -0,0 +1,76 @@ +/* + * Copyright (c) 2024 Huawei Device Co., Ltd. + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +import { camera } from '@kit.CameraKit'; +import Logger from './Logger'; +import { CameraDataModel } from '../../model/CameraDateModel'; +import { CommonConstants as Const } from '../CommonConstants'; + +const TAG = 'CAMERA_CHECK'; + +export function cameraCheck(cameraManager: camera.CameraManager, + cameraData: CameraDataModel): undefined | camera.VideoProfile { + let cameraDevices = cameraManager.getSupportedCameras(); + if (cameraDevices !== undefined && cameraDevices.length <= 0) { + Logger.error(TAG, 'cameraManager.getSupportedCameras error!'); + return; + } + + let profiles: camera.CameraOutputCapability = + cameraManager.getSupportedOutputCapability(cameraDevices[0], camera.SceneMode.NORMAL_VIDEO); + if (!profiles) { + Logger.error(TAG, 'cameraManager.getSupportedOutputCapability error!'); + return; + } + + let videoProfiles: Array = profiles.videoProfiles; + if (!videoProfiles) { + Logger.error(TAG, 'Get videoProfiles error!'); + return; + } + + let videoProfile: undefined | camera.VideoProfile = videoProfiles.find((profile: camera.VideoProfile) => { + if (cameraData.isHDRVivid) { + if (cameraData.frameRate === Const.FRAMERATE_VIDEO_30FPS) { + return profile.size.width === cameraData.cameraWidth && + profile.size.height === cameraData.cameraHeight && + profile.format === camera.CameraFormat.CAMERA_FORMAT_YCBCR_P010 && + profile.frameRateRange.min === 1 && + profile.frameRateRange.max === 30; + } else { + return profile.size.width === cameraData.cameraWidth && + profile.size.height === cameraData.cameraHeight && + profile.format === camera.CameraFormat.CAMERA_FORMAT_YCBCR_P010 && + profile.frameRateRange.min === cameraData.frameRate && + profile.frameRateRange.max === cameraData.frameRate; + } + } else { + if (cameraData.frameRate === Const.FRAMERATE_VIDEO_30FPS) { + return profile.size.width === cameraData.cameraWidth && + profile.size.height === cameraData.cameraHeight && + profile.format === camera.CameraFormat.CAMERA_FORMAT_YUV_420_SP && + profile.frameRateRange.min === 1 && + profile.frameRateRange.max === 30; + } else { + return profile.size.width === cameraData.cameraWidth && + profile.size.height === cameraData.cameraHeight && + profile.format === camera.CameraFormat.CAMERA_FORMAT_YUV_420_SP && + profile.frameRateRange.min === cameraData.frameRate && + profile.frameRateRange.max === cameraData.frameRate; + } + } + }); + return videoProfile; +} \ No newline at end of file diff --git a/entry/src/main/ets/common/utils/SaveCameraAsset.ets b/entry/src/main/ets/common/utils/SaveCameraAsset.ets deleted file mode 100644 index d3a91beb3026019aaa1307d553965e2d79df2f0d..0000000000000000000000000000000000000000 --- a/entry/src/main/ets/common/utils/SaveCameraAsset.ets +++ /dev/null @@ -1,64 +0,0 @@ -/* - * Copyright (C) 2024 Huawei Device Co., Ltd. - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -import { fileIo, picker } from '@kit.CoreFileKit'; -import { BusinessError } from '@kit.BasicServicesKit'; -import DateTimeUtil from './DateTimeUtils'; -import Logger from './Logger'; - -const TAG: string = 'SaveAsset'; - -export default class SaveAsset { - private tag: string; - private lastSaveTime: string = ''; - private saveIndex: number = 0; - - constructor(tag: string) { - this.tag = tag; - } - - public async createVideoFd(): Promise { - Logger.info(TAG, 'get Recorder File Fd'); - const mDateTimeUtil = new DateTimeUtil(); - const displayName = this.checkName(`AVCodec_${mDateTimeUtil.getDate()}_${mDateTimeUtil.getTime()}`) + '.mp4'; - Logger.info(TAG, 'get Recorder display name is: ' + displayName); - let photoSaveOptions = new picker.PhotoSaveOptions(); - photoSaveOptions.newFileNames = [displayName]; - let photoPicker = new picker.PhotoViewPicker(); - let result = await photoPicker.save(photoSaveOptions); - - let videoFd: number = 0; - await fileIo.open(result[0], fileIo.OpenMode.READ_WRITE | fileIo.OpenMode.CREATE).then((file) => { - videoFd = file.fd; - Logger.info(TAG, 'getRawFileDescriptor success fileName: ' + result[0] + ', fd: ' + videoFd); - }).catch((err: BusinessError) => { - Logger.error(TAG, 'open file failed with error message: ' + err.message + ', error code: ' + err.code); - }) - - Logger.info(TAG, 'leave get Recorder File Fd'); - return videoFd; - } - - private checkName(name: string): string { - if (this.lastSaveTime == name) { - this.saveIndex += 1; - return `${name}_${this.saveIndex}`; - } - this.lastSaveTime = name; - this.saveIndex = 0; - Logger.info(this.tag, 'get Recorder File name is: ' + name); - return name; - } -} \ No newline at end of file diff --git a/entry/src/main/ets/model/CameraDateModel.ets b/entry/src/main/ets/model/CameraDateModel.ets new file mode 100644 index 0000000000000000000000000000000000000000..458dbbe2ba5c477bac8f7a97908fb7e251ea1f12 --- /dev/null +++ b/entry/src/main/ets/model/CameraDateModel.ets @@ -0,0 +1,40 @@ +/* + * Copyright (c) 2024 Huawei Device Co., Ltd. + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +import { camera } from '@kit.CameraKit'; +import { CommonConstants as Const } from '../common/CommonConstants'; + +export class CameraDataModel { + surfaceId: string = ''; + cameraWidth: number = Const.DEFAULT_WIDTH; + cameraHeight: number = Const.DEFAULT_HEIGHT; + isHDRVivid: number = Const.DEFAULT_VALUE; + outputfd: number = -1; + frameRate: number = Const.FRAMERATE_VIDEO_30FPS; + previewProfile: camera.Profile = Const.DEFAULT_PROFILE; + videoCodecMime: string | null = Const.MIME_VIDEO_AVC; + bitRate: number = Const.BITRATE_VIDEO_20M; + + setCodecFormat(isHDR: number, codecMime: string) { + this.isHDRVivid = isHDR; + this.videoCodecMime = codecMime; + } + + setResolution(width: number, height: number, bit: number) { + this.cameraWidth = width; + this.cameraHeight = height; + this.bitRate = bit; + } +} \ No newline at end of file diff --git a/entry/src/main/ets/pages/Index.ets b/entry/src/main/ets/pages/Index.ets index e6355c15d930c69360c13f75ef2275088f922cbb..a5647999e0ff402ca8ef14865051f74bc0083abb 100644 --- a/entry/src/main/ets/pages/Index.ets +++ b/entry/src/main/ets/pages/Index.ets @@ -13,48 +13,40 @@ * limitations under the License. */ -import { picker, fileIo } from '@kit.CoreFileKit'; +import { fileIo } from '@kit.CoreFileKit'; import { display, promptAction, router } from '@kit.ArkUI'; +import { camera } from '@kit.CameraKit'; +import { photoAccessHelper } from '@kit.MediaLibraryKit'; import player from 'libplayer.so'; import recorder from 'librecorder.so'; import Logger from '../common/utils/Logger'; -import SaveAsset from '../common/utils/SaveCameraAsset'; +import DateTimeUtil from '../common/utils/DateTimeUtils'; import { CommonConstants as Const } from '../common/CommonConstants'; +import { CameraDataModel } from '../model/CameraDateModel'; +import { cameraCheck } from '../common/utils/CameraCheck'; const TAG: string = Const.INDEX_TAG; - -class DataModel { - surfaceId: string = ''; - cameraWidth: number = Const.DEFAULT_VALUE; - cameraHeight: number = Const.DEFAULT_VALUE; - isHDRVivid: number = Const.DEFAULT_VALUE; - outputfd: number | null = null; - frameRate: number = Const.DEFAULT_VALUE; -} +const DATETIME: DateTimeUtil = new DateTimeUtil(); @Entry @Component struct Player { @State buttonEnabled: boolean = true; + @State isShow: boolean = false; + private cameraData: CameraDataModel = new CameraDataModel(); private selectFilePath: string | null = null; - private mSaveAsset: SaveAsset = new SaveAsset(TAG); - private videoCodecMime: string | null = Const.MIME_VIDEO_AVC; - private isHDRVivid: number = Const.DEFAULT_VALUE; - private cameraWidth: number = Const.DEFAULT_WIDTH; - private cameraHeight: number = Const.DEFAULT_HEIGHT; - private bitRate: number = Const.BITRATE_VIDEO_20M; private display = display.getDefaultDisplaySync(); - private heightPx = (this.display.width * this.cameraHeight / this.cameraWidth) + Const.PX; - private frameRate: number = Const.FRAMERATE_VIDEO_30FPS; - private outputfd: number | null = null; + private heightPx = (this.display.width * Const.DEFAULT_HEIGHT / Const.DEFAULT_WIDTH) + Const.PX; selectFile() { - let documentSelectOptions = new picker.DocumentSelectOptions; - let documentViewPicker = new picker.DocumentViewPicker; - documentViewPicker.select(documentSelectOptions) - .then((documentSelectResult) => { - this.selectFilePath = documentSelectResult[0]; - if (this.selectFilePath == null) { + let photoPicker = new photoAccessHelper.PhotoViewPicker(); + photoPicker.select({ + MIMEType: photoAccessHelper.PhotoViewMIMETypes.VIDEO_TYPE, + maxSelectNumber: 1 + }) + .then((photoSelectResult) => { + this.selectFilePath = photoSelectResult.photoUris[0]; + if (this.selectFilePath === null) { promptAction.showToast({ message: $r('app.string.alert'), duration: Const.DURATION, @@ -67,32 +59,6 @@ struct Player { }); } - async selectOutputFilePath(): Promise { - this.outputfd = await this.mSaveAsset.createVideoFd(); - if (this.outputfd !== null) { - recorder.initNative(this.outputfd, this.videoCodecMime, this.cameraWidth, - this.cameraHeight, this.frameRate, this.isHDRVivid, this.bitRate) - .then((data) => { - if (data.surfaceId !== null) { - let paramsInfo: DataModel = { - surfaceId: data.surfaceId, - cameraWidth: this.cameraWidth, - cameraHeight: this.cameraHeight, - isHDRVivid: this.isHDRVivid, - outputfd: this.outputfd, - frameRate: this.frameRate - }; - router.pushUrl({ - url: 'pages/Recorder', - params: paramsInfo - }); - } - }) - } else { - Logger.error(TAG, 'get outputfd failed'); - } - } - play() { let inputFile = fileIo.openSync(this.selectFilePath, fileIo.OpenMode.READ_ONLY); if (!inputFile) { @@ -110,6 +76,27 @@ struct Player { }) } + async checkIsProfileSupport() { + let cameraManager: camera.CameraManager = camera.getCameraManager(getContext(this)); + if (!cameraManager) { + Logger.error(TAG, 'camera.getCameraManager error!'); + } + + let videoProfile: undefined | camera.VideoProfile = cameraCheck(cameraManager, this.cameraData); + if (!videoProfile) { + Logger.error(TAG, 'videoProfile is not found'); + promptAction.showToast({ + message: $r('app.string.alert_setting'), + duration: Const.DURATION, + bottom: Const.BOTTOM, + backgroundColor: Color.White, + backgroundBlurStyle: BlurStyle.NONE + }) + this.cameraData = new CameraDataModel(); + return; + } + } + @Builder SettingButton() { Button() { @@ -138,18 +125,15 @@ struct Player { onAccept: (value: TextPickerResult) => { switch (value.value[0]) { case Const.VIDEO_MIMETYPE[0]: { - this.videoCodecMime = Const.MIME_VIDEO_HEVC; - this.isHDRVivid = Const.TRUE; + this.cameraData.setCodecFormat(Const.TRUE, Const.MIME_VIDEO_HEVC); break; } case Const.VIDEO_MIMETYPE[1]: { - this.videoCodecMime = Const.MIME_VIDEO_AVC; - this.isHDRVivid = Const.FALSE; + this.cameraData.setCodecFormat(Const.FALSE, Const.MIME_VIDEO_AVC); break; } case Const.VIDEO_MIMETYPE[2]: { - this.videoCodecMime = Const.MIME_VIDEO_HEVC; - this.isHDRVivid = Const.FALSE; + this.cameraData.setCodecFormat(Const.FALSE, Const.MIME_VIDEO_HEVC); break; } default: @@ -158,21 +142,15 @@ struct Player { switch (value.value[1]) { case Const.VIDEO_RESOLUTION[0]: { - this.cameraWidth = Const.VIDEO_WIDTH_4K; - this.cameraHeight = Const.VIDEO_HEIGHT_4K; - this.bitRate = Const.BITRATE_VIDEO_30M; + this.cameraData.setResolution(Const.VIDEO_WIDTH_4K, Const.VIDEO_HEIGHT_4K, Const.BITRATE_VIDEO_30M); break; } case Const.VIDEO_RESOLUTION[1]: { - this.cameraWidth = Const.VIDEO_WIDTH_1080P; - this.cameraHeight = Const.VIDEO_HEIGHT_1080P; - this.bitRate = Const.BITRATE_VIDEO_20M; + this.cameraData.setResolution(Const.VIDEO_WIDTH_1080P, Const.VIDEO_HEIGHT_1080P, Const.BITRATE_VIDEO_20M); break; } case Const.VIDEO_RESOLUTION[2]: { - this.cameraWidth = Const.VIDEO_WIDTH_720P; - this.cameraHeight = Const.VIDEO_HEIGHT_720P; - this.bitRate = Const.BITRATE_VIDEO_10M; + this.cameraData.setResolution(Const.VIDEO_WIDTH_720P, Const.VIDEO_HEIGHT_720P, Const.BITRATE_VIDEO_10M); break; } default: @@ -181,21 +159,77 @@ struct Player { switch (value.value[2]) { case Const.VIDEO_FRAMERATE[0]: { - this.frameRate = Const.FRAMERATE_VIDEO_30FPS; + this.cameraData.frameRate = Const.FRAMERATE_VIDEO_30FPS; break; } case Const.VIDEO_FRAMERATE[1]: { - this.frameRate = Const.FRAMERATE_VIDEO_60FPS; + this.cameraData.frameRate = Const.FRAMERATE_VIDEO_60FPS; break; } default: break; } + this.checkIsProfileSupport(); } }); }) } + @Builder + Authorized() { + Column() { + Text($r('app.string.saveButtonNote')) + .width('360.42vp') + .fontSize('16vp') + .margin({ bottom: '12vp' }) + + Row() { + Button($r('app.string.saveButtonCancel')) + .onClick(() => { + this.isShow = false; + }) + .width('174.38vp') + .margin({ right: '12vp' }) + + SaveButton({ text: SaveDescription.SAVE }) + .onClick(async () => { + const context = getContext(this); + let helper = photoAccessHelper.getPhotoAccessHelper(context); + let uri = await helper.createAsset(photoAccessHelper.PhotoType.VIDEO, 'mp4', { + title: `AVCodecVideo_${DATETIME.getDate()}_${DATETIME.getTime()}` + }); + let file = await fileIo.open(uri, fileIo.OpenMode.READ_WRITE | fileIo.OpenMode.CREATE); + this.cameraData.outputfd = file.fd; + if (this.cameraData.outputfd !== null) { + recorder.initNative(this.cameraData.outputfd, this.cameraData.videoCodecMime, this.cameraData.cameraWidth, + this.cameraData.cameraHeight, this.cameraData.frameRate, this.cameraData.isHDRVivid, + this.cameraData.bitRate).then((data) => { + if (data.surfaceId !== null) { + this.cameraData.surfaceId = data.surfaceId; + router.pushUrl({ + url: 'pages/Recorder', + params: this.cameraData + }); + } + }) + } else { + Logger.error(TAG, 'get outputfd failed!'); + } + }) + .width('174.38vp') + .height('40vp') + } + .justifyContent(FlexAlign.Center) + .alignItems(VerticalAlign.Bottom) + .margin({ bottom: '44vp' }) + .width('100%') + .height('52vp') + } + .justifyContent(FlexAlign.End) + .width('100%') + .height('100%') + } + @Builder Window() { Row() { @@ -244,7 +278,13 @@ struct Player { Button($r('app.string.record')) .onClick(() => { - this.selectOutputFilePath(); + this.isShow = true; + }) + .bindSheet($$this.isShow, this.Authorized, { + height: 210, + title: { + title: $r('app.string.saveButtonTitle') + } }) .size({ width: $r('app.float.index_button_width'), diff --git a/entry/src/main/ets/pages/Recorder.ets b/entry/src/main/ets/pages/Recorder.ets index df42603e2ed1eecf86337770443cfba2c20462ca..8cc78759bd33542d0407435020541f342102ba46 100644 --- a/entry/src/main/ets/pages/Recorder.ets +++ b/entry/src/main/ets/pages/Recorder.ets @@ -21,31 +21,18 @@ import recorder from 'librecorder.so'; import Logger from '../common/utils/Logger'; import { dateTime } from '../common/utils/DateTimeUtils'; import { CommonConstants as Const } from '../common/CommonConstants'; +import { CameraDataModel } from '../model/CameraDateModel'; +import { cameraCheck } from '../common/utils/CameraCheck'; const TAG: string = Const.RECORDER_TAG; -const params: RouTmp = router.getParams() as RouTmp; -const encoderSurfaceId: string = params.surfaceId; -const cameraWidth: number = params.cameraWidth; -const cameraHeight: number = params.cameraHeight; -const isHDRVivid: number = params.isHDRVivid; -const frameRate: number = params.frameRate; -const outputfd: number = params.outputfd; +const params: CameraDataModel = router.getParams() as CameraDataModel; let cameraInput: camera.CameraInput; let XComponentPreviewOutput: camera.PreviewOutput; let encoderVideoOutput: camera.VideoOutput; let videoSession: camera.VideoSession; -class RouTmp { - surfaceId: string = Const.DEFAULT_ID; - cameraWidth: number = Const.DEFAULT_VALUE; - cameraHeight: number = Const.DEFAULT_VALUE; - isHDRVivid: number = Const.DEFAULT_VALUE; - frameRate: number = Const.DEFAULT_VALUE; - outputfd: number = Const.DEFAULT_VALUE; -} - async function releaseCamera() { // Stop the video output stream if (encoderVideoOutput) { @@ -60,7 +47,7 @@ async function releaseCamera() { // Stop the Session. videoSession.stop(); // Close file fd. - fileIo.close(outputfd); + fileIo.close(params.outputfd); // Close camera input stream. cameraInput.close(); // Release preview output stream. @@ -72,19 +59,6 @@ async function releaseCamera() { videoSession.release(); } -function getCameraDevices(cameraManager: camera.CameraManager): Array { - let cameraArray: Array = cameraManager.getSupportedCameras(); - if (cameraArray !== undefined && cameraArray.length <= 0) { - Logger.error(TAG, 'cameraManager.getSupportedCameras error'); - return []; - } - for (let index = 0; index < cameraArray.length; index++) { - Logger.info(TAG, 'getCameraDevices -- cameraId :' + cameraArray[index].cameraId); - Logger.info(TAG, 'getCameraDevices -- cameraPosition :' + cameraArray[index].cameraPosition); - } - return cameraArray; -} - @Entry @Component struct Recorder { @@ -99,6 +73,7 @@ struct Recorder { private XComponentController: XComponentController = new XComponentController(); private display = display.getDefaultDisplaySync(); private heightPx = (this.display.width * this.cameraWidth / this.cameraHeight) + Const.PX; + private widthPx = this.display.width + Const.PX; private timer: number = Const.DEFAULT_VALUE; private seconds: number = Const.DEFAULT_VALUE; private isReleased: boolean = false; @@ -149,82 +124,27 @@ struct Recorder { } // Get supported camera devices. - let camerasDevices: Array = getCameraDevices(cameraManager); - - // Get profile object. - let profiles: camera.CameraOutputCapability = cameraManager.getSupportedOutputCapability(camerasDevices[0], - camera.SceneMode.NORMAL_VIDEO); - if (!profiles) { - Logger.error(TAG, 'cameraManager.getSupportedOutputCapability error'); - return; - } - - // Get the preview stream profile - let previewProfiles: Array = profiles.previewProfiles; - if (!previewProfiles) { - Logger.error(TAG, 'createOutput previewProfiles == null || undefined'); + let cameraDevices: Array = cameraManager.getSupportedCameras(); + if (cameraDevices !== undefined && cameraDevices.length <= 0) { + Logger.error(TAG, 'cameraManager.getSupportedCameras error!'); return; } - // Get the video stream profile. - let videoProfiles: Array = profiles.videoProfiles; - if (!videoProfiles) { - Logger.error(TAG, 'createOutput videoProfiles == null || undefined'); + let videoProfile: undefined | camera.VideoProfile = cameraCheck(cameraManager, params); + if (!videoProfile) { + Logger.error(TAG, 'videoProfile is not found!'); return; } //The preview stream of XComponent. - let XComponentPreviewProfile: camera.Profile = previewProfiles[0]; + let XComponentPreviewProfile: camera.Profile = params.previewProfile; if (XComponentPreviewProfile === undefined) { Logger.error(TAG, 'XComponentPreviewProfile is not found'); return; } - // The videoProfile's width and height need to be the same as the encoder's width and height. - let videoSize: camera.Size = { - width: cameraWidth, - height: cameraHeight - }; - - // Matches the videoProfile selected by the user. - let videoProfile: undefined | camera.VideoProfile = videoProfiles.find((profile: camera.VideoProfile) => { - if (isHDRVivid) { - if (frameRate === Const.FRAMERATE_VIDEO_30FPS) { - return profile.size.width === videoSize.width && - profile.size.height === videoSize.height && - profile.format === camera.CameraFormat.CAMERA_FORMAT_YCBCR_P010 && - profile.frameRateRange.min === Const.MIN_RANGE && - profile.frameRateRange.max === Const.MAX_RANGE; - } else { - return profile.size.width === videoSize.width && - profile.size.height === videoSize.height && - profile.format === camera.CameraFormat.CAMERA_FORMAT_YCBCR_P010 && - profile.frameRateRange.min === frameRate && - profile.frameRateRange.max === frameRate; - } - } else { - if (frameRate == Const.FRAMERATE_VIDEO_30FPS) { - return profile.size.width === videoSize.width && - profile.size.height === videoSize.height && - profile.format === camera.CameraFormat.CAMERA_FORMAT_YUV_420_SP && - profile.frameRateRange.min === Const.MIN_RANGE && - profile.frameRateRange.max === Const.MAX_RANGE; - } else { - return profile.size.width === videoSize.width && - profile.size.height === videoSize.height && - profile.format === camera.CameraFormat.CAMERA_FORMAT_YUV_420_SP && - profile.frameRateRange.min === frameRate && - profile.frameRateRange.max === frameRate; - } - } - }); - if (!videoProfile) { - Logger.error(TAG, 'videoProfile is not found'); - return; - } - //Create the encoder output object - encoderVideoOutput = cameraManager.createVideoOutput(videoProfile, encoderSurfaceId); + encoderVideoOutput = cameraManager.createVideoOutput(videoProfile, params.surfaceId); if (encoderVideoOutput === undefined) { Logger.error(TAG, 'encoderVideoOutput is undefined'); return; @@ -240,7 +160,7 @@ struct Recorder { // Create the cameraInput object. try { - cameraInput = cameraManager.createCameraInput(camerasDevices[0]); + cameraInput = cameraManager.createCameraInput(cameraDevices[0]); } catch (error) { let err = error as BusinessError; Logger.error(TAG, `Failed to createCameraInput. error: ${JSON.stringify(err)}`); @@ -333,7 +253,6 @@ struct Recorder { XComponent({ id: 'recorderXComponent', type: XComponentType.SURFACE, - libraryname: '', controller: this.XComponentController }) .onLoad(() => { @@ -344,7 +263,7 @@ struct Recorder { this.isRecorderTimeTextHide = false; this.getRecordTime(); }) - .width(Const.FULL_SIZE) + .width(this.widthPx) .height(this.heightPx) .gesture( PinchGesture() diff --git a/entry/src/main/resources/base/element/string.json b/entry/src/main/resources/base/element/string.json index 6d06831f566672589195b5900d8f14c734678dac..6be5677fe5b289fa6016e306a0667098c35ad31c 100644 --- a/entry/src/main/resources/base/element/string.json +++ b/entry/src/main/resources/base/element/string.json @@ -39,6 +39,22 @@ { "name": "record", "value": "Record" + }, + { + "name": "alert_setting", + "value": "The setting is not supported by the current device." + }, + { + "name": "saveButtonNote", + "value": "Allow AVCodecVideo to save captured video to gallery?" + }, + { + "name": "saveButtonCancel", + "value": "Cancel" + }, + { + "name": "saveButtonTitle", + "value": "Confirm the video storage location" } ] } \ No newline at end of file diff --git a/entry/src/main/resources/en_US/element/string.json b/entry/src/main/resources/en_US/element/string.json index 6d06831f566672589195b5900d8f14c734678dac..6be5677fe5b289fa6016e306a0667098c35ad31c 100644 --- a/entry/src/main/resources/en_US/element/string.json +++ b/entry/src/main/resources/en_US/element/string.json @@ -39,6 +39,22 @@ { "name": "record", "value": "Record" + }, + { + "name": "alert_setting", + "value": "The setting is not supported by the current device." + }, + { + "name": "saveButtonNote", + "value": "Allow AVCodecVideo to save captured video to gallery?" + }, + { + "name": "saveButtonCancel", + "value": "Cancel" + }, + { + "name": "saveButtonTitle", + "value": "Confirm the video storage location" } ] } \ No newline at end of file diff --git a/entry/src/main/resources/zh_CN/element/string.json b/entry/src/main/resources/zh_CN/element/string.json index 35c12a541d5f9622a0d6369f8b777a527ab37534..91621817b7d2729e91d7fe6fb5ee12f4fd9768e1 100644 --- a/entry/src/main/resources/zh_CN/element/string.json +++ b/entry/src/main/resources/zh_CN/element/string.json @@ -39,6 +39,22 @@ { "name": "record", "value": "录制" + }, + { + "name": "alert_setting", + "value": "当前设备不支持该设置。" + }, + { + "name": "saveButtonNote", + "value": "是否允许AVCodecVideo保存拍摄的视频到图库?" + }, + { + "name": "saveButtonCancel", + "value": "取消" + }, + { + "name": "saveButtonTitle", + "value": "视频保存位置确认" } ] } \ No newline at end of file diff --git a/screenshots/device/AVCodecSample.en.gif b/screenshots/device/AVCodecSample.en.gif new file mode 100644 index 0000000000000000000000000000000000000000..d4374f9bd14b32e50ca732999b6432373cb48741 Binary files /dev/null and b/screenshots/device/AVCodecSample.en.gif differ diff --git a/screenshots/device/AVCodecSample.gif b/screenshots/device/AVCodecSample.gif index 4ec67594f26280460cab759398b981917ca40e0b..04e0f77b71a2c786ab86caf3d727a92ed13c0af4 100644 Binary files a/screenshots/device/AVCodecSample.gif and b/screenshots/device/AVCodecSample.gif differ diff --git a/screenshots/device/AVCodec_Index.en.png b/screenshots/device/AVCodec_Index.en.png new file mode 100644 index 0000000000000000000000000000000000000000..4d04a4c8b6f0474107f745144473fe457ade61f3 Binary files /dev/null and b/screenshots/device/AVCodec_Index.en.png differ