Bootstrap

Android程序中使用FFmpeg库

目录

前言

一、环境

二、创建APP

三. 添加FFmpeg库文件到app中

1. 复制ffmpeg头文件和so库到app中

2. 修改CMakeLists.txt文件内容.

3. 修改ffmpeglib.cpp 文件内容

4. 修改NativeLib.kt 文件添加方法和加载库

5. 调用

四. 增加解析视频文件信息功能

总结


前言

        前面有一篇记录了windows上👉 编译Android平台使用的FFmpeg库。想知道的同学可以去看一下😄。这一篇记录一下在android app上怎么使用这些库。


一、环境

  1. 安装Android studio, 方法就不介绍了,网上太多安装的方法了。
  2. 安装NDK和cmake,直接使用SDK monitor安装。 看过我的编译ffmpeg库的知道用的ndk 版本是:
    25.1.8937393

cmake版本:3.22.1

二、创建APP

android studio 创建module

创建成功后,目录结构跟下面差不多,只是没有assets和cpp这两个文件夹和NatvieLib文件。后面会说这两个文件夹和文件是干嘛的

 

三. 添加FFmpeg库文件到app中

1. 复制ffmpeg头文件和so库到app中

使用过NDK项目的都知道cpp这个是放CMakeLists.txt和所有的cpp文件的。

cpp这个文件夹下面创建一个ffmpeg文件夹用来放ffmpeg的头文件和so库文件。因为只编译了一个arm64-v8a 架构,所在在lib这个文件夹下面创建一个arm64-v8a用于放so库。目录结构如下图:

2. 修改CMakeLists.txt文件内容.

修改CMakeLists.txt文件内容编译ffmpeglib.cpp文件。

CMakeLists.txt文件内容如下,都添加注释了,不多说了。有不太清楚的可以自己创建一个Native library module,比对一下看看

# For more information about using CMake with Android Studio, read the
# documentation: https://d.android.com/studio/projects/add-native-code.html.
# For more examples on how to use CMake, see https://github.com/android/ndk-samples.

# Sets the minimum CMake version required for this project.
cmake_minimum_required(VERSION 3.22.1)

# Declares the project name. The project name can be accessed via ${ PROJECT_NAME},
# Since this is the top level CMakeLists.txt, the project name is also accessible
# with ${CMAKE_PROJECT_NAME} (both CMake variables are in-sync within the top level
# build script scope).
project("ffmpeglib")

#设定ffmpeg的头文件和so库文件到一个变量中
set(FFMPEG_INCLUDE_DIR ${CMAKE_SOURCE_DIR}/../cpp/ffmpeg/include)
set(FFMPEG_LIB_DIR ${CMAKE_SOURCE_DIR}/../cpp/ffmpeg/lib/${ANDROID_ABI})

# 输出调试信息,用于查看路径是否正确
message(STATUS "FFMPEG_INCLUDE_DIR: ${FFMPEG_INCLUDE_DIR}")
message(STATUS "FFMPEG_LIB_DIR: ${FFMPEG_LIB_DIR}")

# 检查库文件是否存在
file(GLOB FFMPEG_LIB_FILES "${FFMPEG_LIB_DIR}/*.so")
if(NOT FFMPEG_LIB_FILES)
    message(FATAL_ERROR "No FFmpeg library files found in ${FFMPEG_LIB_DIR}. Please check the paths and ensure the libraries exist.")
endif()

# 包含FFmpeg头文件,只有包含头文件后,在cpp中才能正确引用头文件
include_directories(${FFMPEG_INCLUDE_DIR})

# Creates and names a library, sets it as either STATIC
# or SHARED, and provides the relative paths to its source code.
# You can define multiple libraries, and CMake builds them for you.
# Gradle automatically packages shared libraries with your APK.
#
# In this top level CMakeLists.txt, ${CMAKE_PROJECT_NAME} is used to define
# the target library name; in the sub-module's CMakeLists.txt, ${PROJECT_NAME}
# is preferred for the same purpose.
#
# In order to load a library into your app from Java/Kotlin, you must call
# System.loadLibrary() and pass the name of the library defined here;
# for GameActivity/NativeActivity derived applications, the same library name must be
# used in the AndroidManifest.xml file.
add_library(${CMAKE_PROJECT_NAME} SHARED
        # List C/C++ source files with relative paths to this CMakeLists.txt.
        ffmpeglib.cpp)

# 显式指定库文件路径
set(avformat_LIBRARY ${FFMPEG_LIB_DIR}/libavformat.so)
set(avcodec_LIBRARY ${FFMPEG_LIB_DIR}/libavcodec.so)
set(avutil_LIBRARY ${FFMPEG_LIB_DIR}/libavutil.so)
set(swresample_LIBRARY ${FFMPEG_LIB_DIR}/libswresample.so)
set(swscale_LIBRARY ${FFMPEG_LIB_DIR}/libswscale.so)
set(avdevice_LIBRARY ${FFMPEG_LIB_DIR}/libavdevice.so)
set(avfilter_LIBRARY ${FFMPEG_LIB_DIR}/libavfilter.so)

# 检测so库文件,输出找到的库文件路径, c++引用so库是不用带lib前缀和.so扩展名的
foreach (LIB avformat avcodec avutil swresample swscale avdevice avfilter)
    if(EXISTS ${${LIB}_LIBRARY})
        message(STATUS "${LIB}_LIBRARY: ${${LIB}_LIBRARY}")
    else()
        message(FATAL_ERROR "${LIB}_LIBRARY not found at ${${LIB}_LIBRARY}. Please check the paths and ensure the libraries exist.")
    endif()
endforeach()

#链接库文件
target_link_libraries(${CMAKE_PROJECT_NAME}
        ${avutil_LIBRARY}
        ${swresample_LIBRARY}
        ${swscale_LIBRARY}
        ${avcodec_LIBRARY}
        ${avdevice_LIBRARY}
        ${avfilter_LIBRARY}
        ${avformat_LIBRARY}
        # List libraries link to the target library
        c++_shared
        android
        log)

3. 修改ffmpeglib.cpp 文件内容

添加一个initFfmpeg 方法用来初始化ffmpeg

extern "C" JNIEXPORT void JNICALL
Java_com_bob_ffmpegdemo_NativeLib_initFfmpeg(JNIEnv *env, jobject /* this */) {
    // 初始化 FFmpeg
    avformat_network_init();

    // 打印 FFmpeg 版本信息到日志
    const char *version = avformat_configuration();
    LOGD("FFmpeg version: %s", version);
}

4. 修改NativeLib.kt 文件添加方法和加载库

package com.bob.ffmpegdemo

class NativeLib {
    /**
     * A native method that is implemented by the 'ffmpglib' native library,
     * which is packaged with this application.
     */
    external fun stringFromJNI(): String
    external fun initFfmpeg()

    companion object {
        // Used to load the 'ffmpeglib' library on application startup.
        init {
            System.loadLibrary("ffmpeglib")
        }
    }
}

5. 调用

可以在Activity文件中直接调用NativeLab这个类中和方法

    override fun onResume() {
        super.onResume()
        testFfmpeg()
    }

    private fun testFfmpeg() {
        val nativeLib = NativeLib()
        Log.d(TAG, "-------- ${nativeLib.stringFromJNI()}")
        nativeLib.initFfmpeg()
    }

直接运行app, 成功会输出下面的内容

四. 增加解析视频文件信息功能

通过前面三节内容后,ffmpeg的库就添加到app中了,但是只是输出了ffmpeg 编译的信息。不知道ffmpeg的功能是否能用。这节增加解析视频文件的功能

  •  直接在ffmpeglib.cpp文件中添加testOpenVideo方法解析视频
extern "C" JNIEXPORT jstring JNICALL
Java_com_bob_ffmpegdemo_NativeLib_testOpenVideo(JNIEnv *env, jobject /* this */, jstring filePath) {
    const char *path = env->GetStringUTFChars(filePath, NULL);

    // 添加 'file://' 前缀以确保正确解析路径
    std::string full_path = "file://" + std::string(path);
    LOGD("Attempting to open video file: %s", full_path.c_str());

    AVFormatContext *pFormatCtx = nullptr;
    int ret = avformat_open_input(&pFormatCtx, full_path.c_str(), NULL, NULL);
    if (ret < 0) {
        char errbuf[AV_ERROR_MAX_STRING_SIZE];
        av_strerror(ret, errbuf, sizeof(errbuf));
        LOGE("Failed to open video file: %s", errbuf);
        env->ReleaseStringUTFChars(filePath, path);
        return env->NewStringUTF("Failed to open video file.");
    }

    if (avformat_find_stream_info(pFormatCtx, NULL) < 0) {
        LOGE("Failed to retrieve stream information.");
        avformat_close_input(&pFormatCtx);
        env->ReleaseStringUTFChars(filePath, path);
        return env->NewStringUTF("Failed to retrieve stream information.");
    }

    // 使用正确的函数名 av_dump_format
    av_dump_format(pFormatCtx, 0, full_path.c_str(), 0); // 打印格式信息到标准输出

    // 计算持续时间和比特率
    int64_t duration = pFormatCtx->duration != AV_NOPTS_VALUE ?
                       av_rescale_q(pFormatCtx->duration, AV_TIME_BASE_Q, {1, 1000}) : -1; // 转换为毫秒
    int64_t bitrate = pFormatCtx->bit_rate / 1000;

    // 解析流信息并处理无音频流的情况
    bool hasAudioStream = false;
    for (unsigned int i = 0; i < pFormatCtx->nb_streams; ++i) {
        AVStream *stream = pFormatCtx->streams[i];
        AVCodecParameters *codecpar = stream->codecpar;

        if (codecpar->codec_type == AVMEDIA_TYPE_VIDEO) {
            LOGD("Video Stream: Codec %s, Resolution %dx%d",
                 avcodec_get_name(codecpar->codec_id),
                 codecpar->width, codecpar->height);
        } else if (codecpar->codec_type == AVMEDIA_TYPE_AUDIO) {
            hasAudioStream = true;

            // 如果 channel_layout 存在,则使用它;否则提供一个默认值
            int channels = 2; // 默认立体声音频
            #if LIBAVCODEC_VERSION_INT >= AV_VERSION_INT(58, 9, 100)
                        if (codecpar->channel_layout) {
                                channels = av_get_channel_layout_nb_channels(codecpar->channel_layout);
                            }
            #endif

            LOGD("Audio Stream: Codec %s, Sample Rate %d Hz, Channels %d",
                 avcodec_get_name(codecpar->codec_id),
                 codecpar->sample_rate,
                 channels);
        }
    }

    if (!hasAudioStream) {
        LOGD("No audio streams found in the video file.");
    }

    char info[1024];
    snprintf(info, sizeof(info), "Duration: %lld ms, Bitrate: %lld kbps",
             static_cast<long long>(duration),
             static_cast<long long>(bitrate));

    avformat_close_input(&pFormatCtx);
    env->ReleaseStringUTFChars(filePath, path);

    return env->NewStringUTF(info);
}

因为使用的是我自己录制的mp4文件,没有声音的。所以添加了hasAudioStream的判断

  •  修改NativeLib.kt 文件

增加: 

external fun testOpenVideo(path:String): String
  •  方法调用
        val outputDir = getExternalFilesDir(null)
        val outputFile = File(outputDir, FILENAME)

        if (!outputFile.exists()) {
            Log.e(TAG, "File does not exist at path: ${outputFile.absolutePath}")
            return
        } else if (!outputFile.canRead()) {
            Log.e(TAG, "File is not readable at path: ${outputFile.absolutePath}")
            return
        }

        val result = nativeLib.testOpenVideo(outputFile.absolutePath)
        Log.d(TAG, "-------- $result")

运行成功后

 


上面贴的CMakeLists.txt的内容已经是完整的。下面贴一下ffmpeglib.cpp, NativeLib.kt 和 MainActivity.kt 完整代码。

ffmpeglib.cpp

#include <jni.h>
#include <string>
#include <android/log.h>

extern "C" {
#include "ffmpeg/include/libavformat/avformat.h"
}

#define LOG_TAG "NativeLib"
#define LOGD(...) __android_log_print(ANDROID_LOG_DEBUG, LOG_TAG, __VA_ARGS__)
#define LOGE(...) __android_log_print(ANDROID_LOG_ERROR, LOG_TAG, __VA_ARGS__)

extern "C" JNIEXPORT jstring JNICALL
Java_com_bob_ffmpegdemo_NativeLib_stringFromJNI(
        JNIEnv *env,
        jobject /* this */) {
    std::string hello = "Hello from FFmpeg";
    return env->NewStringUTF(hello.c_str());
}

extern "C" JNIEXPORT void JNICALL
Java_com_bob_ffmpegdemo_NativeLib_initFfmpeg(JNIEnv *env, jobject /* this */) {
    // 初始化 FFmpeg
    avformat_network_init();

    // 打印 FFmpeg 版本信息到日志
    const char *version = avformat_configuration();
    LOGD("FFmpeg version: %s", version);
}


extern "C" JNIEXPORT jstring JNICALL
Java_com_bob_ffmpegdemo_NativeLib_testOpenVideo(JNIEnv *env, jobject /* this */, jstring filePath) {
    const char *path = env->GetStringUTFChars(filePath, NULL);

    // 添加 'file://' 前缀以确保正确解析路径
    std::string full_path = "file://" + std::string(path);
    LOGD("Attempting to open video file: %s", full_path.c_str());

    AVFormatContext *pFormatCtx = nullptr;
    int ret = avformat_open_input(&pFormatCtx, full_path.c_str(), NULL, NULL);
    if (ret < 0) {
        char errbuf[AV_ERROR_MAX_STRING_SIZE];
        av_strerror(ret, errbuf, sizeof(errbuf));
        LOGE("Failed to open video file: %s", errbuf);
        env->ReleaseStringUTFChars(filePath, path);
        return env->NewStringUTF("Failed to open video file.");
    }

    if (avformat_find_stream_info(pFormatCtx, NULL) < 0) {
        LOGE("Failed to retrieve stream information.");
        avformat_close_input(&pFormatCtx);
        env->ReleaseStringUTFChars(filePath, path);
        return env->NewStringUTF("Failed to retrieve stream information.");
    }

    // 使用正确的函数名 av_dump_format
    av_dump_format(pFormatCtx, 0, full_path.c_str(), 0); // 打印格式信息到标准输出

    // 计算持续时间和比特率
    int64_t duration = pFormatCtx->duration != AV_NOPTS_VALUE ?
                       av_rescale_q(pFormatCtx->duration, AV_TIME_BASE_Q, {1, 1000}) : -1; // 转换为毫秒
    int64_t bitrate = pFormatCtx->bit_rate / 1000;

    // 解析流信息并处理无音频流的情况
    bool hasAudioStream = false;
    for (unsigned int i = 0; i < pFormatCtx->nb_streams; ++i) {
        AVStream *stream = pFormatCtx->streams[i];
        AVCodecParameters *codecpar = stream->codecpar;

        if (codecpar->codec_type == AVMEDIA_TYPE_VIDEO) {
            LOGD("Video Stream: Codec %s, Resolution %dx%d",
                 avcodec_get_name(codecpar->codec_id),
                 codecpar->width, codecpar->height);
        } else if (codecpar->codec_type == AVMEDIA_TYPE_AUDIO) {
            hasAudioStream = true;

            // 如果 channel_layout 存在,则使用它;否则提供一个默认值
            int channels = 2; // 默认立体声音频
            #if LIBAVCODEC_VERSION_INT >= AV_VERSION_INT(58, 9, 100)
                        if (codecpar->channel_layout) {
                                channels = av_get_channel_layout_nb_channels(codecpar->channel_layout);
                            }
            #endif

            LOGD("Audio Stream: Codec %s, Sample Rate %d Hz, Channels %d",
                 avcodec_get_name(codecpar->codec_id),
                 codecpar->sample_rate,
                 channels);
        }
    }

    if (!hasAudioStream) {
        LOGD("No audio streams found in the video file.");
    }

    char info[1024];
    snprintf(info, sizeof(info), "Duration: %lld ms, Bitrate: %lld kbps",
             static_cast<long long>(duration),
             static_cast<long long>(bitrate));

    avformat_close_input(&pFormatCtx);
    env->ReleaseStringUTFChars(filePath, path);

    return env->NewStringUTF(info);
}

NativeLib.kt

package com.bob.ffmpegdemo

class NativeLib {

    /**
     * A native method that is implemented by the 'ffmpglib' native library,
     * which is packaged with this application.
     */
    external fun stringFromJNI(): String
    external fun initFfmpeg()
    external fun testOpenVideo(path:String): String

    companion object {
        // Used to load the 'ffmpeglib' library on application startup.
        init {
            System.loadLibrary("ffmpeglib")
        }
    }
}

 MainActivity.kt

package com.bob.ffmpegdemo

import android.os.Bundle
import android.util.Log
import androidx.activity.enableEdgeToEdge
import androidx.appcompat.app.AppCompatActivity
import androidx.core.view.ViewCompat
import androidx.core.view.WindowInsetsCompat
import com.bob.ffmpegdemo.databinding.ActivityMainBinding
import java.io.File
import java.io.FileOutputStream
import java.io.IOException
import java.io.InputStream
import java.io.OutputStream

class MainActivity : AppCompatActivity() {
    companion object {
        const val FILENAME = "abc.mp4"
        const val TAG = "TAG"
    }

    private val binding: ActivityMainBinding by lazy {
        ActivityMainBinding.inflate(layoutInflater)
    }

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        enableEdgeToEdge()
        setContentView(binding.root)
        ViewCompat.setOnApplyWindowInsetsListener(binding.main) { v, insets ->
            val systemBars = insets.getInsets(WindowInsetsCompat.Type.systemBars())
            v.setPadding(systemBars.left, systemBars.top, systemBars.right, systemBars.bottom)
            insets
        }
        copyAssetToFile()
    }

    private fun copyAssetToFile() {
        val dir = getExternalFilesDir(null)
        // Step 1: Create a file object in that directory
        val outFile = File(dir, FILENAME)

        // Step 2: Copy the asset file to the external files directory
        try {
            val `in`: InputStream = assets.open(FILENAME)
            val out: OutputStream = FileOutputStream(outFile)

            val buffer = ByteArray(1024)
            var read: Int
            while ((`in`.read(buffer).also { read = it }) != -1) {
                out.write(buffer, 0, read)
            }
            `in`.close()
            out.flush()
            out.close()

            Log.d(TAG, "Successfully copied $FILENAME to external files directory.")
        } catch (e: IOException) {
            Log.e(TAG, "Failed to copy asset file: " + e.message)
        }
    }

    override fun onResume() {
        super.onResume()
        testFfmpeg()
    }

    private fun testFfmpeg() {
        val nativeLib = NativeLib()
        Log.d(TAG, "-------- ${nativeLib.stringFromJNI()}")
        nativeLib.initFfmpeg()

        val outputDir = getExternalFilesDir(null)
        val outputFile = File(outputDir, FILENAME)

        if (!outputFile.exists()) {
            Log.e(TAG, "File does not exist at path: ${outputFile.absolutePath}")
            return
        } else if (!outputFile.canRead()) {
            Log.e(TAG, "File is not readable at path: ${outputFile.absolutePath}")
            return
        }

        val result = nativeLib.testOpenVideo(outputFile.absolutePath)
        Log.d(TAG, "-------- $result")
    }
}

总结

以上就是今天要讲的内容。

;