[android] FFmpeg on Android

I have got FFmpeg compiled (libffmpeg.so) on Android. Now I have to build either an application like RockPlayer or use existing Android multimedia framework to invoke FFmpeg.

  1. Do you have steps / procedures / code / example on integrating FFmpeg on Android / StageFright?

  2. Can you please guide me on how can I use this library for multimedia playback?

  3. I have a requirement where I have already audio and video transport streams, which I need to feed to FFmpeg and get it decoded / rendered. How can I do this on Android, since IOMX APIs are OMX based and cannot plug-in FFmpeg here?

  4. Also I could not find documentation on the FFmpeg APIs which need to be used for playback.

This question is related to android ffmpeg stagefright android-ffmpeg

The answer is


After a lot of research, right now this is the most updated compiled library for Android that I found:

https://github.com/bravobit/FFmpeg-Android

  • At this moment is using FFmpeg release n4.0-39-gda39990
  • Includes FFmpeg and FFProbe
  • Contains Java interface to launch the commands
  • FFprobe or FFmpeg could be removed from the APK, check the wiki https://github.com/bravobit/FFmpeg-Android/wiki

Strange that this project hasn't been mentioned: AndroidFFmpeg from Appunite

It has quite detailed step-by-step instructions to copy/paste to command line, for lazy people like me ))


The most easy to build, easy to use implementation I have found is made by theguardianproject team: https://github.com/guardianproject/android-ffmpeg


For various reasons, Multimedia was and is never easy in terms of achieving the task without compromising on efficiency. ffmpeg is an effort in improving it day by day. It supports different formats of codecs and containers.

Now to answer the question of how to use this library, i would say that it is not so simple to write it here. But i can guide you in following ways.

1) Inside the ffmpeg directory of source code, you have output_example.c or api_example.c. Here, you can see the code where encoding/decoding is done. You will get an idea as to which API's inside ffmpeg you should call. This would be your first step.

2) Dolphin player is a open source project for Android. Currently it is having bugs but developers are working continuously. In that project you have the whole setup ready which you can use to continue your investigation. Here is a link to the project from code.google.com or run the command "git clone https://code.google.com/p/dolphin-player/" in a terminal. You can see two projects named P and P86 . You can use either of them.

Extra tip i would like to offer is that when you are building the ffmpeg code, inside build.sh you need to enable the muxers/demuxers/encoders/decoders of the formats you want to use. Else the corresponding code will not be included in the libraries. It took a lot of time for me to realize this. So thought of sharing it with you.

Few Basics : When we say a video file, ex : avi, it is combination of both audio and video

Video file = Video + Audio


Video = Codec + Muxer + Demuxer

codec = encoder + Decoder

=> Video = encoder + decoder + Muxer + Demuxer(Mpeg4 + Mpeg4 + avi +avi - Example for avi container)


Audio = Codec + Muxer + Demuxer

codec = encoder + Decoder

=> Audio = encoder + decoder + Muxer + Demuxer(mp2 + mp2 + avi + avi - Example for avi container)


Codec(name is deriverd from a combination of en*co*der/*dec*oder) is just a part of format which defines the algorithms used to encode/decode a frame. AVI is not a codec, it is a container which uses Video codec of Mpeg4 and Audio codec of mp2.

Muxer/demuxer is used to combine/separate the frames from a file used while encoding/decoding.

So if you want to use avi format, you need to enable Video components + Audio components.

Ex, for avi, you need to enable the following. mpeg4 Encoder, mpeg4 decoder, mp2 encoder, mp2 decoder, avi muxer, avi demuxer.

phewwwwwww...

Programmatically build.sh should contain the following code:

--enable-muxer=avi --enable-demuxer=avi (Generic for both audio/video. generally Specific to a container)
--enable-encoder=mpeg4 --enable-decoder=mpeg4(For video support)
--enable-encoder=mp2 --enable-decoder=mp2 (For Audio support)

Hope i idid not confuse you more after all this...

Thanks, Any assistance needed, please let me know.


Inspired by many other FFmpeg on Android implementations out there (mainly the guadianproject), I found a solution (with Lame support also).

(lame and FFmpeg: https://github.com/intervigilium/liblame and http://bambuser.com/opensource)

to call FFmpeg:

new Thread(new Runnable() {

    @Override
    public void run() {

        Looper.prepare();

        FfmpegController ffmpeg = null;

        try {
            ffmpeg = new FfmpegController(context);
        } catch (IOException ioe) {
            Log.e(DEBUG_TAG, "Error loading ffmpeg. " + ioe.getMessage());
        }

        ShellDummy shell = new ShellDummy();
        String mp3BitRate = "192";

        try {
            ffmpeg.extractAudio(in, out, audio, mp3BitRate, shell);
        } catch (IOException e) {
            Log.e(DEBUG_TAG, "IOException running ffmpeg" + e.getMessage());
        } catch (InterruptedException e) {
            Log.e(DEBUG_TAG, "InterruptedException running ffmpeg" + e.getMessage());
        }

        Looper.loop();

    }

}).start();

and to handle the console output:

private class ShellDummy implements ShellCallback {

    @Override
    public void shellOut(String shellLine) {
        if (someCondition) {
            doSomething(shellLine);
        }
        Utils.logger("d", shellLine, DEBUG_TAG);
    }

    @Override
    public void processComplete(int exitValue) {
        if (exitValue == 0) {
            // Audio job OK, do your stuff: 

                            // i.e.             
                            // write id3 tags,
                            // calls the media scanner,
                            // etc.
        }
    }

    @Override
    public void processNotStartedCheck(boolean started) {
        if (!started) {
                            // Audio job error, as above.
        }
    }
}

I've done a little project to configure and build X264 and FFMPEG using the Android NDK. The main thing that's missing is a decent JNI interface to make it accessible via Java, but that is the easy part (relatively). When I get round to making the JNI interface good for my own uses, I'll push that in.

The benefit over olvaffe's build system is that it doesn't require Android.mk files to build the libraries, it just uses the regular makefiles and the toolchain. This makes it much less likely to stop working when you pull new change from FFMPEG or X264.

https://github.com/halfninja/android-ffmpeg-x264


I had the same issue, I found most of the answers here out dated. I ended up writing a wrapper on FFMPEG to access from Android with a single line of code.

https://github.com/madhavanmalolan/ffmpegandroidlibrary


First, add the dependency of FFmpeg library

implementation 'com.writingminds:FFmpegAndroid:0.3.2'

Then load in activity

FFmpeg ffmpeg;
    private void trimVideo(ProgressDialog progressDialog) {

    outputAudioMux = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_MOVIES).getAbsolutePath()
            + "/VidEffectsFilter" + "/" + new SimpleDateFormat("ddMMyyyy_HHmmss").format(new Date())
            + "filter_apply.mp4";

    if (startTrim.equals("")) {
        startTrim = "00:00:00";
    }

    if (endTrim.equals("")) {
        endTrim = timeTrim(player.getDuration());
    }

    String[] cmd = new String[]{"-ss", startTrim + ".00", "-t", endTrim + ".00", "-noaccurate_seek", "-i", videoPath, "-codec", "copy", "-avoid_negative_ts", "1", outputAudioMux};


    execFFmpegBinary1(cmd, progressDialog);
    }



    private void execFFmpegBinary1(final String[] command, ProgressDialog prpg) {

    ProgressDialog progressDialog = prpg;

    try {
        ffmpeg.execute(command, new ExecuteBinaryResponseHandler() {
            @Override
            public void onFailure(String s) {
                progressDialog.dismiss();
                Toast.makeText(PlayerTestActivity.this, "Fail to generate video", Toast.LENGTH_SHORT).show();
                Log.d(TAG, "FAILED with output : " + s);
            }

            @Override
            public void onSuccess(String s) {
                Log.d(TAG, "SUCCESS wgith output : " + s);

//                    pathVideo = outputAudioMux;
                String finalPath = outputAudioMux;
                videoPath = outputAudioMux;
                Toast.makeText(PlayerTestActivity.this, "Storage Path =" + finalPath, Toast.LENGTH_SHORT).show();

                Intent intent = new Intent(PlayerTestActivity.this, ShareVideoActivity.class);
                intent.putExtra("pathGPU", finalPath);
                startActivity(intent);
                finish();
                MediaScannerConnection.scanFile(PlayerTestActivity.this, new String[]{finalPath}, new String[]{"mp4"}, null);

            }

            @Override
            public void onProgress(String s) {
                Log.d(TAG, "Started gcommand : ffmpeg " + command);
                progressDialog.setMessage("Please Wait video triming...");
            }

            @Override
            public void onStart() {
                Log.d(TAG, "Startedf command : ffmpeg " + command);

            }

            @Override
            public void onFinish() {
                Log.d(TAG, "Finished f command : ffmpeg " + command);
                progressDialog.dismiss();
            }
        });
    } catch (FFmpegCommandAlreadyRunningException e) {
        // do nothing for now
    }
}

  private void loadFFMpegBinary() {
    try {
        if (ffmpeg == null) {
            ffmpeg = FFmpeg.getInstance(this);
        }
        ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
            @Override
            public void onFailure() {
                showUnsupportedExceptionDialog();
            }

            @Override
            public void onSuccess() {
                Log.d("dd", "ffmpeg : correct Loaded");
            }
        });
    } catch (FFmpegNotSupportedException e) {
        showUnsupportedExceptionDialog();
    } catch (Exception e) {
        Log.d("dd", "EXception no controlada : " + e);
    }
}

private void showUnsupportedExceptionDialog() {
    new AlertDialog.Builder(this)
            .setIcon(android.R.drawable.ic_dialog_alert)
            .setTitle("Not Supported")
            .setMessage("Device Not Supported")
            .setCancelable(false)
            .setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
                @Override
                public void onClick(DialogInterface dialog, int which) {
                    finish();
                }
            })
            .create()
            .show();

}
    public String timeTrim(long milliseconds) {
        String finalTimerString = "";
        String minutString = "";
        String secondsString = "";

        // Convert total duration into time
        int hours = (int) (milliseconds / (1000 * 60 * 60));
        int minutes = (int) (milliseconds % (1000 * 60 * 60)) / (1000 * 60);
        int seconds = (int) ((milliseconds % (1000 * 60 * 60)) % (1000 * 60) / 1000);
        // Add hours if there

        if (hours < 10) {
            finalTimerString = "0" + hours + ":";
        } else {
            finalTimerString = hours + ":";
        }


        if (minutes < 10) {
            minutString = "0" + minutes;
        } else {
            minutString = "" + minutes;
        }

        // Prepending 0 to seconds if it is one digit
        if (seconds < 10) {
            secondsString = "0" + seconds;
        } else {
            secondsString = "" + seconds;
        }

        finalTimerString = finalTimerString + minutString + ":" + secondsString;

        // return timer string
        return finalTimerString;
    }

Also use another feature by FFmpeg

===> merge audio to video
String[] cmd = new String[]{"-i", yourRealPath, "-i", arrayList.get(posmusic).getPath(), "-map", "1:a", "-map", "0:v", "-codec", "copy", "-shortest", outputcrop};


===> Flip vertical :
String[] cm = new String[]{"-i", yourRealPath, "-vf", "vflip", "-codec:v", "libx264", "-preset", "ultrafast", "-codec:a", "copy", outputcrop1};


===> Flip horizontally :  
String[] cm = new String[]{"-i", yourRealPath, "-vf", "hflip", "-codec:v", "libx264", "-preset", "ultrafast", "-codec:a", "copy", outputcrop1};


===> Rotate 90 degrees clockwise:
String[] cm=new String[]{"-i", yourRealPath, "-c", "copy", "-metadata:s:v:0", "rotate=90", outputcrop1};


===> Compress Video
String[] complexCommand = {"-y", "-i", yourRealPath, "-strict", "experimental", "-vcodec", "libx264", "-preset", "ultrafast", "-crf", "24", "-acodec", "aac", "-ar", "22050", "-ac", "2", "-b", "360k", "-s", "1280x720", outputcrop1};


===> Speed up down video
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=2.0*PTS[v];[0:a]atempo=0.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=1.0*PTS[v];[0:a]atempo=1.0[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=0.75*PTS[v];[0:a]atempo=1.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};
String[] complexCommand = {"-y", "-i", yourRealPath, "-filter_complex", "[0:v]setpts=0.5*PTS[v];[0:a]atempo=2.0[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputcrop1};



===> Add two mp3 files 

StringBuilder sb = new StringBuilder();
sb.append("-i ");
sb.append(textSngname);
sb.append(" -i ");
sb.append(mAudioFilename);
sb.append(" -filter_complex [0:0][1:0]concat=n=2:v=0:a=1[out] -map [out] ");
sb.append(finalfile);
---> ffmpeg.execute(sb.toString().split(" "), new ExecuteBinaryResponseHandler()




===> Add three mp3 files

StringBuilder sb = new StringBuilder();
sb.append("-i ");
sb.append(firstSngname);
sb.append(" -i ");
sb.append(textSngname);
sb.append(" -i ");
sb.append(mAudioFilename);
sb.append(" -filter_complex [0:0][1:0][2:0]concat=n=3:v=0:a=1[out] -map [out] ");
sb.append(finalfile);
---> ffmpeg.execute(sb.toString().split(" "), new ExecuteBinaryResponseHandler()

Here are the steps I went through in getting ffmpeg to work on Android:

  1. Build static libraries of ffmpeg for Android. This was achieved by building olvaffe's ffmpeg android port (libffmpeg) using the Android Build System. Simply place the sources under /external and make away. You'll need to extract bionic(libc) and zlib(libz) from the Android build as well, as ffmpeg libraries depend on them.
  2. Create a dynamic library wrapping ffmpeg functionality using the Android NDK. There's a lot of documentation out there on how to work with the NDK. Basically you'll need to write some C/C++ code to export the functionality you need out of ffmpeg into a library java can interact with through JNI. The NDK allows you to easily link against the static libraries you've generated in step 1, just add a line similar to this to Android.mk: LOCAL_STATIC_LIBRARIES := libavcodec libavformat libavutil libc libz

  3. Use the ffmpeg-wrapping dynamic library from your java sources. There's enough documentation on JNI out there, you should be fine.

Regarding using ffmpeg for playback, there are many examples (the ffmpeg binary itself is a good example), here's a basic tutorial. The best documentation can be found in the headers.

Good luck :)


To make my FFMPEG application I used this project (https://github.com/hiteshsondhi88/ffmpeg-android-java) so, I don't have to compile anything. I think it's the easy way to use FFMPEG in our Android applications.

More info on http://hiteshsondhi88.github.io/ffmpeg-android-java/