2015-02-27 12 views
5

Zrobiłem problem kilka lat wstecz na podstawie https://ikaruga2.wordpress.com/2011/06/15/video-live-wallpaper-part-1/. Mój projekt został zbudowany w wersji Eclipse dostarczonej bezpośrednio przez Google w tym czasie i działał dobrze z kopią skompilowanych bibliotek ffmpeg utworzonych z moją nazwą aplikacji.Jak korzystać z natywnych bibliotek C w Androidzie Studio

Teraz próbuję utworzyć nową aplikację na podstawie mojej starej aplikacji. Ponieważ Google nie obsługuje już Eclipse, pobrałem aplikację Android Studio i zaimportowałem mój projekt. Dzięki kilku poprawkom udało mi się skompilować starą wersję projektu. Zmodyfikowałem więc nazwę, skopiowałem nowy zestaw plików ".so" do aplikacji \ src \ main \ jniLibs \ armeabi (gdzie, jak zakładałem, powinny przejść) i spróbowałem ponownie uruchomić aplikację na moim telefonie bez żadnych innych zmian.

NDK nie zgłasza żadnych błędów. Gradle kompiluje plik bez błędów i instaluje go na moim telefonie. Aplikacja pojawia się na liście moich żywych tapet i mogę ją kliknąć, aby wyświetlić podgląd. Ale zamiast filmu pojawia otrzymam i błędów i logcat raporty:

02-26 21:50:31.164 18757-18757/? E/AndroidRuntime﹕ FATAL EXCEPTION: main 
java.lang.ExceptionInInitializerError 
     at com.nightscapecreations.anim3free.VideoLiveWallpaper.onSharedPreferenceChanged(VideoLiveWallpaper.java:165) 
     at com.nightscapecreations.anim3free.VideoLiveWallpaper.onCreate(VideoLiveWallpaper.java:81) 
     at android.app.ActivityThread.handleCreateService(ActivityThread.java:2273) 
     at android.app.ActivityThread.access$1600(ActivityThread.java:127) 
     at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1212) 
     at android.os.Handler.dispatchMessage(Handler.java:99) 
     at android.os.Looper.loop(Looper.java:137) 
     at android.app.ActivityThread.main(ActivityThread.java:4441) 
     at java.lang.reflect.Method.invokeNative(Native Method) 
     at java.lang.reflect.Method.invoke(Method.java:511) 
     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:823) 
     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:590) 
     at dalvik.system.NativeStart.main(Native Method) 
Caused by: java.lang.UnsatisfiedLinkError: Cannot load library: link_image[1936]: 144 could not load needed library '/data/data/com.nightscapecreations.anim1free/lib/libavutil.so' for 'libavcore.so' (load_library[1091]: Library '/data/data/com.nightscapecreations.anim1free/lib/libavutil.so' not found) 
     at java.lang.Runtime.loadLibrary(Runtime.java:370) 
     at java.lang.System.loadLibrary(System.java:535) 
     at com.nightscapecreations.anim3free.NativeCalls.<clinit>(NativeCalls.java:64) 
     at com.nightscapecreations.anim3free.VideoLiveWallpaper.onSharedPreferenceChanged(VideoLiveWallpaper.java:165) 
     at com.nightscapecreations.anim3free.VideoLiveWallpaper.onCreate(VideoLiveWallpaper.java:81) 
     at android.app.ActivityThread.handleCreateService(ActivityThread.java:2273) 
     at android.app.ActivityThread.access$1600(ActivityThread.java:127) 
     at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1212) 
     at android.os.Handler.dispatchMessage(Handler.java:99) 
     at android.os.Looper.loop(Looper.java:137) 
     at android.app.ActivityThread.main(ActivityThread.java:4441) 
     at java.lang.reflect.Method.invokeNative(Native Method) 
     at java.lang.reflect.Method.invoke(Method.java:511) 

Jestem początkującym Android/Java/C++ developer i nie jestem pewien, co oznacza ten błąd, ale Google pozwala mi wierzyć, że mój nowy bibliotek nie znaleziono. W moim projekcie Eclipse miałem ten zestaw bibliotek w "libs \ armeabi", a kolejną ich kopię w bardziej skomplikowanej strukturze folderów w "jni \ ffmpeg-android \ build \ ffmpeg \ armeabi \ lib". Wygląda na to, że Android Studio zachował wszystko tak samo, poza zmianą nazwy "libs" na "jniLibs", ale trafiam na ceglaną ścianę z tym błędem i nie jestem pewien, jak to zrobić.

Jak mogę skompilować tę nową aplikację z nową nazwą za pomocą Android Studio?

W przypadku pomaga tu jest mój Android.mk file:

LOCAL_PATH := $(call my-dir) 

    include $(CLEAR_VARS) 
    MY_LIB_PATH := ffmpeg-android/build/ffmpeg/armeabi/lib 
    LOCAL_MODULE := bambuser-libavcore 
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavcore.so 
    include $(PREBUILT_SHARED_LIBRARY) 

    include $(CLEAR_VARS) 
    LOCAL_MODULE := bambuser-libavformat 
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavformat.so 
    include $(PREBUILT_SHARED_LIBRARY) 

    include $(CLEAR_VARS) 
    LOCAL_MODULE := bambuser-libavcodec 
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavcodec.so 
    include $(PREBUILT_SHARED_LIBRARY) 

    include $(CLEAR_VARS) 
    LOCAL_MODULE := bambuser-libavfilter 
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavfilter.so 
    include $(PREBUILT_SHARED_LIBRARY) 

    include $(CLEAR_VARS) 
    LOCAL_MODULE := bambuser-libavutil 
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavutil.so 
    include $(PREBUILT_SHARED_LIBRARY) 

    include $(CLEAR_VARS) 
    LOCAL_MODULE := bambuser-libswscale 
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libswscale.so 
    include $(PREBUILT_SHARED_LIBRARY) 

    #local_PATH := $(call my-dir) 

    include $(CLEAR_VARS) 

    LOCAL_CFLAGS := -DANDROID_NDK \ 
        -DDISABLE_IMPORTGL 

    LOCAL_MODULE := video 
    LOCAL_SRC_FILES := video.c 

    LOCAL_C_INCLUDES := \ 
     $(LOCAL_PATH)/include \ 
     $(LOCAL_PATH)/ffmpeg-android/ffmpeg \ 
     $(LOCAL_PATH)/freetype/include/freetype2 \ 
     $(LOCAL_PATH)/freetype/include \ 
     $(LOCAL_PATH)/ftgl/src \ 
     $(LOCAL_PATH)/ftgl 
    LOCAL_LDLIBS := -L$(NDK_PLATFORMS_ROOT)/$(TARGET_PLATFORM)/arch-arm/usr/lib -L$(LOCAL_PATH) -L$(LOCAL_PATH)/ffmpeg-android/build/ffmpeg/armeabi/lib/ -lGLESv1_CM -ldl -lavformat -lavcodec -lavfilter -lavutil -lswscale -llog -lz -lm 

    include $(BUILD_SHARED_LIBRARY) 

I tu jest moje NativeCalls.java:

package com.nightscapecreations.anim3free; 

    public class NativeCalls { 
     //ffmpeg 
     public static native void initVideo(); 
     public static native void loadVideo(String fileName); // 
     public static native void prepareStorageFrame(); 
     public static native void getFrame(); // 
     public static native void freeConversionStorage(); 
     public static native void closeVideo();// 
     public static native void freeVideo();// 
     //opengl 
     public static native void initPreOpenGL(); // 
     public static native void initOpenGL(); // 
     public static native void drawFrame(); // 
     public static native void closeOpenGL(); // 
     public static native void closePostOpenGL();// 
     //wallpaper 
     public static native void updateVideoPosition(); 
     public static native void setSpanVideo(boolean b); 
     //getters 
     public static native int getVideoHeight(); 
     public static native int getVideoWidth(); 
     //setters 
     public static native void setWallVideoDimensions(int w,int h); 
     public static native void setWallDimensions(int w,int h); 
     public static native void setScreenPadding(int w,int h); 
     public static native void setVideoMargins(int w,int h); 
     public static native void setDrawDimensions(int drawWidth,int drawHeight); 
     public static native void setOffsets(int x,int y); 
     public static native void setSteps(int xs,int ys); 
     public static native void setScreenDimensions(int w, int h); 
     public static native void setTextureDimensions(int tx, 
           int ty); 
     public static native void setOrientation(boolean b); 
     public static native void setPreviewMode(boolean b); 
     public static native void setTonality(int t); 
     public static native void toggleGetFrame(boolean b); 
     //fps 
     public static native void setLoopVideo(boolean b); 

     static { 
     System.loadLibrary("avcore"); 
     System.loadLibrary("avformat"); 
     System.loadLibrary("avcodec"); 
     //System.loadLibrary("avdevice"); 
     System.loadLibrary("avfilter"); 
     System.loadLibrary("avutil"); 
     System.loadLibrary("swscale"); 
     System.loadLibrary("video"); 
     } 

    } 

EDIT

Jest to pierwsza część z mojego pliku video.c:

#include <GLES/gl.h> 
    #include <GLES/glext.h> 

    #include <GLES2/gl2.h> 
    #include <GLES2/gl2ext.h> 

    #include <stdlib.h> 
    #include <time.h> 

    #include <libavcodec/avcodec.h> 
    #include <libavformat/avformat.h> 
    #include <libswscale/swscale.h> 

    #include <jni.h> 
    #include <string.h> 
    #include <stdio.h> 
    #include <android/log.h> 

    //#include <FTGL/ftgl.h> 

    //ffmpeg video variables 
    int  initializedVideo=0; 
    int  initializedFrame=0; 
    AVFormatContext *pFormatCtx=NULL; 
    int    videoStream; 
    AVCodecContext *pCodecCtx=NULL; 
    AVCodec   *pCodec=NULL; 
    AVFrame   *pFrame=NULL; 
    AVPacket  packet; 
    int    frameFinished; 
    float   aspect_ratio; 

    //ffmpeg video conversion variables 
    AVFrame   *pFrameConverted=NULL; 
    int    numBytes; 
    uint8_t   *bufferConverted=NULL; 

    //opengl 
    int textureFormat=PIX_FMT_RGBA; // PIX_FMT_RGBA PIX_FMT_RGB24 
    int GL_colorFormat=GL_RGBA; // Must match the colorspace specified for textureFormat 
    int textureWidth=256; 
    int textureHeight=256; 
    int nTextureHeight=-256; 
    int textureL=0, textureR=0, textureW=0; 
    int frameTonality; 

    //GLuint textureConverted=0; 
    GLuint texturesConverted[2] = { 0,1 }; 
    GLuint dummyTex = 2; 
    static int len=0; 


    static const char* BWVertexSrc = 
      "attribute vec4 InVertex;\n" 
      "attribute vec2 InTexCoord0;\n" 
      "attribute vec2 InTexCoord1;\n" 
      "uniform mat4 ProjectionModelviewMatrix;\n" 
      "varying vec2 TexCoord0;\n" 
      "varying vec2 TexCoord1;\n" 

      "void main()\n" 
      "{\n" 
      " gl_Position = ProjectionModelviewMatrix * InVertex;\n" 
      " TexCoord0 = InTexCoord0;\n" 
      " TexCoord1 = InTexCoord1;\n" 
      "}\n"; 
    static const char* BWFragmentSrc = 

      "#version 110\n" 
      "uniform sampler2D Texture0;\n" 
      "uniform sampler2D Texture1;\n" 

      "varying vec2 TexCoord0;\n" 
      "varying vec2 TexCoord1;\n" 

      "void main()\n" 
      "{\n" 
      " vec3 color = texture2D(m_Texture, texCoord).rgb;\n" 
      " float gray = (color.r + color.g + color.b)/3.0;\n" 
      " vec3 grayscale = vec3(gray);\n" 

      " gl_FragColor = vec4(grayscale, 1.0);\n" 
      "}"; 
    static GLuint shaderProgram; 


    //// Create a pixmap font from a TrueType file. 
    //FTGLPixmapFont font("/home/user/Arial.ttf"); 
    //// Set the font size and render a small text. 
    //font.FaceSize(72); 
    //font.Render("Hello World!"); 

    //screen dimensions 
    int screenWidth = 50; 
    int screenHeight= 50; 
    int screenL=0, screenR=0, screenW=0; 
    int dPaddingX=0,dPaddingY=0; 
    int drawWidth=50,drawHeight=50; 

    //wallpaper 
    int wallWidth = 50; 
    int wallHeight = 50; 
    int xOffSet, yOffSet; 
    int xStep, yStep; 
    jboolean spanVideo = JNI_TRUE; 

    //video dimensions 
    int wallVideoWidth = 0; 
    int wallVideoHeight = 0; 
    int marginX, marginY; 
    jboolean isScreenPortrait = JNI_TRUE; 
    jboolean isPreview = JNI_TRUE; 
    jboolean loopVideo = JNI_TRUE; 
    jboolean isGetFrame = JNI_TRUE; 

    //file 
    const char * szFileName; 

    #define max(a, b) (((a) > (b)) ? (a) : (b)) 
    #define min(a, b) (((a) < (b)) ? (a) : (b)) 

    //test variables 
    #define RGBA8(r, g, b) (((r) << (24)) | ((g) << (16)) | ((b) << (8)) | 255) 
    int sPixelsInited=JNI_FALSE; 
    uint32_t *s_pixels=NULL; 

    int s_pixels_size() { 
     return (sizeof(uint32_t) * textureWidth * textureHeight * 5); 
    } 

    void render_pixels1(uint32_t *pixels, uint32_t c) { 
     int x, y; 
     /* fill in a square of 5 x 5 at s_x, s_y */ 
     for (y = 0; y < textureHeight; y++) { 
      for (x = 0; x < textureWidth; x++) { 
       int idx = x + y * textureWidth; 
       pixels[idx++] = RGBA8(255, 255, 0); 
      } 
     } 
    } 

    void render_pixels2(uint32_t *pixels, uint32_t c) { 
     int x, y; 
     /* fill in a square of 5 x 5 at s_x, s_y */ 
     for (y = 0; y < textureHeight; y++) { 
      for (x = 0; x < textureWidth; x++) { 
       int idx = x + y * textureWidth; 
       pixels[idx++] = RGBA8(0, 0, 255); 
      } 
     } 
    } 

    void Java_com_nightscapecreations_anim3free_NativeCalls_initVideo (JNIEnv * env, jobject this) { 
     initializedVideo = 0; 
     initializedFrame = 0; 
    } 

    /* list of things that get loaded: */ 
    /* buffer */ 
    /* pFrameConverted */ 
    /* pFrame */ 
    /* pCodecCtx */ 
    /* pFormatCtx */ 
    void Java_com_nightscapecreations_anim3free_NativeCalls_loadVideo (JNIEnv * env, jobject this, jstring fileName) { 
     jboolean isCopy; 
     szFileName = (*env)->GetStringUTFChars(env, fileName, &isCopy); 
     //debug 
     __android_log_print(ANDROID_LOG_DEBUG, "NDK: ", "NDK:LC: [%s]", szFileName); 
     // Register all formats and codecs 
     av_register_all(); 
     // Open video file 
     if(av_open_input_file(&pFormatCtx, szFileName, NULL, 0, NULL)!=0) { 
     __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Couldn't open file"); 
     return; 
     } 
     __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Succesfully loaded file"); 
     // Retrieve stream information */ 
     if(av_find_stream_info(pFormatCtx)<0) { 
     __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Couldn't find stream information"); 
     return; 
     } 
     __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Found stream info"); 
     // Find the first video stream 
     videoStream=-1; 
     int i; 
     for(i=0; i<pFormatCtx->nb_streams; i++) 
      if(pFormatCtx->streams[i]->codec->codec_type==CODEC_TYPE_VIDEO) { 
       videoStream=i; 
       break; 
      } 
     if(videoStream==-1) { 
      __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Didn't find a video stream"); 
      return; 
     } 
     __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Found video stream"); 
     // Get a pointer to the codec contetx for the video stream 
     pCodecCtx=pFormatCtx->streams[videoStream]->codec; 
     // Find the decoder for the video stream 
     pCodec=avcodec_find_decoder(pCodecCtx->codec_id); 
     if(pCodec==NULL) { 
      __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Unsupported codec"); 
      return; 
     } 
     // Open codec 
     if(avcodec_open(pCodecCtx, pCodec)<0) { 
      __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Could not open codec"); 
      return; 
     } 
     // Allocate video frame (decoded pre-conversion frame) 
     pFrame=avcodec_alloc_frame(); 
     // keep track of initialization 
     initializedVideo = 1; 
     __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Finished loading video"); 
    } 

    //for this to work, you need to set the scaled video dimensions first 
    void Java_com_nightscapecreations_anim3free_NativeCalls_prepareStorageFrame (JNIEnv * env, jobject this) { 
     // Allocate an AVFrame structure 
     pFrameConverted=avcodec_alloc_frame(); 
     // Determine required buffer size and allocate buffer 
     numBytes=avpicture_get_size(textureFormat, textureWidth, textureHeight); 
     bufferConverted=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t)); 
     if (pFrameConverted == NULL || bufferConverted == NULL) 
      __android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "Out of memory"); 
     // Assign appropriate parts of buffer to image planes in pFrameRGB 
     // Note that pFrameRGB is an AVFrame, but AVFrame is a superset 
     // of AVPicture 
     avpicture_fill((AVPicture *)pFrameConverted, bufferConverted, textureFormat, textureWidth, textureHeight); 
     __android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "Created frame"); 
     __android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "texture dimensions: %dx%d", textureWidth, textureHeight); 
     initializedFrame = 1; 
    } 

    jint Java_com_nightscapecreations_anim3free_NativeCalls_getVideoWidth (JNIEnv * env, jobject this) { 
     return pCodecCtx->width; 
    } 

    jint Java_com_nightscapecreations_anim3free_NativeCalls_getVideoHeight (JNIEnv * env, jobject this) { 
     return pCodecCtx->height; 
    } 

    void Java_com_nightscapecreations_anim3free_NativeCalls_getFrame (JNIEnv * env, jobject this) { 
     // keep reading packets until we hit the end or find a video packet 
     while(av_read_frame(pFormatCtx, &packet)>=0) { 
      static struct SwsContext *img_convert_ctx; 
      // Is this a packet from the video stream? 
      if(packet.stream_index==videoStream) { 
       // Decode video frame 
       /* __android_log_print(ANDROID_LOG_DEBUG, */ 
       /*   "video.c", */ 
       /*   "getFrame: Try to decode frame" */ 
       /*   ); */ 
       avcodec_decode_video(pCodecCtx, pFrame, &frameFinished, packet.data, packet.size); 
       // Did we get a video frame? 
       if(frameFinished) { 
        if(img_convert_ctx == NULL) { 
         /* get/set the scaling context */ 
         int w = pCodecCtx->width; 
         int h = pCodecCtx->height; 
         img_convert_ctx = sws_getContext(w, h, pCodecCtx->pix_fmt, textureWidth,textureHeight, textureFormat, SWS_FAST_BILINEAR, NULL, NULL, NULL); 
         if(img_convert_ctx == NULL) { 
          return; 
         } 
        } 
        /* if img convert null */ 
        /* finally scale the image */ 
        /* __android_log_print(ANDROID_LOG_DEBUG, */ 
        /*   "video.c", */ 
        /*   "getFrame: Try to scale the image" */ 
        /*   ); */ 

        //pFrameConverted = pFrame; 
        sws_scale(img_convert_ctx, pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameConverted->data, pFrameConverted->linesize); 
        //av_picture_crop(pFrameConverted->data, pFrame->data, 1, pCodecCtx->height, pCodecCtx->width); 
        //av_picture_crop(); 
        //avfilter_vf_crop(); 

        /* do something with pFrameConverted */ 
        /* ... see drawFrame() */ 
        /* We found a video frame, did something with it, now free up 
         packet and return */ 
        av_free_packet(&packet); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.age: %d", pFrame->age); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.buffer_hints: %d", pFrame->buffer_hints); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.display_picture_number: %d", pFrame->display_picture_number); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.hwaccel_picture_private: %d", pFrame->hwaccel_picture_private); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.key_frame: %d", pFrame->key_frame); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.palette_has_changed: %d", pFrame->palette_has_changed); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.pict_type: %d", pFrame->pict_type); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.qscale_type: %d", pFrame->qscale_type); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.age: %d", pFrameConverted->age); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.buffer_hints: %d", pFrameConverted->buffer_hints); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.display_picture_number: %d", pFrameConverted->display_picture_number); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.hwaccel_picture_private: %d", pFrameConverted->hwaccel_picture_private); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.key_frame: %d", pFrameConverted->key_frame); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.palette_has_changed: %d", pFrameConverted->palette_has_changed); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.pict_type: %d", pFrameConverted->pict_type); 
    //    __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.qscale_type: %d", pFrameConverted->qscale_type); 
        return; 
       } /* if frame finished */ 
      } /* if packet video stream */ 
      // Free the packet that was allocated by av_read_frame 
      av_free_packet(&packet); 
     } /* while */ 
     //reload video when you get to the end 
     av_seek_frame(pFormatCtx,videoStream,0,AVSEEK_FLAG_ANY); 
    } 

    void Java_com_nightscapecreations_anim3free_NativeCalls_setLoopVideo (JNIEnv * env, jobject this, jboolean b) { 
     loopVideo = b; 
    } 

    void Java_com_nightscapecreations_anim3free_NativeCalls_closeVideo (JNIEnv * env, jobject this) { 
     if (initializedFrame == 1) { 
      // Free the converted image 
      av_free(bufferConverted); 
      av_free(pFrameConverted); 
      initializedFrame = 0; 
      __android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed converted image"); 
     } 
     if (initializedVideo == 1) { 
      /* // Free the YUV frame */ 
      av_free(pFrame); 
      /* // Close the codec */ 
      avcodec_close(pCodecCtx); 
      // Close the video file 
      av_close_input_file(pFormatCtx); 
      initializedVideo = 0; 
      __android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed video structures"); 
     } 
    } 

    void Java_com_nightscapecreations_anim3free_NativeCalls_freeVideo (JNIEnv * env, jobject this) { 
     if (initializedVideo == 1) { 
      /* // Free the YUV frame */ 
      av_free(pFrame); 
      /* // Close the codec */ 
      avcodec_close(pCodecCtx); 
      // Close the video file 
      av_close_input_file(pFormatCtx); 
      __android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed video structures"); 
      initializedVideo = 0; 
     } 
    } 

    void Java_com_nightscapecreations_anim3free_NativeCalls_freeConversionStorage (JNIEnv * env, jobject this) { 
     if (initializedFrame == 1) { 
      // Free the converted image 
      av_free(bufferConverted); 
      av_freep(pFrameConverted); 
      initializedFrame = 0; 
     } 
    } 

    /*--- END OF VIDEO ----*/ 

    /* disable these capabilities. */ 
    static GLuint s_disable_options[] = { 
     GL_FOG, 
     GL_LIGHTING, 
     GL_CULL_FACE, 
     GL_ALPHA_TEST, 
     GL_BLEND, 
     GL_COLOR_LOGIC_OP, 
     GL_DITHER, 
     GL_STENCIL_TEST, 
     GL_DEPTH_TEST, 
     GL_COLOR_MATERIAL, 
     0 
    }; 

    // For stuff that opengl needs to work with, 
    // like the bitmap containing the texture 
    void Java_com_nightscapecreations_anim3free_NativeCalls_initPreOpenGL (JNIEnv * env, jobject this) { 

    } 
    ... 
+0

Czy dodać go w build.gradle? – Paritosh

+0

@Paritosh Dzięki za odpowiedź. Dodałem ndk {moduleName "bambuser-libavcore"} w defaultConfig. Nie było to konieczne podczas mojej pierwszej kompilacji przy użyciu starych bibliotek, ale chciałem spróbować dla nowych. – Nicholas

Odpowiedz

4

Jeśli chcesz ponownie użyć poprzedniej biblioteki i nie kompilować niczego w NDK, możesz po prostu upuścić wszystkie pliki .so wewnątrz jniLibs/<abi>.

W przeciwnym razie, ponieważ twoja budowa ndk zależy od prebuilts, nie możesz jej poprawnie skonfigurować do pracy bezpośrednio z konfiguracją gradle (ndk{}). W każdym razie, jako wsparcie NDK jest przestarzałe na razie, najbardziej czysty sposób, aby praca jest, aby Gradle rozmowę NDK-build i korzystać z istniejących Makefile:

import org.apache.tools.ant.taskdefs.condition.Os 

... 

android { 
    ... 
    sourceSets.main { 
     jniLibs.srcDir 'src/main/libs' //set .so files location to libs instead of jniLibs 
     jni.srcDirs = [] //disable automatic ndk-build call 
    } 

    // add a task that calls regular ndk-build(.cmd) script from app directory 
    task ndkBuild(type: Exec) { 
     if (Os.isFamily(Os.FAMILY_WINDOWS)) { 
      commandLine 'ndk-build.cmd', '-C', file('src/main').absolutePath 
     } else { 
      commandLine 'ndk-build', '-C', file('src/main').absolutePath 
     } 
    } 

    // add this task as a dependency of Java compilation 
    tasks.withType(JavaCompile) { 
     compileTask -> compileTask.dependsOn ndkBuild 
    } 
} 
+0

Dziękuję za odpowiedź. Powyższe błędy kodu w synchronizacji gradle z "Błąd: (24, 0) Nie można znaleźć właściwości" Os "w zadaniu": app: ndkBuild "." Po usunięciu zaznaczenia i założeniu, że system Windows jest poprawnie zsynchronizowany, ale błędy kompilacji z "C: /Users/Nicholas/AndroidstudioProjects/FFvideoLiveWallpaper2/app/src/main//jni/video.c: 22: 21: błąd krytyczny: GLES/gl.h: Brak takiego pliku lub katalogu ". Jest to ten sam plik c, który pracował w Eclipse. Dodaję go do pierwotnego pytania. Czy jest coś specjalnego, co musisz zrobić, aby Gradle rozpoznał ścieżki względne? Przepraszam za pytanie nowicjusza ... – Nicholas

+1

Odnośnie pierwszego błędu, myślę, że po prostu zapomniałeś o deklaracji importu dla * Os *. Nie jestem pewien co do błędu włączenia GLES/gl.h. Spróbuj usunąć '-L $ (NDK_PLATFORMS_ROOT)/$ (TARGET_PLATFORM)/arch-arm/usr/lib' z LD_LIBS wewnątrz Android.mk i utwórz plik * Application.mk * zawierający plik' APP_PLATFORM: = android-9' (lub wyższy niż 9, w zależności od tego, jaki jest twój minimalny docelowy SDK). – ph0b

+0

Dziękuję; to pomogło, chociaż nie rozumiem, jak jeszcze :). Otrzymałem wtedy komunikat "Błąd: duplikaty plików podczas pakowania pliku APK C: \ Users \ Nicholas \ AndroidstudioProjects \ FFvideoLiveWallpaper2 \ app \ build \ outputs \ apk \ app-debug-unaligned.apk \t Ścieżka w archiwum: lib/armeabi/libavcodec. więc". Mój nowicjusz zgadywał, że próbował skompilować pliki c do takich plików, mimo że mam już ich skompilowane wersje. Usunąłem więc wszystko z mojego folderu jniLibs i zaczęło działać! Dzięki wielkie; Uratowałeś mi tygodnie stresującego pogorszenia. – Nicholas

Powiązane problemy