Amr file converted from wav other than original

Asked

Viewed 57 times

0

I am converting a wav file to Amr using the amrInputStream library, but the original Wav file is 15s 387ms long with the size of 1.3Mb and the generated AMR file is 133kb but is 1m 25s long, and it is slow and distorted in playback. What am I doing wrong? Here is the code of the Amrinputstream class

package android.media;

import java.io.InputStream;
import java.io.IOException;


/**
 * AmrInputStream
 * @hide
 */
public final class AmrInputStream extends InputStream
{
    static {
        System.loadLibrary("media_jni");
    }

    private final static String TAG = "AmrInputStream";

    // frame is 20 msec at 8.000 khz
    private final static int SAMPLES_PER_FRAME = 8000 * 20 / 1000;

    // pcm input stream
    private InputStream mInputStream;

    // native handle
    private long mGae;

    // result amr stream
    private final byte[] mBuf = new byte[SAMPLES_PER_FRAME * 2];
    private int mBufIn = 0;
    private int mBufOut = 0;

    // helper for bytewise read()
    private byte[] mOneByte = new byte[1];

    /**
     * Create a new AmrInputStream, which converts 16 bit PCM to AMR
     * @param inputStream InputStream containing 16 bit PCM.
     */
    public AmrInputStream(InputStream inputStream) {
        mInputStream = inputStream;
        mGae = GsmAmrEncoderNew();
        GsmAmrEncoderInitialize(mGae);
    }

    @Override
    public int read() throws IOException {
        int rtn = read(mOneByte, 0, 1);
        return rtn == 1 ? (0xff & mOneByte[0]) : -1;
    }

    @Override
    public int read(byte[] b) throws IOException {
        return read(b, 0, b.length);
    }

    @Override
    public int read(byte[] b, int offset, int length) throws IOException {
        if (mGae == 0) throw new IllegalStateException("not open");

        // local buffer of amr encoded audio empty
        if (mBufOut >= mBufIn) {
            // reset the buffer
            mBufOut = 0;
            mBufIn = 0;

            // fetch a 20 msec frame of pcm
            for (int i = 0; i < SAMPLES_PER_FRAME * 2; ) {
                int n = mInputStream.read(mBuf, i, SAMPLES_PER_FRAME * 2 - i);
                if (n == -1) return -1;
                i += n;
            }

            // encode it
            mBufIn = GsmAmrEncoderEncode(mGae, mBuf, 0, mBuf, 0);
        }

        // return encoded audio to user
        if (length > mBufIn - mBufOut) length = mBufIn - mBufOut;
        System.arraycopy(mBuf, mBufOut, b, offset, length);
        mBufOut += length;

        return length;
    }

    @Override
    public void close() throws IOException {
        try {
            if (mInputStream != null) mInputStream.close();
        } finally {
            mInputStream = null;
            try {
                if (mGae != 0) GsmAmrEncoderCleanup(mGae);
            } finally {
                try {
                    if (mGae != 0) GsmAmrEncoderDelete(mGae);
                } finally {
                    mGae = 0;
                }
            }
        }
    }

    @Override
    protected void finalize() throws Throwable {
        if (mGae != 0) {
            close();
            throw new IllegalStateException("someone forgot to close AmrInputStream");
        }
    }

    //
    // AudioRecord JNI interface
    //
    private static native long GsmAmrEncoderNew();
    private static native void GsmAmrEncoderInitialize(long gae);
    private static native int GsmAmrEncoderEncode(long gae,
                                                  byte[] pcm, int pcmOffset, byte[] amr, int amrOffset) throws IOException;
    private static native void GsmAmrEncoderCleanup(long gae);
    private static native void GsmAmrEncoderDelete(long gae);

}

Here is the code of the Recorder class that I use to record the microphone audio, generate the Wav file and convert to AMR

package iupi.com.br.calls;

import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;

import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;

import android.util.Log;

import android.media.AmrInputStream;

/**
 * Created by Bernardo on 29/06/2015.
 */
public class Recorder {

    //Begin private fields for this class
    private AudioRecord recorder;

    private static final int RECORDER_BPP = 16;
    private static final String AUDIO_RECORDER_FILE_EXT_WAV = "Audio";
    private static final String AUDIO_RECORDER_FILE_EXT = ".wav";
    private static final String AUDIO_ENCODED_FILE_EXT = ".amr";
    private static final String AUDIO_RECORDER_FOLDER = "files";
    private static final String AUDIO_RECORDER_TEMP_FILE = "record_temp.3gp";
    private static final int RECORDER_SAMPLERATE = 44100;
    private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO;
    private static final int RECORDER_CHANNELS_INT = 1;

    private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;

    private int bufferSize = 200000;
    short[] buffer;
    private Thread recordingThread = null;
    private boolean isRecording = false;
    private String path;

    //Constructor
    public Recorder(String path)
    {
           int bufferSize = AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE,
                RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING);
        this.path = path;

        System.out.println("BUFFER SIZE VALUE IS " + bufferSize);

        int buffercount = 4088 / bufferSize;
        if (buffercount < 1)
            buffercount = 1;
        recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
                RECORDER_SAMPLERATE, RECORDER_CHANNELS,
                RECORDER_AUDIO_ENCODING, 44100);

        //recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
        //recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
        //recorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);


    }

    public void start() throws IllegalStateException, IOException
    {

        buffer = new short[4088];

        recorder.startRecording();

        isRecording = true;

        recordingThread = new Thread(new Runnable()
        {
            @Override
            public void run() {
                writeAudioDataToFile();
            }
        }, "AudioRecorder Thread");

        recordingThread.start();
    }

    public void stop()
    {
        Log.i("SmartMontor","Parando a gravacao do audio");
        stopRecording();
    }

    public boolean isRecording()
    {
        if(recorder.getRecordingState() == AudioRecord.RECORDSTATE_RECORDING)
            return true;
        else
            return false;
    }


    private void stopRecording() {
        // stops the recording activity

        if (null != recorder) {
            isRecording = false;

            recorder.stop();


            recorder.release();

            recorder = null;
            recordingThread = null;}
        // copy the recorded file to original copy & delete the recorded copy
        copyWaveFile(getTempFilename(), getFilename()+AUDIO_RECORDER_FILE_EXT);
        deleteTempFile();
        convertWaveToAmr(getFilename()+AUDIO_RECORDER_FILE_EXT);
    } // stores the file into the SDCARD
    private String getFilename() {
        System.out.println("---3---");
        //String filepath = Environment.getExternalStorageDirectory().getPath();
        File file = new File(path, AUDIO_RECORDER_FOLDER);

        if (!file.exists()) {
            file.mkdirs();
        }

        return (file.getAbsolutePath() + "/" + AUDIO_RECORDER_FILE_EXT_WAV);
    }

    private void deleteTempFile() {
        File file = new File(getTempFilename());
        Log.i("SmartMontor","Deletando o arquivo temporario");

        file.delete();
    }


    private void copyWaveFile(String inFilename, String outFilename) {
        System.out.println("---8---");
        Log.i("SmartMontor", "Copiando o arquivo Wav na pasta");
        FileInputStream in = null;
        FileOutputStream out = null;
        long totalAudioLen = 0;
        long totalDataLen = totalAudioLen + 36;
        long longSampleRate = RECORDER_SAMPLERATE;
        int channels = RECORDER_CHANNELS_INT;
        long byteRate = RECORDER_BPP * RECORDER_SAMPLERATE * channels / 8;

        byte[] data = new byte[bufferSize];

        try {
            in = new FileInputStream(inFilename);
            out = new FileOutputStream(outFilename);
            totalAudioLen = in.getChannel().size();
            totalDataLen = totalAudioLen + 36;

            //Controller.doDoc("File size: " + totalDataLen, 4);

            WriteWaveFileHeader(out, totalAudioLen, totalDataLen,
                    longSampleRate, channels, byteRate);
            byte[] bytes2 = new byte[buffer.length * 2];
            ByteBuffer.wrap(bytes2).order(ByteOrder.LITTLE_ENDIAN)
                    .asShortBuffer().put(buffer);
            while (in.read(bytes2) != -1) {
                out.write(bytes2);
            }

            in.close();
            out.close();
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    // stores the file into the SDCARD
    private String getTempFilename() {
        // Creates the temp file to store buffer
        System.out.println("---4-1--");
        //String filepath = Environment.getExternalStorageDirectory().getPath();
        System.out.println("---4-2--");
        File file = new File(path, AUDIO_RECORDER_FOLDER);
        System.out.println("---4-3--");

        if (!file.exists()) {
            file.mkdirs();
        }

        File tempFile = new File(path, AUDIO_RECORDER_TEMP_FILE);
        System.out.println("---4-4--");

        if (tempFile.exists())
            tempFile.delete();
        System.out.println("---4-5--");
        return (file.getAbsolutePath() + "/" + AUDIO_RECORDER_TEMP_FILE);
    }
    private void writeAudioDataToFile() {

        // Write the output audio in byte
        byte data[] = new byte[bufferSize];

        String filename = getTempFilename();
        //
        FileOutputStream os = null;
        //
        try {
            //
            os = new FileOutputStream(filename);
            //
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        }

        int read = 0;


        // if (null != os) {
        while (isRecording) {
            // gets the voice output from microphone to byte format
            recorder.read(buffer, 0, buffer.length);
            // read = recorder.read(data, 0, 6144);

            if (AudioRecord.ERROR_INVALID_OPERATION != read) {
                try {
                    // // writes the data to file from buffer
                    // // stores the voice buffer

                    // short[] shorts = new short[bytes.length/2];
                    // to turn bytes to shorts as either big endian or little
                    // endian.
                    // ByteBuffer.wrap(bytes).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(shorts);

                    // to turn shorts back to bytes.
                    byte[] bytes2 = new byte[buffer.length * 2];
                    ByteBuffer.wrap(bytes2).order(ByteOrder.LITTLE_ENDIAN)
                            .asShortBuffer().put(buffer);

                    os.write(bytes2);
                    //  ServerInteractor.SendAudio(buffer);
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }

        try {
            os.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    private void WriteWaveFileHeader(FileOutputStream out, long totalAudioLen,
                                     long totalDataLen, long longSampleRate, int channels, long byteRate)
            throws IOException {
        System.out.println("---9---");
        byte[] header = new byte[4088];

        header[0] = 'R'; // RIFF/WAVE header
        header[1] = 'I';
        header[2] = 'F';
        header[3] = 'F';
        header[4] = (byte) (totalDataLen & 0xff);
        header[5] = (byte) ((totalDataLen >> 8) & 0xff);
        header[6] = (byte) ((totalDataLen >> 16) & 0xff);
        header[7] = (byte) ((totalDataLen >> 24) & 0xff);
        header[8] = 'W';
        header[9] = 'A';
        header[10] = 'V';
        header[11] = 'E';
        header[12] = 'f'; // 'fmt ' chunk
        header[13] = 'm';
        header[14] = 't';
        header[15] = ' ';
        header[16] = 16; // 4 bytes: size of 'fmt ' chunk
        header[17] = 0;
        header[18] = 0;
        header[19] = 0;
        header[20] = 1; // format = 1
        header[21] = 0;
        header[22] = (byte) RECORDER_CHANNELS_INT;
        header[23] = 0;
        header[24] = (byte) (longSampleRate & 0xff);
        header[25] = (byte) ((longSampleRate >> 8) & 0xff);
        header[26] = (byte) ((longSampleRate >> 16) & 0xff);
        header[27] = (byte) ((longSampleRate >> 24) & 0xff);
        header[28] = (byte) (byteRate & 0xff);
        header[29] = (byte) ((byteRate >> 8) & 0xff);
        header[30] = (byte) ((byteRate >> 16) & 0xff);
        header[31] = (byte) ((byteRate >> 24) & 0xff);
        header[32] = (byte) (RECORDER_CHANNELS_INT * RECORDER_BPP / 8); // block align
        header[33] = 0;
        header[34] = RECORDER_BPP; // bits per sample
        header[35] = 0;
        header[36] = 'd';
        header[37] = 'a';
        header[38] = 't';
        header[39] = 'a';
        header[40] = (byte) (totalAudioLen & 0xff);
        header[41] = (byte) ((totalAudioLen >> 8) & 0xff);
        header[42] = (byte) ((totalAudioLen >> 16) & 0xff);
        header[43] = (byte) ((totalAudioLen >> 24) & 0xff);

        out.write(header, 0, 4088);
    }




    public void convertWaveToAmr(String wavFilename)
    {
        Log.i("SmartMonitor","Convertendo o arquivo para AMR");
        AmrInputStream aStream = null ;
        InputStream inStream = null;
        OutputStream out = null;

        try {
            inStream = new FileInputStream(wavFilename);
            aStream= new AmrInputStream(inStream);
            File file = new File(getFilename()+AUDIO_ENCODED_FILE_EXT);
            file.createNewFile();
            out= new FileOutputStream(file);

            // #!AMR\n
            out.write(0x23);
            out.write(0x21);
            out.write(0x41);
            out.write(0x4D);
            out.write(0x52);
            out.write(0x0A);

            byte[] x = new byte[1024];
            int len;
            while ((len=aStream.read(x)) > 0) {
                out.write(x,0,len);
            }
        } catch (FileNotFoundException e) {
            Log.e("SmartMonitor","Erro FileNotFoundException "+e.getMessage());
            e.printStackTrace();
        } catch (IOException e) {
            Log.e("SmartMonitor","Erro IOException "+e.getMessage());
            e.printStackTrace();
        }
        finally
        {
            try {
                out.close();
                aStream.close();
                inStream.close();
            } catch (IOException e) {
                Log.e("SmartMonitor","Erro IOException 2 "+e.getMessage());
                e.printStackTrace();
            }
        }
    }
}

Here is the sample of the original audio Audio Wav And the audio after being converted Audio AMR

1 answer

0


Your problem is the sample rate, you are generating audio in 44100hz and recording in 8000hz.

you need to decide and choose which one to use, if you are working only with voice can be considered a waste of resources working at 44100hz since the human voice can reach up to 3400Hz the spectrum range from a sample rate to 8000hz is more than sufficient to meet these needs.

Proving that the Audios are in different sample rate just with the information you passed on:

Multiplying 8000hz by 1m and 25s

8000*85

ans =

      680000

Multiplying 44100 by 15 seconds to 4 milliseconds

44100*15.4

ans =

      679140

The result of both is almost the same number of samples!

My advice is to change your wav to record in 8000hz line:

private static final int RECORDER_SAMPLERATE = 8000;

And tbm the line:

recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
            RECORDER_SAMPLERATE, RECORDER_CHANNELS,
            RECORDER_AUDIO_ENCODING, 8000);
  • Thank you for the reply, but after the suggested amendment the audios were almost the same time but the reproduction is very fast. The result can be seen (https://soundcloud.com/la-rcio-bernardo/sets/teste-amrinputstream).

  • In the Recorder class only at 44100 is that wav gets the normal playback, since Amr does not stay, I also tried to change in the Amrinputstream class the final private field Static int SAMPLES_PER_FRAME = 8000 * 20 / 1000; by private final Static int SAMPLES_PER_FRAME = 44100 * 20 / 1000, but also did not work, excuse me ignorance, but do not understand audio file conversion, I am implementing by my software will work as automatic backup of calls.

  • Ever tried to put the SAMPLES_PER_FRAME = 44100 ? Your problem is still the frame rate ...

Browser other questions tagged

You are not signed in. Login or sign up in order to post.