4
My app the user can use the camera to take photo and need to save it in the database. To save in the database I convert the image into an array of bytes, however I noticed a problem in this process when the resolution of the camera is high (4:3 16 MB) does not work and when I leave lower as 16:9 (6 MB) works. I am using the Ormlite to save and it does not return any error exception when the camera is in high resolution, does anyone know what might be?
Bitmap photo = null;
File file = new File(mCurrentPhotoPath);
photo = MediaStore.Images.Media.getBitmap(
this.getActivity().getContentResolver(),Uri.fromFile(file));
ByteArrayOutputStream stream = new ByteArrayOutputStream();
photo.compress(Bitmap.CompressFormat.JPEG, 100, stream);
bytesImage = stream.toByteArray();
CheckListPendente CheckListPendente2 = new CheckListPendente();
CheckListPendente2.setId(checkListPendenteId);
CheckListResposta resposta = new CheckListResposta();
if (bytesImage != null) {
resposta.setImageBytes(bytesImage);
}
checkListDao = CheckListRespostaDao(helper.getConnectionSource());
checkListDao.create(resposta);
Change the line to photo.compress(Bitmap.CompressFormat.JPEG, 0, stream); and it worked, you know the reason why?
– Zica
The third parameter you passed is related to compression quality. Probably there must be memory overflow, you don’t have any stacktrace ?
– wryel