Size of captured image
Opened this issue · 1 comments
I'm not sure if this is a bug, but it looks a little strange.
I use the camera to take a photo, and then upload the resulting bytearray to the server. The server has a file upload limit of 10mb.
I have several android phones and an iphone se 2016. It works correctly on all androids, but on an iPhone, when trying to upload a photo from an iPhone to the server, the server informed me that the photo was 15mb in size.
I converted the array to bitmap - the resolution is about the same on both platforms - 3000*4000.
But when I look at the size of the array, it is 5-8 times larger on ios, for example 3,806,630 on android versus 21,843,377 on ios .
I don't quite understand if the photo resolution is the same and there is no compression - where does such a huge difference come from.
And I'm not sure if we have some kind of multiplatform way to compress without saving to disk.
As far as I understand the problem is that an uncompressed byte array is returned to ios and bytearray from jpeg is returned on android.
I fix my problem make this
@OptIn(ExperimentalForeignApi::class)
actual fun saveImage(byteArrays: ByteArray?): ByteArray? {
val data = byteArrays?.toData()
val image = data?.let { UIImage.imageWithData(it) }
val jpegData = image?.let { UIImageJPEGRepresentation(it, 1.00) }
val jpegByteArray = jpegData?.length?.let {
ByteArray(it.toInt()).apply {
memcpy(this.refTo(0), jpegData.bytes, jpegData.length)
}
}
return jpegByteArray
}
@OptIn(ExperimentalForeignApi::class)
fun ByteArray.toData(): NSData = memScoped {
val data = NSData.create(bytes = allocArrayOf(this@toData),
length = this@toData.size.toULong())
return data
}
However, the difference in behavior is little confusing.