How do I convert camera data to Image?
CompletedHello
I have an M30.
I am running this code:
private val streamSourcesListener = StreamSourceListener {
it?.let {
upload_toWebSocket (it.data.toByteString());
}
}
Please refrain from talking about your sample code and see if you can answer this question.
The even fires, i get a bytestream about 23 kB long. The camera is sending data 30 FPS. All this is fine.
Here is sample data. This is simply conversion of it.data into a bytearray, and then saved in a dat file.
I want to convert this to a png image.
How can I convert the output of streamSourcesListener to a PNG image?
This has to happen at least 30 fps
Thank you
-
I guess you are referring to YuvDataListener or CameraFrameListener. I would appreciate it if you could point out any errors that may have occurred. The push frequency of these listeners cannot be adjusted, as they receive video frames from the camera at the corresponding YUV data rate. While you suggested not providing sample code on how to convert YUV data to a PNG image, the sample code does offer a reference that can be directly used. If you are using CameraFrameListener to retrieve data, you can set the data format to RGBA_8888 when setting up the listener, then use bitmap.copyPixelsFromBuffer to convert it to a bitmap. Finally, you can use bitmap.compress to output it in the desired image format.
Please sign in to leave a comment.
Comments
1 comment