It would really help if you provided a screenshot of your relevant blocks, so we can see what you are trying to do, and where the problem may be.
To get an image of your blocks, right click in the Blocks Editor and select "Download Blocks as Image". You might want to use an image editor to crop etc. if required. Then post it here in the community.
To be honest I just tried getting the basic code running before integrating it into my app. So I opened a new project with the code.
I didn´t change anything instead of the "file://".
I do find the the recorded video(s) in /storage/emulated/0/DCIM/100ANDRO/Name of Video.mp4
But I can´t play any video after recording.
hi there!
i am currently working on my project i.e based on emotion detector in which i need to send my audio file from mit app inventor to my machine learning model via mqtt and my mentor suggested me that use base64 for this it makes easy for me, my doubt is we are converting the audio file into string, right? it is encoding i want to send the string to my machine learning model as audio file, can you please help me with this. By the way, it's a great work i observe keenly.
I have not tried to send files by MQTT, the usual thing is to send small information.
Do you need to use MQTT? Can you use FireBase or CloudDB? What is your "machine learning model"? What is your MQTT Broker?
hi, Im trying to get pictures and save them on drive. Im using the Takepicturestobase64 block to do it, and work, but the resolution and the scale of the picture is geting small. should I try another kaind of block to fix this?
If I remember correctly the block you are using gets a preview image not a full image from the camera.
It may be better for you to use the TakePicture block from the camera, then set the resulting image to an image component and convert this image to base64.
You should bear in mind that a base64 of an image is generally 1.3 times larger than the image size.