How does app inventor use the phone's sensors?

Hello everyone!
I wanted to ask a rather technical question about the way app inventor works. I have tried to search for information but I can't find much about how app inventor works when using the phone sensors, for example, the accelerometer (maybe I haven't searched well).
Specifically, I'm looking at the limitations it may have, could the operation of the sensors be different depending on the phone in which the application is used?
I would really appreciate your guidance and especially if you have any literature I can read about it.

Thanks!

Yes, here, it is,

http://ai2.appinventor.mit.edu/reference/components/sensors.html

Thank you! I have already read this.
Maybe I didn't phrase my question correctly, but I mean what's underneath that? does it work as a native app? does it work as a web app? does it work in another way? what limitations does the type of operation that app inventor has when using sensors? does it bring a significant delay? what does that delay depend on?

sincerely it is not my area of expertise, that's why I ask

Yes, the companion app and your compiled apps are native apps. You can see the source code for the AccelerometerSensor on GitHub.

As implemented, all sensor events are run on the application's UI thread. This means that you should try to process the events as quickly as possible because otherwise the app will have the appearance of being "slow" since the UI thread would be tied up in computation. How you optimize your code to run as fast as possible will be a function of what the app is doing. We also use an intermediate language we call YAIL, but is actually a dialect of the Scheme language. When running in the companion, this code is interpreted by the app. When you compile the app, the code is compiled down to native code so you will generally find compiled apps to perform better than when they are in development mode.

1 Like