Example mit app inventor using ai teachable machine with google (for projects involving physically disabled people))

hello i want to ask about an idea for students for aproject using mit app inventor using ai teachable machine with google asimple programe for helping disabled people for example

Welcome Amal.

Perhaps you are looking for something like this Amal

or like this

https://www.researchgate.net/publication/321350863_Teachable_machines_for_accessibility

or

or

1 Like

hello i want to ask about an idea for students for aproject using mit app inventor using ai teachable machine with google asimple programe for helping disabled people for example And i need aprogramme using mit app inventor that student can learn an programme it easily please

@amal_abu_nassar1 see above answers please. What have you tried?

Here is an extension for teachable machine using app inventor TMIC: App Inventor Extension for the Deployment of Image Classification Models Exported from Teachable Machine . It might be the basis for your app for the physically disabled as already mentioned. Have you tried it?

Hello
I tried the sign language but does not work good
May be the pictures should be more in every sign
If you can help me

You have identified what you can do to increase reliability in reading the finger signs Amal.
Why don't you increase the images in your data set? First try adding more images for one or two of the 'signs'; if increasing the data set (training set of images) for those images improves recognition, add more images for all the signs. Maybe that will help you.

Another possibility for lack of recognition using your training set is the hand signs you are making are not being properly formed? If you use your hand signs with a disabled person, can he or she understand the sign you make? This would be a great test of your ability to use your hand to create a proper sign for a deaf person to understand.

Thankyou i will try it

And if you any project with same idea easy recognition