How to make a gesture recognition app to translate sign language or at least distinguish the alphabet that is shown by gestures?

Hi everyone, this is my first time on this forum. I have a school project I want to make a gesture recognition app. Which could translate sign language. Or at least for the project distinguished the alphabet that is shown by gestures. Right now I have the project design and layout in thunkable. I have trained the teachable machine to recognize some gestures for now as a test. I asked ChatGPT and DeepSeek for help and they suggested a solution via Web Viewer basically by creating an html page and uploading it to some hosting and linking it to Web Viewer. I’ve tried a bunch of options nothing works. Has anyone encountered this problem? If yes, I would be glad to hear your methods of solving the situation.

Translated with DeepL.com (free version)

Hi faktor066797ft, welcome to Thunkable! :tada:

Be sure to check out How to ask Great Questions v2.0, the Community Guidelines, and our Getting Started Guide to make the best of your Thunkable Community experience!

Could you show us what you have done so far to try and get this to work? It sounds quite advanced but if you can share your project or screenshots of the blocks you’ve used so far, someone may have a suggestion for you.

Hey, @faktor066797ft, and welcome to Thunkable!

Check out this link to implement Teachable Machine to Thunkable. Don’t know if it works, but check it out: