I thunk – I mean think – that there is no way to do this but some of you are pretty clever about figuring out ways to do stuff in Thunkable.
I have a student who wants to make a drawing app where you can overlay an outline over a live video to help with tracing artwork. The drawing itself would be done on paper so she’s not using the Canvas for example.
In short, she wants to turn on a phone’s camera and see the live view in the app while also displaying an image over it.
I can stack components in the new drag-and-drop interface so I can put an image component on top of a video component. The problem is that once you start video recording or photo capture, the whole screen is filled – I think because it uses the native photo app on my iPhone. It’s not until the video is recorded or photo is taken that I can then see the image on top of it in the app.
I’m interested in a demo of a way to see a “live” view of what the phone’s camera sees with an image on top of it. I’m 99% sure there’s no way to do it but willing to have someone prove me wrong!
I’m afraid that displaying content over the Camera is not possible. Your student will need to get a photo or a still from a video and work with that in her app.
I’m assuming this is still the case? I am trying to find the ability to prompt the user while recording to record different things but the workarounds I have tried did not work. Just wondering if there’s been any change to the element or if someone clever has found anything