OCR extension crashes app

Hi,

I was tinkering with this OCR extension (https://nmd-apps.jimdo.com/extensions/nmd-extensions/#5).

Everytime I try to take a picture from the camera or from the gallery the app will crash.

Using an imageURL works fine. Am I doing something wrong? Or is the extension buggy?

Please show how you have set the blocks and how you tried it.

This is how it works:

Hi, thanks for the quick reply.

Yes I was using the exact same blocks.
I now figured out that I have to export and install it as an apk first. Then it seems to work.

If I do the aia live connection demo it will simply crash.

One more question. Is it possible to make this extension work with the google OCR (https://cloud.google.com/vision/docs/detecting-text) or does that require more tweaking?

Yes, the most things works only on a real device as apk and not at live testing via companion or usb live testing.
The extension (created by me) uses this service:

https://ocr.space/ocrapi

I have not much time to create the same things with other services.
So for now i let it how it is.

in case you want to find out for @Mika, why it crashes, then use logcat, see here


Taifun

1 Like

Hi, I have one more question.

I have now tried both your extension and got another OCR service from microsoft working with thunkable.(https://westus.dev.cognitive.microsoft.com/docs/services/56f91f2d778daf23d8ec6739/operations/56f91f2e778daf14a499e1fc)

I just have one issue. I want to scan text which is formatted in a table.

An example picture:

The OCR spits something out similar to this:

Team1, Brazil, Mexico, Spain, Chile …
(Each item in the column)

What I want is:

Brazil, 3, 1, Croatia, Sao Paulo
(Each item in a row)

Any ideas? Can I somehow use the “coordinates” in the Json response code? (If yes, how?)
Problem here would be what happens if the picture is slightly tilted and it might mix up the position of some text.

Json response below

Summary

`X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Content-Length: 4485
Content-Type: application/json; charset=utf-8
Expires: -1

{
“language”: “en”,
“textAngle”: 0.0,
“orientation”: “Up”,
“regions”: [{
“boundingBox”: “6,49,143,327”,
“lines”: [{
“boundingBox”: “55,49,78,18”,
“words”: [{
“boundingBox”: “55,49,78,18”,
“text”: “Teaml”
}]
}, {
“boundingBox”: “7,86,57,19”,
“words”: [{
“boundingBox”: “7,86,57,19”,
“text”: “Brazil”
}]
}, {
“boundingBox”: “7,125,80,18”,
“words”: [{
“boundingBox”: “7,125,80,18”,
“text”: “Mexico”
}]
}, {
“boundingBox”: “6,163,59,23”,
“words”: [{
“boundingBox”: “6,163,59,23”,
“text”: “Spain”
}]
}, {
“boundingBox”: “6,200,54,19”,
“words”: [{
“boundingBox”: “6,200,54,19”,
“text”: “Chile”
}]
}, {
“boundingBox”: “6,238,104,19”,
“words”: [{
“boundingBox”: “6,238,104,19”,
“text”: “Colombia”
}]
}, {
“boundingBox”: “6,276,143,19”,
“words”: [{
“boundingBox”: “6,276,51,19”,
“text”: “Cöte”
}, {
“boundingBox”: “65,276,84,19”,
“text”: “d’Ivoire”
}]
}, {
“boundingBox”: “7,315,94,23”,
“words”: [{
“boundingBox”: “7,315,94,23”,
“text”: “Uruguay”
}]
}, {
“boundingBox”: “7,352,87,24”,
“words”: [{
“boundingBox”: “7,352,87,24”,
“text”: “England”
}]
}]
}, {
“boundingBox”: “223,49,76,322”,
“lines”: [{
“boundingBox”: “223,49,76,18”,
“words”: [{
“boundingBox”: “223,49,76,18”,
“text”: “Scorel”
}]
}, {
“boundingBox”: “255,87,12,18”,
“words”: [{
“boundingBox”: “255,87,12,18”,
“text”: “3”
}]
}, {
“boundingBox”: “256,125,11,18”,
“words”: [{
“boundingBox”: “256,125,11,18”,
“text”: “1”
}]
}, {
“boundingBox”: “256,163,11,18”,
“words”: [{
“boundingBox”: “256,163,11,18”,
“text”: “1”
}]
}, {
“boundingBox”: “255,201,12,18”,
“words”: [{
“boundingBox”: “255,201,12,18”,
“text”: “3”
}]
}, {
“boundingBox”: “255,239,12,18”,
“words”: [{
“boundingBox”: “255,239,12,18”,
“text”: “3”
}]
}, {
“boundingBox”: “255,277,12,18”,
“words”: [{
“boundingBox”: “255,277,12,18”,
“text”: “2”
}]
}, {
“boundingBox”: “256,315,11,18”,
“words”: [{
“boundingBox”: “256,315,11,18”,
“text”: “1”
}]
}, {
“boundingBox”: “256,353,11,18”,
“words”: [{
“boundingBox”: “256,353,11,18”,
“text”: “1”
}]
}]
}, {
“boundingBox”: “369,49,76,322”,
“lines”: [{
“boundingBox”: “369,49,76,18”,
“words”: [{
“boundingBox”: “369,49,76,18”,
“text”: “Score2”
}]
}, {
“boundingBox”: “402,87,11,18”,
“words”: [{
“boundingBox”: “402,87,11,18”,
“text”: “1”
}]
}, {
“boundingBox”: “401,125,12,18”,
“words”: [{
“boundingBox”: “401,125,12,18”,
“text”: “0”
}]
}, {
“boundingBox”: “401,163,12,18”,
“words”: [{
“boundingBox”: “401,163,12,18”,
“text”: “5”
}]
}, {
“boundingBox”: “402,201,11,18”,
“words”: [{
“boundingBox”: “402,201,11,18”,
“text”: “1”
}]
}, {
“boundingBox”: “401,239,12,18”,
“words”: [{
“boundingBox”: “401,239,12,18”,
“text”: “0”
}]
}, {
“boundingBox”: “402,277,11,18”,
“words”: [{
“boundingBox”: “402,277,11,18”,
“text”: “1”
}]
}, {
“boundingBox”: “401,315,12,18”,
“words”: [{
“boundingBox”: “401,315,12,18”,
“text”: “3”
}]
}, {
“boundingBox”: “401,353,12,18”,
“words”: [{
“boundingBox”: “401,353,12,18”,
“text”: “2”
}]
}]
}, {
“boundingBox”: “568,49,78,18”,
“lines”: [{
“boundingBox”: “568,49,78,18”,
“words”: [{
“boundingBox”: “568,49,78,18”,
“text”: “Team2”
}]
}]
}, {
“boundingBox”: “485,87,139,289”,
“lines”: [{
“boundingBox”: “486,87,78,18”,
“words”: [{
“boundingBox”: “486,87,78,18”,
“text”: “Croatia”
}]
}, {
“boundingBox”: “486,125,116,18”,
“words”: [{
“boundingBox”: “486,125,116,18”,
“text”: “Cameroon”
}]
}, {
“boundingBox”: “487,162,137,19”,
“words”: [{
“boundingBox”: “487,162,137,19”,
“text”: “Netherlands”
}]
}, {
“boundingBox”: “485,200,97,19”,
“words”: [{
“boundingBox”: “485,200,97,19”,
“text”: “Australia”
}]
}, {
“boundingBox”: “486,239,80,18”,
“words”: [{
“boundingBox”: “486,239,80,18”,
“text”: “Greece”
}]
}, {
“boundingBox”: “485,277,63,23”,
“words”: [{
“boundingBox”: “485,277,63,23”,
“text”: “Japan”
}]
}, {
“boundingBox”: “486,315,112,18”,
“words”: [{
“boundingBox”: “486,315,60,18”,
“text”: “Costa”
}, {
“boundingBox”: “556,315,42,18”,
“text”: “Rica”
}]
}, {
“boundingBox”: “487,352,45,24”,
“words”: [{
“boundingBox”: “487,353,25,18”,
“text”: “Ita”
}, {
“boundingBox”: “516,352,16,24”,
“text”: “ly”
}]
}]
}, {
“boundingBox”: “739,48,166,323”,
“lines”: [{
“boundingBox”: “805,48,44,24”,
“words”: [{
“boundingBox”: “805,48,44,24”,
“text”: “City”
}]
}, {
“boundingBox”: “739,86,108,19”,
“words”: [{
“boundingBox”: “739,87,39,18”,
“text”: “Sao”
}, {
“boundingBox”: “787,86,60,19”,
“text”: “Paulo”
}]
}, {
“boundingBox”: “740,124,55,19”,
“words”: [{
“boundingBox”: “740,124,55,19”,
“text”: “Natal”
}]
}, {
“boundingBox”: “739,162,97,19”,
“words”: [{
“boundingBox”: “739,162,97,19”,
“text”: “Salvador”
}]
}, {
“boundingBox”: “739,200,86,19”,
“words”: [{
“boundingBox”: “739,200,86,19”,
“text”: “Curitiba”
}]
}, {
“boundingBox”: “740,238,165,19”,
“words”: [{
“boundingBox”: “740,238,47,19”,
“text”: “Belo”
}, {
“boundingBox”: “796,239,109,18”,
“text”: “Horizonte”
}]
}, {
“boundingBox”: “740,276,67,19”,
“words”: [{
“boundingBox”: “740,276,67,19”,
“text”: “Recife”
}]
}, {
“boundingBox”: “740,314,100,19”,
“words”: [{
“boundingBox”: “740,314,100,19”,
“text”: “Fortaleza”
}]
}, {
“boundingBox”: “740,353,88,18”,
“words”: [{
“boundingBox”: “740,353,88,18”,
“text”: “Manaus”
}]
}]
}]
}`

Hi, could you please post an example on how to use OCR from microsoft ? or could you tell me how to do it because I tried with image and emotion recognition and they are working well, however OCR is not

Did you looked above??
There are all information you need.

Here the blocks I used for the Microsoft OCR api. Let me know if it works or not. I may have forgotten something in my messy block order :stuck_out_tongue:

URL = https://southeastasia.api.cognitive.microsoft.com/vision/v1.0/ocr?language=unk&detectOrientation =true
Host = southeastasia.api.cognitive.microsoft.com
Content-Type = application/octet-stream (For uploading pictures directly. If you want to use an URL = application/json)
Ocp-Apim-Subscription-Key = YOUR API KEY

Info from here: https://southeastasia.dev.cognitive.microsoft.com/docs/services/56f91f2d778daf23d8ec6739/operations/56f91f2e778daf14a499e1fc/console