API return JSON puzzle

I’m working on giving my AI assistant a personality. To do this I’ve crafted an intro prompt to prime it and then given it a set of adjustable personality traits as a JSON. Since this is a lot of text and takes ChatGPT a while to process I’ve asked it to process a bunch of responses up front to ensure there’s less lag on each interaction.

Every interaction with the user requires three outputs by the AI Agent: a greeting / acknowledgment / callout. And each Set is an element in an array. The output JSON looks like this and works just fine:

Screenshot 2023-03-23 144903

Because this is coming back from ChatGPT I need to get the actual response back which is more complicated than it was with GPT-3. Once that’s done I’d like to iterate through the array / list in Thunkable with each new interaction with the user. I’m trying this, but getting nothing:

We’re all good up to number 1. Section 2 doesn’t work. I’ve tried a few different ways of extracting that info but I do really struggle with the logic of parsing JSON / objects.

Section 2 looks right to me and it’s helpful that you provided formatted JSON. But it’s best if you can post the full JSON response as text here. Make sure to format it using the </> button in the toolbar which will avoid any quotes getting improperly formatted as smart quotes. You can also, temporarily, create a variable or text string with that JSON response text and bypass the API call just to make sure your parsing blocks work. I can show you an example of that after you post the response here.

So there were definite errors in that section 2 which I picked up but it’s still not working.

I altered the structure to make sure the JSON is being returned as below and it definitely is. Here’s what it should look like:

Here’s what the test looks like to capture the JSON and make sure it’s correct:

Here’s the JSON:

[{"greeting":"What do you want?","acknowledgment":"Ugh, fine.","callout":"Have I got that right?"},
{"greeting":"What now?","acknowledgment":"Got it.","callout":"Does this look good enough for you?"},
{"greeting":"Why are you bothering me?","acknowledgment":"Sure thing, boss.","callout":"Is this what you needed?"}]

I’m not sure this is going to work for me as ChatGPT is now so slow at returning this JSON that it’s nearly unusable!

Which endpoint are you using? GPT-4 isn’t super fast for me but the responses come back in a few seconds usually.

Oh… interesting… the JSON response actually contains the word “array”. This is why having the full text and not just a screenshot is so important when troubleshooting APIs. This is what I see when I paste your JSON text into https://codebeautify.org/jsonviewer:

As you can see, the “greeting” property has a path (at the top) that is array[1].greeting (Thunkable indexes lists starting at 1 whereas APIs typically start arrays at 0). You can either use that as the property name or you can modify your existing blocks to get the first list item of the property “array” and then from that get the property “greeting”.

1 Like

So, I still don’t have GPT-4 access :frowning: I’ve requested it but my understanding is that if I send stuff to the GPT-4 endpoint I’ll get an access denied. I’m using the GPT3.5-Turbo (ChatGPT) endpoint right now.

Wider context of what I’m trying to do might help then. Since I’m sending ChatGPT a long characterisation prompt along with the character traits JSON to get ChatGPT into character, it’s a lot of tokens for just a greeting / acknow / callout in return. So, I figured I’d ask it to produced 3-5 Sets of these and then store them all in a stored variable for later use. That way, it seems to work much faster for the user. The app will use greeting[1] then acknowledgment[1] then callout[1] and then it’ll increment to the second Set in the array. Once the array is done I’ll refill it in background.

RE: your answer then. Ahhh that makes sense! Interesting that you can address it in this way (via an object property) I thought I’d have to address the elements of the array as a list. Good to know! For those as dense as me, here’s what a working vs looks like:

As always - THANK YOU. I’d have given up on Thunkable ages ago without your help. I appreciate it :slight_smile:

1 Like

You’re welcome! And thanks for sharing a screenshot of working blocks. That helps others in the future.

Just one note about your blocks: accessing stored variables is a slow process in Thunkable. If at all possible, avoid doing this several times throughout your code. The approach I take is to assign an app variable to the stored variable’s value when the screen opens. That way, I retrieve the saved value. Then, I use the app variable throughout the rest of my code. If I need to change the value of the stored (saved) variable, then I use a stored variable block. But any other time I’m only accessing that value, I use the app variable which is much faster.

I think you’ll find your Post function is considerably faster just replacing the [get object from JSON stored variable ConvoSet] with [get object from JSON app variable ConvoSet].

I haven’t done a timed test recently but just from memory, accessing a stored variable takes something like 0.3 seconds whereas accessing an app variable takes something like 0.03 seconds. Don’t quote me on that but the difference is significant!

1 Like

Oh wow! That’s very good to know! I was putting the speed down to ChatGPT but perhaps it’s not entirely that. I’ll go looking for them all and see if there’s a way of trimming them back. Appreciate that :wink:

I was getting response times of 10 seconds from an API once and was frustrated that I couldn’t figure out how to speed it up. I finally realized it was the stored variables and once I removed them the response times were about 1-2 seconds.

1 Like