Run time error "Stack size 8188kb" during big list slice

I'm trying to extract a small list taking a peace from much bigger list using SLICE LIST block but if the big list has for example 100.000 elements and i'm trying to extract a piece of 5000 elements where the initial index is bigger then 60.000 - there is an error "Stack size 8188kb".
If I'm extracting the piece where initial index is less than 50.000 then everything is ok.
Maybe somebody knows a FAST way to extract a part of a huge list without usage of "slice list" or "for each number/item" blocks?
P.s. the big list is a content of a file that was read in binary mode and I need to elaborate it byte by byte but a normal "For each number/item.." block is simply closing an app after elaboration of about 100-200kb (without any error message).
I found a bypass dividing a huge list in pieces using "for each number" block with elaboration of each piece with a clock and it works but it's very very slow (more then 10 minutes for 1MB file).
"SLICE list" block is MUCH faster but it gives a stack error if the file is bigger than 60-70KB (and I need up to 2-3MB)
p.s i prepared a simple AIA that can show both ERRORS (with a SLICE LIST and FOR EACH NUMBER in case of elaboration of big list). Please compile AIA and select any file on your smartphone bigger than 800KB.
Prova (1).aia (34.2 KB)

Put the data into a sqlite database to filter the data easily...
Select * from myTable where id > 60000 and id <= 65000

You need an extension to access the local sqlite database, for example App Inventor Extensions: SQlite | Pura Vida Apps


I thought AI2 only read text files.

Please download and post each of those event block(s)/procedures here ...
P.S. These blocks can be dragged directly into your Blocks Editor workspace.

See Download Block Images for a demo.

How did you get your byte code into an AI2 list ?

I'm using SAF. ReadAsByteArray

It's not the problem to read file as binary (I'm using SAF extension). The problem is in extracting 1000 elements from the list with 1.000.000 elements.
'For each item/number" block is simply closing the app and SLICE LIST block is giving the stack error when you try to extract elements already from 60000

Yes, and ...?

Oh, it has gone up from 100,000 to a miliion ?

I prepared a simple AIA with the possibility to read a file as binary using SAF (maybe it will be useful for you 😀) and then trying to slice the list. The result is STACK ERROR
Prova.aia (33.4 KB)

Thank you Taifun. I was thinking about your SQLite extension (i've bought it already from you and i've used it successfully in my APPs - It's PERFECT!! ) but to load 2-3 millions of records to DB each time when i read a file seems to be a little bit excessive (and will also take a lot of time).

SLICE LIST block will be perfect - maybe somebody will be able to correct this problem (i prepared an aia that shows an error)

You can't shard your data to load subsets?

Please compile an attached aia and select any file bigger than 800KB from your smartphone - you will receive a STACK ERROR during the "slice list" execution for extracting of elements from index = 700.000

This is my first experience with a SAF ByteArray.

Is it text, or is it an AI2 list?

If it is text, why not use the text segment block?
If it is a list, why the SPLIT At SPACES?

Have you tried: JavaScript arrayBuffer.slice() Method ?

Its a text where each byte from the file is a number (digits). Each byte number is separated from the next by space. The length of each number is different (from 1 to 127 and it can be also minus before it).

Unfortunately i don't know how to use JavaScript from AI2. Could you please give me a link to read?

Search the community for how to run javascript

To "Elaborate"....

Using these blocks, based on @ABG 's suggestion to do the split using segment

worked for me on image files upto @ 2.5mb in size, after that it crashed with an OOM error.

Suggest for larger file sizes you segment the byte array into usable chunks, from the beginning.

This was tested using companion app on Android 13 (Google Pixel 4a)

The byte list looks like strange JSON:
Note the leading '[' but spaces instead of commas.

Text segment is likely to split midst number.

True, hadn't thought of that :wink:

List operations might speed up if you avoid high item numbers, since they are linked lists.

Copy the list.
Loop until index1, removing item 1 as you go.
Loop until index2-index1, adding onto a separate chunk and deleting as you go.

If that's not fast enough, use a Clock Timer and a Progress Notifier, 1000 bytes per chunk.

Brush up on your Javascript.

Thank you ABG. Removing items i was able to double the speed of elaboration for list with 25000 elements