I have a very long serial (time) data. I sequentially scan them once to find the numbers of serial data satisfying various “complicated” criteria. After that, I allocate the sizes of storage data structures (arrays). Next, I have to re-scan the data for second time to allocate the data in the corresponding arrays.
Is there any smart way to bunch (ordered or unordered) temporarily and dynamically the data at first scan to save the second scan? I understand that allocating dynamic arrays is very efficient for memory handling and later on for calculations. To note that while data are bunched, calculations are not done at all. Next, knowing the sizes required, data could permanently be transferred (in order or not) to allocated arrays and release the “abused memory” used for bunching. I believe that transferring data is less expensive than scanning them for second time, isn’t it?
Any suggestion? Tia
Note (added later on for clarification purposes): Data are already read from the file and are stored in an 1d array. Matched data found at first parse/loop/scan of array are required to temporarily stored and transferred to arrays upon parse completion when array sizes are known to avoid second parse of the array. First answers have mostly to do with reading and storing the data from files.