camel split big sql result in smaller chunks -


because of memory limitation need split result sql-component (list<map<column,value>>) smaller chunks (some thousand).

i know

from(sql:...).split(body()).streaming().to(...) 

and know

.split().tokenize("\n", 1000).streaming()  

but latter not working list<map<>> , returning string. there out of box way create chunks? or need add custom aggregator behind split? or there way?

edit additional info requested soilworker: @ moment sql endpoint configured way:

sqlendpoint endpoint = context.getendpoint("sql:select * " + lookuptablename + "?datasource=" + look_up_ds,                                            sqlendpoint.class); // returns complete result in 1 list instead of 1 exchange per line. endpoint.getconsumerproperties().put("useiterator", false); // poll interval endpoint.getconsumerproperties().put("delay", lookup_poll_interval); 

the route using should poll once day (we add cronscheduledroutepolicy soon) , fetch complete table (view). data converted csv custom processor , sent via custom component proprietary software. table has 5 columns (small strings) , around 20m entries. don't know if there memory issue. know on local machine 3gb isn't enough. there way approximate memory footprint know if amount of ram enough?

thanks in advance

maxmessagesperpoll result in batches


Comments

Popular posts from this blog

matlab - "Contour not rendered for non-finite ZData" -

delphi - Indy UDP Read Contents of Adata -

qt - How to embed QML toolbar and menubar into QMainWindow -