spring batch - mark read data as "processing" by a table column flag then restore at the end -
below relevant portion of code reader, processor , writer , step batch job create.
i have requirement update flag column in table data being read ( source table ) mark data being processed job other apps don't pick data. once processing of read records finished, need restore column original value other apps can work on records too.
i guess, listener approach take ( itemreadlistener ? ) . reader listener seems suitable first step ( i.e update flag column ) not restore @ end of chunk. challenge seems making read data available @ end of processor.
can suggest possible approaches?
@bean public step step1(stepbuilderfactory stepbuilderfactory, itemreader<remittancevo> reader, itemwriter<remittanceclaimvo> writer, itemprocessor<remittancevo, remittanceclaimvo> processor) { return stepbuilderfactory.get("step1") .<remittancevo, remittanceclaimvo> chunk(constants.spring_batch_chunk_size) .reader(reader) .processor(processor) .writer(writer) .taskexecutor(simpleasyntaskexecutor) .throttlelimit(constants.throttle_limit) .build(); } @bean public itemreader<remittancevo> reader() { jdbcpagingitemreader<remittancevo> reader = new jdbcpagingitemreader<remittancevo>(); reader.setdatasource(datasource); reader.setrowmapper(new remittancerowmapper()); reader.setqueryprovider(queryprovider); reader.setpagesize(constants.spring_batch_reader_page_size); return reader; } @bean public itemprocessor<remittancevo, remittanceclaimvo> processor() { return new matchclaimprocessor(); } @bean public itemwriter<remittanceclaimvo> writer(datasource datasource) { return new matchedclaimwriter(); }
i started spring batch few days ago don't have familiarity provided modeling , patterns.
firstly, small hint using asynctaskexecutor: have synchronize reader, otherwise run concurrency problems. can use synchronizeditemstreamreader this:
@bean public step step1(stepbuilderfactory stepbuilderfactory, itemreader<remittancevo> reader, itemwriter<remittanceclaimvo> writer, itemprocessor<remittancevo, remittanceclaimvo> processor) { return stepbuilderfactory.get("step1") .<remittancevo, remittanceclaimvo> chunk(constants.spring_batch_chunk_size) .reader(syncreader) .processor(processor) .writer(writer) .taskexecutor(simpleasyntaskexecutor) .throttlelimit(constants.throttle_limit) .build(); } @bean public itemreader<remittancevo> syncreader() { synchronizeditemstreamreader<remittancevo> syncreader = new synchronizeditemstreamreader<>(); syncreader.setdelegate(reader()); return syncreader; } @bean public itemreader<remittancevo> reader() { jdbcpagingitemreader<remittancevo> reader = new jdbcpagingitemreader<remittancevo>(); reader.setdatasource(datasource); reader.setrowmapper(new remittancerowmapper()); reader.setqueryprovider(queryprovider); reader.setpagesize(constants.spring_batch_reader_page_size); return reader; }
secondly, possible approach real question:
i use simple tasklet in order "mark" entries want process. can 1 simple update-statement, since know selection criterias. way, need 1 call , therefore 1 transaction.
after that, implement normal step reader, processor , writer. reader has read marked entries, making select clause simple.
in order restore flag, in third step implemented tasklet , uses appropriate update-statement (like first step). ensure flag restored in case of exception, configure jobflow appropriately, step 3 executed if step 2 fails (-> see answer question spring batch java config: skip step when exception , go next steps)
of course, restore flag when writing chunk if use compositeitemwriter. however, need strategy how restore flag in case of exception in step 2.
imo, using listener not idea, since transaction handling differently.
Comments
Post a Comment