jdbc batch insert exception handling

jdbc batch insert exception handling  using -'java,sql,oracle,jdbc,oracle10g'

I am performing a JDBC batch insert (inserting 1000 rows approx at a time) each time my program is executed. But i am not able to handle the exception thrown by some of the records properly.

Suppose, the 100th record out of 1000 records is throwing an exception because of an invalid data or size of some value exceeds the column size. Once the exception has occured, the remaining records are not getting inserted and the program fails in between.

What I want is even if the 100th record is throwing exception, the remaining insertions should happen as usual before my program ends.

I am not able to understand how to achieve this. Please suggest.


Here is a sample code I am using in my app for batch insert. Assume that the result set has got approx 1000 records.

PreparedStatement ps = null;
  //Retrieve the value and set it to a Prepared statement

  String name = rs.getString("Name");
  int age = rs.getInt("Age");
  ps.setInt(1, age);
  ps.setString(2, name);

  //Finally invoke addBatch

//Finally call the executeBatch method

If the 100th record is throwing an exception, then I want to trigger the process only from the 100th too 1000th record. Is there some way to do this such that I can restart the process from the record which threw exception onwards till the end again? How to achieve this?

asked Sep 29, 2015 by r3tt
0 votes

2 Answers

0 votes

You state that you are executing batches of SQL statements so I'm assuming that you are using executeBatch. When you executeBatch a batch of statements, some of the statements may succeed and some of the statements may fail. If one statement fails, that does not mean that the JDBC driver rolls back the statements that succeeded. The JDBC driver may choose to attempt to execute all the statements in the batch if one statement fails or it may choose to stop executing statements in the batch once a statement fails (it sounds like the driver you're using chooses to stop executing statements as soon as there is a failure).

When one statement in a batch fails, you should get a BatchUpdateException. In your exception handler, you need to call getUpdateCounts. That will give you an array of int that tells you either how many rows the statement updated, a Statement.SUCCESS_NO_INFO indicating the statement succeeded but no row count was available, or a Statement.EXECUTE_FAILED indicating that the statement failed. If the first 99 statements succeed, the 100th statement generates an error, and the remaining statements are not executed, you should get back a 100 element array where the first 99 elements indicate success and the 100th element indicates Statement.EXECUTE_FAILED. Your code would then need to retry those statements that were not executed (in this case, statements 101-1000).

answered Sep 29, 2015 by ukohale
0 votes

Without seeing any code, the best advice I can give is to wrap a try/catch block around your code that does the inserting. You will also probably need to commit your inserts (assuming transactional connection) at each insert. This will result in a little bit slower processing, but will allow your program to continue.

I think a more clean approach would be to try pre-validating the records before inserting. This should not be to much of a processing load if we are only talking about 1000 records at a time.

answered Sep 29, 2015 by sandeep bhadauria