OOME on MySQL query before using the ResultSet

I have this simple table, with lots of rows (well… about 400.000) with less than 100 bytes per row.

So I connect to the database, create a Statement, and call Statement.executeQuery(String)


public static void main(String[] args) throws Exception
{
   Class.forName("com.mysql.jdbc.Driver");
   java.sql.Connection con = DriverManager.getConnection(args[0], args[1], args[2]);
   String sql = "SELECT `id` FROM `SomeTable` WHERE `something` <> 'something else'";
   Statement stmt = sqlCon.createStatement();
   stmt.setFetchSize(64);
   ResultSet result = stmt.executeQuery(sql);
}

After a few seconds, I get the infamous OOME:


Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
        at com.mysql.jdbc.Buffer.getBytes(Buffer.java:198)
        at com.mysql.jdbc.Buffer.readLenByteArray(Buffer.java:318)
        at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1375)
        at com.mysql.jdbc.MysqlIO.readSingleRowSet(MysqlIO.java:2369)
        at com.mysql.jdbc.MysqlIO.getResultSet(MysqlIO.java:451)
        at com.mysql.jdbc.MysqlIO.readResultsForQueryOrUpdate(MysqlIO.java:2076)
        at com.mysql.jdbc.MysqlIO.readAllResults(MysqlIO.java:1451)
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1787)
        at com.mysql.jdbc.Connection.execSQL(Connection.java:3277)
        at com.mysql.jdbc.Connection.execSQL(Connection.java:3206)
        at com.mysql.jdbc.Statement.executeQuery(Statement.java:1232)
        at touchetracks.Launcher.main(Launcher.java:16)

As you can see, the code crashes at executeQuery(…) so I’m not even travering the ResultSet.
It seems like it’s loading all rows into RAM, before returning.

When I add a condition, to reduce the amount of rows in the ResultSet, the problem goes away - but I need almost the whole table in this case.

Can anybody be kind enough to point me at the obvious thing I forgot?

The MySQL jdbc driver is quite bad with this things. I am afraid you have to use “LIMIT [offset,] row_count” in your statement and traverse the chunks yourself.

:o

Using LIMITs is easy, but kinda ‘blegh’ from a technical point of view (and… how big should the chunk size be).

Ensuring memory usage will be low (it’s a Virtual Private Server so free memory is low), forces me to use a fairly small chunk size which is (probably) much slower than 1 query.

Can I connect to a MySQL database through another driver, or shouldn’t I even bother…

Great… these is an “ORDER BY” clause in my query, so that makes every query with a LIMIT roughly as fast at the original query (without LIMIT)

So splitting my query over N limits, makes execution time grow by facter N.

Nice way to kill your server!

grmbl

Hurray for Google…


         Statement stmt = con.createStatement();
         stmt.setFetchSize(Integer.MIN_VALUE);  <-------
         ResultSet result = stmt.executeQuery(sql);
         return SQLUtil.grabIterator(stmt, result);

… did the job.

Hehe, I was just about to post the very same thing ;D

Seems like I’m not the only one doing raw SQL in Java, in this world of JEE and at least a dozen abstraction layers.

Well the workaround is not that good, streaming rows this way is rather slow, it seems to fetch 1 row at a time…

Have you tried this:

“When using Connector/J 5.0.1 along with more recent builds of the MySQL server, you can
add “useCursorFetch=true” to your JDBC url parameters, and the driver will fetch rows in
batches of size setFetchSize() as defined in the JDBC API.”

[tr][td]{no-parameters}[/td][td]64k rows fetch[/td][td]OOME[/td][/tr]
[tr][td]{no-parameters}[/td][td]MIN_VALUE rows fetch[/td][td]27s[/td][/tr]
[tr][td]{no-parameters}[/td][td]1 row fetch[/td][td]OOME[/td][/tr]
[tr][td]----------------------------------------[/td][td]---------------------------------[/td][td]--------------------[/td][/tr]
[tr][td]useCursorFetch=true[/td][td]64k rows fetch[/td][td]41s[/td][/tr]
[tr][td]useCursorFetch=true[/td][td]MIN_VALUE rows fetch[/td][td]24s[/td][/tr]
[tr][td]useCursorFetch=true[/td][td]1 row fetch[/td][td]65s[/td][/tr]