How to optimize writing this data to a postgres database

I'm parsing poker hand histories, and storing the data in a postgres database. Here's a quick view of that:

I'm getting a relatively bad performance, and parsing files will take several hours. I can see that the database part takes 97% of the total program time. So only a little optimization would make this a lot quicker.

The way I have it set-up now is as follows:

  1. Read next file into a string.
  2. Parse one game and store it into object GameData.
  3. For every player, check if we have his name in the std::map. If so; store the playerids in an array and go to 5.
  4. Insert the player, add it to the std::map, store the playerids in an array.
  5. Using the playerids array, insert the moves for this betting round, store the moveids in an array.
  6. Using the moveids array, insert a movesequence, store the movesequenceids in an array.
  7. If this isn't the last round played, go to 5.
  8. Using the movesequenceids array, insert a game.
  9. If this was not the final game, go to 2.
  10. If this was not the last file, go to 1.

Since I'm sending queries for every move, for every movesequence, for every game, I'm obviously doing too many queries. How should I bundle them for best performance? I don't mind rewriting a bit of code, so don't hold back. :)

Thanks in advance.

CX

Answers


It's very hard to answer this without any queries, schema, or a Pg version.

In general, though, the answer to these problems is to batch the work into bigger coarser batches to avoid repeating lots of work, and, most importantly, by doing it all in one transaction.

You haven't said anything about transactions, so I'm wondering if you're doing all this in autocommit mode. Bad plan. Try wrapping the whole process in a BEGIN and COMMIT. If it's a seriously long-running process the COMMIT every few minutes / tens of games / whatever, write a checkpoint file or DB entry your program can use to resume the import from that point, and open a new transaction to carry on.

It'll help to use multi-valued inserts where you're inserting multiple rows to the same table. Eg:

INSERT INTO some_table(col1, col2, col3) VALUES
('a','b','c'),
('1','2','3'),
('bork','spam','eggs');

You can improve commit rates with synchronous_commit=off and a commit_delay, but that's not very useful if you're batching work into bigger transactions.

One very good option will be to insert your new data into UNLOGGED tables (PostgreSQL 9.1 or newer) or TEMPORARY tables (all versions, but lost when session disconnects), then at the end of the process copy all the new rows into the main tables and drop the import tables with commands like:

INSERT INTO the_table
SELECT * FROM the_table_import;

When doing this, CREATE TABLE ... LIKE is useful.

Another option - really a more extreme version of the above - is to write your results to CSV flat files as you read and convert them, then COPY them into the database. Since you're working in C++ I'm assuming you're using libpq - in which case you're hopefully also using libpqtypes. libpq offers access to the COPY api for bulk-loading, so your app wouldn't need to call out to psql to load the CSV data once it'd produced it.


Need Your Help

How to Convert string to date in c#

c# string datetime

i have string format of date looks like "04/16/2014 19:10", i want to convert it to DateTime.

Android: Remove "paste" "replace" textselection bubbles in EditText

android android-edittext textselection android-contextmenu

I have an EditText, and when I perform a long click, the word being pressed is selected, and two other things happen.