Exporting / Backing up the object store

I recently found out about the Project Darkstar, and I’m interested in trying out a few MMO projects that use it. However, I have some questions related to the Object Store:

  1. From what I understand, the entire object store is kept in memory and persisted across the cluster using a distributed database backend that’s unavailable directly through the API. If I needed to export some or all of the data currently in the object store (i.e., DB dump or XML file), what would be the best way to facilitate this?

  2. Is there an easy way to backup the database?

  3. It seems that every time I make a change to a GLO, and an old instance of that GLO is stored in the object store, I have to clear the store before starting up the server again. This means that with a patch to a server, all player data (players’ locations, inventories, stats, etc.) would be lost. What would be the best way around this? Should we store the player data separately, so the entire object store can be rebuilt when it starts up?

Thanks.

We are designing right now the export/import facilities for the big back-end. The idea we are looking at right now will be that you can do a live-export to an external RDBMS in a known schema, work on that with SQl, and then if you wish import part or all of it back.

The big-backend will support multiple levels of backup, from fail-over nodes to live snapshot for offsite backup.

The SDK is using derby for the persistance layer so you can back it up however you might back up a derby database. The simplest is just to recursively copy the persistant_store directory it creates. Im not sure however if derby ensures that this is always in a referrentially integrous state or whether you would have to stop the games to be sure. Frankly, robust back up is not a big SDK priority sicne you cannot by license be using it for a production environment.

Set your SerializationUID manually and you will not have this problem. Eclipse will do this for you with a click.

Regarding restoring GLOs whose classes have changed, default deserialization with the serialVersionUID set should work in most cases, where there are only minor changes to the data in the GLO (no changes to fields except adding new ones). However, what if you make more complex changes to the GLO, such as deleting fields, changing their names / types, moving the GLO to a different package, etc.?

It seems like there would be some cases where the underlying data structure has changed enough that simply using the standard Java serialization / deserialization wouldn’t be enough to avoid having to clear the persistent store. I was thinking using Externalizable instead of Serializable and check for old versions of GLO’s to convert would solve most of the problems, but it doesn’t solve everything, such as moving a class to a different package or restructuring GLO’s.

If something like that is needed, would there be a way to manually convert the GLO’s in the persistent store?

I believe you shoudl be able to delete fields. The values will just got dropped on deserialization. Ill run a quick test of that assumption today.

If you are making a change such as moving packages, i guess my first impulse would be to do one of two things:

(1) Write an in-system script to walk the old data structures and populate the new ones.
(2) Externalize to an RDBMS. Do your conversion via SQL scripts in the RDBMS and then import it back.

Ill think on it some more but Im not sure off the top of my head if there is any more convenient solution possible for such a general problem as “any change to the program structure.”…

Yup.deleting a field works just fine.

In general, as long as you arent changing the package and such structural changes, you shoudl be able to change the code pretty freely as long as serialVersionUID is manually set.

Cool, thanks for the info.