DynamoDB: Changing table schema

I’ve seen many situations where a team has realised the initial Key Schema they’ve selected for their tables isn’t a good fit, for performance, scale reasons, or simply they recognised a need for a Local Secondary Index.

Global Secondary Indexes allow you to create indexes with alternative Partition Key / Sort Key but this isn’t ideal, as it has additional storage + capacity costs. You also lose the ability to use consistent reads on this index.

In this situation, try leveraging DynamoDB Streams and AWS Lambda to remodel data as needed.

A nice way to restructure your table definition is to leverage DynamoDB triggers, following these steps:

  1. The idea is to Create a new table, with the desired key structure.
  2. Enable DynamoDB Streams
  3. Associate a Lambda to the Stream, which pushes the record into the new table (Trim off the Migrated attribute in the next step)
  4. Update each item in the table with a migrate flag (ie: “Migrated”: { “S”: “0” }, which sends it to DynamoDB Streams (using UpdateItem API, to ensure no data loss occurs).
  5. The Lambda will pick up all items, trim the flag off and push it into the new table structure.
  6. This trigger can stay on until all data is migrated and code is pointed to the new table structure.

The great part about DynamoDB Streams it respects ordering, which means there will be no data loss.

I’ve used this a few times to great effect.

Hit me up if you would like more details. Good luck!

Leave a Reply

Your email address will not be published. Required fields are marked *