0
votes

I am new to RedShift. Below is my process flow:

  1. Create CSV copies of my database(Postgres RDS) tables in S3.
  2. Create some staging tables(for ETL purposes) in RedShift using "create table" from sql client connected to redshift.
  3. Moving data from S3 to RedShift using Copy command.

Problem:

My staging table(which will be dropped after ETL process with other staging tables) has same schema as source table from RDS. But every time I build a new staging table I have to write a long "create table" command and it becomes frustrating when I have table with 100's of features. Is there any easy method to copy the schema? or Do I need to change something with my current process to make this easier?

2

2 Answers

1
votes

We use Redshift's 'CREATE TABLE LIKE' command. It looks like: create table staging_table (like current_table);. It has one shortcoming, that it doesn't inherit the primary key and foreign key attributes of current_table, but we are okay to live with it.

Look at the documentation for more details: http://docs.aws.amazon.com/redshift/latest/dg/performing-a-deep-copy.html

0
votes

Instead of dropping the Stagging table, you could truncate the Stagging table after ETL and you could setup your Vacuuming process inline with ETL to make sure Redshift data hygiene to get good performance.