Snowflake – Create table from CSV file by placing into S3 bucket

  1. Create desired table with proper schema
CREATE TABLE "DEV"."DWH"."TBL_NAME" ("SEGMENT_CNT" NUMBER, "SEGMENT_NAME" STRING, "SEGMENT_TYPE" STRING, "SEGMENT_CATEGORY" STRING, "NEED" STRING);

2. Copy CSV file from local machine to desired S3 bucket (I had to ssh into our emr in order to use proper aws credentials for this step, but if your respective aws credentials are all setup properly on your local machine you should be fine)

scp /path/to/file.csv  dev-emr:~/

aws s3 cp file.csv s3://bucket/file.csv --sse

3. Use the ‘copy into’ command to copy file into ‘external stage’ within SF to select from.

COPY INTO TBL_NAME
FROM (
    SELECT t.$1,
           t.$2,
           t.$3,
           t.$4,
           t.$5
    from '@EXT_STG_NAME/file.csv' t
    )
FILE_FORMAT = (
  type = csv
  record_delimiter='\n'
  field_delimiter=','
  skip_header=1
  field_optionally_enclosed_by='"' -- if you have fields wrapped with "s
  TRIM_SPACE = true
)
-- PATTERN = '/*.csv'
FORCE=TRUE;

4. Now you can check to see if the file was properly loaded into the table you created

select * from TBL_NAME limit 100

Leave a Reply