Load csv file from S3 to DynamoDB

Dipali Kulshrestha
1 min readMay 26, 2021

--

  1. Create IAM Policy — as per following configuration

S3: All all actions, Resources- all
add additional permissions
Dynamodb: All actions, All Resources
CloudWatch Logs: All actions, All Resources
Policy Name: lambda_csv_reader

2. Create IAM Role — as per given configuration

Create Role: to attach this policy
Service: Lambda
Attach Policy: lambda_csv_reader, AWSLambdaBasicExecutionRole
role name: role_lambda_csv_reader

3. Create lambda function:
from scratch
lambda_csv_reader
Python2.7

Add trigger s3
Bucket: awscptraining1
EventType: All Object Create events
suffix: .csv

4. Create table in Dynamo DB
table_name: result_data
Partitionkey: registration_id
Sort Key: name

5. datafile: Final_results_data.csv
registration_id,name
EI00001,Name1
EI00002,Name2
EI00003,Name3
EI00004,Name4
EP00001,Name5

6. Add code to Lambda Function

import boto3s3 = boto3.client('s3')
dynamodb = boto3.resource('dynamodb')
def lambda_handler(event, context): bucket = event['Records'][0]['s3']['bucket']['name']
key = event['Records'][0]['s3']['object']['key']

obj = s3.get_object(Bucket=bucket, Key=key)

rows = obj['Body'].read().split('\n')

table = dynamodb.Table('result_data')

with table.batch_writer() as batch:
for row in rows:


batch.put_item(Item={

'registration_id':row.split(',')[0],
'name':row.split(',')[1]

})

7. Upload csv file to bucket and verify the flow.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Dipali Kulshrestha
Dipali Kulshrestha

No responses yet

Write a response