Load csv file from S3 to DynamoDB with Python 3.9

Dipali Kulshrestha
1 min readJul 27, 2022

--

  1. Create IAM Policy — as per following configuration

S3: All all actions, Resources- all
add additional permissions
Dynamodb: All actions, All Resources
CloudWatch Logs: All actions, All Resources
Policy Name: lambda_csv_reader

2. Create IAM Role — as per given configuration

Create Role: to attach this policy
Service: Lambda
Attach Policy: lambda_csv_reader, AWSLambdaBasicExecutionRole
role name: role_lambda_csv_reader

3. Create bucket (you may use existing bucket)

4. Create table in Dynamo DB
table_name: user
Partitionkey: id

5. Create lambda function:
from scratch
lambda_csv_reader
Python3.9

attache role created in step 2

Add trigger s3
Bucket: <> e.g. awscptraining1
EventType: PUT events
suffix: .csv

6. create datafile: user.csv
id,name,salary
1,emp1,10000
2,Emp2,20000
3,Emp3,40000
4,Emp4,50000

7. Add code to Lambda Function

8.upload file and verify data in table

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Dipali Kulshrestha
Dipali Kulshrestha

No responses yet

Write a response