Load csv file from S3 to DynamoDB with Python 3.9
- Create IAM Policy — as per following configuration
S3: All all actions, Resources- all
add additional permissions
Dynamodb: All actions, All Resources
CloudWatch Logs: All actions, All Resources
Policy Name: lambda_csv_reader
2. Create IAM Role — as per given configuration
Create Role: to attach this policy
Service: Lambda
Attach Policy: lambda_csv_reader, AWSLambdaBasicExecutionRole
role name: role_lambda_csv_reader
3. Create bucket (you may use existing bucket)
4. Create table in Dynamo DB
table_name: user
Partitionkey: id
5. Create lambda function:
from scratch
lambda_csv_reader
Python3.9
attache role created in step 2
Add trigger s3
Bucket: <> e.g. awscptraining1
EventType: PUT events
suffix: .csv
6. create datafile: user.csv
id,name,salary
1,emp1,10000
2,Emp2,20000
3,Emp3,40000
4,Emp4,50000
7. Add code to Lambda Function
8.upload file and verify data in table