Load csv file from S3 to DynamoDB
- Create IAM Policy — as per following configuration
S3: All all actions, Resources- all
add additional permissions
Dynamodb: All actions, All Resources
CloudWatch Logs: All actions, All Resources
Policy Name: lambda_csv_reader
2. Create IAM Role — as per given configuration
Create Role: to attach this policy
Service: Lambda
Attach Policy: lambda_csv_reader, AWSLambdaBasicExecutionRole
role name: role_lambda_csv_reader
3. Create lambda function:
from scratch
lambda_csv_reader
Python2.7
Add trigger s3
Bucket: awscptraining1
EventType: All Object Create events
suffix: .csv
4. Create table in Dynamo DB
table_name: result_data
Partitionkey: registration_id
Sort Key: name
5. datafile: Final_results_data.csv
registration_id,name
EI00001,Name1
EI00002,Name2
EI00003,Name3
EI00004,Name4
EP00001,Name5
6. Add code to Lambda Function
import boto3s3 = boto3.client('s3')
dynamodb = boto3.resource('dynamodb')def lambda_handler(event, context): bucket = event['Records'][0]['s3']['bucket']['name']
key = event['Records'][0]['s3']['object']['key']
obj = s3.get_object(Bucket=bucket, Key=key)
rows = obj['Body'].read().split('\n')
table = dynamodb.Table('result_data')
with table.batch_writer() as batch:
for row in rows:
batch.put_item(Item={
'registration_id':row.split(',')[0],
'name':row.split(',')[1]
})
7. Upload csv file to bucket and verify the flow.