aws-data-pipeline

AWS Data Pipeline: Issue with permissions S3 Access for IAM role

梦想的初衷 提交于 2021-02-05 07:19:26
问题 I'm using the Load S3 data into RDS MySql table template in AWS Data Pipeline to import csv's from a S3 bucket into our RDS MySql. However I (as IAM user with full-admin rights) run into a warning I can't solve: Object:Ec2Instance - WARNING: Could not validate S3 Access for role. Please ensure role ('DataPipelineDefaultRole') has s3:Get*, s3:List*, s3:Put* and sts:AssumeRole permissions for DataPipeline. Google told me not to use the default policies for the DataPipelineDefaultRole and

AWS Data Pipeline S3 CSV to DynamoDB JSON Error

大兔子大兔子 提交于 2020-07-23 07:17:29
问题 I'm trying to insert several csv located in the S3 directory with the AWS DATA Pipeline But, I'm taking this error. at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) Caused by: com.google.gson.stream.MalformedJsonException: Expected ':' at line 1 column 10 at com.google.gson.stream.JsonReader.syntaxError(JsonReader.java:1505) at com.google

AWS Data Pipeline S3 CSV to DynamoDB JSON Error

Deadly 提交于 2020-07-23 07:17:00
问题 I'm trying to insert several csv located in the S3 directory with the AWS DATA Pipeline But, I'm taking this error. at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) Caused by: com.google.gson.stream.MalformedJsonException: Expected ':' at line 1 column 10 at com.google.gson.stream.JsonReader.syntaxError(JsonReader.java:1505) at com.google

AWS Data Pipeline S3 CSV to DynamoDB JSON Error

自作多情 提交于 2020-07-23 07:15:31
问题 I'm trying to insert several csv located in the S3 directory with the AWS DATA Pipeline But, I'm taking this error. at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1844) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) Caused by: com.google.gson.stream.MalformedJsonException: Expected ':' at line 1 column 10 at com.google.gson.stream.JsonReader.syntaxError(JsonReader.java:1505) at com.google

Permissions for creating and attaching EBS Volume to an EC2Resource i AWS Data Pipeline

纵然是瞬间 提交于 2020-02-25 01:23:57
问题 I need more local disk than available to EC2Resources in an AWS Data Pipline. The simplest solution seems to be to create and attach an EBS volume. I have added EC2:CreateVolume og EC2:AttachVolume policies to both DataPipelineDefaultRole and DataPipelineDefaultResourceRole. I have also tried setting AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY for an IAM role with the same permissions in the shell, but alas no luck. Is there some other permission needed, is it not using the roles it says it

Permissions for creating and attaching EBS Volume to an EC2Resource i AWS Data Pipeline

不羁的心 提交于 2020-02-25 01:22:50
问题 I need more local disk than available to EC2Resources in an AWS Data Pipline. The simplest solution seems to be to create and attach an EBS volume. I have added EC2:CreateVolume og EC2:AttachVolume policies to both DataPipelineDefaultRole and DataPipelineDefaultResourceRole. I have also tried setting AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY for an IAM role with the same permissions in the shell, but alas no luck. Is there some other permission needed, is it not using the roles it says it