[ https://issues.apache.org/jira/browse/SPARK-10969?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15035873#comment-15035873 ]
Christoph Pirkl commented on SPARK-10969: ----------------------------------------- While this commit is useful it does not fix this issue. In order to fix this, it would be best to introduce a serializable parameter object that contains aws credentials for kinesis, dynamodb and cloudwatch (and maybe other values). This would make it easier to add more parameters later. > Spark Streaming Kinesis: Allow specifying separate credentials for Kinesis > and DynamoDB > --------------------------------------------------------------------------------------- > > Key: SPARK-10969 > URL: https://issues.apache.org/jira/browse/SPARK-10969 > Project: Spark > Issue Type: Improvement > Components: Streaming > Affects Versions: 1.5.1 > Reporter: Christoph Pirkl > Priority: Critical > > {{KinesisUtils.createStream()}} allows specifying only one set of AWS > credentials that will be used by Amazon KCL for accessing Kinesis, DynamoDB > and CloudWatch. > h5. Motivation > In a scenario where one needs to read from a Kinesis Stream owned by a > different AWS account the user usually has minimal rights (i.e. only read > from the stream). In this case creating the DynamoDB table in KCL will fail. > h5. Proposal > My proposed solution would be to allow specifying multiple credentials in > {{KinesisUtils.createStream()}} for Kinesis, DynamoDB and CloudWatch. The > additional credentials could then be passed to the constructor of > {{KinesisClientLibConfiguration}} or method > {{KinesisClientLibConfiguration.withDynamoDBClientConfig()}}. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org