[
https://issues.apache.org/jira/browse/BAHIR-122?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106004#comment-16106004
]
ASF GitHub Bot commented on BAHIR-122:
--------------------------------------
Github user ckadner commented on the issue:
https://github.com/apache/bahir/pull/48
@ire7715 -- I create a [Google API Service
account](https://console.developers.google.com/iam-admin/serviceaccounts/project?project=apache-bahir-pubsub)
and [added the generated key
files](https://support.cloudbees.com/hc/en-us/articles/203802500-Injecting-Secrets-into-Jenkins-Build-Jobs)
to our Jenkins server. All your tests appear to be [enabled and complete
successfully](http://169.45.79.58:8080/job/bahir_spark_pr_builder/95/) now.
```
[INFO] --- scalatest-maven-plugin:1.0:test (test) @
spark-streaming-pubsub_2.11 ---
Discovery starting.
Google Pub/Sub tests that actually send data has been enabled by setting
the environment
variable ENABLE_PUBSUB_TESTS to 1.
This will create Pub/Sub Topics and Subscriptions in Google cloud platform.
Please be aware that this may incur some Google cloud costs.
Set the environment variable GCP_TEST_PROJECT_ID to the desired project.
Discovery completed in 135 milliseconds.
Run starting. Expected test count is: 10
SparkGCPCredentialsBuilderSuite:
- should build application default
- should build json service account
- should provide json creds
- should build p12 service account
- should provide p12 creds
- should build metadata service account
- SparkGCPCredentials classes should be serializable
Using project apache-bahir-pubsub for creating Pub/Sub topic and
subscription for tests.
PubsubStreamSuite:
- PubsubUtils API
- pubsub input stream
- pubsub input stream, create pubsub
Run completed in 14 seconds, 143 milliseconds.
Total number of tests run: 10
Suites: completed 3, aborted 0
Tests: succeeded 10, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
```
---
Would you **please add a short paragraph** to the [PubSub
README](https://github.com/apache/bahir/blob/master/streaming-pubsub/README.md)
describing how to enable your unit tests by setting the environment variables
(and how to set up a Google API *service account*, generate *key files* and how
to minimally configure the *Roles* like "Pub/Sub Publisher", etc)? i.e.:
```Bash
mvn clean package -DskipTests -pl streaming-pubsub
export ENABLE_PUBSUB_TESTS=1
export
GCP_TEST_ACCOUNT="apache-bahir-streaming-pub...@apache-bahir-pubsub.iam.gserviceaccount.com"
export GCP_TEST_PROJECT_ID="apache-bahir-pubsub"
export
GCP_TEST_JSON_KEY_PATH=/path/to/pubsub/credential/files/Apache-Bahir-PubSub-1234abcd.json
export
GCP_TEST_P12_KEY_PATH=/path/to/pubsub/credential/files/Apache-Bahir-PubSub-5678efgh.p12
mvn test -pl streaming-pubsub
```
**Thank you!**
> [PubSub] Make "ServiceAccountCredentials" really broadcastable
> --------------------------------------------------------------
>
> Key: BAHIR-122
> URL: https://issues.apache.org/jira/browse/BAHIR-122
> Project: Bahir
> Issue Type: Improvement
> Components: Spark Streaming Connectors
> Reporter: Ire Sun
>
> The origin implementation broadcast the key file path to Spark cluster, then
> the executor read key file with the broadcasted path. Which is absurd, if you
> are using a shared Spark cluster in a group/company, you certainly not want
> to (and have no right to) put your key file on each instance of the cluster.
> If you store the key file on driver node and submit your job to a remote
> cluster. You would get the following warning:
> {{WARN ReceiverTracker: Error reported by receiver for stream 0: Failed to
> pull messages - java.io.FileNotFoundException}}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)