[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

2014-04-22 Thread witgo
Github user witgo closed the pull request at:

https://github.com/apache/spark/pull/357


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

2014-04-13 Thread witgo
Github user witgo commented on the pull request:

https://github.com/apache/spark/pull/357#issuecomment-40301947
  
@srowen mind reviewing the PR?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

2014-04-13 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/357#issuecomment-40304539
  
I myself don't agree with this change, no. See the discussion in 
https://issues.apache.org/jira/browse/SPARK-1441 . For example, I think you can 
merely build with the yarn-alpha profile to get the artifacts you want.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

2014-04-13 Thread witgo
Github user witgo commented on the pull request:

https://github.com/apache/spark/pull/357#issuecomment-40304913
  
So, if someone  compile the spark with hadoop 0.23.x how to automatically 
activate the profile
```xml
profile
  idyarn-alpha/id
  dependencies
dependency
  groupIdorg.apache.avro/groupId
  artifactIdavro/artifactId
/dependency
  /dependencies
/profile
```
Maven does not support such a activation
```xml
activation
   propertynamehadoop.version/namevalue0.23.*/value/property
/activation
```



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

2014-04-13 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/357#issuecomment-40305010
  
Right, but you can just write `-Pyarn-alpha` and set `hadoop.version` and 
`yarn.version` as you like. That gets what you need.

A better change would be to change automatically based on versions, but 
that's not what this PR does.

I agree that Maven does not support ranges on property values. It would 
support ranges on the artifact versions that a property like this controls. So 
it may work to 'query' the version of `hadoop-client` for example. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

2014-04-13 Thread witgo
Github user witgo commented on the pull request:

https://github.com/apache/spark/pull/357#issuecomment-40305799
  
```xml
activation
   
propertynamehadoop.version/namevalue[0.23,0.24)/value/property
/activation
```
It doesn't work

see 
[PropertyProfileActivator.java](https://github.com/apache/maven/blob/master/maven-model-builder/src/main/java/org/apache/maven/model/profile/activation/PropertyProfileActivator.java)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: SPARK-1441: Compile Spark Core error with Hado...

2014-04-13 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/357#issuecomment-40305906
  
That's not quite what I mean. `hadoop.version` affects the version of the 
various artifacts in the build of course, like `hadoop-client`. You can express 
activations based on artifact versions, IIRC. So you might activate based on 
this *effect* of setting `hadoop.version` rather than the property itself. It 
might still not work, but worth a shot.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---