Re: New idea & POC for a Kubernetes/Clojure interface

2018-05-22 Thread Punit Naik
Is there any documentation around spinning up K8s cluster on Amazon EC2 
instances?

On Tuesday, November 7, 2017 at 8:34:35 AM UTC+5:30, Blake Miller wrote:
>
> Here's a little something I cooked up this weekend, to interact with a 
> Kubernetes cluster from Clojure:
>
> https://github.com/blak3mill3r/keenest-rube
>
> It abstracts away the K8s API calls completely. Instead, you get the state 
> of the cluster as a value in an atom. Changes to the state of the cluster 
> are streamed to the Clojure client, which keeps the value in the atom 
> current, and attempts to mutate the atom will cause one cluster resource 
> (one at a time) to be modified/created/destroyed appropriately. So far I'm 
> finding it to be a real pleasure to use compared to `kubectl` (giving it 
> hand-edited json or yaml files) or worse: the Dashboard (poking around at a 
> web app with a mouse).
>
> I guess I could've just tried the Python library, but where's the fun in 
> that?
>
> I feel like this could turn into a pretty powerful tool for ops work, or 
> even adding abstractions to manage resources automatically. I've been 
> wanting to use Clojure (more) for infrastructure-automation/dev-ops ... but 
> this is one area where the tooling available is a bit lacking, IMO.
>
> I've been successfully toying around with this project and a real 
> Kubernetes cluster in an AWS VPC.
>
> I'd be glad to get hear any thoughts on this idea. If you're into k8s, 
> please give it a whirl.
>
> This is total toy-status right now, by the way, it's just a 
> proof-of-concept. Oh, and it mixes nicely with `cider-enlighten-mode`, if 
> you're into that sort of thing. I went ahead and published it to Clojars.
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: `partition-all` issue

2016-12-21 Thread Punit Naik
Okay Alex, thanks.

On Wednesday, December 21, 2016 at 6:23:34 PM UTC+5:30, Alex Miller wrote:
>
> This has come up before (http://dev.clojure.org/jira/browse/CLJ-764) and 
> we decided this was the desired behavior, so no plans to change it.

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


`partition-all` issue

2016-12-21 Thread Punit Naik
Hi Guys

Recently I came across this issue i.e. when the user passes 0 as the `n` 
parameter to the `partition` or `partition-all` function, the code hangs 
indefinitely without throwing an error or anything. I was using this 
function inside my Elasticsearch back-end API and when this condition used 
to occur, my ES instance used to get stuck and give me OOM issue which I 
had to fix by restarting the instance.

I know that this is an external code level issue but I think it would be 
great if we can at least throw an error if the user accidentally passes `n` 
as 0. It would at least not hang up the system.

I was thinking of creating a PR for this but wanted an approval from you 
guys.

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: Secondar Sorting in spark using clojure/flambo

2016-07-11 Thread Punit Naik


So I finally figured it out on my own. I had to basically write my custom 
ordering function as a separate scala project and then call that in clojure.

I had my scala file written in this manner:


import org.apache.spark.Partitionerimport org.apache.spark.rdd.RDD
case class RFMCKey(cId: String, R: Double, F: Long, M: Double, C: Double)class 
RFMCPartitioner(partitions: Int) extends Partitioner {
  require(partitions >= 0, "Number of partitions ($partitions) cannot be 
negative.")
  override def numPartitions: Int = partitions
  override def getPartition(key: Any): Int = {
val k = key.asInstanceOf[RFMCKey]
k.cId.hashCode() % numPartitions
  }}object RFMCKey {
  implicit def orderingBycId[A <: RFMCKey] : Ordering[A] = {
Ordering.by(k => (k.R, k.F * -1, k.M * -1, k.C * -1))
  }}
class rfmcSort {
  def sortWithRFMC(a: RDD[(String, (((Double, Long), Double), Double))], parts: 
Int): RDD[(RFMCKey, String)] = {
val x = a.map(v => v match {
case (custId, (((rVal, fVal), mVal),cVal)) => (RFMCKey(custId, 
rVal, fVal, mVal, cVal), rVal+","+fVal+","+mVal+","+cVal)
}).repartitionAndSortWithinPartitions(new RFMCPartitioner(parts))
x
  }}

I compiled it as ascala project and used it in my clojure code this way:


(:import [org.formcept.wisdom rfmcSort]
 [org.apache.spark.rdd.RDD])

sorted-rfmc-records (.toJavaRDD (.sortWithRFMC (rfmcSort.) (.rdd rfmc-records) 
num_partitions))

Please notice the way I am calling the sortWithRFMC function from the 
rfmcSort object that I created. Also one very important thing to note here 
is when you pass your JavaPairRDD to your scala function, you have to 
convert it into a normal spark RDD first by calling the .rdd method on it. 
And then you have to convert the spark RDD back to JavaPairRDD to work with 
it in clojure.


And sorry that I got your name wrong *Blake :)

On Tuesday, July 12, 2016 at 12:16:42 AM UTC+5:30, Punit Naik wrote:
>
> Hi Black
>
> Thanks for the reply but  figured it out on my own. Posting the answer 
> after this.
>
> On Monday, July 11, 2016 at 11:42:10 PM UTC+5:30, Blake Miller wrote:
>>
>> Hi Punit
>>
>> The behavior you are referring to is a feature of the Scala compiler, 
>> which is why it does not happen automatically when you try to use it from 
>> Clojure.
>>
>> Please see the note here:
>>
>>
>> https://github.com/t6/from-scala/blob/4e1752aaa2ef835dd67a8404273bee067510a431/test/t6/from_scala/guide.clj#L161-L166
>>
>> You may find that library a useful resource, either as a dependency or 
>> simply as reference material.
>>
>> What you want to do is find the full method signature, including the 
>> implicits, and invoke _that_ from clojure, passing values for all implicit 
>> parameters (in this case, your custom ordering function.
>>
>> HTH
>>
>> On Saturday, July 9, 2016 at 6:13:17 AM UTC, Punit Naik wrote:
>>>
>>> Hi Ashish
>>>
>>> The "package" is indeed the full package name.
>>> On 09-Jul-2016 11:02 AM, "Ashish Negi"  wrote:
>>>
>>>> Should not be `package` in `:import` be the actual package name of  `
>>>> RFMCPartitioner` ?
>>>>
>>>> see examples at https://clojuredocs.org/clojure.core/import
>>>>
>>>> like :
>>>>
>>>> (ns foo.bar
>>>>   (:import (java.util Date
>>>>   Calendar)
>>>>(java.util.logging Logger
>>>>   Level)))
>>>>
>>>>
>>>>
>>>> (ns xyz
>>>>   (:import
>>>> [**  RFMCPartitioner]
>>>> [** RFMCKey]
>>>> )
>>>>   )
>>>>
>>>>
>>>> where ** is package full name.
>>>>
>>>>
>>>>
>>>> On Friday, 8 July 2016 21:31:27 UTC+5:30, Punit Naik wrote:
>>>>>
>>>>>
>>>>>  
>>>>>
>>>>> I have a scala program in which I have implemented a secondary sort 
>>>>> which works perfectly. The way I have written that program is:
>>>>>
>>>>> object rfmc {
>>>>>   // Custom Key and partitioner
>>>>>
>>>>>   case class RFMCKey(cId: String, R: Double, F: Double, M: Double, C: 
>>>>> Double)
>>>>>   class RFMCPartitioner(partitions: Int) extends Partitioner {
>>>>> require(partitions >= 0, "Number of partitions ($partitions) cannot 
>>>>> be negative.")
>>>>> o

Re: Secondar Sorting in spark using clojure/flambo

2016-07-11 Thread Punit Naik
Hi Black

Thanks for the reply but  figured it out on my own. Posting the answer 
after this.

On Monday, July 11, 2016 at 11:42:10 PM UTC+5:30, Blake Miller wrote:
>
> Hi Punit
>
> The behavior you are referring to is a feature of the Scala compiler, 
> which is why it does not happen automatically when you try to use it from 
> Clojure.
>
> Please see the note here:
>
>
> https://github.com/t6/from-scala/blob/4e1752aaa2ef835dd67a8404273bee067510a431/test/t6/from_scala/guide.clj#L161-L166
>
> You may find that library a useful resource, either as a dependency or 
> simply as reference material.
>
> What you want to do is find the full method signature, including the 
> implicits, and invoke _that_ from clojure, passing values for all implicit 
> parameters (in this case, your custom ordering function.
>
> HTH
>
> On Saturday, July 9, 2016 at 6:13:17 AM UTC, Punit Naik wrote:
>>
>> Hi Ashish
>>
>> The "package" is indeed the full package name.
>> On 09-Jul-2016 11:02 AM, "Ashish Negi"  wrote:
>>
>>> Should not be `package` in `:import` be the actual package name of  `
>>> RFMCPartitioner` ?
>>>
>>> see examples at https://clojuredocs.org/clojure.core/import
>>>
>>> like :
>>>
>>> (ns foo.bar
>>>   (:import (java.util Date
>>>   Calendar)
>>>(java.util.logging Logger
>>>       Level)))
>>>
>>>
>>>
>>> (ns xyz
>>>   (:import
>>> [**  RFMCPartitioner]
>>> [** RFMCKey]
>>> )
>>>   )
>>>
>>>
>>> where ** is package full name.
>>>
>>>
>>>
>>> On Friday, 8 July 2016 21:31:27 UTC+5:30, Punit Naik wrote:
>>>>
>>>>
>>>>  
>>>>
>>>> I have a scala program in which I have implemented a secondary sort 
>>>> which works perfectly. The way I have written that program is:
>>>>
>>>> object rfmc {
>>>>   // Custom Key and partitioner
>>>>
>>>>   case class RFMCKey(cId: String, R: Double, F: Double, M: Double, C: 
>>>> Double)
>>>>   class RFMCPartitioner(partitions: Int) extends Partitioner {
>>>> require(partitions >= 0, "Number of partitions ($partitions) cannot be 
>>>> negative.")
>>>> override def numPartitions: Int = partitions
>>>> override def getPartition(key: Any): Int = {
>>>>   val k = key.asInstanceOf[RFMCKey]
>>>>   k.cId.hashCode() % numPartitions
>>>> }
>>>>   }
>>>>   object RFMCKey {
>>>> implicit def orderingBycId[A <: RFMCKey] : Ordering[A] = {
>>>>   Ordering.by(k => (k.R, k.F * -1, k.M * -1, k.C * -1))
>>>> }
>>>>   }
>>>>   // The body of the code
>>>>   //
>>>>   //
>>>>   val x = rdd.map(RFMCKey(cust,r,f,m,c), r+","+f+","+m+","+c)
>>>>   val y = x.repartitionAndSortWithinPartitions(new RFMCPartitioner(1))}
>>>>
>>>> I wanted to implement the same thing using clojure's DSL for spark 
>>>> called flambo. Since I can't write partitioner using clojure, I re-used 
>>>> the 
>>>> code defind above, compiled it and used it as a dependency in my Clojure 
>>>> code.
>>>>
>>>> Now I am importing the partitioner and the key in my clojure code the 
>>>> following way:
>>>>
>>>> (ns xyz
>>>>   (:import
>>>> [package RFMCPartitioner]
>>>> [package RFMCKey]
>>>> )
>>>>   )
>>>>
>>>> But when I try to create RFMCKey by doing (RFMCKey. cust_id r f m c), 
>>>> it throws the following error:
>>>>
>>>> java.lang.ClassCastException: org.formcept.wisdom.RFMCKey cannot be cast 
>>>> to java.lang.Comparable
>>>> at 
>>>> org.spark-project.guava.collect.NaturalOrdering.compare(NaturalOrdering.java:28)
>>>> at 
>>>> scala.math.LowPriorityOrderingImplicits$$anon$7.compare(Ordering.scala:153)
>>>> at 
>>>> org.apache.spark.util.collection.ExternalSorter$$anon$8.compare(ExternalSorter.scala:170)
>>>> at 
>>>> org.apache.spark.util.collection.ExternalSorter$$anon$8.compare(ExternalSorter.scala:164)
>>>> at 
>>>> org.ap

Re: Secondar Sorting in spark using clojure/flambo

2016-07-08 Thread Punit Naik
Hi Ashish

The "package" is indeed the full package name.
On 09-Jul-2016 11:02 AM, "Ashish Negi"  wrote:

> Should not be `package` in `:import` be the actual package name of  `
> RFMCPartitioner` ?
>
> see examples at https://clojuredocs.org/clojure.core/import
>
> like :
>
> (ns foo.bar
>   (:import (java.util Date
>   Calendar)
>(java.util.logging Logger
>   Level)))
>
>
>
> (ns xyz
>   (:import
> [**  RFMCPartitioner]
> [** RFMCKey]
> )
>   )
>
>
> where ** is package full name.
>
>
>
> On Friday, 8 July 2016 21:31:27 UTC+5:30, Punit Naik wrote:
>>
>>
>>
>>
>> I have a scala program in which I have implemented a secondary sort which
>> works perfectly. The way I have written that program is:
>>
>> object rfmc {
>>   // Custom Key and partitioner
>>
>>   case class RFMCKey(cId: String, R: Double, F: Double, M: Double, C: Double)
>>   class RFMCPartitioner(partitions: Int) extends Partitioner {
>> require(partitions >= 0, "Number of partitions ($partitions) cannot be 
>> negative.")
>> override def numPartitions: Int = partitions
>> override def getPartition(key: Any): Int = {
>>   val k = key.asInstanceOf[RFMCKey]
>>   k.cId.hashCode() % numPartitions
>> }
>>   }
>>   object RFMCKey {
>> implicit def orderingBycId[A <: RFMCKey] : Ordering[A] = {
>>   Ordering.by(k => (k.R, k.F * -1, k.M * -1, k.C * -1))
>> }
>>   }
>>   // The body of the code
>>   //
>>   //
>>   val x = rdd.map(RFMCKey(cust,r,f,m,c), r+","+f+","+m+","+c)
>>   val y = x.repartitionAndSortWithinPartitions(new RFMCPartitioner(1))}
>>
>> I wanted to implement the same thing using clojure's DSL for spark called
>> flambo. Since I can't write partitioner using clojure, I re-used the code
>> defind above, compiled it and used it as a dependency in my Clojure code.
>>
>> Now I am importing the partitioner and the key in my clojure code the
>> following way:
>>
>> (ns xyz
>>   (:import
>> [package RFMCPartitioner]
>> [package RFMCKey]
>> )
>>   )
>>
>> But when I try to create RFMCKey by doing (RFMCKey. cust_id r f m c), it
>> throws the following error:
>>
>> java.lang.ClassCastException: org.formcept.wisdom.RFMCKey cannot be cast to 
>> java.lang.Comparable
>> at 
>> org.spark-project.guava.collect.NaturalOrdering.compare(NaturalOrdering.java:28)
>> at 
>> scala.math.LowPriorityOrderingImplicits$$anon$7.compare(Ordering.scala:153)
>> at 
>> org.apache.spark.util.collection.ExternalSorter$$anon$8.compare(ExternalSorter.scala:170)
>> at 
>> org.apache.spark.util.collection.ExternalSorter$$anon$8.compare(ExternalSorter.scala:164)
>> at 
>> org.apache.spark.util.collection.TimSort.countRunAndMakeAscending(TimSort.java:252)
>> at org.apache.spark.util.collection.TimSort.sort(TimSort.java:110)
>> at org.apache.spark.util.collection.Sorter.sort(Sorter.scala:37)
>> at 
>> org.apache.spark.util.collection.SizeTrackingPairBuffer.destructiveSortedIterator(SizeTrackingPairBuffer.scala:83)
>> at 
>> org.apache.spark.util.collection.ExternalSorter.partitionedIterator(ExternalSorter.scala:687)
>> at 
>> org.apache.spark.util.collection.ExternalSorter.iterator(ExternalSorter.scala:705)
>> at 
>> org.apache.spark.shuffle.hash.HashShuffleReader.read(HashShuffleReader.scala:64)
>> at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:92)
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
>> at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:70)
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:242)
>> at 
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
>> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
>> at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
>> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
>> at org.apache.spark.scheduler.Task.run(Task.scala:64)
>> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
>> at 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>

Secondar Sorting in spark using clojure/flambo

2016-07-08 Thread Punit Naik
 
 

I have a scala program in which I have implemented a secondary sort which 
works perfectly. The way I have written that program is:

object rfmc {
  // Custom Key and partitioner

  case class RFMCKey(cId: String, R: Double, F: Double, M: Double, C: Double)
  class RFMCPartitioner(partitions: Int) extends Partitioner {
require(partitions >= 0, "Number of partitions ($partitions) cannot be 
negative.")
override def numPartitions: Int = partitions
override def getPartition(key: Any): Int = {
  val k = key.asInstanceOf[RFMCKey]
  k.cId.hashCode() % numPartitions
}
  }
  object RFMCKey {
implicit def orderingBycId[A <: RFMCKey] : Ordering[A] = {
  Ordering.by(k => (k.R, k.F * -1, k.M * -1, k.C * -1))
}
  }
  // The body of the code
  //
  //
  val x = rdd.map(RFMCKey(cust,r,f,m,c), r+","+f+","+m+","+c)
  val y = x.repartitionAndSortWithinPartitions(new RFMCPartitioner(1))}

I wanted to implement the same thing using clojure's DSL for spark called 
flambo. Since I can't write partitioner using clojure, I re-used the code 
defind above, compiled it and used it as a dependency in my Clojure code.

Now I am importing the partitioner and the key in my clojure code the 
following way:

(ns xyz
  (:import
[package RFMCPartitioner]
[package RFMCKey]
)
  )

But when I try to create RFMCKey by doing (RFMCKey. cust_id r f m c), it 
throws the following error:

java.lang.ClassCastException: org.formcept.wisdom.RFMCKey cannot be cast to 
java.lang.Comparable
at 
org.spark-project.guava.collect.NaturalOrdering.compare(NaturalOrdering.java:28)
at 
scala.math.LowPriorityOrderingImplicits$$anon$7.compare(Ordering.scala:153)
at 
org.apache.spark.util.collection.ExternalSorter$$anon$8.compare(ExternalSorter.scala:170)
at 
org.apache.spark.util.collection.ExternalSorter$$anon$8.compare(ExternalSorter.scala:164)
at 
org.apache.spark.util.collection.TimSort.countRunAndMakeAscending(TimSort.java:252)
at org.apache.spark.util.collection.TimSort.sort(TimSort.java:110)
at org.apache.spark.util.collection.Sorter.sort(Sorter.scala:37)
at 
org.apache.spark.util.collection.SizeTrackingPairBuffer.destructiveSortedIterator(SizeTrackingPairBuffer.scala:83)
at 
org.apache.spark.util.collection.ExternalSorter.partitionedIterator(ExternalSorter.scala:687)
at 
org.apache.spark.util.collection.ExternalSorter.iterator(ExternalSorter.scala:705)
at 
org.apache.spark.shuffle.hash.HashShuffleReader.read(HashShuffleReader.scala:64)
at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:92)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:70)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:242)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:64)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

My guess is that its not able to find the ordering that I have defined 
after the partitioner. But if it works in Scala, why doesn't it work in 
Clojure?

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: ExceptionInInitialization error

2016-05-31 Thread Punit Naik
Nice explanation.

On Thursday, February 25, 2016 at 8:46:46 PM UTC+5:30, Gary Verhaegen wrote:
>
> The lein deps :tree message (on stderr, which may be why it was not 
> included in your mail?) said:
>
>
> Possibly confusing dependencies found:
> [cheshire "5.3.1"]
>  overrides
> [riemann "0.2.10"] -> [clj-http "1.1.2" :exclusions
> [org.clojure/tools.reader]] -> [chesh
> ire "5.4.0" :exclusions [org.clojure/clojure]]
>  and
> [riemann "0.2.10"] -> [cheshire "5.5.0"]
>
> Which means: "In your project.clj file, you explicitly tell me that you 
> want cheshire 5.3.1, but you also tell me that you want riemann 0.2.10. It 
> turns out that riemann tells me it wants clj-http 1.1.2, which itself wants 
> cheshire 5.4.0. In addition, riemann also tells me that it wants cheshire 
> 5.5.0. Now, you're the boss, and I can't load multiple versions of the same 
> library, so I'm going to guess that you really want cheshire 5.3.1 and not 
> one of the other ones, though I must say I am a bit confused.
>
> I'm also a bit shy, so if this is really what you want, could you please 
> tell riemann that you do not want him to give me orders regarding the 
> version of cheshire to be included? That would make things less confusing."
>
> so you have to choose: either follow Leiningen's advice and tell riemann 
> to shut up, or decide that you don't actually need to insist on that 
> specific version of cheshire and give riemann what it needs.
>
> What Leiningen suggests (putting in exclusions) is a way of making 
> explicit the choices that he is already guessing from your project.clj, so 
> in this case it does not help.
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: ExceptionInInitialization error

2016-02-24 Thread Punit Naik
Thank you so much Gary. Updating Cheshire totally worked!! But may I ask
how were you able to deduce that Cheshire was the problem? Do tell me so
that in the future I can fix these kind of problems myself.

On Thu, Feb 25, 2016 at 12:07 PM, Punit Naik  wrote:

> Okay thanks a lot Gary. Will try that.
>
> On Thu, Feb 25, 2016 at 6:49 AM, Gary Verhaegen 
> wrote:
>
>> So when I do `lein deps :tree` with these dependencies (except for the
>> swissknife one, which my computer does not seem to find), I get a
>> *lot* of conflicts with riemann, starting with:
>>
>> $ lein deps :tree
>> Possibly confusing dependencies found:
>> [cheshire "5.3.1"]
>>  overrides
>> [riemann "0.2.10"] -> [clj-http "1.1.2" :exclusions
>> [org.clojure/tools.reader]] -> [chesh
>> ire "5.4.0" :exclusions [org.clojure/clojure]]
>>  and
>> [riemann "0.2.10"] -> [cheshire "5.5.0"]
>>
>> Consider using these exclusions:
>> [riemann "0.2.10" :exclusions [cheshire]]
>> [riemann "0.2.10" :exclusions [cheshire]]
>>
>>
>> I would try updating the cheshire version you're declaring in
>> project.clj, instead of adding exclusions, though. (On my machine this
>> gets rid of the conflicts.)
>>
>>
>> On 24 February 2016 at 10:24, Punit Naik  wrote:
>> > Okay. So this is my project.clj:
>> >
>> >
>> > (defproject chowkidar "0.3.0-SNAPSHOT"
>> >   :description "Formcept Monitoring Framework"
>> >   :dependencies [[org.clojure/clojure "1.6.0"]
>> >  [org.clojure/java.jmx "0.3.0"]
>> >  [org.clojure/tools.logging "0.3.1"]
>> >  [riemann/riemann "0.2.10"]
>> >  [riemann-clojure-client "0.4.2"]
>> >  [org.formcept/swissknife "0.6.0"]
>> >  [com.novemberain/langohr "3.0.1"]
>> >  [cheshire "5.3.1"]]
>> >   :java-source-paths ["src/java"]
>> >   :resource-paths ["resources" "conf"]
>> >   :jvm-opts ["-XX:+UseConcMarkSweepGC"]
>> >   :profiles {:ship {:aot :all
>> > :omit-source true}
>> >  :uberjar {:uberjar-name "formcept-chowkidar.jar"}})
>> >
>> > So I had changed the version of "riemann" from 0.2.6 to 0.2.10. Only
>> that.
>> >
>> >
>> > On Wednesday, February 24, 2016 at 2:41:14 PM UTC+5:30, Gary Verhaegen
>> > wrote:
>> >>
>> >> No sign of conflict there - that's a bit surprising. Can you post your
>> >> project.clj? Do you have any explicit exclusions? What did you change
>> last
>> >> before it broke? Is it possible that you somehow corrupted your Maven
>> >> repository and are missing the mentioned class?
>> >>
>> >> Maybe it's not a dependency issue at all.
>> >>
>> >> On Wednesday, 24 February 2016, Punit Naik 
>> wrote:
>> >>
>> >> Here it is:
>> >>
>> >> [cheshire "5.3.1"] [com.fasterxml.jackson.core/jackson-core "2.3.1"]
>> >> [com.fasterxml.jackson.dataformat/jackson-dataformat-smile "2.3.1"]
>> [tigris
>> >> "0.1.1"] [clojure-complete "0.2.3" :scope "test" :exclusions
>> >> [[org.clojure/clojure]]] [com.novemberain/langohr "3.0.1"]
>> >> [clojurewerkz/support "1.1.0"] [com.google.guava/guava "18.0"]
>> >> [com.rabbitmq/amqp-client "3.4.2"] [org.clojure/clojure "1.6.0"]
>> >> [org.clojure/java.jmx "0.3.0"] [org.clojure/tools.logging "0.3.1"]
>> >> [org.clojure/tools.nrepl "0.2.6" :scope "test" :exclusions
>> >> [[org.clojure/clojure]]] [org.formcept/swissknife "0.6.0"]
>> >> [com.twitter/carbonite "1.4.0"] [com.esotericsoftware.kryo/kryo "2.21"]
>> >> [com.esotericsoftware.minlog/minlog "1.2"]
>> >> [com.esotericsoftware.reflectasm/reflectasm "1.07" :classifier
>> "shaded"]
>> >> [org.ow2.asm/asm "4.0"] [org.objenesis/objenesis "1.2"]
>> >> [com.twitter/chill-java "0.3.5"] [riemann-clojure-client "0.4.2"]
>> >> [com.aphyr/riemann-java-client "0.4.1"]
>&g

Re: ExceptionInInitialization error

2016-02-24 Thread Punit Naik
Okay thanks a lot Gary. Will try that.

On Thu, Feb 25, 2016 at 6:49 AM, Gary Verhaegen 
wrote:

> So when I do `lein deps :tree` with these dependencies (except for the
> swissknife one, which my computer does not seem to find), I get a
> *lot* of conflicts with riemann, starting with:
>
> $ lein deps :tree
> Possibly confusing dependencies found:
> [cheshire "5.3.1"]
>  overrides
> [riemann "0.2.10"] -> [clj-http "1.1.2" :exclusions
> [org.clojure/tools.reader]] -> [chesh
> ire "5.4.0" :exclusions [org.clojure/clojure]]
>  and
> [riemann "0.2.10"] -> [cheshire "5.5.0"]
>
> Consider using these exclusions:
> [riemann "0.2.10" :exclusions [cheshire]]
> [riemann "0.2.10" :exclusions [cheshire]]
>
>
> I would try updating the cheshire version you're declaring in
> project.clj, instead of adding exclusions, though. (On my machine this
> gets rid of the conflicts.)
>
>
> On 24 February 2016 at 10:24, Punit Naik  wrote:
> > Okay. So this is my project.clj:
> >
> >
> > (defproject chowkidar "0.3.0-SNAPSHOT"
> >   :description "Formcept Monitoring Framework"
> >   :dependencies [[org.clojure/clojure "1.6.0"]
> >  [org.clojure/java.jmx "0.3.0"]
> >  [org.clojure/tools.logging "0.3.1"]
> >  [riemann/riemann "0.2.10"]
> >  [riemann-clojure-client "0.4.2"]
> >  [org.formcept/swissknife "0.6.0"]
> >  [com.novemberain/langohr "3.0.1"]
> >  [cheshire "5.3.1"]]
> >   :java-source-paths ["src/java"]
> >   :resource-paths ["resources" "conf"]
> >   :jvm-opts ["-XX:+UseConcMarkSweepGC"]
> >   :profiles {:ship {:aot :all
> > :omit-source true}
> >  :uberjar {:uberjar-name "formcept-chowkidar.jar"}})
> >
> > So I had changed the version of "riemann" from 0.2.6 to 0.2.10. Only
> that.
> >
> >
> > On Wednesday, February 24, 2016 at 2:41:14 PM UTC+5:30, Gary Verhaegen
> > wrote:
> >>
> >> No sign of conflict there - that's a bit surprising. Can you post your
> >> project.clj? Do you have any explicit exclusions? What did you change
> last
> >> before it broke? Is it possible that you somehow corrupted your Maven
> >> repository and are missing the mentioned class?
> >>
> >> Maybe it's not a dependency issue at all.
> >>
> >> On Wednesday, 24 February 2016, Punit Naik  wrote:
> >>
> >> Here it is:
> >>
> >> [cheshire "5.3.1"] [com.fasterxml.jackson.core/jackson-core "2.3.1"]
> >> [com.fasterxml.jackson.dataformat/jackson-dataformat-smile "2.3.1"]
> [tigris
> >> "0.1.1"] [clojure-complete "0.2.3" :scope "test" :exclusions
> >> [[org.clojure/clojure]]] [com.novemberain/langohr "3.0.1"]
> >> [clojurewerkz/support "1.1.0"] [com.google.guava/guava "18.0"]
> >> [com.rabbitmq/amqp-client "3.4.2"] [org.clojure/clojure "1.6.0"]
> >> [org.clojure/java.jmx "0.3.0"] [org.clojure/tools.logging "0.3.1"]
> >> [org.clojure/tools.nrepl "0.2.6" :scope "test" :exclusions
> >> [[org.clojure/clojure]]] [org.formcept/swissknife "0.6.0"]
> >> [com.twitter/carbonite "1.4.0"] [com.esotericsoftware.kryo/kryo "2.21"]
> >> [com.esotericsoftware.minlog/minlog "1.2"]
> >> [com.esotericsoftware.reflectasm/reflectasm "1.07" :classifier "shaded"]
> >> [org.ow2.asm/asm "4.0"] [org.objenesis/objenesis "1.2"]
> >> [com.twitter/chill-java "0.3.5"] [riemann-clojure-client "0.4.2"]
> >> [com.aphyr/riemann-java-client "0.4.1"]
> [com.google.protobuf/protobuf-java
> >> "2.6.1"] [io.netty/netty "3.6.1.Final"] [riemann "0.2.10"] [amazonica
> >> "0.3.28" :exclusions [[joda-time]]] [com.amazonaws/amazon-kinesis-client
> >> "1.1.0" :exclusions [[joda-time]]] [com.taoensso/nippy "2.7.0"]
> >> [com.taoensso/encore "1.11.2"] [net.jpountz.lz4/lz4 "1.2.0"]
> >> [org.clojure/tools.reader "0.8.9"] [org.iq80.snappy/snappy "0.3"]
> >> [org.tukaani/xz "1.5"] [robert/hooke "1.3.0&

Re: ExceptionInInitialization error

2016-02-24 Thread Punit Naik
Okay. So this is my project.clj:


(defproject chowkidar "0.3.0-SNAPSHOT"
  :description "Formcept Monitoring Framework"
  :dependencies [[org.clojure/clojure "1.6.0"]
 [org.clojure/java.jmx "0.3.0"]
 [org.clojure/tools.logging "0.3.1"]
 [riemann/riemann "0.2.10"]
 [riemann-clojure-client "0.4.2"]
 [org.formcept/swissknife "0.6.0"]
 [com.novemberain/langohr "3.0.1"]
 [cheshire "5.3.1"]]
  :java-source-paths ["src/java"]
  :resource-paths ["resources" "conf"]
  :jvm-opts ["-XX:+UseConcMarkSweepGC"]
  :profiles {:ship {:aot :all
:omit-source true}
 :uberjar {:uberjar-name "formcept-chowkidar.jar"}})

So I had changed the version of "riemann" from 0.2.6 to 0.2.10. Only that.


On Wednesday, February 24, 2016 at 2:41:14 PM UTC+5:30, Gary Verhaegen 
wrote:
>
> No sign of conflict there - that's a bit surprising. Can you post your 
> project.clj? Do you have any explicit exclusions? What did you change last 
> before it broke? Is it possible that you somehow corrupted your Maven 
> repository and are missing the mentioned class?
>
> Maybe it's not a dependency issue at all.
>
> On Wednesday, 24 February 2016, Punit Naik  > wrote:
>
> Here it is:
>
> [cheshire "5.3.1"] [com.fasterxml.jackson.core/jackson-core "2.3.1"] [com.
> fasterxml.jackson.dataformat/jackson-dataformat-smile "2.3.1"] [tigris 
> "0.1.1"] [clojure-complete "0.2.3" :scope "test" :exclusions [[org.clojure
> /clojure]]] [com.novemberain/langohr "3.0.1"] [clojurewerkz/support 
> "1.1.0"] [com.google.guava/guava "18.0"] [com.rabbitmq/amqp-client "3.4.2"
> ] [org.clojure/clojure "1.6.0"] [org.clojure/java.jmx "0.3.0"] [org.
> clojure/tools.logging "0.3.1"] [org.clojure/tools.nrepl "0.2.6" :scope 
> "test" :exclusions [[org.clojure/clojure]]] [org.formcept/swissknife 
> "0.6.0"] [com.twitter/carbonite "1.4.0"] [com.esotericsoftware.kryo/kryo 
> "2.21"] [com.esotericsoftware.minlog/minlog "1.2"] [com.esotericsoftware.
> reflectasm/reflectasm "1.07" :classifier "shaded"] [org.ow2.asm/asm "4.0"] 
> [org.objenesis/objenesis "1.2"] [com.twitter/chill-java "0.3.5"] [riemann-
> clojure-client "0.4.2"] [com.aphyr/riemann-java-client "0.4.1"] [com.
> google.protobuf/protobuf-java "2.6.1"] [io.netty/netty "3.6.1.Final"] 
> [riemann 
> "0.2.10"] [amazonica "0.3.28" :exclusions [[joda-time]]] [com.amazonaws/
> amazon-kinesis-client "1.1.0" :exclusions [[joda-time]]] [com.taoensso/nippy 
> "2.7.0"] [com.taoensso/encore "1.11.2"] [net.jpountz.lz4/lz4 "1.2.0"] [org
> .clojure/tools.reader "0.8.9"] [org.iq80.snappy/snappy "0.3"] [org.tukaani
> /xz "1.5"] [robert/hooke "1.3.0"] [capacitor "0.4.3" :exclusions [[http-
> kit]]] [org.clojure/core.async "0.1.319.0-6b1aca-alpha"] [org.clojure/
> tools.analyzer.jvm "0.1.0-beta12"] [org.clojure/core.memoize "0.5.6"] [org
> .clojure/tools.analyzer "0.1.0-beta12"] [org.ow2.asm/asm-all "4.1"] 
> [clj-antlr 
> "0.2.2"] [org.antlr/antlr4-runtime "4.2.2"] [org.abego.treelayout/org.
> abego.treelayout.core "1.0.1"] [org.antlr/antlr4-annotations "4.2.2"] [org
> .antlr/antlr4 "4.2.2"] [org.antlr/ST4 "4.0.8"] [org.antlr/antlr-runtime 
> "3.5.2"] [clj-campfire "2.2.0"] [http.async.client "0.5.2"] [com.ning/
> async-http-client "1.7.10"] [clj-http "1.1.2" :exclusions [[org.clojure/
> tools.reader]]] [com.cognitect/transit-clj "0.8.271" :exclusions [[org.
> clojure/clojure]]] [com.cognitect/transit-java "0.8.287"] [com.fasterxml.
> jackson.datatype/jackson-datatype-json-org "2.3.2"] [org.json/json 
> "20090211"] [org.apache.directory.studio/org.apache.commons.codec "1.8"] [
> org.msgpack/msgpack "0.6.10"] [com.googlecode.json-simple/json-simple 
> "1.1.1" :exclusions [[junit]]] [org.javassist/javassist "3.18.1-GA"] [
> commons-codec "1.10" :exclusions [[org.clojure/clojure]]] [commons-io 
> "2.4" :exclusions [[org.clojure/clojure]]] [crouton "0.1.2" :exclusions [[
> org.clojure/clojure]]] [org.jsoup/jsoup "1.7.1"] [org.ap

Clojure.async channels

2016-02-17 Thread Punit Naik
I want to create a clojure aync channel which will do a specific task after 
its fully filled to its capacity or after a certain amount of time since 
its activation. How do I do this?

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


clojure future method

2016-02-15 Thread Punit Naik
I have a function which calls a number of functions in sequence. Few of 
them are normal functions and some of them use clojure's 'future' method. 
When I run this function, All the normal fuctions run but the functions 
which use 'future' don't run at all. But if I run the functions which use 
'future' individually, they run perfectly fine.

Can anyone point me to a solution?

Please help!

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.