Re: [julia-users] Re: how to run julia without the llvm jit

2016-02-10 Thread Steve Kelly
Jameson, this is really great. I think many of us have been nibbling around
the edges of the compiler for a long time and this will give us a great
boost to make more integrations happen!

I have a question regarding generated functions. It seems to me the claim
that generated functions are black boxes during compilation to be unclear.
I believe on 0.4 such code would be inlined if types are inferred
correctly. So my confusion seems to be on the role of inference in
generated functions. If a call to a generated function is type stable and
called from a precompiled function in the same module, the behavior seems
to be that the generated function gets the usual compilation treatment. Is
this analysis approximate?

Thanks,
Steve
On Feb 10, 2016 12:48 PM, "Simon Danisch"  wrote:

> @Jameson, thanks for the interesting read!
>
> @Tim, I'd like to hear about anything you come up with :)
>
> Am Mittwoch, 10. Februar 2016 04:12:19 UTC+1 schrieb Jameson:
>>
>> I've been working on a static (no-JIT) Julia mode and related
>> ahead-of-time-compilation tools. I put together a blog post describing this
>> work, and also covering some of the build stages of Julia as well, that I
>> felt may be of interest to some users. It is available at:
>>
>> http://juliacomputing.com/blog/2016/02/09/static-julia.html
>>
>> -jameson
>>
>


Re: [julia-users] Using Dot Syntax in Julia

2016-01-17 Thread Steve Kelly
It has always been this way because of multiple dispatch. However you can
do something like:

type Wallet
  dotTest::Function
end

Which might have ambiguous performance impact.
On Jan 17, 2016 12:45 PM, "Bryan Rivera"  wrote:

> I have seen some code out in the wild that allows us to use dot syntax
> like so:
>
> function dotTest!(wallet::Wallet, valueToAdd::Int):
>
> ...
>
> end
>
> wallet = Wallet(100)
>
> wallet.dotTest!(5)  # Does not work
> dotTest!(wallet, 5)  # Works
>
> However I cannot get it to work, the method is not found because I am not
> passing wallet as the arg.
>
> So did the language change, or am I doing it wrong?
>


Re: [julia-users] Re: ANN: Kip.jl an alternative module system

2015-12-20 Thread Steve Kelly
I've been using NixOS the past month to try and gain some perspective on
this problem. Their approach is very good, and I recommend reading the
white papers and source notes.
On Dec 20, 2015 10:05 AM, "Tom Breloff"  wrote:

> My reaction is the same as Tim Holy's.  While I agree there are aspects of
> Julia's modules which are not perfect, it's not clear at all how your
> package changes the workflow.  In fact, I don't understand anything about
> your package.  Could you please try to write up the design thoughts behind
> what you are doing, so that we can understand the high level concepts that
> you are attempting?
>
> On Sun, Dec 20, 2015 at 6:24 AM, Tim Holy  wrote:
>
>> After reading your README example, I'm still left wondering how one works
>> with
>> Kip, or how it fixes the problems you're describing. To me it's not at all
>> obvious how your example "illustrates" the statements you make in the
>> prose.
>> You might consider explaining the meaning of the various arguments to
>> @require, what an "index" file is and what its format should be, and
>> exactly
>> what the call to the emit function is supposed to demonstrate.
>>
>> Best,
>> --Tim
>>
>> On Saturday, December 19, 2015 11:26:53 PM Jake Rosoman wrote:
>> > I forgot to actually link to the project <
>> https://github.com/jkroso/Kip.jl>
>> >
>> > On Sunday, December 20, 2015 at 8:25:04 PM UTC+13, Jake Rosoman wrote:
>> > > Julia's module system is the one part of it I feel confident enough
>> to say
>> > > is bad. It can't handle several versions of the same package. Is hard
>> (or
>> > > impossible?) to depend on packages that aren't in the registry and
>> hard to
>> > > add (controversial) things to the registry. I also find it ugly and
>> hard
>> > > to
>> > > use but now I'm getting into opinions so I'll stop.
>> > >
>> > > Kip solves all these problems and works fine alongside Julia's current
>> > > module system so you can try it out now. I hope that eventually we can
>> > > replace Julia's module system if people generally agree that it's
>> worth
>> > > doing. I've created a poll to measure the communities opinion
>> > >  and you can change your vote at any time so
>> feel
>> > > free to say no now but follow the discussion.
>>
>>
>


Re: [julia-users] Re: Ray tracing for complex geometry

2015-11-23 Thread Steve Kelly
I think most of the people working on mesh related geometry are doing it
part-time and building these libraries is part of the learning experience.
Asking questions is important, since it is the only way to gain insight. If
you have more questions about Meshes, I'd be happy to help on the github
bug tracker.

My senior thesis is on Polyhedra, and I thought it would be easy since I
have been working with mesh types for over a year. It turned out I was
wrong, and the needs of the application revealed a whole set of types (and
more importantly relations between them) that are very important for
combinatorial geometry. Stretching the application of libraries is really
important, and I am sure in time you will be contributing back if you stick
with it!

On Mon, Nov 23, 2015 at 7:31 AM, kleinsplash 
wrote:

> Nope. I have obj files. Your hypothesis is correct. I have attached one of
> them. Your script works just fine (is there an easy way to save this
> image?).
>
> As a side note: I do collect point clouds using V-REP, and I can generate
> pointclouds (pcd) using pcl - but I want to work with the obj mesh files
> because the clouds are too sparse.
>
> I probably could have explained myself better, point taken. I will aim to
> try harder next time, I feel horrid when I am asking basic questions and on
> top of that writing an essay.
>
> The only other person I know who uses the term interwebz is Richard on
> Fast and Loud - I am an avid supporter.
>
> On Monday, 23 November 2015 13:41:40 UTC+2, Simon Danisch wrote:
>>
>> I'm not sure what you mean by virtual objects. Obj is in the context of
>> 3D objects is usually the wavefront
>>  format.
>> If you have an object database with *.obj's in it, the probability is
>> very high, that you don't have pointclouds whatsoever.
>> You can try this, to confirm my hypothesis:
>>
>> using GLVisualize, FileIO
>> obj = load("file.obj")
>> w,r = glscreen()
>> view(visualize(obj))
>> r()
>>
>> Or just download any obj viewer from the interwebzz and look at that
>> thing.
>> If you have nice smooth surfaces, you're getting it all wrong with the
>> pointclouds and ray tracing.
>> I could give you some hacky way of extracting depth images with
>> GLVisualize, if that's what you're after.
>> In that case, just try the example above and if that works, open an issue
>> at GLVisualize that you want depth images. Then we can take it from there.
>>
>>
>> If by any chance you DO have pointclouds stored in an obj file, things
>> are more complicated since you then need to approximate the surface of that
>> cloud.
>> Still, raytracing wouldn't be your friend ;) If you have infinitely small
>> points, there is no magic that lets a ray hit these points any better then
>> some other visualization algorithm.
>> Even if it's really dense, you still have infinitely small points. You
>> can treat the points as particles, to give them some "body" that you can
>> see, but then it's not really a surface anymore.
>>
>> Just google for pointcloud surface approximation and see where that gets
>> you.
>>
>> I'm guessing here, that you have some sensor that outputs depth images
>> and you want to recognize objects in these depth images.
>> To train your depth image classifier, you need depth images from a lot of
>> perspectives from a lot of random 3D objects, which is why you searched for
>> a 3D object database, which got you to the obj files of random 3D objects.
>>
>> It'd have been a lot easier, if you just stated this in your problem
>> description, probably even with links to the obj database.
>>
>> Best,
>> Simon
>>
>> Am Freitag, 20. November 2015 16:18:46 UTC+1 schrieb kleinsplash:
>>>
>>> I was wondering if someone could help me out with a decision/offer an
>>> opinion:
>>>
>>> I need a ray tracer that deals with complex geometry (a fast ray tracer
>>> that can create 1000's of point clouds in minimal time)
>>> Python has methods: http://pyopengl.sourceforge.net/ that I could get
>>> to grips with. But I want to stick with Julia.
>>>
>>> I have found these resources:
>>> https://github.com/JuliaGL/ModernGL.jl - not sure if this has a ray
>>> tracing option
>>> http://www.cs.columbia.edu/~keenan/Projects/QuaternionJulia/ - looks
>>> crazy complicated
>>>
>>> https://github.com/JuliaLang/julia/blob/master/test/perf/kernel/raytracer.jl
>>> - looks like only handles simple geometry
>>>
>>> Could someone point me in the right direction?
>>>
>>>
>>>
>>


Re: [julia-users] Ray tracing for complex geometry

2015-11-20 Thread Steve Kelly
How is your geometry defined? If it is an implicit function, ShaderToy.jl
(built on GLVisualize) has a raymarching example.
https://github.com/SimonDanisch/ShaderToy.jl/blob/master/examples/rayprimitive.frag

The method can be generalized to generating distance fields, but I haven't
gotten to it yet. I'd also recommend taking a look at the link in the
comments. Inigo has some great stuff on ray tracing techniques for the GPU.

I've been working on a solid modeler that makes describing primitives as
functions much easier.
https://github.com/FactoryOS/Descartes.jl/tree/master/examples The eventual
goal is to get all the geometric realization code on the GPU (SDFs and Dual
Contours).
On Nov 20, 2015 11:15 AM, "Tom Breloff"  wrote:

> Could you describe a little more about your use-case?  I'm not sure that
> ray-tracing is necessarily what you want if you're displaying point
> clouds.  I would check out GLVisualize.jl as a first step.
>
> On Fri, Nov 20, 2015 at 10:18 AM, kleinsplash 
> wrote:
>
>> I was wondering if someone could help me out with a decision/offer an
>> opinion:
>>
>> I need a ray tracer that deals with complex geometry (a fast ray tracer
>> that can create 1000's of point clouds in minimal time)
>> Python has methods: http://pyopengl.sourceforge.net/ that I could get to
>> grips with. But I want to stick with Julia.
>>
>> I have found these resources:
>> https://github.com/JuliaGL/ModernGL.jl - not sure if this has a ray
>> tracing option
>> http://www.cs.columbia.edu/~keenan/Projects/QuaternionJulia/ - looks
>> crazy complicated
>>
>> https://github.com/JuliaLang/julia/blob/master/test/perf/kernel/raytracer.jl
>> - looks like only handles simple geometry
>>
>> Could someone point me in the right direction?
>>
>>
>>
>
>


Re: [julia-users] Google releases TensorFlow as open source

2015-11-10 Thread Steve Kelly
FWIW I think this may be the C API:
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/public/tensor_c_api.h
On Nov 10, 2015 11:05 AM, "Stefan Karpinski"  wrote:

> Time for a JuliaML org?
>
> On Tuesday, November 10, 2015, Tom Breloff  wrote:
>
>> I'm interested as well.  Who wants to claim TensorFlow.jl?
>>
>> On Tue, Nov 10, 2015 at 9:11 AM, Ben Moran  wrote:
>>
>>> I'm very interested in this.  I haven't gone through the details yet but
>>> they say that C++ API currently only supports a subset of the Python API
>>> (weird!).
>>>
>>> One possibility is to use PyCall to wrap the Python version, like was
>>> done for PyPlot, SymPy and like I began tentatively for Theano here -
>>> https://github.com/benmoran/MochaTheano.jl
>>>
>>>
>>> On Monday, 9 November 2015 21:06:41 UTC, Phil Tomson wrote:

 Looks like they used SWIG to create the Python bindings.  I don't see
 Julia listed as an output target for SWIG.



 On Monday, November 9, 2015 at 1:02:36 PM UTC-8, Phil Tomson wrote:
>
> Google has released it's deep learning library called TensorFlow as
> open source code:
>
> https://github.com/tensorflow/tensorflow
>
> They include Python bindings, Any ideas about how easy/difficult it
> would be to create Julia bindings?
>
> Phil
>

>>


Re: [julia-users] Using Meshes.ji

2015-11-09 Thread Steve Kelly
The faces can be accessed with faces(load("foo.obj")) or mesh.faces.

Probably the easiest way to display the mesh at this point is with
ThreeJS.jl:
https://github.com/rohitvarkey/ThreeJS.jl/blob/master/examples/mesh.jl.
This approach should work in IJulia and Blink.

GLVisualize has some good demos and a much more responsive backend, but it
needs some work to run in OpenGL < 3.3 and the working commits aren't on
Metadata yet. Meshes is kind of a weird state right now, and most of the
functionality can be had with GeometryTypes, Meshing, and MeshIO. We have
been working the past few months to finish the coupling between data
structures for geometry and visualization. It would be great to hear your
application, and see if we could achieve something in the short term that
would work for you. Personally I use Meshlab when I do solid modelling in
Julia which slows down my iteration time, and it would be nice to have a
mesh viewer in the workflow.

Best,
Steve
On Nov 9, 2015 9:55 AM, "Ashley Kleinhans" 
wrote:

> Hi,
>
> I am new at this - but have decided that Julia is my language of choice.
> So I begin silly question stage:
>
> Could someone talk me through how to access and display an .obj file?
>
> I have gotten so far:
>
> using Meshes
> using PyPlot
> using FileIO
> using MeshIO
>
> obj = load(filename)
> vts = obj.vertices
>
>
> Which gives me:
>
> 502-element Array{FixedSizeArrays.Point{3,Float32},1}:
>
>
>
> One example point being:
>
> Point(0.00117,-0.02631,0.03907)
>
>
>
>
>
> How do I access the verticies to use them with plot?
>
> -A
>
>
>


Re: [julia-users] Anaconda Python

2015-11-02 Thread Steve Kelly
The context is not clear. Is this regarding Conda.jl?

On Mon, Nov 2, 2015 at 4:00 PM,  wrote:

> 
> I don't think you should support Anaconda Python.  I realize it is
> convenient.  Providing a sort of private copy of Python and its packages
> makes sense.  It simplifies installation and maintenance of key Julia
> dependencies for users.  I just don't think you should use Anaconda to do
> it.
>
> Anaconda is fork of Python, its package management, its primary package
> repository, and many of the packages themselves.  Forks are BAD.  It
> borders on a commercial lock-in or, at a minimum, a technical lock-in to
> Anaconda.
>
> I am a commercial software guy by experience.  I made a living from
> commercial software and find that to be completely honorable.  This is not
> an anti-commercial rant.  It IS, on the other hand, an anti-fork rant.
>
> Python is a vibrant community.  Julia is a vibrant community on a very
> nice trajectory.  May they both continue.  Rather than a philosophical
> discussion of Continuum and various open source license types, lets think
> about this from the standpoint of Julia.
>
> Would you like it if someone came along and forked all of Julia,
> especially Pkg, and created forks of every package?   To do so would be
> entirely compliant with the MIT open source license.  So, it would be legal
> (not that license enforcement is common in the open source world).  But,
> would it be DESIRABLE?  You've done a fine thing to rely largely on git and
> github.
>
> Probably not.  Is it possible that someone proposing enhancements found
> that their suggestions were rejected?  Well, that can happen.  Perhaps that
> would lead to a fork.  But, if there was community endorsement of the
> suggestions from some reasonable plurality of members and enhancements
> could be made without injury to those preferring some other code path, it
> would be reasonable to accommodate particularly if the proposers backed
> their suggestions with effort--working code that could be integrated under
> the conditions mentioned.  I depict this in a somewhat negative way, but my
> point is to confirm that *contributing is better than forking*.
>
> Typically, it is easier--and less negative--than the scenario I depicted.
> There is a community.  Some leaders are very technically adept and have a
> vision (e.g., Julia is not C, Python, R, or Java so it won't do things just
> like those or other languages...) so they have some sway over final
> inclusion decisions.  And these technical leaders do care what the
> community suggests; are open to suggestions and contributions; occasionally
> reject some input with transparent reasons (transparency may not convince
> the proposer, but it is good for everyone to see the dialog and decisions);
> and often accept suggestions--implementing the suggestions themselves or
> accepting pull requests.  But, realistically the core team makes most of
> the commits and carries most of the work.
>
> This is probably how we want it to work.  We probably don't want a fork of
> Julia and hope to avoid it and we will see Julia grow and be enhanced--most
> often on the path and vision of the founders and sometimes with the
> contributions of others. In the spirt of "do unto others...", let's not
> encourage a fork of Python.
>
> This would mean using Python releases of the Python Software Foundation
> (PSF) and its package repository PyPI.  There will be some inconvenience.
> Perhaps not all of the Python "cousins" are enamored of Julia and aren't
> eager to be helpful.  Or perhaps they are merely neutral and busy.  But, it
> supports their community to endorse it.  Do unto others...   As a serious
> practical matter, the communities are not distinct.  Many user-developers
> do use both Julia and Python.  We'd like both communities to thrive.  I
> think Continuum would probably concur with the broad sentiment, though not
> with my personal opinion about using Anaconda as a Julia dependency.
>
> This requires some deep thought.  Using Anaconda is certainly a near-term
> convenience.   On Windows, it is possible to get most of the same benefits
> from the less commercially oriented release WinPython.  On Mac, ...from
> Homebrew, which is also quite non-commercial.  On Linux, ...well, that is
> another kettle of fragmentation--and probably better to rely on PSF than a
> bunch of package repositories.  Consider:  how do you want the Julia
> community to develop?  How does the Julia community overlap with the Python
> community (and to a lesser extent the R community)?  How do choices affect
> the healthy, long-term evolution of an open source community?
> 
>
> I've left out discussion of how open source communities can attract
> commercial participants.  That is indeed beneficial.  Look to Cloudera's
> role in the Hadoop community for a good example of how this can work.
>


Re: [julia-users] Re: Anaconda Python

2015-11-02 Thread Steve Kelly
Conda.jl is a community contribution as a package, not part of the core
language. Likewise it is not strictly required. It does make using PyCall
significantly more convenient. Maybe you might consider making a Pip.jl
package that does something similar?

On Mon, Nov 2, 2015 at 4:18 PM,  wrote:

> Yes.
>
> The practical problem is out of date packages and subtle incompatibilities
> between "system" installs of Python and the private one.
>
> I now use the "Julia" Python as my 2.7 version and a system Python from
> PSF for 3.5.
>
> Conda is limited to Python 3.4 and matplotlib 1.4.3.
>
> Conda et al will always lag a bit with the risk of more incompatibilities
> creeping in.
>


Re: [julia-users] Travis build failing

2015-10-02 Thread Steve Kelly
Looks like it is reported:

https://github.com/JuliaLang/julia/issues/13399
On Oct 2, 2015 7:07 PM, "Júlio Hoffimann"  wrote:

> Hi,
>
> All my packages are failing the build on Travis/Linux even though I didn't
> touch the code recently. The build on Travis/Mac is working.
>
> For instance: https://github.com/juliohm/GeoStatsImages.jl
>
> Could you please confirm the error is in the Travis/Julia side?
>
> -Júlio
>
>
>


Re: [julia-users] Re: [ANN] FixedSizeArrays

2015-09-05 Thread Steve Kelly
It is 0.4 only so you can't add it from 0.3.

On Sat, Sep 5, 2015 at 7:06 PM, David P. Sanders 
wrote:

>
>
> El sábado, 5 de septiembre de 2015, 22:09:20 (UTC+1), Simon Danisch
> escribió:
>>
>> Hi everyone,
>>
>> FixedSizeArrays offers
>> an abstract interface to turn arbitrary types into arrays of fixed size
>> with most array functionality defined.
>> The types Point, Vec and Mat are already included per default.
>>
>
> This is great news, congrats!
>
> There doesn't seem to be a version tagged. Should we do Pkg.clone()? Could
> you add this info in the repo?
>
> David.
>
>
>>
>> Advantages are that they're stack allocated, often have less overhead
>> than Julia arrays and you can dispatch on the dimension.
>> Disadvantages are, that they're still immutable (advantage?!), 0.4 only
>> and I still have some problems with compilation times for first calls of
>> the function even with precompilation.
>> Also the constructor code is pretty messy, as it is currently relatively
>> hard to write constructors for abstract types.
>>
>> In the future I want to move this into base, by simply inheriting from
>> AbstractArray, which is currently not possible for immutable arrays.
>> Major blocking issue for this is my restricted time and #11610
>> .
>> Also, we might use Ref{NTuples} for fixed size arrays in the future,
>> which would resolve quite a few issues.
>> It's pretty fast but might get even faster as soon as NTuple gets
>> translated into LLVM's vector type (currently array).
>>
>> Here's a short code sample:
>>
>> immutable RGB{T} <: FixedVectorNoTuple{T, 3}
>> r::T
>> g::T
>> b::Tendimmutable Vec{N, T} <: FixedVector{N, T} # defined in GeometryTypes.jl
>> _::NTuple{N, T}end
>> Vec{3, Float32}(0) # constructor with 1 argument already definedrand(Vec{3, 
>> Int})+sin(Vec(0,2,2)) # a lot of array functions are already defined#There 
>> is also a matrix typeeye(Mat{3,3,Float32}) * rand(Vec{3, Float32}) # will 
>> also "just work"
>> a = Vec(1,2,3)[1:2] # returns (1,2)
>>
>>
>> Best,
>> Simon
>>
>


Re: [julia-users] Julia will always be open source

2015-05-09 Thread Steve Kelly
Could Julia Computing be a way to sponsor the core team to do full-time
development? How is it going to work when there are consulting jobs that
bring in revenue, yet take time away from core development?

On Sat, May 9, 2015 at 4:20 PM, Viral Shah vi...@mayin.org wrote:

 Hello all,

 You may have seen today’s Hacker News story about Julia Computing:
 https://news.ycombinator.com/item?id=9516298

 As you all know, we are committed to Julia being high quality and open
 source.

 The existence of Julia Computing was discussed a year ago at JuliaCon
 2014, though we recognize that not everyone is aware. We set up Julia
 Computing to assist those who asked for help building Julia applications
 and deploying Julia in production.  We want Julia to be widely adopted by
 the open source community, for research in academia, and for production
 software in companies.  Julia Computing provides support, consulting, and
 training for customers, in order to help them build and deploy Julia
 applications.

 We are committed to all the three organizations that focus on different
 users and use cases of Julia:

 1. The open source Julia project is housed at the NumFocus Foundation.
 http://numfocus.org/projects/

 2. Research on various aspects of Julia is anchored in Alan’s group at
 MIT. http://www-math.mit.edu/~edelman/research.php

 3. Julia Computing works with customers who are building Julia
 applications. http://www.juliacomputing.com/

 Our customers make Julia Computing self-funded. We are grateful that they
 have created full time opportunities for us to follow our passions. Open
 source development will never cease.

 You may have questions. Please shoot them here. We will respond back with
 a detailed blog post.

 -viral




Re: [julia-users] [ANN] JuliaIO and FileIO

2015-04-04 Thread Steve Kelly
Simon,

I think this is a great idea! However I am kind of confused with the
separation of interface here. Is the idea that FileIO.jl will have
definitions for each package?

I think the power of the Require.jl package is that packages can define
their own interface. For example Meshes.jl can have `@requires FileIO begin
... end`. I think FileIO is more likely to have a stable API than Meshes,
so it seems better IMO to keep the FileIO method extensions in Meshes. In
the case of Images.jl, putting the FileIO behind a @requires
 would eliminate the need for the ImageIO package and make it easier to
update if types change in Images.

Best,
Steve

On Sat, Apr 4, 2015 at 5:23 PM, Matt Bauman mbau...@gmail.com wrote:

 Very cool. I've been intersted in doing something like this myself for
 quite some time now. Have you seen
 https://github.com/JuliaLang/julia/issues/7299 ?

 I'll take a look when I get a chance.


Re: [julia-users] Julia on Raspberry Pi 2

2015-02-14 Thread Steve Kelly
Sto,

I got Julia running on a BeagleBone Black running Debian Jessie a couple
months back using this process:
https://github.com/JuliaLang/julia/blob/master/README.arm.md. It depends on
a few system libraries to run, so I needed to update from Wheezy to Jessie
so it would work. I think some improvements have been made since then so
the build is more self contained. I am pretty sure Raspbian is based on
Wheezy, but it might be worth a shot with the latest master.

Best,
Steve

On Sat, Feb 14, 2015 at 3:11 PM, Sto Forest stochastic.for...@gmail.com
wrote:

 Is there a way to get Julia running on the new Raspberry Pi 2, perhaps
 under raspbian ?





[julia-users] ANN: Clipper.jl

2015-02-12 Thread Steve Kelly
Julians,

The past few months we have been working on using Cxx.jl to wrap Clipper (
http://www.angusj.com/delphi/clipper.php ) for polygon and polyline set
operations and offsetting. We have been testing it with our path planning
software (written in Julia) and the results are really exciting. The
library definitely needs some more polish, but it serves our needs. For
example, the quadrotor on our site ( http://www.voxel8.co/ ) was printed
using Julia for the whole pipeline from mesh input to exported Gcode. Cxx
has been an excellent FFI and Julia has helped us move quickly.

We hope that this library will be useful to the community, and hope to
register it once 0.4 and Cxx becomes more stable. This is the link to the
package: https://github.com/Voxel8/Clipper.jl

Best,
Steve

P.S. We are a growing company that uses Julia heavily, and likes to
contribute back to the community. If you are interested in working with us,
please contact (j...@voxel8.co) off-list.

P.P.S. I hope job solicitation is okay!


Re: [julia-users] Re: why does in(x,y) not follow the usual semantics of built-ins?

2015-01-11 Thread Steve Kelly
The contains method also has the arg order switched.
On Jan 11, 2015 5:28 PM, Stefan Karpinski stefan.karpin...@gmail.com
wrote:

 We also, when it makes linguistic sense tend to have the order
 verb(subj,obj). The in operator is a case of this.


  On Jan 11, 2015, at 8:10 PM, Steven G. Johnson stevenj@gmail.com
 wrote:
 
  For any infix operator O, O(x,y) is equivalent to x O y. Since in is
 an infix operator, it follows the same convention.



Re: [julia-users] very rough sketch of future release targets

2014-12-20 Thread Steve Kelly
I think the closest thing is the GitHub milestones:
https://github.com/julialang/julia/milestones


On Sat, Dec 20, 2014 at 7:40 PM, ivo welch ivo...@gmail.com wrote:


 is there a (living) document that sketches rough planned release target
 dates and release features?  note I did not call them schedule.
  obviously, it would and should change a lot over time.  but, presumably,
 it would now have some target release date for 0.40-dev to become 0.40,
 too, for example.  ;-)  and it could list some long-term goals, like focus
 on debugging support, focus on x, etc.  it may already exist, but I may
 have overlooked it.  google release schedule for julialang didn't show
 something useful.




Re: [julia-users] Rust and Julia - Together? Advice

2014-12-17 Thread Steve Kelly
I am currently building a path planner for 3D printers in Julia. We are
also using a ZeroMQ interface to separate the web interface from the path
planner. This is working very well for us now. We will also be using
JuliaBox for packaging our application.

On Wed, Dec 17, 2014 at 8:05 PM, Eric Forgy eric.fo...@gmail.com wrote:

 Thank you Viral and thank you Stefan.

 The Rust solution would be an interesting and longer term effort because
 the person I would rely on to do most of the development is not yet ready
 to take the Rust plunge until it matures/stabilizes a bit more.

 The other two ideas REST and zeromq look interesting and I'd like to learn
 more. I spent what little available time I had yesterday reading about
 message queueing with zeromq and others. That is very interesting stuff. I
 had a quick look at JuliaBox. That is pretty awesome :) I would love to
 understand how it works. Like I said, I am pretty much a blank slate and
 learning everything as I go, so looking at the JuliaBox code, it wasn't
 immediately obvious (.t? .lua? :)). Does there exist a simple document
 describing the process flow of how JuliaBox works?

 In very general terms, could you help sketch out and point me to some
 reading material to help me make progress building a RESTful interface to
 Julia running on a server? Should it be built entirely in Julia? Is it
 already done in JuliaBox or elsewhere? I think initially, I will focus on
 REST. Another platform (OpenGamma) I'm working with also has a RESTful API
 so any effort I make will not be wasted even if I eventually start work
 with zeromq (which I probably will).

 Thank you again.

 Cheers,
 Eric




Re: [julia-users] Re: Displaying a polygon mesh

2014-11-11 Thread Steve Kelly
If you pull from master you should be able to just use threejs in IJulia:
https://baconscript.github.io/Meshes.jl/

On Tue, Nov 11, 2014 at 3:13 PM, Tracy Wadleigh tracy.wadle...@gmail.com
wrote:

 When developing Meshes.jl, I would dump a PLY (which, unlike STL,
 preserves topology) and view it with meshlab
 http://meshlab.sourceforge.net/.



Re: [julia-users] Re: PSA: Choosing between Julia 0.3 vs Julia 0.4

2014-09-26 Thread Steve Kelly
This is how I set up my environment to stay involved:

julia - master
julia3 - release-0.3
julia4on3- use 0.4 packages on julia3 (this is helpful since I like to
develop in the v0.4 directory)
julia-multi - run something with 0.4 packages on julia and julia3 (I
normally only use this with 'julia-multi ./test/runtests.jl')

I've put the scripts I use for the last two on Github:
https://github.com/sjkelly/julia_scripts

These four commands give me the satisfaction of seeing stuff break, and
also providing comfort when there are deadlines to meet :P.


On Fri, Sep 26, 2014 at 10:49 AM, Stefan Karpinski 
stefan.karpin...@gmail.com wrote:

 It's a bit odd for there to be simultaneous complaints about 0.4 being
 unstable (ie under rapid development) and not going anywhere. It's been,
 what, 13 years since the plans to release Perl 6 were announced? Seems a
 bit early to worry about that kind of problem a couple of months after the
 last significant release of Julia. If 0.4 isn't out by 2020 we can start to
 worry.


 On Sep 26, 2014, at 10:12 AM, John Myles White johnmyleswh...@gmail.com
 wrote:

 Hans,

 The tone of your e-mail is a little odd in my opinion. It seems to imply
 distrust and even possibly anger for a project that would be substantially
 better served by participating actively in the issue discussions that Tim
 Holy discussed. I don't think anyone who's following 0.4's progress would
 ever believe that 0.4 is not on track.

  -- John

 On Sep 26, 2014, at 3:30 AM, Hans W Borchers hwborch...@gmail.com wrote:

 Ivar,

 thanks for this clarification; I was really under the impression that --
 like
 for Perl and other projects -- I might never ever again hear from a Julia
 0.4
 version.

 A question I asked got buried in another thread and never answered, so I'd
 like
 to repeat it here:

   Will the NEWS.md file immediately document the (disruptive or
 non-disruptive)
   changes? That would be very helpful, even if the change is withdrawn
 later on.
   Also, every NEWS entry could include a date to make it easier to follow
 the
   development.

 By the way, I am a bit worried about some of the names that seem to come
 up in a
 next version of Julia. For example, 'Nullable' or 'NullableArray' sound
 strange
 for me in a technical computing environment.


 On Friday, September 26, 2014 9:19:37 AM UTC+2, Ivar Nesje wrote:

 I think this is a too strong statement. There are definitely happening a
 lot on the master (0.4-dev) branch, but it should be quite usable even
 without reading the majority of Github issues. The more users we have, the
 earlier concerns is raised, and the earlier we can fix them and prepare for
 the final release. You should definitely avoid master on any project with a
 deadline tough.






Re: [julia-users] Re: PSA: Choosing between Julia 0.3 vs Julia 0.4

2014-09-25 Thread Steve Kelly
Case and point:
https://github.com/JuliaLang/julia/commit/2ef8d31b6b05ed0a8934c7a13f6490939a30b24b

:)

On Thu, Sep 25, 2014 at 11:46 PM, Isaiah Norton isaiah.nor...@gmail.com
wrote:

 Checking out the release branch is fine; the 0.3.1 tag is on that branch.

 On Thu, Sep 25, 2014 at 11:12 PM, John Myles White 
 johnmyleswh...@gmail.com wrote:

 I think it's more correct to check out tags since there seems to be work
 being done progressively on that branch to keep up with backports.

 Not totally sure, though.

  -- John

 On Sep 25, 2014, at 7:58 PM, David P. Sanders dpsand...@gmail.com
 wrote:



 El jueves, 25 de septiembre de 2014 19:59:41 UTC-5, John Myles White
 escribió:

 I just wanted to suggest that almost everyone on this mailing list
 should be using Julia 0.3, not Julia 0.4. Julia 0.4 changes dramatically
 from day to day and is probably not safe for most use cases.

 I'd suggest the following criterion: are you reading the comment
 threads for the majority of issues being filed on the Julia GitHub repo?
 If the answer is no, you probably should use Julia 0.3.


 Thanks for the nice, clear statement, John!

 Currently I have been using

 git checkout release-0.3

 and compiling from there.

 Is this the correct thing to do?  I notice there is now a v0.3.1 tag.

 David.


  -- John






Re: [julia-users] Re: new REPL

2014-09-22 Thread Steve Kelly
The prompt is defined here:
https://github.com/JuliaLang/julia/blob/2eee58701052caf6b6927604732e197b1054ac7b/base/REPL.jl#L191

I was looking into it so I could number lines for my SaveREPL.jl package,
but it looked like a lot of refactoring.

On Mon, Sep 22, 2014 at 10:10 PM, cdm cdmclean@gmail.com wrote:


 i suspect that there is a way to do this ...

 suppose that one needed to change the
 Julia prompt from

 julia


 to say, something like

 j~


 is this defined in Base, or somewhere else?

 many thanks,

 cdm



Re: [julia-users] Faster than CGAL!

2014-09-17 Thread Steve Kelly
Your analysis was excellent and informative, thanks for sharing! I have
noticed similar performance qualities with the greiner-hormann clipping
algorithm.

On Wed, Sep 17, 2014 at 1:03 PM, Toivo Henningsson toivo@gmail.com
wrote:

 Nice!



[julia-users] C Global Structs

2014-09-15 Thread Steve Kelly
I'd like to see if Julia is running in code-coverage mode or not.

In the REPL I can do the following:

julia a = cglobal(:(jl_compileropts))
Ptr{Void} @0x7fa3e01bbc90


The struct is defined like so:
https://github.com/JuliaLang/julia/blob/aab2c6e67b5aaee7f23bc5a52897f7219473c153/src/julia.h#L1330-L1338

How can I access the code_coverage member?

Thanks,
Steve


Re: [julia-users] Re: C Global Structs

2014-09-15 Thread Steve Kelly
This works excellent, thank you both!

Do you think this would be useful to have these user-facing in Base?

On Mon, Sep 15, 2014 at 12:46 PM, Jake Bolewski jakebolew...@gmail.com
wrote:

 julia immutable CompilerOpts
build_path::Ptr{Cchar}
code_coverage::Int8
malloc_log::Int8
check_bounds::Int8
dumpbitcode::Int8
int_literals::Cint
compile_enabled::Int8
end

 julia a = cglobal(:jl_compileropts, CompilerOpts)
 Ptr{CompilerOpts} @0x00010f840a88

 julia unsafe_load(a)
 CompilerOpts(Ptr{Int8} @0x,0,0,0,0,0,1)

 On Monday, September 15, 2014 12:30:48 PM UTC-4, Steve Kelly wrote:

 I'd like to see if Julia is running in code-coverage mode or not.

 In the REPL I can do the following:

 julia a = cglobal(:(jl_compileropts))
 Ptr{Void} @0x7fa3e01bbc90


 The struct is defined like so:
 https://github.com/JuliaLang/julia/blob/aab2c6e67b5aaee7f23bc5a52897f7
 219473c153/src/julia.h#L1330-L1338

 How can I access the code_coverage member?

 Thanks,
 Steve




Re: [julia-users] Re: BeagleBone Black

2014-09-11 Thread Steve Kelly
Viral, I started the build on our BBB on master. We are running Debian sid
with the stock BBB
 kernel. I tried running the build with the stock arm-make3 branch on
Monday and the LLVM build failed. I tried to use the system 3.5 but that
failed as well.

Are you using Clang or GCC?

On Thu, Sep 11, 2014 at 8:09 AM, Viral Shah vi...@mayin.org wrote:

 Could you try building master on the BBB again? It should hopefully have a
 comparable CPU as the Chromebook. It also helps if it is running Ubuntu, to
 install various dependencies to cut down the build time.

 -viral


 On Thursday, September 4, 2014 11:14:18 AM UTC+5:30, Steve Kelly wrote:

 I can confirm the arm-make3 branch does not build on the BBB. My company
 is interested in getting Julia on ARM. I think the biggest issue is that
 ubiquitous ARM devices are poor for building software quickly. Also with
 Red Hat and possibly Amazon entering the ARM server space this will become
 more feasible in the next year.


 On Thu, Sep 4, 2014 at 1:01 AM, Viral Shah vi...@mayin.org wrote:

 See the arm-make3 branch. I have an ARM chromebook, and it currently
 crashes in building the system image. I haven't been able to look into this
 further. I am guessing we will be ready in the 0.5 timeframe.

 If someone does want to try their hand at the ARM port, I can set up ssh
 on my chromebook.

 -viral


 On Wednesday, September 3, 2014 7:05:35 PM UTC-7, 2Cubed wrote:

 I would love to get Julia and/or IJulia up and running on my BeagleBone
 Black, but it seems that Julia is currently x86-specific.  Any idea when it
 will be available for ARM?





Re: [julia-users] Re: BeagleBone Black

2014-09-11 Thread Steve Kelly
Okay I am on GCC 4.9. I will try to get the build running on Wheezy as well.

On Thu, Sep 11, 2014 at 12:02 PM, Viral Shah vi...@mayin.org wrote:

 I am using GCC 4.8 and building LLVM 3.5 as part of the build.

 -viral
 On 11 Sep 2014 21:21, Steve Kelly kd2...@gmail.com wrote:

 Viral, I started the build on our BBB on master. We are running Debian
 sid with the stock BBB
  kernel. I tried running the build with the stock arm-make3 branch on
 Monday and the LLVM build failed. I tried to use the system 3.5 but that
 failed as well.

 Are you using Clang or GCC?

 On Thu, Sep 11, 2014 at 8:09 AM, Viral Shah vi...@mayin.org wrote:

 Could you try building master on the BBB again? It should hopefully have
 a comparable CPU as the Chromebook. It also helps if it is running Ubuntu,
 to install various dependencies to cut down the build time.

 -viral


 On Thursday, September 4, 2014 11:14:18 AM UTC+5:30, Steve Kelly wrote:

 I can confirm the arm-make3 branch does not build on the BBB. My
 company is interested in getting Julia on ARM. I think the biggest issue is
 that ubiquitous ARM devices are poor for building software quickly. Also
 with Red Hat and possibly Amazon entering the ARM server space this will
 become more feasible in the next year.


 On Thu, Sep 4, 2014 at 1:01 AM, Viral Shah vi...@mayin.org wrote:

 See the arm-make3 branch. I have an ARM chromebook, and it currently
 crashes in building the system image. I haven't been able to look into 
 this
 further. I am guessing we will be ready in the 0.5 timeframe.

 If someone does want to try their hand at the ARM port, I can set up
 ssh on my chromebook.

 -viral


 On Wednesday, September 3, 2014 7:05:35 PM UTC-7, 2Cubed wrote:

 I would love to get Julia and/or IJulia up and running on my
 BeagleBone Black, but it seems that Julia is currently x86-specific.  Any
 idea when it will be available for ARM?






Re: [julia-users] Re: BeagleBone Black

2014-09-03 Thread Steve Kelly
I can confirm the arm-make3 branch does not build on the BBB. My company is
interested in getting Julia on ARM. I think the biggest issue is that
ubiquitous ARM devices are poor for building software quickly. Also with
Red Hat and possibly Amazon entering the ARM server space this will become
more feasible in the next year.


On Thu, Sep 4, 2014 at 1:01 AM, Viral Shah vi...@mayin.org wrote:

 See the arm-make3 branch. I have an ARM chromebook, and it currently
 crashes in building the system image. I haven't been able to look into this
 further. I am guessing we will be ready in the 0.5 timeframe.

 If someone does want to try their hand at the ARM port, I can set up ssh
 on my chromebook.

 -viral


 On Wednesday, September 3, 2014 7:05:35 PM UTC-7, 2Cubed wrote:

 I would love to get Julia and/or IJulia up and running on my BeagleBone
 Black, but it seems that Julia is currently x86-specific.  Any idea when it
 will be available for ARM?




Re: [julia-users] Conditional import within a function

2014-08-22 Thread Steve Kelly
What I've been doing is very similar. I leave the plotting package up to
the user, but they are first required to do one of: using PyPlot or using
Gadfly inside their script.

I then have a function similar to yours (named plot(x::MyType), to take
advantage of multiple dispatch) that checks for isdefined(:PyPlot) or
isdefined(:Gadfly). If neither is defined I throw an error telling the user
to include a plotting package in their script. This keeps load times quick
if they are not using plotting functionality.

To answer you question, conditional import inside functions is not
possible.



On Fri, Aug 22, 2014 at 11:56 AM, Spencer Lyon spencerly...@gmail.com
wrote:

 I am working on a library that defines various types as well as a few
 “helper” functions to plot those types with PyPlot.

 If I do [import|using] PyPlot at the top level of any file in my package,
 PyPlot is loaded when I do [using|import] MyPackage. This makes the
 startup time for my package much much longer.

 What I would like to do is instead of having to load it when my package
 loads, I could load it when someone calls one of the functions that needs
 it.

 Here is an example of what I would like to do:

 function plot_my_type(x::MyType)
 if !isdefined(:PyPlot)
 using PyPlot
 end
 # finish the function by plotting with PyPlot
 end

 I haven’t been able to get a solution that works for this. Does anyone
 know if it is possible?
 ​



Re: [julia-users] Re: Conditional import within a function

2014-08-22 Thread Steve Kelly
Peter Simon, very cool. When I ran hit this problem I saw it as an
opportunity to make my code more Julian. :P Eval FTW.


On Fri, Aug 22, 2014 at 12:16 PM, Peter Simon psimon0...@gmail.com wrote:

 From https://groups.google.com/d/topic/julia-users/AWCerAdDLQo/discussion
 :

  eval(Expr(:using,:PyPlot))

 can be used inside a conditional.

 --Peter


 On Friday, August 22, 2014 8:56:53 AM UTC-7, Spencer Lyon wrote:

 I am working on a library that defines various types as well as a few
 “helper” functions to plot those types with PyPlot.

 If I do [import|using] PyPlot at the top level of any file in my
 package, PyPlot is loaded when I do [using|import] MyPackage. This makes
 the startup time for my package much much longer.

 What I would like to do is instead of having to load it when my package
 loads, I could load it when someone calls one of the functions that needs
 it.

 Here is an example of what I would like to do:

 function plot_my_type(x::MyType)
 if !isdefined(:PyPlot)
 using PyPlot
 end
 # finish the function by plotting with PyPlot
 end

 I haven’t been able to get a solution that works for this. Does anyone
 know if it is possible?
 ​




Re: [julia-users] Re: Conditional import within a function

2014-08-22 Thread Steve Kelly
Changing plot() to PyPlot.plot() fixes it for me.

type Foo

x::Int
end

function check_defined_err(x::Symbol)
if !isdefined(x)
error(Module $x not defined. run `using $x` to fix the problem)
end
end

function check_defined_eval(x::Symbol)
if !isdefined(x)
eval(Expr(:using, x))
end
end

function plot_err(x::Foo)
check_defined_err(:PyPlot)
d = 1:x.x
PyPlot.plot(d, d.^2)
end

function plot_eval(x::Foo)

check_defined_eval(:PyPlot)
d = 1:x.x
PyPlot.plot(d, d.^2)
end

a = Foo(10)
try
plot_err(a)
catch err
println(err)
end
plot_eval(a)

This also eliminates the warning:

Warning: using PyPlot.plot in module Main conflicts with an existing identifier.

Which makes me think you analysis is correct. When Julia evaluates the
function it must be binding the name plot to something. Maybe somebody
more knowledgeable about the pipeline can chime in. I don't think this is a
scoping issue, as much as it is intrinsic to the eval and compilation steps.

On Fri, Aug 22, 2014 at 12:49 PM, Spencer Lyon spencerly...@gmail.com
wrote:

 I’m actually still having issues with both of these options — I’ll try to
 enumerate what I think the problem is here.

- When I do using MyPackage and my code is loaded (including the
plotting routines).
- Then when I call plot(x:MyType) (sorry for the shorthand), the
function is compiled.
- If PyPlot is not defined I try to either define it for them or emit
an error message telling them to define it.
- After PyPlot is defined I try to run the file again, and I get
errors telling me the functions from PyPlot that I use in the
plot(x::MyType) function are not defined.

 I think the reason they are not defined within the function is that it was
 compiled the first time I called it, when the functions weren’t actually
 available.

 Does my analysis seem correct?

 Does anyone know a way to accomplish this?
 --

 I will provide a quick usable example so it is easy for people to
 experiment

 type Foo
 x::Int
 end

 function check_defined_err(x::Symbol)
 if !isdefined(x)
 error(Module $x not defined. run `using $x` to fix the problem)
 end
 end

 function check_defined_eval(x::Symbol)
 if !isdefined(x)
 eval(Expr(:using, x))
 end
 end

 function plot_err(x::Foo)
 check_defined_err(:PyPlot)
 d = 1:x
 plot(d, d.^2)
 end

 function plot_eval(x::Foo)
 check_defined_eval(:PyPlot)
 d = 1:x
 plot(d, d.^2)
 end

 a = Foo(10)
 # plot_err(a)
 # plot_eval(a)

 Then when I have run the code above, I get the following at the console
 (it says string because I :

 julia plot_err(a)
 ERROR: Module PyPlot not defined. run `using PyPlot` to fix the problem
  in error at /usr/local/julia/usr/lib/julia/sys.dylib
  in check_defined_err at none:7
  in plot_err at string:18

 julia using PyPlot
 INFO: Loading help data...
 Warning: using PyPlot.plot in module Main conflicts with an existing 
 identifier.

 julia plot_err(a)
 ERROR: plot not defined
  in plot_err at none:20

 In another session to see plot_eval(a):

 julia plot_eval(a)
 INFO: Loading help data...
 Warning: using PyPlot.plot in module Main conflicts with an existing 
 identifier.
 ERROR: plot not defined
  in plot_eval at none:4

 Thank you for your help.

 On Friday, August 22, 2014 12:18:14 PM UTC-4, Steve Kelly wrote:

 Peter Simon, very cool. When I ran hit this problem I saw it as an
 opportunity to make my code more Julian. :P Eval FTW.


 On Fri, Aug 22, 2014 at 12:16 PM, Peter Simon psimo...@gmail.com wrote:

 From https://groups.google.com/d/topic/julia-users/
 AWCerAdDLQo/discussion :

  eval(Expr(:using,:PyPlot))

 can be used inside a conditional.

 --Peter


 On Friday, August 22, 2014 8:56:53 AM UTC-7, Spencer Lyon wrote:

 I am working on a library that defines various types as well as a few
 “helper” functions to plot those types with PyPlot.

 If I do [import|using] PyPlot at the top level of any file in my
 package, PyPlot is loaded when I do [using|import] MyPackage. This
 makes the startup time for my package much much longer.

 What I would like to do is instead of having to load it when my package
 loads, I could load it when someone calls one of the functions that needs
 it.

 Here is an example of what I would like to do:

 function plot_my_type(x::MyType)
 if !isdefined(:PyPlot)
 using PyPlot
 end
 # finish the function by plotting with PyPlot
 end

 I haven’t been able to get a solution that works for this. Does anyone
 know if it is possible?
 ​


  ​



Re: [julia-users] Re: Clipping Algorithm

2014-08-14 Thread Steve Kelly
I implemented the Greiner-Hormann algorithm for clipping and
Hormann-Agathos algorithm for point-in-polygon here:
https://github.com/sjkelly/PolygonClipping.jl
I made a specialized implementation for generating infill for 3D printing
paths.

I am currently working on merging this package with Geometry2D.jl. It can
also be about 2x faster in the general case and handle more than two polys,
but I need some things in Geometry2D.jl to make that possible.


On Thu, Aug 14, 2014 at 10:12 AM, Tim Holy tim.h...@gmail.com wrote:

 On Thursday, August 14, 2014 06:38:29 AM Simon Danisch wrote:
  I'm not talking theoretically about this, but I mention this because this
  is the reason why I can't use two very nice packages (namely Color.jl and
  Meshes.jl), without a lot of unnecessary conversions.

 The latest version of Color has fixed this.

 --Tim




Re: [julia-users] Re: Clipping Algorithm

2014-08-14 Thread Steve Kelly
I'm not talking theoretically about this, but I mention this because this
is the reason why I can't use two very nice packages (namely Color.jl and
Meshes.jl), without a lot of unnecessary conversions.

I've been hacking on Meshes.jl the past few days. I will file an issue for
this and try to fix it ASAP.


On Thu, Aug 14, 2014 at 9:38 AM, Simon Danisch sdani...@gmail.com wrote:

 Please excuse me for my OpenCL lobbying, but I pretty much came to Julia,
 to have these kind of algorithms as fast as possible :)

 These considerations won't necessarily slow down the development of any
 Julia algorithm, as the implementation is rather orthogonal
 - which is why I don't see any harm in getting OpenCL into the discussion.

 I just mentioned, what I will be doing, in the hope, that people who
 develop any geometric algorithm in Julia keep it in mind, while designing
 the library.
 For example, the library should only use parametric types, which would
 enable me to use Float32.
 It's a small thing if you do it from the beginning, but gives you quite a
 headache if you try to integrate it later on.
 This way it becomes a lot simpler, to smuggle in some OpenCL
 implementations( e.g. via multiple dispatch) parallel to the already
 established algorithms written in Julia.
 I'm not talking theoretically about this, but I mention this because this
 is the reason why I can't use two very nice packages (namely Color.jl and
 Meshes.jl), without a lot of unnecessary conversions.

 I'm very well aware, that not everyone wants to put the extra effort into
 developing in OpenCL.
 That's why I wrote: lets start with julia and add OpenCL later




 Am Freitag, 9. Mai 2014 23:13:06 UTC+2 schrieb Steve Kelly:

 I am going to be developing some software for 3D printing. For path
 planning, we will need to use the clipping algorithm.

 Graphics.jl mentions a clip function. https://github.com/JuliaLang/
 julia/blob/master/base/graphics.jl
 Cairo.jl uses the C implementation in Cairo.

 I would like to implement this algorithm natively in Julia. My question
 to the community is whether it be more appropriate to create a new package
 or optionally add the algorithm to Graphics.jl (or another package)?

 Thanks,
 Steve




[julia-users] Type Expr Construction

2014-08-06 Thread Steve Kelly
Julians,

I have a question about the construction of the type Expr.
I would like to create a macro to generate types.

julia a = :(type FooBar{T:Number}
 a::T
 b::T
end)
:(type FooBar{T:Number} # line 2:
a::T # line 3:
b::T
end)

julia dump(a)
Expr
  head: Symbol type
  args: Array(Any,(3,))
1: Bool true
2: Expr
  head: Symbol curly
  args: Array(Any,(2,))
1: Symbol FooBar
2: Expr
  head: Symbol :
  args: Array(Any,(2,))
  typ: Any
  typ: Any
3: Expr
  head: Symbol block
  args: Array(Any,(4,))
1: LineNumberNode
  line: Int64 2
2: Expr
  head: Symbol ::
  args: Array(Any,(2,))
  typ: Any
3: LineNumberNode
  line: Int64 3
4: Expr
  head: Symbol ::
  args: Array(Any,(2,))
  typ: Any
  typ: Any
  typ: Any

My first question is why is args[1] = true?
Why can I not see a::T and b::T or the T:Number?

I'm inclined to think this is bad form to generate types in macros. I am
trying to create a package for bounding boxes. I'd like to do something
like
   @boundingbox BoundsXYZ, x, y, z
and get the type and method definitions.


Re: [julia-users] Type Expr Construction

2014-08-06 Thread Steve Kelly
Guys, thanks for the help! The args: Array(Any,(2,)) was very confusing.

Tony, TermWin is a pretty awesome package! It works great on Debian.


On Thu, Aug 7, 2014 at 12:26 AM, Isaiah Norton isaiah.nor...@gmail.com
wrote:

  Probably a bug in dump

 (or maybe just a depth limit?)


 On Thu, Aug 7, 2014 at 12:25 AM, Isaiah Norton isaiah.nor...@gmail.com
 wrote:

 My first question is why is args[1] = true?


 This is the `mutable` flag (`immutable FooBar` will be false).

 Why can I not see a::T and b::T or the T:Number?


 Probably a bug in dump:

 julia a.args[2].args
 2-element Array{Any,1}:
  :FooBar
  :(T:Number)

 julia a.args[3].args
 4-element Array{Any,1}:
  :( # line 2:)
  :(a::T)
  :( # line 3:)
  :(b::T)




 On Thu, Aug 7, 2014 at 12:09 AM, Steve Kelly kd2...@gmail.com wrote:

 Julians,

 I have a question about the construction of the type Expr.
 I would like to create a macro to generate types.

 julia a = :(type FooBar{T:Number}
  a::T
  b::T
 end)
 :(type FooBar{T:Number} # line 2:
 a::T # line 3:
 b::T
 end)

 julia dump(a)
 Expr
   head: Symbol type
   args: Array(Any,(3,))
 1: Bool true
 2: Expr
   head: Symbol curly
   args: Array(Any,(2,))
 1: Symbol FooBar
 2: Expr
   head: Symbol :
   args: Array(Any,(2,))
   typ: Any
   typ: Any
 3: Expr
   head: Symbol block
   args: Array(Any,(4,))
 1: LineNumberNode
   line: Int64 2
 2: Expr
   head: Symbol ::
   args: Array(Any,(2,))
   typ: Any
 3: LineNumberNode
   line: Int64 3
 4: Expr
   head: Symbol ::
   args: Array(Any,(2,))
   typ: Any
   typ: Any
   typ: Any

 My first question is why is args[1] = true?
 Why can I not see a::T and b::T or the T:Number?

 I'm inclined to think this is bad form to generate types in macros. I am
 trying to create a package for bounding boxes. I'd like to do something
 like
@boundingbox BoundsXYZ, x, y, z
 and get the type and method definitions.






Re: [julia-users] Pitching Julia to company. IJulia live slideshow mode?

2014-07-28 Thread Steve Kelly
+1 for what Stefan said. In addition if it is a small and keen audience,
when someone asks a question you can easily add examples.


On Mon, Jul 28, 2014 at 4:05 PM, Daniel Jones danielcjo...@gmail.com
wrote:


 It's a little hidden, but from the notebook you change the “Cell Toolbar”
 option at the top to “Slideshow”, then you can define slides, etc. To view
 the slideshow run ipython nbconvert --post serve --to=slides
 SomeNotebook.ipynb.

 I've a similar experience as Stefan with this, though. It works alright if
 your slides are static and simple, but it's kind of glitchy overall.


 On Monday, July 28, 2014 12:55:57 PM UTC-7, Jay Kickliter wrote:

 I'm still curious though, I don't see an option for slide mode in my
 IJulia interface. Is it something you enable in the config files? I found
 the following in custom.jl, but don't know what you're supposed to do with
 it:

  *// to load the metadata ui extension to control slideshow mode /
 reveal js for nbconvert
  *$.getScript('/static/js/celltoolbarpresets/slideshow.js');

 On Monday, July 28, 2014 1:40:03 PM UTC-6, Stefan Karpinski wrote:

 My experience has been that the slideshow mode in IJulia is too buggy to
 use for live coding. You can definitely use it to make static slides in
 which code has already been evaluated, but I like to type code in live and
 evaluate it, which did some rather strange things last I tried. I've found
 presenting in the normal IJulia mode to be pretty effective. You just
 scroll instead of clicking through to the next slide.


 On Mon, Jul 28, 2014 at 3:33 PM, Jay Kickliter jay.ki...@gmail.com
 wrote:

 This week I'm giving a presentation on Julia at my company's technology
 review. I'm pretty sure no one here has heard of it. I'd like to do
 something different than powerpoint. Is *live* slideshow mode possible
 with IJulia? Google is failing me, and I don't have much time to get
 prepared (otherwise I'd do more research before posting, sorry).





Re: [julia-users] Static Analysis in Julia talk

2014-07-21 Thread Steve Kelly
This was excellent and informative. Thanks for sharing!


On Mon, Jul 21, 2014 at 6:19 PM, Leah Hanson astriea...@gmail.com wrote:

 I've spoken twice about TypeCheck.jl in the past couple of weeks (at
 JuliaCon and Midwest.io). The Midwest.io talk has been posted:
 https://www.youtube.com/watch?v=mL1Ow03G8tkfeature=youtu.be

 Because Midwest.io is a general programming conference, there's background
 on Julia before we get into the actual analysis. However, since it's the
 second time I gave the talk, it's more polished. But otherwise, it's the
 same content as the one I gave at JuliaCon.

 -- Leah



[julia-users] Pass Keyword Argument Dictionary

2014-06-18 Thread Steve Kelly
I have the following code that takes some keywords and does some
operations. This is for a wrapper for Gcode (talking to 3D printers), so
I'd like to specify an arbitrary axis to control. Below is what I would
like to do.

function print_args(;args...)
 # do some formatting
 str = 
 for (key, val) in args
 str *= string(key,   , val,  )
 end
 println(str)
 end

 function arg_collector(;args...)

 # do some operation on args


print_args(args) #always print args
 end

 arg_collector(x=5, y=6, z=7)


 The obvious work around seems to be to dispatch onto a
print_args(x::Array{Any,1}) and handle the formatting there. I'm not
inclined to do that since it creates redundant code.

Is there a succinct way to re-pass the keyword arguments to another
function expecting only keyword arguments?


Re: [julia-users] Compilation to hardware (ASICs)

2014-05-29 Thread Steve Kelly
I am currently working on a path planner for 3D printing written in
Julia. I also am going to be working on solid modeling with Julia
scripts and functional representation for my undergraduate thesis.

So maybe we can meet half way :)

On Thu, May 29, 2014 at 6:00 PM, Matt Bauman mbau...@gmail.com wrote:
 It seems like there are several groups working on an LLVM IR to FPGA/ASIC
 compiler.  That'd be the way to do it.  Make julia emit the IR, and then
 compile that to your ASIC.

 http://stackoverflow.com/questions/3664692/creating-a-vhdl-backend-for-llvm
 Google search: llvm ir hardware (asic|fpga)

 On Thursday, May 29, 2014 5:26:40 PM UTC-4, John Myles White wrote:

 If someone wrote code to do that, I don't see why it wouldn't be possible.

  -- John

 On May 29, 2014, at 11:44 AM, David Ainish david@gmail.com wrote:

 3D printing is growing at a rapid pace and in a few years it will be
 possible to 3D print our own integrated circuits and microprocessors.

 Would it be possible for Julia in the future to do Hardware compilation
 and 3D print ASICs from our Julia code?






Re: [julia-users] What's with the Nothing type?

2014-05-24 Thread Steve Kelly
I've done some quick experiments with None and Nothing, and I am
seeing nearly identical performance and memory allocation. It seems
'Nothing' is more idiomatic Julia considering the role it plays in
function return.

Another insight I would like to add is the role it plays in type
unions. In the REPL you see

julia Union(Int64, None)
Int64

julia Union(Int64, Nothing)
Union(Nothing,Int64)

I am wondering if this is the primary reason Nothing is preferred over None.

Best,
Steve

On Sat, May 24, 2014 at 11:58 AM, Dom Luna dluna...@gmail.com wrote:
 Thanks for all the helpful messages everyone, much appreciated:)


Re: [julia-users] Re: Clipping Algorithm

2014-05-12 Thread Steve Kelly
Andreas,

Thank you for your input. I think a library similar to CGAL would be
of use. Writing it in Julia might be helpful for reference and
community contribution.

Best,
Steve

On Sat, May 10, 2014 at 8:57 AM, Andreas Lobinger lobing...@gmail.com wrote:
 Hello colleague,


 On Friday, May 9, 2014 11:13:06 PM UTC+2, Steve Kelly wrote:

 I am going to be developing some software for 3D printing. For path
 planning, we will need to use the clipping algorithm.

 Graphics.jl mentions a clip function.
 https://github.com/JuliaLang/julia/blob/master/base/graphics.jl
 Cairo.jl uses the C implementation in Cairo.

 i'm not so sure if the call to cairo's clip - which is just a hint to the
 internal polygon/tesselation units where to put paint and where not - is the
 thing you need (especially as cairo is 2D only). So you probably need to
 implement a 3D clipping anyway


 I would like to implement this algorithm natively in Julia. My question to
 the community is whether it be more appropriate to create a new package or
 optionally add the algorithm to Graphics.jl (or another package)?


 I think Julia woud benefit of something like a computational geometry pkg
 that deal with the basic problems of point/line/polygon/meshes and
 inside/outside testing, clipping, intersection.

 When i started looking at julia i was missing a inpolygon (matlab basic
 command) counterpart. It wasn't there, i implemented a local julia solution,
 but never found a place to contribute.

 Wishing a happy day,
 Andreas