Hey Nils, That is very interesting. I hadn't noticed any cases where `quadgk` is faster.
I'm not sure I can give a great answer here without seeing your code and really digging into the `quadgk` implementation. I can say, however, that [`do_quad` is a very simple function](https://github.com/QuantEcon/QuantEcon.jl/blob/master/src/quad.jl#L602): it simply evaluates the the function at the nodes and computes the dot product between that result and the weights. You should get consistent timing results from `do_quad` for an interval of any length, given a fixed number of nodes. One more comment. The example with integrating over cos used 65 nodes. This is much more than needed for "reasonable" accuracy: ``` julia> nodes, weights = qnwlege(65, -2pi, 2pi); julia> integral = do_quad(x -> cos(x), nodes, weights) -2.9819896552041314e-15 julia> nodes, weights = qnwlege(21, -2pi, 2pi); julia> do_quad(x -> cos(x), nodes, weights) -1.887379141862766e-14 ``` As the computational complexity of the function being integrated goes up, using less nodes will likely be a good tradeoff between accuracy and efficiency. In [this example](https://github.com/QuantEcon/QuantEcon.jl/commit/f89abfed0bb4f478cb131c2055916ea6c527c29c) I changed from using `quadgk` to computing the nodes/weights once and then calling `do_quad` and achieved a speed up of 115x for total time for the `compute_lt_price` function to converge. If you would be willing to share your code I might be able to give better advice. On Wednesday, October 8, 2014 6:26:38 AM UTC-4, Nils Gudat wrote: > > Hi Spencer, > > Can I just hop in with a question on your integration/quadrature routine? > I'm working on a value function iteration in a fairly large state space > (think of the order of 500,000 to 1m grid points per period) and hence have > to do calculate a lot of expected values. Naturally, I tried out the > QuantEcon package right away after seeing this thread here and I'm a bit > puzzled by its performance. When timing the inner part of my loop, in which > the quadrature is used (and around 400 integrations are performes), using > `quadgk` takes between 3 and 4 seconds to do the job, while using `qnwlege` > and `do_quad` clocks in at 18 to 20 seconds. However, for some iterations, > the time used with `quadgk` explodes to 350 to 450 seconds. > > I checked the examples on the quant-econ website and tried to reproduce > the integration of the cos function found here > <http://quant-econ.net/jl/julia_libraries.html#optimization-roots-and-fixed-points>. > > While I can reproduce the result on the website, the relative advantage of > `do_quad` seems to vanish when I narrow the interval, e.g. when integrating > from pi/2 to pi, `quadgk` seems to be an order of magnitude faster than > `do_quad`, especially when timing both steps for `do_quad` (i.e. including > the calculation of nodes and weights). > > Do you have a general idea about when `do_quad` outperforms `quadgk` and > vice versa? Ideally, I'd like to write my code in a way that chooses the > better option in each iteration, but at the moment I'm stumped as for how > to figure out which situation suits which routine. > > Thanks! > > On Saturday, September 20, 2014 1:06:41 AM UTC+1, Spencer Lyon wrote: >> >> Hi David, >> >> Thanks for the questions, I’ll respond in chunks. >> >> Any experiences/opinions/pitfalls your group discovered on the Python vs >> Julia question? >> >> I personally love both languages and use both regularly. I find myself >> reaching for Julia for most things right now — I love the more advanced >> features like metaprogramming and parallel processing that are either >> non-existent or not as well integrated into the python language itself (I >> know there are many packages that provide similar features for python, but >> they don’t feel as natural as they do in Julia). >> >> Did you find one is faster than the other? >> >> As you might expect, we found that Julia was faster than Python for most >> things. This is probably an artifact of the type of algorithms we used. >> Often we would define an operator that loops over a grid, and then iterate >> on that operator until we find a fixed point. Because of this looping, >> Julia has a predictable speed advantage. >> >> Were there major areas where Julia lagged behind Python? >> >> I can think of two places where performance in Julia wasn’t as good as >> performance in python: >> >> 1. We often need to do quadrature in the innermost part of our loops >> (to approximate expected values) and we found that Julia’s quadgk >> routine was much slower scipy.integrate.fixed_quad. This is *not* a >> fair comparison because the algorithms employed by the two functions are >> very different. Specifically quadgk uses an adaptive algorithm while >> fixed_quad just uses standard Gaussian quadrature. To get around this >> we actually implemented a whole suite of quadrature routines in the file >> quad.jl >> <https://github.com/QuantEcon/QuantEcon.jl/blob/master/src/quad.jl>. >> After switching the Julia code from using quadgk to our own internal >> quadrature methods, we got approximately the same solutions, but the code >> was between 35-115 faster. >> 2. Linear interpolation. The function numpy.interp does simple >> piecewise linear interpolation. We ended up using the CoordInterpGrid >> type from Grid.jl to accomplish this. As of right now the >> interpolation steps are the biggest bottleneck in most of the functions >> we >> have. >> >> We also didn’t really find that Julia was lagging behind Python in terms >> of libraries that we needed. All the functionality we needed was already >> available to us. We are using PyPlot.jl to generate graphics, so I guess >> the some of the Julia code is dependent on Python. Come to think of it the >> one thing we would like to have that we haven’t been able to find is >> arbitrary precision linear algebra. In Python this can be achieved through >> SymPy’s wrapping of mpmath, but as far as I know we don’t yet have >> arbitrary precision linear algebra in Julia. >> >> On Friday, September 19, 2014 12:40:01 PM UTC-4, David Anthoff wrote: >> >> This is fantastic! >>> >>> >>> >>> Any experiences/opinions/pitfalls your group discovered on the Python vs >>> Julia question? I guess you pretty much implemented the same algorithms in >>> both languages. Did you find one is faster than the other? Were there major >>> areas where Julia lagged behind Python? >>> >>> >>> >>> Thanks, >>> >>> David >>> >>> >>> >>> *From:* julia...@googlegroups.com [mailto:julia...@googlegroups.com] *On >>> Behalf Of *Spencer Lyon >>> *Sent:* Thursday, September 18, 2014 7:14 PM >>> *To:* julia...@googlegroups.com >>> *Subject:* [julia-users] ANN: QuantEcon.jl >>> >>> >>> >>> New package QuantEcon.jl <https://github.com/QuantEcon/QuantEcon.jl>. >>> >>> This package collects code for quantitative economic modeling. It is >>> currently comprised of two main parts: >>> >>> 1. A toolbox of routines useful when doing economics >>> >>> 2. Implementations of types and solution methods for common >>> economic models. >>> >>> This library has a python twin: QuantEcon.py >>> <https://github.com/QuantEcon/QuantEcon.py>. The same development team >>> is working on both projects, so we hope to keep the two libraries in sync >>> very closely as new functionality is added. >>> >>> The library contains all the code necessary to do the computations found >>> on http://quant-econ.net/, a website dedicated to providing lectures >>> that each economics and programming. The website currently (as of 9/18/14) >>> has only a python version, but the Julia version is in late stages of >>> refinement and should be live very soon (hopefully within a week). >>> >>> The initial version of the website will feature 6 lectures dedicated to >>> helping a new user set up a working Julia environment and learn the basics >>> of the language. In addition to this language specific section, the website >>> will include 22 other lectures on topics including >>> >>> · statistics: markov processes (continuous and discrete state), >>> auto-regressive processes, the Kalman filter, covariance stationary >>> proceses, ect. >>> >>> · economic models: the income fluctuation problem, an asset >>> pricing model, the classic optimal growth model, optimal (Ramsey) taxation >>> , the McCall search model >>> >>> · dynamic programming: shortest path, as well as recursive >>> solutions to economic models >>> >>> All the lectures have code examples in Julia and most of the 22 will >>> display code from the QuantEcon.jl library. >>> >>> >>> >> >> >