Simon,

Thanks again for the input. I searched the "julia stats" forum but there 
did not seem to be any posts regarding spatial econometric models.

I might post the revised code there to see if there is any mutual interest. 
I have a few more items to code and the final piece will be a "pretty 
printing" function somewhat akin to the output of R, etc.

Thanks,
Don

On Wednesday, July 9, 2014 5:23:19 AM UTC-4, Simon Byrne wrote:
>
> Hi Don,
>
> On Tuesday, 8 July 2014 20:51:42 UTC+1, Donald Lacombe wrote:
>>
>> 2) When I use A = sparse(In - rho*W) I get the following warning from 
>> Julia and a break:
>>
>> julia> sar_gibbs
>>
>> LoadError("C:/Users/Don/Desktop/Julia Materials/SAR 
>> Gibbs/sar_gibbs.jl",13,ErrorException("The inverse of a sparse matrix can 
>> often be dense and can cause the computer to run out of memory. If you are 
>> sure you have enough memory, please convert your matrix to a dense matrix."))
>>
>>
>> Apparently, it does not like it when I invert the A matrix.
>>
>>
> For reasons of performance and numerical accuracy, you should generally 
> try to avoid explicitly constructing matrix inverses where possible: 
> instead of writing inv(A) * y, you should use the left division operator A 
> \ y.
>
> 3) Can I wrap the code with a "blank" function call like so: function 
> sar_gibbs()? 
>>
>>
> Yes, that's what I meant.
>  
>
>> 4) Are you indicating that I can include the knn_weights_matrix code in the 
>> same file?
>>
>>
> Yes: obviously this is up to you, but for short functions like this I find 
> it much easier to keep track of everything in the same file.
>  
>
>> Once again, thank you for the response. My goal is to produce a suite of 
>> Bayesian spatial econometric models for public consumption and you help is 
>> greatly appreciated.
>>
>>
> There are quite a few people interested in similar models: you might also 
> be interested in subscribing to the julia-stats mailing list.
>
> Good luck,
>
> Simon
>
>  
>
>> On Tuesday, July 8, 2014 5:06:14 AM UTC-4, Simon Byrne wrote:
>>>
>>> Hi Don,
>>>
>>> I think the reason they're not sparse is that in the line
>>>     A = (In - rho*W);
>>> the matrix In is not sparse: if you replace the line
>>>     In = eye(n);
>>> by 
>>>     In = speye(n);
>>> the result should then be sparse. However at the moment there doesn't 
>>> seem to be a sparse det method (I just filed issue #7543 
>>> <https://github.com/JuliaLang/julia/issues/7543>): you can get around 
>>> this by using log(det(lufact(A))) instead of log(det(A)).
>>>
>>> You might also want to have a quick look at the performance tips 
>>> <http://docs.julialang.org/en/latest/manual/performance-tips/>, in 
>>> particular:
>>>
>>> * You should use randn() instead of randn(1): randn() samples a single 
>>> Float64 scalar, randn(1) samples a vector of length 1. On my machine (using 
>>> 0.3 pre-release) your code doesn't work, as this promotes some vectors to 
>>> matrices, which doesn't work with dot. You also then won't need to call 
>>> variables like rho_proposed[1].
>>> * Try wrapping everything in a function: this makes it easier for the 
>>> JIT compiler to work its magic, as well as making it easier to use 
>>> profiling tools.
>>> * If you're using dense matrices, you can use logdet(A) instead of 
>>> log(det(A)): this can avoid some underflow and overflow problems 
>>> (unfortunately this doesn't work for sparse lu factorizations yet, see the 
>>> issue mentioned above).
>>> * Purely from a style point of view, there's no need to keep functions 
>>> in separate files, or end lines with semi-colons.
>>>
>>> Simon
>>>
>>>
>>>
>>> On Tuesday, 8 July 2014 01:48:56 UTC+1, Donald Lacombe wrote:
>>>>
>>>> Dear Julia Users,
>>>>
>>>> I am currently developing/converting several Bayesian spatial 
>>>> econometric models from MATLAB into Julia. I have successfully coded the 
>>>> spatial autoregressive model (SAR) with diffuse priors in Julia but have a 
>>>> question regarding the use of sparse matrix algorithms. The models use a 
>>>> spatial weight matrix which is usually sparse in nature and this matrix 
>>>> appears in many of the expressions, especially in the random walk 
>>>> Metropolis algorithm used to obtain the marginal distribution for the 
>>>> spatial autocorrelation parameter rho. Here is a code snippet and the 
>>>> complete code is attached:
>>>>
>>>> # Draw for rho
>>>>
>>>> A = (In - rho*W);
>>>>
>>>>     denominator = log(det(A))-.5*sigma^-2*dot(A*y-x*beta,A*y-x*beta);
>>>>
>>>>     accept = 0;
>>>>
>>>>     rho_proposed = rho + zta*randn(1);
>>>>
>>>>     while accept == 0
>>>>
>>>>         if ((rho_proposed[1] > -1) & (rho_proposed[1] < 1));
>>>>
>>>>             accept = 1;
>>>>
>>>>         else
>>>>
>>>>             rho_proposed = rho + zta*randn(1);
>>>>
>>>>         end
>>>>
>>>>     end
>>>>
>>>>      B = (In - rho_proposed[1]*W);
>>>>
>>>>      numerator = log(det(B))-.5*sigma^-2*dot(B*y-x*beta,B*y-x*beta);
>>>>
>>>>      u = rand(1);
>>>>
>>>>      if ((numerator[1] - denominator[1]) > exp(1))
>>>>
>>>>         pp = 1;
>>>>
>>>>      else
>>>>
>>>>         ratio = exp(numerator[1] - denominator[1]);
>>>>
>>>>         pp = min(1,ratio);
>>>>
>>>>     end
>>>>
>>>>     if (u[1] < pp[1])
>>>>
>>>>         rho = rho_proposed[1];
>>>>
>>>>         acc = acc + 1;
>>>>
>>>>     end
>>>>
>>>>     ar = acc/gibbs;
>>>>
>>>>
>>>>     if ar < 0.4
>>>>
>>>>         zta = zta/1.1;
>>>>
>>>>     end
>>>>
>>>>     if ar > 0.6
>>>>
>>>>         zta = zta*1.1;
>>>>
>>>>     end
>>>>
>>>>
>>>>     A = (In - rho*W);
>>>>
>>>>
>>>> Basically, I'm wondering if there is any way to make the "A" and "B" 
>>>> matrices sparse and possibly make it run faster, especially in the 
>>>> log(det(A)) terms. Currently, 110 draws (with 10 burn) takes approximately 
>>>> 8 seconds on my 64 bit, core i7 laptop. The computational speed decreases 
>>>> with the sample size n because the weight matrices are treated as full.
>>>>
>>>>
>>>> Any help would be greatly appreciated and if anyone is interested in 
>>>> running the code, the Distributions and Distance packages must be included 
>>>> and initiated first.
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Don
>>>>
>>>>
>>>>
>>>>

Reply via email to