I can't make roots fail with the example you gave.

It is right that there isn't yet eigfact methods for the Hessenberg type
and the LAPACK routines are not wrapped. The last part wouldn't be
difficult, but we need to think about the Hessenberg type. Right now it is
a Factorization and it stores the elementary reflectors for the
transformation to Hessenberg form. We might want a
Hessenberg<:AbstractMatrix similarly to e.g. Triangular.


2014-06-17 16:46 GMT+02:00 Tony Kelman <t...@kelman.net>:

> The implementation in https://github.com/Keno/Polynomials.jl looks a bit
> better-scaled (not taking reciprocals of the eigenvalues of the companion
> matrix), though it still might be better off on the original Wilkinson
> example if the companion matrix were transposed.
>
> Doesn't look like Julia has a nicely usable Hessenberg type yet (
> https://github.com/JuliaLang/julia/issues/6434 - there is hessfact and a
> Hessenberg factorization object, but those don't look designed to be
> user-constructed), and I don't see any sign of ccall's into the Hessenberg
> routine {sdcz}hseqr either.
>
>
> On Tuesday, June 17, 2014 7:08:11 AM UTC-7, Alan Edelman wrote:
>>
>> I just tried roots in the Polynomial package
>>
>> here's what happened
>>
>> @time roots(Poly([randn(100)]));
>>
>> LAPACKException(99)
>> while loading In[10], in expression starting on line 44
>>  in geevx! at linalg/lapack.jl:1225
>>  in eigfact! at linalg/factorization.jl:531
>>  in eigfact at linalg/factorization.jl:554
>>  in roots at /Users/julia/.julia/v0.3/Polynomial/src/Polynomial.jl:358
>>
>>
>> my first question would be why are we calling geevx for a matrix
>>
>> known to be Hessenberg?
>>
>>
>> I'd be happy to have a time comparable to matlab's though i'm sure there
>>
>> are faster algorithms out there as well
>>
>>
>>
>>
>>
>>
>>
>>
>> On Friday, May 9, 2014 11:21:11 PM UTC-4, Tony Kelman wrote:
>>>
>>> By default GitHub doesn't enable issue tracking in forked repositories,
>>> the person who makes the fork has to manually go do that under settings.
>>>
>>>
>>> On Friday, May 9, 2014 9:39:56 AM UTC-7, Hans W Borchers wrote:
>>>>
>>>> @Jameson
>>>> I am writing a small report on scientific programming with Julia. I
>>>> changed the section on polynomials by now basing it on the newer(?)
>>>> Polynomials.jl. This works quite fine, and roots() computes the zeros of
>>>> the Wilkinson polynomial to quite satisfying accuracy.
>>>>
>>>> It's a bit irritating that the README file still documents the old
>>>> order of sequence of coefficients while the code already implements the
>>>> coefficients in increasing order of exponents. I see there is a pull
>>>> request for an updated README, but this is almost 4 weeks old.
>>>>
>>>> Testing one of my examples,
>>>>
>>>>     julia> using Polynomials
>>>>
>>>>     julia> p4 = poly([1.0, 1im, -1.0, -1im])
>>>>     Poly(--1.0 + 1.0x^4)
>>>>
>>>>
>>>> which appears to indicate a bug in printing the polynomial. The stored
>>>> coefficient is really and correctly -1.0 as can be seen from
>>>>
>>>>     julia> p4[0]
>>>>     -1.0 + 0.0im
>>>>
>>>>
>>>> I wanted to report that as an issue on the project page, but I did not
>>>> find a button for starting the issue tracker. Does this mean the
>>>> Polynomial.jl project is still 'private' in some sense?
>>>>
>>>> I know there have been long discussions on which is the right order for
>>>> the coefficients of a polynomial. But I feel it uneasy that the defining
>>>> order in MATLAB and other numerical computing systems has been changed so
>>>> drastically. Well, we have to live with it.
>>>>
>>>>
>>>> On Friday, May 9, 2014 7:53:30 AM UTC+2, Hans W Borchers wrote:
>>>>>
>>>>> Thanks a lot. Just a few minutes ago I saw here on the list an
>>>>> announcement
>>>>> of the "Least-squares curve fitting package" with poly_fit, among
>>>>> others.
>>>>> I think this is good enough for me at the moment.
>>>>>
>>>>> I will come back to your suggestion concerning polynomials when I have
>>>>> a
>>>>> better command of the type system. For polynomials there is
>>>>> surprisingly
>>>>> many more interesting functionality than is usually implemented.
>>>>>
>>>>>
>>>>> On Friday, May 9, 2014 6:30:06 AM UTC+2, Jameson wrote:
>>>>>>
>>>>>> As the author of Polynomial.jl, I'll say that being "a bit
>>>>>> unsatisfied" is a good reason to make pull requests for any and all
>>>>>> improvements :)
>>>>>>
>>>>>> While loladiro is now the official maintainer of Polynomials.jl
>>>>>> (since
>>>>>> he volunteered to do the badly-needed work to switch the coefficient
>>>>>> order), if I had access, I would accept a pull request for additional
>>>>>> roots() methods (parameterized by an enum type, for overloading, and
>>>>>> possibly also a realroots function), horner method functions,
>>>>>> polyfit,
>>>>>> etc.
>>>>>>
>>>>>> I would not accept a pull request for allowing a vector instead of a
>>>>>> Polynomial in any method, however. IMHO, this is a completely
>>>>>> unnecessary "optimization", which encourages the user to conflate the
>>>>>> concept of a Vector and a Polynomial without benefit. It could even
>>>>>> potentially lead to subtle bugs (since indexing a polynomial is
>>>>>> different from indexing a vector), or passing in the roots instead of
>>>>>> the polynomial.
>>>>>>
>>>>>> I think merging your proposal for a polyfit function with
>>>>>> StatsBase.fit makes sense. You could use a tuple parameter to combine
>>>>>> the Polynomial parameter with the degrees information:
>>>>>>
>>>>>> function fit((T::(Type{Polynomial},Int), data)
>>>>>>   P, deg = T
>>>>>>   return Poly( pfit(deg, data) ) #where pfit represents the
>>>>>> calculation of the polynomial-of-best fit, and may or may not be a
>>>>>> separate function
>>>>>> end
>>>>>> fit((Polynomial,3), data)
>>>>>>
>>>>>> David de Laat put together a pull request to add his content to
>>>>>> Polynomial: https://github.com/vtjnash/Polynomial.jl/pull/25. He
>>>>>> also
>>>>>> indicated he would update it for Polynomials.jl so that it could be
>>>>>> merged.
>>>>>>
>>>>>>


-- 
Med venlig hilsen

Andreas Noack Jensen

Reply via email to