Hey Heiko,

Spark 0.9 introduced a common config class for Spark applications. It
also (initially) supported loading config files in the nested typesafe
format, but this was removed last minute due to a bug. In 1.0 we'll
probably add support for config files, though it may not support
typesafe's tree-style config files because that conflicts with the
naming style of several spark options (we have options where x.y and
x.y.z are both named keys, and the typesafe parser doesn't allow
that).

- Patrick

On Mon, Jan 27, 2014 at 8:59 AM, Heiko Braun <ike.br...@googlemail.com> wrote:
> Thanks. I found the discussion myself ;)
>
> /heiko
>
>> Am 27.01.2014 um 17:34 schrieb Mark Hamstra <m...@clearstorydata.com>:
>>
>> And it would be more helpful if I gave you a usable link 
>> http://apache-spark-developers-list.1001551.n3.nabble.com/Config-properties-broken-in-master-td208.html
>>
>> Sent from my iPhone
>>
>>> On Jan 27, 2014, at 8:13 AM, Heiko Braun <ike.br...@googlemail.com> wrote:
>>>
>>> Thanks Mark.
>>>
>>>> On 27 Jan 2014, at 17:05, Mark Hamstra <m...@clearstorydata.com> wrote:
>>>>
>>>> Been done and undone, and will probably be redone for 1.0.  See
>>>> https://mail.google.com/mail/ca/u/0/#search/config/143a6c39e3995882
>>>>
>>>>
>>>> On Mon, Jan 27, 2014 at 7:58 AM, Heiko Braun 
>>>> <ike.br...@googlemail.com>wrote:
>>>>
>>>>>
>>>>> Is there any interest in moving to a more structured approach for
>>>>> configuring spark components? I.e. moving to the typesafe config [1]. 
>>>>> Since
>>>>> spark already leverages akka, this seems to be a reasonable choice IMO.
>>>>>
>>>>> [1] https://github.com/typesafehub/config
>>>>>
>>>>> Regards, Heiko
>>>

Reply via email to