Thanks,
     Yes,I'm using a custom hadoop cluster.But I also want to use a custom 
hadoop configure path instead of ./fe/src/test/resources.
     So I changed HADOOP_CONF_DIR variable in the ./impala-config.sh,but it 
does not work.Do you have any other way ?







At 2017-07-26 07:48:10, "黄权隆" <huang_quanl...@126.com> wrote:
>Hi sky, 
>
>
>Do you want to use customized hadoop cluster but not the mini cluster? For 
>example, testing the latest version of Impala upon your existing Hive cluster.
>If so, you can modify the configuration files in ./fe/src/test/resources. 
>They're just symbolic links. You can link them to your hadoop configuration 
>files. Then the impala cluster will use your hadoop cluster.
>
>At 2017-07-18 18:14:32, "sky" <x_h...@163.com> wrote:
>>     After I changed the impala-config.sh file, the HADOOP_CONF_DIR variables 
>> did not take effect. This is still the case after running this 
>> file(./bin/create-test-configuration.sh).
>>It seems that this variable does not take effect, because I only put hadoop 
>>configuration file into the ./bin/start-impala -cluster.sh file of the same 
>>directory will recognize the configuration. So I want to know how to do so 
>>that any startup is not necessarily in the hadoop configure path.
>>
>>
>>
>>
>>
>>
>>
>>At 2017-07-18 02:33:11, "Tim Armstrong" <tarmstr...@cloudera.com> wrote:
>>>I'm not sure that I fully understand the question.
>>>
>>>There isn't a way to override HADOOP_CONF_DIR mostly  - most scripts source
>>>impala-config.sh.
>>>
>>>On Sun, Jul 16, 2017 at 8:31 PM, sky <x_h...@163.com> wrote:
>>>
>>>> Hi Tim,
>>>>     I found it from ./bin/create-test-configuration.sh that generating
>>>> ./fe/src/test/resources configurations, and HADOOP_CONFIG_DIR variable also
>>>> points to this directory. But I change this variable is not take effect. Is
>>>> this a hard code?

Reply via email to