[ https://issues.apache.org/jira/browse/HAWQ-1391?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15944761#comment-15944761 ]
ketan commented on HAWQ-1391: ----------------------------- Hi, The above setup issue for HAWQ is also resolved. I am encountering unit test case failures, failures in module pxffilters. for s390x platform. The below test fails unit_test(test__list_const_to_str__int), unit_test(test__list_const_to_str__boolean), unit_test(test__list_const_to_str__text), On debugging the below test its observed in file 'src/backend/access/ecternal/pxffilters.c' in function list_const_to_str() At line no 1078 call to function deconstruct_array(arr, INT2OID, sizeof (value), true, 's', &dats, NULL, &len); , sets len = 0 Hence the 'for' loop starting at line 1084 fails to execute and test fails The same test when compared on x86 intel is seen set value to len other than 0 and hence test cases are passing. Any inputs on the above cause would be helpful to resolve the failure on s390x. Thanks Ketan Kunde > s390x support for HWCRC32c > -------------------------- > > Key: HAWQ-1391 > URL: https://issues.apache.org/jira/browse/HAWQ-1391 > Project: Apache HAWQ > Issue Type: Bug > Components: libhdfs > Reporter: ketan > Assignee: Ed Espino > > Hi , > I am in progress building Apache - HAWQ on s390x > following instruction on > https://cwiki.apache.org/confluence/display/HAWQ/Build+and+Install > I am in the build stage i notice that during the build i encounter > undefined reference to vtable for Hdfs::Internal::HWCrc32c > On further debugging i observed that libhdfs3/src/common/HWCRC32c.cpp has > not support for s390x. > My questions are as follows. > 1) I want to confirm whether does this check happens as part of unit testing > of libhdfs3? > 2) if yes to 1 whether this test is specific to SSE based platforms ? > 3) can we exactly get some information on what this check does>? > 4) Is HAWQ source supported on SSE based platforms only ? > Help would be appreciated. > Adding Log for reference. > ************************************************** > make[3]: Leaving directory `//incubator-hawq/src/backend/cdb' > g++ -O3 -std=gnu99 -Wall -Wmissing-prototypes -Wpointer-arith > -Wendif-labels -Wformat-security -fno-strict-aliasing -fwrapv > -fno-aggressive-loop-optimizations -I/usr/include/libxml2 -L../../src/port > -L../../src/port -Wl,--as-needed > -L/scratch/ecos0013/ketan/incubator-hawq/depends/libhdfs3/build/install/usr/local/hawq/lib > > -L/scratch/ecos0013/ketan/incubator-hawq/depends/libyarn/build/install/usr/local/hawq/lib > -Wl,-rpath,'/usr/local/hawq/lib',--enable-new-dtags -Wl,-E access/SUBSYS.o > bootstrap/SUBSYS.o catalog/SUBSYS.o parser/SUBSYS.o commands/SUBSYS.o > executor/SUBSYS.o foreign/SUBSYS.o lib/SUBSYS.o libpq/SUBSYS.o > gp_libpq_fe/SUBSYS.o main/SUBSYS.o nodes/SUBSYS.o optimizer/SUBSYS.o > port/SUBSYS.o postmaster/SUBSYS.o regex/SUBSYS.o rewrite/SUBSYS.o > storage/SUBSYS.o tcop/SUBSYS.o utils/SUBSYS.o resourcemanager/SUBSYS.o > ../../src/timezone/SUBSYS.o cdb/SUBSYS.o ../../src/port/libpgport_srv.a > -lprotobuf -lboost_system -lboost_date_time -lstdc++ -lhdfs3 -lgsasl -lxml2 > -ljson-c -levent -lyaml -lsnappy -lbz2 -lrt -lz -lcrypt -ldl -lm -lcurl > -lyarn -lkrb5 -lpthread -lthrift -lsnappy -o postgres > /scratch/ecos0013/ketan/incubator-hawq/depends/libhdfs3/build/install/usr/local/hawq/lib/libhdfs3.so: > undefined reference to `Hdfs::Internal::HWCrc32c::available()' > /scratch/ecos0013/ketan/incubator-hawq/depends/libhdfs3/build/install/usr/local/hawq/lib/libhdfs3.so: > undefined reference to `vtable for Hdfs::Internal::HWCrc32c' > collect2: error: ld returned 1 exit status > make[2]: *** [postgres] Error 1 > make[2]: Leaving directory `incubator-hawq/src/backend' > make[1]: *** [all] Error 2 > make[1]: Leaving directory `/incubator-hawq/src' > make: *** [all] Error 2 > ****************************************************************************** > Regards > Ketan -- This message was sent by Atlassian JIRA (v6.3.15#6346)