Re: PyArrow build problems

2019-12-15 Thread Sutou Kouhei
Hi,

Did you set -DARROW_PYTHON_INCLUDE_DIR,
-DARROW_PYTHON_LIB_DIR=, ... by the PYARROW_CMAKE_OPTIONS
environment variable? Normally, you should not define these
variables by hand. They should be defined automatically. You
may need to pass
-DCMAKE_MODULE_PATH=/home/bubba/work/arrow/dist/lib/cmake/arrow
by the PYARROW_CMAKE_OPTIONS environment variable.


Thanks,
--
kou


In 
  "Re: PyArrow build problems" on Thu, 12 Dec 2019 11:44:08 -0500,
  Justin Polchlopek  wrote:

> Thanks for the response.  I wasn't able to figure out what happened, but I
> was able to get a docker environment set up to do the build.  There must be
> something really peculiar with my environment.  (Which is probably not
> something I should be surprised of.  I suppose I was just hoping this was a
> thing that had been seen before.)
> 
> 
> On Thu, Dec 12, 2019 at 3:27 AM Micah Kornfield 
> wrote:
> 
>> I'm not sure exactly what is going on but somehow, FindArrow.cmake [1]
>> doesn't seem to be getting imported when the command is getting run in
>> setup.py, I'm not why this would be different then running the same command
>> by hand.
>>
>> [1]
>>
>> https://github.com/apache/arrow/blob/b282838d3ffea8285a06addab202014825b21993/cpp/cmake_modules/FindArrow.cmake
>>
>>
>> On Wed, Dec 4, 2019 at 10:50 AM Justin Polchlopek 
>> wrote:
>>
>> > I'm trying to get pyarrow built from source after making some changes to
>> > current master.  I've successfully built the C++ libraries, but `python
>> > setup.py build_ext`, and all variations thereof that I can try are
>> failing,
>> > unable to find the arrow libs, even though the ARROW_HOME environment
>> > variable is set properly.
>> >
>> > It appears that the way cmake is being spawned is a problem, since I can
>> > copy/paste the cmake command reported by the setup script into the
>> console
>> > and it runs just fine.  The build gives similar problems, but again
>> > succeeds when run manually.  Obviously, more is going on in the setup
>> > script than just the cmake invocations, so I need to get past this
>> problem
>> > so that the build can complete.
>> >
>> > FWIW, I'm running this from a virtualenv.
>> >
>> > A transcript of the session:
>> >
>> > (arrow-dev) bubba@localhost ~/work/arrow/python $ python setup.py
>> > build_ext
>> > --inplace
>> > running build_ext
>> > -- Running cmake for pyarrow
>> > cmake -DARROW_PYTHON_INCLUDE_DIR=/home/bubba/work/arrow/dist/include
>> > -DARROW_PYTHON_LIB_DIR=/home/bubba/work/arrow/dist/lib
>> > -DARROW_FLIGHT_INCLUDE_DIR=/home/bubba/work/arrow/dist/include
>> > -DARROW_FLIGHT_LIB_DIR=/home/bubba/work/arrow/dist/lib
>> > -DPYTHON_EXECUTABLE=/home/bubba/.python_envs/arrow-dev/bin/python
>> >  -DPYARROW_BUILD_CUDA=off -DPYARROW_BUILD_FLIGHT=on
>> > -DPYARROW_BUILD_GANDIVA=off -DPYARROW_BUILD_ORC=off
>> > -DPYARROW_BUILD_PARQUET=off -DPYARROW_BUILD_PLASMA=off
>> > -DPYARROW_BUILD_S3=off -DPYARROW_USE_TENSORFLOW=off
>> > -DPYARROW_BUNDLE_ARROW_CPP=off -DPYARROW_BUNDLE_BOOST=off
>> > -DPYARROW_GENERATE_COVERAGE=off -DPYARROW_BOOST_USE_SHARED=on
>> > -DPYARROW_PARQUET_USE_SHARED=on -DCMAKE_BUILD_TYPE=release
>> > /home/bubba/work/arrow/python
>> > -- Arrow build warning level: PRODUCTION
>> > Using ld linker
>> > Configured for RELEASE build (set with cmake
>> > -DCMAKE_BUILD_TYPE={release,debug,...})
>> > -- Build Type: RELEASE
>> > -- Build output directory:
>> > /home/bubba/work/arrow/python/build/temp.linux-x86_64-3.6/release
>> > -- Searching for Python libs in
>> >
>> >
>> /home/bubba/.python_envs/arrow-dev/lib64;/home/bubba/.python_envs/arrow-dev/lib;/usr/lib64/python3.6/config-3.6m-x86_64-linux-gnu
>> > -- Looking for python3.6m
>> > -- Found Python lib /home/bubba/.python_envs/arrow-dev/bin/python
>> > -- Searching for Python libs in
>> >
>> >
>> /home/bubba/.python_envs/arrow-dev/lib64;/home/bubba/.python_envs/arrow-dev/lib;/usr/lib64/python3.6/config-3.6m-x86_64-linux-gnu
>> > -- Looking for python3.6m
>> > -- Found Python lib /home/bubba/.python_envs/arrow-dev/bin/python
>> > -- Found the Arrow Python by
>> > -- Found the Arrow Python shared library:
>> > -- Found the Arrow Python import library:
>> > -- Found the Arrow Python static library:
>> > -- Found the Arrow Flight by
>> > -- Found the Arrow Flight shared library:
>> > -- Found the Arrow Flight import library:
>> > -- Found the Arrow Flight static library:
>> > CMake Error at cmake_modules/FindArrowPythonFlight.cmake:52
>> > (arrow_find_package):
>> >   Unknown CMake command "arrow_find_package".
>> > Call Stack (most recent call first):
>> >   CMakeLists.txt:453 (find_package)
>> >
>> >
>> > -- Configuring incomplete, errors occurred!
>> > See also
>> >
>> >
>> "/home/bubba/work/arrow/python/build/temp.linux-x86_64-3.6/CMakeFiles/CMakeOutput.log".
>> > See also
>> >
>> >
>> "/home/bubba/work/arrow/python/build/temp.linux-x86_64-3.6/CMakeFiles/CMakeError.log".
>> > error: command 'cmake' failed with exit status 1
>> > (arrow-dev) bubba@localhost ~/work/arrow/python $ cmake
>> > -DARRO

[jira] [Created] (ARROW-7396) [Format] Register media types (MIME types) for Apache Arrow formats to IANA

2019-12-15 Thread Kouhei Sutou (Jira)
Kouhei Sutou created ARROW-7396:
---

 Summary: [Format] Register media types (MIME types) for Apache 
Arrow formats to IANA
 Key: ARROW-7396
 URL: https://issues.apache.org/jira/browse/ARROW-7396
 Project: Apache Arrow
  Issue Type: Improvement
  Components: Format
Reporter: Kouhei Sutou


See "MIME types" thread for details: 
https://lists.apache.org/thread.html/b15726d0c0da2223ba1b45a226ef86263f688b20532a30535cd5e267%40%3Cdev.arrow.apache.org%3E

Summary:

  * If we don't register our media types for Apache Arrow formats (IPC File 
Format and IPC Streaming Format) to IANA, we should use "x-" prefix such as 
"application/x-apache-arrow-file".
  * It may be better that we reuse the same manner as Apache Thrift. Apache 
Thrift registers their media types as "application/vnd.apache.thrift.XXX". If 
we use the same manner as Apache Thrift, we will use 
"application/vnd.apache.arrow.file" or something.

TODO:

  * Decide which media types should we register. (Do we need vote?)
  * Register our media types to IANA.
  ** Media types page: 
https://www.iana.org/assignments/media-types/media-types.xhtml
  ** Application form for new media types: https://www.iana.org/form/media-types




--
This message was sent by Atlassian Jira
(v8.3.4#803005)


Re: MIME type

2019-12-15 Thread Sutou Kouhei
Done: https://issues.apache.org/jira/browse/ARROW-7396

In 
  "Re: MIME type" on Thu, 12 Dec 2019 16:46:30 -0600,
  Wes McKinney  wrote:

> Shall we create a JIRA issue to memorialize this task?
> 
> On Thu, Dec 12, 2019 at 1:16 AM Micah Kornfield  wrote:
>>
>> Yeah, I was thinking to wait for the actual formats to become "official",
>> but I agree the names won't change.
>>
>> On Thu, Dec 5, 2019 at 9:55 PM Sutou Kouhei  wrote:
>>
>> > I think that we don't need to wait for 1.0.0. Because we
>> > don't change existing format names. ("streaming" and "file"
>> > aren't changed.)
>> >
>> > In 
>> >   "Re: MIME type" on Sun, 1 Dec 2019 20:45:43 -0500,
>> >   Micah Kornfield  wrote:
>> >
>> > >>
>> > >> Should we register our MIME types to IANA?
>> > >
>> > > Are there any downsides to doing this? should we wait for 1.0.0?
>> > >
>> > > On Thu, Nov 21, 2019 at 5:04 AM Sutou Kouhei  wrote:
>> > >
>> > >> I found Apache Thrift registers the following MIME types:
>> > >>
>> > >>   * application/vnd.apache.thrift.binary
>> > >>   * application/vnd.apache.thrift.compact
>> > >>   * application/vnd.apache.thrift.json
>> > >>
>> > >> https://www.iana.org/assignments/media-types/media-types.xhtml
>> > >>
>> > >> Thrift uses "vnd.apache." prefix[1].
>> > >>
>> > >> [1] https://tools.ietf.org/html/rfc6838
>> > >> > Vendor-tree registrations will be distinguished by the leading facet
>> > >> > "vnd.".  That may be followed, at the discretion of the registrant,
>> > >> > by either a media subtype name from a well-known producer (e.g.,
>> > >> > "vnd.mudpie") or by an IANA-approved designation of the producer's
>> > >> > name that is followed by a media type or product designation (e.g.,
>> > >> > vnd.bigcompany.funnypictures).
>> > >>
>> > >> vnd.apache.thrift.binary was registered at 2014-09-09:
>> > >>
>> > >>
>> > >>
>> > https://www.iana.org/assignments/media-types/application/vnd.apache.thrift.binary
>> > >>
>> > >> Should we register our MIME types to IANA?
>> > >>
>> > >> It seems that Apache Thrift uses application/x-thift (typo?)
>> > >> before Apache Thrift registers these MIME types.
>> > >>
>> > >> > The application/x-thift media type is currently used to describe
>> > multiple
>> > >> > formats/protocols. Communications endpoints need to the
>> > format/protocol
>> > >> > used, so this media type should be used preferentially when it is
>> > >> > appropriate to do so.
>> > >>
>> > >> In 
>> > >>   "Re: MIME type" on Wed, 20 Nov 2019 12:01:54 +0100,
>> > >>   Antoine Pitrou  wrote:
>> > >>
>> > >> >
>> > >> > If it's not standardized, shouldn't it be prefixed with x-?
>> > >> >
>> > >> > e.g. application/x-apache-arrow-stream
>> > >> >
>> > >> >
>> > >> > Le 20/11/2019 à 08:29, Micah Kornfield a écrit :
>> > >> >> I would propose:
>> > >> >> application/apache-arrow-stream
>> > >> >> application/apache-arrow-file
>> > >> >>
>> > >> >> I'm not attached to those names but I think there should be two
>> > >> different
>> > >> >> mime-types, since the formats are not interchangeable.
>> > >> >>
>> > >> >> On Tue, Nov 19, 2019 at 10:31 PM Sutou Kouhei 
>> > >> wrote:
>> > >> >>
>> > >> >>> Hi,
>> > >> >>>
>> > >> >>> What MIME type should be used for Apache Arrow data?
>> > >> >>> application/arrow?
>> > >> >>>
>> > >> >>> Should we use the same MIME type for IPC Streaming Format[1]
>> > >> >>> and IPC File Format[2]? Or should we use different MIME
>> > >> >>> types for them?
>> > >> >>>
>> > >> >>> [1]
>> > >> >>>
>> > >> https://arrow.apache.org/docs/format/Columnar.html#ipc-streaming-format
>> > >> >>> [2]
>> > https://arrow.apache.org/docs/format/Columnar.html#ipc-file-format
>> > >> >>>
>> > >> >>>
>> > >> >>> Thanks,
>> > >> >>> --
>> > >> >>> kou
>> > >> >>>
>> > >> >>
>> > >>
>> >


[jira] [Created] (ARROW-7395) Logical "or" with constants is a Clang warning

2019-12-15 Thread Jim Apple (Jira)
Jim Apple created ARROW-7395:


 Summary: Logical "or" with constants is a Clang warning
 Key: ARROW-7395
 URL: https://issues.apache.org/jira/browse/ARROW-7395
 Project: Apache Arrow
  Issue Type: Bug
  Components: C++
Affects Versions: 0.15.0
Reporter: Jim Apple
Assignee: Jim Apple


With clang version 9.0.1, the C++ debug build fails with:

{noformat}
In file included from 
/home/jbapple/code/arrow/cpp/src/arrow/vendored/xxhash/xxhash.h:532:
/home/jbapple/code/arrow/cpp/src/arrow/vendored/xxhash/xxhash.c:810:11: error: 
use of logical '||' with constant operand [-Werror,-Wconstant-logical-operand]
if (0 || 0) {
  ^  ~
/home/jbapple/code/arrow/cpp/src/arrow/vendored/xxhash/xxhash.c:810:11: note: 
use '|' for a bitwise operation
if (0 || 0) {
  ^~
  |
{noformat}

The simple fix is to add {{-Wno-constant-logical-operand}} to 
SetupCxxFlags.cmake.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


Re: [Gandiva] How to optimize per CPU feature

2019-12-15 Thread Yibo Cai

On 12/13/19 7:45 PM, Ravindra Pindikura wrote:

On Fri, Dec 13, 2019 at 3:41 PM Yibo Cai  wrote:


Hi,

Thanks to pravindra's patch [1], Gandiva loop vectorization is okay now.

Will Gandiva detects CPU feature at runtime? My test CPU supports sse to
avx2, but I only
see "target-features"="+fxsr,+mmx,+sse,+sse2,+x87" in IR, and final code
doesn't leverage
registers longer than 128.



Can you please give some details about the hardware/OS-version you are
running this on ? Also, are you building the binaries and running them on
the same host ?



I'm building and running on same host.

Build: cmake -DCMAKE_BUILD_TYPE=RelWithDebInfo -DARROW_BUILD_TESTS=ON 
-DARROW_GANDIVA=ON ..

OS: ubuntu 18.04

CPU: lscpu outputs below

Architecture:x86_64
CPU op-mode(s):  32-bit, 64-bit
Byte Order:  Little Endian
CPU(s):  8
On-line CPU(s) list: 0-7
Thread(s) per core:  2
Core(s) per socket:  4
Socket(s):   1
NUMA node(s):1
Vendor ID:   GenuineIntel
CPU family:  6
Model:   60
Model name:  Intel(R) Core(TM) i7-4790 CPU @ 3.60GHz
Stepping:3
CPU MHz: 3591.845
CPU max MHz: 4000.
CPU min MHz: 800.
BogoMIPS:7183.72
Virtualization:  VT-x
L1d cache:   32K
L1i cache:   32K
L2 cache:256K
L3 cache:8192K
NUMA node0 CPU(s):   0-7
Flags:   fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca 
cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx 
pdpe1gb rdtscp lm constant_tsc arch_perfmon pebs bts rep_good nopl xtopology 
nonstop_tsc cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est 
tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid sse4_1 sse4_2 x2apic movbe popcnt 
tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm cpuid_fault epb 
invpcid_single pti ssbd ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid 
fsgsbase tsc_adjust bmi1 avx2 smep bmi2 erms invpcid xsaveopt dtherm ida arat 
pln pts md_clear flush_l1d




[1] https://github.com/apache/arrow/pull/6019






Re: Human-readable version of Arrow Schema?

2019-12-15 Thread Wes McKinney
I'd be open to looking at a proposal for a human-readable text
representation, but I'm definitely wary about making any kind of
cross-version compatibility guarantees (beyond "we will do our best").
If we were to make the same kinds of forward/backward compatibility
guarantees as with Flatbuffers it could create a lot of work for
maintainers.

On Thu, Dec 12, 2019 at 12:43 AM Micah Kornfield  wrote:
>
> >
> > With these two together, it would seem not too difficult to create a text
> > representation for Arrow schemas that (at some point) has some
> > compatibility guarantees, but maybe I'm missing something?
>
>
> I think the main risk is if somehow flatbuffers JSON parsing doesn't handle
> backward compatible changes to the arrow schema message.  Given the way the
> documentation is describing the JSON functionality I think this would be
> considered a bug.
>
> The one downside to calling the "schema" canonical is the flatbuffers JSON
> functionality only appears to be available in C++ and Java via JNI, so it
> wouldn't have cross language support.  I think this issue is more one of
> semantics though (i.e. does the JSON description become part of the "Arrow
> spec" or does it live as a C++/Python only feature).
>
> -Micah
>
>
> On Tue, Dec 10, 2019 at 10:51 AM Christian Hudon 
> wrote:
>
> > Micah: I didn't know that Flatbuffers supported serialization to/from JSON,
> > thanks. That seems like a very good start, at least. I'll aim to create a
> > draft pull request that at least wires everything up in Arrow so we can
> > load/save a Schema.fbs instance from/to JSON. At least it'll make it easier
> > for me to see how Arrow schemas would look in JSON with that.
> >
> > Otherwise, I'm still gathering requirements internally here. For example,
> > one thing that would be nice would be to be able to output a JSON Schema
> > from at least a subset of the Arrow schema. (That way our users could start
> > by passing around JSON with a given schema, and transition pieces of a
> > workflow to Arrow as they're ready.) But that part can also be done outside
> > of the Arrow code, if deemed not relevant to have in the Arrow codebase
> > itself.
> >
> > One core requirement for us, however, would be eventual compatibility
> > between Arrow versions for a given text representation of a schema.
> > Meaning, if you have a text description of a given Arrow schema, you can
> > load it into different versions of Arrow and it creates a valid Schema
> > Flatbuffer description, that Arrow can use. Wes, were you thinking of that,
> > or of something else, when you wrote "only makes sense if it is offered
> > without any backward/forward compatibility guarantees"?
> >
> > For the now, or me, assuming the JSON serialization done by the Flatbuffer
> > libraries is usable, it seems we have all the pieces to make this happen:
> > 1) The binary Schema.fbs data structures has to be compatible between
> > different versions of Arrow, otherwise two processes with different Arrow
> > versions won't be able to interoperate, no?
> > 2) The Flatbuffer <-> JSON serialization supplied by the Flatbuffers
> > library also has to be compatible between different versions of the
> > Flatbuffers library, since the main use case seems to be storing
> > Flatbuffers assets into version control. Breaking changes there will also
> > be painful to their users.
> >
> > With these two together, it would seem not too difficult to create a text
> > representation for Arrow schemas that (at some point) has some
> > compatibility guarantees, but maybe I'm missing something?
> >
> > Thanks,
> >
> >   Christian
> >
> > Le lun. 9 déc. 2019, à 07 h 00, Wes McKinney  a
> > écrit :
> >
> > > The only "canonical" representation of schemas at the moment is the
> > > Flatbuffers data structure [1]
> > >
> > > Having a human-readable/parseable text representation I think only
> > > makes sense if it is offered without any backward/forward
> > > compatibility guarantees.
> > >
> > > Note I had previously opened
> > > https://issues.apache.org/jira/browse/ARROW-3730 where I noted that
> > > there's no way (aside from generating the Flatbuffers messages) to
> > > generate a schema representation that can be used later to reconstruct
> > > a schema in a program. If such a representation were human
> > > readable/editable that seems beneficial.
> > >
> > >
> > >
> > > [1]: https://github.com/apache/arrow/blob/master/format/Schema.fbs
> > >
> > > On Sat, Dec 7, 2019 at 11:56 AM Maarten Ballintijn 
> > > wrote:
> > > >
> > > >
> > > > Is there a syntax specified for schemas?
> > > >
> > > > Cheers,
> > > > Maarten.
> > > >
> > > >
> > > > > On Dec 6, 2019, at 5:01 PM, Micah Kornfield 
> > > wrote:
> > > > >
> > > > > Hi Christian,
> > > > > As far as I know no-one is working on a canonical text representation
> > > for
> > > > > schemas.  A JSON serializer exists for integration test purposes, but
> > > > > IMO it shouldn't be relied upon as canonical.
> > > > >
> > > > > It

[NIGHTLY] Arrow Build Report for Job nightly-2019-12-15-0

2019-12-15 Thread Crossbow


Arrow Build Report for Job nightly-2019-12-15-0

All tasks: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0

Failed Tasks:
- conda-osx-clang-py27:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-conda-osx-clang-py27
- conda-osx-clang-py36:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-conda-osx-clang-py36
- conda-osx-clang-py37:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-conda-osx-clang-py37
- gandiva-jar-osx:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-travis-gandiva-jar-osx
- macos-r-autobrew:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-travis-macos-r-autobrew
- test-ubuntu-16.04-cpp:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-ubuntu-16.04-cpp
- test-ubuntu-18.04-r-3.6:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-ubuntu-18.04-r-3.6
- wheel-manylinux1-cp35m:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-wheel-manylinux1-cp35m
- wheel-manylinux2010-cp36m:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-wheel-manylinux2010-cp36m
- wheel-osx-cp37m:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-travis-wheel-osx-cp37m

Succeeded Tasks:
- centos-6:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-centos-6
- centos-7:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-centos-7
- centos-8:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-centos-8
- conda-linux-gcc-py27:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-conda-linux-gcc-py27
- conda-linux-gcc-py36:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-conda-linux-gcc-py36
- conda-linux-gcc-py37:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-conda-linux-gcc-py37
- conda-win-vs2015-py36:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-conda-win-vs2015-py36
- conda-win-vs2015-py37:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-conda-win-vs2015-py37
- debian-buster:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-debian-buster
- debian-stretch:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-azure-debian-stretch
- gandiva-jar-trusty:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-travis-gandiva-jar-trusty
- homebrew-cpp:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-travis-homebrew-cpp
- test-conda-cpp:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-cpp
- test-conda-python-2.7-pandas-latest:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-2.7-pandas-latest
- test-conda-python-2.7:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-2.7
- test-conda-python-3.6:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-3.6
- test-conda-python-3.7-dask-latest:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-3.7-dask-latest
- test-conda-python-3.7-hdfs-2.9.2:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-3.7-hdfs-2.9.2
- test-conda-python-3.7-pandas-latest:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-3.7-pandas-latest
- test-conda-python-3.7-pandas-master:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-3.7-pandas-master
- test-conda-python-3.7-spark-master:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-3.7-spark-master
- test-conda-python-3.7-turbodbc-latest:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-3.7-turbodbc-latest
- test-conda-python-3.7-turbodbc-master:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-3.7-turbodbc-master
- test-conda-python-3.7:
  URL: 
https://github.com/ursa-labs/crossbow/branches/all?query=nightly-2019-12-15-0-circle-test-conda-python-3.7
- test-conda-python-3.8-dask-master: