Flyway DB Migrations

2015-01-14 Thread Josh Mahonin
Hi all, As an FYI, I've got a pull request into Flyway (http://flywaydb.org/) for Phoenix support: https://github.com/flyway/flyway/pull/930 I don't know what everyone else is using for schema management, if anything at all, but the preliminary support works well enough for Flyway's various

Re: Flyway DB Migrations

2015-01-14 Thread Josh Mahonin
Hi James, I had actually hoped to use the DatabaseMetaData originally, but I was getting some interesting behaviour when using the 'getTables()' query when 'schemaPattern' was null. I'm not at my dev. machine to check for sure, but I think I was getting back a list of all tables, rather than just

rpc timeout when count on large table

2015-01-14 Thread su...@certusnet.com.cn
Hi, all When counting on large table, we got the following exception org.apache.hadoop.hbase.ipc.RpcClient$CallTimeoutException: Call id=, waitTime=69714 rpcTimetout=6 How would that be resolved? Table size goes to 17.3G with issuing hdfs dfs -du. Table with 90+ columns and only one

Re: Problems upgrading 2.2.3-incubating to 3.2.1

2015-01-14 Thread Kristoffer Sjögren
Yes, single quotes for the default column family works. CREATE TABLE TABLE ( C1 INTEGER NOT NULL, C2 INTEGER NOT NULL, C3 BIGINT NOT NULL, C4 BIGINT NOT NULL, C5 CHAR(2) NOT NULL, V BIGINT CONSTRAINT PK PRIMARY KEY ( C1, C2, C3, C4,

Re: Flyway DB Migrations

2015-01-14 Thread James Taylor
Wow, that's really awesome, Josh. Nice work. Can you let us know if/when it makes it in? One modification you may want to consider in a future revision to protect yourself in case the SYSTEM.CATALOG schema changes down the road: Use the DatabaseMetaData APIs[1] instead of querying the

RE: MapReduce bulk load into Phoenix table

2015-01-14 Thread Ciureanu, Constantin (GfK)
Hello James, Yes, as low as 1500 rows /sec - using Phoenix JDBC with Batch Inserts of 1000 records at once, but there are at least 100 dynamic columns for each row. I was expecting higher values of course - but I will finish soon coding a MR job to load the same data using Hadoop. The code I

java.math.BigInteger in phoenix

2015-01-14 Thread Lavrenty Eskin
Hi all, How to better store java.math.BigInteger datatype in phoenix? Phoenix's BIGINT datatype mapped to java.lang.Long Is there any option except VARCHAR(19)? Thanks The information transmitted herein is intended only for the person or entity to which it is