[
https://issues.apache.org/jira/browse/BAHIR-99?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16097668#comment-16097668
]
ASF GitHub Bot commented on BAHIR-99:
-------------------------------------
Github user rmetzger commented on a diff in the pull request:
https://github.com/apache/bahir-flink/pull/17#discussion_r128921802
--- Diff:
flink-connector-kudu/src/main/java/es/accenture/flink/Sources/KuduInputFormat.java
---
@@ -0,0 +1,340 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package es.accenture.flink.Sources;
+
+import es.accenture.flink.Utils.RowSerializable;
+import org.apache.flink.api.common.io.InputFormat;
+import org.apache.flink.api.common.io.LocatableInputSplitAssigner;
+import org.apache.flink.api.common.io.statistics.BaseStatistics;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.core.io.InputSplitAssigner;
+import org.apache.kudu.client.*;
+import org.apache.log4j.Logger;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.List;
+
+/**
+ * {@link InputFormat} subclass that wraps the access for KuduTables.
+ */
+public class KuduInputFormat implements InputFormat<RowSerializable,
KuduInputSplit> {
+
+ private String KUDU_MASTER;
+ private String TABLE_NAME;
+
+ private transient KuduTable table = null;
+ private transient KuduScanner scanner = null;
+ private transient KuduClient client = null;
+
+ private transient RowResultIterator results = null;
+ private List<RowSerializable> rows = null;
+ private List<KuduScanToken> tokens = null;
+ private boolean endReached = false;
+ private int scannedRows = 0;
+
+ private static final Logger LOG =
Logger.getLogger(KuduInputFormat.class);
+
+ private List<String> projectColumns;
+
+ /**
+ * Constructor of class KuduInputFormat
+ * @param tableName Name of the Kudu table in which we are going to
read
+ * @param IP Kudu-master server's IP direction
+ */
+ public KuduInputFormat(String tableName, String IP){
+ LOG.info("1. CONSTRUCTOR");
+ KUDU_MASTER = IP;
+ TABLE_NAME = tableName;
--- End diff --
Why are these fields all-uppercase?
> Kudu connector to read/write from/to Kudu
> -----------------------------------------
>
> Key: BAHIR-99
> URL: https://issues.apache.org/jira/browse/BAHIR-99
> Project: Bahir
> Issue Type: New Feature
> Components: Flink Streaming Connectors
> Affects Versions: Flink-1.0
> Reporter: Rubén Casado
> Assignee: Rubén Casado
> Fix For: Flink-Next
>
>
> Java library to integrate Apache Kudu and Apache Flink. Main goal is to be
> able to read/write data from/to Kudu using the DataSet and DataStream Flink's
> APIs.
> Data flows patterns:
> Batch
> - Kudu -> DataSet<RowSerializable> -> Kudu
> - Kudu -> DataSet<RowSerializable> -> other source
> - Other source -> DataSet<RowSerializable> -> other source
> Stream
> - Other source -> DataStream <RowSerializable> -> Kudu
> Code is available in https://github.com/rubencasado/Flink-Kudu
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)