I’ve filed an issue [1] for this.
I also found that we have 2 classes called FeedMessageService and I think that 1 can be removed [2].

Cheers,
Till

[1] https://issues.apache.org/jira/browse/ASTERIXDB-1175
[2] https://issues.apache.org/jira/browse/ASTERIXDB-1176

On 13 Nov 2015, at 3:17, abdullah alamoudi wrote:

I am also getting this exception. Although, I am not seeing any problems
caused by it. Still, we need to check it.

Amoudi, Abdullah.

On Fri, Nov 13, 2015 at 6:23 AM, Murtadha Hubail <[email protected]>
wrote:

Hi Yingyi,

I think this merge (https://asterix-gerrit.ics.uci.edu/#/c/487/ <
https://asterix-gerrit.ics.uci.edu/#/c/487/>) caused Asterix recovery
test cases to get stuck when duplicate key exception happens.

Could you please have a look at it? You can reproduce it with the
statements below.

@Others,
I’m also getting the below exception on the current master every time I start AsterixHyracksIntegrationUtil or during tests. Is anyone experiencing
the same?

java.net.ConnectException: Connection refused
     at java.net.PlainSocketImpl.socketConnect(Native Method)
     at
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
     at
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
     at
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
     at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
     at java.net.Socket.connect(Socket.java:589)
     at java.net.Socket.connect(Socket.java:538)
     at java.net.Socket.<init>(Socket.java:434)
     at java.net.Socket.<init>(Socket.java:211)
     at
org.apache.asterix.common.feeds.FeedMessageService$FeedMessageHandler.run(FeedMessageService.java:101)
     at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
     at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
     at java.lang.Thread.run(Thread.java:745)

-Murtadha

drop dataverse recovery if exists;
create dataverse recovery;
use dataverse recovery;

/* For raw Fragile data */
create type FragileTypeRaw as closed {
row_id: int32,
sid: int32,
date: string,
day: int32,
time: string,
bpm: int32,
RR: float
};

/* For cleaned Fragile data */
create type FragileType as closed {
row_id: int32,
sid: int32,
date: date,
day: int32,
time: time,
bpm: int32,
RR: float
};

/* Create dataset for loading raw Fragile data */
create dataset Fragile_raw (FragileTypeRaw)
primary key row_id;

/* Create dataset for cleaned Fragile data */
create dataset Fragile (FragileType)
primary key row_id;

use dataverse recovery;

load dataset Fragile_raw using
"org.apache.asterix.external.dataset.adapter.NCFileSystemAdapter"
(("path"="127.0.0.1://data/csv/fragile_01.csv"),("format"="delimited-text"),("delimiter"=","))
pre-sorted;

use dataverse recovery;

/* Load Fragile data from raw dataset into cleaned dataset */
insert into dataset Fragile (
for $t in dataset Fragile_raw
return {
 "row_id": $t.row_id % 28000,
 "sid": $t.sid,
 "date": date($t.date),
 "day": $t.day,
 "time": parse-time($t.time, "h:m:s"),
 "bpm": $t.bpm,
 "RR": $t.RR
}
);


Reply via email to