Re: [ceph-users] ceph-users Digest, Vol 60, Issue 26

2019-05-25 Thread Lazuardi Nasution
Hi Orlando and Haodong, Is there any response of this thread? I'm interested with this too. Best regards, Date: Fri, 26 Jan 2018 21:53:59 + > From: "Moreno, Orlando" > To: "ceph-users@lists.ceph.com" , Ceph > Development > Cc: "Tang, Haodong" > Subject: [ceph-users] Ceph OSDs fail

Re: [ceph-users] Major ceph disaster

2019-05-25 Thread Paul Emmerich
On Sat, May 25, 2019 at 7:45 PM Paul Emmerich wrote: > > > On Fri, May 24, 2019 at 5:22 PM Kevin Flöh wrote: > >> ok this just gives me: >> >> error getting xattr ec31/10004dfce92./parent: (2) No such file or >> directory >> > Try to run it on the replicated main data pool which contains

Re: [ceph-users] Major ceph disaster

2019-05-25 Thread Paul Emmerich
On Fri, May 24, 2019 at 5:22 PM Kevin Flöh wrote: > ok this just gives me: > > error getting xattr ec31/10004dfce92./parent: (2) No such file or > directory > Try to run it on the replicated main data pool which contains an empty object for each file, not sure where the xattr is stored in

Re: [ceph-users] performance in a small cluster

2019-05-25 Thread Marc Roos
Maybe my data can be useful to compare with? I have the samsung sm863. This[0] is what I get from fio directly on the ssd, and from an rbd ssd pool with 3x replication[1]. I also have included a comparisson with cephfs[3], would be nice if there would be some sort of manual page describing

Re: [ceph-users] performance in a small cluster

2019-05-25 Thread Marc Schöchlin
Hello Robert, probably the following tool provides deeper insights whats happening on your osds: https://github.com/scoopex/ceph/blob/master/src/tools/histogram_dump.py https://github.com/ceph/ceph/pull/28244 https://user-images.githubusercontent.com/288876/58368661-410afa00-7ef0-11e9-9aca-b09d9