Ali,
Without knowing the details of the data streams, nature of each event
and the operations that will be performed against them, or how the
processors themselves will work, I cannot give you a solid answer. Do
I think it is possible? Absolutely. Do I think there will be hurdles
to overcome
Thanks Lee. Your response was awesome and really made me want to get
hands on a set of boxes like this so we could do some testing.
Thanks
Joe
On Mon, Oct 17, 2016 at 11:32 AM, Lee Laim wrote:
> Joe,
> Good points regarding throughput on real flows and sustained basis. My
Joe,
Good points regarding throughput on real flows and sustained basis. My test
was only pushing one aspect of the system.
That said, I would be interested discussing/developing a more comprehensive
test flow to capture more real world use cases. I'll check to see if that
conversation has
Dear Joe,
Thank you very much.
Best regards
On Mon, Oct 17, 2016 at 10:08 PM, Joe Witt wrote:
> Ali
>
> I suspect bottlenecks in the software itself and the flow design will
> become a factor before you 800 MB/s. You'd likely hit CPU efficiency
> issues before this caused
Prabhu,
Certainly, the performance that you are seeing, taking 4-5 hours to move 3M
rows into SQLServer is far from
ideal, but the good news is that it is also far from typical. You should be
able to see far better results.
To help us understand what is limiting the performance, and to make
Ali
I suspect bottlenecks in the software itself and the flow design will
become a factor before you 800 MB/s. You'd likely hit CPU efficiency
issues before this caused by the flow processors themselves and due to
garbage collection. Probably the most important factor though will be
the
Hi All,
I have tried to perform the below operation.
dat file(input)-->JSON-->SQL-->SQLServer
GetFile-->SplitText-->SplitText-->ExtractText-->ReplaceText-->ConvertJsonToSQL-->PutSQL.
My Input File(.dat)-->3,00,000 rows.
*Objective:* Move the data from '.dat' file into SQLServer.
I can able