Folks, I have to send several thousand database entities of different types
to both a Silverlight 5 and WPF app for display in a grid. I can't page
the data because it's all got to be loaded to allow a snappy response to
filtering it. I'm fishing for ways of getting the data across with the
least
Hi Greg,
What I did with my Motion Chart software (
http://www.eshiftlog.com/Silverlight/MotionGraphTestPage.html) to get
better download performance was:
• Move away from small WCF data transfers to transferring a single large
encoded compressed text file
• Only transfer raw data (no JSON/XML
Hi Greg,
We've used a technique called Chunking to move large quantities of binary
data around. We move ~50 Mb files (LAN only) this way and works pretty well.
http://msdn.microsoft.com/en-us/library/aa717050.aspx
Rob
From: ozdotnet-boun...@ozdotnet.com
Howdy Greg #2 (or 3?)
Haven't seen you since the Silverlight weekend in Docklands a few years ago.
Very interesting! You have implemented your own data compression, and we
used to do very similar things back in the late 70s and 80s when mainframe
disk space was precious. Compression algorithms
Greg,
I saw the TED talk that you note was the inspiration for this. I thought at
the time it was a brilliant way to present and understand data. Plus it and
the presenter had the audience totally amused but it really made the data
talk.
Is this something you will use yourself or for a
Hi Greg #N+1,
That Silverlight weekend in Docklands was a great event, thank you to the
guys that organised it!
I would not say that I implemented my own data compression, more I avoided
any extra fat in the data. I agree that I would not have been getting much
extra mileage out of the
Hi Paul,
Is this something you will use yourself or for a client, or propose to
make available one way or another?
This is work that I did myself as a side project some years ago to cement
my Silverlight and C# knowledge. I tried to find some commercial interest
in it, but it just was not
Hi Rob, I actually implemented a similar chunking technique from scratch
last year between WCF over http and an SL4 app. I simulated a simple
torrent idea where numbered chunks are sent to the server who assembles
them in a receiving array of the correct size. The only tricky code was on
the SL
Hi,
I'm looking for some recommendations for a system that does Server
Application Monitoring for our servers.
I've tried NewRelic and while it's an impressive product they don't have
very good support for custom metrics and instrumentation. I'd like to be
able to publish my own metrics that
In this age of 'big data' you'd think there would be a big commercialisation
opportunity for visualising both small and large data sets in that way.
Standardise the input data formats so people can prepare their own data and
interpolate missing points and it would have to be huge for management
PRTG is the go. It was recommended to me by someone on this list (thanks!)
and it is boss. We use it to monitor s truck load of stuff and it is very
flexible. Some parts of it that are boss:
- It is entirely agent based. Even in a single server install it uses a
local probe to do the polls.
We are now big enough to require a ticketing system to manage customer
requests/tasks.
Internally (8 of us) we use cloud TFS (visualstudio.com) and Office 365 so
a cloud based solution suits us better then us having to manage it. In fact
our whole infrastructure is slowly migrating to azure
David,
Thanks for the reference. I'll check
1. I don't have a server to allocate for this. My servers are all in the
cloud in various locations. I really want a service not a product :)
2. I don't see anything in their list specific to .Net. NewRelic has very
good profiling for .Net
On Wed,
Hey all,
Am stuck with some code first migrations to Azure. Have Googled but not
found anything that seems to help.
Basically I've got the local stuff working, I understand the workflow. ie
Add-Migration name generates the code to upgrade the recent changes to the
model, and Upgrade-Database
14 matches
Mail list logo