On Wednesday, 18 May 2016 at 21:49:34 UTC, Joseph Rushton
Wakeling wrote:
On Wednesday, 18 May 2016 at 20:29:27 UTC, Walter Bright wrote:
I do not understand the tolerance for bad results in
scientific, engineering, medical, or finance applications.
I don't think anyone has suggested tolerance for bad results in
any of those applications.
I don't think its about tolerance for bad results, so much as the
ability to make the trade-off between speed and precision when
you need to.
Just thinking of finance: a market maker has to provide quotes on
potentially thousands of instruments in real-time. This might
involve some heavy calculations for options pricing. When dealing
with real-time tick data (the highest frequency of financial
data), sometimes you take shortcuts that you wouldn't be willing
to do if you were working with lower frequency data. It's not
that you don't care about precision. It's just that sometimes
it's more important to be fast than accurate.
I'm not a market maker and don't work with high frequency data. I
usually look at low enough frequency data so that I actually do
generally care more about accurate results than speed.
Nevertheless, sometimes with hefty simulations that take several
hours or days to run, I might be willing to take some short cuts
to get a general idea of the results. Then, when I implement the
strategy, I might do something different.