cognitive largess (was Re: reductionism)
Posted by Marcus G. Daniels on
URL: http://friam.383.s1.nabble.com/Seminal-Papers-in-Complexity-tp524047p524146.html
Glen E. P. Ropella wrote:
> Well, I suppose that begs the question of what we mean by "system". In
> the case of the financial machinery, it is clear how that part of the
> thing works. But, it is not at all clear how the whole system works.
> If it were, then predictive algorithms would be reasonably simple to
> develop.
It's price and risk that matters to quantitative traders. Price and
risk are measurable on any historical timescale that might be of
interest. Bias in data collection takes the form of "How do I reduce
all this individual trading activity to something I can manage?"
That sort of `bias' can be greatly attenuated using terabyte RAM
multiprocessor systems . No lie, it's feasible to correlate all sorts
of things with thousands of processors at your disposal. Why model
when you can measure?
I suppose a lot of `ideal experiments' are obvious enough to experts,
but perceived as too expensive. In other situations, like in the
study of social phenomena, it may not be clear at all what to measure or
for that matter how to do it, so there's the risk that the conventional
wisdom comes to dominate the kinds of data that are sought.