H
Harlan Grove
(e-mail address removed) wrote...
Linear interpolation involves *one* table. What's this 'couple of
tables'? Do you understand the concept? Tables showing the standard
normal distributions may be found in most statistics texts. One of the
more common tasks is figuring out critical values for significance
testing. What's the critical value for the two-tailed 20% significance
level? The relevant portion of the standard normal distribution table
would be
1.28 .89973
1.29 .90147
Two-tailed 10% significance level means z = .95. The critical value to
3 decimal places is then 1.282 =
(1.28*(0.90147-0.9)+1.29*(0.9-0.89973))/(0.90147-0.89973).
Could you use a cartesian product of the table with itself? I suppose
so. Seems wasteful, but I suppose database developers like you have yet
to figure out how to fetch numbers taking up total storage of 32 bytes
without creating multiple megabyte temporary data structures.
Whereas I'd call your approach sad. Perhaps pitiful would be closer.
Wrong in the particulars.
Cartesian is still gross overkill compared to simple array indexing,
but I'll take you word for it that it may be the best databases can
manage.
Go back a few of my posts. You'll see two simple SELECT queries for the
two records needed. Simple enough to join them to put everything into a
single record, then use an expression based on those fields to produce
the linear interpolation result.
Other than the obvious (you have no clue how to do this), why would you
bother with a cartesian?
Excel doesn't have a linear interpolation function either, but it's
MUCH EASIER to write an expression to return a linear interpolation
result in Excel than it is in Access or SQL Server with all the add-on
software you seem to need to use.
because I can interpolate with real data using olap
i can cartesian a couple of tables--
Linear interpolation involves *one* table. What's this 'couple of
tables'? Do you understand the concept? Tables showing the standard
normal distributions may be found in most statistics texts. One of the
more common tasks is figuring out critical values for significance
testing. What's the critical value for the two-tailed 20% significance
level? The relevant portion of the standard normal distribution table
would be
1.28 .89973
1.29 .90147
Two-tailed 10% significance level means z = .95. The critical value to
3 decimal places is then 1.282 =
(1.28*(0.90147-0.9)+1.29*(0.9-0.89973))/(0.90147-0.89973).
Could you use a cartesian product of the table with itself? I suppose
so. Seems wasteful, but I suppose database developers like you have yet
to figure out how to fetch numbers taking up total storage of 32 bytes
without creating multiple megabyte temporary data structures.
i just think that it's funny
Whereas I'd call your approach sad. Perhaps pitiful would be closer.
the only thing that you're talking about doing is trying data for lots
and lots of time slices
Wrong in the particulars.
you store those time slices in a TABLE and then you can cartesian date
and flow where you want and you'll get whatever kind of ratio you're
looking for.
Cartesian is still gross overkill compared to simple array indexing,
but I'll take you word for it that it may be the best databases can
manage.
and it wont be iterative. you wont write a loop. it'll be a JOIN.
Go back a few of my posts. You'll see two simple SELECT queries for the
two records needed. Simple enough to join them to put everything into a
single record, then use an expression based on those fields to produce
the linear interpolation result.
Other than the obvious (you have no clue how to do this), why would you
bother with a cartesian?
the only single reason that you think that databases 'arent as
powerful' is because they dont have all the cheesy little functions
you're looking for.
Excel doesn't have a linear interpolation function either, but it's
MUCH EASIER to write an expression to return a linear interpolation
result in Excel than it is in Access or SQL Server with all the add-on
software you seem to need to use.