Date Format Conversion?

H

Harlan Grove

(e-mail address removed) wrote...
because I can interpolate with real data using olap

i can cartesian a couple of tables--

Linear interpolation involves *one* table. What's this 'couple of
tables'? Do you understand the concept? Tables showing the standard
normal distributions may be found in most statistics texts. One of the
more common tasks is figuring out critical values for significance
testing. What's the critical value for the two-tailed 20% significance
level? The relevant portion of the standard normal distribution table
would be

1.28 .89973
1.29 .90147

Two-tailed 10% significance level means z = .95. The critical value to
3 decimal places is then 1.282 =
(1.28*(0.90147-0.9)+1.29*(0.9-0.89973))/(0.90147-0.89973).

Could you use a cartesian product of the table with itself? I suppose
so. Seems wasteful, but I suppose database developers like you have yet
to figure out how to fetch numbers taking up total storage of 32 bytes
without creating multiple megabyte temporary data structures.
i just think that it's funny

Whereas I'd call your approach sad. Perhaps pitiful would be closer.
the only thing that you're talking about doing is trying data for lots
and lots of time slices

Wrong in the particulars.
you store those time slices in a TABLE and then you can cartesian date
and flow where you want and you'll get whatever kind of ratio you're
looking for.

Cartesian is still gross overkill compared to simple array indexing,
but I'll take you word for it that it may be the best databases can
manage.
and it wont be iterative. you wont write a loop. it'll be a JOIN.

Go back a few of my posts. You'll see two simple SELECT queries for the
two records needed. Simple enough to join them to put everything into a
single record, then use an expression based on those fields to produce
the linear interpolation result.

Other than the obvious (you have no clue how to do this), why would you
bother with a cartesian?
the only single reason that you think that databases 'arent as
powerful' is because they dont have all the cheesy little functions
you're looking for.

Excel doesn't have a linear interpolation function either, but it's
MUCH EASIER to write an expression to return a linear interpolation
result in Excel than it is in Access or SQL Server with all the add-on
software you seem to need to use.
 
A

aaron.kempf

no that's not true

if i knew what the formula was; it would be easier on the database
side-- because i'd write this function ONCE and use it anywhere--
instead of needign to copy this function to 2,000 different
spreadsheets

i just know that there isn't a damn thing that i can caclulate in plain
old sql server. sql server-- MSDE-- is free with Windows effectively.

i just think that it's a shame that you guys haven't grown up. I
mean-- it is 2005. It's not really like Excel is MORE POWERFUL than
databases.

it's just sad that you work for a company that doesn't provide
databases for you.

i mean-- databases should be AVAILABLE
instead of each person having their own H drive; each person should
have their own SQL Server database-- start keeping this data in a place
where you can USE it instead of having to always be copying and pasting
data around

write simple reports off of this data

i mean-- you guys sit there and say 'oh but i dont make reports'-- yes
you do make reports.

you sit there and copy and paste your life away.

and there is a better way.

there are more efficient ways to do math than to have 20k copies of the
same formula with different arguments


-Aaron
 
H

Harlan Grove

(e-mail address removed) wrote...
no that's not true
¿Que?

if i knew what the formula was; it would be easier on the database
side-- because i'd write this function ONCE and use it anywhere--
instead of needign to copy this function to 2,000 different
spreadsheets

On the spreadsheet side it's simple enough to write an interpolation
formula once in VBA.


Function li(xx As Double, x As Variant, y As Variant) As Variant
'minimal error checking
Dim n As Long
On Error Resume Next
n = Application.WorksheetFunction.Match(xx, x)
If Err.Number <> 0 Then
li = CVErr(xlErrValue)
Else
li = (y(n) * (x(n + 1) - xx) + y(n + 1) * (xx - x(n))) _
/ (x(n + 1) - x(n))
If Err.Number <> 0 Then li = CVErr(xlErrRef)
End If
Err.Clear
End Function


Then I could call it as needed. Only problem is that udfs in Excel are
slow, so I'd stick to the TREND formula approach.

=TREND(OFFSET(y,MATCH(xx,x),0,2,1),OFFSET(x,MATCH(xx,x),0,2,1),xx)
i just know that there isn't a damn thing that i can caclulate in plain
old sql server. sql server-- MSDE-- is free with Windows effectively.

I agree. There isn't a damn thing you *CAN* calculate in plain old SQL
Server. [Proof reading's a bitch, ain't it?]
it's just sad that you work for a company that doesn't provide
databases for you.

As I've said before, I have Access on my own PC and Oracle on the
servers. They're not real RDBMSs? Only SQL Server suffices for you?
i mean-- you guys sit there and say 'oh but i dont make reports'-- yes
you do make reports.

Define 'report'.

I generate exhibits other people use. Most of them are based on tested
templates that haven't needed major revisions in years. And nearly all
the data in them comes from e-mail. Some cut & paste? From time to
time. More often it's just detaching files and parsing them. Generally
less than a half hour from e-mail receipt to completion of exhibit.
It'd take longer to bash the data into a database (it's from different
customers, so the formats are never the same).

Putting it differently, I don't use much in-house data, and when I do I
tend to use Access (believe it or not). I definitely don't produce
canned reports which you seem to believe are the pinacle of business
workproduct.
there are more efficient ways to do math than to have 20k copies of the
same formula with different arguments

If all these formulas were in memory at the same time, you'd have a
point. If spreadsheets like Excel didn't provide minimal recalculation,
you'd have a point. However, only the formulas in the open workbooks
are in memory using the CPU, and only the formulas that need to be
recalced are recalced, and whether it's a spreadsheet with 1,000
identical formulas or a database calculating the same expression for
1,000 records, 1,000 calculations in one isn't much different than
1,000 calculations in the other.

If you mean storage redundancy, then you have a fair point. However,
it's a trade-off between storage usage and flexibility. You just don't
grasp that concept.
 
Top