OT? Strategy For Handling Daily Accruals?

  • Thread starter (PeteCresswell)
  • Start date
P

(PeteCresswell)

Bond trading system.

Every bond has a "Coupon", which is the percent of held value
that gets paid out to the owner.

The "Coupon" can change over time. For some bonds it changes
daily; others change monthly; still others change quarterly,
annually, or semi-annually.

The coupon changes are per some external reference rate.
The federal funds opening rate on the day in question, for
instance. That if a security's reference rate is fed funds open,
that index's rate on a particular day is added to something in
the security called a "Spread". If the fed funds opening rate
on 11/3 is 5 and Security.Spread is 2, the Security.Coupon for
that day is 7.

That bond also has a frequency of payment. Some pay daily,
others pay monthly, still others pay quarterly...and so-forth.


Now, here's the fun part:

The users want to know the daily accrued payments for all bonds.

If something only pays monthly - say on the first of the month,
they still want to know what fraction of that payment was accrued
for, say 9/5.

A bond's actual payment amount (the one that happens once a month
or whatever) is the sum of all the daily accruals since the last
payment amount.

One of the jokers in the deck is that those reference rates might
change retroactively. Maybe somebody will mis-type a Libor
3-month rate on 9/3 and discover it on 9/18. That would change
the daily accruals of all bonds whose coupons are tied to the
Libor 3-month index. Ditto any change to the security's
properties like reset frequency, payment frequency, spread, or
which index is tb used for the reference rate.

So far, this sounds like an argument for a 100% normalized
architecture.


I probably should try it first, but my instinct is that
representing those payment amounts on-the-fly in a timely manner
as a security is loaded onto a screen (there could be hundreds or
thousands of them) is going to be beyond the capabilities of the
user's PC. And they *do* want to see that payment stream.

With that in mind, the only alternative I can come up with is a
denormalized architecture in which we have a "tblAccrualDaily"
which has one record for each security/day.

The columns would be SecurityID, Date, EffectiveCoupon,
CurrentHoldings, and PaymentAccrual

We'd generate all a security's records at one time when the
security was created - populating only SecurityID and Date,
leaving the Coupon, Holdings, and Accrual tb filled in each time
the calendar rolls over and/or one of the properties that affects
it changes. e.g. Somebody changes or adds a reference rate, we
seek out all the affected records and update them. Somebody
sells some of a security, same thing.


Bottom Line: I'd beg somebody with experience to comment on this
strategy. I'm relatively weak on SQL and suspect that there
might be an SQL-based solution that could get around the
denormalization.
 
R

Rod Plastow

Pete,

My instinctive reaction is to stick with your first intuition and normalise
the data. Have a table of external rates - whether you denormalise and hold
a rate for each and every day even if there is no change or just hold the
rate when it changes is a matter of choice - weigh larger file size and more
update activity against more complex programming (SQL). (You don't mention
whether rates are entered daily or just when they change.

This rate table will be a child of a rate type/source table. The rate
type/source table is also a parent for the bond coupon table. You don't say
whether the spread is related to the coupon or the rate type - I suspect the
latter.

OK, you're worried about performance in calculating the accruals. Some
questions here: do your users need the accruals for every coupon at the same
time or are they interrogating accruals on an individual basis; what is an
acceptable response time for them, what is the frequency of these
interrogations? Unless you are working with huge data and your users want
'instantaneous' accruals for every coupon every 5 minutes I think a
normalised design will hold up.

But even if it doesn't then you have not wasted your time; your data is in
good shape and you can cope with changes and new requirements all the more
easily. If you do need to improve performance use a data wharehousing
technique of generating processed denormalised records just for interrogation
purposes; there is no update of these records; they are in fact 'thrown away'
and replaced as and when necessary. The generation of these records can take
place at quieter times or in background. (Does a correction to a rate have
to be reflected immediately? After all in your example the users have lived
with incorrect data for a week or so. Can the correction wait? Even if not
then you only need to regenerate changes.)

I think you may have had something like this in the back of your mind when
you wrote the post. My reason for responding is to urge you to develop and
implement the normalised solution first. You won't regret it.

I have in the past written something similar. My problem was to make
up-to-date debtor ledger and chart of account balances available at all
times. Rather than calculate the balance every time it was required I
decided to denormalise the design and maintain balances. I then wrote an
'engine' that would propagate changes up through the chart of accounts and
forward through open accounting periods on sections of both ledgers - in fact
it did not propagate a 'change' but resummed all the individual transactions
including the new or changed transactions. Then I did not want to do this
every time something was added, deleted or changed so I recorded transaction
activity in a queue. When a balance was required the queue was interrogated
and if an entry in the queue affected the requested balance the whole queue
was processed.

I'd be very interested to know what you decide and how you get on.

Regards,

Rod
 
P

(PeteCresswell)

Per Rod Plastow:
Pete,

My instinctive reaction is to stick with your first intuition and normalise
the data. Have a table of external rates - whether you denormalise and hold
a rate for each and every day even if there is no change or just hold the
rate when it changes is a matter of choice - weigh larger file size and more
update activity against more complex programming (SQL). (You don't mention
whether rates are entered daily or just when they change.

Thanks.... expanding the external rate table to one per day
hadn't occurred to me... obvious as it now seems....
This rate table will be a child of a rate type/source table. The rate
type/source table is also a parent for the bond coupon table. You don't say
whether the spread is related to the coupon or the rate type - I suspect the
latter.

Spread is related to the security itself.

OK, you're worried about performance in calculating the accruals. Some
questions here: do your users need the accruals for every coupon at the same
time or are they interrogating accruals on an individual basis; what is an
acceptable response time for them, what is the frequency of these
interrogations? Unless you are working with huge data and your users want
'instantaneous' accruals for every coupon every 5 minutes I think a
normalised design will hold up.

They want the accruals (by implication, since accruals roll up
into payments...) every time they walk a list of securities. As
they move to the next security, all kinds of stuff related to the
security gets loaded. viz: http://tinyurl.com/ytdcff
But even if it doesn't then you have not wasted your time; your data is in
good shape and you can cope with changes and new requirements all the more
easily. If you do need to improve performance use a data wharehousing
technique of generating processed denormalised records just for interrogation
purposes; there is no update of these records; they are in fact 'thrown away'
and replaced as and when necessary. The generation of these records can take
place at quieter times or in background. (Does a correction to a rate have
to be reflected immediately? After all in your example the users have lived
with incorrect data for a week or so. Can the correction wait? Even if not
then you only need to regenerate changes.)

I think you may have had something like this in the back of your mind when
you wrote the post. My reason for responding is to urge you to develop and
implement the normalised solution first. You won't regret it.

That's my gut feeling - in spite of the horsepower-related
doubts.

Seems like whenever I deviate from the Good-Right-And-Holy path
in that respect, I live to regret it.... stuff just pyramids so
rapidly...

I have in the past written something similar. My problem was to make
up-to-date debtor ledger and chart of account balances available at all
times. Rather than calculate the balance every time it was required I
decided to denormalise the design and maintain balances. I then wrote an
'engine' that would propagate changes up through the chart of accounts and
forward through open accounting periods on sections of both ledgers - in fact
it did not propagate a 'change' but resummed all the individual transactions
including the new or changed transactions. Then I did not want to do this
every time something was added, deleted or changed so I recorded transaction
activity in a queue. When a balance was required the queue was interrogated
and if an entry in the queue affected the requested balance the whole queue
was processed.

I'd be very interested to know what you decide and how you get on.

I'll go the straight denormalized path first.

If that bogs down too much, I'll do the read-only shadow file
thing.

Give me a week or so.

If you don't hear anything, ping me at FirstName dot LastName at
Fatbelly fullstop Com.

Thanks for the expertise.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top