N
NickV
Hi,
I was wondering if somebody knows the answer to my problem, which i
the following:
I have a dataset of 100 columns, each representing a year. Every ro
represents a house, that has been sold several times during those 10
years. For example, if a house is sold in 1843 for $3000, next in 187
for $4000 and in 1900 for $10000, I would like to compute the return
for each of those transactions.
The problem is that there are varying numbers of columns between th
columns in which the transactions are recorded, that contain no values
Does anybody know how to compute the differences between these value
within a row?
Thanks,
regards,
Nic
I was wondering if somebody knows the answer to my problem, which i
the following:
I have a dataset of 100 columns, each representing a year. Every ro
represents a house, that has been sold several times during those 10
years. For example, if a house is sold in 1843 for $3000, next in 187
for $4000 and in 1900 for $10000, I would like to compute the return
for each of those transactions.
The problem is that there are varying numbers of columns between th
columns in which the transactions are recorded, that contain no values
Does anybody know how to compute the differences between these value
within a row?
Thanks,
regards,
Nic