Calculate actual trunk utilization based on call start/duration

N

nathaniel.watkins

I am 'attempting' to calculate our phone line utilization at work. The
end goal is to determine how many lines we actually use at any given
point. I thought this would be an easy problem, however the more I
work on solving it, the more difficult it becomes.



I have roughly 250,000 calls (I'm obviously using excel 2007) that I
am attempting to run calculations against. Here is a sampling of the
data.



Start Duration End

3/19/07 6:39:59 AM 143 3/19/07 6:42:22 AM
3/19/07 6:45:7 AM 9 3/19/07 6:45:16 AM
3/19/07 6:54:25 AM 14 3/19/07 6:54:39 AM
3/19/07 7:03:25 AM 50 3/19/07 7:04:15 AM
3/19/07 7:04:4 AM 18 3/19/07 7:04:22 AM
3/19/07 7:07:6 AM 25 3/19/07 7:07:31 AM
3/19/07 7:10:41 AM 267 3/19/07 7:15:8 AM
3/19/07 7:12:30 AM 29 3/19/07 7:12:59 AM
3/19/07 7:12:41 AM 18 3/19/07 7:12:59 AM
3/19/07 7:37:45 AM 37 3/19/07 7:38:22 AM
3/19/07 7:42:50 AM 9 3/19/07 7:42:59 AM
3/19/07 7:43:37 AM 7 3/19/07 7:43:44 AM
3/19/07 7:44:58 AM 43 3/19/07 7:45:41 AM
3/19/07 7:53:46 AM 75 3/19/07 7:55:1 AM
3/19/07 7:56:30 AM 34 3/19/07 7:57:4 AM
3/19/07 7:59:38 AM 312 3/19/07 8:04:50 AM
3/19/07 8:00:2 AM 9 3/19/07 8:00:11 AM
3/19/07 8:00:7 AM 28 3/19/07 8:00:35 AM
3/19/07 8:01:7 AM 32 3/19/07 8:01:39 AM
3/19/07 8:01:13 AM 45 3/19/07 8:01:58 AM
3/19/07 8:02:36 AM 23 3/19/07 8:02:59 AM
3/19/07 8:05:53 AM 2 3/19/07 8:05:55 AM
3/19/07 8:10:33 AM 21 3/19/07 8:10:54 AM
3/19/07 8:12:14 AM 21 3/19/07 8:12:35 AM
3/19/07 8:13:5 AM 43 3/19/07 8:13:48 AM
3/19/07 8:14:13 AM 32 3/19/07 8:14:45 AM
3/19/07 8:14:17 AM 33 3/19/07 8:14:50 AM
3/19/07 8:14:21 AM 6 3/19/07 8:14:27 AM
3/19/07 8:14:21 AM 33 3/19/07 8:14:54 AM
3/19/07 8:14:29 AM 32 3/19/07 8:15:1 AM
3/19/07 8:14:39 AM 41 3/19/07 8:15:20 AM
3/19/07 8:15:30 AM 81 3/19/07 8:16:51 AM
3/19/07 8:18:21 AM 587 3/19/07 8:28:8 AM
3/19/07 8:18:55 AM 56 3/19/07 8:19:51 AM
3/19/07 8:20:44 AM 148 3/19/07 8:23:12 AM
3/19/07 8:20:52 AM 1002 3/19/07 8:37:34 AM
3/19/07 8:21:42 AM 23 3/19/07 8:22:5 AM
3/19/07 8:22:43 AM 33 3/19/07 8:23:16 AM
3/19/07 8:22:59 AM 855 3/19/07 8:37:14 AM
3/19/07 8:23:50 AM 1087 3/19/07 8:41:57 AM
3/19/07 8:25:58 AM 22 3/19/07 8:26:20 AM
3/19/07 8:27:57 AM 42 3/19/07 8:28:39 AM
3/19/07 8:28:53 AM 17 3/19/07 8:29:10 AM
3/19/07 8:29:8 AM 10 3/19/07 8:29:18 AM
3/19/07 8:30:1 AM 58 3/19/07 8:30:59 AM
3/19/07 8:30:27 AM 196 3/19/07 8:33:43 AM
3/19/07 8:31:49 AM 2 3/19/07 8:31:51 AM
3/19/07 8:31:56 AM 221 3/19/07 8:35:37 AM
3/19/07 8:32:16 AM 535 3/19/07 8:41:11 AM
3/19/07 8:32:26 AM 178 3/19/07 8:35:24 AM
3/19/07 8:32:35 AM 18 3/19/07 8:32:53 AM
3/19/07 8:32:47 AM 35 3/19/07 8:33:22 AM
3/19/07 8:33:5 AM 28 3/19/07 8:33:33 AM
3/19/07 8:33:11 AM 1 3/19/07 8:33:12 AM
3/19/07 8:33:19 AM 68 3/19/07 8:34:27 AM
3/19/07 8:34:11 AM 143 3/19/07 8:36:34 AM
3/19/07 8:34:17 AM 177 3/19/07 8:37:14 AM
3/19/07 8:36:20 AM 17 3/19/07 8:36:37 AM
3/19/07 8:36:26 AM 57 3/19/07 8:37:23 AM
3/19/07 8:36:39 AM 333 3/19/07 8:42:12 AM
3/19/07 8:38:22 AM 44 3/19/07 8:39:6 AM
3/19/07 8:39:0 AM 2 3/19/07 8:39:2 AM
3/19/07 8:40:4 AM 85 3/19/07 8:41:29 AM
3/19/07 8:40:28 AM 2 3/19/07 8:40:30 AM
3/19/07 8:41:3 AM 30 3/19/07 8:41:33 AM
3/19/07 8:42:14 AM 511 3/19/07 8:50:45 AM
3/19/07 8:42:22 AM 882 3/19/07 8:57:4 AM
3/19/07 8:42:24 AM 94 3/19/07 8:43:58 AM






This 'simple' problem is probably the most difficult issue I have
encountered. The issue is a short phone call ties up the line but then
releases it again - so getting totals has proved difficult.



In the end, I would like to be able to graph the results to visually
be able to tell how many times we have hit certain numbers.



I am hoping to merge several sites together via VOIP and need a way to
determine how many lines we actually need.



Good Luck and Thanks,



Nathaniel
 
B

Barb Reinhardt

Let's say the Start data is in column A1:A63 and the END data is in Column
C1:C68.

Put this in D1 and copy down.

=SUMPRODUCT(--($A$1:$A$68>=A1),--($C$1:$C$68<=C1))
 
B

Barb Reinhardt

Now that I think of it, this may not be exactly what you want. But it's
close.

You could set up another table with START/END time increments of 1 minute or
10 minutes, whatever you want and use a similar SUMPRODUCT calculation. If
you need help, let me know.
 
J

Joel

I haven't seen this problem in almost 30 years since I was going for my
masters in computer science. I went to Steven's Institute of Technology
where many of the courses were taught by engineers from Bell Labs.

I'm using two worksheets. Sheet1 contains your input data. Not usre which
data is in which columns. I have the following

A: Date
B: Start Time
C: Phone Line
D: Date
E: End time

In sheet two the macro creates a row for each minute (0 to 1439) where there
are 1440 minutes in a day. The code then looks at each phone call and adds 1
to column B of sheet two for each minute the phone call was active. The code
then goes to each phone call and repeates the process.


The code is only set up for one day. You probably want to modifiy the code
that a diferent column in sheet 2 is used for each day. If you need
additional help let me know.



Sub getusage()

RowCount = 1

For MyMinute = 0 To 1439
With Sheets("Sheet2")
.Range("A" & RowCount) = _
TimeSerial(Int(MyMinute / 60), MyMinute Mod 60, 0)
RowCount = RowCount + 1
End With
Next MyMinute

RowCount = 1
With Sheets("Sheet1")
Do While .Range("A" & RowCount) <> ""
StartTime = .Range("B" & RowCount)
EndTime = .Range("E" & RowCount)
MyTime = TimeSerial(Hour(StartTime), Minute(StartTime), 0)

With Sheets("Sheet2")
Set c = .Columns("A:A").Find(what:=MyTime, _
LookIn:=xlValues, lookat:=xlWhole)
If Not c Is Nothing Then
RowOffset = 0
Do While c.Offset(RowOffset, 0) <= EndTime
c.Offset(RowOffset, 1) = _
c.Offset(RowOffset, 1) + 1
RowOffset = RowOffset + 1
Loop
End If
End With
RowCount = RowCount + 1
Loop
End With



End Sub
 
G

Gary''s Student

This is a GREAT problem! It falls in the area of Resource Utilization and
many thick books have been written on the topic. You can use Excel with some
simple formulas to get information about your data. Let’s say we split the
day up into 1 minute intervals and we want to know how many calls fell into
each interval.

I took your data and put the start date/time in column A, the duration in
column B, and the end date/time in column C. I began in row #3. In D1 and
D2 we enter:

3/19/2007 8:00:00
3/19/2007 8:01:00

the beginning and end of a typical one minute window. In D3 we enter:
=($D$2>A3)*($D$1<C3) and copy down. We see in A1 thru D70:

3/19/2007 8:00:00
3/19/2007 8:01:00
3/19/2007 6:39:59 143 3/19/2007 6:42:22 0
3/19/2007 6:45:07 9 3/19/2007 6:45:16 0
3/19/2007 6:54:25 14 3/19/2007 6:54:39 0
3/19/2007 7:03:25 50 3/19/2007 7:04:15 0
3/19/2007 7:04:04 18 3/19/2007 7:04:22 0
3/19/2007 7:07:06 25 3/19/2007 7:07:31 0
3/19/2007 7:10:41 267 3/19/2007 7:15:08 0
3/19/2007 7:12:30 29 3/19/2007 7:12:59 0
3/19/2007 7:12:41 18 3/19/2007 7:12:59 0
3/19/2007 7:37:45 37 3/19/2007 7:38:22 0
3/19/2007 7:42:50 9 3/19/2007 7:42:59 0
3/19/2007 7:43:37 7 3/19/2007 7:43:44 0
3/19/2007 7:44:58 43 3/19/2007 7:45:41 0
3/19/2007 7:53:46 75 3/19/2007 7:55:01 0
3/19/2007 7:56:30 34 3/19/2007 7:57:04 0
3/19/2007 7:59:38 312 3/19/2007 8:04:50 1
3/19/2007 8:00:02 9 3/19/2007 8:00:11 1
3/19/2007 8:00:07 28 3/19/2007 8:00:35 1
3/19/2007 8:01:07 32 3/19/2007 8:01:39 0
3/19/2007 8:01:13 45 3/19/2007 8:01:58 0
3/19/2007 8:02:36 23 3/19/2007 8:02:59 0
3/19/2007 8:05:53 2 3/19/2007 8:05:55 0
3/19/2007 8:10:33 21 3/19/2007 8:10:54 0
3/19/2007 8:12:14 21 3/19/2007 8:12:35 0
3/19/2007 8:13:05 43 3/19/2007 8:13:48 0
3/19/2007 8:14:13 32 3/19/2007 8:14:45 0
3/19/2007 8:14:17 33 3/19/2007 8:14:50 0
3/19/2007 8:14:21 6 3/19/2007 8:14:27 0
3/19/2007 8:14:21 33 3/19/2007 8:14:54 0
3/19/2007 8:14:29 32 3/19/2007 8:15:01 0
3/19/2007 8:14:39 41 3/19/2007 8:15:20 0
3/19/2007 8:15:30 81 3/19/2007 8:16:51 0
3/19/2007 8:18:21 587 3/19/2007 8:28:08 0
3/19/2007 8:18:55 56 3/19/2007 8:19:51 0
3/19/2007 8:20:44 148 3/19/2007 8:23:12 0
3/19/2007 8:20:52 1002 3/19/2007 8:37:34 0
3/19/2007 8:21:42 23 3/19/2007 8:22:05 0
3/19/2007 8:22:43 33 3/19/2007 8:23:16 0
3/19/2007 8:22:59 855 3/19/2007 8:37:14 0
3/19/2007 8:23:50 1087 3/19/2007 8:41:57 0
3/19/2007 8:25:58 22 3/19/2007 8:26:20 0
3/19/2007 8:27:57 42 3/19/2007 8:28:39 0
3/19/2007 8:28:53 17 3/19/2007 8:29:10 0
3/19/2007 8:29:08 10 3/19/2007 8:29:18 0
3/19/2007 8:30:01 58 3/19/2007 8:30:59 0
3/19/2007 8:30:27 196 3/19/2007 8:33:43 0
3/19/2007 8:31:49 2 3/19/2007 8:31:51 0
3/19/2007 8:31:56 221 3/19/2007 8:35:37 0
3/19/2007 8:32:16 535 3/19/2007 8:41:11 0
3/19/2007 8:32:26 178 3/19/2007 8:35:24 0
3/19/2007 8:32:35 18 3/19/2007 8:32:53 0
3/19/2007 8:32:47 35 3/19/2007 8:33:22 0
3/19/2007 8:33:05 28 3/19/2007 8:33:33 0
3/19/2007 8:33:11 1 3/19/2007 8:33:12 0
3/19/2007 8:33:19 68 3/19/2007 8:34:27 0
3/19/2007 8:34:11 143 3/19/2007 8:36:34 0
3/19/2007 8:34:17 177 3/19/2007 8:37:14 0
3/19/2007 8:36:20 17 3/19/2007 8:36:37 0
3/19/2007 8:36:26 57 3/19/2007 8:37:23 0
3/19/2007 8:36:39 333 3/19/2007 8:42:12 0
3/19/2007 8:38:22 44 3/19/2007 8:39:06 0
3/19/2007 8:39:00 2 3/19/2007 8:39:02 0
3/19/2007 8:40:04 85 3/19/2007 8:41:29 0
3/19/2007 8:40:28 2 3/19/2007 8:40:30 0
3/19/2007 8:41:03 30 3/19/2007 8:41:33 0
3/19/2007 8:42:14 511 3/19/2007 8:50:45 0
3/19/2007 8:42:22 882 3/19/2007 8:57:04 0
3/19/2007 8:42:24 94 3/19/2007 8:43:58 0

This means that three calls fell into the window. Now we just continue with
column E for the next interval, etc. In the end we can create a
minute-by-minute histogram of phone usage.

But this is only the beginning. If the three calls were tiny, they may not
overlap at all, or they may overlap completely! I suggest you begin by
checking out:

http://www.google.com/search?hl=en&q=algorithm+resource+utilization&btnG=Google+Search

to see if a solution has already been published.
 
N

nathaniel.watkins

All very good advise - Gary's Student - as you point out, I am trying
to avoid doing a time based analysis since it doesn't tell me how many
lines were overlapping (30 2 second calls could tie up anywhere from 1
line to 30). We have a software package that does trunk analysis, but
it does the same thing - how 'utilized' is each individual trunk on an
hourly basis...

My end goal would be to have a number for each row that show how many
lines are currently being used at that instant. Here is my initial
thought process (it is of course not right, but I'm hoping it will be
the building blocks to a possible solution):

Start Duration End 1 2 3 4 5 6 7
3/19/07 12:25:12 AM 73 3/19/07 12:26:25 AM 0 0 0 0 0 0 0

in E2 - I have the following formula:
=IF($C2>=INDIRECT("A" & ROW(C2)+E$1),1,0)

I then copied this formula to the right about 40 colums and down the
entire range of my spreadsheet. This then shows me (line by line) how
many calls the current call spans.

in AR2 - I have:
=SUM(INDIRECT("E"&ROW(AT2)&":E"&ROW(AT2)+SUM(E2:AQ2))) - this will
basically take and sum the 1s in column E starting with AR2 and going
down how ever many calls this call sapns (which in this example is 0)
- but longer duration calls will obviously span more calls.

I will copy a few thousand of the call records and upload it to a
website in the event anyone wants to actually see this spreadsheet. I
find this problem facinating, It seems so simple at first glance...but
it is certainly one of the hardest (if no the hardest) real life excel
crunching scenario I have seen.

Thanks again for your skills.

Nathaniel
 
I

Ivyleaf

Hi,

I might be on the wrong track all together, but this looks like an
Access problem to me. How I tackled it (and I know there are better
ways in Access, but it's late and I'm not thinking clearly) was as
follows:

Table 1: (Named "Sheet1")

Start Duration End
19/03/2007 6:39:59 AM 143 19/03/2007 6:42:22 AM
19/03/2007 6:45:07 AM 9 19/03/2007 6:45:16 AM
19/03/2007 6:54:25 AM 14 19/03/2007 6:54:39 AM
etc.
(Duration column is redundant)

Table 2: (Named "Sheet2")

Time
19/03/2007 6:00:00 AM
19/03/2007 6:01:00 AM
19/03/2007 6:02:00 AM
19/03/2007 6:03:00 AM
19/03/2007 6:04:00 AM
19/03/2007 6:05:00 AM
19/03/2007 6:06:00 AM
etc.

Query1:

SELECT "Data" AS Data, Sheet2.Time, Sum(IIf([time] Between [Start] And
[End],1,0)) AS Engaged
FROM Sheet2, Sheet1
GROUP BY "Data", Sheet2.Time;

This gives you a list of times showing a row for every second. In the
engaged column, there will be a number representing the total number
of lines in use for this second. From here, there are two options:

1. Switch to PivotChart View
Drag the "Engaged" field to the data area
Drag an appropriate time interval to the category axis (I chose
Hours)

2. Build a Crosstab from this query:
TRANSFORM Sum(Query1.Engaged) AS SumOfEngaged
SELECT Query1.Data
FROM Query1
GROUP BY Query1.Data
PIVOT Format(Query1.Time,"hh");

This will give you a summarised table that you can then chart with
Excel, showing the total phone minutes in each hour.

Cheers,
Ivan.
 
I

Ivyleaf

Hi Nathaniel,

Reading your reply to Gary's Student, I changed Table 2 to be
intervals of 1 second rather than 1 minute which will avoid the
problem of two calls placed in the same window that don't actually
overlap. You could then still run the crosstab query, but instead of
the sum function change to the max function. This would then give you
a profile of the maximum number of simultaneous calls in any specified
interval. From this analysis, your busiest period in the data you
provided was 10 simultaneous calls at 8:33am. The only issue I can see
here is that the 'Between' function may have to be replaced with a
different formula that will check for "Between or equal to" just to
catch those one or two calls that are less than three seconds.

Cheers,
Ivan.

Hi,

I might be on the wrong track all together, but this looks like an
Access problem to me. How I tackled it (and I know there are better
ways in Access, but it's late and I'm not thinking clearly) was as
follows:

Table 1: (Named "Sheet1")

Start   Duration        End
19/03/2007 6:39:59 AM   143     19/03/2007 6:42:22 AM
19/03/2007 6:45:07 AM   9       19/03/2007 6:45:16 AM
19/03/2007 6:54:25 AM   14      19/03/2007 6:54:39 AM
etc.
(Duration column is redundant)

Table 2: (Named "Sheet2")

Time
19/03/2007 6:00:00 AM
19/03/2007 6:01:00 AM
19/03/2007 6:02:00 AM
19/03/2007 6:03:00 AM
19/03/2007 6:04:00 AM
19/03/2007 6:05:00 AM
19/03/2007 6:06:00 AM
etc.

Query1:

SELECT "Data" AS Data, Sheet2.Time, Sum(IIf([time] Between [Start] And
[End],1,0)) AS Engaged
FROM Sheet2, Sheet1
GROUP BY "Data", Sheet2.Time;

This gives you a list of times showing a row for every second. In the
engaged column, there will be a number representing the total number
of lines in use for this second. From here, there are two options:

1. Switch to PivotChart View
    Drag the "Engaged" field to the data area
    Drag an appropriate time interval to the category axis (I chose
Hours)

2. Build a Crosstab from this query:
    TRANSFORM Sum(Query1.Engaged) AS SumOfEngaged
    SELECT Query1.Data
    FROM Query1
    GROUP BY Query1.Data
    PIVOT Format(Query1.Time,"hh");

This will give you a summarised table that you can then chart with
Excel, showing the total phone minutes in each hour.

Cheers,
Ivan.

This is a GREAT problem!  It falls in the area of Resource Utilizationand
many thick books have been written on the topic.  You can use Excel with some
simple formulas to get information about your data.  Let's say we split the
day up into 1 minute intervals and we want to know how many calls fell into
each interval.
I took your data and put the start date/time in column A, the duration in
column B, and the end date/time in column C.  I began in row #3.  InD1 and
D2 we enter:
3/19/2007 8:00:00
3/19/2007 8:01:00
the beginning and end of a typical one minute window.  In D3 we enter:
=($D$2>A3)*($D$1<C3) and copy down.  We see in A1 thru D70:
                        3/19/2007 8:00:00
                        3/19/2007 8:01:00
3/19/2007 6:39:59       143     3/19/2007 6:42:22       0
3/19/2007 6:45:07       9       3/19/2007 6:45:16       0
3/19/2007 6:54:25       14      3/19/2007 6:54:39       0
3/19/2007 7:03:25       50      3/19/2007 7:04:15       0
3/19/2007 7:04:04       18      3/19/2007 7:04:22       0
3/19/2007 7:07:06       25      3/19/2007 7:07:31       0
3/19/2007 7:10:41       267     3/19/2007 7:15:08       0
3/19/2007 7:12:30       29      3/19/2007 7:12:59       0
3/19/2007 7:12:41       18      3/19/2007 7:12:59       0
3/19/2007 7:37:45       37      3/19/2007 7:38:22       0
3/19/2007 7:42:50       9       3/19/2007 7:42:59       0
3/19/2007 7:43:37       7       3/19/2007 7:43:44       0
3/19/2007 7:44:58       43      3/19/2007 7:45:41       0
3/19/2007 7:53:46       75      3/19/2007 7:55:01       0
3/19/2007 7:56:30       34      3/19/2007 7:57:04       0
3/19/2007 7:59:38       312     3/19/2007 8:04:50       1
3/19/2007 8:00:02       9       3/19/2007 8:00:11       1
3/19/2007 8:00:07       28      3/19/2007 8:00:35       1
3/19/2007 8:01:07       32      3/19/2007 8:01:39       0
3/19/2007 8:01:13       45      3/19/2007 8:01:58       0
3/19/2007 8:02:36       23      3/19/2007 8:02:59       0
3/19/2007 8:05:53       2       3/19/2007 8:05:55       0
3/19/2007 8:10:33       21      3/19/2007 8:10:54       0
3/19/2007 8:12:14       21      3/19/2007 8:12:35       0
3/19/2007 8:13:05       43      3/19/2007 8:13:48       0
3/19/2007 8:14:13       32      3/19/2007 8:14:45       0
3/19/2007 8:14:17       33      3/19/2007 8:14:50       0
3/19/2007 8:14:21       6       3/19/2007 8:14:27       0
3/19/2007 8:14:21       33      3/19/2007 8:14:54       0
3/19/2007 8:14:29       32      3/19/2007 8:15:01       0
3/19/2007 8:14:39       41      3/19/2007 8:15:20       0
3/19/2007 8:15:30       81      3/19/2007 8:16:51       0
3/19/2007 8:18:21       587     3/19/2007 8:28:08       0
3/19/2007 8:18:55       56      3/19/2007 8:19:51       0
3/19/2007 8:20:44       148     3/19/2007 8:23:12       0
3/19/2007 8:20:52       1002    3/19/2007 8:37:34       0
3/19/2007 8:21:42       23      3/19/2007 8:22:05       0
3/19/2007 8:22:43       33      3/19/2007 8:23:16       0
3/19/2007 8:22:59       855     3/19/2007 8:37:14       0
3/19/2007 8:23:50       1087    3/19/2007 8:41:57       0
3/19/2007 8:25:58       22      3/19/2007 8:26:20       0
3/19/2007 8:27:57       42      3/19/2007 8:28:39       0
3/19/2007 8:28:53       17      3/19/2007 8:29:10       0
3/19/2007 8:29:08       10      3/19/2007 8:29:18       0
3/19/2007 8:30:01       58      3/19/2007 8:30:59       0
3/19/2007 8:30:27       196     3/19/2007 8:33:43       0
3/19/2007 8:31:49       2       3/19/2007 8:31:51       0
3/19/2007 8:31:56       221     3/19/2007 8:35:37       0
3/19/2007 8:32:16       535     3/19/2007 8:41:11       0
3/19/2007 8:32:26       178     3/19/2007 8:35:24       0
3/19/2007 8:32:35       18      3/19/2007 8:32:53       0
3/19/2007 8:32:47       35      3/19/2007 8:33:22       0
3/19/2007 8:33:05       28      3/19/2007 8:33:33       0
3/19/2007 8:33:11       1       3/19/2007 8:33:12       0
3/19/2007 8:33:19       68      3/19/2007 8:34:27       0
3/19/2007 8:34:11       143     3/19/2007 8:36:34       0
3/19/2007 8:34:17       177     3/19/2007 8:37:14       0
3/19/2007 8:36:20       17      3/19/2007 8:36:37       0
3/19/2007 8:36:26       57      3/19/2007 8:37:23       0
3/19/2007 8:36:39       333     3/19/2007 8:42:12       0
3/19/2007 8:38:22       44      3/19/2007 8:39:06       0
3/19/2007 8:39:00       2       3/19/2007 8:39:02       0
3/19/2007 8:40:04       85      3/19/2007 8:41:29       0
3/19/2007 8:40:28       2       3/19/2007 8:40:30       0
3/19/2007 8:41:03       30      3/19/2007 8:41:33       0
3/19/2007 8:42:14       511     3/19/2007 8:50:45       0
3/19/2007 8:42:22       882     3/19/2007 8:57:04       0
3/19/2007 8:42:24       94      3/19/2007 8:43:58       0
This means that three calls fell into the window.  Now we just continue with
column E for the next interval, etc.  In the end we can create a
minute-by-minute histogram of phone usage.
But this is only the beginning.  If the three calls were tiny, they may not
overlap at all, or they may overlap completely!  I suggest you begin by
checking out:

to see if a solution has already been published.

...

read more »- Hide quoted text -

- Show quoted text -
 
I

Ivyleaf

Just out of interest (to see it it agrees with anyone else's results)
here it the data summary that I ended up with:

Max of Engaged
Hours Time Total
6 AM :39 1
:40 1
:41 1
:42 1
:43 0
:44 0
:45 1
:46 0
:47 0
:48 0
:49 0
:50 0
:51 0
:52 0
:53 0
:54 1
:55 0
:56 0
:57 0
:58 0
:59 0
7 AM :00 0
:01 0
:02 0
:03 1
:04 2
:05 0
:06 0
:07 1
:08 0
:09 0
:10 1
:11 1
:12 3
:13 1
:14 1
:15 1
:16 0
:17 0
:18 0
:19 0
:20 0
:21 0
:22 0
:23 0
:24 0
:25 0
:26 0
:27 0
:28 0
:29 0
:30 0
:31 0
:32 0
:33 0
:34 0
:35 0
:36 0
:37 1
:38 1
:39 0
:40 0
:41 0
:42 1
:43 1
:44 1
:45 1
:46 0
:47 0
:48 0
:49 0
:50 0
:51 0
:52 0
:53 1
:54 1
:55 1
:56 1
:57 1
:58 0
:59 1
8 AM :00 3
:01 3
:02 2
:03 1
:04 1
:05 1
:06 0
:07 0
:08 0
:09 0
:10 1
:11 0
:12 1
:13 1
:14 5
:15 2
:16 1
:17 0
:18 2
:19 2
:20 3
:21 4
:22 5
:23 5
:24 4
:25 5
:26 5
:27 5
:28 5
:29 5
:30 5
:31 5
:32 9
:33 10
:34 9
:35 8
:36 8
:37 7
:38 4
:39 5
:40 5
:41 5
:42 3
:43 3
:44 2
:45 2
:46 2
:47 2
:48 2
:49 2
:50 2
:51 1
:52 1
:53 1
:54 1
:55 1
:56 1
:57 1
:58 0
:59 0
9 AM :00 0
Grand Total 10


Hi Nathaniel,

Reading your reply to Gary's Student, I changed Table 2 to be
intervals of 1 second rather than 1 minute which will avoid the
problem of two calls placed in the same window that don't actually
overlap. You could then still run the crosstab query, but instead of
the sum function change to the max function. This would then give you
a profile of the maximum number of simultaneous calls in any specified
interval. From this analysis, your busiest period in the data you
provided was 10 simultaneous calls at 8:33am. The only issue I can see
here is that the 'Between' function may have to be replaced with a
different formula that will check for "Between or equal to" just to
catch those one or two calls that are less than three seconds.

Cheers,
Ivan.

I might be on the wrong track all together, but this looks like an
Access problem to me. How I tackled it (and I know there are better
ways in Access, but it's late and I'm not thinking clearly) was as
follows:
Table 1: (Named "Sheet1")
Start   Duration        End
19/03/2007 6:39:59 AM   143     19/03/2007 6:42:22 AM
19/03/2007 6:45:07 AM   9       19/03/2007 6:45:16 AM
19/03/2007 6:54:25 AM   14      19/03/2007 6:54:39 AM
etc.
(Duration column is redundant)
Table 2: (Named "Sheet2")
Time
19/03/2007 6:00:00 AM
19/03/2007 6:01:00 AM
19/03/2007 6:02:00 AM
19/03/2007 6:03:00 AM
19/03/2007 6:04:00 AM
19/03/2007 6:05:00 AM
19/03/2007 6:06:00 AM
etc.

SELECT "Data" AS Data, Sheet2.Time, Sum(IIf([time] Between [Start] And
[End],1,0)) AS Engaged
FROM Sheet2, Sheet1
GROUP BY "Data", Sheet2.Time;
This gives you a list of times showing a row for every second. In the
engaged column, there will be a number representing the total number
of lines in use for this second. From here, there are two options:
1. Switch to PivotChart View
    Drag the "Engaged" field to the data area
    Drag an appropriate time interval to the category axis (I chose
Hours)
2. Build a Crosstab from this query:
    TRANSFORM Sum(Query1.Engaged) AS SumOfEngaged
    SELECT Query1.Data
    FROM Query1
    GROUP BY Query1.Data
    PIVOT Format(Query1.Time,"hh");
This will give you a summarised table that you can then chart with
Excel, showing the total phone minutes in each hour.

...

read more »- Hide quoted text -

- Show quoted text -
 
N

nwatkins

You have solved the issue of using minutes (seconds will certainly
work) - my only concern is that I have 250,000 calls that will span
roughly 23 million seconds. If I understand you correctly, I will
have 23 million rows (one for each second) that will show how many
calls were in session at that second. Not sure how well access will
do returning that quantity of data. Doing a max will give me the
total results (which is farther than I have gotten) - but doing a max
will not let me see how often that max is reached.

Perhaps I could take the dataset that is returned and group based on
the result and then do a count for how many times that interval is
reached.

i.e.
0 - 5,000
1 - 10,00
2 - 6,000
.....
18 - 5
19 - 2

etc...

I did upload the entire spreadsheet incase anyone is interested. I
don't generally have a lot of time to be in front of my pc. "http://
www.myitcs.com/Courthouse analysis - nathaniel.xlsx"
the smaller version ( only a thousand calls) "http://www.myitcs.com/
callanalysis.xls" is also up there.

I'll try taking the results to a table and then using that to get my
statistics. If anyone else has any luck, I would be curious to know.

Step two of this project will be for me to then evaluate how many of
these calls are local vs. long distance. and then how many were
building to building ( which we could then eliminate because we will
be doing VOIP over a T1).

Thanks again for all your help. This is my very first usenet posting
- it's good to know there are some nice/intelligent people out there.

Nathaniel
 
R

Rick Rothstein \(MVP - VB\)

Okay, here is my take on it. Assuming your start date/time is in Column A,
your duration is in Column B, and your ending date/time is in Column C, run
the following macro from the code page that your data is located on. When
the code is finished running (I have no idea how long it will take to
process 250,000 rows of data), a message box will appear showing the usage
frequency on a per second basis.

Rick

Sub GetMaxUsageBySecond()
Dim X As Long
Dim Z As Long
Dim Offset As Long
Dim LastRow As Long
Dim TotalSeconds As Long
Dim MaxStartTimeSeconds As Long
Dim Duration As Long
Dim MaxSeconds As Long
Dim Seconds() As Long
Dim Frequency() As Long
Dim FirstDateTime As Double
Dim LastDateTime As Double
Dim LastStartTime As Double
Dim MaxTime As Date
Dim MaximumDate As Date
Dim Summary As String
LastRow = Cells(Rows.Count, "A").End(xlUp).Row
FirstDateTime = Range("A1").Value
LastDateTime = Cells(LastRow, "C").Value
LastStartTime = Cells(LastRow, "A").Value
MaxStartTimeSeconds = DateDiff("s", FirstDateTime, LastStartTime)
MaximumDate = Application.Evaluate("=MAX(C1:C" & _
Cells(Rows.Count, "C").End(xlUp).Row & ")")
TotalSeconds = DateDiff("s", FirstDateTime, MaximumDate)
ReDim Seconds(0 To TotalSeconds)
For X = 1 To LastRow
Duration = Cells(X, "B").Value
Offset = DateDiff("s", FirstDateTime, Cells(X, "A").Value)
For Z = Offset To Offset + Duration
Seconds(Z) = Seconds(Z) + 1
Next
Next
For X = 0 To TotalSeconds
If Seconds(X) > MaxSeconds Then
MaxSeconds = Seconds(X)
MaxTime = FirstDateTime + TimeSerial(0, 0, X)
End If
Next
ReDim Frequency(0 To MaxSeconds)
For X = 0 To TotalSeconds
Frequency(Seconds(X)) = Frequency(Seconds(X)) + 1
Next
For X = MaxSeconds To 0 Step -1
Summary = Summary & "There were " & Frequency(X) & " times " & X & _
" lines were in use at the same time." & vbNewLine
Next
MsgBox Summary
End Sub
 
I

Ivyleaf

Hi Rick,

I thought of trying the same process this morning (in the shower of
course where great ideas always start) but I thought there might be a
limitation to the size of an array? I haven't looked around yet, but
does anyone know if an array can handle 23 million items? If not, I
was thinking that maybe you could integrate sub arays into the code...
for example, depending on how Nathaniel wants to summarise his data
the code could take a chunk of data at a time into the "seconds" array
(say an hour's worth) then at the end of that loop the max of the
"seconds" array could be stored in the "hours" array and so on. The
same could be done for "days", "weeks" etc if necessary. You could
also then do a frequency analysis in parallel by looping throught the
resulting arrays and doing a count for each "max" figure. Might be
worth running the array to a collection to remove duplicates.

Just some thoughts.

PS. This might be an appropriate situation for some sort of progress
box / status bar text or a small server farm :)

Cheers,
Ivan.
 
R

Rick Rothstein \(MVP - VB\)

By the way, I forgot to mention, if you want to plot the number of
concurrent lines in use per second, use the Seconds array. The zeroeth array
element contains the number of lines in use for the first "date/time" in
your list, array element one contains the number of lines in use one second
later, and so on.

Rick
 
N

nwatkins

I've just gotten back to my house. I'm getting ready to run Rick's
code - I'll keep you posted on the progress...
 
N

nwatkins

just started running the code - I'm getting an overflow error with the
timeserial function - it apparently dies out at 32,767 - I will modify
the code to divide this out put it into hour minutes once it gets that
large...I'll check back once I change this
 
R

Rick Rothstein \(MVP - VB\)

I just tested Dim'ming the array to 31,622,400, the number of seconds in a
leap year (I'm assuming the 23,000,000 number is supposed to be the number
of "working days" in a year) as the array element span from first to last
date needs to be contiguous, and it worked. The main loop will only have to
iterate 250,000 times (the number of actual records), but there are two
internal loops (one for the max value and the other for the summary of
frequencies) that iterate throughout all the seconds across the time span. I
guess one of those loops could be eliminated, or their functionality
combined into a single loop (the summary of frequencies was an after thought
that I threw together before going out for the evening). I like to get some
feed back from the OP before "slicing and dicing" it any further. As for you
"sub arrays" idea... yeah, splitting this out into, say, weeks or months is
definitely a possibility; although if calculated all together, the over
memory requirements would be roughly the same as doing it as I originally
proposed. Again, I would like some more feed back for the OP before
attempting one of those also.

Rick


Hi Rick,

I thought of trying the same process this morning (in the shower of
course where great ideas always start) but I thought there might be a
limitation to the size of an array? I haven't looked around yet, but
does anyone know if an array can handle 23 million items? If not, I
was thinking that maybe you could integrate sub arays into the code...
for example, depending on how Nathaniel wants to summarise his data
the code could take a chunk of data at a time into the "seconds" array
(say an hour's worth) then at the end of that loop the max of the
"seconds" array could be stored in the "hours" array and so on. The
same could be done for "days", "weeks" etc if necessary. You could
also then do a frequency analysis in parallel by looping throught the
resulting arrays and doing a count for each "max" figure. Might be
worth running the array to a collection to remove duplicates.

Just some thoughts.

PS. This might be an appropriate situation for some sort of progress
box / status bar text or a small server farm :)

Cheers,
Ivan.
 
R

Rick Rothstein \(MVP - VB\)

If you come back to this thread before your test run, you can eliminate the
MaxTime declaration and all lines containing it (except for the MsgBox
line... just remove the section of the output dealing with it for that one).
The time it reports is only one of possibly many, many times where the
maximum is sure to occur, so it is kind of a meaningless entity to display.
Removing it will make the loops it is in faster as the program won't have to
waste time performing calculations involving it.

Rick
 
N

nwatkins

The entire thing took less than 1 minute to run - here are the
results:

"There were 15 times 18 lines were in use at the same time.
There were 120 times 17 lines were in use at the same time.
There were 396 times 16 lines were in use at the same time.
There were 1238 times 15 lines were in use at the same time.
There were 4113 times 14 lines were in use at the same time.
There were 10571 times 13 lines were in use at the same time.
There were 24887 times 12 lines were in use at the same time.
There were 54191 times 11 lines were in use at the same time.
There were 112575 times 10 lines were in use at the same time.
There were 213338 times 9 lines were in use at the same time.
There were 361482 times 8 lines were in use at the same time.
There were 561049 times 7 lines were in use at the same time.
There were 773832 times 6 lines were in use at the same time.
There were 938616 times 5 lines were in use at the same time.
There were 993990 times 4 lines were in use at the same time.
There were 932789 times 3 lines were in use at the same time.
There were 1085005 times 2 lines were in use at the same time.
There were 3297372 times 1 lines were in use at the same time.
There were 15515978 times 0 lines were in use at the same time.

Rick - that was beautiful - it would have taken me forever to get
there without your help. Thanks to everyone that contributed!

Nathaniel Watkins
Garrett County Government
 
R

Rick Rothstein \(MVP - VB\)

Less than one minute? Really? Wow, I am surprised. I just added up all the
frequencies and it totals 24,881,557 one-second intervals that were
examined. I would have thought that many loops (even though all they are
actually doing is incrementing values by one most of the time), plus the
original 250,000 loops for your records, would have taken a lot, lot, lot
longer to run. Anyway, I'm glad the code worked out for you. Just a
reminder, don't forget, if you want to actually plot the number of lines per
second, you can use the Seconds array for that (I'm guessing you saw my
other post to Ivan where I mentioned that as you responded to the message,
although not regarding its content).

Rick
 
R

Rick Rothstein \(MVP - VB\)

Actually, in thinking about it, while there were 24,881,557 one-second
intervals in your time span, 15,515,978 were not in use which means there
was no code activity for them. Hence, only 9,365,579 intervals were actively
being incremented while the code was creating the frequency summary.. a lot,
to be sure, but significantly less than 24,881,557. Still, less than a
minute to execute it all is still surprising to me.

Rick
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top