huge file

M

mmonroe

Help.
I have a HUGE text file that won't import into ACCESS....so I downloaded
SQLEXPRESS, created a table, then linked it to ACCESS and tried to import the
data into that table, knowing it has a 4gb limit, as opposed to ACCESS having
a 2gb limit. However, the wizard fails several minutes into it saying that
"Not enough space on temp disk" What do I do?? I have tried to BULK INSERT
with SQL but I must have the syntax wrong cause it fails as well. My only
thought is that I should write an openfile in ACCESS and write it one line at
a time to the table. Not quite sure how to do that.

HELP
 
D

Douglas J. Steele

How much free space do you have on your hard drive (specifically on the
partition where your TEMP and/or TMP environment variables point)?

Often there's a requirement for considerably more memory than the size of
the file would suggest during the import process, and temporary files get
created.

Any chance you could break the file into smaller pieces and import them
separately?
 
M

mmonroe

Douglas, First, thank you for your quick response. I am not very Hardware
savy. I just bought this computer and it has 160GB hard drive, of which I
still have around 140GB free, where would I change the size of my temp drive?
Also, I tried just counting the rows in the file, there are 106Million rows,
and it took Access bout 10 minutes, I would much prefer to just copy them to
the SQL2005 table, as I will be downloading this size each week, and can't
take the time to separate it right now. As soon as I wrap my arms around
this, I will make it better. I just REALLY need to get that data in a table
so I can use it.

Thanks,

Melissa
 
D

Douglas J. Steele

No guarantee that SQL Server will be any faster.

To determine where your temp files are, open a DOS box (easiest way is to go
to Start | Run..., type cmd and hit enter). In the DOS box, type set t and
hit Enter. You'll get a list of all of the environment variables that start
with the letter T. Check where Temp and Tmp point, although unless you've
partitioned your hard drive, it's doubtful that's the issue.
 
M

mmonroe

I went into ControlPanel, System,..etc and changed the size of my temp/page
file to 2gb, it still FAILS with the same error about space. What about
simply opening it in Access through some vbscript like openfile..., then
reading it into the table one at a time?

Not worried sql is FASTER, just won't run out of table space.

I tried to write it to the table in Access, my openfile worked fine, but I
must be referencing the table wrong, could you give me a line of sample code
to write to a table?

Melissa
 
D

Douglas J. Steele

Afraid I don't understand what you're asking for a sample of.

Show me what you've got, pointing out what doesn't work (and what it's
supposed to do)
 
J

John Nurick

Hi Melissa (and Doug),

Just how huge is this file? If it really contains 106 million rows of
data, then even if there are only 10 bytes of data per row you have 1 GB
of data, and with a few bytes per record storage overhead you're rapidly
approaching the 2 GB file size limit before you can do anything with the
data.

And if this is to be a monthly import the difference between Jet's 2 GB
and MSDE's 4 GB will be insignificant: if you want to get a year's
worth of data into a database you're looking at full SQL Server
territory.

But what do you need to do with the data? If you're only interested in a
small subset of the records, you can use text file tools to filter them
out. Or you can write code in your favourite language to read the file a
line at a time, parsing each line and doing whatever grouping and
totalling or analysis you need without ever importing the whole thing
into a table.

If the file is smaller than 2GB you *may* be able to get Access to link
to it rather than import it.

If you post back with more information about the file we should be able
to suggest something.
 
M

mmonroe

OK, here is the scoop. There is a 1.2G file, it is the Do Not Call List.
There are currently 106M numbers on it. It changes weekly, so I will be
downloading it every week, and deleting the old one. So, I need to run a set
of numbers against it. Some of the tables have 200k numbers/names, some only
have 50k. The DNC list, however, is simply, areacode,phone. so xxx,xxxxxxx.
However, once downloaded, I can't figure out how to get it into a table so I
can run a delete query against it. I am going crazy. I have written some
code in ACCESS to open it, and count it. But, lost the code, and when I use
the help, some of the ACCESS code it offers is no longer valid code. So, I
paste it, change the names, add the functionality of parsing the DNC number
into two variables, one is areacode and the other phone. The tables I am
wanting to run the delete query against, have the phone in two fields,
areacode, phone. This makes for a quick search, if I make the areacode a
key. But I can't get the file IN!!! UGGHHGHHHH> I downloaded SQL, as
someone suggested, the sql2005express, as it can have a 4gb table, but the
management tool does not offer "import". So, I found the BULK INSERT command
on the inet and can't seem to get the syntax right.....can either of you
help, I have spent hours, and am becomming an angry mom with my kids! LOL
 
J

John Nurick

Hi Melissa,

Before getting into the SQL Server tools, I'd try this:

1) create a table in SQL Server with fields for AreaCode and
PhoneNumber, both indexed.

2) create a linked table in Access connected to this table.

3) Use the import wizard or DoCmd.TransferText to import the data into
the linked table.

Let us know how this goes.
 
M

mmonroe

John,

That is what I have been doing. I created the table in SQL, linked it, then
tried to IMPORT the data from the txt file into the linked table. When I do
this, I get an error that says "Not enough space on temp disk". I have
changed my temp or page file to 2gb, and it still errors out. I have tried
to write code in Access, but am getting confused I used
open <filename> as # 1
do while not eof (1)
input line #1, phone
loop
close #1
docmd.transfer text before, how do I use that?

Melissa
and stepped through it but it bombs, not sure how to read it any other way,
tried the openfilestream command, I have seen/written so much code that
hasn't worked, I am a bit confused. I have never seen the
 
G

Guest

I think /Access/ will do append queries by writing a
transaction to xxxxxx.tmp in the temp folder, so I wouldn't
expect TransferText (which is an Access method) to work,
regardless of the target.

(and wouldn't even be sure about db.Execute)

Also, I'm worried about the Primary Key/Indexing thing.
Obviously, the import is going to be much slower to an
indexed table, but it is not going to be possible to import
unless there is a primary key on the SQL Server table is
it? And then, as we sometimes see with subforms, if
Access/Jet actually trys to use the primary key instead of
doing 'append all', we will definitely be here all night.

(david)
 
J

John Nurick

Melissa,

1) David's post explains why importing to the linked file isn't
workking.

2) this
open <filename> as # 1
do while not eof (1)
input line #1, phone
loop
close #1

doesn't actually import anything, it merely reads the text file line by
line. It should be possible to import the data this way by using a
single-record append query inside the loop as follows, but 106m times
round the loop may take too long:

Sub ImportPhone()
Dim Phone As String
Dim Area As String
Dim FN As Long
Dim dbD As DAO.Database

FN = FreeFile()
Open "C:\Temp\Phones.txt" For Input As #FN
Set dbD = CurrentDb

Do Until EOF(FN)
Input #FN, Area, Phone
dbD.Execute "INSERT INTO DoNotCall (Area, Phone) VALUES ('" _
& Area & "', '" & Phone & "');", dbFailOnError
Loop
Close #1
End Sub

So maybe the next thing to try is to download the Windows port of the
Gnu utilities from http://unxutils.sourceforge.net/. These include the
"split" utility, which will split the 106MB file into (say) 10 x 10MB
files and one smaller one; these should import one at a time via Access
and the linked SQL Server table.

DoCmd.TransferText is the VBA equivalent of importing text using
File|Get External Data.
 
J

John Nurick

Since it's a weekly import, taking all night presumably isn't a
problem<g>. But it's a lot of data for routine Access operations, and if
the suggestions I've made in my other post just now don't do the job I'm
beginning to feel it's time to pass this one over to a welcoming SQL
Server newsgroup - if there is one<g>.
 
M

mmonroe

John,

I am on it! I knew the previous code did not write anywhere, I was having
trouble getting it to write to a table. I used that code to time the simple
read. It took over 10 minutes, but once it is written AND indexed, I suspect
it will be fine. I don't think I will need the split command, as I intend to
populate the SQL table I have attached, we shall see if that works.

I will try your code, of course changing names..etc. Thank you for all your
help. I will post the results, as it would be unfair to not tell you after
all your effort to help a stranger!

Melissa
 
M

mmonroe

John,

Basically, that is not going to be the solution. After some manipulating, I
set it up to read/parse the line, and let it run, it was still running 1.5
hours later, when I interrupted it. It had read several hundred thousand
numbers into the table, but not all of them, and the CNTROL ALT DEL, said
that ACCESS had stopped responding. I am working from the BULK INSERT
command on the SQLEXPRESS download now. It is having trouble with the
syntax...or I am. But that may be the fastest way to get the numbers in.

Thanks for all your help

Melissa
 
J

John Nurick

I'm not surprised: executing 106m append queries was going to take time.

If you post the code you were using someone may be able to suggest a way
of shaving a few milliseconds off the loop. The way you say "read/parse"
makes me wonder if you were using Line Input # and then parsing the
result instead of using Input # to parse as you read: that would waste
time.

Access stopping responding is, I think, normal. You'd need to call
DoEvents inside your loop to get it to respond - and that would slow
things down enormously. You could use a counter so as to call DoEvents
occasionally, but even doing this 106m times would materially increase
the run time:
Dim Counter As Long
...
Do Until Eof(FN)
...
Counter = Counter + 1
If Counter Mod 1000 = 0 Then
Counter = 0
DoEvents
End If
Loop

So maybe BULK INSERT (or the bcp.exe utility) is your best bet. From a
quick look at the documentation, bcp looks simpler to me.

Good luck!
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top