D
David
Hello,
I have a "small" problem...
I can take a .txt file and import it into Excel, manually, and my long
text/memo fields all come in fine.
When I use the following command to automate this process to create an .xls
file, these long text fields are truncated at 255 characters.
I need to get to an Excel file to import into Access.
Two questions:
1. In order to test this, I am using the following command, knowing columns
29-32 contain "memo" data. For some reason, the fields are truncated whether
I use a data type of 1 (general) or 2 (text). When I import manually, the
fields are set to GENERAL data type by Excel.
xlApp.Workbooks.OpenText FileName:=strInputFileName, _
DataType:=xlDelimited, ConsecutiveDelimiter:=True, _
Tab:=True, FieldInfo:=Array(Array(29, 1), Array(30, 1),
Array(31, 1), Array(32, 1))
2. Ideally, I want to "parse"/dynamically determine which field names are in
the .txt file (user definable) so I can dynamically build the "FieldInfo"
command above.
Thanks in advance.
I have a "small" problem...
I can take a .txt file and import it into Excel, manually, and my long
text/memo fields all come in fine.
When I use the following command to automate this process to create an .xls
file, these long text fields are truncated at 255 characters.
I need to get to an Excel file to import into Access.
Two questions:
1. In order to test this, I am using the following command, knowing columns
29-32 contain "memo" data. For some reason, the fields are truncated whether
I use a data type of 1 (general) or 2 (text). When I import manually, the
fields are set to GENERAL data type by Excel.
xlApp.Workbooks.OpenText FileName:=strInputFileName, _
DataType:=xlDelimited, ConsecutiveDelimiter:=True, _
Tab:=True, FieldInfo:=Array(Array(29, 1), Array(30, 1),
Array(31, 1), Array(32, 1))
2. Ideally, I want to "parse"/dynamically determine which field names are in
the .txt file (user definable) so I can dynamically build the "FieldInfo"
command above.
Thanks in advance.