0
votes

I am using MS-Access 2003 and 2016 (365) and I have an excel 2016 file saved as a csv. The excel file is from an application that I have no control over and cannot be normalized. They use some columns for a specific type and not for others hence why the output has columns A to XU.

The excel file has more than 255 columns.

I want to use a single column a column 2 (part number) and other multiple columns load into multiple tables and allow for linking the part numbers together.

I.E. Table one will have part number, column 1, column 3, column 4 .... column 200.

Then Table two will be Part number, column 201, column 202..... column 400.

Then table three etc. etc.

Until all the columns are loaded (this can be variable but around 650 columns)(currently column XU in excel).

'The first part
#If Win64 Then '64?
    Private Declare PtrSafe Function MsgBoxTimeout _
        Lib "user32" _
        Alias "MessageBoxTimeoutA" ( _
            ByVal hwnd As LongPtr, _
            ByVal lpText As String, _
            ByVal lpCaption As String, _
            ByVal wType As VbMsgBoxStyle, _
            ByVal wlange As Long, _
            ByVal dwTimeout As Long) _
    As Long
#Else
    Private Declare Function MsgBoxTimeout _
        Lib "user32" _
        Alias "MessageBoxTimeoutA" ( _
            ByVal hwnd As Long, _
            ByVal lpText As String, _
            ByVal lpCaption As String, _
            ByVal wType As VbMsgBoxStyle, _
            ByVal wlange As Long, _
            ByVal dwTimeout As Long) _
    As Long
#End If

Sub Insert_PPL()
'
' Insert_PPL Macro
' This copies the PPL external data into the PPL table
       
    Application.ScreenUpdating = False
    Dim MyFile As String
    Dim LastRow As Long

    'Error handling
On Error GoTo Err_Insert

   'MyFile = Application.GetOpenFilename("Excel Files (*.xl*),*.xl*", , "Select TechnoSearch Download File", "Open", False)

    'Workbooks.Open (MyFile)

    Worksheets("PPL").Activate
    Worksheets("PPL").Cells.Select
    Selection.Delete
    
    'Moved the myfile open to after the PPL delete
    MyFile = Application.GetOpenFilename("Excel Files (*.csv*),*.csv*", , "Select TechnoSearch Download CSV File", "Open", False)

    Workbooks.Open (MyFile)
    
    ActiveSheet.Cells.Select
    Selection.Copy
    
    Application.DisplayAlerts = False
    
    ActiveWorkbook.Close
    
    Worksheets("PPL").Select
    ActiveSheet.Range("A1").Select
    Worksheets("PPL").Paste
    
    Application.DisplayAlerts = True
        
    MsgBox ("PPL has been loaded")
    
    Remove_More_Text
    Filter_PPL
    
    Exit Sub

Err_Insert:
    MsgBox Err.Description, vbCritical, Err.Number
    
End Sub

Sub Remove_More_Text()
'
' Remove_More_Text Macro
' Used to remove the additional text in the TechnoSearch File
'

Dim sht As Worksheet
Dim LastRow As Long
Dim rng As Range
Dim str As String
Dim x As Integer
Dim LastWord As String


Set sht = ThisWorkbook.Worksheets("PPL")
    
    Columns("E1:E" + CStr(sht.Rows.Count)).Select
    LastRow=sht.Rows.Count
    For cnt = 2 To LastRow
        Set rng = Range("E" + CStr(cnt))
        
        str = rng.Value
        
        'Get the Character Position of more text
        If InStr(str, "more text") = 0 Then
            x = Len(str) + 3
        ElseIf InStr(str, "more text") < 4 Then
            x = 3
        Else
            x = InStr(str, "more text")
        End If
        
        LastWord = Left(str, x - 3)

        'Replace the original with the shortened string
        rng.Value = LastWord
        Call MsgBoxTimeout(0,cnt&" of "&LastRow,"",vbInformation,0,1)

    Next
    
End Sub
2
What is your question? I see posted title but body does not explain issue with code.Parfait

2 Answers

0
votes

You can use VBA to open the CSV file as a text file, read it in line by line, and then add it to recordsets based on the three tables. Something like the code below should get you going in the right direction:

Sub sGetWideCSV()
    On Error GoTo E_Handle
    Dim db As DAO.Database
    Dim rs1 As DAO.Recordset
    Dim rs2 As DAO.Recordset
    Dim rs3 As DAO.Recordset
    Dim strFile As String
    Dim intFile As Integer
    Dim strInput As String
    Dim astrInput() As String
    Dim intLoop1 As Integer
    strFile = "C:\test\data.csv"
    intFile = FreeFile
    Open strFile For Input As intFile
    Set db = CurrentDb
    Set rs1 = db.OpenRecordset("SELECT * FROM tbl1 WHERE 1=2;")
    Set rs2 = db.OpenRecordset("SELECT * FROM tbl2 WHERE 1=2;")
    Set rs3 = db.OpenRecordset("SELECT * FROM tbl3 WHERE 1=2;")
    Do
        Line Input #intFile, strInput
        astrInput = Split(strInput, ",")
        With rs1
            .AddNew
            rs1(0) = astrInput(0)
            For intLoop1 = 1 To 10
                rs1(intLoop1) = astrInput(intLoop1)
            Next intLoop1
            .Update
        End With
        With rs2
            .AddNew
            rs2(0) = astrInput(0)
            For intLoop1 = 1 To 10
                rs2(intLoop1) = astrInput(intLoop1 + 10)
            Next intLoop1
            .Update
        End With
        With rs3
            .AddNew
            rs3(0) = astrInput(0)
            For intLoop1 = 1 To 10
                rs3(intLoop1) = astrInput(intLoop1 + 20)
            Next intLoop1
            .Update
        End With
    Loop Until EOF(intFile)
sExit:
    On Error Resume Next
    rs1.Close
    rs2.Close
    rs3.Close
    Set rs1 = Nothing
    Set rs2 = Nothing
    Set rs3 = Nothing
    Set db = Nothing
    Reset
    Exit Sub
E_Handle:
    MsgBox "sGetWideCSV", vbOKOnly + vbCritical, "Error: " & Err.Number
    Resume sExit
End Sub

In the example above, I've only used 10 columns per recordset, but you should get the idea. If the last column in the CSV is variable, then you will need to deal with that (probably using UBound after the data is split into the array).

However, rather than reading it into Access and getting around the column limitation by using multiple tables, it is almost certainly a better idea to normalise the data when you read it in.

Regards,

0
votes

Consider pure SQL as Access can directly query CSV data files (even Excel workbooks) as if they are tables:

INSERT INTO myTable1 (PartNumber, Col1, Col2, ..., Col200)
SELECT PartNumber, Col1, Col2, ... Col200
FROM [text;database=C:\Path\To\Folder].myCSVFile.csv AS t;
INSERT INTO myTable2 (PartNumber, Col1, Col2, ..., Col200)
SELECT PartNumber, Col201, Col202, ... Col400
FROM [text;database=C:\Path\To\Folder].myCSVFile.csv AS t;
INSERT INTO myTable3 (PartNumber, Col1, Col2, ..., Col200)
SELECT PartNumber, Col401, Col402, ... Col600
FROM [text;database=C:\Path\To\Folder].myCSVFile.csv AS t;

Above can be saved as stored Access queries and run any time in future with DoCmd.OpenQuery "myQueryName" or CurrentDb.Execute "myQueryName" assuming paths, file names, and headers do not change.


To quickly retrieve columns from CSV (which you only need to do one time for above queries):

  1. Copy in Excel the needed batch of 200 headers.
  2. Paste tab-separated CSV column headers in Notepad or other text editor.
  3. Replace all tabs (\t) with comma and space (, ). Specifically, copy the invisible space between any two column names and run a Find/Replace All (Ctrl+H).
    • If columns have special characters or names, you need to wrap square brackets. So replace tab with closing bracket, comma, space, and opening bracket (], [). Then add an opening bracket before very first column and closing at very last column.
  4. Copy resulting comma-separated CSV column names.
  5. Paste into SELECT clause of query (or INSERT INTO columns listing if headers match table columns).