Wednesday, October 16, 2024

Convert date in Databricks - JD Edwards Julian date to calendar date

Convert date in Databricks - JD Edwards Julian date to calendar date


CREATE FUNCTION yourdatabase.date_convert_j_to_c(gldgj INT)
RETURNS TIMESTAMP
RETURN date_sub(
    date_add(
        DAY,
        cast(gldgj AS VARCHAR(10)) % 1000,
        date_add(YEAR, cast(gldgj AS VARCHAR(10)) / 1000, cast('1900-01-01' AS timestamp))
    ),
    1
);

Friday, June 21, 2024

Monday, June 10, 2024

Load data from SQL Server to databricks using Excel VBA

Sub Upload_Data_To_SQLServer_Dynamic() ' Define variables Dim Cn As ADODB.Connection Dim rs As ADODB.Recordset Dim uploadQuery As String Dim batchCounter As Long Dim batchSize As Long Dim batchQuery As String Dim headerArray() As String Dim numColumns As Integer Dim columnList As String Dim sqlQuery As String Dim TableName As String Dim i As Integer Dim StartTime As Double StartTime = Timer ' Establish connection to SQL Server using Windows Authentication Set Cn = New ADODB.Connection Cn.ConnectionString = "Provider=SQLOLEDB;Data Source=YourServerName ;Initial Catalog=mfdb;Integrated Security=SSPI;" Cn.Open ' SQL query to get data from source table (modify as needed) sqlQuery = "SELECT * FROM dbo.tblco_list" ' Execute the query and get a Recordset Set rs = New ADODB.Recordset rs.Open sqlQuery, Cn, adOpenStatic, adLockReadOnly ' Initialize variables batchCounter = 0 batchSize = 80 TableName = "DatabricksDatabasename.tblCoListCon_Temp" uploadQuery = "INSERT INTO " & TableName & " (" ' Read the header row to determine the number of columns numColumns = rs.Fields.Count ReDim headerArray(numColumns - 1) For i = 0 To numColumns - 1 headerArray(i) = rs.Fields(i).Name columnList = columnList & rs.Fields(i).Name If i < numColumns - 1 Then columnList = columnList & ", " End If Next i uploadQuery = uploadQuery & columnList & ") " ' Initialize the batch query batchQuery = "" ' Establish connection to SQL Server (replace details with your own) dsnName = "_DL_PRD_DSN_DBLT" Set Cn2 = New ADODB.Connection Cn2.Open "DSN=" & dsnName & ";Uid=databricksuserid;Pwd=databricksPassword;" ' Read and process the data from the recordset Do While Not rs.EOF ' Build the SELECT part dynamically based on data types (modify as needed) batchQuery = batchQuery & "SELECT " For i = 0 To numColumns - 1 If IsNull(rs.Fields(i).Value) Then batchQuery = batchQuery & "NULL" Else batchQuery = batchQuery & "'" & Replace(rs.Fields(i).Value, "'", "''") & "'" End If If i < numColumns - 1 Then batchQuery = batchQuery & ", " End If Next i batchQuery = batchQuery & " UNION ALL " & Chr(10) batchCounter = batchCounter + 1 ' Check if batch size is reached If batchCounter >= batchSize Then ' Remove the last 'UNION ALL' and execute the batch query If Len(batchQuery) > 0 Then batchQuery = Left(batchQuery, Len(batchQuery) - Len(" UNION ALL " & Chr(10))) Debug.Print uploadQuery & Chr(10) & batchQuery Cn2.Execute uploadQuery & Chr(10) & batchQuery End If ' Reset the batch variables batchQuery = "" batchCounter = 0 End If rs.MoveNext Loop ' Execute any remaining batch query If batchCounter > 0 And Len(batchQuery) > 0 Then batchQuery = Left(batchQuery, Len(batchQuery) - Len(" UNION ALL " & Chr(10))) Cn2.Execute uploadQuery & Chr(10) & batchQuery End If ' Close the recordset rs.Close ' Close the connection Cn2.Close ' Inform user of successful upload MsgBox Format((Timer - StartTime) / 86400, "hh:mm:ss") & " Data upload to SQL Server completed!", vbInformation ' Clean up Set rs = Nothing Set Cn = Nothing End Sub

Thursday, June 6, 2024

How to load data (CSV file) in databricks delta table using Excel, VBA and DSN

 How to load data (CSV file) in databricks delta table using Excel, VBA and DSN

1. Create DSN using Simba driver on your machine

2. Get token and password from databricks

3. CSV column header Name must be same as table's columns header. 


Sub UploadDataToSQLServer_Dynamic()


    ' Define variables

    Dim sFile As String

    Dim Cn As ADODB.Connection

    Dim UploadQuery As String

    Dim fso As Object

    Dim ts As Object

    Dim line As String

    Dim DataArray() As String

    Dim i As Long

    Dim batchCounter As Long

    Dim batchSize As Long

    Dim batchQuery As String

    Dim headerArray() As String

    Dim numColumns As Integer

    Dim columnList As String

    

    Dim StartTime As Double

     StartTime = Timer


    ' Get the CSV file to upload

    sFile = Application.GetOpenFilename("CSV Files (*.csv), *.csv", , "Select CSV File to Upload")

    If sFile = "False" Then Exit Sub ' User canceled the file selection


    ' Open the CSV file

    Set fso = CreateObject("Scripting.FileSystemObject")

    Set ts = fso.OpenTextFile(sFile, 1) ' 1 for reading


    ' Establish connection to SQL Server (replace details with your own)

    dsnName = "YourDSNName"

    Set Cn = New ADODB.Connection

    Cn.Open "DSN=" & dsnName & ";Uid=YorrTokenFromDataBricks;Pwd=YourPassWord from Databricks ;"


    ' Initialize variables

    batchCounter = 0

    batchSize = 80

    TableName = "DatabaseName.TableName"

    

    UploadQuery = "INSERT INTO " & TableName & "("


    ' Read the header row to determine the number of columns

    If Not ts.AtEndOfStream Then

        line = ts.ReadLine

        headerArray = Split(line, ",")

        numColumns = UBound(headerArray) + 1

        columnList = Join(headerArray, ", ")

        UploadQuery = UploadQuery & columnList & ") "

    End If


    ' Initialize the batch query

    batchQuery = ""


    ' Read and process the CSV file

    i = 0

    Do Until ts.AtEndOfStream

        line = ts.ReadLine

        DataArray = Split(line, ",")


        ' Build the SELECT part dynamically based on data types (modify as needed)

        If UBound(DataArray) = UBound(headerArray) Then

            batchQuery = batchQuery & "SELECT "

            For j = 0 To numColumns - 1

                batchQuery = batchQuery & "'" & Replace(DataArray(j), "'", "''") & "'"

                If j < numColumns - 1 Then

                    batchQuery = batchQuery & ", "

                End If

            Next j

            batchQuery = batchQuery & " UNION ALL " & Chr(10)

            batchCounter = batchCounter + 1


            ' Check if batch size is reached

            If batchCounter >= batchSize Then

                ' Remove the last 'UNION ALL' and execute the batch query

                If Len(batchQuery) > 0 Then

                    batchQuery = Left(batchQuery, Len(batchQuery) - Len(" UNION ALL " & Chr(10)))

                 '   Debug.Print UploadQuery & Chr(10) & batchQuery

                    

                    Cn.Execute UploadQuery & Chr(10) & batchQuery

                End If

                ' Reset the batch variables

                batchQuery = ""

                batchCounter = 0

            End If

        End If

        i = i + 1

    Loop


    ' Execute any remaining batch query

    If batchCounter > 0 And Len(batchQuery) > 0 Then

        batchQuery = Left(batchQuery, Len(batchQuery) - Len(" UNION ALL " & Chr(10)))

        Debug.Print UploadQuery & Chr(10) & batchQuery


        Cn.Execute UploadQuery & Chr(10) & batchQuery

    End If


    ' Close the text stream

    ts.Close


    ' Close the connection

    Cn.Close


    ' Inform user of successful upload

    MsgBox Format((Timer - StartTime) / 86400, "hh:mm:ss") & " Data upload to Databricks SQL  completed!", vbInformation


    ' Clean up

    Set Cn = Nothing

    Set fso = Nothing

    Set ts = Nothing


End Sub



Thursday, February 15, 2024

Databricks - How to create function UDF

A user-defined function (UDF) is a means for a user to extend the native capabilities of Apache Spark™ SQL. SQL on Databricks has supported external user-defined functions written in Scala, Java, Python and R programming languages since 1.3.0. While external UDFs are very powerful, they also come with a few caveats:

  • Security. A UDF written in an external language can execute dangerous or even malicious code. This requires tight control over who can create UDF.
  • Performance. UDFs are black boxes to the Catalyst Optimizer. Given Catalyst is not aware of the inner workings of a UDF, it cannot do any work to improve the performance of the UDF within the context of a SQL query.
  • SQL Usability. For a SQL user it can be cumbersome to write UDFs in a host language and then register them in Spark. Also, there is a set of extensions many users may want to make to SQL which are rather simple where developing an external UDF is overkill.

 https://www.databricks.com/blog/2021/10/20/introducing-sql-user-defined-functions.html

Thursday, January 18, 2024

Openai with databricks sql for queries in natural language

Modern data platforms store and collect an incredible amount of both useful data and metadata. However, even knowing the metadata itself might be not useful for the end-users who don’t have enough experience with classical components of a relation-based data model. One of the challenges is not only the ability to write proper SQL statements to select the relevant information but also understanding of what needs to be joined (and how exactly this shall be done) even to get the simplest insights (e.g. top-5 customers from a given region by the number of orders).

 https://polarpersonal.medium.com/using-openai-with-databricks-sql-for-queries-in-natural-language-cf6521e88148