Convert date in Databricks - JD Edwards Julian date to calendar date
This is my Technical area of troubleshooting and learning new Programming skills and many more. Here you will find answers for many new technologies like asp.net 2.0/3.5,4.0 C# access, mysql, Amazon Webservice ,Sql-server, JD Edwards, SAS, Salesforce, APIs, MVC and many more. please visit & discuss.
Wednesday, October 16, 2024
Convert date in Databricks - JD Edwards Julian date to calendar date
Friday, June 21, 2024
Upload csv file from local machine to Azure Databricks, Excel,VBA
Monday, June 10, 2024
Load data from SQL Server to databricks using Excel VBA
Thursday, June 6, 2024
How to load data (CSV file) in databricks delta table using Excel, VBA and DSN
How to load data (CSV file) in databricks delta table using Excel, VBA and DSN
1. Create DSN using Simba driver on your machine
2. Get token and password from databricks
3. CSV column header Name must be same as table's columns header.
Sub UploadDataToSQLServer_Dynamic()
' Define variables
Dim sFile As String
Dim Cn As ADODB.Connection
Dim UploadQuery As String
Dim fso As Object
Dim ts As Object
Dim line As String
Dim DataArray() As String
Dim i As Long
Dim batchCounter As Long
Dim batchSize As Long
Dim batchQuery As String
Dim headerArray() As String
Dim numColumns As Integer
Dim columnList As String
Dim StartTime As Double
StartTime = Timer
' Get the CSV file to upload
sFile = Application.GetOpenFilename("CSV Files (*.csv), *.csv", , "Select CSV File to Upload")
If sFile = "False" Then Exit Sub ' User canceled the file selection
' Open the CSV file
Set fso = CreateObject("Scripting.FileSystemObject")
Set ts = fso.OpenTextFile(sFile, 1) ' 1 for reading
' Establish connection to SQL Server (replace details with your own)
dsnName = "YourDSNName"
Set Cn = New ADODB.Connection
Cn.Open "DSN=" & dsnName & ";Uid=YorrTokenFromDataBricks;Pwd=YourPassWord from Databricks ;"
' Initialize variables
batchCounter = 0
batchSize = 80
TableName = "DatabaseName.TableName"
UploadQuery = "INSERT INTO " & TableName & "("
' Read the header row to determine the number of columns
If Not ts.AtEndOfStream Then
line = ts.ReadLine
headerArray = Split(line, ",")
numColumns = UBound(headerArray) + 1
columnList = Join(headerArray, ", ")
UploadQuery = UploadQuery & columnList & ") "
End If
' Initialize the batch query
batchQuery = ""
' Read and process the CSV file
i = 0
Do Until ts.AtEndOfStream
line = ts.ReadLine
DataArray = Split(line, ",")
' Build the SELECT part dynamically based on data types (modify as needed)
If UBound(DataArray) = UBound(headerArray) Then
batchQuery = batchQuery & "SELECT "
For j = 0 To numColumns - 1
batchQuery = batchQuery & "'" & Replace(DataArray(j), "'", "''") & "'"
If j < numColumns - 1 Then
batchQuery = batchQuery & ", "
End If
Next j
batchQuery = batchQuery & " UNION ALL " & Chr(10)
batchCounter = batchCounter + 1
' Check if batch size is reached
If batchCounter >= batchSize Then
' Remove the last 'UNION ALL' and execute the batch query
If Len(batchQuery) > 0 Then
batchQuery = Left(batchQuery, Len(batchQuery) - Len(" UNION ALL " & Chr(10)))
' Debug.Print UploadQuery & Chr(10) & batchQuery
Cn.Execute UploadQuery & Chr(10) & batchQuery
End If
' Reset the batch variables
batchQuery = ""
batchCounter = 0
End If
End If
i = i + 1
Loop
' Execute any remaining batch query
If batchCounter > 0 And Len(batchQuery) > 0 Then
batchQuery = Left(batchQuery, Len(batchQuery) - Len(" UNION ALL " & Chr(10)))
Debug.Print UploadQuery & Chr(10) & batchQuery
Cn.Execute UploadQuery & Chr(10) & batchQuery
End If
' Close the text stream
ts.Close
' Close the connection
Cn.Close
' Inform user of successful upload
MsgBox Format((Timer - StartTime) / 86400, "hh:mm:ss") & " Data upload to Databricks SQL completed!", vbInformation
' Clean up
Set Cn = Nothing
Set fso = Nothing
Set ts = Nothing
End Sub
Thursday, February 15, 2024
Databricks - How to create function UDF
A user-defined function (UDF) is a means for a user to extend the native capabilities of Apache Spark™ SQL. SQL on Databricks has supported external user-defined functions written in Scala, Java, Python and R programming languages since 1.3.0. While external UDFs are very powerful, they also come with a few caveats:
- Security. A UDF written in an external language can execute dangerous or even malicious code. This requires tight control over who can create UDF.
- Performance. UDFs are black boxes to the Catalyst Optimizer. Given Catalyst is not aware of the inner workings of a UDF, it cannot do any work to improve the performance of the UDF within the context of a SQL query.
- SQL Usability. For a SQL user it can be cumbersome to write UDFs in a host language and then register them in Spark. Also, there is a set of extensions many users may want to make to SQL which are rather simple where developing an external UDF is overkill.
https://www.databricks.com/blog/2021/10/20/introducing-sql-user-defined-functions.html
Thursday, January 18, 2024
Openai with databricks sql for queries in natural language
Modern data platforms store and collect an incredible amount of both useful data and metadata. However, even knowing the metadata itself might be not useful for the end-users who don’t have enough experience with classical components of a relation-based data model. One of the challenges is not only the ability to write proper SQL statements to select the relevant information but also understanding of what needs to be joined (and how exactly this shall be done) even to get the simplest insights (e.g. top-5 customers from a given region by the number of orders).
https://polarpersonal.medium.com/using-openai-with-databricks-sql-for-queries-in-natural-language-cf6521e88148