I am connecting a ssms instance present on my personal computer with databricks using jdbc.
here is my code:
database_host = "192.168.xxxxx.xxxxx"
database_port = "1433" # update if you use a non-default port
database_name = "PRACTICE"
table = "TblGender"
user = "sa"
password = "somepassword"
url = f"jdbc:sqlserver://{database_host}:{database_port};database={database_name}"
remote_table = (spark.read
.format("jdbc")
.option("driver", driver)
.option("url", url)
.option("dbtable", table)
.option("user", user)
.option("password", password)
.load()
)
remote_table.show()
I have enabled TCP/IP and Set up the inbound rule and still i am not able to connect.
I am getting this error:
INTERNAL: The TCP/IP connection to the host 192.168.xxx.xxx, port 1433 has failed. Error: "Connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".
I am not sure what is incorrect in this code but wanted to ask if:
- Host name is correct or not as i am using ipv4 address fetched from running ipconfig in cmd.
- If i am using the servername i am getting the same error.
Help me out to identify the problem.