2,996 questions
-3
votes
0
answers
25
views
How to ready files dynamically from different storage accounts [closed]
I am writing a Synapse spark code to dynamically read from files from different storage account, I dont want it to be hard coded as the Spark will be attached to pipiline - see the image below..NB the ...
0
votes
1
answer
54
views
How to export Artifacts ARM Template for a Azure Synapse Workspace?
Is there a possible way to export an "Artifacts ARM Template" from a specific resource?
I have been trying to implement an Azure devops CI/CD Pipeline that migrates artifacts (Pipeline, ...
1
vote
1
answer
42
views
Issue Passing Parameters to Pipeline in Azure Synapse
hello I am having a problem activating a pipeline on Azure Synapse, via local, with input parameters (parameters that I then use in a notebook).
here is my local code:
import json
import time
import ...
0
votes
1
answer
46
views
Access filters for Synapse DB files for particular user
I have 5 files in my Azure Synapse (i.e) file1, file2, file3, file4, file5. I have 3 users for those 5 files, Let's say user 1 should see file1, file 2 only. User 2 shall see file 3, file 4 & user ...
0
votes
1
answer
55
views
Synapse Web Activity could be changing datatime2 format
I have a Web Activity to send a POST request to an API created in Azure Spring Apps. The body is made out of the result of a previous query, in JSON format. The Web Activity sends the request properly ...
0
votes
1
answer
47
views
Connecting Local Visual Studio Notebook to Azure Synapse and Using a Spark Pool
Hi is it possible to connect me from local (visual studio notebook) to Azure Synapse and use a spark pool created in synapse?
Something like this:
spark = SparkSession.builder \
.appName("...
0
votes
1
answer
61
views
Spark error hadoop reading a file in azure- java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azure.NativeAzureFileSystem not founded
I am working locally on a visual studio notebook and would like to connect to the spark pool of synapse azure, but i keep having the problem with hadoop.
The error appears when spark reads the parquet....
-1
votes
1
answer
75
views
Synapse Link for Cosmos DB vCore (MongoDB)
I need to setup CosmosDB Analytical Store and Synapse Link for my vCore based Cosmos DB Database for MongoDB. As it is not stated in the documentation, I would like to ask whether this is supported ...
0
votes
2
answers
75
views
Efficiently updating a single column value for many rows in MS Fabric / pyspark / delta
I have a data set on pretty small Microsoft Fabric capacity (which, if you don't know is basically Azure Synapse, which is basically Apache Spark).
Due to limitations with the data source, I am ...
0
votes
1
answer
49
views
Azure Synapse External Table no accessible from PowerBI
I have a delta table in a directory in a storage account and I am creating an external table in azure synapse using this query
IF NOT EXISTS (SELECT * FROM sys.external_file_formats WHERE name = '...
0
votes
1
answer
70
views
Modifying Spark Partition Key Without Shuffling
I am working in Azure Synapse Analytics, in PySpark. Say I have a PySpark dataframe df with a partition key 'DATE'. However, say that 'DATE' is a string type and we would like to cast it to a date by ...
0
votes
1
answer
81
views
Spark giving error when writing a limited length Column of type Varbinary for Synapse database
I am writing output to a Azure Synapse table where the table contains a varbinary(8000) column. When writing using spark it gives error that UNSUPORTED_DATATYPE as I am trying to limit length from ...
0
votes
1
answer
27
views
How can I return an empty result instead of error 13807 when querying a path that might not exist with OPENROWSET?
I have a Synapse SQL Serverless query as follows
SELECT *
FROM OPENROWSET
(
BULK 'mytestpath/*/*',
DATA_SOURCE = 'LocalDataLake',
FORMAT = 'Parquet'
)
WITH(
Foo INT,
Bar INT
) X
The ...
0
votes
1
answer
69
views
RLS in synapse serverless SQL pool using views
Issue: In synapse serverless SQL pool when I try to access view data created from file stored in ADLS with AD/Entra users having "Synapse SQL pool admin" rights on the synapse workspace
and ...
2
votes
1
answer
87
views
Delta lake MERGE error: [INVALID_EXTRACT_BASE_FIELD_TYPE]
I'm trying to implement the Pyspark code below to read delta files saved in the data lake (delta_table) and join with data frame with updated records (novos_registros).
#5. Build the matching ...