I'm working on a databricks pipelibe and trying to create and apply expectations on a pipeline. I have the code but I keep getting an error that I cannot resolve.There is not much to go on, but I keep trying different methods, resoving all the errors and end up with the same error and I don't really understand what is going wrong. I've checked if it's a premission issue, I havve tried displaing the table and that works fine. In the pipeline view I should be able to see my expectaion but because it does not work it's not showing.
The error is: be7a33 update is FAILES. Error class:_UNCLASSIFIED_PYTHON_COMMAND_ERROR
%python
from pyspark.sql.functions import col
from pyspark import pipelines as dp
@dp.table(
name="orders",
comment="Orders table with data quality constraints"
)
@dp.expect_all_or_fail(
"expect_table_row_count_to_be_between", "COUNT(*) > 100",
"customer_id_not_null", "customer_id IS NOT NULL",
"expect_column_values_to_be_in_set", "currency IN ('USD', 'EUR', 'GBP')"
)
def orders():
return dp.read("Xyntrel_bronze.bronze.orders").filter(
col("customer_id").isNotNull()
)
I don't understand because the parser says the code is correct but on execution I get a fail.
"timestamp": "2025-12-10T09:13:32.863Z",
"message": "Update be7a33 is FAILED.",
"level": "ERROR",
"error": {
"exceptions": [
{
"message": "",
"error_class": "_UNCLASSIFIED_PYTHON_COMMAND_ERROR",
"short_message": ""
}
],
"fatal": true
},
"details": {
"update_progress": {
"state": "FAILED"
}
},
"event_type": "update_progress",
"maturity_level": "STABLE"}