1,285 questions
0
votes
1
answer
26
views
Tasks in "removed" status keep appearing and disappearing in Google Cloud Composer UI
I'm using Google Cloud Composer (Apache Airflow), and recently I started noticing a strange issue in the UI. Some tasks periodically appear and disappear with the "removed" status.
Has ...
0
votes
0
answers
37
views
CloudRunExecuteJobOperator fail to trigger job run on GCP
I have a Cloud Composer environment that is mainly made of DAGs that have to trigger jobs on GCP using the CloudRunExecuteJobOperator.
Without any changes to my instance, the tasks begin to stay stuck ...
0
votes
0
answers
14
views
Is there a way schedule a cloud composer environment on and off?
I'm exploring building a data pipeline that runs on Google Cloud Composer 3. Then pipeline only needs to run once a week for about 12-18 hours. I'm trying to minimize costs as the environment incurs ...
0
votes
0
answers
30
views
Why did my Airflow DAGs suddenly fail with "PythonVirtualenvOperator only supports functions for python_callable arg"?
Issue Description
I have multiple Airflow DAGs using a custom operator. All of these DAGs were working perfectly fine a few hours ago, but now they're all showing "Broken dag" status with ...
0
votes
0
answers
50
views
Cloud composer 2 is not responding after environment size changed
I'm using Cloud composer environment on GCP.
Last month, we tried to change our environment size from ENVIRONMENT_SIZE_MEDIUM to ENVIRONMENT_SIZE_SMALL (using Terraform).
After 45 minutes, the ...
0
votes
1
answer
102
views
Google Cloud Composer - Missing "Trigger DAG w/ config" option?
I have been looking everywhere in Cloud Composer Airflow for the "Trigger DAG w/ config" option when executing a DAG but it seems like it's simply not available anywhere.
Could anyone point ...
3
votes
0
answers
108
views
Cloud Composer/Airflow Loggin — Line breaks as /n
We've recently upgraded GCP's Cloud Composer to version 3 and Airflow to version 2.10.2-build.5.
After this change, all the logs that previously were multi-line, are now all in a single line with \n ...
0
votes
0
answers
28
views
Mount Google Composer's data folder in a custom Kubernetes pod
To execute compute-intensive tasks (such as unzipping files), I decided to use the KubernetesPodOperator, which runs Bash commands on a pod with a custom Docker image inside the composer-user-workload ...
0
votes
1
answer
49
views
How to authenticate to Cloud Composer Airflow REST API
I try to authenticate an HTTP call to the Cloud Composer Airflow REST API from a Cloud Run service in a different project. I want to programmatically trigger a DAG run. I don't understand how to do it....
0
votes
0
answers
25
views
Cloud composer 2 fails to check dataflow job status
I need help with this issue.
We recently migrated from Cloud Composer 1 to Cloud Composer 2 but we have issues when Composer tries to check the DataFlow job status.
Dataflow correctly starts but often,...
0
votes
0
answers
69
views
Accessing sheets from operators on Google Cloud Composer 3
We are trying to migrate our DAGs from Airflow 2.9.3 on Composer 2 to Airflow 2.10.2 on Composer 3, but can't seem to get operators accessing Google Cloud Sheets to work.
What works on Composer 2, is ...
0
votes
0
answers
78
views
How to load common dependencies into dataflow?
Our team has a set of data pipelines built as DAGs triggered on Composer (Airflow) that run Beam (Dataflow) jobs.
Across these dataflow pipelines, there are a set of common utilities engineers need to ...
0
votes
0
answers
61
views
Incompatible table partitioning specification using spark BQ connector
I want to run a spark job that was originally based on scala 2.12.15 and spark 3.3.4 with sparkBigQueryConnectorVersion = "0.28.1".
However, I just upgraded the runtime of dataproc ...
1
vote
1
answer
64
views
Running dbt in Airflow using GKEStartPodOperator
I would like to run dbt using the GKEStartPodOperator Airflow Operator but I am struggling to find the proper way to authenticate dbt so that it can perform operations on Google Cloud BigQuery.
So ...
1
vote
1
answer
139
views
Why is BeamRunPythonPipelineOperator unable to track dataflow job status, keeps waiting until job end without returning Dataflow logs?
I am triggering a Dataflow pipeline using the BeamRunPythonPipelineOperator() in Airflow on Cloud Composer (composer-2.9.8-airflow-2.9.3). The job is submitted successfully to Dataflow, however the ...