All Questions
Tagged with google-cloud-composer python
230 questions
1
vote
0
answers
31
views
How do i configure right the staging and temp buckets at DataprocCreateClusterOperator?
I'm trying to find how to set the temp and staging buckets at the DataprocOperator. I've searched for all the internet and didnt find a good awnser.
import pendulum
from datetime import timedelta
...
0
votes
0
answers
181
views
Google Cloud Composer 3 not using Default Service Account in DAG Task
Trying to invoke Cloud Function from Cloud Composer for a Customer POC in a Free Trial Account
I am using
Cloud Function Version 2 powered by Cloud Run - Default Python Hello World Cloud Function
...
0
votes
1
answer
72
views
How to retrive which DAG has updated a Composer Airflow Dataset
Regarding Google Cloud Composer, I have defined a DAG in this way:
dataset = Dataset("//my_Dataset")
dag = DAG(
dag_id='my_dag',
default_args=default_args,
schedule=[dataset],
catchup=...
1
vote
0
answers
138
views
How to get the xcom pull as dictionary
I have a dag where i have pushed a dictionary to xcom and wnat to pull it in BigQuery Operator, i have also defined render_template_as_native_obj=True, but still it is giving error as
time ...
2
votes
1
answer
179
views
The same DAGs with different parameters or the same tasks with different parameters in the one DAG
In our project we have different clients and identical DAGs for them with different prefixes and parameters. For instance, we have mssql_to_bigquery DAG but it's separate for every client. That leads ...
1
vote
0
answers
277
views
Dynamic Task Mapping in Airflow - setting sequential dependencies amongst mapped tasks
I have a dag that queries a table in bigquery which returns a list of stored procedures to execute which can vary in both the number of stored procedures as well as the actual stored procedure needed ...
2
votes
0
answers
135
views
DbtCloudRunJobOperator in Airflow Fails to Detect Successful Job Completion in DBT Cloud: How to Resolve?
I am using the DbtCloudRunJobOperator in a managed (Google Cloud Composer) Apache Airflow instance to trigger jobs in DBT Cloud. While the DBT Cloud jobs themselves run successfully and complete ...
0
votes
0
answers
81
views
Xcom Pull returning None
I have a DAG where i am doing some data loading activity and at the end i am fetching the DAG run state in a Xcom and using that Xcom i want to pull its value to another dag. while doing so i am ...
0
votes
0
answers
543
views
Generate dynamic tasks with Airflow with .expand
I have the a DAG with the following structure:
with DAG(
# configs go here
) as dag:
def get_intervals()
# logic to retrieve some data from CloudSQL
...
1
vote
1
answer
51
views
Using Composer version 2.6.2 and Airflow version 2.5.3, getting AttributeError: 'bytes' object has no attribute 'encode' error in pubsub message [duplicate]
In logs can see the following error and DAG gets failed:-
[2024-03-27, 06:25:37 UTC] {logging_mixin.py:137} INFO - data:b'DNB-MS-MY,2024-03-26,Location' [2024-03-27, 06:25:37 UTC] {taskinstance.py:...
0
votes
1
answer
1k
views
KubernetesPodOperator shows completed in Airflow UI but keep on running in backend
I am working on KubernetesPodOperator for one of the development.
In that, I have used this to generate the files in GCP.
I tried the below approach and it was working fine too.
The ...
0
votes
1
answer
85
views
How to install composer through cloud function in GCP?
How to install composer through cloud function in GCP ? did not find appropriate documents or examples on web. I am want a cloud function to install or create a composer environment.
I did not find or ...
0
votes
1
answer
436
views
Poetry Install Subprocess PoetryException
I am facing the below issue with poetry install in Cloud composer.
Failing at installation of URLObject.
Looking for suggestions on resolving the same as my online search didnt get any results.
The ...
4
votes
0
answers
485
views
Removing Python 3.10+ type hints for Python 3.8 runtime
I have a codebase that was developed using Python 3.10 and was originally designed to run on a that version of Python. However, we have migrated to use Google Cloud Composer, which only supports ...
0
votes
0
answers
59
views
Is there any way to get the latest S3 file by GCP StorageTransfer?
We're running Airflow/Composer on GCP and trying to get the latest S3 file by any python operator. Currently we're getting S3 file once a day, they can happen to export several files in the same day ...