Skip to main content
Filter by
Sorted by
Tagged with
0 votes
0 answers
50 views

I’m running Apache Airflow inside a Docker container and trying to use the KubernetesPodOperator to run a simple “hello world” pod in an external Kubernetes cluster (not the same one where Airflow ...
Denis Victor's user avatar
0 votes
0 answers
42 views

To execute compute-intensive tasks (such as unzipping files), I decided to use the KubernetesPodOperator, which runs Bash commands on a pod with a custom Docker image inside the composer-user-workload ...
Floyd's user avatar
  • 1
0 votes
1 answer
180 views

Looking for help on using the EKSPodOperator. My set up is as follows: Airflow Version: 2.6.2 deployed with the official helm chart v1.15.0 Kubernetes Cluster: EKS 1.30 Executor: LocalExecutor ...
Fidel 's user avatar
0 votes
0 answers
69 views

I have an Airflow DAG with six tasks, all using GKEPodOperator. My DAG has an execution_timeout of 1 hour, but sometimes Task 5 takes longer to execute, causing failures. I want to set a dynamic ...
sammortals's user avatar
0 votes
1 answer
77 views

I tried creating a decorator that would automatically assign a pod name based on the wrapped function, something like: def default_kubernetes_task( **task_kwargs: Any, ) -> Callable[[Callable[.....
wasabigeek's user avatar
  • 3,252
0 votes
1 answer
238 views

I have recently started to use the KubernetesPodOperator that provides a lot of scale to airflow tasks (on 2.10) and has been working very well. However, I would also like to kubernetes pods as sensor ...
user3124206's user avatar
1 vote
1 answer
445 views

try to launch operator GKEStartPodOperator, to launch a simple test again GKE cluster in GCP. Using service account with these roles: Kubernetes Engine Admin Kubernetes Engine Developer Kubernetes ...
Cir02's user avatar
  • 127
0 votes
1 answer
229 views

I have an Airlfow instance spinning in Azure Kubernetes Service. Deployed with official Helm chart. And I noticed CeleryExecutor is used there. I have KubernetesPodOperator tasks, and wondering how ...
Aleksandr Semenov's user avatar
2 votes
0 answers
589 views

It seems like the only way to pull xcom values in a KubernetesPodOperator task is by setting do_xcom_push=True and using a templated xcom_pull, ie "{{ ti.xcom_pull(task_ids='foo', key='bar') }}&...
Chris's user avatar
  • 21
0 votes
0 answers
88 views

I have a ADF's Airflow managed instance provisioned. And, I'm trying to schedule a DAG. In this DAG, I'm trying to run a shell script which is present in the HOSTPATH "/opt/airflow/dags" ...
Tad's user avatar
  • 913
0 votes
1 answer
2k views

I am working on KubernetesPodOperator for one of the development. In that, I have used this to generate the files in GCP. I tried the below approach and it was working fine too. The ...
Raj's user avatar
  • 1
0 votes
1 answer
214 views

We have dags running in astronomer 9.4.0 using airflow version 2.7.2+astro.3 I have a dag which have KubernetesPodOperator as a task , which runs the golang application. Inside the Golang application ...
Ashish Bhatt's user avatar
1 vote
0 answers
43 views

I'm working on an Airflow project where I have a KubernetesPodOperator task that sometimes fails but remains in a running state after passing number of defined retries. Status in UI is Running but no ...
David Shpilfoygel's user avatar
1 vote
0 answers
203 views

In my KubernetesPodOperator I want to assign dynamic values to namespace, servic_account_name and image_pull secrets. I am using jijna template. Below is the code for my custom KubernetesPodOperator. ...
Niketa's user avatar
  • 11
1 vote
0 answers
664 views

I am running Airflow 2.5.3 and I am tearing my hair out over this one: A task in my DAG (KubernestPodOperator) was configured to request 3Gi memory and it runs a loop. At the bottom of the loop, I log ...
Hui's user avatar
  • 127
0 votes
1 answer
239 views

I have a conf json looks like this { "customer": "customer1", "region": "south" } I want generate a cmds: ["aws", "domain_name", "...
Jason LiLy's user avatar
-1 votes
1 answer
2k views

I have a job that i want airflow to do, but this job cannot run in parralel. This job is based on a docker image that airflow will trigger using KubernetesPodOperator with the specific command. I want ...
David Belhamou's user avatar
0 votes
1 answer
1k views

I am running a kubernetes pod using airflow KubernetesPodOperator. Then executing a jar file in the pod and writing the output to the /airflow/xcom/return.json. When checking the task's XCom value it ...
Sudiv's user avatar
  • 27
0 votes
2 answers
891 views

In a task, I serialise a dict (converting a dict to string) and I pushed it to XCOM result[data] = json.dumps({"agents": ["[email protected]"], "houses": ["jane....
Alexander Montoya Ángel's user avatar
0 votes
1 answer
1k views

I have a task defined using KubernetesPodOperator and I am passing value of the templated variable run_id as an environment variable. What baffles me is that when I use the Jinja template syntax, it ...
Hui's user avatar
  • 127
0 votes
2 answers
5k views

I have an Airflow DAG that uses a KubernetesPodOperator to run a containerized task. I would like to be able to parameterize the resources (memory and CPU) allocated to the container, so that I can ...
BlablaAT's user avatar
1 vote
1 answer
4k views

I am running airflow via MWAA on aws and the worker nodes are running k8s. The pods are getting scheduled just fine but I am trying to use pod_template_file with KubernetesPodOperator, it's giving me ...
Naxi's user avatar
  • 2,264
1 vote
1 answer
2k views

I am using the kubernetes version 1.25 client and server, I have deployed Airflow using the official helm charts on the environment. I want the Airflow dags kubernetes pod operator that has code to ...
SAGE's user avatar
  • 51
1 vote
1 answer
1k views

Airflow's KubernetesPodOperator provides an init_containers parameter, with which you can specify kubernetes init_containers. However init_containers expects a list of kubernetes.models.V1Container, ...
ForeverWintr's user avatar
  • 6,157
1 vote
2 answers
3k views

Im Trying to pass a Xcom value as arguments inside a kubernetes pod operator But values are not populating. Checked Xcom's List in Airflow UI I am able to see the key,value. This is my syntax: ...
Mark Antony's user avatar
0 votes
1 answer
1k views

I am using airflow 2.4.3 and running KubernetesPodOperator Below is the code and error:- Please help me with creating a KubernetesPosOperator in python. I have tried on both GCP and Azure. Also adding ...
Himanshu Malhotra's user avatar
2 votes
0 answers
243 views

So I have two V2 Composers running in the same project, the only difference in these two is that in one of them I'm using the default subnet and default values/autogenerated values for cluster-ipv4-...
Anton's user avatar
  • 601
1 vote
1 answer
1k views

When using the newest Airflow version (2.4.2), the on_failure_callback doesnt get triggered on KubernetesPodOperator tasks. The on_success_callback works just fine. For each KubernetesPodOperator task ...
mato777's user avatar
  • 11
0 votes
1 answer
540 views

I want to run a docker inside Kubernetes pod via Cloud composer, so to do so I followed all the mentioned steps here and also verified the binding, I have used default service account along with ...
sridar1992's user avatar
0 votes
1 answer
1k views

I am pulling my dags from github using git-sync. The only changes I made to my values.yaml file was using KubernetesExecutor, configuring git-sync and logs. Is there something else missing because ...
Tini's user avatar
  • 179
1 vote
0 answers
496 views

I am trying to make the airflow KubernetesPodOperator work with minikube. But unfortunately, the operator does not find the kubernetes cluster. The dag returned to me the following error: ERROR - ...
Elie Ladias's user avatar
1 vote
1 answer
2k views

I am using the Composer version 2.0.0 Airflow 2.1.4 and I have created a KubernetesPodOperator that is trying to access the Airflow connection stored in the Google Secrets Manager. But it isn't able ...
tank's user avatar
  • 525
1 vote
0 answers
754 views

I am trying to pass environment variables to my container using env_vars parameters of the KubernetesPodOperator. task_fetch = KubernetesPodOperator( task_id=FETCH, name=FETCH, ...
Elie Ladias's user avatar
0 votes
1 answer
1k views

I am trying to run one Kubernetes pod job 6 times. Each time it will print a number and sleep for 5 seconds. However, it only runs once, and then it stops. Here is the full code for the dag file: from ...
user2771708's user avatar
0 votes
1 answer
163 views

I am trying to make Elyra to use a custom Jinja template for my custom component. I have tried modifying Elyra's configuration file for this two items: c.ElyraApp.template_paths = ['/home/templates'] ...
el-aasi's user avatar
  • 312
1 vote
1 answer
276 views

Trying to make my own component based on KubernetesPodOperator. I am able to define and add the component to the list of components but when trying to run it, I get: Operator 'KubernetesPodOperator' ...
el-aasi's user avatar
  • 312
1 vote
0 answers
1k views

We are using a hosted Airflow 1.10.2 in Google Composer 1.7.5 to launch jobs via the KubernetesPodOperator (tasks that will be run in a Kubernetes pod inside a worker cluster) There has been several ...
Savir's user avatar
  • 18.6k
3 votes
1 answer
1k views

Airflow - Unable to use jinja template for resources in Kubernetes Pod Operator task. Able to use jinja template for environment variables, image but not able to use for resources to specify CPU and ...
Raja Sekhar's user avatar
0 votes
1 answer
580 views

I am using Airflow2.0 with KubernetesPodOperator want to run a command that use as a parameter a file from inside the image ran by the Operator. This is what I used: KubernetesPodOperator( ...
salvob's user avatar
  • 1,380
0 votes
1 answer
2k views

I'm writing an Airflow DAG using the KubernetesPodOperator. A Python process running in the container must open a file with sensitive data: with open('credentials/jira_credentials.json', 'r') as f: ...
balkon16's user avatar
  • 1,467
3 votes
1 answer
3k views

I am currently using the KubernetesPodOperator to run a Pod on a Kubernetes cluster. I am getting the below error: kubernetes.client.rest.ApiException: (403) Reason: Forbidden HTTP response headers: ...
adan11's user avatar
  • 807
0 votes
0 answers
745 views

We have a sort of self-serve Airflow cluster that mandates that all of the tasks are wrapped as KubernetesPodOperator tasks. With this setup what are the possible options to test the dags in CI/CD ...
Somasundaram Sekar's user avatar
0 votes
1 answer
1k views

I'm having a very weird airflow bug. Problem I have a dag that has a bash operator as step 1 and a KubernetesPodOperator as step 2. The issue is regarding the KubernetesPodOperator. Basically, I was ...
JaMo's user avatar
  • 105
1 vote
2 answers
752 views

How do I docker run private image from Container Registry in GCP using --privileged Running this locally works fine: docker run -it --privileged --entrypoint /bin/bash ${GKE_APP} I followed this but ...
user6308605's user avatar
-1 votes
1 answer
303 views

We have airflow running on kubernetes. Below is my airflowlocalsettings: airflowLocalSettings: | from airflow.contrib.kubernetes.pod import Pod, Resources from airflow.configuration import ...
yahoo's user avatar
  • 331
0 votes
1 answer
945 views

I am scheduling some tasks on airflow using KubernetesPodOperator; I want to deploy my pod with custom dns configuration: spec: dnsPolicy: "None" dnsConfig: nameservers: - 10.10....
Mohsen Eimany's user avatar
3 votes
2 answers
9k views

I am trying to create and run a pod using Airflow kubernetes pod operator. The command below is tried and confirmed to be working and I am trying to replicate the same using the kubernetes pod ...
idk's user avatar
  • 51
8 votes
5 answers
9k views

I have a airflow DAG "example_ml.py" which has a task "train_ml_model" and this task is calling/running a python script "training.py". -Dags/example_ml.py -Dags/training....
veeresh patil's user avatar
4 votes
0 answers
5k views

I am trying to pass secret variables to my KubernetesPodOperator in airflow Here is what I have done : Create a secret.yaml file that looks like the following apiVersion: v1 kind: Secret metadata: ...
Luis Blanche's user avatar
6 votes
0 answers
1k views

I have a Kubernetes ConfigMap called test that contains a key foobar with some value. I would like to use that key's value in an environmental variable. import datetime import os from airflow import ...
P-S's user avatar
  • 4,054