2025, Oct 01 17:00
Airflow ImportError and Disappearing DAGs in Google Cloud Composer after BigQuery Operator Upgrade
Fix Airflow ImportError and missing DAGs in Google Cloud Composer after BigQuery operator upgrades: update imports, clear pycache, restart scheduler, webserver.
Airflow upgrades in Google Cloud Composer can surface in subtle ways. One moment you fix an ImportError, the next your entire set of DAGs seems to vanish from the UI. This guide walks through a concrete case around BigQuery operators, why an ImportError can cascade into missing DAGs, and how to resolve it cleanly without side effects.
What happened
Several Composer DAGs relied on BigQueryCreateEmptyTableOperator. After updating to use the newer BigQueryCreateTableOperator to resolve an ImportError, the ImportError disappeared but all DAGs stopped showing up in the Airflow UI. The DAG files were still present in the GCS /dags folder, and scheduler logs did not reveal obvious parsing failures.
Minimal example that triggers the issue
The problem starts with an import that no longer resolves in the environment:
from airflow import DAG
from airflow.providers.google.cloud.operators.bigquery import BigQueryCreateEmptyTableOperator
from datetime import datetime
with DAG(
    dag_id="dw_pipeline_daily",
    start_date=datetime(2024, 1, 1),
    schedule_interval="@daily",
    catchup=False,
):
    pass
Airflow raises ImportError: cannot import name 'BigQueryCreateEmptyTableOperator' from 'airflow.providers.google.cloud.operators.bigquery'.
Why the ImportError turned into “DAGs disappeared”
After replacing the deprecated import with BigQueryCreateTableOperator, the ImportError went away, yet the UI no longer listed the DAGs. The underlying reason is that Airflow’s DAG parsing can be impacted by stale Python bytecode artifacts and processes that continue to serve cached state. When the import surface changes, the scheduler and web server may need a full refresh to rebuild their view of the DAGs; otherwise, the parser may not register them, which looks like they disappeared.
There is an additional angle worth considering when tracking down the initial ImportError. A mismatch in provider versions across environments can create a situation where one environment pulls a latest provider and breaks on imports, while another environment pins a stable version and continues to work. Aligning how providers are installed in staging and production helps avoid environment-specific ImportError surprises.
The fix
First, update the import to use the supported operator. Then, clear the Python cache and restart Airflow components to force a clean DAG re-parse. This combination resolves the ImportError and prevents the “missing DAGs” symptom by refreshing the parser state.
from airflow import DAG
from airflow.providers.google.cloud.operators.bigquery import BigQueryCreateTableOperator
from datetime import datetime
with DAG(
    dag_id="dw_pipeline_daily",
    start_date=datetime(2024, 1, 1),
    schedule_interval="@daily",
    catchup=False,
):
    pass
After deploying the change, clear the PyCache in your DAGs path and restart the Airflow scheduler and web server. This ensures Composer drops stale bytecode, reloads modules with the updated import path, and re-registers your DAGs in the UI.
Why this matters
Airflow’s dynamic DAG parsing relies on a consistent import graph and clean module state. When operators move or change names, it’s not enough to just fix the import line; the runtime must also discard cached artifacts. Otherwise, the UI may not reflect reality in the GCS bucket, leading to confusion and wasted debugging cycles. Consistent provider management across environments further reduces noise, so a change that works in one place doesn’t fail in another with ImportError.
Takeaways
When you replace BigQueryCreateEmptyTableOperator with BigQueryCreateTableOperator to fix an ImportError, follow through by clearing PyCache and restarting the Airflow scheduler and web server. This forces a fresh DAG parse and prevents the UI from “losing” your DAGs. Keep an eye on provider version consistency across environments to avoid the same ImportError reappearing elsewhere. With these two habits—clean operator replacement and a clean restart cycle—you get predictable, visible DAGs after code changes.
The article is based on a question from StackOverflow by Laura and an answer by Pauls Creationids Interior Des.