← Home

šŸŒ€

⌘K
šŸ¤–
Claude Code AI Tools
šŸ¤—
Hugging Face AI Tools
🦜
LangChain AI Tools
🧠
Keras AI Tools
šŸ¦™
Ollama AI Tools
šŸ
Python Programming Languages
🟨
JavaScript Programming Languages
šŸ”·
TypeScript Programming Languages
āš›ļø
React Programming Languages
🐹
Go Programming Languages
šŸ¦€
Rust Programming Languages
šŸ“Š
MATLAB Programming Languages
šŸ—„ļø
SQL Programming Languages
āš™ļø
C/C++ Programming Languages
ā˜•
Java Programming Languages
🟣
C# Programming Languages
šŸŽ
Swift Programming Languages
🟠
Kotlin Programming Languages
ā–²
Next.js Programming Languages
šŸ’š
Vue.js Programming Languages
šŸ”„
Svelte Programming Languages
šŸŽØ
Tailwind CSS Programming Languages
šŸ’š
Node.js Programming Languages
🌐
HTML Programming Languages
šŸŽØ
CSS/SCSS Programming Languages
🐘
PHP Programming Languages
šŸ’Ž
Ruby Programming Languages
šŸ”“
Scala Programming Languages
šŸ“Š
R Programming Languages
šŸŽÆ
Dart Programming Languages
šŸ’§
Elixir Programming Languages
šŸŒ™
Lua Programming Languages
🐪
Perl Programming Languages
šŸ…°ļø
Angular Programming Languages
šŸš‚
Express.js Programming Languages
🐱
NestJS Programming Languages
šŸ›¤ļø
Ruby on Rails Programming Languages
ā—¼ļø
GraphQL Programming Languages
🟪
Haskell Programming Languages
šŸ’š
Nuxt.js Programming Languages
šŸ”·
SolidJS Programming Languages
⚔
htmx Programming Languages
šŸ’»
VS Code Development Tools
🧠
PyCharm Development Tools
šŸ““
Jupyter Development Tools
🧠
IntelliJ IDEA Development Tools
šŸ’š
Neovim Development Tools
šŸ”®
Emacs Development Tools
šŸ”€
Git DevOps & CLI
🐳
Docker DevOps & CLI
ā˜øļø
Kubernetes DevOps & CLI
ā˜ļø
AWS CLI DevOps & CLI
šŸ”„
GitHub Actions DevOps & CLI
🐧
Linux Commands DevOps & CLI
šŸ’»
Bash Scripting DevOps & CLI
🌐
Nginx DevOps & CLI
šŸ“
Vim DevOps & CLI
šŸ”Ø
Makefile DevOps & CLI
🧪
Pytest DevOps & CLI
🪟
Windows DevOps & CLI
šŸ“¦
Package Managers DevOps & CLI
šŸŽ
macOS DevOps & CLI
šŸ—ļø
Terraform DevOps & CLI
šŸ”§
Ansible DevOps & CLI
āŽˆ
Helm DevOps & CLI
šŸ”Ø
Jenkins DevOps & CLI
šŸ”„
Prometheus DevOps & CLI
šŸ“Š
Grafana DevOps & CLI
šŸ’»
Zsh DevOps & CLI
🐟
Fish Shell DevOps & CLI
šŸ’™
PowerShell DevOps & CLI
šŸ”„
Argo CD DevOps & CLI
šŸ”€
Traefik DevOps & CLI
ā˜ļø
Azure CLI DevOps & CLI
ā˜ļø
Google Cloud CLI DevOps & CLI
šŸ“Ÿ
tmux DevOps & CLI
šŸ”§
jq DevOps & CLI
āœ‚ļø
sed DevOps & CLI
šŸ“Š
awk DevOps & CLI
🌊
Apache Airflow DevOps & CLI
šŸ”¢
NumPy Databases & Data
🐼
Pandas Databases & Data
šŸ”„
PyTorch Databases & Data
🧠
TensorFlow Databases & Data
šŸ“ˆ
Matplotlib Databases & Data
🐘
PostgreSQL Databases & Data
🐬
MySQL Databases & Data
šŸƒ
MongoDB Databases & Data
šŸ”“
Redis Databases & Data
šŸ”
Elasticsearch Databases & Data
šŸ¤–
Scikit-learn Databases & Data
šŸ‘ļø
OpenCV Databases & Data
⚔
Apache Spark Databases & Data
🪶
SQLite Databases & Data
⚔
Supabase Databases & Data
šŸ”µ
Neo4j Databases & Data
šŸ“Ø
Apache Kafka Databases & Data
🐰
RabbitMQ Databases & Data
šŸ”¤
Regex Utilities
šŸ“
Markdown Utilities
šŸ“„
LaTeX Utilities
šŸ”
SSH & GPG Utilities
🌐
curl & HTTP Utilities
šŸ“œ
reStructuredText Utilities
šŸš€
Postman Utilities
šŸŽ¬
FFmpeg Utilities
šŸ–¼ļø
ImageMagick Utilities
šŸ”
ripgrep Utilities
šŸ”
fzf Utilities
šŸ“—
Microsoft Excel Office Applications
šŸ“˜
Microsoft Word Office Applications
šŸ“™
Microsoft PowerPoint Office Applications
šŸ“
Hancom Hangul Hancom Office
šŸ“½ļø
Hancom Hanshow Hancom Office
šŸ“Š
Hancom Hancell Hancom Office
šŸ“„
Google Docs Google Workspace
šŸ“Š
Google Sheets Google Workspace
šŸ“½ļø
Google Slides Google Workspace
šŸ”Œ
Cadence Virtuoso EDA & Hardware
āš™ļø
Synopsys EDA EDA & Hardware
šŸ’Ž
Verilog & VHDL EDA & Hardware
⚔
LTSpice EDA & Hardware
šŸ”§
KiCad EDA & Hardware
šŸ“
Notion Productivity
šŸ’Ž
Obsidian Productivity
šŸ’¬
Slack Productivity
šŸŽ®
Discord Productivity
šŸŽØ
Figma Design Tools
šŸ“˜
Confluence Atlassian
šŸ“‹
Jira Atlassian
šŸƒ
Jest Testing
⚔
Vitest Testing
šŸŽ­
Playwright Testing
🌲
Cypress Testing
🌐
Selenium Testing
šŸ’™
Flutter Mobile Development
šŸ“±
React Native Mobile Development
šŸŽ
SwiftUI Mobile Development
šŸ“±
Expo Mobile Development
šŸ
Django Web Frameworks
⚔
FastAPI Web Frameworks
šŸŒ¶ļø
Flask Web Frameworks
šŸƒ
Spring Boot Web Frameworks
šŸø
Gin Web Frameworks
⚔
Vite Build Tools
šŸ“¦
Webpack Build Tools
⚔
esbuild Build Tools
🐘
Gradle Build Tools
🪶
Maven Build Tools
šŸ”§
CMake Build Tools
šŸŽ®
Unity Game Development
šŸ¤–
Godot Game Development
šŸ”Œ
Arduino Embedded & IoT
šŸ”
Nmap Security
šŸ•
Datadog Monitoring
šŸ“–
Swagger/OpenAPI Documentation
No results found
EN KO

CLI Commands

Basic Commands

airflow db init Initialize database
airflow webserver -p 8080 Start web server
airflow scheduler Start scheduler
airflow users create --username admin --role Admin Create user
airflow dags list List DAGs
airflow dags trigger dag_id Trigger DAG
airflow tasks test dag_id task_id 2024-01-01 Test task
airflow dags backfill -s 2024-01-01 -e 2024-01-31 dag_id Backfill

DAG Definition

Creating DAGs

Basic DAG
from airflow import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime, timedelta

default_args = {
    "owner": "airflow",
    "depends_on_past": False,
    "email_on_failure": False,
    "retries": 1,
    "retry_delay": timedelta(minutes=5),
}

with DAG(
    dag_id="my_dag",
    default_args=default_args,
    description="My first DAG",
    schedule="0 0 * * *",  # Daily at midnight
    start_date=datetime(2024, 1, 1),
    catchup=False,
    tags=["example"],
) as dag:
    pass
TaskFlow API
from airflow.decorators import dag, task
from datetime import datetime

@dag(
    schedule="@daily",
    start_date=datetime(2024, 1, 1),
    catchup=False,
)
def my_dag():
    @task()
    def extract():
        return {"data": [1, 2, 3]}
    
    @task()
    def transform(data: dict):
        return [x * 2 for x in data["data"]]
    
    @task()
    def load(data: list):
        print(f"Loading: {data}")
    
    # Define dependencies
    data = extract()
    transformed = transform(data)
    load(transformed)

my_dag()

Operators

Common Operators

Python & Bash
from airflow.operators.python import PythonOperator
from airflow.operators.bash import BashOperator

def my_function(**context):
    print(context["ds"])  # Execution date
    return "result"

python_task = PythonOperator(
    task_id="python_task",
    python_callable=my_function,
)

bash_task = BashOperator(
    task_id="bash_task",
    bash_command="echo Hello {{ ds }}",
)
Branching
from airflow.operators.python import BranchPythonOperator
from airflow.operators.empty import EmptyOperator

def choose_branch(**context):
    if context["ds_nodash"] > "20240101":
        return "branch_a"
    return "branch_b"

branch = BranchPythonOperator(
    task_id="branch",
    python_callable=choose_branch,
)

branch_a = EmptyOperator(task_id="branch_a")
branch_b = EmptyOperator(task_id="branch_b")
join = EmptyOperator(task_id="join", trigger_rule="none_failed_min_one_success")

branch >> [branch_a, branch_b] >> join

Sensors

File & HTTP sensors
from airflow.sensors.filesystem import FileSensor
from airflow.providers.http.sensors.http import HttpSensor

file_sensor = FileSensor(
    task_id="wait_for_file",
    filepath="/path/to/file.csv",
    poke_interval=60,  # Check every 60 seconds
    timeout=3600,      # Timeout after 1 hour
    mode="poke",       # or "reschedule"
)

http_sensor = HttpSensor(
    task_id="wait_for_api",
    http_conn_id="api_conn",
    endpoint="/health",
    response_check=lambda response: response.status_code == 200,
)

Dependencies

Task Dependencies

Setting dependencies
# Using >> and <<
task1 >> task2 >> task3
task1 >> [task2, task3] >> task4

# Using set_upstream/downstream
task2.set_upstream(task1)
task2.set_downstream(task3)

# Cross-dependencies
from airflow.models.baseoperator import cross_downstream
cross_downstream([task1, task2], [task3, task4])

# Chain
from airflow.models.baseoperator import chain
chain(task1, [task2, task3], task4)

XComs

XCom Usage

Push & Pull
def push_function(**context):
    # Automatic push via return
    return {"key": "value"}
    
    # Or explicit push
    context["ti"].xcom_push(key="my_key", value="my_value")

def pull_function(**context):
    # Pull by task_id
    value = context["ti"].xcom_pull(task_ids="push_task")
    
    # Pull by key
    value = context["ti"].xcom_pull(task_ids="push_task", key="my_key")

# In templates
bash_task = BashOperator(
    task_id="bash_task",
    bash_command='echo {{ ti.xcom_pull(task_ids="push_task") }}',
)

Connections

Database Connections

Using hooks
from airflow.providers.postgres.hooks.postgres import PostgresHook
from airflow.providers.amazon.aws.hooks.s3 import S3Hook

def query_postgres(**context):
    hook = PostgresHook(postgres_conn_id="my_postgres")
    records = hook.get_records("SELECT * FROM table")
    return records

def upload_to_s3(**context):
    hook = S3Hook(aws_conn_id="my_aws")
    hook.load_string(
        string_data="data",
        key="my-key",
        bucket_name="my-bucket",
    )