According to a recent LinkedIn post from Astronomer, the company is promoting a live, hands-on workshop focused on building ETL and ELT data pipelines using Apache Airflow. The session is led by in-house experts and is positioned as a way for participants to build pipelines directly in their browsers while learning best practices for reliability and data quality.
Claim 30% Off TipRanks
- Unlock hedge fund-level data and powerful investing tools for smarter, sharper decisions
- Discover top-performing stock ideas and upgrade to a portfolio of market leaders with Smart Investor Picks
The post highlights topics such as SQL-based transformations, data quality gates, asset-aware scheduling, dynamic task mapping, and new Airflow 3 capabilities including human-in-the-loop controls, DAG versioning, and backfills. For investors, this emphasis on education and technical enablement suggests Astronomer is seeking to deepen engagement with data engineering teams, which could support product adoption, strengthen its ecosystem position in the Airflow community, and potentially drive longer-term customer retention and expansion.

