One Python expression, 22+ SQL dialects, zero rewrites 🐍 Running queries across multiple databases often means rewriting the same logic for each backend's SQL dialect. A query that works in DuckDB may require syntax changes for PostgreSQL, and another rewrite for BigQuery. Ibis removes that friction by compiling Python expressions into each backend's native SQL. Swap the connection, and the same code runs across 22+ databases. Key features: • Write once, run on DuckDB, PostgreSQL, BigQuery, Snowflake, and 18+ more • Lazy execution that builds and optimizes the query plan before sending it to the database • Intuitive chaining syntax similar to Polars 🚀 Article comparing Ibis with other libraries: https://bit.ly/3MnsHs7 #Python #DataScience #SQL
️📦 Link to the repository: https://github.com/ibis-project/ibis
Ibis uses sqlglot for formatting and transpiling the sql code (write in one dialect of sql and transpile into another dialect). Additionally, sqlglot can also be used to performance tune (see sqlglot optimizer) the sql. https://sqlglot.com/sqlglot.html
This is a big unlock for teams dealing with multi-warehouse setups. The real value isn’t just less rewriting, it’s consistent logic across environments.
That kind of flexibility saves so much time, especially when juggling projects with different databases. It really lowers the barrier for scaling analytics.
I never worked in a project with the requirement of running the same query across different databases
I wish they had already support for oracle dialect
And how not to love Python?
SQLAlchemy?
One abstraction layer that compiles to every backend means data teams can finally stop rewriting the same query five times.