Moving 10 million rows from Oracle Fusion GL_BALANCES to Snowflake via Azure Data Factory
Extracting large volumes of data from Oracle Fusion into modern data warehouses like Snowflake, Azure SQL, or Oracle DB is a huge problem for enterprises. Most rely on BICC to generate file extracts, then use ETL tools to upload them—an error-prone and fragile process. A U.S.-based manufacturing giant shared that their extract pipeline fails at least twice a week, delaying data access for 50 analysts and costing over $250,000 in lost productivity weekly.
BI Connector eliminates this complexity by enabling a direct, reliable connection to Oracle Fusion. Paired with Microsoft Fabric or Azure Data Factory, teams can set up scalable, automated dataflows that move data quickly and consistently. We decided to stress-test BI Connector in a high-volume scenario—here’s what we found:
Test Setup
- Source Table: GL_BALANCES (Oracle Financials)
- Rows: 10,253,077
- Columns: 39
- Total Data Size: 3.776 GB
- Destination: Snowflake (x-small warehouse for max difficulty)
- Integration: BI Connector + Azure Data Factory (Fabric can also be used)
Performance Metrics

Note: Most enterprises use at least a medium warehouse in Snowflake, offering 3x faster load speed.
Key Takeaway
Even under constrained conditions, BI Connector combined with Azure Data Factory delivered stable and consistent performance—proving itself as a reliable solution for moving Oracle Fusion data to the Data Warehouse. Once the initial load is complete, teams can configure incremental refreshes to keep data continuously up to date for business users. For organizations standardizing on the Microsoft Data Platform, BI Connector offers a scalable, cost-effective, and production-ready pipeline for integrating Oracle Fusion data into their data warehouse environment.
Ditch the extract-and-wait cycle. Deploy BI Connector with Azure Data Factory or Fabric and turn Oracle Fusion into a reliable data source for your analytics. It’s time to make your data warehouse work smarter.