O’Neal Steel, a century‑strong metal distributor, was preparing a major ERP transition and it's 15‑year‑old analytics stack had begun to show its age. DevIQ partnered on a phased modernization focused on outcomes: unify data under a single governed foundation on Azure Databricks, simplify and standardize integrations, and embed DataOps practices that accelerated delivery and reduced operational overhead for the in-house data analytics team – positioning ONS to continue improving data-driven decision‑making across the business.
in Data Stacks
Unified
Savings
Eliminated
The Client & The Challenge
O’Neal Steel (ONS) is one of the largest family‑owned metal distributors and service centers in the U.S., operating in 19 locations nationwide. Data‑driven by culture, ONS relied on a home‑grown, aging data platform while preparing a significant ERP transition.
Challenges & Pain Points
Aging, fragmented platform
A patchwork of overlapping tools required constant “eyes on glass,” with manual monitoring and brittle integrations across over 15 different source systems.
Multiple integration patterns
Half a dozen approaches for data ingestion and transformation created complexity, risk, and higher support overhead – precisely when ERP dependencies required reliability.
Governance and security gaps
Fine‑grained access was difficult to enforce consistently. Leaders wanted clearer lineage, role/row/column security, and a single governed catalog
Timing with ERP transformation
Modernizing analytics in isolation of the ERP transformation would double the work. Aligning data platform cutover with the ERP reduced rework – but raised coordination risk.
Consequences of Inaction
Without a unified and well-governed platform, ONS would face escalating support costs, prolonged decision-making timelines, and growing challenges from vendor deprecations. Failure to upgrade the analytics infrastructure would result in a complete loss of analytics data from the new ERP system, forcing stakeholders to make critical business decisions without access to essential information. In addition to this loss of data visibility, inaction would slow progress toward AI-enabled capabilities and create costly rework during future system migrations.
DevIQ's expertise in data architecture and business intelligence helped us plan, implement, and enable our teams to build a new, modern data analytics solution primed for future growth.
— Clayton Holderfield, Director of Data & Analytics, O’Neal Steel
The Partnership & The Proposition
DevIQ engaged in two phases – inception for platform strategy and platform selection, then enablement to stand up the Databricks Lakehouse and transfer operating patterns. Alongside the ONS analytics team, DevIQ co‑created the architecture, defined patterns, and guided spikes while ONS scaled the implementation.
Expert Guidance and a Flexible Process
- Strategy first, then Selection – Collaborative multi-team workshops translated business goals into evaluation criteria and vendor scorecards.
- Transparent Trade‑offs – Comparative studies evaluated Azure‑native options and Databricks vs. alternatives, aligning on a future‑proofed Lakehouse.
- Detailed Evaluation – Guided real-world testing of competing analytics tools provided confidence that the platform would exceed requirements with predictable costs.
- Advisory + Hands‑on Spikes – DevIQ built first‑of‑kind patterns while ONS replicated across pipelines at scale.
Empowering Internal Data & Analytics Teams
- Knowledge Transfer by Default – Continuous enablement replaced a big‑bang handoff.
- DataOps Discipline – Branching, testing, and promotion standards brought software‑grade rigor to data transformation code.
- Cultural Fit – A partner‑oriented rhythm – cadence, artifacts, and constant communication – kept stakeholders aligned.
The Solution
Strategy & Approach
DevIQ and ONS aligned on a pragmatic roadmap – validate strategy, select the platform, establish core patterns, then scale with the client team owning more each sprint. The plan synchronized analytics milestones with ERP to avoid rework and reduce risk.
Inception & Roadmap Alignment
- Co‑defined north‑star outcomes, selection criteria, and a phased rollout aligned to ERP cutovers.
- Mapped top data domains and early use cases to quick wins that unlocked adoption.
Enablement & Knowledge Transfer
- Delivered the first wave of ingestion and transformation patterns – then paired with ONS to replicate.
- Embedded DataOps practices – versioning, PR reviews, environment promotion, and rollback standards.
ERP‑Aligned Delivery Governance
- Engaged in weekly working sessions, sprint reviews, and decision logs ensured traceability.
- Implemented risk controls for reconciliation, dual‑run, and cutover timing reduced downstream surprises.
Technical Execution
- Languages: SQL, Python
- Components: Databricks Lakehouse, dbt, Unity Catalog, Power BI
- Integrations: Fivetran pre-built + custom connectors for ERP, SaaS & proprietary data sources
- Infrastructure: Azure Databricks, Microsoft Entra ID, Azure DevOps/Git
- Reusable Templates: Built reusable pipeline templates to accelerate new sources and enforce conventions.
- Release Pipelines: Implemented release pipelines with approval gates and documented rollback playbooks.
- Git Workflows: Established Git‑based workflows for SQL and dbt with branch policies, automated tests, and environment promotions.
- Infrastructure as Code: Implemented Terraform IaC for rapid and repeatable environment provisioning
- Semantic Models: Centralized semantic models and certified datasets with a clear workspace strategy (shared vs. self‑service).
- Deployment Pipelines: Implemented deployment pipelines (Dev → Test → Prod) for datasets, pipelines and reports with approval processes.
- Role-Based Reports: Adapted Power BI reports for curated, role‑based content distribution across business units.
- Adoption Monitoring: Monitored adoption with usage dashboards (active users, views, refresh status) to guide enablement.
- Unity Catalog: Implemented Unity Catalog for lineage, ownership, and fine‑grained permissions; modeled row/column security to business roles.
- Metadata Standards: Created table and column metadata to provide semantic context to the cleansed and transformed data.
- SSO Integration: Enabled SSO via Microsoft Entra ID with least‑privilege, role‑based access reviews.
- Cost Monitoring: Implemented native Databricks cost monitoring/tagging and AI-based predictive optimization to manage costs
- Workload Optimization: Leveraged usage analytics to schedule workloads and tune refresh cadences, balancing speed and spend.
The Results
Built and Shipped
O'Neal Steel now operates on a unified, Azure‑native data foundation with governed access and repeatable pipelines. The analytics team ships with confidence – moving from manual babysitting to observable workflows – and is beginning to explore advanced analytics/models such as customer segmentation and supply‑chain optimization.
- One Governed Platform – Disparate sources centralized into a Databricks Lakehouse governed with Unity Catalog.
- Integration Simplified – Consolidated from 5 ingestion approaches to 1–2 standard patterns across 15 sources.
- Faster, Safer Delivery – DataOps practices reduced rework and made releases auditable and reversible.
- AI‑ready Posture – Architecture, data quality checks, and governance establish a runway for ML and future GenAI use cases.
By the Numbers
- Consolidation – Reduced the number of data products and environments by 50%.
- Maintainability – Migrated over 20,000 lines of transformation logic found in multiple legacy transformation platforms into a single unified codebase.
- Platform Efficiency – Optimized Databricks platform spend by 38% post‑go‑live through Databricks compute tagging and job consolidation
- Operational Effort – Eliminated manual pipeline monitoring saving 20 hours per month.
The Conclusion
O'Neal Steel treated data as a strategic asset and chose to modernize with intent – aligning analytics with ERP, standardizing on Azure, and embracing a governed Databricks Lakehouse. The outcome is control and clarity – fewer moving parts, stronger governance, and a foundation that scales with the business.
Looking ahead, with DataOps norms and secure access in place, ONS can expand into advanced analytics at its own pace. Customer segmentation and supply‑chain initiatives are already emerging, with data quality, lineage, and access controls ready to support future ML and, when the time is right, GenAI.