Contract - Senior Azure Data Engineer
Senior/Lead Azure Data Engineer (Contract — 3 Months)
Location: Remote (U.S.), core overlap with CST
Engagement: Contract only (1099/C2C) — no C2H for this role
Start: Immediate | Duration: 3 months (extension possible)
Comp: Competitive hourly rate (DOE)
About Smartbridge
Smartbridge simplifies business transformation across App Development, Automation, Data & Analytics, and Modernization. We build production systems for clients in Energy, Life Sciences, and Food & Beverage.
The Opportunity
Our project needs a senior/lead who can both design and build modern Azure data platforms—someone with strong coding in T-SQL and Python/PySpark, architectural judgment, and deep database chops (modeling, performance, reliability). Because this is contract-only, fit must be tight and immediate impact.
What You’ll Own (Architecture + Build)
- Architecture: Define the target-state Azure data architecture (ingestion, orchestration, storage zones, serving patterns), security/networking boundaries, cost/perf tradeoffs, and promotion strategy (Dev→Test→Prod).
- Pipelines & Code: Implement robust ELT/ETL with ADF/Synapse Pipelines (parameters, reusable templates, CI/CD). Hands-on in T-SQL and Python/PySpark for transformations, utilities, and tests.
- Database Excellence: Physical/semantic modeling, partitioning, columnstore strategies, statistics management, query plan analysis, index design, concurrency & transaction isolation, workload management.
- Observability & Reliability: SLA/SLO definitions, Azure Monitor / Log Analytics / App Insights dashboards and alerts; error handling, retries/backoff, idempotency, CDC and schema drift strategies.
- Security & Governance: RBAC, Key Vault, managed identities, private endpoints/VNet, data masking patterns; document data contracts and access patterns.
- Leadership: Code reviews, PR discipline, mentoring, and crisp documentation/runbooks for client handoff.
- 8–12+ years in data engineering (recent Azure focus).
- Expert with ADF (linked services, datasets, IRs—including self-hosted), Synapse (SQL pools/serverless, pipelines), and ADLS/Blob.
- T-SQL: advanced query tuning, execution plan analysis, windowing, TVFs/stored procs, temp tables vs CTE tradeoffs, cardinality estimator know-how.
- Python/PySpark: production data transforms, packaging, and testing.
- CI/CD: Azure DevOps or GitHub Actions (multi-stage releases, approvals, infra + data deployments).
- Proven delivery of production-grade platforms at scale (TB-level data, strict SLAs).
Nice to Have
- Designed and implemented robust data validation procedures to verify the completeness of data transfers, ensuring all records were successfully migrated and proactively triggering alerts in cases of discrepancies.
- Experience with working with large SQL tables (100 million rows)
- IaC (Bicep/Terraform) for data resources.
- Event-driven integration (Service Bus/Event Grid, CDC tooling).
- Certs: DP-203 or AZ-204 are a plus.