Hybrid or full remote Full time Role Overview The SQL Senior / Data Engineer owns the data and analytics support queue — responsible for monitoring, troubleshooting, and remediating failures across cloud data platforms, ETL/ELT pipelines, and Power BI reporting. The role also handles SQL-level data fixes and record corrections that overflow from the application support queues. Minimum 5 years of relevant experience is required per contractual staffing requirements.\nResponsibilities Own the data and analytics support queue: pipeline failures, ETL/ELT errors, Power BI dashboard issues\nMonitor and remediate data pipelines across Azure and AWS environments\nWrite and execute SQL scripts for data fixes, record corrections, and diagnostic queries\nSupport Snowflake and Databricks: query optimization, job failure diagnosis, cluster management\nMaintain CI/CD pipelines supporting data operations (Tier 1 monitoring, Tier 2 remediation)\nHandle SQL-heavy overflow tickets from the application support queues (HUB-Report, data corrections)\nCoordinate with client data team on schema changes, pipeline updates, and capacity planning\nPerform minor data platform enhancements as assigned (up to 80 engineering hours)\nSupport vehicle telematics data infrastructure and related ingestion pipelines Required Technical Skills Azure: Data Factory, Azure SQL, Synapse Analytics — 5+ years required per contractual staffing requirements\nAWS: S3, RDS, Glue — data pipeline and infrastructure support\nSnowflake: Query writing, data loading, performance troubleshooting, Snowpipe\nDatabricks: Notebook execution, cluster management, job failure diagnosis, Delta Lake basics\nPower BI: Dashboard connectivity, data source troubleshooting, refresh failures, DAX basics\nSQL: Advanced — complex queries, stored procedures, performance tuning across SQL Server and cloud DBs\nPipeline tools: Apache Airflow or Azure Data Factory — DAG/pipeline monitoring and repair\nPython: Scripting for data transformation, automation, and diagnostic tasks\nVersion control: GitLab — CI/CD pipeline basics for data workflows Experience 5+ years in data engineering or data platform support — mandatory per contractual staffing requirements. Hands-on experience with at least 3 of: Azure, AWS, Snowflake, Databricks. Production pipeline support experience strongly preferred. On-Call Requirements ⚠ ON-CALL ROTATION — data incidents Covers P1/P2 data pipeline and platform failures outside business hours\nShared rotation between Ukraine and Argentina Data Engineers — approximately every other week per person\nActivation expected when a critical pipeline failure impacts business operations or reporting delivery\nResponse expected within 1 hour of activation\nOn-call compensation applies per company policy Language English — required (all tickets, escalations, client communication)\nSpanish About Us Established in 2011, Trinetix is a dynamic tech service provider supporting enterprise clients around the world. Headquartered in Nashville, Tennessee, we have a global team of over 1,000 professionals and delivery centers across Europe, the United States, and Argentina. We partner with leading integral brands, delivering innovative digital solutions across Fintech, Professional Services, Logistics, Healthcare, and Agriculture.\nOur operations are driven by a strong business vision, a people-first culture, and a commitment to responsible growth. We actively give back to the community through various CSR activities and adhere to international principles for sustainable development and business ethics. To learn more about how we collect, process, and store your personal data, please review our Privacy Notice: https://www.trinetix.com/corporate-policies/privacy-notice