G-360 | Data Engineer - Ey Global Delivery Services

EY

  • Buenos Aires
  • Permanente
  • Tiempo completo
  • Hace 1 día
**CT Data Engineer**EY is a global leader in assurance, tax, transaction and advisory services. Technology is at the heart of what we do and deliver at EY. Technology solutions are integrated in the client services we deliver and are key to our innovation as an organization.Fueled by strategic investment in technology and innovation, Client Technology seeks to drive growth opportunities and solve complex business problems for our clients through building a robust platform for business and powerful product engine that are vital to innovation at scale. As part of Client Technology, you’ll work with technologists and business experts, blending EY’s deep industry knowledge and innovative ideas with our platforms, capabilities, and technical expertise. As a catalyst for change and growth, you’ll be at the forefront of integrating emerging technologies from AI to Data Analytics into every corner of what we do at EY. That means more growth for you, exciting learning opportunities, career choices, and the chance to make a real impact.
- Leads the delivery of processes to extract, transform and load data from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities
- Designs, develops and produces data models of relatively high complexity, leveraging a sound understanding of data modeling standards to ensure high quality
- Builds networks with other functional teams across the business to help define and deliver business value, and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects**Your key responsibilities include**
- Leading the production of high-quality data engineering deliverables, helping to ensure project timelines are met, and providing informal mentoring / training to junior members of the team
- Translating requirements, design and solution architecture deliverables into detailed design specifications
- Leading the delivery of data quality reviews including data cleansing where required to ensure integrity and quality
- Leading the delivery of data models, data storage models and data migration to manage data within the organization, for a small to medium-sized project
- Resolving escalated design and implementation issues with moderate to high complexity
- Analyzing the latest industry trends such as cloud computing and distributed processing and beginning to infer risks and benefits of their use in business
- Developing working relationships with peers across other engineering teams and beginning to collaborate to develop leading data engineering solutions
- Driving adherence to the relevant data engineering and data modeling processes, procedures and standards**Skills and attributes for success**
- **
Batch Processing** - Capability to design an efficient way of processing high volumes of data where a group of transactions is collected over a period of time
- **Data Integration (Sourcing, Storage and Migration)** - Capability to design and implement models, capabilities and solutions to manage data within the enterprise (structured and unstructured, data archiving principles, data warehousing, data sourcing, etc.). This includes the data models, storage requirements and migration of data from one system to another
- **Data Quality, Profiling and Cleansing** - Capability to review (profile) a data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to remediate the data
- **Stream Systems** - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it’s produced, in any format, and at any quality**Required Technical Skills**
- Experience designing and building Data Platforms integrating disparate data sources
- Knowledge of distributed computing
- Expertise in ETL, SQL
- Expertise working with MPP solutions to deal with massive amount of data
- Expertise in Azure, Azure Data Bricks, Azure SQL, Synapse
- Expertise developing dataflows using NiFi/ADF, Databricks.
- Advanced, hands-on in implementing large scale Datawarehouse and data lakes.
- Advanced, hands-on experience in Spark architecture and implementation
- Experience in working with Distributed Message Systems like Kafka
- Hands on experience in Python, Pyspark or R
- Knowledge of working with structured and unstructured data sources.
- Expert in writing complex SQL and procedures.
- Creating Ingestion workflows using Oozie or similar tools
- Knowledge of security measures like HTTPS and Kerberos**Beneficial Technical Skills**
- Knowledge in Graph Databases, preferably Neo4J, Cypher and Cosmos DB
- Graph Data modelling
- Azure Data Lake Store, Databricks Deltalake
- Spark ML
- Experience developing Microservices**Education**
- B.S. Computer Science, Data Analytics, Data Science, Engineering, IT, or related field preferred

Kit Empleo

Empleos similares

  • Data Engineer SSR Data Engineer - BI SQL Azure

    Yel Informática

    • Olivos, Provincia de Buenos Aires
    YEL Solutions, una empresa con más de 20 años de presencia en el mercado, líder brindando IT Outsourcing Services con operaciones de alcance global. Sumando nuestras capacidades …
    • Hace 23 días