Lead Data Engineer
JPMorgan Chase Ver todas las vacantes
- Buenos Aires
- Permanente
- Tiempo parcial
- Lead the design, development, and optimization of complex SQL and PL/SQL solutions, with a strong focus on Oracle Database environments.
- Generate and maintain reusable reporting datasets to support multiple dashboards, reports, and business use cases, ensuring data quality, completeness, and consistency.
- Deliver data collection, storage, access, analytics, and machine learning platform solutions in a secure, stable, and scalable way.
- Implement and oversee database backup, recovery, and archiving strategies, ensuring data integrity and security.
- Collaborate closely with data analysts, product managers, and business stakeholders to gather requirements and translate them into efficient database views and curated data layers for reporting and API consumption.
- Evaluate and report on access control processes to determine effectiveness of data asset security with minimal supervision.
- Design, implement, and maintain materialized views, indexing, and schema changes for performance optimization and data aggregation.
- Build and maintain clear KPI, metric, and dimension logic with consistent business meaning, simplifying and standardizing reporting logic for maintainability and reuse.
- Apply advanced data validation, reconciliation, and root-cause analysis to ensure reporting accuracy and trust.
- Develop and deploy ETL/ELT workflows using tools such as Pentaho Data Integration (Kettle) and modern data transformation platforms.
- Apply machine learning, generative AI (GenAI), Retrieval-Augmented Generation (RAG), and vector embeddings for advanced search, analytics, and insight-led reporting.
- Add to team culture of diversity, opportunity, inclusion, and respect.
- Extensive hands-on experience with SQL and PL/SQL, ideally in Oracle Database environments.
- Proven ability to write, debug, and optimize complex SQL queries for large-scale, high-performance environments.
- Experience designing, implementing, and maintaining materialized views, packages, procedures, functions, and triggers in Oracle.
- Strong skills in database performance tuning, query optimization, and troubleshooting.
- Experience collaborating with business stakeholders to translate requirements into efficient database solutions.
- Experience and proficiency across the data lifecycle, including data collection, storage, access, and analytics.
- Experience with database backup, recovery, and archiving strategy.
- Strong focus on data quality, completeness, and consistency for reporting purposes.
- Ability to define, document, and maintain clear KPI, metric, and dimension logic.
- Strong analytical skills, including the ability to identify trends, anomalies, inconsistencies, and opportunities within data.
- Experience in data validation, reconciliation, and root-cause analysis.
- Familiarity with Pentaho Data Integration (Kettle) or similar ETL/ELT tools.
- Experience working with BI/reporting platforms and understanding how they consume curated datasets.
- Strong capability in Excel for validation, reconciliation, and ad hoc analysis.
- Familiarity with version control and structured development practices.
- Exposure to modern data transformation or analytical tools.
- Experience with Python or similar analytical tooling.
- Working experience with NoSQL databases.
- Familiarity with Generative AI (GenAI) concepts, Retrieval-Augmented Generation (RAG), and vector embeddings in databases.
- Proficient knowledge of linear algebra, statistics, and geometrical algorithms.
- Experience with software development best practices in enterprise environments.
- Excellent communication skills for cross-functional teamwork and documentation of database solutions.