28 feb
Brinks Argentina
Argentina
Scala or Java Data Engineer (Senior/Lead) Id28611
Company: AgileEngine
What you will do:
1. Design, develop, maintain, and enhance highly scalable data engineering solutions leveraging AWS services.
2. Design, build, document, and implement scalable pipelines with a clear focus on data quality and reliability.
3. Ingest and transform structured, semi-structured, and unstructured data from multiple sources.
4. Build an enterprise-level ETL/ELT solution.
5. Innovate and build proprietary algorithms to tackle complex problems involving interesting data challenges.
6. Execute and continually optimize new customer data ingestion and model implementation processes.
7. Integrate business knowledge with technical functionalities.
8.
Develop solutions at the intersection of data and ML.
9. Monitor workflow performance and reliability, and ensure SLA targets are met.
10. Automate existing code and processes using scripting, CI/CD, infrastructure-as-code, and configuration management tools.
11. Work with AI on problems like NLP, image analysis featurization, and OCR labeling.
Must haves:
1. 5+ years of experience with Scala (preferred) OR Java for data engineering and ETL.
2. 5+ years of experience with data pipeline tools such as Spark.
3. 5+ years of experience working in the AWS ecosystem (preferred), or GCP.
4. High proficiency in SQL programming with relational databases - experience writing complex SQL queries is a must.
5. Experience in using the best practices of Cloud provider AI services.
6. Ability to contribute in an agile, collaborative, and fast-paced environment.
7. An excellent problem-solving ability, to think outside the box to solve common problems.
8. Upper-intermediate English level.
Nice to haves:
1.
5+ years of experience with DAG orchestration and workflow management tools like Airflow or AWS Step Functions.
2. 3+ years of experience using cloud-provider AI services.
3. 3+ years of experience with Kubernetes.
4. 3+ years of hands-on experience developing ETL solutions using RDS and warehouse solutions using AWS services (S3, IAM, Lambda, RDS, Redshift, Glue, SQS, EKS, ECR).
5. Experience working with distributed computing tools like Hive.
6. Experience with Git and CI/CD tools like Jenkins, GitLab CI/CD, or GitHub Actions.
7. Experience with containers/orchestration tools like Docker or Helm.
8. Experience in a fast-paced agile development environment.
9. AWS certifications (AWS Certified Solutions Architect, Developer, or DevOps).
10. Knowledge of commercial claims management systems.
#J-18808-Ljbffr
Muestra tus habilidades a la empresa, rellenar el formulario y deja un toque personal en la carta, ayudará el reclutador en la elección del candidato.