Senior Java developer (Data Platform)

Technology Stack:

Java, Scala, Apache Spark, Apache Airflow, Apache Hadoop (HDFS, YARN), Apache Flume, HP Vertica, MongoDB, AWS ecosystem, and Snowflake.

Senior Java developer (Data Platform)

What you'll do:

As a member of the Data Platform Team, you will design, implement, and maintain robust data platforms that handle large-scale data processing and management. You’ll collaborate with cross-functional teams to model data pipelines, develop solutions for complex data challenges, and ensure our platform is built to scale efficiently across a wide variety of use cases and systems.

Your responsibilities will include:

  • Design, implement, and support data platform applications using modern technologies in a dynamic, fast-evolving environment;

  • Develop and optimize large-scale data pipelines, ensuring high performance, reliability, and scalability;

  • Collaborate with various teams to model complex data relationships and provide insights to support data-driven decisions;

  • Ensure data platform architecture and infrastructure remain resilient, efficient, and scalable to meet the company's growing data needs;

  • Promote a knowledge-sharing environment, mentoring peers and contributing to the team's success.

Skills & Requirements:

  • 5+ years of Java/Scala programming experience;

  • Strong grasp of Java concepts (collections, serialization, multi-threading, lambda expressions, JVM architecture, etc.);

  • Proficiency in ANSI SQL, including query syntax, performance tuning, and knowledge of OLAP vs. OLTP;

  • Ability to quickly learn new technologies and integrate them into existing infrastructure;

  • A deep understanding of architecture patterns in distributed systems, especially in data platform environments.

Preferred Qualifications:

  • Experience working with Hadoop ecosystem components and big data frameworks;

  • Hands-on experience with Hadoop, Spark, Kafka, or other data streaming technologies;

  • Familiarity with designing and implementing ETL (Extract, Transform, Load) processes;

  • Basic knowledge of Python and its use in data engineering tasks;

  • Experience with data visualization/analysis tools (e.g., Tableau) to drive insights;

  • Proficiency with Linux and cloud-based infrastructure (especially AWS);

  • Intermediate or higher proficiency in written and spoken English.

We offer:

  • Well-coordinated professional team.

  • Cutting edge technologies, interesting and challenging tasks, dynamic project, great opportunities for self-realization, professional and career growth.

  • Additional Health and Life Insurance Package.

  • Employee Assistance Program.

  • 25 vacation days.

  • ReBenefit Platform Account with 400BGN value monthly.

Apply for job

apply

Contact us

Write to us at jobs@jettycloud.com or send a message to our recruiters