About the company
Coherent Solutions is headquartered in United States with 1500+ employees across several development centers in Eastern and Central Europe. We are a top software product engineering and consulting services company, offering custom digital solutions, web and mobile application development, DevOps and data services, and emerging technologies such as blockchain and IoT—with 1000+ completed projects going back to the company’s inception in 1995.
With a goal to grow our Moldovan R&D center to 150 professionals by the next year we are actively looking for a strong Big Data Developer.
He/she will help us further build our Big Data capability and will have a powerful impact in the development of the overall team in Moldova.
Our client is a leading software provider for converged TV and video advertising. It empowers marketers, agencies and media companies to make smarter advertising decisions and optimize their digital video and TV advertising by simplifying big data.
We are looking for a talented candidate to join Data Team which is responsible for the whole data lifecycle: from collecting and cleaning raw logs and third-party data (15 TB daily) to preparing a variety of reports for both internal and client’s usage. Our team helps to get insights from data and make targeting for advertising campaigns as accurate as possible. We are constantly working to find a balance between delivery speed, algorithms accuracy and cloud infrastructure cost.
If you are passionate about distributed computing, large-scale data pipelines, our project is your arena to join a world-class team of engineers to innovate leading-edge solutions.
• Support, manage and develop large scale ETL pipelines;
• Implementation of fault tolerant and robust spark streaming and batch jobs running on EMR;
• Performance optimization;
• Work with DevOps team to streamline deployments and cut infrastructure cost;
• Work on reporting and orchestration tools written in Java;
• Assess new technologies and see how they fit to the existing architecture and business needs;
• Develop and prove new architectural solutions;
• Direct work with client to gather requirements for ongoing development;
• Troubleshoot application operational issues;
• Strong knowledge of common algorithms and data structures;
• Strong communication skills;
• Experience developing robust and fault tolerant ETL pipelines;
• Experience with Hadoop, Spark on YARN;
• 2+ years of experience with Java or Scala;
• Experience with Amazon Web Services;
• Proficiency in relational databases;
• Experience with Aurora, Snowflake, Looker is preferred;
• Proficiency in NoSQL databases is preferred;
• Experience with Linux based operation systems is preferred;
• Strong communication and presenting skills;
• Sticking power, ability to convince people;
• Flexibility, openness to ideas of others;
• Fluent English speaker (Upper-Intermediate at least);
What’s in it for you?
• Attractive compensation and benefits package;
• Career coaching;
• Fantastic opportunities to develop your career in a fast growing company.
Contact us directly at: email@example.com or firstname.lastname@example.org