Bayer
Published
15/11/2019
Location
Category
Job Type
Work Hours
9-17
Street Address
800 N Lindbergh Blvd, Creve Coeur

Description

At Bayer, you have the opportunity to be part of a culture where we value the passion of our employees to innovate and give them the power to change.

Your success will be driven by your demonstration of our LIFE values.
This role is located in St. Louis, MO and relocation will be available for qualified candidates.

Responsibilities

Work on the deployment, delivery, and expansion of Analytics breeding algorithms;
Work on all aspects of the design, development, validation, scaling, and delivery of analytical solutions;
Work on data pipelining and data integration projects of all Phenotype and Genotype data, also work on standardizing data access and storage patterns of all analytical models;
Collaborate with interdisciplinary scientists to gather requirements for data pipelines;
Collaborate with APC and P&E to streamline data storage and access patterns;
Optimize algorithms and data workers to scale horizontally and also contribute to the development of new algorithms and capabilities that will enable connected pipeline analytics for all pipelines;
Tune data workers and algorithms to scale horizontally;
Own and modify algorithms to expand to multiple crops and regions with minimal Data Scientist involvement;
Work on connecting models and generate feedback and help in training models with different data sets;
Manage analytics activities based on harvest schedule;
Work with breeding/regional leads to maintain analytics harvest schedule and deliver all advancement related analysis;
Act as point of contact for domain/data related issues;

Qualifications

PMP Certification;
Java programming experience;

Experience Requirements

5 + years of relevant software development experience;
2 + years of Network and Database administration, R, Python, Java and/or Scala;
Proven ability to plan, schedule and deliver quality software;
Experience working with large data sets;
Experience working with distributed computing tools (Map/Reduce, Hadoop, Hive, HBase, Scala, Spark etc.);
Experience authoring workflows (using technology like Apache Airflow)
Experience in running real workloads in the cloud and diagnosing and fixing problems;
Experience in managing support.

Education Requirements

Minimum of a Master’s Degree in Computer Science, Electrical Engineering or a closely-related field;

Skills

Creativity in validating and optimizing analytic solutions.

Only registered members can apply for jobs.

Related Jobs

Share This