This position is responsible to build various Machine Learning-based tools or processes such as recommendation engines or automated Customer scoring systems using Apache Spark Machine Learning Libraries by mining data from a data lake or other large sources of Customer Data. S/he should be able to work with big data, experiment with various Machine Learning algorithms that will solve the task, create prototypes and productionize those prototypes. S/he should have hands on experience with Scala and Spark ML APIs to build reliable predictive models.
S/he should have extensive background in data mining and statistical analysis. The candidate will be required to clarify business requirements for new initiatives and; interfaces with the users, external and internal support teams to identify and help resolve production problems; prepare detailed specifications; and technical guidance to his/her development peers on their efforts. The candidate will devise and/or suggest modifications to procedures that would in effect help to solve complex problems considering system applications, computer equipment capacity, operating time and desired results.
The candidate will report progress on initiatives to Director Master Data Management providing updates to critical requests and fulfillment of business and development needs. S/he will regularly provide guidance and direction to less experienced technologists; has direct responsibility for quality of the deliverables of the unit.
ESSENTIAL DUTIES AND RESPONSIBILITIES
- Build predictive models via Apache Spark Machine Learning Libraries by mining data from a data lake or other sources utilizing Spark/ Scala and other interpretive languages such as Python with Hadoop
- Unit test applications
- Document the applications
- Provide 3rd tier support for production systems
- Mentor junior developers
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the level of financial impact this job has on the organization. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
⦁ Specify the measure and size of this job’s financial responsibility.
The employee’s efforts directly affect the company’s direct sales and retention initiatives; this position is responsible for the additional cost savings and opportunities to the company realizes through the predictive analysis of Customer’s data to provide insights to business teams and influence the business strategies and roadmap.
⦁ Describe any responsibility for design/development of programs/strategies/policies.
Design and development Predictive models and Big Data Applications
⦁ Describe any responsibility for implementation and/or administration of programs/policies/procedures.
Directly involved in implementation of software programs and Predictive Models
⦁ Describe the kinds of decisions made by the incumbent as well as the recommendations referred to the next level of management or others for approval.
Decisions Made by this Position:
1. Software design and implementation decisions
2. Recommendations To Others
3. Provide recommendations on System/software architecture at all levels
⦁ List the most frequent internal and external contacts.
MAX OPS Team
Carrier OPS Team
Carrier Audit Team
Loss and Prevention Team
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
⦁ List the minimum formal education, if any, required to perform this job.
Bachelor’s Degree in a technical discipline (Computer Science, Math, statistics etc.)
⦁ List the minimum experience required to perform this job.
7+ years of hands on Experience in Developing Big Data applications using Spark/Scala and building Predictive models using Spark Machine Learning APIs; Experienced in working with Hadoop ecosystem and related technologies (Cloudera, Hive, Impala, spark/Scala/spark ML etc.)
⦁ List minimum job content knowledge required to perform this job.
Experience with wireless carriers preferred
⦁ Describe any physical demands required to perform the essential functions of this job.
While performing the duties of this job, the employee is regularly required to sit; use hands to finger, handle, or feel; and talk or hear. The employee is occasionally required to stand and walk. Specific vision abilities required by this job include close vision, and color vision.
⦁ Describe the work environment that this position encounters (please include any work hazards).
Normal office environment. The noise level in the work environment is usually moderate
⦁ List specific jobs which could prepare an individual for this job.
Machine Learning Engineer