Data Engineer/Technology Leader - ETL/Machine Learning (4-10 Yrs) Bangalore (Systems/Product Software) by The Modern Dimension
❱ साईट पर देखें
इस नौकरी के लिए आवेदन करें
अलर्ट सब्सक्राइब करें
मुझे इसी तरह की नौकरियां भेजें
✕
XpatJobs
कृपया अपना अनुभव साझा करें
मानदंड
रेटिंग
जवाब देने का तरीका
जवाब देने का समय
प्रस्ताव की गुणवत्ता
पूरा अनुभव
सुरक्षा टिप्स:
क्लिकइंडिया केवल विभिन्न उपयोगकर्ताओं के विज्ञापन की मेजबानी में शामिल है... अधिक जानिए
नौकरी का सारांश
Data Engineer/Technology Leader - ETL/Machine Learning (4-10 Yrs) Bangalore (Systems/Product Software) by The Modern Dimension
वेतन - चर्चा योग्य
नौकरी का प्रकार - ऑफिस से फुल टाईम नौकरी
रोजगार का प्रकार - कंपनी के पेरोल पर नौकरी
XpatJobs (November-2017 से पंजीकृत) ने 10 दिन पहले इस नौकरी को पोस्ट किया था
नौकरी के लिए आवश्यक मानदंड
न्यूनतम अनुभव - फ्रेशर
कौन आवेदन कर सकता है - पुरुष / महिला दोनों
नौकरी का विवरण
Job Title: Technology Leader Experience - 4 years Job Band : Corresponding Designation: Team Lead (Data Engineer) Career Stream: Technology Reporting Position: Director of Engineering No of Direct Reportees 2 Total Span of Control: Technology and Product Location: Bangalore Number of Working Days: 5 Shift Working Regular Key Job Purpose: To amplify the Data Science progress in the organisation. Roles span of influence Product, Technology Key Tasks/Responsibilities : - Machine learning pipelines and ETL processes that operate at arbitrary scale - Reliable, stateless services that can be dynamically scaled and continuously deployed - Analytical jobs and services to measure model performance offline and in production - Tooling to help us productionize machine learning models faster.
-We expect you to own modules end to end and take complete ownership of the products you deliver.
-Lead from the front, when it comes to delivering high quality work products. Serve as a mentor to the team.
-Ability to work in ambiguous environment without day-to-day guidance and direction. Can handle the uncertainty of unsolved problems - Create and define performance metrics. Ideate, innovate and hack through the existing systems to improve performance.
-Thorough understanding of the product specs and workflows. Breakdown the requirements into smaller tasks and deliver via agile methodologies.
-Perform code reviews, set coding practices and guidelines within the team.
-Performance Metrics Model Deployment - Data Scientist iteration velocity - Team Management Job Incumbent Requirements (Knowledge/Skill/Experience) : Education : BE /B. Tech from Tier I Training Requirement : Functional Competencies (Mandatory) : - Functional programming in languages like Scala, Haskell, or F#. Experience with functional programming on Spark is a plus.
-Object oriented programming - Data modeling and schema evolution tools - Parallelism, concurrency, and distributed system concepts (knowledge of CQRS, Event Sourcing, CAP Theorem).
-Understand what "turning the database inside out " means and its relevance to distributed commit logs and stream processing frameworks.
-Expert knowledge of SQL and data querying tools & libraries.
-Microservice architecture design. Prior experience with A/B testing machine learning services before deployment is a plus.
-Machine learning fundamentals - API design and evolution - Big data frameworks like Spark, and Hadoop - Modern distributed databases like MongoDB, Cassandra, and Riak - Stream processing frameworks such as Kafka, Flink, Storm, and Samza - AWS systems like EC2, SQS, EMR and Redshift Functional Competencies (Desirable) : - Machine learning libraries like scikit-learn and Tensor Flow - Schema evolution tools such as Thrift, Avro Infrastructural tools like Docker, Zookeeper, Mesos, and Marathon and the ability to deploy machine learning services reliably with these tools or Protocol Buffers - Foundational Competency (Mandatory) Drive & Ownership, Empathy, building winning Teams, Decision Making.
-Foundational Competency (Desirable) Key relationships/customer touch points Internal Data sciences, New Business, GB.
Required Skills : English
-We expect you to own modules end to end and take complete ownership of the products you deliver.
-Lead from the front, when it comes to delivering high quality work products. Serve as a mentor to the team.
-Ability to work in ambiguous environment without day-to-day guidance and direction. Can handle the uncertainty of unsolved problems - Create and define performance metrics. Ideate, innovate and hack through the existing systems to improve performance.
-Thorough understanding of the product specs and workflows. Breakdown the requirements into smaller tasks and deliver via agile methodologies.
-Perform code reviews, set coding practices and guidelines within the team.
-Performance Metrics Model Deployment - Data Scientist iteration velocity - Team Management Job Incumbent Requirements (Knowledge/Skill/Experience) : Education : BE /B. Tech from Tier I Training Requirement : Functional Competencies (Mandatory) : - Functional programming in languages like Scala, Haskell, or F#. Experience with functional programming on Spark is a plus.
-Object oriented programming - Data modeling and schema evolution tools - Parallelism, concurrency, and distributed system concepts (knowledge of CQRS, Event Sourcing, CAP Theorem).
-Understand what "turning the database inside out " means and its relevance to distributed commit logs and stream processing frameworks.
-Expert knowledge of SQL and data querying tools & libraries.
-Microservice architecture design. Prior experience with A/B testing machine learning services before deployment is a plus.
-Machine learning fundamentals - API design and evolution - Big data frameworks like Spark, and Hadoop - Modern distributed databases like MongoDB, Cassandra, and Riak - Stream processing frameworks such as Kafka, Flink, Storm, and Samza - AWS systems like EC2, SQS, EMR and Redshift Functional Competencies (Desirable) : - Machine learning libraries like scikit-learn and Tensor Flow - Schema evolution tools such as Thrift, Avro Infrastructural tools like Docker, Zookeeper, Mesos, and Marathon and the ability to deploy machine learning services reliably with these tools or Protocol Buffers - Foundational Competency (Mandatory) Drive & Ownership, Empathy, building winning Teams, Decision Making.
-Foundational Competency (Desirable) Key relationships/customer touch points Internal Data sciences, New Business, GB.
Required Skills : English
कंपनी प्रोफाइल
The Modern Dimension के लिए XpatJobs द्वारा पोस्ट किया गया
XpatJobs से संपर्क करें
पता : Bangalore, Karnataka, India
इस नौकरी में दिलचस्पी रखने वाले अभ्यर्थियों की इनमे भी रुचि थी
बैंगलोर में Data Engineer/Technology Leader - ETL/Machine Learning (4-10 Yrs) Bangalore (Systems/Product Software) की तरह की नौकरियां
बैंगलोर में सबसे ज्यादा देखी गयी डेटा अभियंता नौकरियां
XpatJobs द्वारा पोस्ट की गयी अन्य नौकरियां
✔ बैंगलोर में डेटा अभियंता नौकरियां