Senior Data Scientist-Job Description
The Senior Data Scientist role is a creative problem solver who utilizes statistical/mathematical principles and modelling skills to uncover new insights that will significantly and meaningfully impact business decisions and actions. She/he applies their data science expertise in identifying, defining, and executing state-of-art techniques for academic opportunities and business objectives in collaboration with other Analytics team members. The Senior Data Scientist will execute analyses & outputs spanning test design and measurement, predictive analytics, multivariate analysis, data/text mining, pattern recognition, artificial intelligence, and machine learning.
- Perform the full range of data science activities including test design and measurement, predictive/advanced analytics, and data mining, and analytic dashboards.
- Extract, manipulate, analyse & interpret data from various corporate data sources developing advanced analytic solutions, deriving key observations, findings, insights, and formulating actionable recommendations.
- Generate clearly understood and intuitive data science / advanced analytics outputs.
- Provide thought leadership and recommendations on business process improvement, analytic solutions to complex problems.
- Participate in best practice sharing and communication platform for advancement of the data science discipline.
- Coach and collaborate with other data scientists and data analysts.
- Present impact, insights, outcomes & recommendations to key business partners and stakeholders.
- Comply with established Service Level Agreements to ensure timely, high quality deliverables with value-add recommendations, clearly articulated key findings and observations.
- Bachelor's Degree (B.A./B.S.) or Master’s Degree (M.A./M.S.) in Computer Science, Statistics, Mathematics, Machine Learning, Physics, or similar degree
- 5+ years of experience in data science in a digitally advanced industry focusing on strategic initiatives, marketing and/or operations.
- Advanced knowledge of best-in-class analytic software tools and languages: Python, SQL, R, SAS, Tableau, Excel, PowerPoint.
- Expertise in statistical methods, statistical analysis, data visualization, and data mining techniques.
- Experience in Test design, Design of Experiments, A/B Testing, Measurement Science Strong influencing skills to drive a robust testing agenda and data driven decision making for process improvements
- Strong Critical thinking skills to track down complex data and engineering issues, evaluate different algorithmic approaches, and analyse data to solve problems.
- Experience in partnering with IT, marketing operations & business operations to deploy predictive analytic solutions.
- Ability to translate/communicate complex analytical/statistical/mathematical concepts with non-technical audience.
- Strong written and verbal communications skills, as well as presentation skills.
Dori AI enables enterprises with AI-powered video analytics to significantly increase human productivity and improve process compliance. We leverage a proprietary full-stack end-to-end computer vision and deep learning platform to rapidly build and deploy AI solutions for enterprises. The platform was built with enterprise considerations including time-to-value, time-to-market, security, and scalability across a range of use cases. Capture visual data across multiple sites, leverage AI + Computer Vision to gather key insights, and make decisions with actionable visual insights. Launch CV applications in a matter of weeks that are optimized for both cloud and edge deployments.
We are looking for a Python Developer to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.
In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine learning and research.
Your goal will be to help our company analyze trends to make better decisions.
1. 2 to 8 years of relevant industry experience
2. Experience in Linear algebra, statistics & Probability skills, such as distributions, Deep Learning, Machine Learning
3. Strong mathematical and statistics background is a must
4. Experience in machine learning frameworks such as Tensorflow, Caffe, PyTorch, or MxNet
5. Strong industry experience in using design patterns, algorithms and data structures
6. Industry experience in using feature engineering, model performance tuning, and optimizing machine learning models
7. Hands on development experience in Python and packages such as NumPy, Sci-Kit Learn and Matplotlib.
- Minimum 3 years of technical experience in AI/ML (you can include internships & freelance work towards this)
- Excellent proficiency in Python (Numpy, Pandas)
- Experience working with SQL/NoSQL databases
- Experience working with AWS, Docker
- Should have worked with a large set of data
- Should be familiar with MI model building and deployment on AWS.
- Good communication skills and very good problem-solving skills
Perks & Benefits @Delivery Solutions:
- Permanent Remote work - (Work from anywhere)
- Broadband reimbursement
- Flexi work hours - (Login/Logout flexibility)
- 21 Paid leaves in a year (Jan to Dec) and 7 COVID leaves
- Two appraisal cycles in a year
- Encashment of unused leaves on Gross
- RNR - Amazon Gift Voucher
- Employee Referral Bonus
- Technical & Soft skills training
- Sodexo meal card
- Surprise on birthday/ service anniversary/new baby/wedding gifts
- Annual trip
- 3+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd and streaming databases like druid
- Strong industry expertise with containerization technologies including kubernetes, docker-compose
- 2+ years of industry in experience in developing scalable data ingestion processes and ETLs
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- Experience with scripting languages. Python experience highly desirable.
- 2+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Demonstrated expertise of building cloud native applications
- Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
- Experience in API development using Swagger
- Strong expertise with containerization technologies including kubernetes, docker-compose
- Experience with cloud platform services such as AWS, Azure or GCP.
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
- Design and Implement Large scale data processing pipelines using Kafka, Fluentd and Druid
- Assist in dev ops operations
- Develop data ingestion processes and ETLs
- Design and Implement APIs
- Assist in dev ops operations
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
- The Machine & Deep Machine Learning Software Engineer (Expertise in Computer Vision) will be an early member of a growing team with responsibilities for designing and developing highly scalable machine learning solutions that impact many areas of our business.
- The individual in this role will help in the design and development of Neural Network (especially Convolution Neural Networks) & ML solutions based on our reference architecture which is underpinned by big data & cloud technology, micro-service architecture and high performing compute infrastructure.
- Typical daily activities include contributing to all phases of algorithm development including ideation, prototyping, design, and development production implementation.
- An ideal candidate will have a background in software engineering and data science with expertise in machine learning algorithms, statistical analysis tools, and distributed systems.
- Experience in building machine learning applications, and broad knowledge of machine learning APIs, tools, and open-source libraries
- Strong coding skills and fundamentals in data structures, predictive modeling, and big data concepts
- Experience in designing full stack ML solutions in a distributed computing environment
- Experience working with Python, Tensor Flow, Kera’s, Sci-kit, pandas, NumPy, AZURE, AWS GPU
- Excellent communication skills with multiple levels of the organization
- Image CNN, Image processing, MRCNN, FRCNN experience is a must.
- Build and mentor the computer vision team at TransPacks
- Drive to productionize algorithms (industrial level) developed through hard-core research
- Own the design, development, testing, deployment, and craftsmanship of the team’s infrastructure and systems capable of handling massive amounts of requests with high reliability and scalability
- Leverage the deep and broad technical expertise to mentor engineers and provide leadership on resolving complex technology issues
- Entrepreneurial and out-of-box thinking essential for a technology startup
- Guide the team for unit-test code for robustness, including edge cases, usability, and general reliability
- Tech in Computer Science and Engineering/Electronics/Electrical Engineering, with demonstrated interest in Image Processing/Computer vision (courses, projects etc) and 6-8 years of experience
- Tech in Computer Science and Engineering/Electronics/Electrical Engineering, with demonstrated interest in Image Processing/Computer vision (Thesis work) and 4-7 years of experience
- D in Computer Science and Engineering/Electronics/Electrical Engineering, with demonstrated interest in Image Processing/Computer vision (Ph. D. Dissertation) and inclination to working in Industry to provide innovative solutions to practical problems
- In-depth understanding of image processing algorithms, pattern recognition methods, and rule-based classifiers
- Experience in feature extraction, object recognition and tracking, image registration, noise reduction, image calibration, and correction
- Ability to understand, optimize and debug imaging algorithms
- Understating and experience in openCV library
- Fundamental understanding of mathematical techniques involved in ML and DL schemas (Instance-based methods, Boosting methods, PGM, Neural Networks etc.)
- Thorough understanding of state-of-the-art DL concepts (Sequence modeling, Attention, Convolution etc.) along with knack to imagine new schemas that work for the given data.
- Understanding of engineering principles and a clear understanding of data structures and algorithms
- Experience in writing production level codes using either C++ or Java
- Experience with technologies/libraries such as python pandas, numpy, scipy
- Experience with tensorflow and scikit.
Job DescriptionWe are looking for applicants who have a demonstrated research background in machine learning, a passion for independent research and technical problem-solving, and a proven ability to develop and implement ideas from research. The candidate will collaborate with researchers and engineers of multiple disciplines within Ideapoke, in particular with researchers in data collection and development teams to develop advanced data analytics solutions. Work with massive amounts of data collected from various sources.
-4 to 5 years of academic or professional experience in Artificial Intelligence and Data Analytics, Machine Learning, Natural Language Processing/Text mining or related field.
-Technical ability and hands on expertise in Python, R, XML parsing, Big Data, NoSQL and SQL
- Hands-on development/maintenance experience in Tableau: Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions using Tableau
- Reviewing and improving existing Tableau dashboards and data models/ systems and collaborating with teams to integrate new systems
- Provide support and expertise to the business community to assist with better utilization of Tableau
- Understand business requirements, conduct analysis and recommend solution options for intelligent dashboards in Tableau
- Experience with Data Extraction, Transformation and Load (ETL) – knowledge of how to extract, transform and load data
- Execute SQL data queries across multiple data sources in support of business intelligence reporting needs. Format query results / reports in various ways
- Participates in QA testing, liaising with other project team members and being responsive to client's needs, all with an eye on details in a fast-paced environment
- Performing and documenting data analysis, data validation, and data mapping/design
Key Performance Indicators (Indicate how performance will be measured: indicators, activities…)
KPIs will be outlined in detail in the goal sheet
Ideal Background (State the minimum and desirable education and experience level)
Minimum: Graduation, preferably in Science
· Minimum: 2-3 years’ relevant work experience in the field of reporting and data analytics using Tableau.
· Tableau certifications would be preferred
· Work experience in the regulated medical device / Pharmaceutical industry would be an added advantage, but not mandatory
Minimum: English (written and spoken)
Specific Professional Competencies: Indicate any other soft/technical/professional knowledge and skills requirements
About WheelsEye :
Logistics in India is a complex business - layered with multiple stakeholders, unorganized, primarily offline, and with many trivial yet deep-rooted problems. Though this industry contributes 14% to the GDP, its problems have gone unattended and ignored, until now.
WheelsEye is a logistics company, building a digital infrastructure around fleet owners. Currently, we offer solutions to empower truck fleet owners. Our proprietary software & hardware solutions help automate operations, secure fleet, save costs, improve on-time performance, and streamline their business.
- Work on a real Indian problem of scale impact lives of 5.5 cr fleet owners, drivers and their families in a meaningful way
- Different from current market players, heavily focused and built around truck owners Problem solving and learning-oriented organization
- Audacious goals, high speed, and action orientation
- Opportunity to scale the organization across the country
- Opportunity to build and execute the culture
- Contribute to and become a part of the action plan for building the tech, finance, and service infrastructure for the logistics industry It's Tough!
- Bachelor’s degree with additional 2-5 years experience in analytics domain
- Experience in articulating and translating business questions and using statistical techniques to arrive at an answer using available data
- Proficient with scripting and/or programming language, e.g. Python, R(Optional), Advanced SQL; advanced knowledge of data processing, database programming and data analytics tools and techniques
- Extensive background in data mining, modelling and statistical analysis; able to understand various data structures and common methods in data transformation e.g. Linear and logistic regression, clustering, decision trees etc.
- Working knowledge of tools like Mixpanel, Metabase, Google sheets, Google BigQuery & Data studio is preferred
- Ability to self-start and self-directed work in a fast-paced environment
If you are willing to work on solving real world problems for truck owners, Join us!
This will include:
The verticals included are: