DATA SCIENTIST-MACHINE LEARNING

at Gormalone LLP

DP
Posted by Dhwani Rambhia
icon
Bengaluru (Bangalore)
icon
3 - 7 yrs
icon
₹6L - ₹30L / yr
icon
Full time
Skills
TensorFlow
Machine Learning (ML)
Artificial Intelligence (AI)
Data Science
Natural Language Processing (NLP)
Computer Vision
Data Analytics
EDA
ETL
recommendation algorithm
MLFlow
Airflow
Cloud Technologies
MLOps

DATA SCIENTIST-MACHINE LEARNING                           

GormalOne LLP. Mumbai IN

 

Job Description

GormalOne is a social impact Agri tech enterprise focused on farmer-centric projects. Our vision is to make farming highly profitable for the smallest farmer, thereby ensuring India's “Nutrition security”. Our mission is driven by the use of advanced technology. Our technology will be highly user-friendly, for the majority of farmers, who are digitally naive. We are looking for people, who are keen to use their skills to transform farmers' lives. You will join a highly energized and competent team that is working on advanced global technologies such as OCR, facial recognition, and AI-led disease prediction amongst others.

 

GormalOne is looking for a machine learning engineer to join. This collaborative yet dynamic, role is suited for candidates who enjoy the challenge of building, testing, and deploying end-to-end ML pipelines and incorporating ML Ops best practices across different technology stacks supporting a variety of use cases. We seek candidates who are curious not only about furthering their own knowledge of ML Ops best practices through hands-on experience but can simultaneously help uplift the knowledge of their colleagues.

 

Location: Bangalore

 

Roles & Responsibilities

  • Individual contributor
  • Developing and maintaining an end-to-end data science project
  • Deploying scalable applications on a different platform
  • Ability to analyze and enhance the efficiency of existing products

 

What are we looking for?

  • 3 to 5 Years of experience as a Data Scientist
  • Skilled in Data Analysis, EDA, Model Building, and Analysis.
  • Basic coding skills in Python
  • Decent knowledge of Statistics
  • Creating pipelines for ETL and ML models.
  • Experience in the operationalization of ML models

 

 

Basic Qualifications

  • Tech/BE in Computer Science or Information Technology
  • Certification in AI, ML, or Data Science is preferred.
  • Masters/Ph.D. in a relevant field is preferred.

 

 

Preferred Requirements

  • Exp in tools and packages like Tensorflow, MLFlow, Airflow
  • Exposure to cloud technologies
  • Operationalization of ML models
  • Good understanding and exposure to MLOps

 

 

Kindly note: Salary shall be commensurate with qualifications and experience

 

 

 

 

About Gormalone LLP

Founded
2017
Type
Products & Services
Size
20-100 employees
Stage
Bootstrapped
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

AI/ML Developer

at An US based firm offering permanent WFH

Agency job
via Jobdost
Machine Learning (ML)
Data Science
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm
icon
Remote only
icon
2 - 8 yrs
icon
₹8L - ₹18L / yr

This person MUST have:

  • B.E Computer Science or equivalent.
  • In-depth knowledge of machine learning algorithms and their applications including practical experience with and theoretical understanding of algorithms for classification, regression and clustering.
  • Hands-on experience in computer vision and deep learning projects to solve real world problems involving vision tasks such as object detection, Object tracking, instance segmentation, activity detection, depth estimation, optical flow, multi-view geometry, domain adaptation etc.
  • Strong understanding of modern and traditional Computer Vision Algorithms.
  • Experience in one of the Deep Learning Frameworks / Networks: PyTorch, TensorFlow, Darknet(YOLO v4 v5), U-Net, Mask R-CNN, EfficientDet,BERT etc.
  • Proficiency with CNN architectures such as ResNet, VGG, UNet, MobileNet, pix2pix, and CycleGAN.
  • Experienced user of libraries such as OpenCV, scikit-learn, matplotlib and pandas.
  • Ability to transform research articles into working solutions to solve real-world problems.
  • High proficiency in Python programming knowledge.
  • Familiar with software development practices/pipelines (DevOps- Kubernetes, docker containers, CI/CD tools).
  • Strong communication skills.


Experience:

  • Min 2 year experience
  • Startup experience is a must. 

Location:

  • Remote developer

Timings:

  • 40 hours a week but with 4 hours a day overlapping with the client timezone.  Typically clients are in the California PST Timezone.

Position:

  • Full time/Direct
  • We have great benefits such as PF, medical insurance, 12 annual company holidays, 12 PTO leaves per year, annual increments, Diwali bonus, spot bonuses and other incentives etc.
  • We dont believe in locking in people with large notice periods.  You will stay here because you love the company.  We have only a 15 days notice period.
Job posted by
Mamatha A

Sr. Analyst - SAS Modelling

at Analytics and IT MNC

Agency job
via Questworkx
SAS
SQL
Linear regression
Logistic regression
Data Science
icon
Bengaluru (Bangalore)
icon
2 - 5 yrs
icon
₹15L - ₹25L / yr
Responsibilities
- Design, Development and Optimization of Anti-Money Laundering Scenarios using statistical tools such as SAS
- Experience in extracting and manipulating large data sets
- Proficiency in analysing data using Statistical techniques
- Experience in summarizing and visualizing analysis results
- Machine learning and Anti - Fraud modelling experience would be a plus
- Should have experience over SAS AML, SAS Programming, R, Python, Analytics and BI tools like SAS VA
- Consultant must have be ready for short-term and long-term travel across India and out of India for project implementation

Must haves:
- SAS, SQL 
- Modeling techniques (linear regression, logistic regression)
- Data Science, Excel, Powerpoint
- BFSI Domain Experience
Job posted by
Jyoti Garach

Data Scientist

at Client is a Machine Learning company based in New Delhi.

Agency job
via Jobdost
Data Science
R Programming
Python
Machine Learning (ML)
Entity Framework
Natural Language Processing (NLP)
Computer Vision
icon
NCR (Delhi | Gurgaon | Noida)
icon
2 - 6 yrs
icon
₹10L - ₹25L / yr

Job Responsibilities

  • Design machine learning systems
  • Research and implement appropriate ML algorithms and tools
  • Develop machine learning applications according to requirements
  • Select appropriate datasets and data representation methods
  • Run machine learning tests and experiments
  • Perform statistical analysis and fine-tuning using test results
  • Train and retrain systems when necessary

 

Requirements for the Job

 

  1. Bachelor’s/Master's/PhD in Computer Science, Mathematics, Statistics or equivalent field andmust have a minimum of 2 years of overall experience in tier one colleges 
  1. Minimum 1 year of experience working as a Data Scientist in deploying ML at scale in production
  2. Experience in machine learning techniques (e.g. NLP, Computer Vision, BERT, LSTM etc..) andframeworks (e.g. TensorFlow, PyTorch, Scikit-learn, etc.)
  1. Working knowledge in deployment of Python systems (using Flask, Tensorflow Serving)
  2. Previous experience in following areas will be preferred: Natural Language Processing(NLP) - Using LSTM and BERT; chatbots or dialogue systems, machine translation, comprehension of text, text summarization.
  3. Computer Vision - Deep Neural Networks/CNNs for object detection and image classification, transfer learning pipeline and object detection/instance segmentation (Mask R-CNN, Yolo, SSD).
Job posted by
Sathish Kumar
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
Data Analytics
Data management
Operations
databricks
snowflake
icon
Remote only
icon
3 - 6 yrs
icon
₹8L - ₹10L / yr
Data Platform Operations
Remote Work, US shift

General Scope and Summary

The Data and Analytics Team sits in the Digital and Enterprise Capabilities Group and is responsible for driving the strategy, implementation and delivery of Data,
Analytics and Automation capabilities across Enterprise.
This global team will deliver “Next-Gen Value” by establishing core Data and Analytics capabilities needed to effectively manage and exploit Data as an Enterprise Asset. Data Platform Operations will be responsible for implementing and supporting Enterprise Data Operations tools and capabilities which will enable teams
 to answer strategic and business questions through data .

Roles and Responsibilities

● Manage overall data operations ensuring adherence to data quality metrics by establishing standard operating procedures and best practices/playbooks.
● Champion the advocacy and adoption of enterprise data assets for analytics and analytics through optimal operating models.
● Provide day-to-day ownership and project management data operations activities including data quality/data management support cases and other ad-hoc requests.
● Create standards, frameworks for CI/CD pipelines and DevOps.
● Collaborative cross-functionally to develop and implement data operations policies balancing centralized control and standardization with decentralized speed and flexibility.
● Identify areas for improvement. Create procedures, teams, and policies to support near real-time clean data, where applicable, or in a batch and close process, where applicable.
● Improve processes by tactically focusing on business outcomes. Drive prioritization based on business needs and strategy.
● Lead and control workflow operations by driving critical issues and discussions with partners to identify and implement improvements.
● Responsible for defining, measuring, monitoring, and reporting of key SLA metrics to support its vision.

Experience, Education and Specialized Knowledge and Skills

Must thrive working in a fast-paced, innovative environment while remaining flexible, proactive, resourceful, and efficient. Strong interpersonal skills, ability to understand
stakeholder pain points, ability to analyze complex issues to develop relevant and realistic solutions and recommendations. Demonstrated ability to translate strategy into action; excellent technical skills and an ability to communicate complex issues in a simple way and to orchestrate solutions to resolve issues and mitigate risks.
Job posted by
Rijooshri Saikia

Data Scientist

at Networking & Cybersecurity Solutions

Agency job
via Multi Recruit
Data Science
Data Scientist
R Programming
Python
Amazon Web Services (AWS)
Spark
Kafka
icon
Bengaluru (Bangalore)
icon
4 - 8 yrs
icon
₹40L - ₹60L / yr
  • Research and develop statistical learning models for data analysis
  • Collaborate with product management and engineering departments to understand company needs and devise possible solutions
  • Keep up-to-date with latest technology trends
  • Communicate results and ideas to key decision makers
  • Implement new statistical or other mathematical methodologies as needed for specific models or analysis
  • Optimize joint development efforts through appropriate database use and project design

Qualifications/Requirements:

  • Masters or PhD in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
  • Excellent understanding of machine learning techniques and algorithms, including clustering, anomaly detection, optimization, neural network etc
  • 3+ years experiences building data science-driven solutions including data collection, feature selection, model training, post-deployment validation
  • Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning models
  • Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
  • Good team worker with excellent communication skills written, verbal and presentation

Desired Experience:

  • Experience with AWS, S3, Flink, Spark, Kafka, Elastic Search
  • Knowledge and experience with NLP technology
  • Previous work in a start-up environment
Job posted by
Ashwini Miniyar

Sr. Data Engineer ( a Fintech product company )

at Velocity.in

Founded 2019  •  Product  •  20-100 employees  •  Raised funding
Data engineering
Data Engineer
Big Data
Big Data Engineer
Python
Data Visualization
Data Warehouse (DWH)
Google Cloud Platform (GCP)
Data-flow analysis
Amazon Web Services (AWS)
PL/SQL
NOSQL Databases
PostgreSQL
ETL
data pipelining
icon
Bengaluru (Bangalore)
icon
4 - 8 yrs
icon
₹20L - ₹35L / yr

We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.

 

We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.

 

Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!

 

Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc. 

 

Key Responsibilities

  • Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality

  • Work with the Office of the CTO as an active member of our architecture guild

  • Writing pipelines to consume the data from multiple sources

  • Writing a data transformation layer using DBT to transform millions of data into data warehouses.

  • Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities

  • Identify downstream implications of data loads/migration (e.g., data quality, regulatory)

 

What To Bring

  • 3+ years of software development experience, a startup experience is a plus.

  • Past experience of working with Airflow and DBT is preferred

  • 2+ years of experience working in any backend programming language. 

  • Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL

  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)

  • Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects

  • Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure

  • Basic understanding of Kubernetes & docker is a must.

  • Experience in data processing (ETL, ELT) and/or cloud-based platforms

  • Working proficiency and communication skills in verbal and written English.

 

 

 

Job posted by
chinnapareddy S

Senior Product Analyst

at Jar

Founded 2021  •  Product  •  0-20 employees  •  Raised funding
Data Analytics
Tableau
SQL
Business Analysis
icon
Bengaluru (Bangalore)
icon
2 - 6 yrs
icon
₹15L - ₹25L / yr

More about us : bit.ly/workatjarapp

 

Jar is seeking a talented Senior Product Analyst to join our Team. If you are intellectually curious, if you eat/sleep/drink data and are committed to translating data to insights & insights to actionable work items, want new challenges daily and impact the lives of hundreds of thousands of users, this is the role for you!

What You Will Do

  • Deliver insight and analysis using statistical tools, data visualization, and business use cases with the Product and Business teams
  • Understanding of tools like product analytics and engagement platforms like Clevertap, Amplitude, Apxor etc
  • Conduct analysis to determine new project pilot settings, new features, user behaviour, and in-app behaviour
  • Build & maintain dashboards for tracking business performance and product adoption
  • Assist Product Managers and Business teams in creating data-backed decisions
  • Collaborate with Consumer Platform's Product and Business teams in identifying new avenues for growth and opportunities, and back their product delivery with experimentation
  • Build first cut Machine Learning models based on product requirements
  • Automate data extraction by creating de-normalized tables

What You Will Need

  • At least 2+ years of work experience dealing with product analytics, data, and statistics
  • Expertise in SQL with experience using data visualization and dashboarding tools (e.g. Tableau, Metabase, Google Data Studio, Clevertap, Python)
  • Experience in Machine Learning technologies (i.e. forecasting, clustering, statistical significance test, predictive modeling, and text mining)
  • Experience in delivering products as end-to-end data solutions (from data pipelining to analysis, presenting, and scalable adaption)
  • A strong business sense with the ability to transform ambiguous business and product issues into well-scoped, impactful analysis
  • Strong ability to design and conduct simple experiments
  • A goal-oriented, critical-thinking mindset with the ability to work equally well within a team and independently with minimal supervision
Job posted by
Misbah Ashraf

Data Scientist

at Dunzo

Agency job
via zyoin
Data Science
Machine Learning (ML)
NumPy
R Programming
Python
icon
Bengaluru (Bangalore)
icon
8 - 12 yrs
icon
₹50L - ₹90L / yr
  • B.Tech/MTech from tier 1 institution
  • 8+years of experience in machine learning techniques like logistic regression, random forest, boosting, trees, neural networks, etc.
  • Showcased experience with Python, SQL and proficiency in Scikit Learn, Pandas, NumPy, Keras and TensorFlow/pytorch
  • Experience of working with Qlik sense or Tableau is a plus
Experience of working in a product company is a plus
Job posted by
Pratibha Yadav

NLP Engineer

at India's first Options Trading Analytics platform on mobile.

Natural Language Processing (NLP)
NLP
Keras
Scikit-Learn
icon
Mumbai
icon
1 - 3 yrs
icon
₹3.5L - ₹5L / yr
Experience : 1-2 years

Location : Andheri East

Notice Period: Immediate-15 days

Responsibilities:
1. Study and transform data science prototypes.
2. Design NLP applications.
3. Select appropriate annotated datasets for Supervised Learning methods.
4. Find and implement the right algorithms and tools for NLP tasks.
5. Develop NLP systems according to requirements.
6. Train the developed model and run evaluation experiments.
7. Perform statistical analysis of results and refine models.
8. Extend ML libraries and frameworks to apply in NLP tasks.
9. Use effective text representations to transform natural language into useful features.
10. Develop APIs to deliver deciphered results, optimized for time and memory.

Requirements:
1. Proven experience as an NLP Engineer of at least one year.
2. Understanding of NLP techniques for text representation, semantic extraction techniques, data
structures and modeling.
3. Ability to effectively design software architecture.
4. Deep understanding of text representation techniques (such as n-grams, bag of words, sentiment
analysis etc), statistics and classification algorithms.
5. Hands on Experience of Knowledge of Python of more than a year.
6. Ability to write robust and testable code.
7. Experience with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn)
8. Strong communication skills.
9. An analytical mind with problem-solving abilities.
10. Bachelor Degree in Computer Science, Mathematics, Computational Linguistics.
Job posted by
Kavita Verma

Machine learning Developer

at Chariot Tech

Founded 2017  •  Product  •  20-100 employees  •  Raised funding
Machine Learning (ML)
Big Data
Data Science
icon
NCR (Delhi | Gurgaon | Noida)
icon
1 - 5 yrs
icon
₹15L - ₹16L / yr
We are looking for a Machine Learning Developer who possesses apassion for machine technology & big data and will work with nextgeneration Universal IoT platform.Responsibilities:•Design and build machine that learns , predict and analyze data.•Build and enhance tools to mine data at scale• Enable the integration of Machine Learning models in Chariot IoTPlatform•Ensure the scalability of Machine Learning analytics across millionsof networked sensors•Work with other engineering teams to integrate our streaming,batch, or ad-hoc analysis algorithms into Chariot IoT's suite ofapplications•Develop generalizable APIs so other engineers can use our workwithout needing to be a machine learning expert
Job posted by
Raj Garg
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Gormalone LLP?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort