Cutshort logo
Graasai logo
Data Engineer
Data Engineer
Graasai's logo

Data Engineer

Vineet A's profile picture
Posted by Vineet A
3 - 7 yrs
₹10L - ₹30L / yr
Pune
Skills
PySpark
Data engineering
Big Data
Hadoop
Spark
SQL
skill iconPython
skill iconDjango
skill iconAmazon Web Services (AWS)
AWS Lambda
Snow flake schema
Amazon Redshift
Keras
PyTorch

Graas uses predictive AI to turbo-charge growth for eCommerce businesses. We are “Growth-as-a-Service”. Graas is a technology solution provider using predictive AI to turbo-charge growth for eCommerce businesses. Graas integrates traditional data silos and applies a machine-learning AI engine, acting as an in-house data scientist to predict trends and give real-time insights and actionable recommendations for brands. The platform can also turn insights into action by seamlessly executing these recommendations across marketplace store fronts, brand.coms, social and conversational commerce, performance marketing, inventory management, warehousing, and last mile logistics - all of which impacts a brand’s bottom line, driving profitable growth.


Roles & Responsibilities:

Work on implementation of real-time and batch data pipelines for disparate data sources.

  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
  • Build and maintain an analytics layer that utilizes the underlying data to generate dashboards and provide actionable insights.
  • Identify improvement areas in the current data system and implement optimizations.
  • Work on specific areas of data governance including metadata management and data quality management.
  • Participate in discussions with Product Management and Business stakeholders to understand functional requirements and interact with other cross-functional teams as needed to develop, test, and release features.
  • Develop Proof-of-Concepts to validate new technology solutions or advancements.
  • Work in an Agile Scrum team and help with planning, scoping and creation of technical solutions for the new product capabilities, through to continuous delivery to production.
  • Work on building intelligent systems using various AI/ML algorithms. 

 

Desired Experience/Skill:

 

  • Must have worked on Analytics Applications involving Data Lakes, Data Warehouses and Reporting Implementations.
  • Experience with private and public cloud architectures with pros/cons.
  • Ability to write robust code in Python and SQL for data processing. Experience in libraries such as Pandas is a must; knowledge of one of the frameworks such as Django or Flask is a plus.
  • Experience in implementing data processing pipelines using AWS services: Kinesis, Lambda, Redshift/Snowflake, RDS.
  • Knowledge of Kafka, Redis is preferred
  • Experience on design and implementation of real-time and batch pipelines. Knowledge of Airflow is preferred.
  • Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn)
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Graasai

Founded :
2013
Type
Size
Stage :
Bootstrapped
About
Graas uses predictive AI to turbo-charge growth for #eCommerce businesses. We are “Growth-as-a-Service”
Read more
Company social profiles
bloginstagramlinkedintwitterfacebook

Similar jobs

Gurugram, Bengaluru (Bangalore), Chennai
2 - 9 yrs
₹9L - ₹27L / yr
DevOps
Microsoft Windows Azure
gitlab
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
+15 more
Greetings!!

We are looking out for a technically driven  "ML OPS Engineer" for one of our premium client

COMPANY DESCRIPTION:
This Company is a global management consulting firm. We are the trusted advisor to the world's leading businesses, governments, and institutions. We work with leading organizations across the private, public and social sectors. Our scale, scope, and knowledge allow us to address


Key Skills
• Excellent hands-on expert knowledge of cloud platform infrastructure and administration
(Azure/AWS/GCP) with strong knowledge of cloud services integration, and cloud security
• Expertise setting up CI/CD processes, building and maintaining secure DevOps pipelines with at
least 2 major DevOps stacks (e.g., Azure DevOps, Gitlab, Argo)
• Experience with modern development methods and tooling: Containers (e.g., docker) and
container orchestration (K8s), CI/CD tools (e.g., Circle CI, Jenkins, GitHub actions, Azure
DevOps), version control (Git, GitHub, GitLab), orchestration/DAGs tools (e.g., Argo, Airflow,
Kubeflow)
• Hands-on coding skills Python 3 (e.g., API including automated testing frameworks and libraries
(e.g., pytest) and Infrastructure as Code (e.g., Terraform) and Kubernetes artifacts (e.g.,
deployments, operators, helm charts)
• Experience setting up at least one contemporary MLOps tooling (e.g., experiment tracking,
model governance, packaging, deployment, feature store)
• Practical knowledge delivering and maintaining production software such as APIs and cloud
infrastructure
• Knowledge of SQL (intermediate level or more preferred) and familiarity working with at least
one common RDBMS (MySQL, Postgres, SQL Server, Oracle)
Read more
A Leading Edtech Company
Noida
3 - 6 yrs
₹12L - ₹15L / yr
skill iconMongoDB
MySQL
SQL
  • Sound knowledge of Mongo as a primary skill
  • . Should have hands on experience of  MySQL as a secondary skill will be enough
  • . Experience with replication , sharding and scaling.
  • . Design, install, maintain highly available systems (includes monitoring, security, backup, and performance tuning)
  • . Implement secure database and server installations (privilege access methodology / role based access)
  • . Help application team in query writing, performance tuning & other D2D issues
  • • Deploy automation techniques for d2d operations
  • . Must possess good analytical and problem solving skills
  • . Must be willing to work flexible hours as needed
  • . Scripting experience a plus
  • . Ability to work independently and as a member of a team
  • . good verbal and written communication skills
Read more
A analytics consulting start-up
Remote only
7 - 12 yrs
₹10L - ₹15L / yr
skill iconMachine Learning (ML)
skill iconData Science
MS-Office
Artificial Intelligence (AI)
skill iconPython
+2 more

 

  • A Data and MLOps Engineering lead that has a good understanding of modern Data engineering frameworks with a focus on Microsoft Azure and Azure Machine Learning and its development lifecycle and DevOps.
  • Aims to solve the problems encountered when turning Data into meaningful solutions using transformations and data science code into production Machine Learning systems. Some of these challenges include:
    • ML orchestration - how can I automate my ML workflows across multiple environments
    • Scalability - how can I take advantage of the huge computational power available in the cloud?
    • Serving - how can I make my ML models available to make predictions reliably when needed?
    • Monitoring - how can I effectively monitor my ML system in production to ensure reliability? Not just system metrics, but also get insight into how my models are performing over time
    • Reuse – how can I profess reuse of artefacts built and establish templates and patterns?


The MLOps team works closely with ML Engineering and DevOps teams. Rather than focus just on individual use cases, the focus would be to specialise in building the platforms and tools that can help adoption of MLOps across the organisation and develop best practices and ways of working to develop a state of the art MLOps capability.

A good understanding of AI/Machine Learning and software engineering best practices such as Cloud Engineering, Infrastructure-as-Code, and CI/CD.

Have excellent communication and consulting skills, while delivering innovative AI solutions on Azure.

Responsibilities will include:

  • Building state-of-the-art MLOps platforms and tooling to help adoption of MLOps across organization
  • Designing cloud ML architectures and provide a roadmap for flexible patterns
  • Optimizing solutions for performance and scalability
  • Leading and driving the evolving best practices for MLOps
  • Helping to showcase expertise and leadership in this field

 

Tech stack

These are some of the tools and technologies that we use day to day. Key to success will be attitude and aptitude with a vision to build the next big thing in AI/ML field.

  • Python - including poetry for dependency management, pytest for automated testing and fastapi for building APIs
  • Microsoft Azure Platform - primarily focused on Databricks, Azure ML
  • Containers
  • CI/CD – Azure DevOps
  • Strong programming skills in Python
  • Solid understanding of cloud concepts
  • Demonstrable interest in Machine Learning
  • Understanding of IaC and CI/CD concepts
  • Strong communication and presentation skills.


Remuneration: Best in the industry


Connect: https://www.linkedin.com/in/shweta-gupta-a361511

Read more
InfoCepts
Lalsaheb Bepari
Posted by Lalsaheb Bepari
Chennai, Pune, Nagpur
7 - 10 yrs
₹5L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+5 more

Responsibilities:

 

• Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for Structured & Semi-structured data processing

• Implementing Spark processing based ETL frameworks

• Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption

• Modifying the Informatica-Teradata & Unix based data pipeline

• Enhancing the Talend-Hive/Spark & Unix based data pipelines

• Develop and Deploy Scala/Python based Spark Jobs for ETL processing

• Strong SQL & DWH concepts.

 

Preferred Background:

 

• Function as integrator between business needs and technology solutions, helping to create technology solutions to meet clients’ business needs

• Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives

• Understanding of EDW system of business and creating High level design document and low level implementation document

• Understanding of Big Data Lake system of business and creating High level design document and low level implementation document

• Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption

Read more
IntraEdge
at IntraEdge
1 recruiter
Poornima V
Posted by Poornima V
Remote only
4 - 16 yrs
₹11L - ₹27L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more

Company Name: Intraedge Technologies Ltd (https://intraedge.com/" target="_blank">https://intraedge.com/)

Type: Permanent, Full time

Location: Any

A Bachelor’s degree in computer science, computer engineering, other technical discipline, or equivalent work experience

  • 4+ years of software development experience
  • 4+ years exp in programming languages- Python, spark, Scala, Hadoop, hive
  • Demonstrated experience with Agile or other rapid application development methods
  • Demonstrated experience with object-oriented design and coding.

Please mail you rresume to poornimakattherateintraedgedotcomalong with NP, how soon can you join, ECTC, Availability for interview, Location
Read more
Spica Systems
at Spica Systems
1 recruiter
Priyanka Bhattacharya
Posted by Priyanka Bhattacharya
Kolkata
3 - 5 yrs
₹7L - ₹12L / yr
skill iconPython
Apache Spark
We are a Silicon Valley based start-up, established in 2019 and are recognized as experts in building products and providing R&D and Software Development services in wide range of leading-edge technologies such as LTE, 5G, Cloud Services (Public -AWS, AZURE,GCP,Private – Openstack) and Kubernetes. It has a highly scalable and secured 5G Packet Core Network, orchestrated by ML powered Kubernetes platform, which can be deployed in various multi cloud mode along with a test tool.Headquartered in San Jose, California, we have our R&D centre in Sector V, Salt Lake Kolkata.
 

Requirements:

  • Overall 3 to 5 years of experience in designing and implementing complex large scale Software.
  • Good in Python is must.
  • Experience in Apache Spark, Scala, Java and Delta Lake
  • Experience in designing and implementing templated ETL/ELT data pipelines
  • Expert level experience in Data Pipeline Orchestrationusing Apache Airflow for large scale production deployment
  • Experience in visualizing data from various tasks in the data pipeline using Apache Zeppelin/Plotly or any other visualization library.
  • Log management and log monitoring using ELK/Grafana
  • Git Hub Integration

 

Technology Stack: Apache Spark, Apache Airflow, Python, AWS, EC2, S3, Kubernetes, ELK, Grafana , Apache Arrow, Java

Read more
Fragma Data Systems
at Fragma Data Systems
8 recruiters
Sudarshini K
Posted by Sudarshini K
Bengaluru (Bangalore)
2 - 6 yrs
₹8L - ₹14L / yr
ETL
Big Data
Hadoop
PySpark
SQL
+4 more
Roles and Responsibilities:

• Responsible for developing and maintaining applications with PySpark 
• Contribute to the overall design and architecture of the application developed and deployed.
• Performance Tuning wrt to executor sizing and other environmental parameters, code optimization, partitions tuning, etc.
• Interact with business users to understand requirements and troubleshoot issues.
• Implement Projects based on functional specifications.

Must Have Skills:

• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good experience in SQL DBs - Be able to write queries including fair complexity.
• Should have excellent experience in Big Data programming for data transformation and aggregations
• Good at ETL architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption.
• Good customer communication.
• Good Analytical skills
Read more
codeMantra
at codeMantra
3 recruiters
saranya v
Posted by saranya v
Chennai
14 - 18 yrs
₹20L - ₹25L / yr
skill iconMachine Learning (ML)
skill iconData Science
skill iconR Programming
skill iconPython

ML ARCHITECT

 

Job Overview

                         We are looking for a ML Architect to help us discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver even better products. Your primary focus will be in applying data mining techniques, doing statistical analysis, and building high quality prediction systems integrated with our products. They must have strong experience using variety of data mining and data analysis methods, building and implementing models, using/creating algorithm’s and creating/running simulations.  They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. Automating to identify the textual data with their properties and structure form various type of document.

 

Responsibilities

  • Selecting features, building and optimizing classifiers using machine learning techniques
  • Data mining using state-of-the-art methods
  • Enhancing data collection procedures to include information that is relevant for building analytic systems
  • Processing, cleansing, and verifying the integrity of data used for analysis
  • Creating automated anomaly detection systems and constant tracking of its performance
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Secure and manage when needed GPU cluster resources for events
  • Write comprehensive internal feedback reports and find opportunities for improvements
  • Manage GPU instances/machines to increase the performance and efficiency of the ML/DL model.

 

Skills and Qualifications

  • Strong Hands-on experience in Python Programming
  • Working experience with Computer Vision models - Object Detection Model, Image Classification
  • Good experience in feature extraction, feature selection techniques and transfer learning
  • Working Experience in building deep learning NLP Models for text classification, image analytics-CNN,RNN,LSTM.
  • Working Experience in any of the AWS/GCP cloud platforms, exposure in fetching data from various sources.
  • Good experience in exploratory data analysis, data visualisation, and other data preprocessing techniques.
  • Knowledge in any one of the DL frameworks like Tensorflow, Pytorch, Keras, Caffe
  • Good knowledge in statistics,distribution of data and in supervised and unsupervised machine learning algorithms.
  • Exposure to OpenCV Familiarity with GPUs + CUDA
  • Experience with NVIDIA software for cluster management and provisioning such as nvsm, dcgm and DeepOps.
  • We are looking for a candidate with 14+ years of experience, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with AWS cloud services: EC2, RDS, AWS-Sagemaker(Added advantage)
  • Experience with object-oriented/object function scripting languages in any: Python, Java, C++, Scala, etc.

 

 

Read more
Artivatic
at Artivatic
1 video
3 recruiters
Layak Singh
Posted by Layak Singh
Bengaluru (Bangalore)
3 - 10 yrs
₹6L - ₹12L / yr
skill iconPython
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
Natural Language Processing (NLP)
TensorFlow
+3 more
Responsibilities :- Define the short-term tactics and long-term technology strategy.- Communicate that technical vision to technical and non-technical partners, customers and investors.- Lead the development of AI/ML related products as it matures into lean, high performing agile teams.- Scale the AI/ML teams by finding and hiring the right mix of on-shore and off-shore resources.- Work collaboratively with the business, partners, and customers to consistently deliver business value.- Own the vision and execution of developing and integrating AI & machine learning into all aspects of the platform.- Drive innovation through the use of technology and unique ways of applying it to business problems.Experience and Qualifications :- Masters or Ph.D. in AI, computer science, ML, electrical engineering or related fields (statistics, applied math, computational neuroscience)- Relevant experience leading & building teams establishing technical direction- A well-developed portfolio of past software development, composed of some mixture of professional work, open source contributions, and personal projects.- Experience in leading and developing remote and distributed teams- Think strategically and apply that through to innovative solutions- Experience with cloud infrastructure- Experience working with machine learning, artificial intelligence, and large datasets to drive insights and business value- Experience in agents architecture, deep learning, neural networks, computer vision and NLP- Experience with distributed computational frameworks (YARN, Spark, Hadoop)- Proficiency in Python, C++. Familiarity with DL frameworks (e.g. neon, TensorFlow, Caffe, etc.)Personal Attributes :- Excellent communication skills- Strong fit with the culture- Hands-on approach, self-motivated with a strong work ethic- Ability to learn quickly (technology, business models, target industries)- Creative and inspired.Superpowers we love :- Entrepreneurial spirit and a vibrant personality- Experience with lean startup build-measure-learn cycle- Vision for AI- Extensive understanding of why things are done the way they are done in agile development.- A passion for adding business valueNote: Selected candidate will be offered ESOPs too.Employment Type : Full TimeSalary : 8-10 Lacs + ESOPFunction : Systems/Product SoftwareExperience : 3 - 10 Years
Read more
LatentView Analytics
at LatentView Analytics
3 recruiters
Kannikanti madhuri
Posted by Kannikanti madhuri
Chennai
3 - 5 yrs
₹0L / yr
SAS
SQL server
skill iconPython
SOFA Statistics
Analytics
+11 more
Looking for Immediate JoinersAt LatentView, we would expect you to:- Independently handle delivery of analytics assignments- Mentor a team of 3 - 10 people and deliver to exceed client expectations- Co-ordinate with onsite LatentView consultants to ensure high quality, on-time delivery- Take responsibility for technical skill-building within the organization (training, process definition, research of new tools and techniques etc.)You'll be a valuable addition to our team if you have:- 3 - 5 years of hands-on experience in delivering analytics solutions- Great analytical skills, detail-oriented approach- Strong experience in R, SAS, Python, SQL, SPSS, Statistica, MATLAB or such analytic tools would be preferable- Working knowledge in MS Excel, Power Point and data visualization tools like Tableau, etc- Ability to adapt and thrive in the fast-paced environment that young companies operate in- A background in Statistics / Econometrics / Applied Math / Operations Research / MBA, or alternatively an engineering degree from a premier institution.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos