Cutshort logo
DemandMatrix logo
Data Engineer
DemandMatrix's logo

Data Engineer

Harwinder Singh's profile picture
Posted by Harwinder Singh
9 - 12 yrs
₹25L - ₹30L / yr
Remote only
Skills
Big Data
PySpark
Apache Hadoop
Spark
skill iconPython
Design patterns
Data Structures
Algorithms

Only a solid grounding in computer engineering, Unix, data structures and algorithms would enable you to meet this challenge.

7+ years of experience architecting, developing, releasing, and maintaining large-scale big data platforms on AWS or GCP

Understanding of how Big Data tech and NoSQL stores like MongoDB, HBase/HDFS, ElasticSearch synergize to power applications in analytics, AI and knowledge graphs

Understandingof how data processing models, data location patterns, disk IO, network IO, shuffling affect large scale text processing - feature extraction, searching etc

Expertise with a variety of data processing systems, including streaming, event, and batch (Spark,  Hadoop/MapReduce)

5+ years proficiency in configuring and deploying applications on Linux-based systems

5+ years of experience Spark - especially Pyspark for transforming large non-structured text data, creating highly optimized pipelines

Experience with RDBMS, ETL techniques and frameworks (Sqoop, Flume) and big data querying tools (Pig, Hive)

Stickler of world class best practices, uncompromising on the quality of engineering, understand standards and reference architectures and deep in Unix philosophy with appreciation of big data design patterns, orthogonal code design and functional computation models
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About DemandMatrix

Founded :
2015
Type
Size
Stage :
Profitable
About
DemandMatrix is an AI-powered Technographics and Intent Data provider that helps B2B marketing and sales teams identify and target the right accounts based on their propensity to buy-into a particular technology.
Read more
Connect with the team
Profile picture
Sudarshana Mukherjee
Profile picture
Rutuja Pawar
Profile picture
Jobs DemandMatrix
Profile picture
Harwinder Singh
Company social profiles
blog

Similar jobs

Intuitive Technology Partners
shalu Jain
Posted by shalu Jain
Remote only
9 - 20 yrs
Best in industry
Architecture
Presales
Postsales
skill iconAmazon Web Services (AWS)
databricks
+13 more

Intuitive cloud (http://www.intuitive.cloud">www.intuitive.cloud) is one of the fastest growing top-tier Cloud Solutions and SDx Engineering solution and service company supporting 80+ Global Enterprise Customer across Americas, Europe and Middle East.

Intuitive is a recognized professional and manage service partner for core superpowers in cloud(public/ Hybrid), security, GRC, DevSecOps, SRE, Application modernization/ containers/ K8 -as-a- service and cloud application delivery.


Data Engineering:

  • 9+ years’ experience as data engineer.
  • Must have 4+ Years in implementing data engineering solutions with Databricks.
  • This is hands on role building data pipelines using Databricks. Hands-on technical experience with Apache Spark.
  • Must have deep expertise in one of the programming languages for data processes (Python, Scala). Experience with Python, PySpark, Hadoop, Hive and/or Spark to write data pipelines and data processing layers
  • Must have worked with relational databases like Snowflake. Good SQL experience for writing complex SQL transformation.
  • Performance Tuning of Spark SQL running on S3/Data Lake/Delta Lake/ storage and Strong Knowledge on Databricks and Cluster Configurations.
  • Hands on architectural experience
  • Nice to have Databricks administration including security and infrastructure features of Databricks.
Read more
A logistic Company
Agency job
via Anzy by Dattatraya Kolangade
Bengaluru (Bangalore)
5 - 7 yrs
₹18L - ₹25L / yr
Data engineering
ETL
SQL
Hadoop
Apache Spark
+13 more
Key responsibilities:
• Create and maintain data pipeline
• Build and deploy ETL infrastructure for optimal data delivery
• Work with various including product, design and executive team to troubleshoot data
related issues
• Create tools for data analysts and scientists to help them build and optimise the product
• Implement systems and process for data access controls and guarantees
• Distill the knowledge from experts in the field outside the org and optimise internal data
systems
Preferred qualifications/skills:
• 5+ years experience
• Strong analytical skills

____ 04

Freight Commerce Solutions Pvt Ltd. 

• Degree in Computer Science, Statistics, Informatics, Information Systems
• Strong project management and organisational skills
• Experience supporting and working with cross-functional teams in a dynamic environment
• SQL guru with hands on experience on various databases
• NoSQL databases like Cassandra, MongoDB
• Experience with Snowflake, Redshift
• Experience with tools like Airflow, Hevo
• Experience with Hadoop, Spark, Kafka, Flink
• Programming experience in Python, Java, Scala
Read more
iLink Systems
at iLink Systems
1 video
1 recruiter
Ganesh Sooriyamoorthu
Posted by Ganesh Sooriyamoorthu
Chennai, Pune, Bengaluru (Bangalore)
5 - 15 yrs
₹14L - ₹25L / yr
PowerBI
Data storage
Data Structures
Algorithms
Data Lake
+2 more
Job Description
The Azure Data Engineer is responsible for building, implementing and supporting Microsoft BI solutions to meet market and/or client requirements. They apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity.
 
We are currently looking for programmers or experienced programmers who have good technical expertise in Azure Data Lake, Azure Synapse and Power BI reporting. As part of a collaborative team and under the supervision of a head of project, he/she will be responsible of designing and developing software products to implement new features and support current applications.
 
 
Responsibilities:
- Create ER diagrams and write relational database queries
- Create database objects and maintain referential integrity
- Configure, deploy and maintain database
- Participate in development and maintenance of Data warehouses
- Design, develop and deploy packages
- Creating and deploying reports
- Provide technical design, coding assistance to the team to accomplish the project deliverables as planned/scoped.

Requirements

 
Required Skills:
- Atleast 3 years of experience in Azure Data Lake Storage
- Atleast 3 years of experience in Azure Synapse Pipelines
- Atleast 3 years of experience in Power BI
- Atleast 3 years of experience in Azure Machine Learning
- Atleast 3 years of experience in Azure Databricks
- Should be well versed with Data Structures & algorithms
- Understanding of software development lifecycle
- Excellent analytical and problem-solving skills.
- Ability to work independently as a self-starter, and within a team environment.
- Good Communication skills- Written and Verbal
Read more
Railofy
at Railofy
1 video
1 recruiter
Manan Jain
Posted by Manan Jain
Mumbai
2 - 5 yrs
₹5L - ₹12L / yr
skill iconData Science
skill iconPython
skill iconR Programming

About Us:

We are a VC-funded startup solving one of the biggest transportation problems India faces. Most passengers in India travel long distance by IRCTC trains. At time of booking, approx 1 out of every 2 passengers end up with a Waitlisted or RAC ticket. This creates a lot of anxiety for passengers, as Railway only announces only 4 hour before departure if they have a confirmed seat. We solve this problem through our Waitlist & RAC Protection. Protection can be bought against each IRCTC ticket at time of booking. If train ticket is not confirmed, we fly the passenger to the destination. Our team consists of 3 Founders from IIT, IIM and ISB.

Functional Experience:

  • Computer Science or IT Engineering background with solid understanding of basics of Data Structures and Algorithms
  • 2+ years of data science experience working with large datasets
  • Expertise in Python packages like pandas, numPy, sklearn, matplotlib, seaborn, keras and tensorflow
  • Expertise in Big Data technologies like Hadoop, Cassandra and PostgreSQL
  • Expertise in Cloud computing on AWS with EC2, AutoML, Lambda and RDS
  • Good knowledge of Machine Learning and Statistical time series analysis (optional)
  • Unparalleled logical ability making you the go to guy for all things related to data
  • You love coding like a hobby and are up for a challenge!

 

Cultural:

  • Assume a strong sense of ownership of analytics : Design, develop & deploy
  • Collaborate with senior management, operations & business team
  • Ensure Quality & sustainability of the architecture
  • Motivation to join an early stage startup should go beyond compensation
Read more
Thinkdeeply
at Thinkdeeply
5 recruiters
Aditya Kanchiraju
Posted by Aditya Kanchiraju
Hyderabad
5 - 15 yrs
₹5L - ₹35L / yr
skill iconMachine Learning (ML)
skill iconR Programming
TensorFlow
skill iconDeep Learning
skill iconPython
+2 more

Job Description

Want to make every line of code count? Tired of being a small cog in a big machine? Like a fast-paced environment where stuff get DONE? Wanna grow with a fast-growing company (both career and compensation)? Like to wear different hats? Join ThinkDeeply in our mission to create and apply Enterprise-Grade AI for all types of applications.

 

Seeking an M.L. Engineer with high aptitude toward development. Will also consider coders with high aptitude in M.L. Years of experience is important but we are also looking for interest and aptitude. As part of the early engineering team, you will have a chance to make a measurable impact in future of Thinkdeeply as well as having a significant amount of responsibility.

 

Experience

10+ Years

 

Location

Bozeman/Hyderabad

 

Skills

Required Skills:

Bachelors/Masters or Phd in Computer Science or related industry experience

3+ years of Industry Experience in Deep Learning Frameworks in PyTorch or TensorFlow

7+ Years of industry experience in scripting languages such as Python, R.

7+ years in software development doing at least some level of Researching / POCs, Prototyping, Productizing, Process improvement, Large-data processing / performance computing

Familiar with non-neural network methods such as Bayesian, SVM, Adaboost, Random Forests etc

Some experience in setting up large scale training data pipelines.

Some experience in using Cloud services such as AWS, GCP, Azure

Desired Skills:

Experience in building deep learning models for Computer Vision and Natural Language Processing domains

Experience in productionizing/serving machine learning in industry setting

Understand the principles of developing cloud native applications

 

Responsibilities

 

Collect, Organize and Process data pipelines for developing ML models

Research and develop novel prototypes for customers

Train, implement and evaluate shippable machine learning models

Deploy and iterate improvements of ML Models through feedback

Read more
first principle labs
at first principle labs
1 recruiter
Ankit Goenka
Posted by Ankit Goenka
Pune
3 - 7 yrs
₹12L - ₹18L / yr
skill iconData Science
skill iconPython
skill iconR Programming
Big Data
Hadoop
The selected would be a part of the inhouse Data Labs team. He/she would be responsible to creation insights-driven decision structure.

This will include:

Scorecards
Strategies
MIS

The verticals included are:

Risk
Marketing
Product
Read more
Japan Based Leading Company
Bengaluru (Bangalore)
3 - 10 yrs
₹0L - ₹20L / yr
Big Data
skill iconAmazon Web Services (AWS)
skill iconJava
skill iconPython
MySQL
+2 more
A data engineer with AWS Cloud infrastructure experience to join our Big Data Operations team. This role will provide advanced operations support, contribute to automation and system improvements, and work directly with enterprise customers to provide excellent customer service.
The candidate,
1. Must have a very good hands-on technical experience of 3+ years with JAVA or Python
2. Working experience and good understanding of AWS Cloud; Advanced experience with IAM policy and role management
3. Infrastructure Operations: 5+ years supporting systems infrastructure operations, upgrades, deployments using Terraform, and monitoring
4. Hadoop: Experience with Hadoop (Hive, Spark, Sqoop) and / or AWS EMR
5. Knowledge on PostgreSQL/MySQL/Dynamo DB backend operations
6. DevOps: Experience with DevOps automation - Orchestration/Configuration Management and CI/CD tools (Jenkins)
7. Version Control: Working experience with one or more version control platforms like GitHub or GitLab
8. Knowledge on AWS Quick sight reporting
9. Monitoring: Hands on experience with monitoring tools such as AWS CloudWatch, AWS CloudTrail, Datadog and Elastic Search
10. Networking: Working knowledge of TCP/IP networking, SMTP, HTTP, load-balancers (ELB) and high availability architecture
11. Security: Experience implementing role-based security, including AD integration, security policies, and auditing in a Linux/Hadoop/AWS environment. Familiar with penetration testing and scan tools for remediation of security vulnerabilities.
12. Demonstrated successful experience learning new technologies quickly
WHAT WILL BE THE ROLES AND RESPONSIBILITIES?
1. Create procedures/run books for operational and security aspects of AWS platform
2. Improve AWS infrastructure by developing and enhancing automation methods
3. Provide advanced business and engineering support services to end users
4. Lead other admins and platform engineers through design and implementation decisions to achieve balance between strategic design and tactical needs
5. Research and deploy new tools and frameworks to build a sustainable big data platform
6. Assist with creating programs for training and onboarding for new end users
7. Lead Agile/Kanban workflows and team process work
8. Troubleshoot issues to resolve problems
9. Provide status updates to Operations product owner and stakeholders
10. Track all details in the issue tracking system (JIRA)
11. Provide issue review and triage problems for new service/support requests
12. Use DevOps automation tools, including Jenkins build jobs
13. Fulfil any ad-hoc data or report request queries from different functional groups
Read more
AthenasOwl
at AthenasOwl
1 video
1 recruiter
Ericsson Fernandes
Posted by Ericsson Fernandes
Mumbai
3 - 7 yrs
₹10L - ₹20L / yr
skill iconDeep Learning
Natural Language Processing (NLP)
skill iconMachine Learning (ML)
Computer vision
skill iconPython
+1 more

Company Profile and Job Description  

About us:  

AthenasOwl (AO) is our “AI for Media” solution that helps content creators and broadcasters to create and curate smarter content. We launched the product in 2017 as an AI-powered suite meant for the media and entertainment industry. Clients use AthenaOwl's context adapted technology for redesigning content, taking better targeting decisions, automating hours of post-production work and monetizing massive content libraries.  

For more details visit: www.athenasowl.tv   

  

Role:   

Senior Machine Learning Engineer  

Experience Level:   

4 -6 Years of experience  

Work location:   

Mumbai (Malad W)   

  

Responsibilities:   

  • Develop cutting edge machine learning solutions at scale to solve computer vision problems in the domain of media, entertainment and sports
  • Collaborate with media houses and broadcasters across the globe to solve niche problems in the field of post-production, archiving and viewership
  • Manage a team of highly motivated engineers to deliver high-impact solutions quickly and at scale

 

 

The ideal candidate should have:   

  • Strong programming skills in any one or more programming languages like Python and C/C++
  • Sound fundamentals of data structures, algorithms and object-oriented programming
  • Hands-on experience with any one popular deep learning framework like TensorFlow, PyTorch, etc.
  • Experience in implementing Deep Learning Solutions (Computer Vision, NLP etc.)
  • Ability to quickly learn and communicate the latest findings in AI research
  • Creative thinking for leveraging machine learning to build end-to-end intelligent software systems
  • A pleasantly forceful personality and charismatic communication style
  • Someone who will raise the average effectiveness of the team and has demonstrated exceptional abilities in some area of their life. In short, we are looking for a “Difference Maker”

 

Read more
Quantiphi Inc.
at Quantiphi Inc.
1 video
10 recruiters
Anwar Shaikh
Posted by Anwar Shaikh
Mumbai
1 - 5 yrs
₹4L - ₹15L / yr
skill iconPython
skill iconMachine Learning (ML)
skill iconDeep Learning
TensorFlow
Keras
+1 more
1. The candidate should be passionate about machine learning and deep learning.
2. Should understand the importance and know-how of taking the machine-learning-based solution to the consumer.
3. Hands-on experience with statistical, machine-learning tools and techniques
4. Good exposure to Deep learning libraries like Tensorflow, PyTorch.
5. Experience in implementing Deep Learning techniques, Computer Vision and NLP. The candidate should be able to develop the solution from scratch with Github codes exposed.
6. Should be able to read research papers and pick ideas to quickly reproduce research in the most comfortable Deep Learning library.
7. Should be strong in data structures and algorithms. Should be able to do code complexity analysis/optimization for smooth delivery to production.
8. Expert level coding experience in Python.
9. Technologies: Backend - Python (Programming Language)
10. Should have the ability to think long term solutions, modularity, and reusability of the components.
11. Should be able to work in a collaborative way. Should be open to learning from peers as well as constantly bring new ideas to the table.
12. Self-driven missile. Open to peer criticism, feedback and should be able to take it positively. Ready to be held accountable for the responsibilities undertaken.
Read more
Precily Private Limited
at Precily Private Limited
5 recruiters
Bharath Rao
Posted by Bharath Rao
NCR (Delhi | Gurgaon | Noida)
1 - 3 yrs
₹3L - ₹9L / yr
skill iconData Science
Artificial Neural Network (ANN)
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
+3 more
-Precily AI: Automatic summarization, shortening a business document, book with our AI. Create a summary of the major points of the original document. AI can make a coherent summary taking into account variables such as length, writing style, and syntax. We're also working in the legal domain to reduce the high number of pending cases in India. We use Artificial Intelligence and Machine Learning capabilities such as NLP, Neural Networks in Processing the data to provide solutions for various industries such as Enterprise, Healthcare, Legal.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos