Cutshort logo
Sr. Data Engineer

Sr. Data Engineer

at A leader in Cognitive and Emerging Technologies Business

Agency job
icon
Remote, Hyderabad
icon
5 - 10 yrs
icon
₹15L - ₹18L / yr
icon
Full time
Skills
PySpark
Data engineering
Big Data
Hadoop
Spark
RESTful APIs
Python
SFTP
Amazon Web Services (AWS)
Amazon S3
SQL
Linux/Unix
Cassandra
MySQL
ETL

Job Title: Sr. Data Engineer


Experience: 5 to 8 years


Work Location: Hyderabad (option to work remotely)


Skillset: Python, PySpark, Kafka, Airflow, Sql, NoSql, API Integration,Data pipeline, Big Data, AWS/ GCP/ OCI/Azure


Selection Process:

1. Assignment

2. Tech Round -I

3. Tech Round - II

4. HR Round


Calling out Python ninjas to showcase their expertise in a stimulating environment, geared towards building cutting-edge products and services. If you have a knack for data processing, scripting and

are excited about delivering a scalable, high-quality data ingestion, API Integration solutions, then we are looking for you!

You will get a chance to work on exciting projects at our state-of-the-art office, grow along with the company and be fruitfully rewarded for your efforts!


Requirements:

● Understanding our data sets and how to bring them together.

● Working with our engineering team to support custom solutions offered to the product development..

● Filling the gap between development, engineering and data ops.

● Creating, maintaining and documenting scripts to support ongoing custom solutions.

● Excellent organizational skills, including attention to precise details

● Strong multitasking skills and ability to work in a fast-paced environment

● 5+ years experience with Python to develop scripts.

● Know your way around RESTFUL APIs.[Able to integrate not necessary to publish]

● You are familiar with pulling and pushing files from SFTP and AWS S3.

● Experience with any Cloud solutions including GCP / AWS / OCI / Azure.

● Familiarity with SQL programming to query and transform data from relational Databases.

● Familiarity to work with Linux (and Linux work environment).

● Excellent written and verbal communication skills

● Extracting, transforming, and loading data into internal databases and Hadoop

● Optimizing our new and existing data pipelines for speed and reliability

● Deploying product build and product improvements

● Documenting and managing multiple repositories of code

● Experience with SQL and NoSQL databases (Casendra, MySQL)

● Hands-on experience in data pipelining and ETL. (Any of these frameworks/tools: Hadoop, BigQuery,

RedShift, Athena)

● Hands-on experience in AirFlow

● Understanding of best practices, common coding patterns and good practices around

● storing, partitioning, warehousing and indexing of data

● Experience in reading the data from Kafka topic (both live stream and offline)

● Experience in PySpark and Data frames


Responsibilities:

You’ll

● Collaborating across an agile team to continuously design, iterate, and develop big data systems.

● Extracting, transforming, and loading data into internal databases.

● Optimizing our new and existing data pipelines for speed and reliability.

● Deploying new products and product improvements.

● Documenting and managing multiple repositories of code.

Read more
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

at codersbrain
1 recruiter
DP
Posted by Tanuj Uppal
Delhi
4 - 8 yrs
₹2L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+5 more
  • Mandatory - Hands on experience in Python and PySpark.

 

  • Build pySpark applications using Spark Dataframes in Python using Jupyter notebook and PyCharm(IDE).

 

  • Worked on optimizing spark jobs that processes huge volumes of data.

 

  • Hands on experience in version control tools like Git.

 

  • Worked on Amazon’s Analytics services like Amazon EMR, Lambda function etc

 

  • Worked on Amazon’s Compute services like Amazon Lambda, Amazon EC2 and Amazon’s Storage service like S3 and few other services like SNS.

 

  • Experience/knowledge of bash/shell scripting will be a plus.

 

  • Experience in working with fixed width, delimited , multi record file formats etc.

 

  • Hands on experience in tools like Jenkins to build, test and deploy the applications

 

  • Awareness of Devops concepts and be able to work in an automated release pipeline environment.

 

  • Excellent debugging skills.
Read more
Information Solution Provider Company
Agency job
via Jobdost by Sathish Kumar
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
3 - 7 yrs
₹10L - ₹15L / yr
SQL
Hadoop
Spark
Machine Learning (ML)
Data Science
+3 more

Job Description:

The data science team is responsible for solving business problems with complex data. Data complexity could be characterized in terms of volume, dimensionality and multiple touchpoints/sources. We understand the data, ask fundamental-first-principle questions, apply our analytical and machine learning skills to solve the problem in the best way possible. 

 

Our ideal candidate

The role would be a client facing one, hence good communication skills are a must. 

The candidate should have the ability to communicate complex models and analysis in a clear and precise manner. 

 

The candidate would be responsible for:

  • Comprehending business problems properly - what to predict, how to build DV, what value addition he/she is bringing to the client, etc.
  • Understanding and analyzing large, complex, multi-dimensional datasets and build features relevant for business
  • Understanding the math behind algorithms and choosing one over another
  • Understanding approaches like stacking, ensemble and applying them correctly to increase accuracy

Desired technical requirements

  • Proficiency with Python and the ability to write production-ready codes. 
  • Experience in pyspark, machine learning and deep learning
  • Big data experience, e.g. familiarity with Spark, Hadoop, is highly preferred
  • Familiarity with SQL or other databases.
Read more
world’s fastest growing consumer internet company
Agency job
via Hunt & Badge Consulting Pvt Ltd by Chandramohan Subramanian
Bengaluru (Bangalore)
5 - 8 yrs
₹20L - ₹35L / yr
Big Data
Data engineering
Big Data Engineering
Data Engineer
ETL
+5 more

Data Engineer JD:

  • Designing, developing, constructing, installing, testing and maintaining the complete data management & processing systems.
  • Building highly scalable, robust, fault-tolerant, & secure user data platform adhering to data protection laws.
  • Taking care of the complete ETL (Extract, Transform & Load) process.
  • Ensuring architecture is planned in such a way that it meets all the business requirements.
  • Exploring new ways of using existing data, to provide more insights out of it.
  • Proposing ways to improve data quality, reliability & efficiency of the whole system.
  • Creating data models to reduce system complexity and hence increase efficiency & reduce cost.
  • Introducing new data management tools & technologies into the existing system to make it more efficient.
  • Setting up monitoring and alarming on data pipeline jobs to detect failures and anomalies

What do we expect from you?

  • BS/MS in Computer Science or equivalent experience
  • 5 years of recent experience in Big Data Engineering.
  • Good experience in working with Hadoop and Big Data technologies like HDFS, Pig, Hive, Zookeeper, Storm, Spark, Airflow and NoSQL systems
  • Excellent programming and debugging skills in Java or Python.
  • Apache spark, python, hands on experience in deploying ML models
  • Has worked on streaming and realtime pipelines
  • Experience with Apache Kafka or has worked with any of Spark Streaming, Flume or Storm

 

 

 

 

 

 

 

 

 

 

 

 

Focus Area:

 

R1

Data structure & Algorithms

R2

Problem solving + Coding

R3

Design (LLD)

 

Read more
A content consumption and discovery app which provides news
Agency job
via Jobdost by Mamatha A
Noida
2 - 5 yrs
₹30L - ₹40L / yr
Data Science
Deep Learning
R Programming
Python

Data Scientist

Requirements

● B.Tech/Masters in Mathematics, Statistics, Computer Science or another
quantitative field
● 2-3+ years of work experience in ML domain ( 2-5 years experience )
● Hands-on coding experience in Python
● Experience in machine learning techniques such as Regression, Classification,
Predictive modeling, Clustering, Deep Learning stack, NLP
● Working knowledge of Tensorflow/PyTorch

Optional Add-ons-

● Experience with distributed computing frameworks: Map/Reduce, Hadoop, Spark
etc.
● Experience with databases: MongoDB

Read more
at Marktine
1 recruiter
DP
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹24L / yr
Data Science
R Programming
Python
SQL
Machine Learning (ML)
+1 more

Responsibilities:

  • Design and develop strong analytics system and predictive models
  • Managing a team of data scientists, machine learning engineers, and big data specialists
  • Identify valuable data sources and automate data collection processes
  • Undertake pre-processing of structured and unstructured data
  • Analyze large amounts of information to discover trends and patterns
  • Build predictive models and machine-learning algorithms
  • Combine models through ensemble modeling
  • Present information using data visualization techniques
  • Propose solutions and strategies to business challenges
  • Collaborate with engineering and product development teams

Requirements:

  • Proven experience as a seasoned Data Scientist
  • Good Experience in data mining processes
  • Understanding of machine learning and Knowledge of operations research is a value addition
  • Strong understanding and experience in R, SQL, and Python; Knowledge base with Scala, Java, or C++ is an asset
  • Experience using business intelligence tools (e. g. Tableau) and data frameworks (e. g. Hadoop)
  • Strong math skills (e. g. statistics, algebra)
  • Problem-solving aptitude
  • Excellent communication and presentation skills
  • Experience in Natural Language Processing (NLP)
  • Strong competitive coding skills
  • BSc/BA in Computer Science, Engineering or relevant field; graduate degree in Data Science or other quantitative field is preferred
Read more
A Pre-series A funded FinTech Company
Agency job
via GoHyre by Avik Majumder
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹30L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

Responsibilities:

  • Ensure and own Data integrity across distributed systems.
  • Extract, Transform and Load data from multiple systems for reporting into BI platform.
  • Create Data Sets and Data models to build intelligence upon.
  • Develop and own various integration tools and data points.
  • Hands-on development and/or design within the project in order to maintain timelines.
  • Work closely with the Project manager to deliver on business requirements OTIF (on time in full)
  • Understand the cross-functional business data points thoroughly and be SPOC for all data-related queries.
  • Work with both Web Analytics and Backend Data analytics.
  • Support the rest of the BI team in generating reports and analysis
  • Quickly learn and use Bespoke & third party SaaS reporting tools with little documentation.
  • Assist in presenting demos and preparing materials for Leadership.

 Requirements:

  • Strong experience in Datawarehouse modeling techniques and SQL queries
  • A good understanding of designing, developing, deploying, and maintaining Power BI report solutions
  • Ability to create KPIs, visualizations, reports, and dashboards based on business requirement
  • Knowledge and experience in prototyping, designing, and requirement analysis
  • Be able to implement row-level security on data and understand application security layer models in Power BI
  • Proficiency in making DAX queries in Power BI desktop.
  • Expertise in using advanced level calculations on data sets
  • Experience in the Fintech domain and stakeholder management.
Read more
Remote, Bengaluru (Bangalore)
6 - 12 yrs
₹10L - ₹18L / yr
Tableau
Analytical Skills
Dashboard
Data extraction
ETL
+3 more
  • Hands-on development/maintenance experience in Tableau: Developing, maintaining, and managing advanced reporting, analytics, dashboards and other BI solutions using Tableau
  • Reviewing and improving existing Tableau dashboards and data models/ systems and collaborating with teams to integrate new systems
  • Provide support and expertise to the business community to assist with better utilization of Tableau
  • Understand business requirements, conduct analysis and recommend solution options for intelligent dashboards in Tableau
  • Experience with Data Extraction, Transformation and Load (ETL) – knowledge of how to extract, transform and load data
  • Execute SQL data queries across multiple data sources in support of business intelligence reporting needs. Format query results / reports in various ways
  • Participates in QA testing, liaising with other project team members and being responsive to client's needs, all with an eye on details in a fast-paced environment
  • Performing and documenting data analysis, data validation, and data mapping/design

 

 

Key Performance Indicators (Indicate how performance will be measured: indicators, activities…)

KPIs will be outlined in detail in the goal sheet

 

 

Ideal Background (State the minimum and desirable education and experience level)

 

Education

Minimum:  Graduation, preferably in Science

Experience requirement:                                      

·        Minimum: 2-3 years’ relevant work experience in the field of reporting and data analytics using Tableau.

·        Tableau certifications would be preferred

·        Work experience in the regulated medical device / Pharmaceutical industry would be an added advantage, but not mandatory

Languages:

Minimum: English (written and spoken)

 

 

 

Specific Professional Competencies: Indicate any other soft/technical/professional knowledge and skills requirements

 

  • Extensive experience in developing, maintaining and managing Tableau driven dashboards & analytics and working knowledge of Tableau administration /architecture.
  • A solid understanding of SQL, rational databases, and normalization
  • Proficiency in use of query and reporting analysis tools
  • Competency in Excel (macros, pivot tables, etc.)
  • Degree in Mathematics, Computer Science, Information Systems, or related field.

 

Read more
at Alien Brains
5 recruiters
DP
Posted by Praveen Baheti
Kolkata
0 - 15 yrs
₹4L - ₹8L / yr
Python
Deep Learning
Machine Learning (ML)
Data Analytics
Data Science
+3 more
You'll be giving industry standard training to engineering students and mentoring them to develop their custom mini projects.
Read more
at FarmGuide
1 recruiter
DP
Posted by Anupam Arya
NCR (Delhi | Gurgaon | Noida)
0 - 8 yrs
₹7L - ₹14L / yr
Computer Security
Image processing
OpenCV
Python
Rational ClearCase
+8 more
FarmGuide is a data driven tech startup aiming towards digitizing the periodic processes in place and bringing information symmetry in agriculture supply chain through transparent, dynamic & interactive software solutions. We, at FarmGuide (https://angel.co/farmguide) help Government in relevant and efficient policy making by ensuring seamless flow of information between stakeholders.Job Description :We are looking for individuals who want to help us design cutting edge scalable products to meet our rapidly growing business. We are building out the data science team and looking to hire across levels.- Solving complex problems in the agri-tech sector, which are long-standing open problems at the national level.- Applying computer vision techniques to satellite imagery to deduce artefacts of interest.- Applying various machine learning techniques to digitize existing physical corpus of knowledge in the sector.Key Responsibilities :- Develop computer vision algorithms for production use on satellite and aerial imagery- Implement models and data pipelines to analyse terabytes of data.- Deploy built models in production environment.- Develop tools to assess algorithm accuracy- Implement algorithms at scale in the commercial cloudSkills Required :- B.Tech/ M.Tech in CS or other related fields such as EE or MCA from IIT/NIT/BITS but not compulsory. - Demonstrable interest in Machine Learning and Computer Vision, such as coursework, open-source contribution, etc.- Experience with digital image processing techniques - Familiarity/Experience with geospatial, planetary, or astronomical datasets is valuable- Experience in writing algorithms to manipulate geospatial data- Hands-on knowledge of GDAL or open-source GIS tools is a plus- Familiarity with cloud systems (AWS/Google Cloud) and cloud infrastructure is a plus- Experience with high performance or large scale computing infrastructure might be helpful- Coding ability in R or Python. - Self-directed team player who thrives in a continually changing environmentWhat is on offer :- High impact role in a young start up with colleagues from IITs and other Tier 1 colleges- Chance to work on the cutting edge of ML (yes, we do train Neural Nets on GPUs) - Lots of freedom in terms of the work you do and how you do it - Flexible timings - Best start-up salary in industry with additional tax benefits
Read more
at OpexAI
1 recruiter
DP
Posted by Jasmine Shaik
Hyderabad
0 - 1 yrs
₹0L - ₹1L / yr
OpenCV
Python
Deep Learning
Benefits: 1)Working with leaders of analytics Guru 2) working under 25+ years of business exp leaders 3) learning to market and sales pitch for product using social media 4) work from home based on your time 5) exp letter from leading analytics firm for 3 months 6) no stiphen
Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at A leader in Cognitive and Emerging Technologies Business?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort