Cutshort logo
Top 3 Fintech Startup logo
Lead Data Engineer
Top 3 Fintech Startup
Lead Data Engineer
Top 3 Fintech Startup's logo

Lead Data Engineer

at Top 3 Fintech Startup

Agency job
6 - 9 yrs
₹16L - ₹24L / yr
Bengaluru (Bangalore)
Skills
SQL
skill iconAmazon Web Services (AWS)
Spark
PySpark
Apache Hive

We are looking for an exceptionally talented Lead data engineer who has exposure in implementing AWS services to build data pipelines, api integration and designing data warehouse. Candidate with both hands-on and leadership capabilities will be ideal for this position.

 

Qualification: At least a bachelor’s degree in Science, Engineering, Applied Mathematics. Preferred Masters degree

 

Job Responsibilities:

• Total 6+ years of experience as a Data Engineer and 2+ years of experience in managing a team

• Have minimum 3 years of AWS Cloud experience.

• Well versed in languages such as Python, PySpark, SQL, NodeJS etc

• Has extensive experience in the real-timeSpark ecosystem and has worked on both real time and batch processing

• Have experience in AWS Glue, EMR, DMS, Lambda, S3, DynamoDB, Step functions, Airflow, RDS, Aurora etc.

• Experience with modern Database systems such as Redshift, Presto, Hive etc.

• Worked on building data lakes in the past on S3 or Apache Hudi

• Solid understanding of Data Warehousing Concepts

• Good to have experience on tools such as Kafka or Kinesis

• Good to have AWS Developer Associate or Solutions Architect Associate Certification

• Have experience in managing a team

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

Similar jobs

UpSolve Solutions LLP
Shaurya Kuchhal
Posted by Shaurya Kuchhal
Mumbai
1 - 4 yrs
₹3L - ₹5L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
skill iconData Analytics
+2 more

Role Description

This is a full-time client facing on-site role for a Data Scientist at UpSolve Solutions in Mumbai. The Data Scientist will be responsible for performing various day-to-day tasks, including data science, statistics, data analytics, data visualization, and data analysis. The role involves utilizing these skills to provide actionable insights to drive business decisions and solve complex problems.


Qualifications

  • Data Science, Statistics, and Data Analytics skills
  • Data Visualization and Data Analysis skills
  • Strong problem-solving and critical thinking abilities
  • Ability to work with large datasets and perform data preprocessing
  • Proficiency in programming languages such as Python or R
  • Experience with machine learning algorithms and predictive modeling
  • Excellent communication and presentation skills
  • Bachelor's or Master's degree in a relevant field (e.g., Computer Science, Statistics, Data Science)
  • Experience in the field of video and text analytics is a plus


Read more
FrugalTesting
at FrugalTesting
1 video
2 recruiters
Bharti Garg
Posted by Bharti Garg
Hyderabad
4 - 7 yrs
₹6L - ₹10L / yr
Performance Testing
skill iconAmazon Web Services (AWS)
skill iconKubernetes
skill iconDocker

Roles and responsibilities:

  • Provide technical assistance to improve system performance, capacity, reliability, and scalability.  
  • Perform root cause analysis of performance issues and suggest corrective actions. 
  • Lead Technology Initiatives to address the stability, performance, and resilience of production systems.
  • Analyze test results and work with developers and engineers to perform bug fixes.  
  • This role will be responsible for reviewing operational trends to identify key areas to focus on to improve the environment's overall production readiness. 
  • Lead Complex Engineering Initiatives.
  • This role will be responsible for developing plans and roadmaps to drive new initiatives; will be required to have deep expertise and technology leadership skills. 
  • Analyze operational data to identify performance and stability issues. Sizing infrastructure needs and optimizing them for higher performance.
  • Analyze business requirements and data processing problems to oversee the development of test plans and strategies for various applications. 
  • Provide guidance to engineering teams on complex technology performance challenges and/or issues. 
  • Define performance engineering requirements and architecture for new applications. 
  • Develop/provide standard reports on performance and resilience testing results for senior management.

Key Requirements:

  • Must have proven experience of 4-7 years
  • Must have experience in working with cloud infrastructure technologies -  Openshift/AWS/Kubernetes/Docker etc.
  • Bachelor's degree in computer science, or related field.
  • Project Management Professional (PMP) certification preferred.
  • Proven ability to solve problems creatively.
Read more
MNC Company - Product Based
MNC Company - Product Based
Agency job
via Bharat Headhunters by Ranjini C. N
Bengaluru (Bangalore), Chennai, Hyderabad, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 9 yrs
₹10L - ₹15L / yr
Data Warehouse (DWH)
Informatica
ETL
skill iconPython
Google Cloud Platform (GCP)
+2 more

Job Responsibilities

  • Design, build & test ETL processes using Python & SQL for the corporate data warehouse
  • Inform, influence, support, and execute our product decisions
  • Maintain advertising data integrity by working closely with R&D to organize and store data in a format that provides accurate data and allows the business to quickly identify issues.
  • Evaluate and prototype new technologies in the area of data processing
  • Think quickly, communicate clearly and work collaboratively with product, data, engineering, QA and operations teams
  • High energy level, strong team player and good work ethic
  • Data analysis, understanding of business requirements and translation into logical pipelines & processes
  • Identification, analysis & resolution of production & development bugs
  • Support the release process including completing & reviewing documentation
  • Configure data mappings & transformations to orchestrate data integration & validation
  • Provide subject matter expertise
  • Document solutions, tools & processes
  • Create & support test plans with hands-on testing
  • Peer reviews of work developed by other data engineers within the team
  • Establish good working relationships & communication channels with relevant departments

 

Skills and Qualifications we look for

  • University degree 2.1 or higher (or equivalent) in a relevant subject. Master’s degree in any data subject will be a strong advantage.
  • 4 - 6 years experience with data engineering.
  • Strong coding ability and software development experience in Python.
  • Strong hands-on experience with SQL and Data Processing.
  • Google cloud platform (Cloud composer, Dataflow, Cloud function, Bigquery, Cloud storage, dataproc)
  • Good working experience in any one of the ETL tools (Airflow would be preferable).
  • Should possess strong analytical and problem solving skills.
  • Good to have skills - Apache pyspark, CircleCI, Terraform
  • Motivated, self-directed, able to work with ambiguity and interested in emerging technologies, agile and collaborative processes.
  • Understanding & experience of agile / scrum delivery methodology

 

Read more
consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry
consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry
Agency job
via Jobdost by Sathish Kumar
Ahmedabad, Hyderabad, Pune, Delhi
5 - 7 yrs
₹18L - ₹25L / yr
AWS Lambda
AWS Simple Notification Service (SNS)
AWS Simple Queuing Service (SQS)
skill iconPython
PySpark
+9 more
  1. Data Engineer

 Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON

Mandatory Requirements  

  • Experience in AWS Glue
  • Experience in Apache Parquet 
  • Proficient in AWS S3 and data lake 
  • Knowledge of Snowflake
  • Understanding of file-based ingestion best practices.
  • Scripting language - Python & pyspark 

CORE RESPONSIBILITIES 

  • Create and manage cloud resources in AWS 
  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies 
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform 
  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations 
  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
  • Define process improvement opportunities to optimize data collection, insights and displays.
  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible 
  • Identify and interpret trends and patterns from complex data sets 
  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. 
  • Key participant in regular Scrum ceremonies with the agile teams  
  • Proficient at developing queries, writing reports and presenting findings 
  • Mentor junior members and bring best industry practices 

QUALIFICATIONS 

  • 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) 
  • Strong background in math, statistics, computer science, data science or related discipline
  • Advanced knowledge one of language: Java, Scala, Python, C# 
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake  
  • Proficient with
  • Data mining/programming tools (e.g. SAS, SQL, R, Python)
  • Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
  • Data visualization (e.g. Tableau, Looker, MicroStrategy)
  • Comfortable learning about and deploying new technologies and tools. 
  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines. 
  • Good written and oral communication skills and ability to present results to non-technical audiences 
  • Knowledge of business intelligence and analytical tools, technologies and techniques.

  

Familiarity and experience in the following is a plus:  

  • AWS certification
  • Spark Streaming 
  • Kafka Streaming / Kafka Connect 
  • ELK Stack 
  • Cassandra / MongoDB 
  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
Read more
Avhan Technologies Pvt Ltd
Aarti Vohra
Posted by Aarti Vohra
Kolkata
7 - 10 yrs
₹8L - ₹20L / yr
MDX
DAX
SQL
SQL server
Microsoft Analysis Services
+3 more
Exp : 7 to 8 years
Notice Period: Immediate to 15 days
Job Location : Kolkata
 
Responsibilities:
• Develop and improve solutions spanning data processing activities from the data lake (stage) to star schemas and reporting view’s / tables and finally into SSAS.
• Develop and improve Microsoft Analysis Services cubes (tabular and dimensional)
• Collaborate with other teams within the organization and be able to devise the technical solution as it relates to the business & technical requirements
• Mentor team members and be proactive in training and coaching team members to develop their proficiency in Analysis Services
• Maintain documentation for all processes implemented
• Adhere to and suggest improvements to coding standards, applying best practices
 
Skillsets:
• Proficient in MDX and DAX for query in SSAS
Read more
Nexsys
at Nexsys
2 recruiters
Kiran Basavaraj  Nirakari
Posted by Kiran Basavaraj Nirakari
Bengaluru (Bangalore)
2 - 5 yrs
₹10L - ₹15L / yr
NumPy
pandas
skill iconMongoDB
SQL
NOSQL Databases
+2 more

What we look for: 

We are looking for an associate who will be doing data crunching from various sources and finding the key points from the data. Also help us to improve/build new pipelines as per the requests. Also, this associate will be helping us to visualize the data if required and find flaws in our existing algorithms. 

Responsibilities: 

  • Work with multiple stakeholders to gather the requirements of data or analysis and take action on them. 
  • Write new data pipelines and maintain the existing pipelines. 
  • Person will be gathering data from various DB’s and will be finding the required metrics out of it. 

Required Skills: 

  • Experience with python and Libraries like Pandas,and Numpy. 
  • Experience in SQL and understanding of NoSQL DB’s. 
  • Hands-on experience in Data engineering. 
  • Must have good analytical skills and knowledge of statistics. 
  • Understanding of Data Science concepts. 
  • Bachelor degree in Computer Science or related field. 
  • Problem-solving skills and ability to work under pressure. 

Nice to have: 

  • Experience in MongoDB or any NoSql DB. 
  • Experience in ElasticSearch. 
  • Knowledge of Tableau, Power BI or any other visualization tool.
Read more
Velocity Services
at Velocity Services
2 recruiters
Newali Hazarika
Posted by Newali Hazarika
Bengaluru (Bangalore)
4 - 9 yrs
₹15L - ₹35L / yr
ETL
Informatica
Data Warehouse (DWH)
Data engineering
Oracle
+7 more

We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.

 

We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.

 

Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!

 

Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc. 

 

Key Responsibilities

  • Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality

  • Work with the Office of the CTO as an active member of our architecture guild

  • Writing pipelines to consume the data from multiple sources

  • Writing a data transformation layer using DBT to transform millions of data into data warehouses.

  • Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities

  • Identify downstream implications of data loads/migration (e.g., data quality, regulatory)

 

What To Bring

  • 5+ years of software development experience, a startup experience is a plus.

  • Past experience of working with Airflow and DBT is preferred

  • 5+ years of experience working in any backend programming language. 

  • Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL

  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)

  • Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects

  • Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure

  • Basic understanding of Kubernetes & docker is a must.

  • Experience in data processing (ETL, ELT) and/or cloud-based platforms

  • Working proficiency and communication skills in verbal and written English.

 

Read more
Peacock Engineering
at Peacock Engineering
3 recruiters
Sangeeta Behera
Posted by Sangeeta Behera
Bengaluru (Bangalore)
8 - 10 yrs
₹9L - ₹15L / yr
Enterprise asset management
Asset management
Maximo
User Interface (UI) Development
skill iconJava
+1 more

About the Company

Peacock Engineering Ltd is a Gold-accredited IBM Premier Business Partner which has amassed over 300 person years of experience implementing business critical EAM (Enterprise Asset Management) solutions across a range of industries such as oil & gas, pharmaceuticals, utilities, facilities management, transport, and power generation. 

Peacock Engineering Ltd specialise in providing consultancy services and support for the IBM Maximo EAM software product and maintain a pool of highly experienced and capable consultants fully conversant with IBM Maximo and its functionality, capabilities, and opportunities for customisation to meet business need. 

 

Main Purpose:

Peacock Engineering’s Technical Services team is now looking for an IBM Maximo Technical Professional to support the growing demand for Maximo enterprise asset management solutions, working from our office in Bangalore.

 

Specific Responsibilities:

 

  • Technical expert in IBM Maximo EAM technology.
  • Should be well versed in MBO customizations for Maximo 7.x version.
  • Advanced JAVA, SQL knowledge.
  • Maximo building and deploying to various instances.
  • Business process management using workflow design and management.
  • Expert Knowledge of Maximo Integration Framework (MIF).
  • Provide technical services over the entire lifecycle of a project.
  • communication skills (verbal and written) possess the ability to multi-task.
  • Maximo installations and upgrade work experience
  • Participate in solution architecture design.
  • Perform application and solution development to meet project requirements.
  • Develop and document detailed technical designs to meet business requirements.
  • Manage multiple technical environments and support the development and testing processes.
  • Lead or assist in data conversion and migration efforts.
  • Configure Maximo and assist in the development of interfaces to external systems.
  • Identify areas of customization and optimization and provide solutions that meet the business requirements.
  • Conduct system testing, as necessary.

 

Skill Requirements - Essential:

 

  • Tech. in Computer Science, Engineering or Business-related field and/or equivalent work experience.
  • Strong Maximo technical knowledge required to help execute numerous projects.
  • Minimum eight (8) years of work experience in a technical position with the implementation and utilization of fully integrated enterprise asset management system.
  • Proficient to convert functional requirements into technical specifications, and configure, tailor and or customize the solutions including building interfaces.
  • Ability to create and update advanced technical documentation.
  • Strong communication skills and the ability to work well in a project team environment.
  • Drafting/Reviewing Functional Specifications
  • Drafting/Reviewing Technical Specifications

 

Skill Requirements - Preferable:

 

  • To bring industry knowledge world class capabilities innovation and cutting-edge technology to our clients in the Resources industry to deliver business value.
  • To work with leading Resources client’s major customers and suppliers to develop and execute projects and reliability strategies.
  • To harness extensive knowledge combined with an integrated suite of methods people and assets to deliver sustainable long-term solution.
  • IBM Maximo 7.x certification

 

Person Specification/Attributes:

 

  • Professional and committed, with a disciplined approach to work.
  • Motivated and driven by finding and providing solutions to problems.
  • Polite, tactful, helpful, empathic nature, able to deliver to the needs of customers.
  • Has respect for others and their views.
  • Technology minded and focused, enthusiastic about technologies.
  • Analytical, able to raise from the detail and see the bigger picture.
  • Dedicated to continually updating and upgrading own knowledge.
  • Carries a mind-set of continuous improvement, constantly looking for better and more efficient ways of doing things.
  • Values quality at the centre of all things in work.

 

Due to considerable amount of virtual working and interaction with colleagues and customers in different physical locations internationally, it is essential that the successful applicant has the drive and ethic to succeed working in small teams physically but in larger efforts virtually.  Self-drive to communicate constantly using web collaboration and video conferencing is essential.

 

As an employee, you will be encouraged to continually develop your capability & attain certifications to reflect your growth as an individual.

Read more
Global SaaS product built to help revenue teams. (TP1)
Global SaaS product built to help revenue teams. (TP1)
Agency job
via Multi Recruit by Kavitha S
Bengaluru (Bangalore)
1 - 5 yrs
₹30L - ₹40L / yr
Spark
Data Engineer
Airflow
SQL
No SQL
+1 more
  • 3-6 years of relevant work experience in a Data Engineering role.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing data pipelines, architectures, and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • A good understanding of Airflow, Spark, NoSQL databases, Kafka is nice to have.
  • Premium Institute Candidates only
Read more
Fragma Data Systems
at Fragma Data Systems
8 recruiters
Priyanka U
Posted by Priyanka U
Remote only
4 - 10 yrs
₹12L - ₹23L / yr
Informatica
ETL
Big Data
Spark
SQL
Skill:- informatica with big data management
 
1.Minimum 6 to 8 years of experience in informatica BDM development
2. Experience working on Spark/SQL
3. Develops informtica mapping/Sql 
4. Should have experience in Hadoop, spark etc

Work days- Sun-Thu
Day shift
 
 
 
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos