Cutshort logo
Top 3 Fintech Startup logo
Lead Data Engineer
Top 3 Fintech Startup
Lead Data Engineer
Top 3 Fintech Startup's logo

Lead Data Engineer

at Top 3 Fintech Startup

Agency job
6 - 9 yrs
₹16L - ₹24L / yr
Bengaluru (Bangalore)
Skills
SQL
skill iconAmazon Web Services (AWS)
Spark
PySpark
Apache Hive

We are looking for an exceptionally talented Lead data engineer who has exposure in implementing AWS services to build data pipelines, api integration and designing data warehouse. Candidate with both hands-on and leadership capabilities will be ideal for this position.

 

Qualification: At least a bachelor’s degree in Science, Engineering, Applied Mathematics. Preferred Masters degree

 

Job Responsibilities:

• Total 6+ years of experience as a Data Engineer and 2+ years of experience in managing a team

• Have minimum 3 years of AWS Cloud experience.

• Well versed in languages such as Python, PySpark, SQL, NodeJS etc

• Has extensive experience in the real-timeSpark ecosystem and has worked on both real time and batch processing

• Have experience in AWS Glue, EMR, DMS, Lambda, S3, DynamoDB, Step functions, Airflow, RDS, Aurora etc.

• Experience with modern Database systems such as Redshift, Presto, Hive etc.

• Worked on building data lakes in the past on S3 or Apache Hudi

• Solid understanding of Data Warehousing Concepts

• Good to have experience on tools such as Kafka or Kinesis

• Good to have AWS Developer Associate or Solutions Architect Associate Certification

• Have experience in managing a team

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

Similar jobs

With a global provider of Business Process Management.
With a global provider of Business Process Management.
Agency job
via Jobdost by Saida Jabbar
Pune
3 - 8 yrs
₹8L - ₹18L / yr
MS SharePoint
Microsoft SharePoint
Sharepoint
skill iconXML
skill icon.NET
+2 more

Job Description

  • Meeting with the client and internal team to review website and application requirements.
  • Setting up project completion timelines with client and tracking it to closure.
  • Configuring the company SharePoint systems to specified requirements.
  • Developing new web components using XML, .NET, SQL, and C#.
  • Designing, coding, and implementing scalable applications.
  • Extending SharePoint functionality with forms, web parts, and application technologies.
  • Testing and debugging code, troubleshooting software issues.
  • Reviewing website interface and software stability.
  • Maintaining and updating SharePoint applications.
  • Prepare Solution Design Documentation of projects.
  • Providing systems training to staff and customers.
  • Reporting - Dashboard, Growth Report, All Project Reports MTD, QTD, YTD wise
  • Collect the Data from Project Team to analyze it daily or monthly basis
  • Expertise in MS Office and Report Generation
  • Work very closely with teams across delivery locations and client
  • SharePoint and VBA skills
  • To be willing to work in US shifts
  • Perform requisite MIS (Count sheets, internal & external reporting)
  • Adhere to reasonable operational requests from the management
  • Expert in managing new SharePoint sites with approval workflow and maintenance of the existing sites.
  • High-level coding skills.
  • Ability to solve complex software issues.
  • Detail orientated.
  • Self-motivated.
Read more
Cubera Tech India Pvt Ltd
Surabhi Koushik
Posted by Surabhi Koushik
Bengaluru (Bangalore)
2 - 3 yrs
₹24L - ₹35L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
SQL
+6 more

Data Scientist

 

Cubera is a data company revolutionizing big data analytics and Adtech through data share value principles wherein the users entrust their data to us. We refine the art of understanding, processing, extracting, and evaluating the data that is entrusted to us. We are a gateway for brands to increase their lead efficiency as the world moves towards web3.

 

What you’ll do?

 

  • Build machine learning models, perform proof-of-concept, experiment, optimize, and deploy your models into production; work closely with software engineers to assist in productionizing your ML models.
  • Establish scalable, efficient, automated processes for large-scale data analysis, machine-learning model development, model validation, and serving.
  • Research new and innovative machine learning approaches.
  • Perform hands-on analysis and modeling of enormous data sets to develop insights that increase Ad Traffic and Campaign Efficacy.
  • Collaborate with other data scientists, data engineers, product managers, and business stakeholders to build well-crafted, pragmatic data products.  
  • Actively take on new projects and constantly try to improve the existing models and infrastructure necessary for offline and online experimentation and iteration.
  • Work with your team on ambiguous problem areas in existing or new ML initiatives

What are we looking for?

  • Ability to write a SQL query to pull the data you need.
  • Fluency in Python and familiarity with its scientific stack such as numpy, pandas, scikit learn, matplotlib.
  • Experience in Tensorflow and/or R Modelling and/or PyTorch
  • Ability to understand a business problem and translate, and structure it into a data science problem. 

 

Job Category: Data Science

Job Type: Full Time

Job Location: Bangalore

 

Read more
CoreStack
at CoreStack
2 recruiters
Dhivya R
Posted by Dhivya R
Chennai
2 - 6 yrs
₹3L - ₹8L / yr
skill iconAmazon Web Services (AWS)
Microsoft Windows Azure
Google Cloud Platform (GCP)
Cloud Computing

Roles & Responsibilities

  • Part of a Cloud Governance product team responsible for installing, configuring, automating and monitoring various Cloud Services (IaaS, PaaS, and SaaS)
  • Be at the forefront of Cloud technology, assisting a global list of customers that consume multiple cloud environments.
  • Ensure availability of internal & customers' hosts and services thru monitoring, analysing metric trends, investigating alerts.
  • Explore and implement a broad spectrum of open source technologies. Help the team/customer to resolve technical issues.
  • Extremely customer focused, flexible to be available on-call for solving critical problems.
  • Contribute towards the process improvement involving the Product deployments, Cloud Governance & Customer Success.
  • Skills Required

    • Minimum 3+ Years of experience with a B.E/B.Tech
    • Experience in managing Azure IaaS, PaaS services for customer production environments
    • Well versed in DevOps technologies, automation, infrastructure orchestration, configuration management and CI/CD
    • Experience in Linux and Windows Administration, server hardening and security compliance
    • Web and Application Server technologies (e.g. Apache, Nginx, IIS)
    • Good command in at least one scripting language (e.g. Bash, PowerShell, Ruby, Python)
    • Networking protocols such as HTTP, DNS and TCP/IP
    • Experience in managing version control platforms (e.g. Git, SVN)
Read more
Extramarks
at Extramarks
4 recruiters
Prachi Sharma
Posted by Prachi Sharma
Noida, Delhi, Gurugram, Ghaziabad, Faridabad
3 - 5 yrs
₹8L - ₹10L / yr
Tableau
PowerBI
skill iconData Analytics
SQL
skill iconPython

Required Experience

· 3+ years of relevant technical experience as a data analyst role

· Intermediate / expert skills with SQL and basic statistics

· Experience in Advance SQL

· Python programming- Added advantage

· Strong problem solving and structuring skills

· Automation in connecting various sources to the data and representing it through various dashboards

· Excellent with Numbers and communicate data points through various reports/templates

· Ability to communicate effectively internally and outside Data Analytics team

· Proactively take up work responsibilities and take adhocs as and when needed

· Ability and desire to take ownership of and initiative for analysis; from requirements clarification to deliverable

· Strong technical communication skills; both written and verbal

· Ability to understand and articulate the "big picture" and simplify complex ideas

· Ability to identify and learn applicable new techniques independently as needed

· Must have worked with various Databases (Relational and Non-Relational) and ETL processes

· Must have experience in handling large volume and data and adhere to optimization and performance standards

· Should have the ability to analyse and provide relationship views of the data from different angles

· Must have excellent Communication skills (written and oral).

· Knowing Data Science is an added advantage

Required Skills

MYSQL, Advanced Excel, Tableau, Reporting and dashboards, MS office, VBA, Analytical skills

Preferred Experience

· Strong understanding of relational database MY SQL etc.

· Prior experience working remotely full-time

· Prior Experience working in Advance SQL

· Experience with one or more BI tools, such as Superset, Tableau etc.

· High level of logical and mathematical ability in Problem Solving

Read more
Amagi Media Labs
at Amagi Media Labs
3 recruiters
Rajesh C
Posted by Rajesh C
Bengaluru (Bangalore), Noida
5 - 9 yrs
₹10L - ₹17L / yr
Data engineering
Spark
skill iconScala
Hadoop
Apache Hadoop
+1 more
  • We are looking for : Data engineer
  • Sprak
  • Scala
  • Hadoop
Exp - 5 to 9 years
N.p - 15 days to 30 Days
Location : Bangalore / Noida
Read more
Product Development
Product Development
Agency job
via Purple Hirez by Aditya K
Hyderabad
12 - 20 yrs
₹15L - ₹50L / yr
Analytics
skill iconData Analytics
skill iconKubernetes
PySpark
skill iconPython
+1 more

Job Description

We are looking for an experienced engineer with superb technical skills. Primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions.

Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.

 

Skills

  • Bachelors/Masters/Phd in CS or equivalent industry experience
  • Demonstrated expertise of building and shipping cloud native applications
  • 5+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka Streams, Py Spark, and streaming databases like druid or equivalent like Hive
  • Strong industry expertise with containerization technologies including kubernetes (EKS/AKS), Kubeflow
  • Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
  • 5+ Industry experience in python
  • Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
  • Experience with scripting languages. Python experience highly desirable. Experience in API development using Swagger
  • Implementing automated testing platforms and unit tests
  • Proficient understanding of code versioning tools, such as Git
  • Familiarity with continuous integration, Jenkins

Responsibilities

  • Architect, Design and Implement Large scale data processing pipelines using Kafka Streams, PySpark, Fluentd and Druid
  • Create custom Operators for Kubernetes, Kubeflow
  • Develop data ingestion processes and ETLs
  • Assist in dev ops operations
  • Design and Implement APIs
  • Identify performance bottlenecks and bugs, and devise solutions to these problems
  • Help maintain code quality, organization, and documentation
  • Communicate with stakeholders regarding various aspects of solution.
  • Mentor team members on best practices
Read more
Fragma Data Systems
at Fragma Data Systems
8 recruiters
Evelyn Charles
Posted by Evelyn Charles
Remote, Bengaluru (Bangalore)
3.5 - 8 yrs
₹5L - ₹18L / yr
PySpark
Data engineering
Data Warehouse (DWH)
SQL
Spark
+1 more
Must-Have Skills:
• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good experience in SQL DBs - Be able to write queries including fair complexity.
• Should have excellent experience in Big Data programming for data transformation and aggregations
• Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption.
• Good customer communication.
• Good Analytical skill
 
 
Technology Skills (Good to Have):
  • Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
  • Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. 
  • Designing and implementing data engineering, ingestion, and transformation functions
  • Azure Synapse or Azure SQL data warehouse
  • Spark on Azure is available in HD insights and data bricks
 
Good to Have: 
  • Experience with Azure Analysis Services
  • Experience in Power BI
  • Experience with third-party solutions like Attunity/Stream sets, Informatica
  • Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
  • Capacity Planning and Performance Tuning on Azure Stack and Spark.
Read more
Gulf client
Gulf client
Agency job
via Fragma Data Systems by Priyanka U
Remote, Bengaluru (Bangalore)
5 - 9 yrs
₹10L - ₹20L / yr
PowerBI
Data Warehouse (DWH)
SQL
DAX
Power query
Key Skills:
 Strong knowledge in Power BI (DAX + Power Query + Power BI Service + Power BI
Desktop Visualisations) and Azure Data Storages.
 Should have experience in Power BI mobile Dashboards.
 Strong knowledge in SQL.
 Good knowledge of DWH concepts.
 Work as an independent contributor at the client location.
 Implementing Access Control and impose required Security.
 Candidate must have very good communication skills.
Read more
Dailyhunt
at Dailyhunt
4 recruiters
khushboo jain
Posted by khushboo jain
Bengaluru (Bangalore)
3 - 9 yrs
₹3L - ₹9L / yr
skill iconJava
Big Data
Hadoop
Pig
Apache Hive
+13 more
What You'll Do :- Develop analytic tools, working on BigData and Distributed Environment. Scalability will be the key- Provide architectural and technical leadership on developing our core Analytic platform- Lead development efforts on product features on Java- Help scale our mobile platform as we experience massive growthWhat we Need :- Passion to build analytics & personalisation platform at scale- 3 to 9 years of software engineering experience with product based company in data analytics/big data domain- Passion for the Designing and development from the scratch.- Expert level Java programming and experience leading full lifecycle of application Dev.- Exp in Analytics, Hadoop, Pig, Hive, Mapreduce, ElasticSearch, MongoDB is an additional advantage- Strong communication skills, verbal and written
Read more
Diverse Lynx
at Diverse Lynx
2 recruiters
Nandita Choudhary
Posted by Nandita Choudhary
Remote, Bengaluru (Bangalore)
5 - 14 yrs
₹11L - ₹22L / yr
Microsoft Dynamics
AppDynamics
skill iconPython
skill iconC#
Linux/Unix
+11 more
Detailed Responsibility & Skills:-
·      Install, Configuration, and Tuning of the following AppDynamics Servers: Controller, Event Service Cluster, End User Monitoring, ADA, ADRUM
·      Reviews system design and works to continuously improve stability and efficiencies
·      Provides system backup recovery methodology and makes recommendations regarding enhancements and/or improvements
·      Formulates policies, procedures, and standards relating to system management, and monitors system resource utilization
·      Responsible for reducing operational downtime for critical, scheduled, and unscheduled maintenance by accelerating deployments of approved changes/fixes/updates
       and solutions and automate manual maintenance, deployment, diagnostic health checks, validation, and reporting
·      Responsible for creating proactive and reactive monitoring methods, generating customer alerts within the Enterprise Event Management and Monitoring capability
·      Skilled at user requirement gathering and can work independently to craft efficient monitoring, alarming solutions, and dashboards
·      Understands the Agile process
·      Ability to operationally support the underlying database as necessary
·      Hands-on Java and/or .Net Development
·      IT Operations and Application Support
·      Application and systems performance management, measurement and analysis.
·      Deployment and configuration of complex enterprise software
·      Solid understanding of Operating Systems (Linux/Windows)
·      Experience with J2EE/LAMP/Microsoft stack
·      Cloud and containerization experience
·      Strong understanding of built-in O/S monitoring and performance tools.
·      Working with a wide variety of platforms and application stacks. Ability to understand new application frameworks in customer environments quickly
·      Works with minimal direction as a seasoned resource
·      Support customer initiatives in their transition towards modernization
·      Tracks own work and backlog, familiar with Agile methodology
·      Prioritize own work in accordance with user priorities and stakeholder expectations
·      Communicates efficiently and effectively both written and verbal
·      Reviews system design and works to continuously improve stability and efficiencies
        Mandatory:
·      Knowledge about APM tools (NewRelic / AppDynamics / DataDog / OpenTracing)
·      Dotnet/Java
·      Linux/Windows
·      SQL
·      Reverse proxy administration (e.g.: IIS)
·      API
·      Elastic knowledge
·      Fault and Performance Monitoring Tools Administration
Good to have: Grafana / Python / GoLang / Bash / PowerShellhttps://www.linkedin.com/feed/update/urn:li:

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos