Cutshort logo
Fintech Company logo
Senior Data Engineer
Senior Data Engineer
Fintech Company's logo

Senior Data Engineer

Agency job
4 - 6 yrs
₹10L - ₹15L / yr
Bengaluru (Bangalore)
Skills
PySpark
Data engineering
Big Data
Hadoop
Spark

Purpose of Job:
We are looking for an exceptionally talented senior data engineer who has exposure in implementing AWS services to build data pipelines, api
integration and designing data warehouse.

 

Job Responsibilities:
• Total 4+ years of experience as a Data Engineer
• Have minimum 3 years of AWS Cloud experience.
• Well versed in languages such as Python, PySpark, SQL, NodeJS etc
• Has extensive experience in Spark ecosystem and has
worked on both real time and batch processing
• Have experience in AWS Glue, EMR, DMS, Lambda, S3, DynamoDB, Step functions, Airflow, RDS, Aurora etc.
• Experience with modern Database systems such as
Redshift, Presto, Hive etc.
• Worked on building data lakes in the past on S3 or
Apache Hudi • Solid understanding of Data Warehousing Concepts
• Good to have experience on tools such as Kafka or Kinesis
• Good to have AWS Developer Associate or Solutions Architect Associate Certification


Qualifications:
At least a bachelor’s degree in Science, Engineering, Applied
Mathematics. Other Requirements: Learning Attitude, Ownership skills

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Fintech Company

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

Remote only
2 - 3 yrs
₹5L - ₹7L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+7 more

About the Role:


We are on the lookout for a dynamic Marketing Automation and Data Analytics Specialist, someone who is not only adept in marketing automation/operation but also possesses a keen expertise in data analytics and visualization. This role is tailor-made for individuals who are proficient with tools like Eloqua, Marketo, Salesforce Pardot, and Power BI.


As our Marketing Automation and Data Analytics Specialist, your responsibilities will span across managing and optimizing marketing automation systems and overseeing the migration and enhancement of data systems and dashboards. You will play a pivotal role in blending marketing strategies with data analytics, ensuring the creation of visually appealing and effective reports and dashboards. Collaborating closely with marketing teams, you will help in making data-driven decisions that propel the company forward.


We believe in fostering an environment where initiative and self-direction are valued. While you will receive the necessary guidance and support, the autonomy of your role is a testament to our trust in your abilities and professionalism.


Responsibilities:


  • Manage and optimize marketing automation systems (Eloqua, Marketo, Salesforce Pardot) to map and improve business processes.
  • Develop, audit, and enhance data systems, ensuring accuracy and efficiency in marketing efforts.
  • Build and migrate interactive, visually appealing dashboards and reports.
  • Develop and maintain reporting and analytics for marketing efforts, database health, lead scoring, and dashboard performance.
  • Handle technical aspects of key marketing systems and integrate them with data visualization tools like Power BI.
  • Review and improve existing SQL data sources for effective integration and analytics.
  • Collaborate closely with sales, marketing, and analytics teams to define requirements, establish best practices, and ensure successful outcomes.
  • Ensure all marketing data, dashboards, and reports are accurate and effectively meet business needs.


Ideal Candidate Qualities:


  • Strong commitment to the role with a focus on long-term growth.
  • Exceptional communication and collaboration skills across diverse teams.
  • High degree of autonomy and ability to work effectively without micromanagement.
  • Strong attention to detail and organization skills.


Qualifications:


  • Hands-on experience with marketing automation systems and data analytics tools like Eloqua, Marketo, Salesforce Pardot and Power Bi .
  • Proven experience in data visualization and dashboard creation using Power BI.
  • Experience with SQL, including building and optimizing queries.
  • Knowledge of ABM and Intent Signaling technologies is a plus.
  • Outstanding analytical skills with an ability to work with complex datasets.
  • Familiarity with data collection, cleaning, and transformation processes.


Benefits:


  • Work-from-home flexibility.
  • Career advancement opportunities and professional development support.
  • Supportive and collaborative team environment.


Hiring Process:


The hiring process at InEvolution is thoughtfully designed to ensure alignment between your career goals and our company's objectives. The process will include:


  • Initial Phone Screening: A brief conversation to discuss your background and understand your career aspirations.
  • Team Introduction Interview: Candidates who excel in the first round will engage in a deeper discussion with our team, providing insights into our work culture and the specificities of the role.
  • Technical Assessment: In the final round, you will meet our Technical Director for an in-depth conversation about your technical skills and how these align with the demands of the role.


Read more
buy now & pay later
Agency job
via Qrata by Rayal Rajan
Mumbai
2 - 7 yrs
₹5L - ₹10L / yr
skill iconPython
SQL
skill iconData Analytics
Communication Skills
Data management

DATA ANALYST

About:

 

 We allows customers to "buy now and pay later" for goods and services purchased online and offline portals. It's a rapidly growing organization opening up new avenues of payments for online and offline customers.

 

Role:

 

 Define and continuously refine the analytics roadmap.

Build, Deploy and Maintain the data infrastructure that supports all of the analysis, including the data warehouse and various data marts

Build, deploy and maintain the predictive models and scoring infrastructure that powers critical decision management systems.

Strive to devise ways to gather more alternate data and build increasingly enhanced predictive models

Partner with business teams to systematically design experiments to continuously improve customer acquisition, minimize churn, reduce delinquency and improve profitability

Provide data insights to all business teams through automated queries, MIS, etc.

 

Requirements:

 4+ years of deep, hands-on analytics experience in a management consulting, start-up or financial services, or fintech company.

Should have strong knowledge in SQL and Python.

Deep knowledge of problem-solving approach using analytical frameworks.

Deep knowledge of frameworks for data management, deployment, and monitoring of performance metrics.

Hands-on exposure to delivering improvements through test and learn methodologies.

Excellent communication and interpersonal skills, with the ability to be pleasantly persistent.

 

 

Location-MUMBAI

Read more
EnterpriseMinds
at EnterpriseMinds
2 recruiters
phani kalyan
Posted by phani kalyan
Pune
9 - 14 yrs
₹20L - ₹40L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+3 more
Job Id: SG0601

Hi,

Enterprise Minds is looking for Data Architect for Pune Location.

Req Skills:
Python,Pyspark,Hadoop,Java,Scala
Read more
Matellio India Private Limited
Harshit Sharma
Posted by Harshit Sharma
Remote only
8 - 15 yrs
₹10L - ₹27L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
skill iconDeep Learning
+7 more

Responsibilities include: 

  • Convert the machine learning models into application program interfaces (APIs) so that other applications can use it
  • Build AI models from scratch and help the different components of the organization (such as product managers and stakeholders) understand what results they gain from the model
  • Build data ingestion and data transformation infrastructure
  • Automate infrastructure that the data science team uses
  • Perform statistical analysis and tune the results so that the organization can make better-informed decisions
  • Set up and manage AI development and product infrastructure
  • Be a good team player, as coordinating with others is a must
Read more
Cognologix Technologies
at Cognologix Technologies
14 recruiters
Priyal Wagh
Posted by Priyal Wagh
Remote, Pune
4 - 9 yrs
₹10L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+1 more

You will work on: 

 

We help many of our clients make sense of their large investments in data – be it building analytics solutions or machine learning applications. You will work on cutting-edge cloud-native technologies to crunch terabytes of data into meaningful insights. 

 

What you will do (Responsibilities):

 

Collaborate with product management & engineering to build highly efficient data pipelines. 

You will be responsible for:

 

  • Dealing with large customer data and building highly efficient pipelines
  • Building insights dashboards
  • Troubleshooting data loss, data inconsistency, and other data-related issues
  • Product development environment delivering stories in a scaled agile delivery methodology.

 

What you bring (Skills):

 

5+ years of experience in hands-on data engineering & large-scale distributed applications

 

  • Extensive experience in object-oriented programming languages such as Java or Scala
  • Extensive experience in RDBMS such as MySQL, Oracle, SQLServer, etc.
  • Experience in functional programming languages such as JavaScript, Scala, or Python
  • Experience in developing and deploying applications in Linux OS
  • Experience in big data processing technologies such as Hadoop, Spark, Kafka, Databricks, etc.
  • Experience in Cloud-based services such as Amazon AWS, Microsoft Azure, or Google Cloud Platform
  • Experience with Scrum and/or other Agile development processes
  • Strong analytical and problem-solving skills

 

Great if you know (Skills):

 

  • Some exposure to containerization technologies such as Docker, Kubernetes, or Amazon ECS/EKS
  • Some exposure to microservices frameworks such as Spring Boot, Eclipse Vert.x, etc.
  • Some exposure to NoSQL data stores such as Couchbase, Solr, etc.
  • Some exposure to Perl, or shell scripting.
  • Ability to lead R&D and POC efforts
  • Ability to learn new technologies on his/her own
  • Team player with self-drive to work independently
  • Strong communication and interpersonal skills

Advantage Cognologix:

  •  A higher degree of autonomy, startup culture & small teams
  •  Opportunities to become an expert in emerging technologies
  •  Remote working options for the right maturity level
  •  Competitive salary & family benefits
  •  Performance-based career advancement


About Cognologix: 

 

Cognologix helps companies disrupt by reimagining their business models and innovate like a Startup. We are at the forefront of digital disruption and take a business-first approach to help meet our client’s strategic goals.

We are a Data focused organization helping our clients to deliver their next generation of products in the most efficient, modern, and cloud-native way.

Benefits Working With Us:

  • Health & Wellbeing
  • Learn & Grow
  • Evangelize 
  • Celebrate Achievements
  • Financial Wellbeing
  • Medical and Accidental cover.
  • Flexible Working Hours.
  • Sports Club & much more.
Read more
Bengaluru (Bangalore)
5 - 7 yrs
₹12L - ₹17L / yr
Data Engineer
Hadoop
Spark
Kafka
Big Data
+3 more
  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Author data services using a variety of programming languages
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and Azure regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • Work in an Agile environment with Scrum teams.
  • Ensure data quality and help in achieving data governance.

 

Basic Qualifications

 

  • 2+ years of experience in a Data Engineer role
  • Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Experience using the following software/tools:
    • Experience with big data tools: Hadoop, Spark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases
    • Experience with data pipeline and workflow management tools
    • Experience with Azure cloud services: ADLS, ADF, ADLA, AAS
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Read more
Numantra Technologies
at Numantra Technologies
2 recruiters
Vandana Saxena
Posted by Vandana Saxena
Mumbai, Navi Mumbai
2 - 8 yrs
₹5L - ₹12L / yr
Microsoft Windows Azure
ADF
NumPy
PySpark
Databricks
+1 more
Experience and expertise in using Azure cloud services. Azure certification will be a plus.

- Experience and expertise in Python Development and its different libraries like Pyspark, pandas, NumPy

- Expertise in ADF, Databricks.

- Creating and maintaining data interfaces across a number of different protocols (file, API.).

- Creating and maintaining internal business process solutions to keep our corporate system data in sync and reduce manual processes where appropriate.

- Creating and maintaining monitoring and alerting workflows to improve system transparency.

- Facilitate the development of our Azure cloud infrastructure relative to Data and Application systems.

- Design and lead development of our data infrastructure including data warehouses, data marts, and operational data stores.

- Experience in using Azure services such as ADLS Gen 2, Azure Functions, Azure messaging services, Azure SQL Server, Azure KeyVault, Azure Cognitive services etc.
Read more
Bengaluru (Bangalore), Mumbai, Gurugram, Nashik, Pune, Visakhapatnam, Chennai, Noida
3 - 5 yrs
₹8L - ₹12L / yr
Oracle Analytics
OAS
OAC
Oracle OAS
Oracle
+8 more

Oracle OAS Developer

 

 

Senior OAS/OAC (Oracle analytics) designer and developer having 3+ years of experience. Worked on new Oracle Analytics platform. Used latest features, custom plug ins and design new one using Java. Has good understanding about the various graphs data points and usage for appropriate financial data display. Worked on performance tuning and build complex data security requirements.

Qualifications



Bachelor university degree in Engineering/Computer Science.

Additional information

Have knowledge of Financial and HR dashboard

 

Read more
Mumbai
5 - 8 yrs
₹25L - ₹30L / yr
SQL Azure
ADF
Azure data factory
Azure Datalake
Azure Databricks
+13 more
As a hands-on Data Architect, you will be part of a team responsible for building enterprise-grade
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
Read more
LatentView Analytics
at LatentView Analytics
3 recruiters
talent acquisition
Posted by talent acquisition
Chennai
3 - 5 yrs
₹0L / yr
Business Analysis
Analytics
skill iconPython
Looking for Immediate JoinersAt LatentView, we would expect you to:- Independently handle delivery of analytics assignments- Mentor a team of 3 - 10 people and deliver to exceed client expectations- Co-ordinate with onsite LatentView consultants to ensure high quality, on-time delivery- Take responsibility for technical skill-building within the organization (training, process definition, research of new tools and techniques etc.)You'll be a valuable addition to our team if you have:- 3 - 5 years of hands-on experience in delivering analytics solutions- Great analytical skills, detail-oriented approach- Strong experience in R, SAS, Python, SQL, SPSS, Statistica, MATLAB or such analytic tools would be preferable- Working knowledge in MS Excel, Power Point and data visualization tools like Tableau, etc- Ability to adapt and thrive in the fast-paced environment that young companies operate in- A background in Statistics / Econometrics / Applied Math / Operations Research / MBA, or alternatively an engineering degree from a premier institution.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos