i

Lead BigData Engineer
at PayU

i
Posted by Vishakha Sonde
Apply to this job
i
Remote, Bengaluru (Bangalore)
i
5 - 10 yrs
i
₹10L - ₹40L / yr (ESOP available)
Skills
PySpark
SQL
SQL Azure
Amazon Web Services (AWS)
Python
Job description

 

 

We are looking to add a Lead BigData Engineer to our team, Would love to connect and tell you more about what were building, and to learn more about what interests you. 

 

Company Name : PayU - Kindly visit our company site for more details .

 

Location : Bangalore, Mumbai & Gurgaon

 

Role and Background Information:

As a Lead/Manager - Data Engineering, you will execute the strategy and roadmap for data engineering at PayU, you will define, evolve and mature PayU's existing data platform and processing framework that handles terabytes of data from our various properties.

You will work with a team of data engineers, analytics and data scientist teams to identify the needs and contribute towards building the next generation data ecosystem.

You will engage with analysts and leaders to research and develop new data engineering capabilities.

Responsible to ingest data from files, streams and databases. Process the data with Python and Pyspark and its storage into time-series database.

Develop programs in Python as part of data extraction, data cleaning, transformation and processing. Develop and maintain scalable data pipelines.  

Rest API’s development.

 

What you’d need to bring to the table:  

Overall 5-10 years of experience as Data Engineer.

Advanced working SQL knowledge to create complex queries.

Experience in working with Time-series Database, relational databases as well as working familiarity with a variety of databases (structured & Unstructured).

Hands On experience on visualization tools like Grafana & Power BI.

Experience in designing and implementing scalable architecture.

Good experience in doing object-oriented programming in python.

Very strong in Object-Oriented Analysis and Design (OOAD).

Strong knowledge on REST APIs.

 Experience working on Azure Cloud services IaaS PaaS.

Hands on Experience in working on Microsoft Azure Services like ADLS/Blob Storage solutions, Event Hubs, Service Bus, scale sets, Load Balancers, Azure Functions, Databricks. - Hands on Experience in working on Kafka.

Knowledge on continuous integration/continuous deployment.

Experience of data migration and deployment from On-Prem to Cloud environment and vice-versa.

 Individual Contributor.

Education-

Bachelor of Engineering (IIT/NIT/Bits is Preferred)

About PayU
PayU is a fintech that provides financial solutions for local and cross border merchants in emerging markets, POS credit and alternative payment methods
Founded
2002
Type
Product
Size
500-1000 employees
Stage
Profitable
Why apply to jobs via CutShort
i
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
i
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
i
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
6212
Companies hiring
Similar jobs
i
Founded 2018  •  Products & Services  •  100-1000 employees  •  Profitable
PowerBI
Python
Tableau
SQL
Data modeling
i
Chennai, Bengaluru (Bangalore)
i
2 - 4 yrs
i
Best in industry
We are looking for a developer to design and deliver strategic data-centric insights leveraging the next generation analytics and BI technologies. We want someone who is data-centric and insight-centric, less report centric. We are looking for someone wishing to make an impact by enabling innovation and growth; someone with passion for what they do and a vision for the future.

Responsibilities:
  • Be the analytical expert in Kaleidofin, managing ambiguous problems by using data to execute sophisticated quantitative modeling and deliver actionable insights.
  • Develop comprehensive skills including project management, business judgment, analytical problem solving and technical depth.
  • Become an expert on data and trends, both internal and external to Kaleidofin.
  • Communicate key state of the business metrics and develop dashboards to enable teams to understand business metrics independently.
  • Collaborate with stakeholders across teams to drive data analysis for key business questions, communicate insights and drive the planning process with company executives.
  • Automate scheduling and distribution of reports and support auditing and value realization.
  • Partner with enterprise architects to define and ensure proposed.
  • Business Intelligence solutions adhere to an enterprise reference architecture.
  • Design robust data-centric solutions and architecture that incorporates technology and strong BI solutions to scale up and eliminate repetitive tasks.
 Requirements:
  • Experience leading development efforts through all phases of SDLC.
  • 2+ years "hands-on" experience designing Analytics and Business Intelligence solutions.
  • Experience with Quicksight, PowerBI, Tableau and Qlik is a plus.
  • Hands on experience in SQL, data management, and scripting (preferably Python).
  • Strong data visualisation design skills, data modeling and inference skills.
  • Hands-on and experience in managing small teams.
  • Financial services experience preferred, but not mandatory.
  • Strong knowledge of architectural principles, tools, frameworks, and best practices.
  • Excellent communication and presentation skills to communicate and collaborate with all levels of the organisation.
  • Preferred candidates with less than 30 days notice period.
Read more
Job posted by
i
Poornima B
Apply for job
i
at Agilisium
Agency job
Big Data
Spark
PySpark
ETL
Data engineering
i
Chennai
i
10 - 19 yrs
i
₹12L - ₹40L / yr

Job Sector: IT, Software

Job Type: Permanent

Location: Chennai

Experience: 10 - 20 Years

Salary: 12 – 40 LPA

Education: Any Graduate

Notice Period: Immediate

Key Skills: Python, Spark, AWS, SQL, PySpark

 

Job Description:

Requirements

  • Minimum 12 years experience
  • In depth understanding and knowledge on distributed computing with spark.
  • Deep understanding of Spark Architecture and internals
  • Proven experience in data ingestion, data integration and data analytics with spark, preferably PySpark.
  • Expertise in ETL processes, data warehousing and data lakes.
  • Hands on with python for Big data and analytics.
  • Hands on in agile scrum model is an added advantage.
  • Knowledge on CI/CD and orchestration tools is desirable.
  • AWS S3, Redshift, Lambda knowledge is preferred
Thanks
Read more
Job posted by
i
Moumita Santra
Apply for job
Spark
Python
SQL
i
Bengaluru (Bangalore)
i
2 - 5 yrs
i
₹7L - ₹12L / yr
Primary Responsibilities:
• Responsible for developing and maintaining applications with PySpark
• Contribute to the overall design and architecture of the application developed and deployed.
• Performance Tuning wrt to executor sizing and other environmental parameters, code optimization, partitions tuning, etc.
• Interact with business users to understand requirements and troubleshoot issues.
• Implement Projects based on functional specifications.


Must-Have Skills:

• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good customer communication.
• Good Analytical skills
Read more
Job posted by
i
geeti gaurav mohanty
Apply for job
i
Founded 2014  •  Products & Services  •  20-100 employees  •  Bootstrapped
Data Science
R Programming
Python
SQL
Natural Language Processing (NLP)
i
Remote, Bengaluru (Bangalore)
i
2 - 4 yrs
i
₹10L - ₹20L / yr

- Modeling complex problems, discovering insights, and identifying opportunities through the use of statistical, algorithmic, mining, and visualization techniques

- Experience working with business understanding the requirement, creating the problem statement, and building scalable and dependable Analytical solutions

- Must have hands-on and strong experience in Python

- Broad knowledge of fundamentals and state-of-the-art in NLP and machine learning

- Strong analytical & algorithm development skills

- Deep knowledge of techniques such as Linear Regression, gradient descent, Logistic Regression, Forecasting, Cluster analysis, Decision trees, Linear Optimization, Text Mining, etc

- Ability to collaborate across teams and strong interpersonal skills

 

Skills

- Sound theoretical knowledge in ML algorithm and their application

- Hands-on experience in statistical modeling tools such as R, Python, and SQL

- Hands-on experience in Machine learning/data science

- Strong knowledge of statistics

- Experience in advanced analytics / Statistical techniques – Regression, Decision trees, Ensemble machine learning algorithms, etc

- Experience in Natural Language Processing & Deep Learning techniques 

- Pandas, NLTK, Scikit-learn, SpaCy, Tensorflow

Read more
Job posted by
i
Vishal Sharma
Apply for job
i
Founded 1987  •  Product  •  500-1000 employees  •  Profitable
Amazon Web Services (AWS)
Python
Scala
Go Programming (Golang)
Java
AWS Lambda
ECS
NLB
Amazon S3
Apache Aurora
Spark
PySpark
Apache Kafka
Redis
Amazon VPC
athena
Amazon EMR
Serverless
Kubernetes
fargate
ALB
Glue
cloudwatch
container
i
Remote only
i
2 - 6 yrs
i
₹12L - ₹18L / yr

Designation: Specialist - Cloud Service Developer (ABL_SS_600)

Position description:

  • The person would be primary responsible for developing solutions using AWS services. Ex: Fargate, Lambda, ECS, ALB, NLB, S3 etc.
  • Apply advanced troubleshooting techniques to provide Solutions to issues pertaining to Service Availability, Performance, and Resiliency
  • Monitor & Optimize the performance using AWS dashboards and logs
  • Partner with Engineering leaders and peers in delivering technology solutions that meet the business requirements 
  • Work with the cloud team in agile approach and develop cost optimized solutions

 

Primary Responsibilities:

  • Develop solutions using AWS services includiing Fargate, Lambda, ECS, ALB, NLB, S3 etc.

 

Reporting Team

  • Reporting Designation: Head - Big Data Engineering and Cloud Development (ABL_SS_414)
  • Reporting Department: Application Development (2487)

Required Skills:

  • AWS certification would be preferred
  • Good understanding in Monitoring (Cloudwatch, alarms, logs, custom metrics, Trust SNS configuration)
  • Good experience with Fargate, Lambda, ECS, ALB, NLB, S3, Glue, Aurora and other AWS services. 
  • Preferred to have Knowledge on Storage (S3, Life cycle management, Event configuration)
  • Good in data structure, programming in (pyspark / python / golang / Scala)
Read more
Job posted by
i
Naim Punasiya
Apply for job
i
Founded 2018  •  Products & Services  •  employees  •  Profitable
PowerBI
MS-Excel
SQL
DAX
SSIS
Tableau
ETL
i
Bengaluru (Bangalore)
i
1 - 3 yrs
i
₹3L - ₹5L / yr
Required Skills and Experience
• • General or Strong IT background, with at least 2 to 4 years of working experience
• o Strong understanding of data integration and ETL methodologies.
• o Demonstrated ability to multi-task
• o Excellent English communication skills
• o A desire to be a part of growing company. You'll have 2 core responsibilities (Client Work, and Company Building), and we expect dedication to both.
• o Willingness to learn and work on new technologies.
• o Should be a quick and self-learner.

Tools:
1. Good Knowledge of Power Bi and Tableau
2. Good experience in handling data in Excel.
Read more
Job posted by
i
Jerrin Thomas
Apply for job
i
Founded 2013  •  Products & Services  •  100-1000 employees  •  Profitable
Python
R Programming
SQL
Tableau
PowerBI
MS-Excel
i
Remote, Bengaluru (Bangalore)
i
2 - 4 yrs
i
₹4L - ₹12L / yr

The Company

We are a young, fast-growing AI company shaking up how work gets done across the enterprise. Every day, we help clients identify opportunities for automation, and then use a variety of AI and advanced automation techniques to rapidly model manual work in the form of code. Our impact has already been felt across some of the most reputable Fortune 500 companies, who are consequently seeing major gains in efficiency, client satisfaction, and overall savings. It’s an exciting experience to watch companies transform themselves rapidly with Soroco!

Based across US, UK, and India, our team includes several PhDs and graduates from top-notch universities such as MIT, Harvard, Carnegie Mellon, Dartmouth, and top rankers/medalists from the IITs and NITs. The senior leadership includes a former founder of a VC/hedge fund, a computer scientist from Harvard, and a former founder of a successful digital media firm. Our team has collectively published more than 100 papers in international journals and conferences and been granted over 20 patents. Our board members include some of the most well-known entrepreneurs across the globe, and our early clients include some of the most innovative Fortune 100 companies. 

 

The Role

As an individual contributor role, Business Analyst (BA) will work closely with Data Science Manager in India. BAs will be primarily responsible for analyzing improvement opportunities with business process, people productivity, application usage experience and other advanced analytics projects using Soroco scout platform collected data, for clients from diverse industry.

Responsibilities include (but are not limited to):

  • Understanding project objectives and frame analytics approach to provide the solution.
  • Take ownership in extracting, cleansing, structuring & analyzing data
  • Analyze data using statistical or rule-based techniques to identify actionable insights.
  • Prepare PowerPoint presentation/build visualization solutions for presenting the analysis & actionable insights to client.
  • Brainstorm and perform root cause analysis to provide suggestions to improve scout platform.
  • Work closely with product managers to build analytical features in the product.
  • Manage multiple projects simultaneously, in a fast-paced setting
  • Communicate effectively with client engagement, product, and engineering teams

The Candidate

An ideal BA should be passionate and entrepreneurial in nature, with a flexible attitude to learn anything and a willingness to provide the highest level of professional service.

  • 2-4 years of analytics work experience with a University degree in Engineering, preferably from Tier-1 or Tier-2 colleges.
  • Possess the skill to creatively solve analytical problems and propose solutions.
  • Ability to perform data manipulation and data modeling with complex data using SQL/Python
  • Knowledge of statistics and experience using statistical packages for analyzing datasets (R/Python)
  • Proficiency in Microsoft Office Excel and PowerPoint.
  • Impeccable attention to detail with excellent prioritization skills
  • Effective verbal, written and interpersonal communication skills.
  • Must be a team player and able to build strong working relationships with stakeholders
  • Strong capabilities and experience with programming in Python (Numpy & Pandas)

Bonus Skills:   

  • Knowledge of machine learning techniques (clustering, classification, and sequencing, among others)
  • Experience with visualization tools like Tableau, PowerBI, Qlik.

How You Will Grow:

Soroco believes in supporting you and your career. We will encourage you to grow by providing you with professional development opportunities across multiple business functions. Joining a young company will allow you to explore what is possible and have a high impact

Read more
Job posted by
i
Priyadarshini Rao
Apply for job
i
Founded 2015  •  Services  •  20-100 employees  •  Profitable
ETL
SQL
Informatica PowerCenter
i
Remote only
i
- yrs
i
₹1L - ₹20L / yr

If you are an outstanding ETL Developer with a passion for technology and looking forward to being part of a great development organization, we would love to hear from you. We are offering technology consultancy services to our Fortune 500 customers with a primary focus on digital technologies. Our customers are looking for top-tier talents in the industry and willing to compensate based on your skill and expertise. The nature of our engagement is Contract in most cases. If you are looking for the next big step in your career, we are glad to partner with you. 

 

Below is the job description for your review.

Extensive hands- on experience in designing and developing ETL packages using SSIS

Extensive experience in performance tuning of SSIS packages

In- depth knowledge of data warehousing concepts and ETL systems, relational databases like SQL Server 2012/ 2014.

Read more
Job posted by
i
Karthik Padmanabhan
Apply for job
i
Founded 2007  •  Products & Services  •  20-100 employees  •  Raised funding
Python
Amazon Web Services (AWS)
Google Cloud Storage
Big Data
Data Analytics
Datawarehousing
Software Development
Data Science
i
Bengaluru (Bangalore)
i
- yrs
i
₹7L - ₹20L / yr
DESCRIPTION :- We- re looking for an experienced Data Engineer to be part of our team who has a strong cloud technology experience to help our big data team to take our products to the next level.- This is a hands-on role, you will be required to code and develop the product in addition to your leadership role. You need to have a strong software development background and love to work with cutting edge big data platforms.- You are expected to bring with you extensive hands-on experience with Amazon Web Services (Kinesis streams, EMR, Redshift), Spark and other Big Data processing frameworks and technologies as well as advanced knowledge of RDBS and Data Warehousing solutions.REQUIREMENTS :- Strong background working on large scale Data Warehousing and Data processing solutions.- Strong Python and Spark programming experience.- Strong experience in building big data pipelines.- Very strong SQL skills are an absolute must.- Good knowledge of OO, functional and procedural programming paradigms.- Strong understanding of various design patterns.- Strong understanding of data structures and algorithms.- Strong experience with Linux operating systems.- At least 2+ years of experience working as a software developer or a data-driven environment.- Experience working in an agile environment.Lots of passion, motivation and drive to succeed!Highly desirable :- Understanding of agile principles specifically scrum.- Exposure to Google cloud platform services such as BigQuery, compute engine etc.- Docker, Puppet, Ansible, etc..- Understanding of digital marketing and digital advertising space would be advantageous.BENEFITS :Datalicious is a global data technology company that helps marketers improve customer journeys through the implementation of smart data-driven marketing strategies. Our team of marketing data specialists offer a wide range of skills suitable for any challenge and cover everything from web analytics to data engineering, data science and software development.Experience : Join us at any level and we promise you'll feel up-levelled in no time, thanks to the fast-paced, transparent and aggressive growth of DataliciousExposure : Work with ONLY the best clients in the Australian and SEA markets, every problem you solve would directly impact millions of real people at a large scale across industriesWork Culture : Voted as the Top 10 Tech Companies in Australia. Never a boring day at work, and we walk the talk. The CEO organises nerf-gun bouts in the middle of a hectic day.Money: We'd love to have a long term relationship because long term benefits are exponential. We encourage people to get technical certifications via online courses or digital schools.So if you are looking for the chance to work for an innovative, fast growing business that will give you exposure across a diverse range of the world's best clients, products and industry leading technologies, then Datalicious is the company for you!
Read more
Job posted by
i
Ramjee Ganti
Apply for job
i
Founded 2006  •  Products & Services  •  100-1000 employees  •  Profitable
Amazon Web Services (AWS)
Big Data
Business Intelligence (BI)
i
Pune
i
- yrs
i
₹13L - ₹25L / yr
The hunt is for a AWS BigData /DWH Architect with the ability to manage effective relationships with a wide range of stakeholders (customers & team members alike). Incumbent will demonstrate personal commitment and accountability to ensure standards are continuously sustained and improved both within the internal teams, and with partner organizations and suppliers. We at Nitor Infotech a Product Engineering Services company are always on hunt for some best talents in the IT industry & keeping with our trend of What next in IT. We are scouting for result oriented resources with passion for product, technology services, and creating great customer experiences. Someone who can take our current expertise & footprint of Nitor Infotech Inc., to an altogether different dimension & level in tune with the emerging market trends and ensure Brilliance @ Work continues to prevail in whatever we do. Nitor Infotech works with global ISVs to help them build and accelerate their product development. Nitor is able to do so because of the fact that product development is its DNA. This DNA is enriched by its 10 years of expertise, best practices and frameworks & Accelerators. Because of this ability Nitor Infotech has been able to build business relationships with product companies having revenues from $50 Million to $1 Billion. • 7-12+ years of relevant experience of working in Database, BI and Analytics space with over 0-2 yrs of architecting and designing data warehouse experience including 2 to 3 yrs in Big Data ecosystem • Experience in data warehouse design in AWS • Strong architecting, programming, design skills and proven track record of architecting and building large scale, distributed big data solutions • Professional and technical advice on Big Data concepts and technologies, in particular highlighting the business potential through real-time analysis • Provides technical leadership in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc. NoSQL stores like Mongodb, Cassandra, HBase etc.) • Performance tuning of Hadoop clusters and Hadoop MapReduce routines. • Evaluate and recommend Big Data technology stack for the platform • Drive significant technology initiatives end to end and across multiple layers of architecture • Should have breadth of BI knowledge which includes:  MSBI, Database design, new visualization tools like Tableau, Qlik View, Power BI  Understand internals and intricacies of Old and New DB platform which includes:  Strong RDMS DB Fundamentals either of it SQL Server/ MySQL/ Oracle  DB and DWH design  Designing Semantic Model using OLAP and Tabular model using MS and Non MS tools  No SQL DBs including Document, Graph, Search and Columnar DBs • Excellent communication skills and strong ability to build good rapport with prospect and existing customers • Be a Mentor and go to person for Jr. team members in the team Qualification & Experience: · Educational qualification: BE/ME/B.Tech/M.Tech, BCA/MCA/BCS/MCS, any other degree with relevant IT qualification.
Read more
Job posted by
i
Balakumar Mohan
Apply for job
Did not find a job you were looking for?
i
Search
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on CutShort.
iiiii
Want to apply for this role at PayU?
i
Apply for this job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No spam.