Cutshort logo
EASEBUZZ logo
Data Engineer
Data Engineer
EASEBUZZ's logo

Data Engineer

Amala Baby's profile picture
Posted by Amala Baby
2 - 4 yrs
₹2L - ₹20L / yr
Pune
Skills
skill iconData Analytics
Data Visualization
PowerBI
Tableau
Qlikview
Spotfire
Apache Kafka
SQL
skill iconAmazon Web Services (AWS)
Big Data
DynamoDB
skill iconMongoDB
EMR
Amazon Redshift
ETL
Data architecture
Data modeling

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About EASEBUZZ

Founded :
2016
Type
Size
Stage :
Raised funding
About
Accept hassle-free Payment with Easebuzz Online Payment Solution with 50+ payment modes. It also offers businesses to accept, process & disburse payments /
Read more
Connect with the team
Profile picture
Amala Baby
Company social profiles
bloglinkedintwitterfacebook

Similar jobs

Digit88
at Digit88
2 recruiters
khushboo mishra
Posted by khushboo mishra
Pune
4 - 7 yrs
₹6L - ₹20L / yr
MS-Excel
SQL

To be successful in this role, you should possess

  • Overall Industry experience 2-6 years
  • Bachelor’s Degree in analytical subject area. E.g., Engineering, Statistics.... etc.
  • Proficient in advanced Excel functions and macros, involving complex calculations and pivots
  • Exceptional Analytical, problem solving & Logical skills.
  • Understanding of relational database concepts and familiar with SQL
  • Demonstrable aptitude for Innovation & Problem solving.
  • Good communication skills & ability to work across Cross-functional teams.
  • Understands complex utility tariffs, rates and programs and converts these into a model.
  • Participates in sprint planning & other ceremonies, passionately works towards fulfilling the committed sprint goals.
  • Knowledge of and ability to automate routine tasks using Python is a PLUS


Preferred Qualifications:

  • Experience in Energy Industry & familiar with basic concepts of Utility (electrical/gas...) tariffs
  • Experience & Knowledge with tools like; Microsoft Excel macros, 
  • Familiarity with writing programs using Python or Shell scripts.
  • Passionate about working with data and data analyses.
  • 1+ year experience in Agile methodology.


Roles and responsibilities


  • Understands complex utility tariffs, rates and programs and converts these into a model.
  • Responsible for analysis of the energy utilization data for cost and usage patterns and derive meaningful patterns
  • Responsible for Maintaining the tariff models with timely updates for price, logic or other enhancements as per client requirement.
  • Assist delivery team in validating the input data received from client for modelling work.
  • Responsible for Communicating & Co-ordinating with Delivery team
  • Work with Cross functional teams to resolve issues in the Modelling tool.
  • Build and deliver compelling demonstrations/visualizations of products
  • Be a lifelong learner and develop your skills continuously
  • Contribute to the success of a rapidly growing and evolving organization


Additional Project/Soft Skills:


  • Should be able to work independently with India & US based team members.
  • Strong verbal and written communication with ability to articulate problems and solutions over phone and emails.
  • Strong sense of urgency, with a passion for accuracy and timeliness.
  • Ability to work calmly in high pressure situations and manage multiple projects/tasks.
  • Ability to work independently and possess superior skills in issue resolution.
Read more
Mumbai, Navi Mumbai
6 - 14 yrs
₹16L - ₹37L / yr
skill iconPython
PySpark
Data engineering
Big Data
Hadoop
+3 more

Role: Principal Software Engineer


We looking for a passionate Principle Engineer - Analytics to build data products that extract valuable business insights for efficiency and customer experience. This role will require managing, processing and analyzing large amounts of raw information and in scalable databases. This will also involve developing unique data structures and writing algorithms for the entirely new set of products. The candidate will be required to have critical thinking and problem-solving skills. The candidates must be experienced with software development with advanced algorithms and must be able to handle large volume of data. Exposure with statistics and machine learning algorithms is a big plus. The candidate should have some exposure to cloud environment, continuous integration and agile scrum processes.



Responsibilities:


• Lead projects both as a principal investigator and project manager, responsible for meeting project requirements on schedule

• Software Development that creates data driven intelligence in the products which deals with Big Data backends

• Exploratory analysis of the data to be able to come up with efficient data structures and algorithms for given requirements

• The system may or may not involve machine learning models and pipelines but will require advanced algorithm development

• Managing, data in large scale data stores (such as NoSQL DBs, time series DBs, Geospatial DBs etc.)

• Creating metrics and evaluation of algorithm for better accuracy and recall

• Ensuring efficient access and usage of data through the means of indexing, clustering etc.

• Collaborate with engineering and product development teams.


Requirements:


• Master’s or Bachelor’s degree in Engineering in one of these domains - Computer Science, Information Technology, Information Systems, or related field from top-tier school

• OR Master’s degree or higher in Statistics, Mathematics, with hands on background in software development.

• Experience of 8 to 10 year with product development, having done algorithmic work

• 5+ years of experience working with large data sets or do large scale quantitative analysis

• Understanding of SaaS based products and services.

• Strong algorithmic problem-solving skills

• Able to mentor and manage team and take responsibilities of team deadline.


Skill set required:


• In depth Knowledge Python programming languages

• Understanding of software architecture and software design

• Must have fully managed a project with a team

• Having worked with Agile project management practices

• Experience with data processing analytics and visualization tools in Python (such as pandas, matplotlib, Scipy, etc.)

• Strong understanding of SQL and querying to NoSQL database (eg. Mongo, Casandra, Redis

Read more
Reqroots
at Reqroots
7 recruiters
Dhanalakshmi D
Posted by Dhanalakshmi D
Chennai
2 - 5 yrs
₹5L - ₹10L / yr
Oracle Application Express (APEX)
PL/SQL
SQL
skill iconJavascript
skill iconHTML/CSS

#HiringAlert

We are looking " Oracle APEX " for Reputed Client @ Permanent Role.

Must Have Skills:

1.Experience in Oracle APEX, SQL / PLSQL concepts, collections, function, procedure, package and types - Mandatory

2. Experience in JASPER / OBIEE reports

Good To Have Skills:

1.Knowledge in Oracle Forms and Reports, JavaScript, HTML, CSS or any other web-based programming language is added advantage (Optional)

2.Key Responsibilities: Design develop test and implement features and code using Oracle Application Express APEX

3. Perform requirements analysis development design test and deployment of custom Oracle applications.

4. Writing SQL queries Views Materialized views PLSQL procedures functions packages triggers cursors collections Ref cursor cursor variables System reference cursor Dynamic SQL

5. Debug code and maintain PL SQL procedures and functions.

6.Optimize existing applications and scripts for both usability and performance.

7. Adherence to policies processes and procedures within your areas of responsibility

8.Evaluate new features and technologies overseeing their implementation into the environment as appropriate.

9.Ability to prioritize, manage, deliver and report on multiple projects simultaneously highly motivated and able to work against aggressive deadlines schedules.

Professional Attributes:

1. Ability to work with minimal supervision.

2. Exhibits strong ethics, positive attitude, and calm demeanor under pressure.

3. Flexible, organized and possesses the ability to work in a fast-paced environment.

Location : Chennai

Read more
Agilisium
Agency job
via Recruiting India by Moumita Santra
Chennai
10 - 19 yrs
₹12L - ₹40L / yr
Big Data
Apache Spark
Spark
PySpark
ETL
+1 more

Job Sector: IT, Software

Job Type: Permanent

Location: Chennai

Experience: 10 - 20 Years

Salary: 12 – 40 LPA

Education: Any Graduate

Notice Period: Immediate

Key Skills: Python, Spark, AWS, SQL, PySpark

Contact at triple eight two zero nine four two double seven

 

Job Description:

Requirements

  • Minimum 12 years experience
  • In depth understanding and knowledge on distributed computing with spark.
  • Deep understanding of Spark Architecture and internals
  • Proven experience in data ingestion, data integration and data analytics with spark, preferably PySpark.
  • Expertise in ETL processes, data warehousing and data lakes.
  • Hands on with python for Big data and analytics.
  • Hands on in agile scrum model is an added advantage.
  • Knowledge on CI/CD and orchestration tools is desirable.
  • AWS S3, Redshift, Lambda knowledge is preferred
Thanks
Read more
Tredence
Suchismita Das
Posted by Suchismita Das
Bengaluru (Bangalore), Gurugram, Chennai, Pune
8 - 10 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
skill iconR Programming
SQL
+1 more

THE IDEAL CANDIDATE WILL

 

  • Engage with executive level stakeholders from client's team to translate business problems to high level solution approach
  • Partner closely with practice, and technical teams to craft well-structured comprehensive proposals/ RFP responses clearly highlighting Tredence’s competitive strengths relevant to Client's selection criteria
  • Actively explore the client’s business and formulate solution ideas that can improve process efficiency and cut cost, or achieve growth/revenue/profitability targets faster
  • Work hands-on across various MLOps problems and provide thought leadership
  • Grow and manage large teams with diverse skillsets
  • Collaborate, coach, and learn with a growing team of experienced Machine Learning Engineers and Data Scientists

 

 

 

ELIGIBILITY CRITERIA

 

  • BE/BTech/MTech (Specialization/courses in ML/DS)
  • At-least 7+ years of Consulting services delivery experience
  • Very strong problem-solving skills & work ethics
  • Possesses strong analytical/logical thinking, storyboarding and executive communication skills
  • 5+ years of experience in Python/R, SQL
  • 5+ years of experience in NLP algorithms, Regression & Classification Modelling, Time Series Forecasting
  • Hands on work experience in DevOps
  • Should have good knowledge in different deployment type like PaaS, SaaS, IaaS
  • Exposure on cloud technologies like Azure, AWS or GCP
  • Knowledge in python and packages for data analysis (scikit-learn, scipy, numpy, pandas, matplotlib).
  • Knowledge of Deep Learning frameworks: Keras, Tensorflow, PyTorch, etc
  • Experience with one or more Container-ecosystem (Docker, Kubernetes)
  • Experience in building orchestration pipeline to convert plain python models into a deployable API/RESTful endpoint.
  • Good understanding of OOP & Data Structures concepts

 

 

Nice to Have:

 

  • Exposure to deployment strategies like: Blue/Green, Canary, AB Testing, Multi-arm Bandit
  • Experience in Helm is a plus
  • Strong understanding of data infrastructure, data warehouse, or data engineering

 

You can expect to –

  • Work with world’ biggest retailers and help them solve some of their most critical problems. Tredence is a preferred analytics vendor for some of the largest Retailers across the globe
  • Create multi-million Dollar business opportunities by leveraging impact mindset, cutting edge solutions and industry best practices.
  • Work in a diverse environment that keeps evolving
  • Hone your entrepreneurial skills as you contribute to growth of the organization

 

 

Read more
Fragma Data Systems
at Fragma Data Systems
8 recruiters
Minakshi Kumari
Posted by Minakshi Kumari
Remote only
1 - 5 yrs
₹10L - ₹15L / yr
SQL
PySpark
Responsible for developing and maintaining applications with PySpark 
• Contribute to the overall design and architecture of the application developed and deployed.
• Performance Tuning wrt to executor sizing and other environmental parameters, code optimization, partitions tuning, etc.
• Interact with business users to understand requirements and troubleshoot issues.
• Implement Projects based on functional specifications.

Must Have Skills:
• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good experience in SQL DBs - Be able to write queries including fair complexity.
• Should have excellent experience in Big Data programming for data transformation and aggregations
• Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption.
• Good customer communication.
• Good Analytical skills
Read more
Helical IT Solutions
at Helical IT Solutions
4 recruiters
Bhavani Thanga
Posted by Bhavani Thanga
Hyderabad
0 - 0 yrs
₹1.2L - ₹3.5L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more

Job description

About Company
Helical Insight an open source Business Intelligence tool from Helical IT Solutions Pvt. Ltd,
based out of Hyderabad, is looking for fresher’s having strong knowledge on SQL. Helical
Insight has more than 50+ clients from various sectors. It has been awarded the most promising
company in the Business Intelligence space. We are looking for rockstar team mate to join
our company.
Job Brief
We are looking for a Business Intelligence (BI) Developer to create and manage BI and analytics
solutions that turn data into knowledge.
In this role, you should have a background in data and business analysis. You should be
analytical and an excellent communicator. If you also have a business acumen and
problemsolving aptitude, we’d like to meet you. Excellent knowledge on SQLQuery is required.
Basic knowledge on HTML CSS and JS is required.
You would be working closely with customers of various domain to understand their data,
understand their business requirement and deliver the required analytics in form of varous
reports dashboards etc. Excellent client interfacing role with opportunity to work across various
sectors and geographies as well as varioud kind of DB including NoSQL, RDBMS, graph db,
Columnar DB etc
Skill set and Qualification required
Responsibilities
 Attending client calls to get requriement, show progress
 Translate business needs to technical specifications
 Design, build and deploy BI solutions (e.g. reporting tools)
 Maintain and support data analytics platforms)
 Conduct unit testing and troubleshooting
 Evaluate and improve existing BI systems
 Collaborate with teams to integrate systems
 Develop and execute database queries and conduct analyses
 Create visualizations and reports for requested projects
 Develop and update technical documentation
Requirements
 Excellent expertise on SQLQueries
 Proven experience as a BI Developer or Data Scientist
 Background in data warehouse design (e.g. dimensional modeling) and data mining
 In-depth understanding of database management systems, online analytical processing
(OLAP) and ETL (Extract, transform, load) framework
 Familiarity with BI technologies
 Proven abilities to take initiative and be innovative
 Analytical mind with a problem-solving aptitude
 BE in Computer Science/IT
Education: BE/ BTech/ MCA/BCA/ MTech/ MS, or equivalent preferred.
Interested candidates call us on +91 7569 765 162
Read more
MNC
at MNC
Agency job
via Fragma Data Systems by Harpreet kour
Pune
6 - 12 yrs
₹25L - ₹27L / yr
ETL QA
ETL
HP Quality Center administration
Oracle
Agile management
+4 more
• 5+ years of testing experience, preferably in a product organization
• Strong knowledge of SQL and ETL Testing
• Extensive experience in ETL/ Data warehouse backend testing and BI Intelligence reports testing
• Hands-on back-end testing skills and strong RDBMS and testing methodologies.
• Expertise in test management tools and defect tracking tools i.e HP Quality Center, Jira
• Proficient experience of working on SDLC & Agile Methodology
•Excellent Knowledge of Database Systems Vertica /Oracle/ Teradata
• Knowledge in security testing will be an added advantage.
• Experience in Business Intelligence testing in various reports Using Tableau
• Strong comprehension, analytical, and problem-solving skills
•Good interpersonal and communication skills, quick learner, and good troubleshooting capabilities.
• Good knowledge of Python Programming language.

• Working knowledge of AWS
Read more
MNC
at MNC
Agency job
via Fragma Data Systems by Priyanka U
Remote, Bengaluru (Bangalore)
2 - 6 yrs
₹6L - ₹15L / yr
Spark
Apache Kafka
PySpark
Internet of Things (IOT)
Real time media streaming

JD for IOT DE:

 

The role requires experience in Azure core technologies – IoT Hub/ Event Hub, Stream Analytics, IoT Central, Azure Data Lake Storage, Azure Cosmos, Azure Data Factory, Azure SQL Database, Azure HDInsight / Databricks, SQL data warehouse.

 

You Have:

  • Minimum 2 years of software development experience
  • Minimum 2 years of experience in IoT/streaming data pipelines solution development
  • Bachelor's and/or Master’s degree in computer science
  • Strong Consulting skills in data management including data governance, data quality, security, data integration, processing, and provisioning
  • Delivered data management projects with real-time/near real-time data insights delivery on Azure Cloud
  • Translated complex analytical requirements into the technical design including data models, ETLs, and Dashboards / Reports
  • Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases
  • Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
  • Successfully delivered large scale IOT data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile
  • Experience in handling telemetry data with Spark Streaming, Kafka, Flink, Scala, Pyspark, Spark SQL.
  • Hands-on experience on containers and Dockers
  • Exposure to streaming protocols like MQTT and AMQP
  • Knowledge of OT network protocols like OPC UA, CAN Bus, and similar protocols
  • Strong knowledge of continuous integration, static code analysis, and test-driven development
  • Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
  • Must have excellent analytical and problem-solving skills
  • Delivered change management initiatives focused on driving data platforms adoption across the enterprise
  • Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations
     

Roles & Responsibilities
 

You Will:

  • Translate functional requirements into technical design
  • Interact with clients and internal stakeholders to understand the data and platform requirements in detail and determine core Azure services needed to fulfill the technical design
  • Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks
  • Design, Develop and Deliver data provisioning interfaces to fulfill consumption needs
  • Deliver data models on Azure platform, it could be on Azure Cosmos, SQL DW / Synapse, or SQL
  • Advise clients on ML Engineering and deploying ML Ops at Scale on AKS
  • Automate core activities to minimize the delivery lead times and improve the overall quality
  • Optimize platform cost by selecting the right platform services and architecting the solution in a cost-effective manner
  • Deploy Azure DevOps and CI CD processes
  • Deploy logging and monitoring across the different integration points for critical alerts

 

Read more
The other Fruit
at The other Fruit
1 video
3 recruiters
Dipendra SIngh
Posted by Dipendra SIngh
Pune
1 - 5 yrs
₹3L - ₹15L / yr
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
skill iconPython
Data Structures
Algorithms
+17 more
 
SD (ML and AI) job description:

Advanced degree in computer science, math, statistics or a related discipline ( Must have master degree )
Extensive data modeling and data architecture skills
Programming experience in Python, R
Background in machine learning frameworks such as TensorFlow or Keras
Knowledge of Hadoop or another distributed computing systems
Experience working in an Agile environment
Advanced math skills (Linear algebra
Discrete math
Differential equations (ODEs and numerical)
Theory of statistics 1
Numerical analysis 1 (numerical linear algebra) and 2 (quadrature)
Abstract algebra
Number theory
Real analysis
Complex analysis
Intermediate analysis (point set topology)) ( important )
Strong written and verbal communications
Hands on experience on NLP and NLG
Experience in advanced statistical techniques and concepts. ( GLM/regression, Random forest, boosting, trees, text mining ) and experience with application.
 
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos