Data Engineer

at Incubyte

DP
Posted by Lifi Lawrance
icon
Remote only
icon
2 - 3 yrs
icon
₹8L - ₹20L / yr
icon
Full time
Skills
Data engineering
Spark
SQL
Windows Azure
MySQL
Python
ETL
ADF
azure

Who are we?

 

We are incubators of high-quality, dedicated software engineering teams for our clients. We work with product organizations to help them scale or modernize their legacy technology solutions. We work with startups to help them operationalize their idea efficiently. Incubyte strives to find people who are passionate about coding, learning, and growing along with us. We work with a limited number of clients at a time on dedicated, long term commitments with an aim of bringing a product mindset into services.

 

What we are looking for

 

We’re looking to hire software craftspeople. People who are proud of the way they work and the code they write. People who believe in and are evangelists of extreme programming principles. High quality, motivated and passionate people who make great teams. We heavily believe in being a DevOps organization, where developers own the entire release cycle and thus get to work not only on programming languages but also on infrastructure technologies in the cloud.

 

What you’ll be doing

 

First, you will be writing tests. You’ll be writing self-explanatory, clean code. Your code will produce the same, predictable results, over and over again. You’ll be making frequent, small releases. You’ll be working in pairs. You’ll be doing peer code reviews.

 

You will work in a product team. Building products and rapidly rolling out new features and fixes.

 

You will be responsible for all aspects of development – from understanding requirements, writing stories, analyzing the technical approach to writing test cases, development, deployment, and fixes. You will own the entire stack from the front end to the back end to the infrastructure and DevOps pipelines. And, most importantly, you’ll be making a pledge that you’ll never stop learning!

 

Skills you need in order to succeed in this role

Most Important: Integrity of character, diligence and the commitment to do your best

Must Have: SQL, Databricks, (Scala / Pyspark), Azure Data Factory, Test Driven Development

Nice to Have: SSIS, Power BI, Kafka, Data Modeling, Data Warehousing

 

Self-Learner: You must be extremely hands-on and obsessive about delivering clean code

 

  • Sense of Ownership: Do whatever it takes to meet development timelines
  • Experience in creating end to end data pipeline
  • Experience in Azure Data Factory (ADF) creating multiple pipelines and activities using Azure for full and incremental data loads into Azure Data Lake Store and Azure SQL DW
  • Working experience in Databricks
  • Strong in BI/DW/Datalake Architecture, design and ETL
  • Strong in Requirement Analysis, Data Analysis, Data Modeling capabilities
  • Experience in object-oriented programming, data structures, algorithms and software engineering
  • Experience working in Agile and Extreme Programming methodologies in a continuous deployment environment.
  • Interest in mastering technologies like, relational DBMS, TDD, CI tools like Azure devops, complexity analysis and performance
  • Working knowledge of server configuration / deployment
  • Experience using source control and bug tracking systems,

    writing user stories and technical documentation

  • Strong in Requirement Analysis, Data Analysis, Data Modeling capabilities
  • Expertise in creating tables, procedures, functions, triggers, indexes, views, joins and optimization of complex
  • Experience with database versioning, backups, restores and
  • Expertise in data security and
  • Ability to perform database performance tuning queries
Read more

About Incubyte

Founded
2020
Type
Services
Size
20-100
Stage
Bootstrapped
About

Who we are

We are Software Craftspeople. We are proud of the way we work and the code we write. We embrace and are evangelists of eXtreme Programming practices. We heavily believe in being a DevOps organization, where developers own the entire release cycle and thus own quality. And most importantly, we never stop learning!


We work with product organizations to help them scale or modernize their legacy technology solutions. We work with startups to help them operationalize their idea efficiently. We work with large established institutions to help them create internal applications to automate manual opperations and achieve scale.

We design software, design the team a well as the organizational strategy required to successfully release robust and scalable products. Incubyte strives to find people who are passionate about coding, learning and growing along with us. We work with a limited number of clients at a time on dedicated, long term commitments with an aim to bringing a product mindset into services. More on our website: https://incubyte.co" target="_blank">https://incubyte.co

 

Join our team! We’re always looking for like minded people!

Read more
Connect with the team
icon
Rushali Parikh
icon
Arohi Parikh
icon
Karishma Shah
icon
Lifi Lawrance
icon
Gouthami Vallabhaneni
icon
Shilpi Gupta
icon
Pooja Karnani
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

DP
Posted by Kayum Ansari
icon
Mumbai
icon
5 - 7 yrs
icon
₹5L - ₹9L / yr
Python
Machine Learning (ML)
Data Science
Natural Language Processing (NLP)
recommendation algorithm

Job Details:-

Designation - Data Scientist

Urgently required. (NP of maximum 15 days)

Location:- Mumbai

Experience:- 5-7 years.

Package Offered:- Rs.5,00,000/- to Rs.9,00,000/- pa.

 

Data Scientist

 

Job Description:-

Responsibilities:

  • Identify valuable data sources and automate collection processes
  • Undertake preprocessing of structured and unstructured data
  • Analyze large amounts of information to discover trends and patterns
  • Build predictive models and machine-learning algorithms
  • Combine models through ensemble modeling
  • Present information using data visualization techniques
  • Propose solutions and strategies to business challenges
  • Collaborate with engineering and product development teams

 

Requirements:

  • Proven experience as a Data Scientist or Data Analyst
  • Experience in data mining
  • Understanding of machine-learning and operations research
  • Knowledge of R, SQL and Python; familiarity with Scala, Java is an asset
  • Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop)
  • Analytical mind and business acumen
  • Strong math skills (e.g. statistics, algebra)
  • Problem-solving aptitude
  • Excellent communication and presentation skills
  • BSc/BA in Computer Science, Engineering or relevant field; graduate degree in Data Science or other quantitative field is preferred
Read more
a global provider of Business Process Management company
Agency job
via Jobdost by Saida Jabbar
icon
Bengaluru (Bangalore)
icon
4 - 10 yrs
icon
₹15L - ₹22L / yr
SQL Azure
ADF
Business process management
Windows Azure
SQL
+12 more

Desired Competencies:

 

Ø  Expertise in Azure Data Factory V2

Ø  Expertise in other Azure components like Data lake Store, SQL Database, Databricks

Ø  Must have working knowledge of spark programming

Ø  Good exposure to Data Projects dealing with Data Design and Source to Target documentation including defining transformation rules

Ø  Strong knowledge of CICD Process

Ø  Experience in building power BI reports

Ø  Understanding of different components like Pipelines, activities, datasets & linked services

Ø  Exposure to dynamic configuration of pipelines using data sets and linked Services

Ø  Experience in designing, developing and deploying pipelines to higher environments

Ø  Good knowledge on File formats for flexible usage, File location Objects (SFTP, FTP, local, HDFS, ADLS, BLOB, Amazon S3 etc.)

Ø  Strong knowledge in SQL queries

Ø  Must have worked in full life-cycle development from functional design to deployment

Ø  Should have working knowledge of GIT, SVN

Ø  Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.

Ø  Should have working knowledge of different resources available in Azure like Storage Account, Synapse, Azure SQL Server, Azure Data Bricks, Azure Purview

Ø  Any experience related to metadata management, data modelling, and related tools (Erwin or ER Studio or others) would be preferred

 

Preferred Qualifications:

Ø  Bachelor's degree in Computer Science or Technology

Ø  Proven success in contributing to a team-oriented environment

Ø  Proven ability to work creatively and analytically in a problem-solving environment

Ø  Excellent communication (written and oral) and interpersonal skills

Qualifications

BE/BTECH

KEY RESPONSIBILITIES :

You will join a team designing and building a data warehouse covering both relational and dimensional models, developing reports, data marts and other extracts and delivering these via SSIS, SSRS, SSAS, and PowerBI. It is seen as playing a vital role in delivering a single version of the truth on Client’s data and delivering MI & BI that will feature in enabling both operational and strategic decision making.

You will be able to take responsibility for projects over the entire software lifecycle and work with minimum supervision. This would include technical analysis, design, development, and test support as well as managing the delivery to production.

The initial project being resourced is around the development and implementation of a Data Warehouse and associated MI/BI functions.

 

Principal Activities:

1.       Interpret written business requirements documents

2.       Specify (High Level Design and Tech Spec), code and write automated unit tests for new aspects of MI/BI Service.

3.       Write clear and concise supporting documentation for deliverable items.

4.       Become a member of the skilled development team willing to contribute and share experiences and learn as appropriate.

5.       Review and contribute to requirements documentation.

6.       Provide third line support for internally developed software.

7.       Create and maintain continuous deployment pipelines.

8.       Help maintain Development Team standards and principles.

9.       Contribute and share learning and experiences with the greater Development team.

10.   Work within the company’s approved processes, including design and service transition.

11.   Collaborate with other teams and departments across the firm.

12.   Be willing to travel to other offices when required.
13.You agree to comply with any reasonable instructions or regulations issued by the Company from time to time including those set out in the terms of the dealing and other manuals, including staff handbooks and all other group policies


Location
– Bangalore

 

Read more
Agency job
via Nu-Pie by Sanjay Biswakarma
icon
Bengaluru (Bangalore)
icon
8 - 10 yrs
icon
₹12L - ₹18L / yr
Tableau Developer
Tableau
Dashboard
SQL
MySQL
JD
The responsibilities of a tableau developer include creating technical solutions, creating data storage tools, and conducting tests. To be successful as a tableau developer, you should have a broad understanding of the business technology landscape, the ability to design reports, and strong analytical skills.
Read more
icon
Remote only
icon
7 - 13 yrs
icon
₹15L - ₹35L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more
Experience
Experience Range

2 Years - 10 Years

Function Information Technology
Desired Skills
Must Have Skills:
• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good experience in SQL DBs - Be able to write queries including fair complexity.
• Should have excellent experience in Big Data programming for data transformation and aggregations
• Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption.
• Good customer communication.
• Good Analytical skills
Education
Education Type Engineering
Degree / Diploma Bachelor of Engineering, Bachelor of Computer Applications, Any Engineering
Specialization / Subject Any Specialisation
Job Type Full Time
Job ID 000018
Department Software Development
Read more
Sopra Steria
Agency job
via Mount Talent Consulting by Himani Jain
icon
Chennai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
icon
5 - 8 yrs
icon
₹2L - ₹12L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+1 more
Good hands-on experience on Spark and Scala.
Should have experience in Big Data, Hadoop.
Currently providing WFH.
immediate joiner or 30 days
Read more
DP
Posted by Ankit Raj
icon
Bengaluru (Bangalore)
icon
0 - 2 yrs
icon
₹4L - ₹6L / yr
MS-Excel
SQL
Python
Business Analysis
Microsoft Excel
+1 more

About Kapiva:

Kapiva is a modern ayurvedic nutrition brand focused on bringing selectively sourced, natural foods to Indian consumers. Inculcating the wisdom of India's ancient food traditions, Kapiva's high-quality product range includes herbal juices, nutrition powders, ayurvedic gummies, healthy staples, and much more. Our products are top performers on online marketplaces such as Amazon, Flipkart, Big Basket and we're growing our presence offline in a big way (Nature’s Basket, Reliance Retail, Noble Plus, etc). We’re also funded by India’s best Consumer VC Fund – Fireside Ventures.


About the role
:

We are looking for a motivated data analyst with sound experience in handling web/ digital analytics, to join us as part of the Kapiva D2C Business Team. This team is primarily responsible for driving sales and customer engagement on our website (www.kapiva.in). This channel has grown 5x in revenue over the last 12 months and is poised to grow another 5x over the next six. It represents a high-growth, important part of Kapiva’s overall e-commerce growth strategy.

The mandate here is to run an end-to-end sustainable e-commerce business, boost sales through marketing campaigns, and build a cutting edge product (website) that optimizes the customer’s journey as well as increases customer lifetime value.

The Data Analyst will support the business heads by providing data-backed insights in order to drive customer growth, retention and engagement. They will be required to set-up and manage reports, test various hypotheses and coordinate with various stakeholders on a day-to-day basis.


Location
: Bangalore


Job Responsibilities
:

Strategy and planning:

  • Work with the D2C functional leads and support analytics planning on a quarterly/ annual basis
  • Identify reports and analytics needed to be conducted on a daily/ weekly/ monthly frequency
  • Drive planning for hypothesis-led testing of key metrics across the customer funnel

Analytics:

  • Interpret data, analyze results using statistical techniques and provide ongoing reports
  • Analyze large amounts of information to discover trends and patterns
  • Work with business teams to prioritize business and information needs
  • Collaborate with engineering and product development teams to setup data infrastructure as needed

Reporting and communication:

  • Prepare reports / presentations to present actionable insights that can drive business objectives
  • Setup live dashboards reporting key cross-functional metrics
  • Coordinate with various stakeholders to collect useful and required data
  • Present findings to business stakeholders to drive action across the organization
  • Propose solutions and strategies to business challenges

 

Requirements sought:

Must-haves:

  • Bachelor’s/ Masters in Mathematics, Economics, Computer Science, Information Management, Statistics or related field
  • 0-2 years’ experience in an analytics role, preferably consumer business. Proven experience as a Data Analyst/ Data Scientist
  • High proficiency in MS Excel and SQL
  • Knowledge of one or more programming languages like Python/ R. Adept at queries, report writing and presenting findings
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy - working knowledge of statistics and statistical methods
  • Ability to work in a highly dynamic environment across cross-functional teams; good at coordinating with different departments and managing timelines
  • Exceptional English written/verbal communication
  • A penchant for understanding consumer traits and behavior and a keen eye to detail

Good to have:

  • Hands-on experience with one or more web analytics tools like Google Analytics, Mixpanel, Kissmetrics, Heap, Adobe Analytics, etc.
  • Experience in using business intelligence tools like Metabase, Tableau, Power BI is a plus
  • Experience in developing predictive models and machine learning algorithms
Read more
DP
Posted by Newali Hazarika
icon
Bengaluru (Bangalore)
icon
4 - 9 yrs
icon
₹15L - ₹35L / yr
ETL
Informatica
Data Warehouse (DWH)
Data engineering
Oracle
+7 more

We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.

 

We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.

 

Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!

 

Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc. 

 

Key Responsibilities

  • Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality

  • Work with the Office of the CTO as an active member of our architecture guild

  • Writing pipelines to consume the data from multiple sources

  • Writing a data transformation layer using DBT to transform millions of data into data warehouses.

  • Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities

  • Identify downstream implications of data loads/migration (e.g., data quality, regulatory)

 

What To Bring

  • 5+ years of software development experience, a startup experience is a plus.

  • Past experience of working with Airflow and DBT is preferred

  • 5+ years of experience working in any backend programming language. 

  • Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL

  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)

  • Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects

  • Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure

  • Basic understanding of Kubernetes & docker is a must.

  • Experience in data processing (ETL, ELT) and/or cloud-based platforms

  • Working proficiency and communication skills in verbal and written English.

 

Read more
icon
Remote only
icon
5 - 20 yrs
icon
₹10L - ₹30L / yr
Data Science
Mathematics
Python
Machine Learning (ML)
Amazon Web Services (AWS)
+1 more

Introduction

The Biostrap platform extracts many metrics related to health, sleep, and activity.  Many algorithms are designed through research and often based on scientific literature, and in some cases they are augmented with or entirely designed using machine learning techniques.  Biostrap is seeking a Data Scientist to design, develop, and implement algorithms to improve existing metrics and measure new ones. 

Job Description

As a Data Scientist at Biostrap, you will take on projects to improve or develop algorithms to measure health metrics, including:

  • Research: search literature for starting points of the algorithm
  • Design: decide on the general idea of the algorithm, in particular whether to use machine learning, mathematical techniques, or something else.
  • Implement: program the algorithm in Python, and help deploy it.  

The algorithms and their implementation will have to be accurate, efficient, and well-documented.

Requirements

  • A Master’s degree in a computational field, with a strong mathematical background. 
  • Strong knowledge of, and experience with, different machine learning techniques, including their theoretical background.  
  • Strong experience with Python
  • Experience with Keras/TensorFlow, and preferably also with RNNs
  • Experience with AWS or similar services for data pipelining and machine learning.  
  • Ability and drive to work independently on an open problem.
  • Fluency in English.
Read more
at Niro
DP
Posted by Vinay Gurram
icon
Bengaluru (Bangalore)
icon
2 - 4 yrs
icon
₹7L - ₹15L / yr
Risk assessment
Risk Management
Risk analysis
Python
SAS
+2 more
  • Gather information from multiple data sources make Approval Decisions mechanically
  • Read and interpret credit related information to the borrowers
  • Interpret, analyze and assess all forms of complex information
  • Embark on risk assessment analysis
  • Maintain the credit exposure of the company within certain risk level with set limit in mind
  • Build strategies to minimize risk and increase approval rates
  • Design Champion and Challenger tests, implement and read test results
  • Build Line assignment strategies
Skills required:
- Credit Risk Modeling
- Statistical Data Understanding and interpretation
- Basic Regression and Advanced Machine Learning Models
- Conversant with coding on Python using libraries like Sklearn etc.
- Build and understand decision trees
Read more
DP
Posted by Karthik Padmanabhan
icon
Remote only
icon
2 - 15 yrs
icon
₹1L - ₹20L / yr
ETL
SQL
Informatica PowerCenter

If you are an outstanding ETL Developer with a passion for technology and looking forward to being part of a great development organization, we would love to hear from you. We are offering technology consultancy services to our Fortune 500 customers with a primary focus on digital technologies. Our customers are looking for top-tier talents in the industry and willing to compensate based on your skill and expertise. The nature of our engagement is Contract in most cases. If you are looking for the next big step in your career, we are glad to partner with you. 

 

Below is the job description for your review.

Extensive hands- on experience in designing and developing ETL packages using SSIS

Extensive experience in performance tuning of SSIS packages

In- depth knowledge of data warehousing concepts and ETL systems, relational databases like SQL Server 2012/ 2014.

Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Incubyte?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort