Cutshort logo
Gipfel & Schnell Consultings Pvt Ltd's logo

Senior Data Engineer

TanmayaKumar Pattanaik's profile picture
Posted by TanmayaKumar Pattanaik
3 - 9 yrs
Best in industry
Bengaluru (Bangalore)
Skills
Data engineering
ADF
data factory
SQL Azure
databricks
SQL
Relational Database (RDBMS)
Databases
CI/CD

Data Engineer

 

Brief Posting Description:

This person will work independently or with a team of data engineers on cloud technology products, projects, and initiatives. Work with all customers, both internal and external, to make sure all data related features are implemented in each solution. Will collaborate with business partners and other technical teams across the organization as required to deliver proposed solutions.

 

Detailed Description:

·        Works with Scrum masters, product owners, and others to identify new features for digital products.

·        Works with IT leadership and business partners to design features for the cloud data platform.

·        Troubleshoots production issues of all levels and severities, and tracks progress from identification through resolution.

·        Maintains culture of open communication, collaboration, mutual respect and productive behaviors; participates in the hiring, training, and retention of top tier talent and mentors team members to new and fulfilling career experiences.

·        Identifies risks, barriers, efficiencies and opportunities when thinking through development approach; presents possible platform-wide architectural solutions based on facts, data, and best practices.

·        Explores all technical options when considering solution, including homegrown coding, third-party sub-systems, enterprise platforms, and existing technology components.

·        Actively participates in collaborative effort through all phases of software development life cycle (SDLC), including requirements analysis, technical design, coding, testing, release, and customer technical support.

·        Develops technical documentation, such as system context diagrams, design documents, release procedures, and other pertinent artifacts.

·        Understands lifecycle of various technology sub-systems that comprise the enterprise data platform (i.e., version, release, roadmap), including current capabilities, compatibilities, limitations, and dependencies; understands and advises of optimal upgrade paths.

·        Establishes relationships with key IT, QA, and other corporate partners, and regularly communicates and collaborates accordingly while working on cross-functional projects or production issues.

 

 

 

 

Job Requirements:

 

EXPERIENCE:

2 years required; 3 - 5 years preferred experience in a data engineering role.

2 years required, 3 - 5 years preferred experience in Azure data services (Data Factory, Databricks, ADLS, Synapse, SQL DB, etc.)

 

EDUCATION:

Bachelor’s degree information technology, computer science, or data related field preferred

 

SKILLS/REQUIREMENTS:

Expertise working with databases and SQL.

Strong working knowledge of Azure Data Factory and Databricks

Strong working knowledge of code management and continuous integrations systems (Azure DevOps or Github preferred)

Strong working knowledge of cloud relational databases (Azure Synapse and Azure SQL preferred)

Familiarity with Agile delivery methodologies

Familiarity with NoSQL databases (such as CosmosDB) preferred.

Any experience with Python, DAX, Azure Logic Apps, Azure Functions, IoT technologies, PowerBI, Power Apps, SSIS, Informatica, Teradata, Oracle DB, and Snowflake preferred but not required.

Ability to multi-task and reprioritize in a dynamic environment.

Outstanding written and verbal communication skills

 

Working Environment:

General Office – Work is generally performed within an office environment, with standard office equipment. Lighting and temperature are adequate and there are no hazardous or unpleasant conditions caused by noise, dust, etc. 

 

physical requirements:                     

Work is generally sedentary in nature but may require standing and walking for up to 10% of the time. 

 

Mental requirements:

Employee required to organize and coordinate schedules.

Employee required to analyze and interpret complex data.

Employee required to problem-solve. 

Employee required to communicate with the public.

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Gipfel & Schnell Consultings Pvt Ltd

Founded :
2009
Type
Size :
20-100
Stage :
Profitable
About

Gipfel & Schnell was born from a seed of thought, to provide exceptional service(s) to the ever-expanding conglomerate of businesses across the Knowledge Space. We have over time, amassed a wide array of success stories based on our ability to function as a Value driven extension with all of our Partners. This has been forged in the areas of Talent Acquisition, HR Practice, BPR and IT Consulting.


G&S with its rich and diverse team boasting of extensive experience across Industries, Functions and Geographic Locations strongly believe in All-Round Solutions to give its Partners a competitive edge in the respective areas.

Read more
Connect with the team
Profile picture
Suma Latha
Profile picture
Mudassar Ahmed
Profile picture
Aravind Kumar
Profile picture
Sharath Simha
Profile picture
Chandragouda Patil
Profile picture
Suma Latha
Company social profiles
N/A

Similar jobs

Adastra India
Remote only
5 - 7 yrs
₹20L - ₹30L / yr
Google Cloud Platform (GCP)
skill iconPython
SQL
Bigquery
Data-flow analysis

 Job Description

As a Senior Python DE on GCP, you will be responsible for:

  • Technical Requirements Gathering and Development of Functional Specifications
  • Analysis on Various Development Alternatives to Applications, Information systems and Modules
  • Code Development in the Area of Data Management – Cloud, Data Integration, Analytics & Reporting
  • Support to Junior Developers, Team Leadership/Mentorship
  • Support to Presales Team – Technical Whitepapers, Solutions Review
  • Experience in cloud-based platforms, Specifically GCP
  • Strong programming skills in Python and SQL
  • Career path ambition – motivation for a management position in a foreign company
  • Strong communication, presentation, and networking Skills
  • Work diligence & initiative – “Deliver no Matter What” attitude
  • Experience working with GCP resources, such as BigQuery, Cloud Function, DataFlow, Cloud Composer
  • Experience building data pipelines with Airflow or other orchestration tools preferred


Read more
Amazech Systems pvt Ltd
Remote only
5 - 7 yrs
₹8L - ₹13L / yr
ADF
Apache Synapse
SSIS
SQL
ETL
+11 more

 Hiring for Azure Data Engineers.

Location: Bangalore

Employment type: Full-time, permanent

website: www.amazech.com

 

Qualifications: 

B.E./B.Tech/M.E./M.Tech in Computer Science, Information Technology, Electrical or Electronic with good academic background.


Experience and Required Skill Sets:


•       Minimum 5 years of hands-on experience with Azure Data Lake, Azure Data Factory, SQL Data Warehouse, Azure Blob, Azure Storage Explorer

•       Experience in Data warehouse/analytical systems using Azure Synapse.

Proficient in creating Azure Data Factory pipelines for ETL processing; copy activity, custom Azure development, Synapse, etc.

•       Knowledge of Azure Data Catalog, Event Grid, Service Bus, SQL, and Purview.

•       Good technical knowledge in Microsoft SQL Server BI Suite (ETL, Reporting, Analytics, Dashboards) using SSIS, SSAS, SSRS, Power BI

•       Design and develop batch and real-time streaming of data loads to data warehouse systems

 

 Other Requirements:


A Bachelor's or Master's degree (Engineering or computer-related degree preferred)

Strong understanding of Software Development Life Cycles including Agile/Scrum


Responsibilities: 

•       Ability to create complex, enterprise-transforming applications that meet and exceed client expectations. 

•       Responsible for the bottom line. Strong project management abilities. Ability to encourage the team to stick to timelines.

Read more
Institutional-grade tools to understand digital assets
Institutional-grade tools to understand digital assets
Agency job
via Qrata by Blessy Fernandes
Bengaluru (Bangalore)
3 - 8 yrs
₹20L - ₹35L / yr
SQL
skill iconPython
Metrics management
skill iconData Analytics

Responsibilities

  • Work with large and complex blockchain data sets and derive investment relevant metrics in close partnership with financial analysts and blockchain engineers.
  • Apply knowledge of statistics, programming, data modeling, simulation, and advanced mathematics to recognize patterns, identify opportunities, pose business questions, and make valuable discoveries leading to the development of fundamental metrics needed to evaluate various crypto assets.
  • Build a strong understanding of existing metrics used to value various decentralized applications and protocols.
  • Build customer facing metrics and dashboards.
  • Work closely with analysts, engineers, Product Managers and provide feedback as we develop our data analytics and research platform.

Qualifications

  • Bachelor's degree in Mathematics, Statistics, a relevant technical field, or equivalent practical experience (or) degree in an analytical field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research, Management Science)
  • 3+ years experience with data analysis and metrics development
  • 3+ years experience analyzing and interpreting data, drawing conclusions, defining recommended actions, and reporting results across stakeholders
  • 2+ years experience writing SQL queries
  • 2+ years experience scripting in Python
  • Demonstrated curiosity in and excitement for Web3/blockchain technologies
Read more
Top 3 Fintech Startup
Top 3 Fintech Startup
Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
4 - 7 yrs
₹11L - ₹17L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
skill iconPython
+6 more
Responsible to lead a team of analysts to build and deploy predictive models to infuse core business functions with deep analytical insights. The Senior Data Scientist will also work
closely with the Kinara management team to investigate strategically important business
questions.

Lead a team through the entire analytical and machine learning model life cycle:

 Define the problem statement
 Build and clean datasets
 Exploratory data analysis
 Feature engineering
 Apply ML algorithms and assess the performance
 Code for deployment
 Code testing and troubleshooting
 Communicate Analysis to Stakeholders
 Manage Data Analysts and Data Scientists
Read more
Digi Upaay Solutions Pvt Ltd
Sridhar Chakkravarthy
Posted by Sridhar Chakkravarthy
Remote only
8 - 11 yrs
₹11L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SQL
PL/SQL
+4 more

Required Skill Set-

Project experience in any of the following - Data Management,

Database Development, Data Migration or Data Warehousing.

• Expertise in SQL, PL/SQL.


Role and Responsibilities -

• Work on a complex data management program for multi-billion dollar

customer

• Work on customer projects related to data migration, data

integration

•No Troubleshooting

• Execution of data pipelines, perform QA, project documentation for

project deliverables

• Perform data profiling, data cleansing, data analysis for migration

data

• Participate and contribute in project meeting

• Experience in data manipulation using Python preferred

• Proficient in using Excel, PowerPoint

 -Perform other tasks as per project requirements.

Read more
Ganit Business Solutions
at Ganit Business Solutions
3 recruiters
Vijitha VS
Posted by Vijitha VS
Remote only
2 - 5 yrs
₹10L - ₹30L / yr
SQL Azure
DevOps
skill iconPython
Spark
PySpark
+12 more

Technologies & Languages

  • Azure
    • Databricks
    • SQL Sever
    • ADF
  • Snowflake
    • Data Cleaning
    • ETL
  • Azure Devops
  • Intermediate Python/Pyspark
  • Intermediate SQL
  • Beginners' knowledge/willingness to learn Spotfire
  • Data Ingestion
  • Familiarity with CI/CD or Agile

Must have:

  • Azure – VM, Data Lake, Data Bricks, Data Factory, Azure DevOps
  • Python/Spark (PySpark) 
  • SQL

Good to have:

  • Docker
  • Kubernetes
  • Scala

He/she should have a good understanding in:

  • How to build pipelines – ETL and Injection
  • Data Warehousing
  • Monitoring

Responsibilities:

Must be able to write quality code and build secure, highly available systems.

Assemble large, complex data sets that meet functional / non-functional business requirements.

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc with the guidance.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Monitoring performance and advising any necessary infrastructure changes.

Defining data retention policies.

Implementing the ETL process and optimal data pipeline architecture

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.

Create design documents that describe the functionality, capacity, architecture, and process.

Develop, test, and implement data solutions based on finalized design documents.

Work with data and analytics experts to strive for greater functionality in our data systems.

Proactively identify potential production issues and recommend and implement solutions

Read more
Blue Sky Analytics
at Blue Sky Analytics
3 recruiters
Balahun Khonglanoh
Posted by Balahun Khonglanoh
Remote only
1 - 5 yrs
Best in industry
NumPy
SciPy
skill iconData Science
skill iconPython
pandas
+8 more

About the Company

Blue Sky Analytics is a Climate Tech startup that combines the power of AI & Satellite data to aid in the creation of a global environmental data stack. Our funders include Beenext and Rainmatter. Over the next 12 months, we aim to expand to 10 environmental data-sets spanning water, land, heat, and more!


We are looking for a data scientist to join its growing team. This position will require you to think and act on the geospatial architecture and data needs (specifically geospatial data) of the company. This position is strategic and will also require you to collaborate closely with data engineers, data scientists, software developers and even colleagues from other business functions. Come save the planet with us!


Your Role

Manage: It goes without saying that you will be handling large amounts of image and location datasets. You will develop dataframes and automated pipelines of data from multiple sources. You are expected to know how to visualize them and use machine learning algorithms to be able to make predictions. You will be working across teams to get the job done.

Analyze: You will curate and analyze vast amounts of geospatial datasets like satellite imagery, elevation data, meteorological datasets, openstreetmaps, demographic data, socio-econometric data and topography to extract useful insights about the events happening on our planet.

Develop: You will be required to develop processes and tools to monitor and analyze data and its accuracy. You will develop innovative algorithms which will be useful in tracking global environmental problems like depleting water levels, illegal tree logging, and even tracking of oil-spills.

Demonstrate: A familiarity with working in geospatial libraries such as GDAL/Rasterio for reading/writing of data, and use of QGIS in making visualizations. This will also extend to using advanced statistical techniques and applying concepts like regression, properties of distribution, and conduct other statistical tests.

Produce: With all the hard work being put into data creation and management, it has to be used! You will be able to produce maps showing (but not limited to) spatial distribution of various kinds of data, including emission statistics and pollution hotspots. In addition, you will produce reports that contain maps, visualizations and other resources developed over the course of managing these datasets.

Requirements

These are must have skill-sets that we are looking for:

  • Excellent coding skills in Python (including deep familiarity with NumPy, SciPy, pandas).
  • Significant experience with git, GitHub, SQL, AWS (S3 and EC2).
  • Worked on GIS and is familiar with geospatial libraries such as GDAL and rasterio to read/write the data, a GIS software such as QGIS for visualisation and query, and basic machine learning algorithms to make predictions.
  • Demonstrable experience implementing efficient neural network models and deploying them in a production environment.
  • Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
  • Capable of writing clear and lucid reports and demystifying data for the rest of us.
  • Be curious and care about the planet!
  • Minimum 2 years of demonstrable industry experience working with large and noisy datasets.

Benefits

  • Work from anywhere: Work by the beach or from the mountains.
  • Open source at heart: We are building a community where you can use, contribute and collaborate on.
  • Own a slice of the pie: Possibility of becoming an owner by investing in ESOPs.
  • Flexible timings: Fit your work around your lifestyle.
  • Comprehensive health cover: Health cover for you and your dependents to keep you tension free.
  • Work Machine of choice: Buy a device and own it after completing a year at BSA.
  • Quarterly Retreats: Yes there's work-but then there's all the non-work+fun aspect aka the retreat!
  • Yearly vacations: Take time off to rest and get ready for the next big assignment by availing the paid leaves.
Read more
A Product Company
A Product Company
Agency job
via wrackle by Lokesh M
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹26L / yr
Looker
Big Data
Hadoop
Spark
Apache Hive
+4 more
Job Title: Senior Data Engineer/Analyst
Location: Bengaluru
Department: - Engineering 

Bidgely is looking for extraordinary and dynamic Senior Data Analyst to be part of its core team in Bangalore. You must have delivered exceptionally high quality robust products dealing with large data. Be part of a highly energetic and innovative team that believes nothing is impossible with some creativity and hard work. 

Responsibilities 
● Design and implement a high volume data analytics pipeline in Looker for Bidgely flagship product.
●  Implement data pipeline in Bidgely Data Lake
● Collaborate with product management and engineering teams to elicit & understand their requirements & challenges and develop potential solutions 
● Stay current with the latest tools, technology ideas and methodologies; share knowledge by clearly articulating results and ideas to key decision makers. 

Requirements 
● 3-5 years of strong experience in data analytics and in developing data pipelines. 
● Very good expertise in Looker 
● Strong in data modeling, developing SQL queries and optimizing queries. 
● Good knowledge of data warehouse (Amazon Redshift, BigQuery, Snowflake, Hive). 
● Good understanding of Big data applications (Hadoop, Spark, Hive, Airflow, S3, Cloudera) 
● Attention to details. Strong communication and collaboration skills.
● BS/MS in Computer Science or equivalent from premier institutes.
Read more
They provide both wholesale and retail funding. PM1
They provide both wholesale and retail funding. PM1
Agency job
via Multi Recruit by Sapna Deb
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
ETL
Talend
OLAP
Data governance
SQL
+8 more
  • Key responsibility is to design and develop a data pipeline including the architecture, prototyping, and development of data extraction, transformation/processing, cleansing/standardizing, and loading in Data Warehouse at real-time/near the real-time frequency. Source data can be structured, semi-structured, and/or unstructured format.
  • Provide technical expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Development of complex data transformation using Talend (BigData edition), Python/Java transformation in Talend, SQL/Python/Java UDXs, AWS S3, etc to load in OLAP Data Warehouse in Structured/Semi-structured form
  • Development of data model and creating transformation logic to populate models for faster data consumption with simple SQL.
  • Implementing automated Audit & Quality assurance checks in Data Pipeline
  • Document & maintain data lineage to enable data governance
  • Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc

Requirements

  • Programming experience using Python / Java, to create functions / UDX
  • Extensive technical experience with SQL on RDBMS (Oracle/MySQL/Postgresql etc) including code optimization techniques
  • Strong ETL/ELT skillset using Talend BigData Edition. Experience in Talend CDC & MDM functionality will be an advantage.
  • Experience & expertise in implementing complex data pipelines, including semi-structured & unstructured data processing
  • Expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Good understanding & working experience in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) and cloud-native Data Lake (S3, ADLS, BigQuery, etc) solutions
  • Familiarity with AWS tool stack for Storage & Processing. Able to recommend the right tools/solutions available to address a technical problem
  • Good knowledge of database performance and tuning, troubleshooting, query optimization, and tuning
  • Good analytical skills with the ability to synthesize data to design and deliver meaningful information
  • Good knowledge of Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model
  • Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
  • Ability to understand business functionality, processes, and flows
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently

Functional knowledge

  • Data Governance & Quality Assurance
  • Distributed computing
  • Linux
  • Data structures and algorithm
  • Unstructured Data Processing
Read more
Techknomatic Services Pvt. Ltd.
Techknomatic Services
Posted by Techknomatic Services
Pune, Mumbai
2 - 6 yrs
₹4L - ₹9L / yr
Tableau
SQL
Business Intelligence (BI)
Role Summary:
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.

Key functions & responsibilities:
 Communication & interaction with the Project Manager to understand the requirement
 Dashboard designing, development and deployment using Tableau eco-system
 Ensure delivery within a given time frame while maintaining quality
 Stay up to date with current tech and bring relevant ideas to the table
 Proactively work with the Management team to identify and resolve issues
 Performs other related duties as assigned or advised
 He/she should be a leader that sets the standard and expectations through an example in
his/her conduct, work ethic, integrity and character
 Contribute in dashboard designing, R&D and project delivery using Tableau

Candidate’s Profile
Academics:
 Batchelor’s degree preferable in Computer science.
 Master’s degree would have an added advantage.

Experience:
 Overall 2-5 Years of experience in DWBI development projects, having worked on BI and
Visualization technologies (Tableau, Qlikview) for at least 2 years.
 At least 2 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modelling, data blending, etc.

Technology & Skills:
 Hands on expertise of Tableau administration and maintenance
 Strong working knowledge and development experience with Tableau Server and Desktop
 Strong knowledge in SQL, PL/SQL and Data modelling
 Knowledge of databases like Microsoft SQL Server, Oracle, etc.
 Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
 Good communication & Analytical skills with Excellent creative and conceptual thinking
abilities
 Superior organizational skills, attention to detail/level of quality, Strong communication
skills, both verbal and written
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos