Cutshort logo
ETL management Jobs in Bangalore (Bengaluru)

11+ ETL management Jobs in Bangalore (Bengaluru) | ETL management Job openings in Bangalore (Bengaluru)

Apply to 11+ ETL management Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest ETL management Job opportunities across top companies like Google, Amazon & Adobe.

icon
EnterpriseMinds

at EnterpriseMinds

2 recruiters
Rani Galipalli
Posted by Rani Galipalli
Bengaluru (Bangalore), Pune, Mumbai
6 - 8 yrs
₹25L - ₹28L / yr
ETL
Informatica
Data Warehouse (DWH)
ETL management
SQL
+1 more

Your key responsibilities

 

  • Create and maintain optimal data pipeline architecture. Should have experience in building batch/real-time ETL Data Pipelines. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • The individual will be responsible for solution design, integration, data sourcing, transformation, database design and implementation of complex data warehousing solutions.
  • Responsible for development, support, maintenance, and implementation of a complex project module
  • Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
  • Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
  • Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support.
  • complete reporting solutions.
  • Preparation of HLD about architecture of the application and high level design.
  • Preparation of LLD about job design, job description and in detail information of the jobs.
  • Preparation of Unit Test cases and execution of the same.
  • Provide technical guidance and mentoring to application development teams throughout all the phases of the software development life cycle

Skills and attributes for success

 

  • Strong experience in SQL. Proficient in writing performant SQL working with large data volumes. Proficiency in writing and debugging complex SQLs.
  • Strong experience in database system Microsoft Azure. Experienced in Azure Data Factory.
  • Strong in Data Warehousing concepts. Experience with large-scale data warehousing architecture and data modelling.
  • Should have enough experience to work on Power Shell Scripting
  • Able to guide the team through the development, testing and implementation stages and review the completed work effectively
  • Able to make quick decisions and solve technical problems to provide an efficient environment for project implementation
  • Primary owner of delivery, timelines. Review code was written by other engineers.
  • Maintain highest levels of development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability
  • Must have understanding of business intelligence development in the IT industry
  • Outstanding written and verbal communication skills
  • Should be adept in SDLC process - requirement analysis, time estimation, design, development, testing and maintenance
  • Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools
  • Should be able to orchestrate and automate pipeline
  • Good to have : Knowledge of distributed systems such as Hadoop, Hive, Spark

 

To qualify for the role, you must have

 

  • Bachelor's Degree in Computer Science, Economics, Engineering, IT, Mathematics, or related field preferred
  • More than 6 years of experience in ETL development projects
  • Proven experience in delivering effective technical ETL strategies
  • Microsoft Azure project experience
  • Technologies: ETL- ADF, SQL, Azure components (must-have), Python (nice to have)

 

Ideally, you’ll also have

Read more
globe teleservices
deepshikha thapar
Posted by deepshikha thapar
Bengaluru (Bangalore)
4 - 8 yrs
₹10L - ₹15L / yr
skill iconPython
SQL

RESPONSIBILITIES:

 Requirement understanding and elicitation, analyze, data/workflows, contribute to product

project and Proof of concept (POC)

 Contribute to prepare design documents and effort estimations.

 Develop AI/ML Models using best in-class ML models.

 Building, testing, and deploying AI/ML solutions.

 Work with Business Analysts and Product Managers to assist with defining functional user

stories.

 Ensure deliverables across teams are of high quality and clearly documented. 

 Recommend best ML practices/Industry standards for any ML use case.

 Proactively take up R and D and recommend solution options for any ML use case.

REQUIREMENTS:

Required Skills

 Overall experience of 4 to 7 Years working on AI/ML framework development

 Good programming knowledge in Python is must.

 Good Knowledge of R and SAS is desired.

 Good hands on and working knowledge SQL, Data Model, CRISP-DM.

 Proficiency with Uni/multivariate statistics, algorithm design, and predictive AI/ML modelling.

 Strong knowledge of machine learning algorithms, linear regression, logistic regression, KNN,

Random Forest, Support Vector Machines and Natural Language Processing.

 Experience with NLP and deep neural networks using synthetic and artificial data.

 Involved in different phases of SDLC and have good working exposure on different SLDC’s like

Agile Methodologies.

Read more
Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
ETL
Informatica
Data Warehouse (DWH)
PowerBI
databricks
+4 more

About The Company


 The client is 17-year-old Multinational Company headquartered in Bangalore, Whitefield, and having another delivery center in Pune, Hinjewadi. It also has offices in US and Germany and are working with several OEM’s and Product Companies in about 12 countries and is a 200+ strong team worldwide. 


The Role


Power BI front-end developer in the Data Domain (Manufacturing, Sales & Marketing, Purchasing, Logistics, …).Responsible for the Power BI front-end design, development, and delivery of highly visible data-driven applications in the Compressor Technique. You always take a quality-first approach where you ensure the data is visualized in a clear, accurate, and user-friendly manner. You always ensure standards and best practices are followed and ensure documentation is created and maintained. Where needed, you take initiative and make

recommendations to drive improvements. In this role you will also be involved in the tracking, monitoring and performance analysis

of production issues and the implementation of bugfixes and enhancements.


Skills & Experience


• The ideal candidate has a degree in Computer Science, Information Technology or equal through experience.

• Strong knowledge on BI development principles, time intelligence, functions, dimensional modeling and data visualization is required.

• Advanced knowledge and 5-10 years experience with professional BI development & data visualization is preferred.

• You are familiar with data warehouse concepts.

• Knowledge on MS Azure (data lake, databricks, SQL) is considered as a plus.

• Experience and knowledge on scripting languages such as PowerShell and Python to setup and automate Power BI platform related activities is an asset.

• Good knowledge (oral and written) of English is required.

Read more
Bengaluru (Bangalore)
4 - 6 yrs
₹12L - ₹15L / yr
SQL
MySQL
MySQL DBA
MariaDB
MS SQLServer
Role Description
As a Database Administrator, you will be responsible for designing, testing, planning,
implementing, protecting, operating, managing and maintaining our company’s
databases. The goal is to provide a seamless flow of information throughout the

company, considering both backend data structure and frontend accessibility for end-
users. You get to work with some of the best minds in the industry at a place where

opportunity lurks everywhere and in everything.
Responsibilities
Your responsibilities are as follows.
• Build database systems of high availability and quality depending on each end
user’s specialised role
• Design and implement database in accordance to end users’ information needs
and views
• Define users and enable data distribution to the right user, in appropriate format
and in a timely manner
• Use high-speed transaction recovery techniques and backup data
• Minimise database downtime and manage parameters to provide fast query
responses
• Provide proactive and reactive data management support and training to users
• Determine, enforce and document database policies, procedures and
standards
• Perform tests and evaluations regularly to ensure data security, privacy and
integrity
• Monitor database performance, implement changes and apply new patches
and versions when required
Required Qualifications
We are looking for individuals who are curious, excited about learning, and navigating
through the uncertainties and complexities that are associated with a growing
company. Some qualifications that we think would help you thrive in this role are:
• Minimum 4 Years of experience as a Database Administrator
• Hands-on experience with database standards and end user applications
• Excellent knowledge of data backup, recovery, security, integrity and SQL
• Familiarity with database design, documentation and coding
• Previous experience with DBA case tools (frontend/backend) and third-party
tools
• Familiarity with programming languages API
• Problem solving skills and ability to think algorithmically
• Bachelor/Masters of CS/IT Engineering, BCA/MCA, B Sc/M Sc in CS/IT

Preferred Qualifications
• Sense of ownership and pride in your performance and its impact on company’s
success
• Critical thinker and problem-solving skills
• Team player
• Good time-management skills
• Great interpersonal and communication skills.
Read more
Bengaluru (Bangalore), Gurugram
2 - 8 yrs
₹10L - ₹35L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
skill iconPython
+11 more
Greetings!!

We are looking for a Machine Learning engineer for on of our premium client.
Experience: 2-9 years
Location: Gurgaon/Bangalore
Tech Stack:

Python, PySpark, the Python Scientific Stack; MLFlow, Grafana, Prometheus for machine learning pipeline management and monitoring; SQL, Airflow, Databricks, our own open-source data pipelining framework called Kedro, Dask/RAPIDS; Django, GraphQL and ReactJS for horizontal product development; container technologies such as Docker and Kubernetes, CircleCI/Jenkins for CI/CD, cloud solutions such as AWS, GCP, and Azure as well as Terraform and Cloudformation for deployment
Read more
Kaleidofin

at Kaleidofin

3 recruiters
Poornima B
Posted by Poornima B
Chennai, Bengaluru (Bangalore)
5 - 7 yrs
Best in industry
Business Intelligence (BI)
PowerBI
skill iconPython
SQL
skill iconR Language
+2 more
We are looking for a leader to design, develop and deliver strategic data-centric insights leveraging the next generation analytics and BI technologies. We want someone who is data-centric and insight-centric, less report centric. We are looking for someone wishing to
make an impact by enabling innovation and growth; someone with passion for what they do and a vision for the future.

Responsibilities:

  • Be the analytical expert in Kaleidofin, managing ambiguous problems by using data to execute sophisticated quantitative modeling and deliver actionable insights.
  • Develop comprehensive skills including project management, business judgment, analytical problem solving and technical depth.
  • Become an expert on data and trends, both internal and external to Kaleidofin.
  • Communicate key state of the business metrics and develop dashboards to enable teams to understand business metrics independently.
  • Collaborate with stakeholders across teams to drive data analysis for key business questions, communicate insights and drive the planning process with company executives.
  • Automate scheduling and distribution of reports and support auditing and value realization.
  • Partner with enterprise architects to define and ensure proposed.
  • Business Intelligence solutions adhere to an enterprise reference architecture.
  • Design robust data-centric solutions and architecture that incorporates technology and strong BI solutions to scale up and eliminate repetitive tasks

Requirements:

  • Experience leading development efforts through all phases of SDLC.
  • 5+ years "hands-on" experience designing Analytics and Business Intelligence solutions.
  • Experience with Quicksight, PowerBI, Tableau and Qlik is a plus.
  • Hands on experience in SQL, data management, and scripting (preferably Python).
  • Strong data visualisation design skills, data modeling and inference skills.
  • Hands-on and experience in managing small teams.
  • Financial services experience preferred, but not mandatory.
  • Strong knowledge of architectural principles, tools, frameworks, and best practices.
  • Excellent communication and presentation skills to communicate and collaborate with all levels of the organisation.
  • Team handling preferred for 5+yrs experience candidates.
  • Notice period less than 30 days.
Read more
Ushur Technologies Pvt Ltd

at Ushur Technologies Pvt Ltd

1 video
2 recruiters
Priyanka N
Posted by Priyanka N
Bengaluru (Bangalore)
6 - 12 yrs
Best in industry
skill iconMongoDB
Spark
Hadoop
Big Data
Data engineering
+5 more
What You'll Do:
● Our Infrastructure team is looking for an excellent Big Data Engineer to join a core group that
designs the industry’s leading Micro-Engagement Platform. This role involves design and
implementation of architectures and frameworks of big data for industry’s leading intelligent
workflow automation platform. As a specialist in Ushur Engineering team, your responsibilities will
be to:
● Use your in-depth understanding to architect and optimize databases and data ingestion pipelines
● Develop HA strategies, including replica sets and sharding to for highly available clusters
● Recommend and implement solutions to improve performance, resource consumption, and
resiliency
● On an ongoing basis, identify bottlenecks in databases in development and production
environments and propose solutions
● Help DevOps team with your deep knowledge in the area of database performance, scaling,
tuning, migration & version upgrades
● Provide verifiable technical solutions to support operations at scale and with high availability
● Recommend appropriate data processing toolset and big data ecosystems to adopt
● Design and scale databases and pipelines across multiple physical locations on cloud
● Conduct Root-cause analysis of data issues
● Be self-driven, constantly research and suggest latest technologies

The experience you need:
● Engineering degree in Computer Science or related field
● 10+ years of experience working with databases, most of which should have been around
NoSql technologies
● Expertise in implementing and maintaining distributed, Big data pipelines and ETL
processes
● Solid experience in one of the following cloud-native data platforms (AWS Redshift/ Google
BigQuery/ SnowFlake)
● Exposure to real time processing techniques like Apache Kafka and CDC tools
(Debezium, Qlik Replicate)
● Strong experience in Linux Operating System
● Solid knowledge of database concepts, MongoDB, SQL, and NoSql internals
● Experience with backup and recovery for production and non-production environments
● Experience in security principles and its implementation
● Exceptionally passionate about always keeping the product quality bar at an extremely
high level
Nice-to-haves
● Proficient with one or more of Python/Node.Js/Java/similar languages

Why you want to Work with Us:
● Great Company Culture. We pride ourselves on having a values-based culture that
is welcoming, intentional, and respectful. Our internal NPS of over 65 speaks for
itself - employees recommend Ushur as a great place to work!
● Bring your whole self to work. We are focused on building a diverse culture, with
innovative ideas where you and your ideas are valued. We are a start-up and know
that every person has a significant impact!
● Rest and Relaxation. 13 Paid leaves, wellness Fridays offs (aka a day off to care
for yourself- every last Friday of the month), 12 paid sick Leaves, and more!
● Health Benefits. Preventive health checkups, Medical Insurance covering the
dependents, wellness sessions, and health talks at the office
● Keep learning. One of our core values is Growth Mindset - we believe in lifelong
learning. Certification courses are reimbursed. Ushur Community offers wide
resources for our employees to learn and grow.
● Flexible Work. In-office or hybrid working model, depending on position and
location. We seek to create an environment for all our employees where they can
thrive in both their profession and personal life.
Read more
Marktine

at Marktine

1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 7 yrs
₹5L - ₹10L / yr
Data Warehouse (DWH)
Spark
Data engineering
skill iconPython
PySpark
+5 more

Basic Qualifications

- Need to have a working knowledge of AWS Redshift.

- Minimum 1 year of designing and implementing a fully operational production-grade large-scale data solution on Snowflake Data Warehouse.

- 3 years of hands-on experience with building productized data ingestion and processing pipelines using Spark, Scala, Python

- 2 years of hands-on experience designing and implementing production-grade data warehousing solutions

- Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies

- Excellent presentation and communication skills, both written and verbal

- Ability to problem-solve and architect in an environment with unclear requirements

Read more
BDI Plus Lab

at BDI Plus Lab

2 recruiters
Puja Kumari
Posted by Puja Kumari
Bengaluru (Bangalore)
1 - 3 yrs
₹3L - ₹6L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+3 more
Experience in a Data Engineer role, who has attained a Graduate degree in Computer
Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
● Experience with big data tools: Hive/Hadoop, Spark, Kafka, Hive etc.
● Experience with querying multiple databases SQL/NoSQL, including
Oracle, MySQL and MongoDB etc.
● Experience in Redis, RabbitMQ, Elastic Search is desirable.
● Strong Experience with object-oriented/functional/ scripting languages:
Python(preferred), Core Java, Java Script, Scala, Shell Scripting etc.
● Must have debugging complex code skills, experience on ML/AI
algorithms is a plus.
● Experience in version control tool Git or any is mandatory.
● Experience with AWS cloud services: EC2, EMR, RDS, Redshift, S3
● Experience with stream-processing systems: Storm, Spark-Streaming,
etc
Read more
Bengaluru (Bangalore)
1 - 8 yrs
₹5L - ₹40L / yr
Data engineering
Data Engineer
AWS Lambda
Microservices
ETL
+8 more
Required Skills & Experience:
• 2+ years of experience in data engineering & strong understanding of data engineering principles using big data technologies
• Excellent programming skills in Python is mandatory
• Expertise in relational databases (MSSQL/MySQL/Postgres) and expertise in SQL. Exposure to NoSQL such as Cassandra. MongoDB will be a plus.
• Exposure to deploying ETL pipelines such as AirFlow, Docker containers & Lambda functions
• Experience in AWS loud services such as AWS CLI, Glue, Kinesis etc
• Experience using Tableau for data visualization is a plus
• Ability to demonstrate a portfolio of projects (GitHub, papers, etc.) is a plus
• Motivated, can-do attitude and desire to make a change is a must
• Excellent communication skills
Read more
MNC
Bengaluru (Bangalore)
4 - 7 yrs
₹25L - ₹28L / yr
skill iconData Science
Data Scientist
skill iconR Programming
skill iconPython
SQL
  • Banking Domain
  • Assist the team in building Machine learning/AI/Analytics models on open-source stack using Python and the Azure cloud stack.
  • Be part of the internal data science team at fragma data - that provides data science consultation to large organizations such as Banks, e-commerce Cos, Social Media companies etc on their scalable AI/ML needs on the cloud and help build POCs, and develop Production ready solutions.
  • Candidates will be provided with opportunities for training and professional certifications on the job in these areas - Azure Machine learning services, Microsoft Customer Insights, Spark, Chatbots, DataBricks, NoSQL databases etc.
  • Assist the team in conducting AI demos, talks, and workshops occasionally to large audiences of senior stakeholders in the industry.
  • Work on large enterprise scale projects end-to-end, involving domain specific projects across banking, finance, ecommerce, social media etc.
  • Keen interest to learn new technologies and latest developments and apply them to projects assigned.
Desired Skills
  • Professional Hands-on coding experience in python for over 1 year for Data scientist, and over 3 years for Sr Data Scientist. 
  • This is primarily a programming/development-oriented role - hence strong programming skills in writing object-oriented and modular code in python and experience of pushing projects to production is important.
  • Strong foundational knowledge and professional experience in 
  • Machine learning, (Compulsory)
  • Deep Learning (Compulsory)
  • Strong knowledge of At least One of : Natural Language Processing or Computer Vision or Speech Processing or Business Analytics
  • Understanding of Database technologies and SQL. (Compulsory)
  • Knowledge of the following Frameworks:
  • Scikit-learn (Compulsory)
  • Keras/tensorflow/pytorch (At least one of these is Compulsory)
  • API development in python for ML models (good to have)
  • Excellent communication skills.
  • Excellent communication skills are necessary to succeed in this role, as this is a role with high external visibility, and with multiple opportunities to present data science results to a large external audience that will include external VPs, Directors, CXOs etc.  
  • Hence communication skills will be a key consideration in the selection process.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort