Loading...

{{notif_text}}

Last chance to connect with exciting companies hiring right now - Register now!|L I V E{{days_remaining}} days {{hours_remaining}} hours left!

ETL Jobs in Bangalore (Bengaluru)

Explore top ETL Job opportunities in Bangalore (Bengaluru) for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Data Engineer

Founded 2011
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 1500000, max: 2400000, duration: "undefined", currency: "INR", equity: false})}}

This position is responsible for developing sustainable scripts to perform data - extraction, manipulation,processing and visualization. We are looking for an individual with an analytical mindset and aneagerness to learn to join our analytics team.You will be responsible for● Building data and intelligence collateral for Sales, Marketing and Customer Success teams.● Working with the Business, Operations and Tech teams to understand and derive dataRequirements● Conducting exploratory data analysis to identify patterns in data that eventually serve as an inputto product.● Identifying data trends, presenting the findings and making recommendations.● Building ad hoc reports and/or other requested information and meeting deadlines for customerdelivery● Owning the quality and accuracy of data reports.● Building automation and tooling to proactively surface data issuesExperience and skills we are looking for● 5-8 years of experience in a data engineer role● Good working knowledge of MySQL/PostgreSQL/Redshift and understanding of data modellingand relational databases● Strong hands-on experience working in Python/R for data analysis and writing highly performantand advanced SQL queries● Demonstrated experience in timely creation of both standard and ad hoc reporting, data analysisand presentation of findingsPayments Made Smart● Proficiency with Word, Excel and PowerPoint, with advanced skills in Excel (Pivot Tables, V-lookups, and Macros)● Experience with Mode Analytics is good to have.● Experience with Dashboard and reporting toolsWhat will make you successful in this role• Strong communication and collaboration skills with ability to work with technology and businesspartners• Desire to learn emerging technologies• Identifying a customer’s needs before they ask

Job posted by
apply for job
apply for job
Sana Kausar picture
Sana Kausar
Job posted by
Sana Kausar picture
Sana Kausar
Apply for job
apply for job

Senior ETL Developer

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 8 years
Salary icon
Best in industry{{renderSalaryString({min: 800000, max: 1300000, duration: "undefined", currency: "INR", equity: false})}}

Minimum of 4 years’ experience of working on DW/ETL projects and expert hands-on working knowledge of ETL tools. Experience with Data Management & data warehouse development Star schemas, Data Vaults, RDBMS, and ODS Change Data capture Slowly changing dimensions Data governance Data quality Partitioning and tuning Data Stewardship Survivorship Fuzzy Matching Concurrency Vertical and horizontal scaling ELT, ETL Spark, Hadoop, MPP, RDBMS Experience with Dev/OPS architecture, implementation and operation Hand's on working knowledge of Unix/Linux Building Complex SQL Queries. Expert SQL and data analysis skills, ability to debug and fix data issue. Complex ETL program design coding Experience in Shell Scripting, Batch Scripting. Good communication (oral & written) and inter-personal skills Expert SQL and data analysis skill, ability to debug and fix data issue Work closely with business teams to understand their business needs and participate in requirements gathering, while creating artifacts and seek business approval. Helping business define new requirements, Participating in End user meetings to derive and define the business requirement, propose cost effective solutions for data analytics and familiarize the team with the customer needs, specifications, design targets & techniques to support task performance and delivery. Propose good design & solutions and adherence to the best Design & Standard practices. Review & Propose industry best tools & technology for ever changing business rules and data set. Conduct Proof of Concepts (POC) with new tools & technologies to derive convincing benchmarks. Prepare the plan, design and document the architecture, High-Level Topology Design, Functional Design, and review the same with customer IT managers and provide detailed knowledge to the development team to familiarize them with customer requirements, specifications, design standards and techniques. Review code developed by other programmers, mentor, guide and monitor their work ensuring adherence to programming and documentation policies. Work with functional business analysts to ensure that application programs are functioning as defined.  Capture user-feedback/comments on the delivered systems and document it for the client and project manager’s review. Review all deliverables before final delivery to client for quality adherence. Technologies (Select based on requirement) Databases - Oracle, Teradata, Postgres, SQL Server, Big Data, Snowflake, or Redshift Tools – Talend, Informatica, SSIS, Matillion, Glue, or Azure Data Factory Utilities for bulk loading and extracting Languages – SQL, PL-SQL, T-SQL, Python, Java, or Scala J/ODBC, JSON Data Virtualization Data services development Service Delivery - REST, Web Services Data Virtualization Delivery – Denodo   ELT, ETL Cloud certification Azure Complex SQL Queries   Data Ingestion, Data Modeling (Domain), Consumption(RDMS)

Job posted by
apply for job
apply for job
Jerrin Thomas picture
Jerrin Thomas
Job posted by
Jerrin Thomas picture
Jerrin Thomas
Apply for job
apply for job

Data Engineer

Founded 2011
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 8 years
Salary icon
Best in industry{{renderSalaryString({min: 1500000, max: 2200000, duration: "undefined", currency: "INR", equity: false})}}

Responsibilities: Building data and intelligence collateral for Sales, Marketing and Customer Success teams. Working with the Business, Operations and Tech teams to understand and derive data Conducting exploratory data analysis to identify patterns in data that eventually serve as an input to product. Identifying data trends, presenting the findings and making recommendations. Building ad hoc reports and/or other requested information and meeting deadlines for customer delivery Owning the quality and accuracy of data reports. Building automation and tooling to proactively surface data issues   Requirements: 5-8 years of experience in a data engineer role Good working knowledge of MySQL/PostgreSQL/Redshift and understanding of data modelling and relational databases Strong hands-on experience working in Python/R for data analysis and writing highly performant and advanced SQL queries Demonstrated experience in timely creation of both standard and ad hoc reporting, data analysis and presentation of findings Proficiency with Word, Excel and PowerPoint, with advanced skills in Excel (Pivot Tables, Vlookups, and Macros) Experience with Mode Analytics is good to have. Experience with Dashboard and reporting tools Strong communication and collaboration skills with ability to work with technology and business partners Desire to learn emerging technologies Identifying a customer's needs before they ask Excellent presentation skills

Job posted by
apply for job
apply for job
Hemant Surdeo picture
Hemant Surdeo
Job posted by
Hemant Surdeo picture
Hemant Surdeo
Apply for job
apply for job

Data Engineer

Founded 2016
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
via slice
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 6 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

About slice slice is a fintech startup focused on India’s young population. We aim to build a smart, simple, and transparent platform to redesign the financial experience for millennials and bring success and happiness to people’s lives. Growing with the new generation is what we dream about and all that we want. We believe that personalization combined with an extreme focus on superior customer service is the key to build long-lasting relations with young people. About team/role In this role, you will have the opportunity to create a significant impact on our business & most importantly our customers through your technical expertise on data as we take on challenges that can reshape the financial experience for the next generation. If you are a highly motivated team player with a knack for problem solving through technology, then we have a perfect job for you. What you’ll do Work closely with Engineering and Analytics teams to assist in Schema Designing, Normalization of Databases, Query optimization etc. Work with AWS cloud services: S3, EMR, Glue, RDS Create new and improve existing infrastructure for ETL workflows from a wide variety of data sources using SQL, NoSQL and AWS big data technologies Manage and monitor performance, capacity and security of database systems and regularly perform server tuning and maintenance activities Debug and troubleshoot database errors Identify, design and implement internal process improvements; optimising data delivery, re-designing infrastructure for greater scalability, data archival Qualification: 2+ years experience working as a Data Engineer Experience with a scripting language -  PYTHON preferably Experience with Spark and Hadoop technologies. Experience with AWS big data tools is a plus. Experience with SQL and NoSQL databases technologies like Redshift, MongoDB, Postgres/MySQL, bigQuery, Casandra. Experience on Graph DB (Neo4j and OrientDB) and Search DB (Elastic Search) is a plus. Experience in handling ETL JOBS

Job posted by
apply for job
apply for job
Gunjan Sheth picture
Gunjan Sheth
Job posted by
Gunjan Sheth picture
Gunjan Sheth
Apply for job
apply for job

Data Architect

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
10 - 18 years
Salary icon
Best in industry{{renderSalaryString({min: 1400000, max: 2500000, duration: "undefined", currency: "INR", equity: false})}}

Coordinate and provide experience-based solution for teams deploying business intelligence, data platform, and big data solutions. Strong knowledge of data warehousing and big data / analytics platform solutions and how data architecture fits into larger data warehousing and database implementation projects as a major component of the effort. Ability to guide C-level executives/ chief architects at major clients (Fortune 500) in data architecture decisions. Experience mapping out enterprise architecture transformations over a 3 year period, and leading those implementations. Develop, implement and maintain data architecture best practices and standards Utilizing data architecture best practices and standards, define and implement technical solutions in the movement of data throughout an organization Provide leadership in related technical areas of importance such as Business Intelligence Reporting and Analytics Gather requirements for data architecture through the use of business and technical interviews and round-table discussions Evaluate and make decisions regarding the alternative processes that can be followed in data movement throughout an organization: ETL, SOA / Web Services, Bulk Load, Evaluate and make decisions regarding the alternative tools and platforms that can be used to perform activities around data collection, data distribution and reporting Show experience with the concepts of data modeling for both transaction based systems and reporting / data warehouse type systems Evaluate data related requirements around data quality and master data management and understand and articulate how these factors apply to data architecture Understand the concepts of data quality, data ownership, and data governance, and understand how they apply within a data architecture framework 15+ years experience in IT, 10+ years experience with data related positions and responsibilities Excellent knowledge of multiple toolsets: ETL tools, reporting tools, data quality, metadata management, multiple database management systems, cloud, security, MDM tools.  (Anything Insights & Data service line may support in future.) Bachelors degree or equivalent in Computer Science, Information Systems or related field Experience in architecting, designing, developing and implementing project work within highly-visible data-driven applications in very large data warehousing / data repository environments with complex processing requirements A proven track record in system design and performance Demonstrated experience integrating systems in multi-user, multi-platform, multitasking operating systems environments Working knowledge of relational databases such as Oracle, DB2, SQL Server, etc. Ability to advocate ideas and to objectively participate in design critique Ideally the candidate should also have: Superb team building skills with a predisposition to building consensus and achieving goals through collaboration rather than direct line authority A positive, results oriented style, evidenced by listening, motivating, delegating, influencing, and monitoring the work being done Strong interpersonal/communication skills with the professional staff, senior level executives, and the business community at large - Experience delivering enterprise architecture for data & analytics for Fortune 500 companies - Ability to lead client architect leadership (CIO, Chief Architect) - Broad understanding of data platforms and tools (Cloud Platforms, Infra, Security, Data Movement, Data Engineering, Visualization, MDM, DQ, Lineage) and proven experience deploying architectures for the largest clients globally - Strong communication and facilitation skills (ability to manage workshops with 20-30 client technical resources) Ability to interface with CIO Train client on cloud, Enterprise Data Platform, Capgemini POV for Data (Business DL, Perform AI, Factory Model etc.)

Job posted by
apply for job
apply for job
Jerrin Thomas picture
Jerrin Thomas
Job posted by
Jerrin Thomas picture
Jerrin Thomas
Apply for job
apply for job

Senior Software Engineer - BODS / ETL

Founded 2001
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 9 years
Salary icon
Best in industryBest in industry

EDUCATION AND YEARS OF EXPERIENCE REQUIREMENTS:   Bachelor’s Degree in Computer Science or IT preferred. 9+ years of experience as ETL Developer in creating jobs, tables, views that will be used to create reports dashboards for internal and external use.   KNOWLEDGE AND SKILLS REQUIREMENTS:   Experience in full-phase ETL implementation using tools such as Talend, Informatica, Data Services or other ETL tools need required to work with Galaxy data sources. Experience in analysis, design and implementation of new reporting tables or views using various ETL tools associated with the Galaxy data sources. Experience with ETL process management, development, data modeling, warehouse architecture and testing the data as well as in maintaining it at the enterprise level as an Administrator. Experience in technical implementations of business intelligence applications that deliver business decision making capabilities. Strong SQL knowledge with scripting languages to deal with large data sets and the ability to create ad-hoc SQL reports for validation Exposure to tools like Qlik, Power BI, Business Objects, Tableau etc. is desirable for validation Galaxy data source and Hawk Marketplace projects allow for a more flexible work schedule   JOB RESPONSIBILITIES:   Acting as ETL Developer expert to advice customers on best practices in creating and deploying reporting tables Creating functional and technical requirements as an input to application design business solution components and prototypes. Design and develop high value views to be used in reports and dashboards. Delivering high quality business intelligence solutions to our internal customers. Developing business intelligence applications for data analysis optimized for the best performance and scalability requirements using various reporting tools like Qlik. Interacting with business leaders to understand business strategy, conditions, and being able to frame problems Provide on-call support to reporting platform and the supporting tables, views, and supporting data sources.

Job posted by
apply for job
apply for job
Arun Jayaraman picture
Arun Jayaraman
Job posted by
Arun Jayaraman picture
Arun Jayaraman
Apply for job
apply for job

Software Architect/CTO

Founded 2018
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Nu-Pie
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
6 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 700000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

Coordinate and provide experience-based solution for teams deploying business intelligence, data platform, and big data solutions. Strong knowledge of data warehousing and big data / analytics platform solutions and how data architecture fits into larger data warehousing and database implementation projects as a major component of the effort. Ability to guide C-level executives/ chief architects at major clients (Fortune 500) in data architecture decisions. Experience mapping out enterprise architecture transformations over a 3 year period, and leading those implementations. Develop, implement and maintain data architecture best practices and standards Utilizing data architecture best practices and standards, define and implement technical solutions in the movement of data throughout an organization Provide leadership in related technical areas of importance such as Business Intelligence Reporting and Analytics Gather requirements for data architecture through the use of business and technical interviews and round-table discussions Evaluate and make decisions regarding the alternative processes that can be followed in data movement throughout an organization: ETL, SOA / Web Services, Bulk Load, Evaluate and make decisions regarding the alternative tools and platforms that can be used to perform activities around data collection, data distribution and reporting Show experience with the concepts of data modeling for both transaction based systems and reporting / data warehouse type systems Evaluate data related requirements around data quality and master data management and understand and articulate how these factors apply to data architecture Understand the concepts of data quality, data ownership, and data governance, and understand how they apply within a data architecture framework 15+ years experience in IT, 10+ years experience with data related positions and responsibilities Excellent knowledge of multiple toolsets: ETL tools, reporting tools, data quality, metadata management, multiple database management systems, cloud, security, MDM tools.  (Anything Insights & Data service line may support in future.) Bachelors degree or equivalent in Computer Science, Information Systems or related field Experience in architecting, designing, developing and implementing project work within highly-visible data-driven applications in very large data warehousing / data repository environments with complex processing requirements A proven track record in system design and performance Demonstrated experience integrating systems in multi-user, multi-platform, multitasking operating systems environments Working knowledge of relational databases such as Oracle, DB2, SQL Server, etc. Ability to advocate ideas and to objectively participate in design critique Ideally the candidate should also have: Superb team building skills with a predisposition to building consensus and achieving goals through collaboration rather than direct line authority A positive, results oriented style, evidenced by listening, motivating, delegating, influencing, and monitoring the work being done Strong interpersonal/communication skills with the professional staff, senior level executives, and the business community at large - Experience delivering enterprise architecture for data & analytics for Fortune 500 companies - Ability to lead client architect leadership (CIO, Chief Architect) - Broad understanding of data platforms and tools (Cloud Platforms, Infra, Security, Data Movement, Data Engineering, Visualization, MDM, DQ, Lineage) and proven experience deploying architectures for the largest clients globally - Strong communication and facilitation skills (ability to manage workshops with 20-30 client technical resources) Ability to interface with CIO Train client on cloud, Enterprise Data Platform, Capgemini POV for Data (Business DL, Perform AI, Factory Model etc.)

Job posted by
apply for job
apply for job
Sanjay Biswakarma picture
Sanjay Biswakarma
Job posted by
Sanjay Biswakarma picture
Sanjay Biswakarma
Apply for job
apply for job

Talend Developer /Architect

Founded
Products and services{{j_company_types[ - 1]}}
{{j_company_sizes[ - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai, Pune, Bengaluru (Bangalore)
Experience icon
5 - 15 years
Salary icon
Best in industry{{renderSalaryString({min: 700000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

JD:Project Role : Talend Application DeveloperProject Role Description : Design, build and configure applications to meet business process and application requirements.Work Experience : 4-9 yearsWork location : Remote till covid,Must Have Skills : Talend ETLGood To Have Skills : ETl,SQL, Data visualizationJob Requirements :Key Responsibilities : aExpertise on any RDBMS will be Plus. Good understanding of Test Driven Development - Unit and Integration testing.Strong focus on Clean Code Full stack developer will be a PLUS dBasic understanding of CI/CDTechnical Experience : Experience on CI/CD tools like Jenkins bKnow-How or experience on Ember JS/Angular JS/Java Script, HTML CSS with Bootstrap.Experience of developing application on AWS dExperience on ETL tools such as Informatica or Talend and MatillionProfessional Attributes : Willingness to learn if there is a lack in knowledge bWork individually with the client and communicate 1:1Co-ordinate the team members if any and mentor junior resourcesEducational Qualification : 15 Years of full time education

Job posted by
apply for job
apply for job
KAUSHANKKHANDWALA picture
KAUSHANKKHANDWALA
Job posted by
KAUSHANKKHANDWALA picture
KAUSHANKKHANDWALA
Apply for job
apply for job

Data & Solution Architect

Founded 2013
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
8 - 15 years
Salary icon
Best in industry{{renderSalaryString({min: 2000000, max: 3500000, duration: "undefined", currency: "INR", equity: false})}}

Responsibilities include: Drive strategic design of the Enterprise Data warehouse/data lake platform. Including logical & physical platform architecture, data modelling, and support throughout the development lifecycle. Work with technical product leads to design and deliver solutions that optimize use of product data. Lead complex discussions and engagements that may involve multiple project teams. Construct and maintain business and technical architecture models that lever industry standards and use cases to create operating efficiency. Provide strategic guidance and direction to development teams. Oversee technical project delivery process to ensure alignment and consistency with sound architecture principles. Identify any product/functionality gaps and collaborate internal product and technology teams to define the necessary development to support solution delivery. Be a technical subject matter expert and represent the full internal and external capabilities of the platform. Work closely with internal teams including compliance, privacy, risk, information security, support and operations to ensure all business requirements are considered in the design process. Participate in the development of architectures, strategies, and policies around data governance, including master data management, metadata management, data quality and data profiling. Skills 7+ years’ experience building large scalable platforms, specifically with a focus on solving cross-functional requirements for data. Experience in architecture, design, and implementation of Analytics Systems/Platforms such as Reporting System, Financial reconciliation and reporting, Customer Feedback, etc. Experience and understanding of data & how to design data models is required. Expertise in designing for reliability, availability, scalability and performance in large scale systems Extensive knowledge of data management strategies & technologies, and how to influence for the right approach based on business requirements. Experience in designing for various physical environments, both on-premises as well as in the cloud, while influencing for the most efficient approach based on business requirements. Good understanding of SQL Server, SQL Server Warehouse, multiple ETL tools, and modelling tools Good understanding of asp.net, C# is a plus. Excellent communication skills and ability to convey complex topics through effective documentation as well as presentation.

Job posted by
apply for job
apply for job
Ayub Pasha picture
Ayub Pasha
Job posted by
Ayub Pasha picture
Ayub Pasha
Apply for job
apply for job

Power BI
Power BI
at ConsultBaeat ConsultBae

Founded 2020
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
4 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 1200000, max: 1500000, duration: "undefined", currency: "INR", equity: false})}}

Role​ : Power BI Engineer/ Reporting Engineer Department​ : TechnologyLocation​ : Bangalore / Currently Work From Home The Power BI Engineer will be part of an agile development team, building and working on enterprise grade software systems on top of the J2EE development stack. The Engineer will get the opportunity to create the BI tool setup & also develop reports – hands on. Responsibilities / Accountabilities ●  Design, implement, and continuously expand Snowflake data pipelines by performing extraction, transformation, and loading activities; ●  Serve as the main Application and Database Developer / Administrator ●  Perform stabilization and optimization of databases, data sets, data flows, queries and views ●  Develop and maintain the data models, data flows and applications that underpin our dashboards and reports ●  Design, develop and implement continual improvements to the Power Apps tools ●  Deliver ad-hoc enchantments and improvements to the power apps tools upon request ●  Identify opportunities to simplify applications, data flows and databases to enable effective execution tracking and reporting in the business health dashboards ●  Support Architect, Data Warehouse & Systems Integration in expanding the data models and tools ●  Work with the Data Engineer to develop queries and views for ad-hoc data extraction from Snowflake ●  Enhancement and maintenance of the Patient De-Duplication and Matching in Snowflake ●  Utilize established development tools, guidelines and conventions to design, develop, and test ●  Developing and managing the ETL packages that utilize open source tools. Technical Qualification ●  Experience with a variety of programming and scripting languages and frameworks, including but not limited to Python, DAX, Power Query Formula Language (M), SQL, MDX and R ●  Previous demonstrable development experience with reporting tools and technology (Power BI, Power Query, Dataflows, Power Apps, Canvas Apps, Business Objects, Tableau, Tableau Prep, MS SQL, Salesforce, Snowflake) ●  Demonstrable experience with Application Development, Infrastructure Delivery Automation, Orchestration, Configuration Management and Testing tools; ●  Extensive experience with ETL and Data Transformation tools such as Matillion, Snowflake and AWS ●  In-depth understanding of database management systems and scripting (SQL, NoSQL and distributed databases) ●  Experience in engaging closely with Solution Architects and Data Use teams to deliver highly available and scalable Business intelligence (BI) services with minimal/zero downtime

Job posted by
apply for job
apply for job
Consult Bae picture
Consult Bae
Job posted by
Consult Bae picture
Consult Bae
Apply for job
apply for job

Data Engineer and Data Bricks

Founded 2013
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 8 years
Salary icon
Best in industry{{renderSalaryString({min: 2000000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

Insurance P&C and Specialty domain experience a plus Experience in a cloud-based architecture preferred, such as Databricks, Azure Data Lake, Azure Data Factory, etc. Strong understanding of ETL fundamentals and solutions. Should be proficient in writing advanced / complex SQL, expertise in performance tuning and optimization of SQL queries required. Strong experience in Python/PySpark and Spark SQL Experience in troubleshooting data issues, analyzing end to end data pipelines, and working with various teams in resolving issues and solving complex problems. Strong experience developing Spark applications using PySpark and SQL for data extraction, transformation, and aggregation from multiple formats for analyzing & transforming the data to uncover insights and actionable intelligence for internal and external use

Job posted by
apply for job
apply for job
Manjunath Multirecruit picture
Manjunath Multirecruit
Job posted by
Manjunath Multirecruit picture
Manjunath Multirecruit
Apply for job
apply for job

Data Engineer

Founded 2018
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Mumbai, Pune, Bengaluru (Bangalore)
Experience icon
4 - 10 years
Salary icon
Best in industryBest in industry

What is Contentstack? Contentstack combines the best Content Management System (CMS) and Digital Experience Platform (DXP) technology. It enables enterprises to manage content across all digital channels and create inimitable digital experiences. The Contentstack platform was designed from the ground up for large-scale, complex, and mission-critical deployments. Recently recognized as the Gartner PeerInsights Customers' Choice for WCM, Contentstack is the preferred API-first, headless CMS for enterprises across the globe.    What Are We Looking For? Contentstack is looking for a Data Engineer.   Roles and responsibilities: Primary responsibilities included designing and scaling ETL pipelines, and ensuring data sanity. Collaborate with multiple groups and produce operational efficiency Develop, construct, test and maintain architectures Align architecture with business requirements Identify ways to improve data reliability, efficiency and quality Optimize database systems for performance and reliability Implementation of model workflows to prepare/analyse/learn/predict and supply the outcomes through API contract(s) Establishing programming patterns, documenting components and provide infrastructure for analysis and execution Set up practices on data reporting and continuous monitoring Provide excellence, open to new ideas and contribute to communities Industrialise the data science models and embed intelligence in product & business applications Find hidden patterns using data Prepare data for predictive and prescriptive modeling Deploy sophisticated analytics programs, machine learning and statistical methods   Mandatory Skills 3+ relevant work experience as a Data Engineer Working experience in HDFS, Big table, MR, Spark, Data warehouse, ETL etc.. Advanced proficiency in Java,Scala, SQL, NoSQL Strong knowledge in Shell/Perl/R/Python/Ruby Proficiency in Statistical procedures, Experiments and Machine Learning techniques. Exceptional problem solving abilities   Job type – Full time employment Job location –  Mumbai/ Pune/ Bangalore/Remote Work schedule – Monday to Friday, 10am to 7pm Minimum qualification – Graduate. Years of experience –  3 + yearsNo of position - 2 Travel opportunities - On need basis within/outside India. Candidate should have valid passport What Really Gets Us Excited About You? Experience in working with product based start-up companies Knowledge of working with SAAS products.   What Do We Offer?   Interesting Work | We hire curious trendspotters and brave trendsetters. This is NOT your boring, routine, cushy, rest-and-vest corporate job. This is the “challenge yourself” role where you learn something new every day, never stop growing, and have fun while you’re doing it.    Tribe Vibe | We are more than colleagues, we are a tribe. We have a strict “no a**hole policy” and enforce it diligently. This means we spend time together - with spontaneous office happy hours, organized outings, and community volunteer opportunities. We are a diverse and distributed team, but we like to stay connected.   Bragging Rights | We are dreamers and dream makers, hustlers, and honeybadgers. Our efforts pay off and we work with the most prestigious brands, from big-name retailers to airlines, to professional sports teams. Your contribution will make an impact with many of the most recognizable names in almost every industry including Chase, The Miami HEAT, Cisco, Shell, Express, Riot Games, IcelandAir, Morningstar, and many more!   A Seat at the Table |  One Team One Dream is one of our values, and it shows. We don’t believe in artificial hierarchies. If you’re part of the tribe, you get a seat at the table. This includes unfiltered access to our C-Suite and regular updates about the business and its performance. Which, btw, is through the roof, so it’s a great time to be joining…

Job posted by
apply for job
apply for job
Rahul Jana picture
Rahul Jana
Job posted by
Rahul Jana picture
Rahul Jana
Apply for job
apply for job

QA Engineer (ETL Testing)

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
2 - 6 years
Salary icon
Best in industry{{renderSalaryString({min: 800000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

Must Have– Pls look for profiles with ETL testing background (2-6 years’ experience); but the below 3 skill requirements are mandatory Python scripting/PySpark experience SQL querying Data Warehousing implementation experience Good to Have Knowledge of working on Azure Databricks using PySpark Knowledge of working ADLS Gen 2 - Delta Lake (Delta Tables) Any ETL job orchestration experience Experience in Agile methodologies Code migration to QA/UAT/PROD through DevOps

Job posted by
apply for job
apply for job
Priyanka U picture
Priyanka U
Job posted by
Priyanka U picture
Priyanka U
Apply for job
apply for job

Business Analyst(Telecom)- For Career break Candidates

Founded
Products and services{{j_company_types[ - 1]}}
{{j_company_sizes[ - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore), Gurgaon
Experience icon
4 - 7 years
Salary icon
Best in industry{{renderSalaryString({min: 800000, max: 1300000, duration: "undefined", currency: "INR", equity: false})}}

· Strong analytical skills with the ability to collect, organize, analyse, and disseminate significant amounts of information with attention to detail and accuracy· Prior experience of statistical modelling techniques, AI/ML models etc. will be value add· Working knowledge of reporting packages (Business Objects, QLIK Power BI etc.), ETL frameworks will be an advantage.· Knowledge of statistics and experience using statistical packages for analysing datasets (MS Excel, SPSS, SAS etc.)· Experience on Python, R and other scripting languages is desirable, but not must

Job posted by
apply for job
apply for job
Rampriya K picture
Rampriya K
Job posted by
Rampriya K picture
Rampriya K
Apply for job
apply for job

Data Engineer

Founded 2016
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
2 - 5 years
Salary icon
Best in industry{{renderSalaryString({min: 600000, max: 1200000, duration: "undefined", currency: "INR", equity: false})}}

We are actively seeking a Senior Data Engineer experienced in building data pipelines and integrations from 3rd party data sources by writing custom automated ETL jobs using Python. The role will work in partnership with other members of the Business Analytics team to support the development and implementation of new and existing data warehouse solutions for our clients. This includes designing database import/export processes used to generate client data warehouse deliverables.   Requirements 2+ Years experience as an ETL developer with strong data architecture knowledge around data warehousing concepts, SQL development and optimization, and operational support models. Experience using Python to automate ETL/Data Processes jobs. Design and develop ETL and data processing solutions using data integration tools, python scripts, and AWS / Azure / On-Premise Environment. Experience / Willingness to learn AWS Glue / AWS Data Pipeline / Azure Data Factory for Data Integration. Develop and create transformation queries, views, and stored procedures for ETL processes, and process automation. Document data mappings, data dictionaries, processes, programs, and solutions as per established standards for data governance. Work with the data analytics team to assess and troubleshoot potential data quality issues at key intake points such as validating control totals at intake and then upon transformation, and transparently build lessons learned into future data quality assessments Solid experience with data modeling, business logic, and RESTful APIs. Solid experience in the Linux environment. Experience with NoSQL / PostgreSQL preferred Experience working with databases such as MySQL, NoSQL, and Postgres, and enterprise-level connectivity experience (such as connecting over TLS and through proxies). Experience with NGINX and SSL. Performance tune data processes and SQL queries, and recommend and implement data process optimization and query tuning techniques.

Job posted by
apply for job
apply for job
Pavel Gupta picture
Pavel Gupta
Job posted by
Pavel Gupta picture
Pavel Gupta
Apply for job
apply for job

Data Engineer

Founded 2002
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
via PayU
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
2 - 5 years
Salary icon
Best in industry{{renderSalaryString({min: 500000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

Role: Data Engineer  Company: PayU Location: Bangalore/ Mumbai Experience : 2-5 yrs About Company:PayU is the payments and fintech business of Prosus, a global consumer internet group and one of the largest technology investors in the world. Operating and investing globally in markets with long-term growth potential, Prosus builds leading consumer internet companies that empower people and enrich communities. The leading online payment service provider in 36 countries, PayU is dedicated to creating a fast, simple and efficient payment process for merchants and buyers. Focused on empowering people through financial services and creating a world without financial borders where everyone can prosper, PayU is one of the biggest investors in the fintech space globally, with investments totalling $700 million- to date. PayU also specializes in credit products and services for emerging markets across the globe. We are dedicated to removing risks to merchants, allowing consumers to use credit in ways that suit them and enabling a greater number of global citizens to access credit services. Our local operations in Asia, Central and Eastern Europe, Latin America, the Middle East, Africa and South East Asia enable us to combine the expertise of high growth companies with our own unique local knowledge and technology to ensure that our customers have access to the best financial services. India is the biggest market for PayU globally and the company has already invested $400 million in this region in last 4 years. PayU in its next phase of growth is developing a full regional fintech ecosystem providing multiple digital financial services in one integrated experience. We are going to do this through 3 mechanisms: build, co-build/partner; select strategic investments.  PayU supports over 350,000+ merchants and millions of consumers making payments online with over 250 payment methods and 1,800+ payment specialists. The markets in which PayU operates represent a potential consumer base of nearly 2.3 billion people and a huge growth potential for merchants.  Job responsibilities: Design infrastructure for data, especially for but not limited to consumption in machine learning applications  Define database architecture needed to combine and link data, and ensure integrity across different sources  Ensure performance of data systems for machine learning to customer-facing web and mobile applications using cutting-edge open source frameworks, to highly available RESTful services, to back-end Java based systems  Work with large, fast, complex data sets to solve difficult, non-routine analysis problems, applying advanced data handling techniques if needed  Build data pipelines, includes implementing, testing, and maintaining infrastructural components related to the data engineering stack. Work closely with Data Engineers, ML Engineers and SREs to gather data engineering requirements to prototype, develop, validate and deploy data science and machine learning solutions Requirements to be successful in this role:  Strong knowledge and experience in Python, Pandas, Data wrangling, ETL processes, statistics, data visualisation, Data Modelling and Informatica. Strong experience with scalable compute solutions such as in Kafka, Snowflake Strong experience with workflow management libraries and tools such as Airflow, AWS Step Functions etc.  Strong experience with data engineering practices (i.e. data ingestion pipelines and ETL)  A good understanding of machine learning methods, algorithms, pipelines, testing practices and frameworks  Preferred) MEng/MSc/PhD degree in computer science, engineering, mathematics, physics, or equivalent (preference: DS/ AI)  Experience with designing and implementing tools that support sharing of data, code,