Loading...

{{notif_text}}

Let the best companies discover your talent - Check out Cutshort Discovered

ETL Jobs in Bangalore (Bengaluru)

Explore top ETL Job opportunities in Bangalore (Bengaluru) for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Data Engineer
Data Engineer

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 5 years
Experience icon
Best in industry5 - 20 lacs/annum

Who are we and Why Yulu for you?Yulu's vision is to reduce traffic congestion by providing a scalable, affordable, efficient and clean solution for the first mile, last mile and short distance commute. We strongly believe that short commute mode can be made more Efficient, Affordable and GREEN, thereby ensuring that future generations will have a healthy environment to live. We want to replace big personal vehicles with a smaller form factor that does not require fossil fuel. Yulu uses IoT technology to create a vast network of shared bikes that can be rented easily by user-friendly app in pay per use business model.We are a mission driven company based out of Bangalore with a solid founding team and big ambitions. At Yulu we believe that every day is a mission and success is exciting when its attained with fun and as a team. If you want to solve the problem of traffic congestion, reduce pollution, positively impact the society and be part of an international dynamic team, join us in our journey to shape better India for our future generations.Responsibility Working with Big Data tools and frameworks to provide requested capabilities Identify development needs in order to improve and streamline operations Develop and manage BI solutions Implementing ETL process and Data Warehousing Monitoring performance and managing infrastructure Skills required 2+ years of experience in relevent field Proficient understanding of distributed computing principles Proficiency with Hadoop and Spark Experience with building stream-processing systems, using solutions such as Kafka and Spark-Streaming Good knowledge of Data querying tools SQL and Hive Experience with integration of data from multiple data sources Knowledge of various ETL techniques and frameworks Experience with Python/Java/Scala (at least one) Experience with cloud services such as AWS or GCP Experience with NoSQL databases, such as DynamoDB,MongoDB will be an advantage

Job posted by
apply for job
apply for job
Abhisek Prasad picture
Abhisek Prasad
Job posted by
Abhisek Prasad picture
Abhisek Prasad
Apply for job
apply for job

Data Engineer
Data Engineer

Founded 2004
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 5 years
Experience icon
Best in industry5 - 8 lacs/annum

Who we are? Searce is a Cloud, Automation & Analytics led business transformation company focussed on helping futurify businesses. We help our clients become successful by helping reimagine ‘what's next’ and then enabling them to realize that ‘now’. We processify, saasify, innovify & futurify businesses by leveraging Cloud | Analytics | Automation | BPM. What we believe? Best practices are overrated Implementing best practices can only make one ‘average’. Honesty and Transparency We believe in naked truth. We do what we tell and tell what we do. Client Partnership Client - Vendor relationship: No. We partner with clients instead.  And our sales team comprises of 100% of our clients. How we work? It’s all about being Happier first. And rest follows. Searce work culture is defined by HAPPIER. Humble: Happy people don’t carry ego around. We listen to understand; not to respond. Adaptable: We are comfortable with uncertainty. And we accept changes well. As that’s what life's about. Positive: We are super positive about work & life in general. We love to forget and forgive. We don’t hold grudges. We don’t have time or adequate space for it. Passionate: We are as passionate about the great vada-pao vendor across the street as about Tesla’s new model and so on. Passion is what drives us to work and makes us deliver the quality we deliver. Innovative: Innovate or Die. We love to challenge the status quo. Experimental: We encourage curiosity & making mistakes. Responsible: Driven. Self-motivated. Self-governing teams. We own it. We welcome *really unconventional* creative thinkers who can work in an agile, flexible environment. We are a flat organization with unlimited growth opportunities, and small team sizes – wherein flexibility is a must, mistakes are encouraged, creativity is rewarded, and excitement is required. Introduction When was the last time you thought about rebuilding your smartphone charger using solar panels on your backpack OR changed the sequencing of switches in your bedroom (on your own, of course) to make it more meaningful OR pointed out an engineering flaw in the sequencing of traffic signal lights to a fellow passenger, while he gave you a blank look? If the last time this happened was more than 6 months ago, you are a dinosaur for our needs. If it was less than 6 months ago, did you act on it? If yes, then let’s talk. We are quite keen to meet you if: You eat, dream, sleep and play with Cloud Data Store & engineering your processes on cloud architecture You have an insatiable thirst for exploring improvements, optimizing processes, and motivating people. You like experimenting, taking risks and thinking big. 3 things this position is NOT about: This is NOT just a job; this is a passionate hobby for the right kind. This is NOT a boxed position. You will code, clean, test, build and recruit and you will feel that this is not really ‘work’. This is NOT a position for people who like to spend time on talking more than the time they spend doing. 3 things this position IS about: Attention to detail matters. Roles, titles, the ego does not matter; getting things done matters; getting things done quicker and better matters the most. Are you passionate about learning new domains & architecting solutions that could save a company millions of dollars? Roles and Responsibilities Drive and define database design and development of real-time complex products. Strive for excellence in customer experience, technology, methodology, and execution. Define and own end-to-end Architecture from definition phase to go-live phase. Define reusable components/frameworks, common schemas, standards to be used & tools to be used and help bootstrap the engineering team. Performance tuning of application and database and code optimizations.  Define database strategy, database design & development standards and SDLC, database customization & extension patterns, database deployment and upgrade methods, database integration patterns, and data governance policies. Architect and develop database schema, indexing strategies, views, and stored procedures for Cloud applications. Assist in defining scope and sizing of work; analyze and derive NFRs, participate in proof of concept development. Contribute to innovation and continuous enhancement of the platform. Define and implement a strategy for data services to be used by Cloud and web-based applications. Improve the performance, availability, and scalability of the physical database, including database access layer, database calls, and SQL statements. Design robust cloud management implementations including orchestration and catalog capabilities. Architect and design distributed data processing solutions using big data technologies - added advantage. Demonstrate thought leadership in cloud computing across multiple channels and become a trusted advisor to decision-makers. Desired Skills Experience with Data Warehouse design, ETL (Extraction, Transformation & Load), architecting efficient software designs for DW platform. Hands-on experience in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc. Knowledge of NoSQL stores is a plus). Knowledge of other transactional Database Management Systems/Open database system and NoSQL database (MongoDB, Cassandra, Hbase etc.) is a plus. Good knowledge of data management principles like Data Architecture, Data Governance, Very Large Database Design (VLDB), Distributed Database Design, Data Replication, and High Availability. Must have experience in designing large-scale, highly available, fault-tolerant OLTP data management systems. Solid knowledge of any one of the industry-leading RDBMS like Oracle/SQL Server/DB2/MySQL etc. Expertise in providing data architecture solutions and recommendations that are technology-neutral. Experience in Architecture consulting engagements is a plus. Deep understanding of technical and functional designs for Databases, Data Warehousing, Reporting, and Data Mining areas. Education & Experience Bachelors in Engineering or Computer Science (preferably from a premier School) - Advanced degree in Engineering, Mathematics, Computer or Information Technology. Highly analytical aptitude and a strong ‘desire to deliver’ outlives those fancy degrees! More so if you have been a techie from 12. 2-5 years of experience in database design & development 0- Years experience  of AWS or Google Cloud Platform or Hadoop experience Experience working in a hands-on, fast-paced, creative entrepreneurial environment in a cross-functional capacity.

Job posted by
apply for job
apply for job
Vishal Jarsania picture
Vishal Jarsania
Job posted by
Vishal Jarsania picture
Vishal Jarsania
Apply for job
apply for job

Senior BI & ETL Developer
Senior BI & ETL Developer

via Wibmo
Founded 1999
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years
Experience icon
Best in industry10 - 15 lacs/annum

Critical Tasks and Expected Contributions/Results : The role will be primarily focused on the design, development and testing of ETL workflows (using Talend) as well as the batch management and error handling processes. Build Business Intelligence Applications using tools like Power BI. Additional responsibilities include the documentation of technical specifications and related project artefacts. - Gather requirement and propose possible ETL solutions for in-house designed Data Warehouse - Analyze & translate functional specifications & change requests into technical specifications. - Design and Creating star schema data models - Design, Build and Implement Business Intelligence Solutions using Power BI - Develop, implement & test ETL program logic. - Deployment and support any related issues Key Competency : - A good understanding of the concepts and best practices of data warehouse ETL design and be able to apply these suitably to solve specific business needs. - Expert knowledge of ETL tool like Talend - Have more than 8 years experience in designing and developing ETL work packages, and be able to demonstrate expertise in ETL tool- Talend - Knowledge of BI tools like Power BI is required - Ability to follow functional ETL specifications and challenge business logic and schema design where appropriate, as well as manage their time effectively. - Exposure to Performance tuning is essential - Good organisational skills. - Methodical and structured approach to design and development. - Good interpersonal skills.

Job posted by
apply for job
apply for job
Shirin AM picture
Shirin AM
Job posted by
Shirin AM picture
Shirin AM
Apply for job
apply for job

Database Architect
Database Architect

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years
Experience icon
Best in industry10 - 20 lacs/annum

candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts

Job posted by
apply for job
apply for job
Rahul Malani picture
Rahul Malani
Job posted by
Rahul Malani picture
Rahul Malani
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No spam.