Cutshort logo
Quantela logo
PL SQL Developer
PL SQL Developer
Quantela's logo

PL SQL Developer

Gayatri Mane's profile picture
Posted by Gayatri Mane
3 - 5 yrs
₹2L - ₹8L / yr
Raipur
Skills
PL/SQL
Databases
Oracle SQL Developer

Job Title: Oracle PL/SQL Developer

Qualification: (B.E./B.Tech/ Masters in Computer or IT)

Years of Experience: 3 – 7 Years

No. of Open Positions – 3

Job Location: Jaipur

  1. Proven hands-on Database Development experience
  2. Develop, design, test and implement complex database programs
  3. Strong experience with oracle functions, procedures, triggers, packages & performance tuning,
  4. Ensure that database programs are in compliance with V3 standards. 
  5. Hands-on development using Oracle PL/SQL.
  6. Performance tune SQL's, application programs and instances. 
  7. Evaluation of new and upcoming technologies.
  8. Providing technical assistance, problem resolution and troubleshooting support.

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Quantela

Founded :
2015
Type
Size
Stage :
Raised funding
About
{{ngMeta.description}}
Read more
Connect with the team
Profile picture
Leena Choudhary
Profile picture
Leena Choudhary
Profile picture
Sai Singh
Company social profiles
blog

Similar jobs

Lagozon Technologies Pvt Ltd
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
1 - 3 yrs
₹8L - ₹12L / yr
Qlikview
Qliksense
QMC
qvd
PL/SQL

Business Intelligence Consultant – Qlik

 

Role

·      Working through customer specifications and develop solutions in line with defined requirements

·      Strategizing and ideating the solution design (create prototypes and/or wireframes) before beginning to create the application or solution.

·      Creating load scripts and QVDs to support dashboards.

·      Creating data models in Qlik Sense to support dashboards.

·      Leading data discovery, assessment, analysis, modeling and mapping efforts for Qlik dashboards.

·      Develop visual reports, dashboards and KPI scorecards using Qlik

·      Connecting to data sources ( MS SQL SERVER, ORACLE, SAP), importing data and transforming data for Business Intelligence.

·      Translating data into informative visuals and reports.

·      Developing, publishing and scheduling reports as per the business requirements.

·      Implementing application security layer models in Qlik

 

 Skills Required

 

·      Knowledge of data visualization and data analytics principles and skills –including good user experience/UI Design

·      Hands-on developer on Qlik Sense development

·      Knowledge of writing SQL queries

·      Exceptional analytical skills, problem-solving skills and excellent communication skills

 

 Qualifications 

 

1.      Degree in Computer Science Engineering disciplines or MCA

2.      2-4 years of hands-on Qlik experience

3.      Qlik Sense certification would be preferred

 

 

Read more
LS Spectrum Solutions Private Limited
Mumbai
5 - 10 yrs
₹8L - ₹15L / yr
PostgreSQL
Database Design
Database architecture
Databases
Linux/Unix
+5 more

Job Description

▪ You are responsible for setting up, operating, and monitoring LS system solutions on premise and in the cloud

▪ You are responsible for the analysis and long-term elimination of system errors

▪ You provide support in the area of ​​information and IT security

▪ You will work on the strategic further development and optimize the platform used

▪ You will work in a global, international team requirement profile

▪ You have successfully completed an apprenticeship / degree in the field of IT

▪ You can demonstrate in-depth knowledge and experience in the following areas:

▪ PostgreSQL databases

▪ Linux (e.g. Ubuntu, Oracle Linux, RHEL)

▪ Windows (e.g. Windows Server 2019/2022)

▪ Automation / IaC (e.g. Ansible, Terraform)

▪ Containerization with Kubernetes / Virtualization with Vmware is an advantage

▪ Service APIs (AWS, Azure)

▪ You have very good knowledge of English, knowledge of German is an advantage

▪ You are a born team player, show high commitment and are resilient

Read more
Cloudera
at Cloudera
2 recruiters
Sushmitha Rengarajan
Posted by Sushmitha Rengarajan
Bengaluru (Bangalore)
3 - 20 yrs
₹1L - ₹44L / yr
ETL
Informatica
Data Warehouse (DWH)
Relational Database (RDBMS)
Data Structures
+7 more

 

Cloudera Data Warehouse Hive team looking for a passionate senior developer to join our growing engineering team. This group is targeting the biggest enterprises wanting to utilize Cloudera’s services in a private and public cloud environment. Our product is built on open source technologies like Hive, Impala, Hadoop, Kudu, Spark and so many more providing unlimited learning opportunities.A Day in the LifeOver the past 10+ years, Cloudera has experienced tremendous growth making us the leading contributor to Big Data platforms and ecosystems and a leading provider for enterprise solutions based on Apache Hadoop. You will work with some of the best engineers in the industry who are tackling challenges that will continue to shape the Big Data revolution.  We foster an engaging, supportive, and productive work environment where you can do your best work. The team culture values engineering excellence, technical depth, grassroots innovation, teamwork, and collaboration.
You will manage product development for our CDP components, develop engineering tools and scalable services to enable efficient development, testing, and release operations.  You will be immersed in many exciting, cutting-edge technologies and projects, including collaboration with developers, testers, product, field engineers, and our external partners, both software and hardware vendors.Opportunity:Cloudera is a leader in the fast-growing big data platforms market. This is a rare chance to make a name for yourself in the industry and in the Open Source world. The candidate will responsible for Apache Hive and CDW projects. We are looking for a candidate who would like to work on these projects upstream and downstream. If you are curious about the project and code quality you can check the project and the code at the following link. You can start the development before you join. This is one of the beauties of the OSS world.Apache Hive

 

Responsibilities:

•Build robust and scalable data infrastructure software

•Design and create services and system architecture for your projects

•Improve code quality through writing unit tests, automation, and code reviews

•The candidate would write Java code and/or build several services in the Cloudera Data Warehouse.

•Worked with a team of engineers who reviewed each other's code/designs and held each other to an extremely high bar for the quality of code/designs

•The candidate has to understand the basics of Kubernetes.

•Build out the production and test infrastructure.

•Develop automation frameworks to reproduce issues and prevent regressions.

•Work closely with other developers providing services to our system.

•Help to analyze and to understand how customers use the product and improve it where necessary. 

Qualifications:

•Deep familiarity with Java programming language.

•Hands-on experience with distributed systems.

•Knowledge of database concepts, RDBMS internals.

•Knowledge of the Hadoop stack, containers, or Kubernetes is a strong plus. 

•Has experience working in a distributed team.

•Has 3+ years of experience in software development.

 

Read more
Sixt R&D
Bengaluru (Bangalore)
5 - 8 yrs
₹11L - ₹14L / yr
SQL
Python
RESTful APIs
Business Intelligence (BI)
QuickSight
+6 more

Technical-Requirements: 

  • Bachelor's Degree in Computer Science or a related technical field, and solid years of relevant experience. 
  • A strong grasp of SQL/Presto and at least one scripting (Python, preferable) or programming language. 
  • Experience with an enterprise class BI tools and it's auditing along with automations using REST API's. 
  • Experience with reporting tools – QuickSight (preferred, at least 2 years hands on). 
  • Tableau/Looker (both or anyone would suffice with at least 5+ years of hands on). 
  • 5+ years of experience with and detailed knowledge of data warehouse technical architectures, data modelling, infrastructure components, ETL/ ELT and reporting/analytic tools and environments, data structures and hands-on SQL coding. 
  • 5+ years of demonstrated quantitative and qualitative business intelligence. 
  • Experience with significant product analysis based business impact. 
  • 4+ years of large IT project delivery for BI oriented projects using agile framework. 
  • 2+ years of working with very large data warehousing environment. 
  • Experience in designing and delivering cross functional custom reporting solutions. 
  • Excellent oral and written communication skills including the ability to communicate effectively with both technical and non-technical stakeholders. 
  • Proven ability to meet tight deadlines, multi-task, and prioritize workload. 
  • A work ethic based on a strong desire to exceed expectations. 
  • Strong analytical and challenge process skills.
Read more
Mobile Programming LLC
at Mobile Programming LLC
1 video
34 recruiters
keerthi varman
Posted by keerthi varman
Bengaluru (Bangalore)
3 - 8 yrs
₹10L - ₹14L / yr
Oracle SQL Developer
PL/SQL
ETL
Informatica
Data Warehouse (DWH)
+4 more
The role and responsibilities of Oracle or PL/SQL Developer and Database Administrator:
• Working Knowledge of XML, JSON, Shell and other DBMS scripts
• Hands on Experience on Oracle 11G,12c. Working knowledge of Oracle 18 and 19c
• Analysis, design, coding, testing, debugging and documentation. Complete knowledge of
Software Development Life Cycle (SDLC).
• Writing Complex Queries, stored procedures, functions and packages
• Knowledge of REST Services, UTL functions, DBMS functions and data integration is required
• Good knowledge on table level partitions, row locks and experience in OLTP.
• Should be aware about ETL tools, Data Migration, Data Mapping functionalities
• Understand the business requirement, transform/design the same into business solutions.
Perform data modelling and implement the business rules using Oracle database objects.
• Define source to target data mapping and data transformation logic as per the business
need.
• Should have worked on Materialised views creation and maintenance. Experience in
Performance tuning, impact analysis required
• Monitoring and optimizing the performance of the database. Planning for backup and
recovery of database information. Maintaining archived data. Backing up and restoring
databases.
• Hands on Experience on SQL Developer
Read more
Bengaluru (Bangalore)
8 - 12 yrs
₹35L - ₹50L / yr
Databases
MySQL
MongoDB
API
kofka
+1 more

What is the role?

You will be responsible for building and maintaining highly scalable data infrastructure for our cloud-hosted SAAS product. You will work closely with the Product Managers and Technical team to define and implement data pipelines for customer-facing and internal reports.

Key Responsibilities

  • Design and develop resilient data pipelines.
  • Write efficient queries to fetch data from the report database.
  • Work closely with application backend engineers on data requirements for their stories.
  • Designing and developing report APIs for the front end to consume.
  • Focus on building highly available, fault-tolerant report systems.
  • Constantly improve the architecture of the application by clearing the technical backlog. 
  • Adopt a culture of learning and development to constantly keep pace with and adopt new technolgies.

What are we looking for?

An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.

  • Education - BE/MCA or equivalent
  • Overall 8+ years of experience
  • Expert level understanding of database concepts and BI.
  • Well verse in databases such as MySQL, MongoDB and hands on experience in creating data models. 
  • Must have designed and implemented low latency data warehouse systems.
  • Must have strong understanding of Kafka and related systems.
  • Experience in clickhouse database preferred.
  • Must have good knowledge of APIs and should be able to build interfaces for frontend engineers.
  • Should be innovative and communicative in approach
  • Will be responsible for functional/technical track of a project

Whom will you work with?

You will work with a top-notch tech team, working closely with the CTO and product team.  

What can you look for?

A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality on content, interact and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits.
We are

A fast-growing SaaS commerce company based in Bangalore with offices in Delhi, Mumbai, SF, Dubai, Singapore and Dublin. We have three products in our portfolio: Plum, Empuls and Compass.  Works with over 1000 global clients. We help our clients in engaging and motivating their employees, sales teams, channel partners or consumers for better business results.

Read more
Marktine
at Marktine
1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 7 yrs
₹5L - ₹10L / yr
Data Warehouse (DWH)
Spark
Data engineering
Python
PySpark
+5 more

Basic Qualifications

- Need to have a working knowledge of AWS Redshift.

- Minimum 1 year of designing and implementing a fully operational production-grade large-scale data solution on Snowflake Data Warehouse.

- 3 years of hands-on experience with building productized data ingestion and processing pipelines using Spark, Scala, Python

- 2 years of hands-on experience designing and implementing production-grade data warehousing solutions

- Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies

- Excellent presentation and communication skills, both written and verbal

- Ability to problem-solve and architect in an environment with unclear requirements

Read more
Bungee Tech India
Abigail David
Posted by Abigail David
Remote, NCR (Delhi | Gurgaon | Noida), Chennai
5 - 10 yrs
₹10L - ₹30L / yr
Big Data
Hadoop
Apache Hive
Spark
ETL
+3 more

Company Description

At Bungee Tech, we help retailers and brands meet customers everywhere and, on every occasion, they are in. We believe that accurate, high-quality data matched with compelling market insights empowers retailers and brands to keep their customers at the center of all innovation and value they are delivering. 

 

We provide a clear and complete omnichannel picture of their competitive landscape to retailers and brands. We collect billions of data points every day and multiple times in a day from publicly available sources. Using high-quality extraction, we uncover detailed information on products or services, which we automatically match, and then proactively track for price, promotion, and availability. Plus, anything we do not match helps to identify a new assortment opportunity.

 

Empowered with this unrivalled intelligence, we unlock compelling analytics and insights that once blended with verified partner data from trusted sources such as Nielsen, paints a complete, consolidated picture of the competitive landscape.

We are looking for a Big Data Engineer who will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.

You will also be responsible for integrating them with the architecture used in the company.

 

We're working on the future. If you are seeking an environment where you can drive innovation, If you want to apply state-of-the-art software technologies to solve real world problems, If you want the satisfaction of providing visible benefit to end-users in an iterative fast paced environment, this is your opportunity.

 

Responsibilities

As an experienced member of the team, in this role, you will:

 

  • Contribute to evolving the technical direction of analytical Systems and play a critical role their design and development

 

  • You will research, design and code, troubleshoot and support. What you create is also what you own.

 

  • Develop the next generation of automation tools for monitoring and measuring data quality, with associated user interfaces.

 

  • Be able to broaden your technical skills and work in an environment that thrives on creativity, efficient execution, and product innovation.

 

BASIC QUALIFICATIONS

  • Bachelor’s degree or higher in an analytical area such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.
  • 5+ years relevant professional experience in Data Engineering and Business Intelligence
  • 5+ years in with Advanced SQL (analytical functions), ETL, Data Warehousing.
  • Strong knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments, data structures, data modeling and performance tuning.
  • Ability to effectively communicate with both business and technical teams.
  • Excellent coding skills in Java, Python, C++, or equivalent object-oriented programming language
  • Understanding of relational and non-relational databases and basic SQL
  • Proficiency with at least one of these scripting languages: Perl / Python / Ruby / shell script

 

PREFERRED QUALIFICATIONS

 

  • Experience with building data pipelines from application databases.
  • Experience with AWS services - S3, Redshift, Spectrum, EMR, Glue, Athena, ELK etc.
  • Experience working with Data Lakes.
  • Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
  • Sharp problem solving skills and ability to resolve ambiguous requirements
  • Experience on working with Big Data
  • Knowledge and experience on working with Hive and the Hadoop ecosystem
  • Knowledge of Spark
  • Experience working with Data Science teams
Read more
Netconnect Pvt. Ltd.
at Netconnect Pvt. Ltd.
2 recruiters
Ruchika M
Posted by Ruchika M
Pune
3 - 5 yrs
₹3L - ₹10L / yr
Perl
Shell Scripting
PL/SQL
Skill required-

• Experienced Developer in Shell scripting,

• PERL Scripting

• PL/SQL knowledge is required.

• Advance Communication skill is a must.

• Ability to learn new applications and technologies
Read more
Velocity Services
Bengaluru (Bangalore)
4 - 8 yrs
₹20L - ₹35L / yr
Data engineering
Data Engineer
Big Data
Big Data Engineer
Python
+10 more

We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.

 

We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.

 

Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!

 

Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc. 

 

Key Responsibilities

  • Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality

  • Work with the Office of the CTO as an active member of our architecture guild

  • Writing pipelines to consume the data from multiple sources

  • Writing a data transformation layer using DBT to transform millions of data into data warehouses.

  • Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities

  • Identify downstream implications of data loads/migration (e.g., data quality, regulatory)

 

What To Bring

  • 3+ years of software development experience, a startup experience is a plus.

  • Past experience of working with Airflow and DBT is preferred

  • 2+ years of experience working in any backend programming language. 

  • Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL

  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)

  • Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects

  • Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure

  • Basic understanding of Kubernetes & docker is a must.

  • Experience in data processing (ETL, ELT) and/or cloud-based platforms

  • Working proficiency and communication skills in verbal and written English.

 

 

 

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos